[NTLUG:Discuss] OT Perl vs C question

Tom Hayden tom.hayden.iii at mail.airmail.net
Tue Apr 26 23:08:10 CDT 2005


Fred,
	This sounds similar to a problem I encountered when I was working at 
Weblink Wireless. It sounds like the SQL may be doing a sort on the data 
before it puts out the results. If this is the case, it will not matter 
whether you use C or PERL, you will have to do something about the SQL. 
One solution would be to do a daily SQL query and save the results in a 
separate table for that day and then "consolidate" all the tables when 
you want to run the report. Another thing to look for is if you are 
using a "Co-ordinated" query. If you are, you need to rewrite it as a 
non co-ordinated query. Your SQL guys should know what I am talking 
about and should know that it is ALWAYS possible to rewrite the query in 
a non coordinated form. If they do not understand this, FIRE THEM!
	One other thing that stands out is you say you are using "flat files 
produced by SQL queries." I do not know which implementation of SQL you 
are using, but I know that with Oracle, you can use Pro*C (Oracles 
extension to C) to process the results of the SQL queries on the fly. 
This may speed things up. Other SQL implementations should have some 
similar facility. These would all be akin to PERL::DBI that Chris 
suggested. In all cases, it would not make much difference whether you 
used C or PERL, it is the interface directly to the SQL server that 
provides the increased speed.

	I hope this helps.

Fred James wrote:
> Chris J Albertson wrote:
> 
>> It seems to me that it would be easier, more reliable, and save on disk
>> space to access the SQL server directly via PERL, instead of copying the
>> SQL result sets into temporary local files. When you read in the files,
>> you're going to have to parse the data, eating up cpu cycles. The SQL
>> result sets are going to be parsed into columns already, by the nature of
>> the RDBMS. PERL::DBI comes to mind here as a solution.
>> Flat files are evil. :)
>>
>> Chris
>>
>>  
>>
> Chris Albertson
> Admitting that I should not be considered a prize SQL programmer, I have 
> relied upon our SQL developers to produce as efficient a query as 
> possible.  That said ...
> 
> Evil flat files not withstanding, the SQL has been tested and it runs 
> for over 24 hours to acquire a 30 days range of data (that test run 
> being on a weekend, which is non-peak hours for us).  Our final run must 
> collect a 365 days range of data - estimated running time for the SQL 
> would therefore be about 304 hours (12.67 days)  We were kind of hoping 
> to run this query at least once a week, if not daily.
> 
> All sad, but true.
> Regards
> Fred James
> 


-- 
Tom Hayden III

Coherent solutions for chaotic situations

tom.hayden.iii at mail.airmail.net
214-435-4174

1531 San Antone ln.
Lewisville Texas 75077




More information about the Discuss mailing list