[NTLUG:Discuss] OT Perl vs C question

Paul Ingendorf pauldy at wantek.net
Tue Apr 26 16:34:43 CDT 2005


It will be faster in C using an array of pointers and reallocating space as
needed but it is far easier for someone who doesn't know how to manage
memory in C to simply script it out in perl.  Chances are the speed hit on a
modern machine will not be a major factor unless you are doing hundreds of
these files a day.  I can't imagine each of these files as you have
described taking more than 20-30 seconds a piece to complete.

-----Original Message-----
From: discuss-bounces at ntlug.org [mailto:discuss-bounces at ntlug.org]On
Behalf Of Fred James
Sent: Tuesday, April 26, 2005 3:30 PM
To: NTLUG Discussion List
Subject: [NTLUG:Discuss] OT Perl vs C question


All
I have a multi part data to report processing project
(1) get some data (flat files produced by SQL queries)
(2) process the data (dump it all into array(s), consolidating and
totaling on the fly)
(3) produce reports (back through the array(s), a few final
calculations, formatting, and output formated report(s))

I am considering Perl because it seems to have the nice feature of
unlimited arrays without declaring them and allocating space (is that true?)

So, my question:  In a moderate volume data processing project, say
reading 7 flat files of 3 to 7 fields each, and > 500,000 records each,
and doing something like steps 2 and 3 (above), how does Perl compare to
C in terms of speed?

Thank you in advance for any help you may be able to offer
Regards
Fred James

--
Compassion alone stands apart from the continuous traffic between good and
evil proceeding within us.  "Om Mani Padme Hum"


_______________________________________________
https://ntlug.org/mailman/listinfo/discuss





More information about the Discuss mailing list