[NTLUG:Discuss] find, xargs, grep
Robert Thompson
ntlug at thorshammer.org
Thu Aug 4 16:17:46 CDT 2005
> Can you give a bit more details on what exactly you did when your
> computer balked?
I don't remember exactly what I did as it was a few years ago. I do
remember it was on Solaris when it died (don't remember the version
though). I just stored it away and remembered for the question "If you
want to search through a list of files" do the "find -type f -exec grep"
dance instead of the "grep <long list of files>" thing.
Rob T
Robert Citek wrote:
>
> On Jul 28, 2005, at 8:01 PM, Robert Thompson wrote:
>
>> > find /path -type f | xargs grep "foo"
>>
>> I've had that command puke and die because the arg list to grep was
>> too long.
>
>
> I don't doubt that a command puked and died, but are you sure it was
> because of using xargs? Are you sure you are not thinking of command
> expansion? For example:
>
> grep "foo" $(find /path -type f )
>
> The whole point of using xargs is so that the argument list doesn't get
> too long.
>
> Can you give a bit more details on what exactly you did when your
> computer balked? A specific example would be ideal.
>
>> A better version is:
>>
>> find /path -type f -exec grep "foo" {} \;
>
>
> That depends on what you mean by better. In your example a new grep
> process will spawn and die for each file found. 100,000 files means
> 100,000 processes. Using xargs, a new process is spawned for a group
> of files. On a recent test I did I got groups of roughly 500 files per
> command. 100,000 files means 200 processes, which is a lot faster and
> a lot less resource intensive.
>
> Regards,
> - Robert
> http://www.cwelug.org/downloads
> Help others get OpenSource software. Distribute FLOSS
> for Windows, Linux, *BSD, and MacOS X with BitTorrent
>
>
> _______________________________________________
> https://ntlug.org/mailman/listinfo/discuss
>
>
More information about the Discuss
mailing list