Large number of files in single directory

Stephen Carville stephen at totalflood.com
Wed May 25 17:50:58 UTC 2005


Burke, Thomas G. wrote:
> I delete them by character...  e.g. rm -rf *1.tmp, rm -rf *2.tmp, and so on.  Don't know of any other way to do it. - although I wrote a little C program once to handle it for me.
>  
>     -Tom
> 
> -----Original Message-----
> From: redhat-list-bounces at redhat.com [mailto:redhat-list-bounces at redhat.com]On Behalf Of Chris
> Sent: Wednesday, May 25, 2005 1:19 PM
> To: redhat-list at redhat.com
> Subject: Large number of files in single directory
> 
> 
> 
> There seems to be a filesystem limitation on most flavors of Linux I've 
> worked on, in terms of a max number of files in a single directory - before 
> tools like tar, gzip, rm, mv, cp and others stop working properly.  For 
> example, I have some users that have 2000+ files in a single directory (some 
> as many as 10,000 files) and trying to tar these directories is always 
> coming up with "argument list too long." 

This is beacue of the limit of the length of the command line.  There is 
a limit on the number of files in a directory but I don't knoiw what it 
is and I haven't hit it yet.  IIRC, the targets I've dealt with was 40K+

> Is there a way for tar and these other tools to "see" all these files and 
> process them as normal?  I recall once I had to resort to something like 
> "find . -print | xargs rm -fr" to remove thousands of files from a single 
> directory.  Is doing something similar but replacing "rm" with "tar" the 
> only way to make this work, or does tar have some sort of command line 
> switch (I couldn't find one) to work with extremely long argument lists? 

Try:

$ find <directory> -exec tar -avf foo.tar {} \;

> Chris 
> 
> 


-- 
Stephen Carville <stephen at totalflood.com>
Unix and Network Admin
Nationwide Totalflood
6033 W. Century Blvd
Los Angeles, CA 90045
310-342-3602




More information about the redhat-list mailing list