compression tools

Steven W. Orr steveo at syslang.net
Wed Jun 22 18:33:11 UTC 2005


On Wednesday, Jun 22nd 2005 at 19:42 +0200, quoth Gérard Milmeister:

=>On Wed, 2005-06-22 at 13:25 -0400, Scot L. Harris wrote:
=>> On Wed, 2005-06-22 at 12:54, sharif islam wrote:
=>> > Is there any compression mechanism that's rated for tar files as big
=>> > as, say, 100GB?
=>> 
=>> I don't believe anyone rates compression programs based the size of the
=>> file to be compressed.
=>
=>For files of that size, it is probably useful for compress program to be
=>fast. Bzip2 for example would take ages to compress 100GB. Gzip with a
=>low compress degree (< 5) would be more useful.

I think you'll find that gzip and bzip2 will give very similiar results 
both in time and space by appropriately lowering the compression factor.

e.g., bzip2 -1 and gzip -1 will be very comparable. 

The point is that the larger the chunk that any compression program has to 
operate upon the more compression you're going to end up with and the 
greater the time requirement will be for the chunk.

-- 
Time flies like the wind. Fruit flies like a banana. Stranger things have  .0.
happened but none stranger than this. Does your driver's license say Organ ..0
Donor?Black holes are where God divided by zero. Listen to me! We are all- 000
individuals! What if this weren't a hypothetical question?
steveo at syslang.net


More information about the fedora-list mailing list