VS: Unzipping problem | write error (disk full?)

Tatu Salin tatusalin at hotmail.com
Mon Apr 12 11:35:14 UTC 2010


Another option is that your file is went broken, which means that you have wrong crc check for that file, your file has wrong crc check. It means that your file has been been went broken. Please try transfer your file as binary transfer to the maschine, where you are unzipping it. I believe that it cannot be unzippe, cause transfer has broken that file.

hope that helps.l 

> Date: Mon, 12 Apr 2010 13:11:25 +0200
> From: rodrigo.garcia at kotasoft.com
> To: redhat-list at redhat.com
> Subject: Re: VS:  Unzipping problem | write error (disk full?)
> 
> Hi,
> 
> I have no problems to unzip the file in my own machine (ubuntu 9.10) and 
> in windows, but in this other machine I have the mentioned problem.
> 
> The partition where I'm triying to unzip has more tha 600GB of space, 
> and the zip file content is 2,5 GB.
> 
> There isn't any command
> 
> uncompress
> 
> in the machine.
> 
> Thanks
> 
> Tatu Salin escribió:
> > Hello. Have you tried uncompress command? This might help. There is also possibility that your zip file cannot be unzipped, cause zip compress files about 90 percent of it's original size. Please try unzipping on some another filesystem, where is more space. There disk full tells that your filesystem gets full when you are unzipping that file, so it means that when you are unzipping that file your filesystem gets full during unzipping and that's cause your file is fullwilling your space on that partition. Kind regards IT specialist.
> >
> >
> > -----Original Message-----
> > From: rodrigo.garcia at kotasoft.com
> > Sent: 4/12/2010 9:53:46 AM
> > To: redhat-list at redhat.com
> > Subject: Unzipping problem | write error (disk full?)
> > Hi,
> >
> > I'm new to redhat. In fact I'm not the systems administrator, but I have
> > a strange problem unzipping a file. I think is a problem of memory or
> > swap space or somethin similar, but I'm going explain the problem in detail:
> >
> > Distribution:
> > [B]Red Hat Enterprise Linux ES release 4 (Nahant Update 3)[/B]
> >
> > [B]I'm connected as root.[/B]
> >
> > I have this zipped file:
> >
> > -rw-r--r--   1 root root 678183271 abr  7 15:30 Master032010.zip
> >
> > it contains a 2,4G file
> >
> >
> > [CODE]df -h
> > S.ficheros          Tamaño Usado  Disp Uso% Montado en
> > /dev/cciss/c0d0p6     4,0G  3,5G  282M  93% /
> > /dev/cciss/c0d0p1     124M   13M  105M  11% /boot
> > none                  4,0G     0  4,0G   0% /dev/shm
> > /dev/cciss/c0d0p7      56G   17G   37G  31% /opt
> > /dev/cciss/c0d0p3     985M   18M  917M   2% /tmp
> > /dev/cciss/c0d0p5     5,0G  404M  4,3G   9% /var
> > /dev/sda1             270G  220G   37G  86% /database
> > /dev/sdb1             125G   63G   56G  54% /data
> > /dev/sdc1             826G  138G  646G  18% /data2[/CODE]
> >
> >
> > The file is in /dev/sdc1  under /data2/elos/files/in
> > as you can see this partition has space enough to unzip a 2,4G file
> > I have also tried to do the same in the /dev/sda1 under /database but in
> > both cases I have the same problem:
> >
> > [CODE][root at ELOS-BD in]# unzip Master032010.zip
> > Archive:  Master032010.zip
> >   inflating: SongMaster201003.txt
> > SongMaster201003.txt:  write error (disk full?).  Continue? (y/n/^C) y
> > bad CRC 9695f189  (should be 0cab6361)[/CODE]
> >
> >
> > I have typed free:
> >
> > [CODE][root at ELOS-BD in]# free -m
> >              total       used       free     shared    buffers     cached
> > Mem:          8054       8038         16          0          5       7222
> > -/+ buffers/cache:        809       7244
> > Swap:         1023         72   [/CODE]
> >
> >
> > I have unzipped the same file in my own machine (ubuntu 9.10) and in
> > windows and I have no problem.
> >
> >
> > Any idea?
> >
> > Thanks and regards.
> >   
> 
 		 	   		  
_________________________________________________________________
Uudessa IE8 selaimessa on uudet pikatoiminnot.
http://www.microsoft.com/finland/windows/products/winfamily/ie/beta/default.mspx


More information about the redhat-list mailing list