VS: Unzipping problem | write error (disk full?)
rodrigo.garcia at kotasoft.com
rodrigo.garcia at kotasoft.com
Mon Apr 12 13:23:15 UTC 2010
I can't reproduce it again, now I follow withe the same problem...
rodrigo.garcia at kotasoft.com escribió:
> No problem with the file:
>
> [root at ELOS-BD in]# unzip -t Master032010.zip
> Archive: Master032010.zip
> testing: SongMaster201003.txt OK
> No errors detected in compressed data of Master032010.zip.
>
> I'm connected as root.
>
> But, you are right, it seems that changing the permissions to 777 it
> unzips ok.
>
> I was thinking that it would be a problem of the unzip program, having
> no support for large files, but it seems that changing the permissions
> the problem is solved.
>
> Thanks.
>
> Best regards.
>
> Burke, Thomas G. escribió:
>> What he's saying is that perhaps the file was corrupted during the
>> transfer. This is rare, these days, but not unheard of - especially
>> if an FPT transfer and the wrong file type was set.
>> Also, someone else mentioned to check the properties of the file -
>> hit it with a "chmod 777 Master032010.zip" before trying to unzip it.
>> One last thing to check is whether or not you have write access to
>> the place you are trying to unzip the file. If not, this could be
>> the cause of the write errors. If the directory contains "sensitive"
>> data bases, it's entirely possible it's been set to read only. Check
>> that, too.
>>
>> -Tom
>>
>> -----Original Message-----
>> From: redhat-list-bounces at redhat.com
>> [mailto:redhat-list-bounces at redhat.com] On Behalf Of
>> rodrigo.garcia at kotasoft.com
>> Sent: Monday, April 12, 2010 8:14 AM
>> To: General Red Hat Linux discussion list
>> Subject: Re: VS: Unzipping problem | write error (disk full?)
>>
>> The file is not broken, I can unzip it in my machine, I have
>> downloaded from the machine where I'm having the problem...
>>
>> Thanks anyway
>>
>> Best regards
>>
>> Tatu Salin escribió:
>>
>>> Another option is that your file is went broken, which means that
>>> you have wrong crc check for that file, your file has wrong crc
>>> check. It means that your file has been been went broken. Please try
>>> transfer your file as binary transfer to the maschine, where you are
>>> unzipping it. I believe that it cannot be unzippe, cause transfer
>>> has broken that file.
>>>
>>> hope that helps.l
>>>
>>>> Date: Mon, 12 Apr 2010 13:11:25 +0200
>>>> From: rodrigo.garcia at kotasoft.com
>>>> To: redhat-list at redhat.com
>>>> Subject: Re: VS: Unzipping problem | write error (disk full?)
>>>>
>>>> Hi,
>>>>
>>>> I have no problems to unzip the file in my own machine (ubuntu
>>>> 9.10) and in windows, but in this other machine I have the
>>>> mentioned problem.
>>>>
>>>> The partition where I'm triying to unzip has more tha 600GB of
>>>> space, and the zip file content is 2,5 GB.
>>>>
>>>> There isn't any command
>>>>
>>>> uncompress
>>>>
>>>> in the machine.
>>>>
>>>> Thanks
>>>>
>>>> Tatu Salin escribió:
>>>>
>>>>> Hello. Have you tried uncompress command? This might help. There
>>>>> is also possibility that your zip file cannot be unzipped, cause
>>>>> zip compress files about 90 percent of it's original size. Please
>>>>> try unzipping on some another filesystem, where is more space.
>>>>> There disk full tells that your filesystem gets full when you are
>>>>> unzipping that file, so it means that when you are unzipping that
>>>>> file your filesystem gets full during unzipping and that's cause
>>>>> your file is fullwilling your space on that partition. Kind
>>>>> regards IT specialist.
>>>>>
>>>>>
>>>>> -----Original Message-----
>>>>> From: rodrigo.garcia at kotasoft.com
>>>>> Sent: 4/12/2010 9:53:46 AM
>>>>> To: redhat-list at redhat.com
>>>>> Subject: Unzipping problem | write error (disk full?)
>>>>> Hi,
>>>>>
>>>>> I'm new to redhat. In fact I'm not the systems administrator, but
>>>>> I have
>>>>> a strange problem unzipping a file. I think is a problem of memory or
>>>>> swap space or somethin similar, but I'm going explain the problem
>>>>> in detail:
>>>>>
>>>>> Distribution:
>>>>> [B]Red Hat Enterprise Linux ES release 4 (Nahant Update 3)[/B]
>>>>>
>>>>> [B]I'm connected as root.[/B]
>>>>>
>>>>> I have this zipped file:
>>>>>
>>>>> -rw-r--r-- 1 root root 678183271 abr 7 15:30 Master032010.zip
>>>>>
>>>>> it contains a 2,4G file
>>>>>
>>>>>
>>>>> [CODE]df -h
>>>>> S.ficheros Tamaño Usado Disp Uso% Montado en
>>>>> /dev/cciss/c0d0p6 4,0G 3,5G 282M 93% /
>>>>> /dev/cciss/c0d0p1 124M 13M 105M 11% /boot
>>>>> none 4,0G 0 4,0G 0% /dev/shm
>>>>> /dev/cciss/c0d0p7 56G 17G 37G 31% /opt
>>>>> /dev/cciss/c0d0p3 985M 18M 917M 2% /tmp
>>>>> /dev/cciss/c0d0p5 5,0G 404M 4,3G 9% /var
>>>>> /dev/sda1 270G 220G 37G 86% /database
>>>>> /dev/sdb1 125G 63G 56G 54% /data
>>>>> /dev/sdc1 826G 138G 646G 18% /data2[/CODE]
>>>>>
>>>>>
>>>>> The file is in /dev/sdc1 under /data2/elos/files/in
>>>>> as you can see this partition has space enough to unzip a 2,4G file
>>>>> I have also tried to do the same in the /dev/sda1 under /database
>>>>> but in
>>>>> both cases I have the same problem:
>>>>>
>>>>> [CODE][root at ELOS-BD in]# unzip Master032010.zip
>>>>> Archive: Master032010.zip
>>>>> inflating: SongMaster201003.txt
>>>>> SongMaster201003.txt: write error (disk full?). Continue?
>>>>> (y/n/^C) y
>>>>> bad CRC 9695f189 (should be 0cab6361)[/CODE]
>>>>>
>>>>>
>>>>> I have typed free:
>>>>>
>>>>> [CODE][root at ELOS-BD in]# free -m
>>>>> total used free shared buffers
>>>>> cached
>>>>> Mem: 8054 8038 16 0
>>>>> 5 7222
>>>>> -/+ buffers/cache: 809 7244
>>>>> Swap: 1023 72 [/CODE]
>>>>>
>>>>>
>>>>> I have unzipped the same file in my own machine (ubuntu 9.10) and in
>>>>> windows and I have no problem.
>>>>>
>>>>>
>>>>> Any idea?
>>>>>
>>>>> Thanks and regards.
>>>>>
>>>
>>> _________________________________________________________________
>>> Uudessa IE8 selaimessa on uudet pikatoiminnot.
>>> http://www.microsoft.com/finland/windows/products/winfamily/ie/beta/default.mspx
>>>
>>>
>>
>>
>>
>
More information about the redhat-list
mailing list