How to fix "Segmentation fault"
Herta Van den Eynde
herta.vandeneynde at gmail.com
Wed Mar 19 22:45:39 UTC 2008
On 19/03/2008, David Nguyen <DNguyen at dallascounty.org> wrote:
>
> Yes, it's about 20G but I don't have this issue running on our lab server
> which is also red hat enterprise 4. I wonder what's missing or wrong on
> this troubled server and need to find the fixes.
>
> Thanks,
> David
>
>
> >>> bill at magicdigits.com 03/19/08 4:59 PM >>>
>
> Any chance the file size >2Gb? If so, then you'd have to pipe the project
> through something like dd.
>
> # cat coldbackup.tar|compress|dd of=coldbackup.tar.Z or syntactically
> correct equivalent for instance.
>
> Bill Watson
> bill at magicdigits.com
>
> -----Original Message-----
> From: redhat-sysadmin-list-bounces at redhat.com
> [mailto:redhat-sysadmin-list-bounces at redhat.com] On Behalf Of David Nguyen
> Sent: Wednesday, March 19, 2008 2:50 PM
> To: redhat-sysadmin-list at redhat.com
> Subject: How to fix "Segmentation fault"
>
>
> Hi,
>
> I run compress command on Red hat enterprise 3, I receive "Segmentation
> fault" error while running compress command then it stops compressing the
> file. Does someone know what causes this error and how to fix it?
>
> [1]+ Segmentation fault nohup compress -f coldbackup.tar
>
>
> Thanks,
> David
>
What is your maximum file size limit set to ("ulimit -f -H" and "ulimit -f
-S")?
Kind regards,
Herta
--
"Life on Earth may be expensive,
but it comes with a free ride around the Sun."
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/redhat-sysadmin-list/attachments/20080319/7529f0d2/attachment.htm>
More information about the redhat-sysadmin-list
mailing list