Linux backup

Ryan Golhar golharam at umdnj.edu
Thu Aug 19 12:40:47 UTC 2004


What are you backing up to?

The procedure I use is to copy all home directories from one machine to
another every night from the backup server:

rsync -arp --delete -e /usr/bin/ssh mainserver:/home /home

I have RSA keys in place to avoid password prompts...

-----
Ryan Golhar
Computational Biologist
The Informatics Institute at
The University of Medicine & Dentistry of NJ

Phone: 973-972-5034
Fax: 973-972-7412
Email: golharam at umdnj.edu

-----Original Message-----
From: redhat-list-bounces at redhat.com
[mailto:redhat-list-bounces at redhat.com] On Behalf Of Malcolm Kay
Sent: Thursday, August 19, 2004 8:26 AM
To: General Red Hat Linux discussion list
Subject: Linux backup


Some weeks ago I enquired here about 'dump' for
use with ext3 file systems; and was strongly advised
the Linux and 'dump' don't play well together.

Reading the arguments including Linus Torvalds's comment
'  Right now, the cpio/tar/xxx solutions are definitely 
   the best ones, and will work on multiple filesystems 
   (another limitation of "dump"). Whatever problems they 
   have, they are still better than the _guaranteed_(*)  
   data corruptions of "dump".'
I was and am still convinced that 'dump' is not the way to
go under linux.

So I've spent some time scripting to marry in 'tar' 
backups for recently acquired Linux machines with a 
backup system that uses 'dump' for our unix machines.

Yesterday I ran this for the first time on one of the Linux machines and
found the backup aborted with the following 
error in the log file:
   /bin/tar: /home/thi/OM5438/test.hir1: file changed as we read it
   /bin/tar: Error exit delayed from previous errors
   Backup /data/pstar/root-0-z.tgz FAILED at Wed 18 Aug 2004 15:29:20
CST

So 'dump' leads to corrupt backups, 'tar' leads to aborted backups. The
abort message is undoubtably correct -- the file in question is a
temporary file used during circuit simulation analysis. Individual 
simululation runs can take from a few second upto a week. So it is not
practical to close down everything for backup. (If it was then
partitions could be dismounted for backup and the principal problem 
espoused for 'dump' would disappear.) Such files are not crucial to the 
backup. If tar simply skipped them or indicated that they were corrupt
in the archive while correctly preserving the rest of the file system
then this would be satisfactory -- but instead it aborts.
 

So is there someway to get 'tar' to continue when an odd file or two
exhibits this sort of problem? I know about the option:
  --ignore-failed-read
              don't exit with non-zero status on unreadable files but
from my interpretation of the man page it is not relevant to this
problem.

Does 'cpio' have the same problem?

Some have suggested 'amanda', but my understanding is that this is just
a wrapper that optionally uses 'dump' or 'tar' so this seems to take us
nowhere.

What else is there out there for backup? I am not looking for a backup 
system; just a reasonably reliable backup utility that can be used so
that the linux machines can be incorporated into the backup system that
works well for our unix machines.

Some advice please.

Malcolm Kay



-- 
redhat-list mailing list
unsubscribe mailto:redhat-list-request at redhat.com?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list





More information about the redhat-list mailing list