Linux Backup Administration

Les Mikesell lesmikesell at gmail.com
Sat Jul 2 21:25:50 UTC 2005


On Fri, 2005-07-01 at 13:07, Mike McCarty wrote:
>  It seems that there are two schools of thought
> 
> cpio
> tar

No, both those work only through the filesystem.  Dump is
the other alternative - it is excessively intimate with
the disk structure.  Straight dd images of the raw disk
are even more extreme.

> It also seems that each side thinks the other side is nuts.
> It also seems that using links (soft or otherwise) is not
> well handled by either technique.

Gnutar and star provide extensions to the original tar (which did
omit support for some essential things).

> It also seems that everyone agrees that using tape is the
> Way To Go(tm).

Tape is OK for archiving large amounts of data that you probably
don't want to restore or keeping offsite for disaster recover,
which again you don't really ever plan to do...

> Can anyone tell me whether my impressions on this matter
> be correct? Is there a good tutorial which can give me
> relative pros and cons of cpio style vs. tar style backup?

Both of these copy files through the filesystem with extensions
for special files.  The only real difference is that tar knows
how to recurse through directories itself, where cpio must be
handed a list of files, generally by some contortions with 'find'.
A side effect of this is that gnutar has a '--listed-incremental'
mode that will catch renamed directories in an incremental and take
the old files under them, where this is basically impossible with
cpio.

> I also don't want to use a tape drive, being (as some are)
> on a restricted budget, both for time to learn new stuff
> and monetarily, being among the Great Telecom Layoff. There
> are very nice Windows programs which create initial/disaster
> recovery CDs which can completely rebuild a system to the way
> it was when initially created, and then do backups to CD after
> that. *nix seems not to have any such concept.

There are at least three different reasons you might want to
make backups and different systems work better for the different
instances and there are ways to handle compromises.  If your
goal is to be able to restore an exactly identical system as fast
as possible you might want mondo or a disk image solution or
a 'warm spare' machine with periodic rsync to it's drives. 
Dump is also suitable for this, along with tar/cpio archives
of the whole system as long as you know a few tricks about
restoring using a bootable CD, making your own filesystems and
making them bootable with grub. 

Another goal might be to have long-term archives, in which case tar
and cpio are likely to be the most portable and readable at some
future date.  These will also be a good option if you are rebuilding
a new and better machine instead of restoring exactly the one that
died, since you will have the option to restore onto different
filesystem types.

The most likely reason to need the backup is that someone accidentally
deleted some files or directories and you are restoring back to the
same machine.  In this case, something online instead of tapes is a
big win.  If you have more than one machine involved, look at
backuppc (http://backuppc.sourceforge.net/) which uses a scheme of
compression an linking to keep copies on disk in a web browsable
form, plus it can generate tar images of any version of any
filesystem within the range you are keeping.  It is as handy as you
can get for the individual file/directory restores and you can
do the bare-iron restores with a small amount of work.

-- 
  Les Mikesell
   lesmikesell at gmail.com






More information about the fedora-list mailing list