[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: help needed: setting up rdist



On Tue, 11 Mar 2003, Blaise Canzian wrote:

> I want to manage (install software, patches, etc.) a network of RH7.3 
> systems using rdist.  Can someone help or provide pointers to info 
> regarding internal firewalling issues (ports used, etc.) and access 
> permission issues (is a .rhosts file needed?) to accomplish this?  Thanks.
> 


I've never used rdist, and it's not installed on any of my systems.

What I would use to maintain copies of files across systems is rsync.
rsync can use either its own connexion (requires an rsync server at the
remote end) or rsh/ssh as a pipe for its data.

I recommend you don't use rsh as it's inherently insecure.

If you use ssh, then rscync's port requirements are those of ssh. If you
use an rsync server, then those port requirements are documented in the
rsync documentation.

I would use rsync on the clients to pull the files from the server as I
think this easier to manage than pushing files from the server to
clients that may be down, or that are new and you forgot to add. You can
configure new clients at install time in your ks file, so there's no
great difficulty remembering to do that;-)


You might also consider sharing a repository via nfs; that's your choice
and may be driven by (network) locality of clients & servers.

Another alternative is using an http client such as wget, but I don't
know that wget has the ability to remove obsolete files. That aside, it
has this advantage.

Suppose your server is (here) in Midland, and you have a branch office
in Joodalup. Using rsync (or rdist), you would transmit each file from
Midland to Joondalup, quite likely across the Internet. That uses lots
of bandwidth and time.

If you use wget, you would configure Squid in your Joondalup branch
office to (transparently) proxy the requests.

On the first request for each file, Squid would get a copy from
Midland, and serve that copy to each client in Joondalup, so the data
would only cross the network between your two offices once.


Now there is the question of automatically updating packages. This is an
idea I dread, because it's certain that at some time something will go
wrong. Perhaps it won't be visible immediately, perhaps it will, but if
you're not conciously aware that something has been changed then you
won't quickly make the connexion.

I think it best to have scheduled maintenance time. If the files are in
place ready, then this needn't take a long time, and you may choose to
have it occur "almost automatically," by something you trigger.


Finally, you need to have a recovery plan for when you foul it up. It
might use a separate partition that can be booted and allow you access,
ir could involve a CD of recovery tools.



-- 


Cheers
John.

Please, no off-list mail. You will fall foul of my spam treatment.
Join the "Linux Support by Small Businesses" list at 
http://mail.computerdatasafe.com.au/mailman/listinfo/lssb






[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]