[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: Maintaining a local yum repository



On Sat, 2008-01-26 at 15:58 -0500, Derek Tattersall wrote:
> macgyver wrote:
> >
> > When I had two servers running Fedora2 and both my wife and I's
> > workstation, and my laptop all running FC2 - what I did was get one of
> > the servers to do an automatic update at say 02:00 - and do not delete
> > the resultant packages...
> >
> > The resultant packages were then NFS shared to the rest of the systems
> > as a background mount.....
> >
> > The result - when yum ran on the rest of the machines, if it found those
> > packages already on disk - it didn't re-download them - thus saving
> > downloading everything 5 times..
> >
> > Also no problems with keys - as still using the repo ;-)
> >
> > Ok - didn't stop the download of what needed to be downloaded - but that
> > traffic was minimal compared to the actual packages.....
> >
> > Helluva lot better (IMHO) that rsyncing the entire repo  - 'cause you
> > won't need a vast majority of what that repo has..
> >
> >
> > Just my 2c
> > AM
> >
> >   
> I like that idea.
> 
> I could make a script on one machine that looks like this:
> 
> #!/bin/bash
> yum $*
> rsync -vRd /var/cache/yum/*/packages/*rpm /var/www/html/yum
> 
> and then use that to either make a local repository, or just copy it to 
> the /var/cache/yum directories on my other machines.
> 
> I think that would do what I have in mind.




My method on all machines...(including server)
Moved /var/cache/yum to /var/cache/yum_old
mkdir /var/cache/yum
export /data/adminfilesyum as read/write NFS share (/etc/exports and
service start NFS - and portmapper of course)
mount <server>:/data/adminfiles/yum /var/cache/yum

No rsyncing, or copying any data at any time - when *any* machine did a
yum, it looked to see if the files were in /var/cache/yum - and if not
downloaded it.
The first machine would d/l the files first - all others then didn't
need to - they just used the locally cached copy.

now /var/cache/yum wasn't local to each machine - but on a server that
was local to them - server was up all the time - so was not an issue..

yeah - opening NFS up and portmapper is theoretically a security issue,
but if all machines are in the same LAN - might not be a huge issue. 

YMMV if you don't have a dedicated "server" machine - but your method
above would be copying that data twice - once to the /var/www/html/yum
area, and then once again down to the "secondary" machines - and data
from the d/l packages can get really rather large over time, unless you
are clearing it out - so might be worth updating one machine first, then
running the following on the other two "client" machines 

rsync -av <firsthost>:/var/cache/yum/ /var/cache/yum/

yeah - you'd be copying the xmls and sqlite data - but that was never a
problem for me - all machines were actually using the same data via the
gift of that NFS mount...


again - only my 2c.

AM


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]