[Pulp-list] Using Pulp in a server-only configuration?

Baird, Josh jbaird at follett.com
Tue Jan 27 22:59:53 UTC 2015


We update several hundred.  We typically do them in batches of anywhere between 5-50 depending on the group.

I have several mco actions written that do things like "yum clean cache," etc.  I haven't had problems with rpmdb corruption.  Usually I always run a 'yum clean cache' before doing anything.

Josh

From: Andrea Giardini [mailto:contact at andreagiardini.com]
Sent: Tuesday, January 27, 2015 5:45 PM
To: Baird, Josh
Cc: pulp-list at redhat.com; Trey Dockendorf; Mathew Crane
Subject: Re: [Pulp-list] Using Pulp in a server-only configuration?

@Josh
How many machine do you update with this method? I use mco as well it's not always efficient with high number of machine (expecially if they have high load)
How do you deal with rpmdb corruption/ stuck transactions and all the other errors that can prevent a machine from updating correctly?

Cheers
Andrea
On 01/25/2015 06:29 PM, Baird, Josh wrote:
We also use Pulp in this exact way.  Puppet drops down repo definitions to each host which are associated with the host's Puppet environment (dev, qa, prod).  The repo definitions point to "snapshot" repositories in Pulp.  We promote packages up through the environments as you are describing.

We do not use the pulp-consumer client.   Instead we trigger "yum updates" using mCollective on groups of hosts.

Josh

From: pulp-list-bounces at redhat.com<mailto:pulp-list-bounces at redhat.com> [mailto:pulp-list-bounces at redhat.com] On Behalf Of Trey Dockendorf
Sent: Sunday, January 25, 2015 12:22 PM
To: Mathew Crane
Cc: pulp-list at redhat.com<mailto:pulp-list at redhat.com>
Subject: Re: [Pulp-list] Using Pulp in a server-only configuration?


Your use case matches exactly how we use Pulp to manage repo contents for a HPC cluster where a consumer service is not possible.  I've had no issues and just push out repo files for all pulp managed repos using Puppet.  Since I'm using self signed certs still in Pulp and our network is private I made sure to serve all repos via http.

- Trey
On Jan 21, 2015 3:06 PM, "Mathew Crane" <mathew.crane at gmail.com<mailto:mathew.crane at gmail.com>> wrote:
In my environment, it doesn't really make sense to have a single point propagating changes to numerous hosts. Instead we'd opt to have the consumers pull down from the Pulp server manually. I understand that this hides a portion of Pulp's featureset (consumer management and reporting) but what I'm more interested in is the ability to manually 'promote' packages into different repos with required or updated deps on the server. Is there any downside to keeping the consumers 'dumb' and hitting the Pulp-managed repositories manually via standard /etc/yum.repos.d/*.conf files?

_______________________________________________
Pulp-list mailing list
Pulp-list at redhat.com<mailto:Pulp-list at redhat.com>
https://www.redhat.com/mailman/listinfo/pulp-list




_______________________________________________

Pulp-list mailing list

Pulp-list at redhat.com<mailto:Pulp-list at redhat.com>

https://www.redhat.com/mailman/listinfo/pulp-list

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/pulp-list/attachments/20150127/9a74144b/attachment.htm>


More information about the Pulp-list mailing list