[Pulp-dev] Stages API Performance Data Collection

Brian Bouterse bbouters at redhat.com
Wed Sep 19 17:04:06 UTC 2018

On Mon, Sep 17, 2018 at 3:10 PM Dana Walker <dawalker at redhat.com> wrote:

> I love this idea!  Running benchmarks as we go will allow us to react
> quickly if there are unforeseen performance pain points.

> Have you run anything similar to this proposal back in Pulp2 or
> elsewhere?  I'm a little concerned about the storage capacity needed for
> the sheer number of sqlite3 databases generated.  Maybe a script could
> periodically empty /var/lib/pulp/debug/ as it reaches certain configured
> size/age limits?
We did have a similar feature in Pulp2 that would output a cProfile with
the filename being the task UUID. (docs link below). I don't think storage
wasn't an issue there, but users would have to confirm for us. When users
would use it, they would turn the feature on, run the troublesome workload,
then turn it off again so it's usually a few tasks only. I think each db
will be very small < 1MB probably.


> --Dana
> Dana Walker
> Associate Software Engineer
> Red Hat
> <https://www.redhat.com>
> <https://red.ht/sig>
> On Mon, Sep 17, 2018 at 2:36 PM, Brian Bouterse <bbouters at redhat.com>
> wrote:
>> I'm interested in implementing a data collection feature for Pulp3. This
>> will allow us to easily and accurately benchmark pipeline performance to
>> clearly show improvement as we make changes. Borrowing from my old queueing
>> theory days... here is a data collection feature proposal:
>> https://pulp.plan.io/issues/4021
>> Any comment/ideas are welcome. Thank you!
>> -Brian
>> _______________________________________________
>> Pulp-dev mailing list
>> Pulp-dev at redhat.com
>> https://www.redhat.com/mailman/listinfo/pulp-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/pulp-dev/attachments/20180919/f429bbf0/attachment.htm>

More information about the Pulp-dev mailing list