Summary from yesterdays (mini) FESCo meeting

Axel Thimm Axel.Thimm at ATrpms.net
Sat Dec 30 15:52:22 UTC 2006


Michael Schwendt wrote:
> You should only feel the need to revisit the guidelines if they
> contain things that contradict with your own packaging habits.

> However, if you read them the first time and find nothing unusual,
> nothing special, there is no need to revisit them

> > I can't reasonably assume that the guidelines I remember from 3
> > months ago are completely appropriate now.
> 
> Who cares? Three months in the future you should still be able to
> spot questionable packaging techniques which require a closer look.

> I do custom reviews

> When I say "APPROVED", all that matters is whether anybody can point
> me to something I've missed.

> On Sat, 30 Dec 2006 12:00:43 +0100, Axel Thimm wrote:
> 
> > > When the reviewer is forced to include a commented mandatory
> > > incomplete checklist, this would require the reviewer to document
> > > all additional checks (among them things more important than what's
> > > in the checklist), too, for completeness.
> > 
> > Why does it have to be all or nothing?
> 
> Why do you ask when I've explained it before? It's in previous messages in
> this thread. I don't want to transcribe my thoughts during reviewing. But
> if I'm forced to include a commented checklist, that would not match what
> I've examined beyond the MUST items, and I would need to extend the list,
> which would be tiresome.

Again all or nothing. Just do the checklist in the bugzilla, that does
not have to be followed up by a book on your methology. You try to
bundle that in order to try to demonstrate that check lists are bad,
but the bundling is wrong to start with.

> > Thanks for putting efforts into allowing good packages to evolve, but
> > any custom or packaging habits controlled reviewes need to be on top
> > of the base checklist.
> 
> Another loop.

Anything that disagrees with your opinion isn't neccessary a loop.

> From the right perspective, however, *every* packager ought to check
> his own package against the reviewing guidelines prior to submitting
> it for review.

Indeed, but more to the point, what does that say about reviews? That
you as a reviewer don't have to check the guidelines anymore? No,
because that's exactly why we need the reviewer - to catch any errors
made by the packager. And the reviewer needs to be an example in not
being sloppy.

> Again, reviews are interesting when something is found, not when nothing
> is found.
> 
> > > > > APPROVAL => all MUST items must have passed the check
> > > > 
> > > > ... using the easy way out.
> > > 
> > > No, there is no excuse if the approved package does not pass the checklist
> > > actually.
> > 
> > And how will one be able to tell? Only by doing himself a complete
> > review ...
> 
> Exactly. That's the only way to verify whether all MUST items have been
> checked.

So for checking whether your single-worded "APPROVED" is correct or
not the whole work needs to be repeated instead of checking that you
reviewed the mandatory items. Sorry, but that's nowhere near quality
control.

> > And "quid custodiet ipsos custodes?".
> 
> I'm not fluent in Latin, so I've had to look this up. Please don't talk in
> riddles.

Sorry for that, the above is used more often outside of pure Latin
and is about self-controlling organizations

http://en.wikipedia.org/wiki/Quis_custodiet_ipsos_custodes%3F

In our context "Who reviewes the reviewers themselves?"

> There really is only one way to verify a review, and that is to do
> an own review of the same package. Do it! Find sloppy reviews, where
> serious problems have slipped through, and then give reason to put
> an eye on reviewers.

I can check a review on its validity only if there is one to start with.

> > Reviewers aren't gods, they are on the same level as contributors,
> > and when we ask contributors to invest time in packaging and
> > explaining package decisions, we have to ask reviewers to put some
> > visible efforts into the process, too.
> 
> Sigh. A cut'n'pasted list?

No, a cut'n'pasted *empty* list, that gets properly filled out. Cut
and paste will be its frame, not its content. On the one hand you want
us to fully trust reviewers who simply paste in an "APPROVE" stamp and
go through loops to verify their "APPROVAL", and on the other hand you
assume malicious cut and paste practices by anyone else.

> [ms: The rest of the message ignored deliberately. You've overstepped the
> mark in there. I've done hundreds of gpg signed reviews, plus more in the
> new system, and surely don't need your judgement about the quality of my
> reviewing.]

I didn't imply anything about the quality of your reviewing, but on
the quality of your reviews, which if - as you yourself say - are
simple "APPROVED" stamps are no proper reviews at all and are the
reason for this thread.

If your review efforts are that detailed and effortful, why not
document them with a one-liner each? Writing "Built under mock" and
"tested create/save under gnome" into the review takes up less than 1%
of the time you needed to do these things.

Please get back in the row with the rest of us, noone is a review god
around here. Are the other reviewers that use proper check lists and
frequently check for changes in the review guidelines that much lesser
than you are?
-- 
Axel.Thimm at ATrpms.net
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: <http://listman.redhat.com/archives/fedora-maintainers/attachments/20061230/9a608e39/attachment.sig>


More information about the Fedora-maintainers mailing list