Summary from yesterdays (mini) FESCo meeting

Michael Schwendt bugs.michael at gmx.net
Sat Dec 30 14:42:22 UTC 2006


On Sat, 30 Dec 2006 12:00:43 +0100, Axel Thimm wrote:

> > When the reviewer is forced to include a commented mandatory
> > incomplete checklist, this would require the reviewer to document
> > all additional checks (among them things more important than what's
> > in the checklist), too, for completeness.
> 
> Why does it have to be all or nothing?

Why do you ask when I've explained it before? It's in previous messages in
this thread. I don't want to transcribe my thoughts during reviewing. But
if I'm forced to include a commented checklist, that would not match what
I've examined beyond the MUST items, and I would need to extend the list,
which would be tiresome.

> So you either just stamp off a package review with a terse aproval
> notice or have to write a book on it?

Strange question.

Have you ever before observed a review done by me? The interesting reviews
are those where multiple problems are pointed out, not those where no
problems are found.

> > hereby refuse to do that and will rather stop doing reviews completely.
> > I do custom reviews and adapt to what is contained within a package,
>        ^^^^^^^^^^^^^^
> > and more often than not that has helped in blocking crap.
> 
> Thanks for putting efforts into allowing good packages to evolve, but
> any custom or packaging habits controlled reviewes need to be on top
> of the base checklist.

Another loop.  Listen, we've tried to keep the reviewing guidelines short
and to the point. Not only because of mixed experience with the shorter
"QA Checklist" from fedora.us, which had been described as too complicated
to check. And still, over time the new list of MUST/SHOULD items has
increased in length. A fundamental problem with such checklists is that
you need to find the volunteers who believe they are able to perform all
the checks. From the right perspective, however, *every* packager ought to
check his own package against the reviewing guidelines prior to submitting
it for review.

> Otherwise the reviews will differ in quality
> too much. Someone else may think that according to his packaging
> habits it is enough to build and run the package and stamp it in the
> same way you do. How will one see the difference in the quality of the
> review?

Reviewer X: "I've checked the package in accordance with the reviewing
guidelines. All MUST items are fine. APPROVED."

Reviewer Y: "The package compiles and links an included libbind."

Again, reviews are interesting when something is found, not when nothing
is found.

> > > > APPROVAL => all MUST items must have passed the check
> > > 
> > > ... using the easy way out.
> > 
> > No, there is no excuse if the approved package does not pass the checklist
> > actually.
> 
> And how will one be able to tell? Only by doing himself a complete
> review ...

Exactly. That's the only way to verify whether all MUST items have been
checked.

> > > > The only interesting point is when after approval it turns out that the
> > > > reviewer has NOT checked something and has NOT noticed one or more flaws
> > > > that should have been noticed when processing the MUST items.
> > > 
> > > Better be proactive than finding whom to blame afterwards: Forcing the
> > > reviewer to interact with the checklist make it less likely for missed
> > > items especially when compared to "wild reviews".
> > 
> > No, thank you. This is a big turn-off criterion for me. When I say "APPROVED",
> > all that matters is whether anybody can point me to something I've missed.
> 
> And "quid custodiet ipsos custodes?".

I'm not fluent in Latin, so I've had to look this up. Please don't talk in
riddles. There really is only one way to verify a review, and that is to
do an own review of the same package. Do it! Find sloppy reviews, where
serious problems have slipped through, and then give reason to put an eye
on reviewers.

> Reviewers aren't gods, they are
> on the same level as contributors, and when we ask contributors to
> invest time in packaging and explaining package decisions, we have to
> ask reviewers to put some visible efforts into the process, too.

Sigh. A cut'n'pasted list?
 
[ms: The rest of the message ignored deliberately. You've overstepped the
mark in there. I've done hundreds of gpg signed reviews, plus more in the
new system, and surely don't need your judgement about the quality of my
reviewing.]




More information about the Fedora-maintainers mailing list