[sos-devel] Proposal: Synergy of upstream and downstream testing of sos

Miroslav Hradilek mhradile at redhat.com
Wed Nov 2 12:20:16 UTC 2016


Hello Bryn,

answers inline:

On 10/26/2016 01:12 PM, Bryn M. Reeves wrote:
> On Wed, Oct 26, 2016 at 10:41:51AM +0200, Miroslav Hradilek wrote:
>> For downstream, we develop tests for a) specific issues, b) whole plugins,
>> c) basic functionality. Unlike sos internal test suite these are blackbox
>> integration tests written in bash as opposed to python whitebox tests
>> checking the functions themselves. Sometimes the tests are run on real
>> environment but very often the files and commands to be collected are
>> mocked. To make our lifes easier there are libraries of setup functions,
>> mock functions, assertions and so on.
>
> Right: these are integration tests (that work on 'sosreport' as a whole),
> where the upstream suite is primarily unit testing.
>
> One thing to remember here though: there are more downstreams than just the
> Red Hat / Fedora world today: sos is actively maintained in both Debian and
> Ubuntu, is used in several hypervisor products (notably IBM's PowerKVM),
> and has at least some users on SuSE and other distributions.
>

Yes, I'm not planing to test these but I believe that their respective 
QE people can too benefit from the tests being maintained upstream. 
People who actually do the testing on those systems would have to keep 
an eye on the test code so that it runs smoothly on their systems too. 
I'd like the test code and libraries to support this variability.

>> To ensure these tests can be reused, a lot of branching needs to be done
>> within code depending on the environment but mostly on version of the
>> sosreport and patches applied downstream. Code branching is very inefficient
>> and takes a lot of effort. When developing the tests library is developed
>> along with it and needs to be maintained too. Upstream gets only bug reports
>> from these efforts.
>
> It might help some of the other readers on the list to give a brief example,
> or overview, of the downstream testing we do in RHEL, and why maintaining
> multiple branches becomes painful.
>

I guess I could do that. I will sent another email explaining it.

One thing to note now is that we are not maintaining multiple branches. 
Instead we branch the differences in the test code using IF/ELSE 
statements where the test differs. Maintaining different git branch for 
each version and environment would make a really large matrix of test 
code to backport patches into.

This wold be the major benefit of the test being maintained upstream. 
There would no longer be need for any IF statement caused by sosreport 
version differences because the test would be working just for the 
version of sosreport source the test suite is part of. There would still 
be need for branching (IF/ELSE or git) based on the environmental 
differences.

>> Problems I suggest to solve:
>>   1. Avoid downstream branching and test library development.
>>   2. Extend upstream test coverage and test library by downstream testers.
>>   3. Write tests library functions and tests so that they can be run in
>> mocked environment as well as in real environment with the flick of a
>> switch.
>
> I think this is an excellent goal. We tried a few years ago to get a docker
> based testing system together, that was trying to address similar needs:
>
>   - coverage of multiple distributions (policies/environments)
>   - reproducible environments for testing
>   - repeatable tests
>
> Some of the ideas were tracked in the following GitHub issue:
>
>   https://github.com/sosreport/sos/issues/335
>

Unfortunately, I do not plan to solve these for sos:
   * The way to obtain, prepare, maintain images of environment setups.
   * Scheduling (test) x (environment) matrices.
   * Collecting, interpreting and reviewing test results.
   * ...

What I'd want to work on is tests and test libraries that will run on 
any system (eg. CI, developers workstation...) via mocking and will work 
seamlessly on real environments when the things mentioned above are 
solved by sos.

>> Extra work I suggest to put on our shoulders:
>>   1. Developers and plugin contributors forced to modify tests and libraries
>> to ensure they work with their commit.
>
> Ack. This is something we've gotten a little better at for API changes -
> asking developers to also amend the unit test suite - but extending this
> to also include integration testing would be really useful.
>

Cool.

>>   2. Downstream testers extending the upstream test suite and libraries to
>> later use it downstream.
>>
>> Proposal I make:
>>   * Let's chose test framework + assertion library, develop our library and
>> fixtures upstream. Develop at least part of the integration test suite [ b)
>> and c) ] upstream and use upstream library for a) downstream.
>>   * Write tests and mock functions so that they use real file chunks and, by
>> change of the environment, setups can be disabled therefore collecting and
>> asserting real files and commands.
>> * Require integration test suite to pass in order to accept commit.
>> Encourage submitting integration tests with new plugins and plugins
>> functionality.
>
> I'm curious what it would look like, and how we would manage to support
> the range of different downstreams from a single upstream testing base,
> but I think it's definitely worth more investigation.
>
> As you're more experienced with this kind of testing, and the available
> tools than most of the team, would you be OK to write up a more detailed
> proposal, or something that demos how some of this would work?
>

Thanks, I will try to do this and let you know when I have something. I 
needed to know that it makes sense to put an effort to it.

> Regards,
> Bryn.
>

Hopefully you are not disappointed that my goals are actually less 
ambitious then they sounded.

-- 
Miroslav Hradilek
Quality Assurance Engineer
Base OS Quality Engineering

Red Hat Czech, s. r. o.
Purkynova 99
612 45 Brno, Czech Republic




More information about the sos-devel mailing list