[Automated-testing] Working Group Pages

Neil Williams neil.williams at linaro.org
Mon Nov 19 03:23:15 PST 2018


On Sat, 17 Nov 2018 at 02:26, <Tim.Bird at sony.com> wrote:
>
>
>
> > -----Original Message-----
> > From: Carlos Hernandez on Thursday, November 15, 2018 7:06 AM
> >
> > I think we should create a page called working groups, milestones or
> > something else along those lines under the top page
> > https://elinux.org/Automated_Testing
> >
> > The working groups page should then point to individual pages focus on
> > solving a specific problem. People can follow (i.e. add to their watchlist)
> > working group pages that they are interested in.
> >
> > Couple of working group (WG) pages to start with could be 'test results
> > definition WG' and 'test case definition WG'.
> >
> > I can setup first couple of pages to get this rolling.
>
> This sounds great to me.  I have been gathering information related
> to some of this on sub-pages off of https://elinux.org/Test_Standards
>
> For example, I've been gathering dependency information on:
> https://elinux.org/Test_Dependencies

With LAVA, we have been trying to get away from the test framework
knowing anything about dependencies of any kind because that breaks
portability of the test operation.

For example, the test writer needs to do the work of installing
packages or ensuring that the support is built into deployments which
don't support package managers. This allows the same unchanged test
operation to run on Debian or in OE. The test can then be reproduced
easily on Red Hat or Ubuntu without needing to investigate how to
unpick the dependency format for their own system.

Test writers then create a script which checks to see if any setup
work is required and does whatever steps are required. That script
then also analyses metadata and parameters to work out what it should
do. The same script gets re-used for multiple test operations with
different dependencies and can work inside the test framework and
outside it. e.g. by checking for lava-test-case in $PATH, it is easy
to call lava-test-case if it exists or use echo | print | log to
declare what happened when running on a developer machine outside of
any test framework.

Such portability is an important aid in getting problems found in CI
fixed in upstream tools because developers who are not invested in the
test framework need to be able to reproduce the failure on their own
desks without going through the hoops to set up another instance of an
unfamiliar test framework.

It would be much better to seek for portable test definitions which
know about their own dependencies than to prescribe a method of
exporting those dependencies. This reduces the number of assumptions
in the test operation and makes it easier to get value out of the CI
by getting bugs fixed outside the narrow scope of the CI itself.

> and information about result codes on:
> https://elinux.org/Test_Result_Codes

Test results are not the only indicator of a test operation. A simple
boot test, like KernelCI, does not care about pass, fail, skip or
unknown. A boot test cares about was the entire test job Complete or
Incomplete. If Incomplete, what was the error type and error message.
If the error type was InfrastructureError, then resubmit (LAVA will
have already submitted a specialised test job called a health check
which has the authority to take that one device offline if the device
repeats an InfrastructureError exception) - a different device will
then pick up the retry.

Test results only apply to "functional" test jobs - there is a whole
class of boot-only test jobs where test results are completely absent
and only the test job completion matters.

>
> However, I think that there is information needed for different
> efforts, other than the information related to, say, the standard
> itself, that should have a home somewhere.
>
> Different projects are going to have different audiences
> and different activities, so I think having separate pages for them
> is a good idea.
>
> >
> > Are there other suggestions?
> I would like to see a page for build artifacts.  This would be particularly
> around standards for test packages.  The other effort I think we
> agreed to start was something about the Test Execution API (API 'E').
>
> I'll wait to see what pages you create, and try to create some
> in the same style.
>  -- Tim
>
> --
> _______________________________________________
> automated-testing mailing list
> automated-testing at yoctoproject.org
> https://lists.yoctoproject.org/listinfo/automated-testing



-- 

Neil Williams
=============
neil.williams at linaro.org
http://www.linux.codehelp.co.uk/


More information about the automated-testing mailing list