[Automated-testing] Test Stack Survey

Neil Williams neil.williams at linaro.org
Thu Oct 4 00:46:21 PDT 2018


On Wed, 3 Oct 2018 at 20:59, <Tim.Bird at sony.com> wrote:

> > -----Original Message-----
> > From: Matt Hart
> > Answers for LAVA,
> >
> Thanks for filling this out.  I have a few questions below.
> (I'll omit some lines that I don't have questions about)
>
> I created a page for this response at:
> https://elinux.org/LAVA_survey_response
>
> Note that I'm marking a few things that I think are interesting
> (and may be good to discuss at the summit) in red.


Live test result reporting is a key feature. It ensures that a test job
produces some results even if the device locks up part of the way through
the test action.


>
> >
> > On Wed, 19 Sep 2018 at 07:39, <Tim.Bird at sony.com> wrote:
> > > ==== test definitions ====
> > > Does your test system:
> > > * have a test definition repository?
> >
> > LAVA does not come with tests, however Linaro does maintain a set of
> > job definitions
>
> Does Linaro make these available for LAVA users?


Yes. With a few exceptions, the test definitions are in public git
repositories and the URL of each test definition is in the test job
definition. LAVA records the git commit hash of the version used for each
specific test job within the results metadat of the test job.


>   Are they generalized
> for common usage, or do they apply only to Linaro test projects or labs?
>

Depends. Some are wrappers for existing test operations, like LTP and
kselftest. Test definitions are being made more and more portable (so that
the operations within the test can be reproduced more easily outside LAVA).
The test should check if LAVA helpers are available and continue without
them where possible.

Some test definitions expect to find peripheral hardware attached to a DUT
which would mean that the test would fail when run on a similar device
without the peripheral. The test job definition will contain tags which
determine which devices can be scheduled for those test jobs. Otherwise,
most test definitions will work outside Linaro labs.


> Where could I find out more about these?
>

Start with an existing test job using a device which is familiar to you and
follow the links from the test job definition. There are a lot of
permutations and variations, just browsing the git repositories would get
confusing / boring. Linaro also has a lot of AOSP (Android Open Source
Project) related test jobs and those are typically more complex than tests
for operating systems like Yocto, OE or Debian.


>
> > > ** if so, what data format or language is used (e.g. yaml, json, shell
> script)
> >
> > YAML
>
> ...
> > > ==== build management ====
> > > Does your test system:
> > > * build the software under test (e.g. the kernel)?
> >
> > No
> >
> > > * build the test software?
> >
> > Sometimes, quite a lot of LAVA users will build the test software on
> > the device they are testing before executing it
>
> I'm familiar with test systems that rely on the distro build system (e.g.
> Yocto, Android
> or buildroot) to build the test software and include it in the distro.
> But this sounds
> like something different.  Just to verify that I understand - the test
> program is built
> on the DUT itself?  Is that right?
>

It can be, yes. As long as the deployed OS has enough support to do the
build. It's useful for small utilities or where the test program needs to
know something specific about the device/kernel. Some test jobs need to run
for hours, so a few minutes at the front compiling a small part of the test
isn't a problem.

LAVA also supports running tests in an LXC which can then use USB or TCP/IP
to communicate with the DUT - for operating systems which do not directly
support POSIX on the DUT and which provide tools to interact in this way,
e.g. AOSP. The power management team regularly build a small utility in the
LXC to communicate with an energy probe which is attached to the DUT. This
use case is becoming more common.


>
> Is this done: 1) during a test, 2) just before the test as part of test
> setup, 3) as part
> of DUT provisioning, or 4) some other time?
>

Typically at the start of one of the test actions as a setup phase. LAVA
supports raising an exception to kill the test job early if this setup
stage fails (e.g. due to a build error) to avoid producing spurious results
from a broken setup phase.

Other operations can be done during DUT deployment to add the LAVA test
helpers to the rootfs.


>
> ...
> > > * store the build artifacts for generated software?
> >
> > No, however pushing to external storage is supported (Artifactorial)
>
> I never heard of this one.  It looks interesting.
> I presume it is this?: https://github.com/ivoire/Artifactorial


Yes. Implemented as a service here: https://archive.validation.linaro.org/

The same functionality can be used to push to any third party service, it
doesn't have to be Artifactorial. Any service with which the DUT can
authenticate and transfer data will work. The URL returned can be stored as
a test reference in the test job results.


>
> ...
> > > ==== Requirements ====
> > > Does your test framework:
> > > * require minimum software on the DUT?
> >
> > Bootloader
> >
> > > * require minimum hardware on the DUT (e.g. memory)
> >
> > Serial port
> >
> > > ** If so, what? (e.g. POSIX shell or some other interpreter, specific
> > libraries, command line tools, etc.)
> >
> > POSIX shell for most DUT, however IOT devices are supported without a
> shell
>
> Interesting.  I assume this precludes any "interactive" commands, or
> DUT-side
> operations initiated by LAVA.  (But please correct me if I'm wrong).
>

Currently, although some form of limited interaction is being scoped for a
future release.


> Does LAVA just scan the serial console output for its results tokens?
>

Essentially, yes. A unique start and end string must be pre-defined and a
Python regular expression is provided by the test writer to parse the test
results which must appear between those strings.


> I presume that in this circumstance the test initiation is baked into
> the boot sequence for the DUT/SUT?
>

Yes. The IoT board typically provides an interface to deploy a new boot
file, gets reset and executes the boot file immediately. Test jobs often
end in under a minute.


>
> > > ==== APIS ====
> > > Does your test framework:
> > > * use existing APIs or data formats to interact within itself, or with
> 3rd-
> > party modules?
> >
> > ZMQ
>
>
> > > * have a published API for any of its sub-module interactions (any of
> the
> > lines in the diagram)?
> > > ** Please provide a link or links to the APIs?
> >
> > No
> >
> > >
> > > Sorry - this is kind of open-ended...
> > > * What is the nature of the APIs you currently use?
> > > Are they:
> > > ** RPCs?
> > > ** Unix-style? (command line invocation, while grabbing sub-tool
> output)
> > > ** compiled libraries?
> > > ** interpreter modules or libraries?
> > > ** web-based APIs?
> > > ** something else?
> >
> > ZMQ is used between LAVA workers (dispatchers) and the master, to
> > schedule jobs
> > XML-RPC is for users to submit jobs, access results, and control the lab


Test results and other data can be extracted through web-based APIs as YAML
etc. There is also a limited REST API.


> >
> > >
> > > ==== Relationship to other software: ====
> > > * what major components does your test framework use (e.g. Jenkins,
> > Mondo DB, Squad, Lava, etc.)
> >
> > Jenkins, Squad
> >
> > > * does your test framework interoperate with other test frameworks or
> > software?
> > > ** which ones?
> >
> > A common LAVA setup is Jenkins to create the builds, LAVA to execute
> > the tests, and Squad/KernelCI to consume the results
>
> That is helpful to know.  Thanks.
>
>
> > >
> > > == Overview ==
> > > Please list the major components of your test system.
> > >
> > > Please list your major components here:
> > > *
> >
> > * LAVA Server - (DUT Scheduler) - UI, Results storage, Device
> > configuration files, Job scheduling, User interaction
> > * LAVA Dispatcher - (DUT Controller) - Device interaction (deploy,
> > boot, execute tests), Device Power Control, Test results Parsing
> >
> > >
> > > == Glossary ==
> > > Here is a glossary of terms.  Please indicate if your system uses
> different
> > terms for these concepts.
> > > Also, please suggest any terms or concepts that are missing.
> >
> > PDU - Power Distribution Unit, however has become a standard term for
> > automation power control.
>
> Good addition.  Thanks.
>  -- Tim
>
> --
> _______________________________________________
> automated-testing mailing list
> automated-testing at yoctoproject.org
> https://lists.yoctoproject.org/listinfo/automated-testing
>


-- 

Neil Williams
=============
neil.williams at linaro.org
http://www.linux.codehelp.co.uk/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.yoctoproject.org/pipermail/automated-testing/attachments/20181004/6cb033ba/attachment.html>


More information about the automated-testing mailing list