[yocto] Automating imaging building and testing, what aproach to use!?

Bruce Ashfield bruce.ashfield at windriver.com
Wed Aug 31 15:43:45 PDT 2016


On 2016-08-31 6:23 PM, Paul Eggleton wrote:
> Hi Daniel,
>
> On Tue, 30 Aug 2016 17:18:44 Daniel. wrote:
>> While writing software we're used to delivery packages, libraries and
>> stacks. There are out there a lot of continuos integration solutions
>> to automaticaly build and test these kinds of software. But when
>> dealing to images the thing is more tricky.
>>
>> I can't run the tests at the same machine that build the image because
>> crosscompilation take place on 99% of the cases. What is aproach that
>> you guys are using to automate and increase the quality of your
>> images?
>>
>> Automating the build is the easy part my concert is about automating
>> the runtime tests that need the target board to run. In my case I
>> depend on hardware to fully test the image features. Is there any
>> reliable way to automate image installation and boot!?
>
> There are some folks here working on automated hardware tests (on CC), perhaps
> they can expand on what we're currently doing in that area. At least in the
> existing code we do have basic support for running tests on real hardware that
> may be worth looking into - at the moment though it's pretty rudimentary when
> it comes to interacting directly with the hardware though. You can see what
> we've currently got here:
>
>   http://www.yoctoproject.org/docs/current/dev-manual/dev-manual.html#hardware-image-enabling-tests
>
> We've looked at LAVA several times, and I'm sorry to say the conclusion each
> time is that its a mess - both from a usage perspective and looking at the
> code. It was disappointing to us because initially it looked like it was going
> to solve a lot of our problems. Maybe others have had different experiences -
> I'd love to hear details if anyone is prepared to share.

We've been using LAVA extensively within WRind River, and haven't run
into any major issues with the code and usage. Perhaps it depends on
the type of test cases that are being run ?

LAVA was active, extensible and able to integrate with our wide range
of targets.

That's not to say that we didn't add a lot of our own tests,
infrastructure, etc, but that was expected work with whatever we
chose.

Cheers,

Bruce

>
> Cheers,
> Paul
>




More information about the yocto mailing list