8. Linaro Test and
Validation Summit
Fathi Boudra
Builds and Baselines
LCE13 - Dublin, July 2013
How Do We Better Test our Engineering
9. ● CI Present
○ Anatomy of CI loop
● CI Future
○ What is on the CI roadmap
Overview
10. ● Get the source
○ Source code is under SCM
■ Git (git.linaro.org)
■ Bazaar (bazaar.launchpad.net)
● Build the code
○ Use a build system
■ Jenkins (ci.linaro.org and android-build.linaro.
org)
■ LAVA (yes, LAVA can be used!!!)
● Publish the build results
○ Build artifacts are available (snapshots.linaro.org)
Anatomy of CI loop
11. ● Submit the results for testing
○ LAVA (validation.linaro.org)
● Get the tests results
○ E-mail notifications with filters (validation.linaro.
org/lava-server/dashboard/filters)
○ LAVA dashboard (validation.linaro.org/lava-
server/dashboard)
Anatomy of CI loop
12. ● Different type of jobs
○ Kernel CI
○ Engineering builds
○ Components
● Build triggers
○ manual, periodically, URL trigger, post-commit
● Do the build
○ shell script(s)
■ can be maintained under SCM (linux-preempt-rt)
○ Groovy script(s)
● Publish
○ to snapshots.linaro.org
○ to package repositories (PPA, other)
Build jobs in depth
13. ● Submit to LAVA
○ Generate a LAVA job file (json)
○ test definitions are pulled from SCM (git.linaro.
org/gitweb?p=qa/test-definitions.git)
● Misc
○ Jenkins can run unit tests (e.g qemu-ltp job)
■ junit
■ xunit
○ CI helpers
■ post-build-lava
■ post-build-ppa
■ Linaro CI build tools
Build jobs in detail
14. ● LAVA CI Runtime
○ LAVA as a build system
● LAVA Publishing API
○ LAVA ability to publish artifacts on remote host
● Build time optimization
○ persistent slaves
○ mirrors and caching
● Better documentation
CI Future
18. Manual Testing
Current approach:
● test results are not very detailed
● no connection between test case description and result sheet
● results stored in google spreadsheet
● bug linking done manually (makes it hard to extract the list of 'known
issues')
19. Future:
● store test cases in some better suited place than wiki
● preserve test case change history
● store manual test results along automatic ones (in LAVA)
● have ability to link bugs from various tracking systems to failed cases (in
LAVA)
● generate reports easily (known issues, fixed problems, etc.)
○ might be done using LAVA if there is an easy way to extract testing
results (for example REST API)
Manual Testing
20. ● Monitoring dashboard
○ adding bugs
○ debugging failed runs
● Creating custom dashboards
○ Dashboard from filter
○ No need to edit python code to create/edit dashboard
○ Private/public dashboards
○ Dashboard email notification (falls in the concept of filter-as-dashboard
approach)
Dashboards
21. ● Use only binaries that were already automatically tested
● Don't repeat automated tests in manual run (we have to be confident that
automated results are reliable)
Release workflow
22. LAVA: Faster, Higher,
Stronger (& easier to use)
Antonio Terceiro
LAVA
LCE13 - Dublin, July 2013
Test and Validation Summit
23. ● Improvements
● New testing capabilities
● Engineering Progress Overview
● What are we missing?
○ Open Discussion
○ We want to hear from you
Overview
24. ● ~90 ARM devices
● ~300 ARM CPUs
● ~150 jobs submitted per day
● ~99% reliability
Context (0): the size of LAVA, today
25. ● LAVA started as an in-house solution
● Open source since day 1
● Other organizations (incl. Linaro members)
interested in running their own LAVA lab
We need to go from an in-house service to a
solid product
Context (1)
26. ● No bootloader testing
● Tests only involve single devices
We need to provide features to support new
demands in test and validation
Context (2)
28. ● Queue size monitored with munin
● Nagios monitoring all sorts of things (e.g.
temperature on Calxeda highbank nodes)
● Health check failures
Monitoring
29. Easing LAVA installation
● Effort on proper upstream packaging so that
packages for any (reasonable) OS can be
easily made
● WIP on Debian and Fedora packaging
$ apt-get install lava-server
$ yum install lava-server
Packaging enhancements
30. Easing LAVA learning
● Documentation is
○ scattered
○ outdated
○ confusing
Documentation overhaul is in the LAVA
roadmap.
Documentation overhaul
31. Easing LAVA usage
ATM a lava-test-shell job requires
● 1 JSON job file
● 1 YAML test definition file
● + the test code itself
$ sudo apt-get install lava-tool
$ lava script submit mytestscript.sh
$ lava job list
LAVA test suite helper tool
32. Getting more out of LAVA data
More information out of LAVA data
● Improvements in test results visualization in
the LAVA dashboard
33.
34. LAVA is too hard to develop
● Too many separate components
○ Also a mess for bug/project management
● Requires almost a full deployment for
development
● Consolidated client components (3 to 1)
● Will consolidate server components (3+ to 1)
Developer-friendliness
36. ● LAVA Multi-purpose Probe
● 1 base design, 5 boards now
● USB serial connection(s) to the host
● management of other connections to/from
devices under test
LMP
37. ● prototype sets manufactured and under test
● Use cases: ethernet hotplug, SATA hotplug,
HDMI hotplug and EDID faking, USB OTG
testing, USB mux (sort of), lsgpio, audio
hotplug, SD-Mux for bootloader testing
LMP (2)
38.
39. LMP (3) - how it works (e.g. SD-MUX)
DUT
SDC1
Host
LMP
USB serialUSB MSD
40. Multi-node testing (1)
● Schedule jobs across multiple target devices
○ Client-server, peer-to-peer and other scenarios
● Combine multiple results into a single result
● LAVA will provide a generic interface, test
writers can program any tests they need.
○ (special hardware setups possible but need to be
handled case-by-case)
Other sessions:
● LAVA multi-node testing on Thursday
● LNG multi-node use-cases on Friday
41. Multi-node testing (2)
● Logistics challenge!
● We might end up needing 20 of every device
type in the lab
● Need to manage the needed growth in the
lab in a sensible way
42. Other projects
● Lightweight interface for kernel developers
● Boot from test UEFI on all Versatile Express
boards
● Support for new member boards
44. In Progress
● LAVA LMP
● Multi-node testing
● Helper tool
● Test result visualization
improvements
● Lightweigth interface for
kernel devs
● UEFI on V. Express
● Support for new member
boards
In Progress X Planned
Planned (for soon)
● Server components
consolidation
● QA improvements
● Doc overhaul
47. ● What is your experience getting started with
LAVA?
● What would have made your experience
easier?
● Any suggestions to the LAVA team? Let us
know!
● Feedback about the image reports revamp?
Seed Questions