SLAV: Test Stack Abstraction Layers - Pawel Wieczorek, Samsung R&D Institute Poland [Automated Testing Summit 2019]

Pawel took part in the automated testing research for embedded devices done by Samsung.

Direct access to a device doesn’t differ much from automated testing: device has to be provisioned, then tests are done and you have to be able to observe it, then the device has to be restored and returned to the pool.

For target management, several controller boards were designed. This includes the SDmux in various iterations, available on https://wiki.tizen.org/SD_MUX. SDmux has to be able to control the power and switch the data lines. The SDWire is a follow-up that only manages control to the SD card. However, this doesn’t allow to intercept traffic. Finally they designed the MuxPi board that combines the SDWire functionality but also has an ARMv8 SBC that can fully control the device. https://github.com/SamsungSLAV/muxpi

For performing a test, a number of actions have to be performed. Where are these best performed, and how? They decided to make the test manager minimal. It just initiates actions and lists or cancels ongoing actions. It provides 4 REST API paths: create, list, get info and cancel a job. A job is a simple task scenario.

The test scheduler manages requests to access devices. It has APIs to list requests and to prolong or terminate an ongoing request. To avoid users hogging devices, requests are automatically closed after some time, so prolonging is necessary. Workers (target managers) appear when they register with the scheduler, then it is possible to set their state (IDLE, MAINTENANCE or FAILED) and to set tags (“groups”).

The possible interactions between the test manager and the device are implemented with shell scripts. The scripts may be customized for the specific DUT. Once the script is written, there’s no need to know exactly how to interact with the board, e.g. what DIP switch settings to use. Still, there are some hardware-specific scripts. The advantage is that test plans can be written without knowledge of the test setup and they can be reused between projects. However, it’s a lot of effort to keep up with changes in LAVA. In addition, it’s redoing work that others already did.

The strength of the test scheduler is that it can give priority to users over tests, and that it is agnostic of resource types. However, it requires a separate task manager agent and it can only use a single manager per test/device. Also the capabilities of a device/manager are fixed, so if they change that has to be done manually. It is thus not possible to dynamically reconfigure a DUT.

The DUT control simplifies the knowledge that is required to work with a specific boards, unifies different DUT devices. It doesn’t matter if it’s a TV or fridge or smartwatch. However, this means the initial setup of a device is hard. Also, often the setup is specific to the test lab so not that much unification.

The project is divided into separate repositories, which makes it modular but also makes integration more difficult. Theoretically the components can be swapped or used independently. It didn’t do enough reuse and too much rewriting of already existing components. However, all of the source is available on github so others can reuse it.