Yocto Project Test Environment Manual

Scott Rifenbark

Scotty's Documentation Services, INC

Permission is granted to copy, distribute and/or modify this document under the terms of the Creative Commons Attribution-Share Alike 2.0 UK: England & Wales as published by Creative Commons.

Manual Notes

  • This version of the Yocto Project Test Environment Manual is for the 2.6 release of the Yocto Project. To be sure you have the latest version of the manual for this release, go to the Yocto Project documentation page and select the manual from that site. Manuals from the site are more up-to-date than manuals derived from the Yocto Project released TAR files.

  • If you located this manual through a web search, the version of the manual might not be the one you want (e.g. the search might have returned a manual much older than the Yocto Project version with which you are working). You can see all Yocto Project major releases by visiting the Releases page. If you need a version of this manual for a different Yocto Project release, visit the Yocto Project documentation page and select the manual set by using the "ACTIVE RELEASES DOCUMENTATION" or "DOCUMENTS ARCHIVE" pull-down menus.

  • To report any inaccuracies or problems with this manual, send an email to the Yocto Project discussion group at yocto@yoctoproject.com or log into the freenode #yocto channel.

Revision History
Revision 2.7TBD
Released with the Yocto Project 2.7 Release.

Table of Contents

1. The Yocto Project Test Environment Manual
1.1. Welcome
1.2. Yocto Project Autobuilder Overview
1.3. Yocto Project Tests - Type Overview
1.4. How Tests Map to Areas of Code
1.5. Test Examples
1.5.1. bitbake-selftest
1.5.2. oe-selftest
1.5.3. testimage
1.5.4. testsdk_ext
1.5.5. testsdk
1.5.6. oe-build-perf-test
1.6. New Section on the Periodic Builds
1.7. Configuring and Triggering Autobuilder Helper Build Scripts
1.8. Deploying Yocto Autobuilder
1.8.1. Upstream Autobuilder Deployment on the Controller
1.8.2. Upstream Autobuilder Deployment on the Worker
1.8.3. Upstream Autobuilder Deployment No Upstream Users
1.9. Setting Up Headless Sanity Tests
1.10. Adding Additional Build Workers
1.11. Setting Up Build History
1.12. Some More Notes
1.13. Yocto Project Autobuilder Helper Scripts

Chapter 1. The Yocto Project Test Environment Manual

1.1. Welcome

Welcome to the Yocto Project Test Environment Manual! This manual is a work in progress. The manual contains information about the testing environment used by the Yocto Project to make sure each major and minor release works as planned. Other organizations can leverage off the process and testing environment used by the Yocto Project to create their own automated, production test environment.

Currently, the Yocto Project Test Environment Manual has no projected release date. This manual is a work-in-progress and is being initially loaded with information from the README files and notes from key engineers:

  • yocto-autobuilder: This README.md is not maintained. However, some information from this README file still applies but could need some modification. In particular, information about setting up headless sanity tests and build history. The sections on these will be changing.


    The yocto-autobuilder  repository is obsolete and is no longer maintained. The new "Autobuilder" is maintained in the yocto-autobuilder2  repository.

  • yocto-autobuilder2: This README.md is the main README for Yocto Project Autobuilder. The yocto-autobuilder2 repository represents the Yocto Project's testing codebase and exists to configure and use Buildbot to do testing.

  • yocto-autobuilder-helper: This README is a valid Autobuilder Git repository that contains Yocto Project Autobuilder Helper Scripts. The yocto-autobuilder-helper repository contains the "glue" logic that connects any Continuous Improvement (CI) system to run builds, support getting the correct code revisions, configure builds and layers, run builds, and collect results. The code is independent of any CI system, which means the code can work Buildbot, Jenkins, or others.

1.2. Yocto Project Autobuilder Overview

The Yocto Project Autobuilder collectively refers to the software, tools, scripts, and procedures used by the Yocto Project to test released software across supported hardware in an automated and regular fashion. Basically, during the development of a Yocto Project release, the Autobuilder tests if things work. The Autobuilder builds all test targets and runs all the tests.

The Yocto Project uses the unpatched Buildbot Nine to drive its integration and testing. Buildbot Nine has a plug-in interface that the Yocto Project customizes using code from the yocto-autobuilder2 repository. The resulting customized UI plug-in allows you to visualize builds in a way suited to the project.

A "helper" layer provides configuration and job management through scripts found in the yocto-autobuilder-helper repository. The helper layer contains the bulk of the build configuration information and is release-specific, which makes it highly customizable on a per-project basis. The layer is CI system-agnostic and contains a number of helper scripts that can generate build configurations from simple JSON files.


It is possible to use the outer layers from another Continuous Integration (CI) system such as Jenkins instead of Buildbot.

The following figure shows the Yocto Project Autobuilder stack with a topology that includes a controller and a cluster of workers:

1.3. Yocto Project Tests - Type Overview

Two kinds of tests exist within the Yocto Project:

  • Build Testing: Tests whether specific configurations build by varying MACHINE, DISTRO, and the specific target images being built (or world).

  • Build Performance Testing: Tests whether or not commonly used steps during builds work efficiently and avoid regressions.

Beyond these types of testing, the Autobuilder tests different pieces by using the following types of tests:

  • Build Testing: Trigger builds of all the different test configurations on the Autobuilder. Builds usually cover each target for different architectures, machines, and distributions.

  • Build Performance Testing: Tests to time commonly used usage scenarios are run through oe-build-perf-test.

  • eSDK Testing: Image tests initiated through the following command:

         $ bitbake image -c testsdkext

    The tests utilize the testsdkext class and the do_testsdkext task.

  • Feature Testing: Various scenario-based tests are run through the OpenEmbedded Self-Test (oe-selftest).

  • Image Testing: Image tests initiated through the following command:

         $ bitbake image -c testimage

    The tests utilize the testimage* classes and the do_testimage task.

  • Package Testing: A Package Test (ptest) runs tests against packages built by the OpenEmbedded build system on the target machine. See the "Testing Packages With ptest" section in the Yocto Project Development Tasks Manual and the "Ptest" Wiki page for more information on Ptest.

  • Sanity Checks During the Build Process: Tests initiated through the insane class.

  • SDK Testing: Image tests initiated through the following command:

         $ bitbake image -c testsdk

    The tests utilize the testsdk class and the do_testsdk task.

  • Unit Testing: Unit tests on various components of the system run through oe-selftest and bitbake-selftest.

1.4. How Tests Map to Areas of Code

Tests map into the codebase as follows:

  • bitbake-selftest:

    • These tests are self-contained and test BitBake as well as its APIs, which include the fetchers. The tests are located in bitbake/lib/*/tests.

    • From within the BitBake repository, run the following:

           $ bitbake-selftest

    • The tests are based on Python unittest.

  • oe-selftest:

    • These tests use OE to test the workflows, which include testing specific features, behaviors of tasks, and API unit tests. The tests take advantage of parallelism through the "-j" option to run in multiple threads.

    • The tests are based on Python unittest.

    • The code for the tests resides in meta/lib/oeqa/selftest.

    • To run all the test, enter the following command:

           $ oe-selftest -a

    • To run a specific test, use the following command form where testname is the name of the specific test:

           $ oe-selftest -r testname

  • testimage:

    • These tests build an image, boot it, and run tests against the image's content.

    • The code for these tests resides in meta/lib/oeqa/runtime.

    • You need to set the IMAGE_CLASSES variable as follows:

           IMAGE_CLASSES += "testimage"

    • Run the tests using the following command form:

           $ bitbake image -c testimage

  • testsdk:

    • These tests build an SDK, install it, and then run tests against that SDK.

    • The code for these tests resides in meta/lib/oeqa/sdk.

    • Run the test using the following command form:

           $ bitbake image -c testsdk

  • testsdk_ext:

    • These tests build an extended SDK (eSDK), install that eSDK, and run tests against the eSDK.

    • The code for these tests resides in meta/lib/oeqa/esdk.

    • To run the tests, use the following command form:

           $ bitbake image -c testsdkext

  • oe-build-perf-test:

    • These tests run through commonly used usage scenarios and measure the performance times.

    • The code for these tests resides in NEED A DIRECTORY HERE.


           some setting

    • Run the tests using the following command form:

           $ some command

1.5. Test Examples

This section provides example tests for each of the tests listed in the How Tests Map to Areas of Code" section.

1.5.1. bitbake-selftest

Content here.

1.5.2. oe-selftest


1.5.3. testimage


1.5.4. testsdk_ext


1.5.5. testsdk


1.5.6. oe-build-perf-test


1.6. New Section on the Periodic Builds

The following is going to be the replacement content for the section on "Nightly Builds". Not sure what we are going to call these builds. We need a name to replace "Nightly Builds".

Here is the content from Richards email:

     In 1.6, we actually dropped the "nightly" bit pretty much everywhere.
     They are now named MACHINE or MACHINE-DISTRO, e.g. qemuarm or qemuarm-
     lsb (which tests poky-lsb with qemuarm). We now parallelise not just
     architecture but by machine so machine and real hardware are now
     separate. The flow is therefore to build the images+sdks, then test the
     images+sdks, trying to do as much as possible in parallel.

     We have two types of build trigger, "quick" and "full". quick runs all
     the things which commonly fail and one random oe-selftest. "full" runs
     all our targets, runs oe-selftest on all distros and includes ptest and
     build performance tests. Its slower but more complete and would be used
     for release builds.

1.7. Configuring and Triggering Autobuilder Helper Build Scripts


This section is created from the information in the yocto-autobuilder2  README.md file. I am making an assumption that we do not want to refer to the Autobuilder stuff as "Autobuilder2". My guess is that since this is the first documentation of any automated test environment and process in the Yocto Project user documentation, we will treat it as the start of things.

Automatic testing is based on the workers executing builds using Buildbot Nine configured for specific build jobs triggered in an automatic and regular fashion. Worker Configuration and triggering is accomplished through the Yocto Project Autobuilder layer and a set of helper scripts.

The configuration and helper scripts have as little code and as few custom Buildbot extensions as possible. The configuration collects required input from the user to furnish the helper scripts with the input needed for workers to accomplish their builds. The input consists of minimal user-customizable parameters used to trigger the helper build scripts.

Each builder maps to a named configuration in the helper scripts. The configuration is created with the steps and properties required to invoke the helper scripts for a worker's builds.

Each worker has a custom scheduler created for it and contains parameters configured for the scheduler that can supply the custom versions of the required values for the helper script parameters.

Following is the code layout for the Autobuilder:

  • builders.py: Configures the builders with the minimal buildsteps to invoke the Yocto Project Autobuilder helper scripts.

  • lib/wiki.py: Implements functionality related to MediaWiki. The wikilog plug-in uses this functionality. Effectively, this functionality provides helper functions for the plug-in.


    Much of this code can be replaced by porting the plug-in so that it is implemented as a buildbot.util.service.HTTPClient.

  • reporters/wikilog.py: A custom plug-in that is a Buildbot service that listens for build failures and then writes information about the failure to the configured wiki page.

  • steps/writelayerinfo.py: Implements a simple, custom buildset that iterates the repo_, branch_, and commit_ properties, which are set by the schedulers, and then writes a JSON file with the user's values.

  • config.py: Contains all values that might need changing to redeploy the Autobuilder code elsewhere.


    The redeployment goal has not been currently met.

  • master.cfg: Performs most configuration by making calls into other scripts. Configuration specific for a worker cluster (i.e. a Controller URL) resides here.

  • schedulers.py: Sets up the force schedulers with controls for modifying inputs for each worker.

  • services.py: Configures IRC, mail, and Wikilog reporters.

  • workers.py: Configures the worker objects.

  • www.py: Sets up the Web User Interface.

The goal is to keep custom code minimized throughout the Autobuilder. The few customizations implemented support the Yocto Project Autobuilder Helper Script workflows and help replicate the workflows established with the Yocto Autobuilder layer. In particular, the following files accomplish this customization:

  • writelayerinfo.py

  • wikilog.py

  • wiki.py

1.8. Deploying Yocto Autobuilder

Steps to deploy the Yocto Project Autobuilder assume each target system has a copy of Buildbot installed. Additionally, various pieces of functionality require that a copy of the Autobuilder Helper Scripts (i.e. yocto-autobuilder-helper) is available in the home directory at ~/yocto-autobuilder-helper of the user running Buildbot.


If you are using a reverse proxy, be aware that modern Buildbot uses a web socket for various communications between the master and the web's User Interface. Refer to the Buildbot documentation for information on how to correctly configure a reverse proxy.

The following sections provide steps for Yocto Autobuilder deployment.

1.8.1. Upstream Autobuilder Deployment on the Controller

Follow these steps to deploy Yocto Autobuilder on an upstream controller:

  1. Create the Master Yocto Controller:

         $ buildbot create-master yocto-controller

  2. Change Your Working Directory to the Master Yocto Controller:

         $ cd yocto-controller

  3. Create a Local Git Repository of the Yocto Project Autobuilder:

         $ git clone https://git.yoctoproject.org/git/yocto-autobuilder2 yoctoabb

    In the previous command, the local repository is created in a yoctoabb directory inside the directory of the Master Yocto Controller directory.

  4. Change Your Working Directory Back to the Master Yocto Controller:

         $ cd ..

  5. Create a Relative Symbolic Link to master.cfg:

         $ ln -rs yocto-controller/yoctoabb/master.cfg yocto-controller/master.cfg

    The previous command sets up a relative symbolic link to the master.cfg using a link of the same name.

  6. Update the Buildbot URL in master.cfg: Use your $EDITOR to edit the Buildbot URL in the master.cfg file. Find the following line and replace the URL with the URL for your Buildbot:

         c['buildbotURL'] = "https://autobuilder.yoctoproject.org/main/"

  7. Enable services in services.py: Use your $EDITOR to edit the services.py file. Set appropriate configuration values to enable desired services.

  8. Enable Automatic Authorization (Autorisation) in www.py: Use your $EDITOR to edit the www.py file. Configure autorisation if desired.

  9. Modify Configuration Options in config.py: Use your $EDITOR to edit the config.py file. Modify configuration options such as worker configurations.

  10. Start Buildbot:

         $ buildbot start yocto-controller

  11. Create a Local Git Repository of the Yocto Autobuilder Helper Scripts::

                            Move up a directory so that you are above the
                            location and clone the directory:
         $ cd ..
         $ git clone https://git.yoctoproject.org/git/yocto-autobuilder-helper

1.8.2. Upstream Autobuilder Deployment on the Worker

Follow these steps to deploy Yocto Autobuilder on an upstream worker:

  1. Create the Worker:

         $ buildbot-worker create-worker yocto-worker localhost example-worker pass


    You do not have to hard-code the third parameter (i.e. example-worker). For example, you can pass `hostname` to use the host's configured name.

  2. Start the Worker:

         $ buildbot-worker start yocto-worker

1.8.3. Upstream Autobuilder Deployment No Upstream Users

This case has yet to be defined. It requires a custom config.json file for yocto-autobuilder-helper.

1.9. Setting Up Headless Sanity Tests

If you plan on using the Yocto Project Autobuilder to run headless sanity testing, you need to do the following:

  1. Install TightVNC client and server.

  2. Create a bank of tap network devices (tap devs) by running the runqemu-gen-tapdevs script found in the Source Directory at https://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/scripts.

    You must disable interface control on these new tap devices.


    Some services include NetworkManager, connman, or wicd.

  3. Add "xterm*vt100*geometry: 80x50+10+10" to .Xdefaults

  4. Set up and start the TightVNC session as the Autobuilder user.

  5. Manually connect to the VNC session at least once prior to running a QEMU sanity test.


    Something is getting set during the initial connection that has not been figured out yet. Manually connecting seems to set up the session correctly.

1.10. Adding Additional Build Workers

The production Yocto Autobuilder uses a cluster of build workers. The cluster shares the same SSTATE_DIR and DL_DIR through an NFS4 mounted Network Attached Storage (NAS). The main nightly trigger pre-populates the DL_DIR, which allows the workers to not have to deal with a lot of downloading. In theory, you could also run your build workers with NO_NETWORK to enforce a single point for populating DL_DIR.

Running multiple build workers is fairly simple, but does require some setup:

  1. Ensure the settings in autobuilder.conf are valid for each worker. Certain variables are set within this file that work with the local configurations on each worker.

  2. Within yocto-controller/controller.cfg, add your worker to the c['workers'] list inside the BUILDWORKERS section.

  3. For each worker change the WORKER SETTINGS section of yocto-worker/buildbot.tac to match the settings in controller.cfg.

  4. Workers must reside in the same path as the Build Controller, even if they are on completely different machines.

1.11. Setting Up Build History

Build History is used to track changes to packages and images. By default, the Autobuilder does not collect build history. The production Autobuilder does have this functionality enabled.

Setting up build history requires the following steps:

  1. Create an empty Git repository. Make a single commit to it and then create and push branches for each of the nightly core architectures (i.e.. mips, ppc, x86...).

  2. Find a central location to create a clone for the repository created in the previous step. This works best if you have a setup similar to the production Autobuilder (i.e. NAS with many workers).

  3. Run the following:

         # This is an example of how to set up a local build history checkout. Paths
         # obviously are situationally dependent.
         $ mkdir /nas/buildhistory
         $ cd /nas/buildhistory
         $ git clone ssh://git@git.myproject.org/buildhistory
         $ git clone ssh://git@git.myproject.org/buildhistory nightly-arm
         $ git clone ssh://git@git.myproject.org/buildhistory nightly-x86
         $ git clone ssh://git@git.myproject.org/buildhistory nightly-x86-64
         $ git clone ssh://git@git.myproject.org/buildhistory nightly-ppc
         $ git clone ssh://git@git.myproject.org/buildhistory nightly-mips
         $ for x in `ls|grep nightly` do cd $x; git checkout $x; cd /nas/buildhistory; done

  4. Within the autobuilder.conf of each worker, change the following:

         BUILD_HISTORY_DIR = "/nas/buildhistory"
         BUILD_HISTORY_REPO = "ssh://git@git.myproject.org/buildhistory"

1.12. Some More Notes

  • Yocto Autobuilder: The Git repository is at http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder/tree/.

    Essentially an extension to the vanilla buildbot. This extension mainly addresses configuration file handling and Yocto-specific build steps.

    For better maintainability, the Autobuilder (see Autobuilder.py located at http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder/tree/lib/python2.7/site-packages/autobuilder), handles configuration from multiple files.

    Additional build steps such as CheckOutLayers.py or CreateBBLayersConf are Yocto-specific and simplify the worker's configuration.

  • TightVNC: Virtual Network Computing (VNC) is a client/server software package that allows remote network access to graphical desktops. With VNC, you can access your machine from everywhere provided that your machine is connected to the Internet. VNC is free (released under the GNU General Public License) and it is available on most platforms.

    TightVNC is an enhanced version of VNC, which includes new features, improvements, optimizations, and bug fixes over the original VNC version. See the list of features at http://www.tightvnc.com/intro.php.

    You need TightVNC in order to run headless sanity tests. See the bullet on headless sanity tests for more information.

  • Files Used for Yocto-Autobuilder Configuration:

    • config/autobuilder.conf: Used to set Autobuilder-wide parameters, such as where various build artifacts are published (e.g. DL_DIR and SSTATE_DIR). Another example is if build artifacts should be published, which is necessary for production Autobuilders but not desktop builders.

    • buildset-config/yoctoAB.conf: The main Yocto Project Autobuilder configuration file. Documentation for this file and its associated format is in the README-NEW-AUTOBUILDER file.

1.13. Yocto Project Autobuilder Helper Scripts


Deferring this topic per Richard's suggestion. It is placed here temporarily.

The helper scripts work in conjunction with the Yocto Project Autobuilder. These scripts do the actual build configuration and execution for tests on a per release basis.

You can use pre-commit-hook.sh to verify the JSON file before committing it. Create a symbolic link as follows:

     $ ln -s ../../scripts/pre-commit-hook.sh .git/hooks/pre-commit

Most users will have to customize the helper script repository to meet their needs. The repository is located at http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder-helper. The scripts themselves should be more generically reusable. The config.json is less reusable as it represents the Yocto Project Autobuilder test matrix.

Two customization options are possible: 1) variable substitution, and 2) overlaying configuration files. The standard config.json minimally attempts to allow substitution of the paths. The helper script repository includes a local-example.json to show how you could override these from a separate configuration file. Pass the following into the environment of the autobuilder:

     ABHELPER_JSON="config.json local-example.json"

As another example, you could also pass the following into the environment:

     ABHELPER_JSON="config.json /some/location/local.json"