Merge lp:~linaro-validation/lava-test/connect-training-session-materials into lp:lava-test/0.0

Proposed by Zygmunt Krynicki
Status: Rejected
Rejected by: Neil Williams
Proposed branch: lp:~linaro-validation/lava-test/connect-training-session-materials
Merge into: lp:lava-test/0.0
Diff against target: 293 lines (+289/-0)
1 file modified
INTRO (+289/-0)
To merge this branch: bzr merge lp:~linaro-validation/lava-test/connect-training-session-materials
Reviewer Review Type Date Requested Status
Linaro Validation Team Pending
Review via email: mp+109087@code.launchpad.net

Description of the change

This branch adds a few things I wrote for Linaro Connect 12 in Hong Kong, I think they are worth keeping

To post a comment you must log in.

Unmerged revisions

154. By Zygmunt Krynicki

Add INTRO for the lava-test training session at Linaro Connect

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== added file 'INTRO'
--- INTRO 1970-01-01 00:00:00 +0000
+++ INTRO 2012-06-07 08:53:21 +0000
@@ -0,0 +1,289 @@
1This session will cover:
2* What lava-test is.
3* How to install lava-test on a board
4* What tests are available in lava-test, and what they do.
5* How to install and execute tests.
6* How to submit test results to the dashboard.
7* How to find the results in LAVA from a lava-test run.
8* How to wrap existing tests to extend lava-test's capabilities
9* What to keep in mind when designing new tests to allow them to run efficiently in lava-test
10* Where to add tests to lava-test that can be shared.
11* Where to find the source, overview of development rules for new shared tests.
12* how to do private testing
13
14* What lava-test is.
15
16LAVA test is a integration layer between third party test programs / scripts
17and the LAVA infrastructure. Each test wrapper that lava tests can use must
18define how to install, execute and qualify the outcome of a test.
19
20Android also differs in the way it is integrated into LAVA, it is not sharing
21code-base with lava-test as of this time. We plan on integrating both so that
22they share APIs and implementation as much as possible.
23
24* How to install lava-test on a board
25
26There are two ways to install lava-test. You can either use a PPA or install from
27source directly. Since lava-test is written in python is shares the installation
28tools from the python ecosystem. In practice that means installation from
29source is very easy and portable to multiple systems (it works the same on
30debian-based systems, fedora-like systems and windows)
31
32For the purpose of this presentation we will start with a developer or desktop
33image and install from source. We will use the pip tool which is the current
34standard installer for python modules. We'll install pip from the repository to
35get started.
36
37$ sudo apt-get install python-pip
38
39We'll now install lava-test into /usr/local
40
41$ sudo pip install lava-test
42
43You will find that pip has quirks but generally is comparable in spirit with
44apt-get or yum. When we make releases we first publish our source packages
45of lava to python package index (also known as pypi). With pip you should be
46able to get the latest version of lava-test at any time.
47
48NOTE: you will find that many, if not all, python developers use virtualenvs to
49manage their installations. Unless you plan to re-image your board and don't
50care about what's get installed where it's almost always better to use
51virtualenv to manage multiple branches or versions of installed components. For
52simplicity I won't cover that here but I'll be using it in the demo.
53
54* What tests are available in lava-test, and what they do.
55
56LAVA test comes preloaded with a few test wrappers. There are commands
57that allow you to enumerate tests, install or remove tests and so on.
58
59$ lava-test list-tests
60LAVA: Tests built directly into LAVA Test:
61LAVA: - bluetooth-enablement
62LAVA: - bootchart
63...
64
65Here lava lists all the tests it knows about, tests are grouped by the provider
66they came from. You can think of providers as wholesale tests merchants. LAVA
67itself has a built-in provider with the tests wrappers it ships with as well as
68two other providers that allow you (end users) to install additional tests in
69two different ways. More advanced users can define a new provider but we won't
70cover that topic here. All you have to know is that lava-test, much like the
71rest of the framework is extensible and loads plug-ins to provide content and
72functionality.
73
74The LAVA team does not maintain the actual test definitions we ship, which is
75somewhat confusing. We accept patches from people that wish to add their test
76to the default collection but we cannot fix bugs or run them each time they are
77proposed (we only look at the integration side of the code)
78
79Tests are documented on our documentation page, test maintainers are encouraged
80to provide a description for their test wrappers to help others understand how
81to use them effectively. LAVA has documentation that is maintained in the
82source tree but built for proper presentation by a third party service called
83readthedocs.org
84
85http://lava-test.readthedocs.org/en/latest/tests.html
86
87* How to install and execute tests.
88
89To install a test simply issue the command
90
91$ lava-test install xxx
92
93Where xxx is the name of the test you wish to install. This step will refer
94to the data or code defined in the test wrapper. Typically most tests use the
95default implementation and simply provide a few shell commands to install some
96packages from the archive, download and compile a simple module or something
97similar. Advanced users can freely customize this step to perform any action
98necessary.
99
100LAVA keeps track of the installed tests by creating a directory tree in
101~/.local/share/lava_test/installed-tests/xxx with xxx being replaced by the
102test name.
103
104* How to submit test results to the dashboard.
105
106This step is a little out of order as we have not covered how to produce test
107results that the dashboard would understand yet. Still it's important to
108discuss how individual developers will submit individual test results vs how
109automated testing (using lava scheduler) submits them.
110
111Technically both approaches do the same thing. They use the XML-RPC API of the
112lava dashboard and call the put() or put_ex() methods to store the test result
113bundle in a particular bundle stream.
114
115If you are experimenting in your local environment (typically while working on
116a new test wrapper) you will be running a test multiple times, adjusting the
117parser logic or test code, and submitting the result to the dashboard for
118analysis. In that cycle you will be using lava-test to run your test code and
119lava-dashboard-tool to send the test result bundle. The latter tool is simply a
120command line front-end to all of the dashboard XML-RPC functions. It comes with
121built-in documentation (just run it with --help to get started) and is pretty
122easy to use.
123
124Caveat: you will need to create a bundle stream on the dashboard first. You
125can do that from command line by running:
126
127$ lava-dashboard-tool make-stream /personal/zyga/demo/
128
129You will want to set DASHBOARD_URL _prior_ to running this command, otherwise
130you will have to keep passing the --dashboard-url=... option over and over.
131
132You will _have_ to create an authentication token in the lava web interface
133and associate that token with your machine for the above command to work.
134LAVA uses those tokens to authenticate your requests.
135
136(shows how to navigate to the authentication page and generate a token)
137
138To save your token simply on your machine simply run this command:
139
140$ lava-dashboard-tool auth-add https://zkrynicki@validation.linaro.org/lava-server/RPC2/
141
142It is _essential_ to use https for security (otherwise your token is
143transmitted in plain text). It is also essential to use your username in the
144URL so that lava knows how to match the token to your account. Lastly it is
145essential to use the proper XML-RPC endpoint, including the trailing slash.
146
147Once the auth-add command finishes successfully you will be able to
148authenticate all your remote requests. You will be able to submit tests results
149to a private bundle streams as well.
150
151So having used the auth-add command I would set my DASHBOARD_URL to
152export DASHBOARD_URL=https://zkrynicki@validation.linaro.org/lava-server/RPC2/
153(I have something like that in my shell configuration files)
154
155* How to find the results in LAVA from a lava-test run.
156
157Each time you submit a bundle you will get a bundle SHA1 identifier (which is
158just the SHA1 of the entire text of the bundle). You can append that SHA1 to
159a permalink to go directly to that bundle.
160
161Here is an example permalink I pulled from the our production instance:
162
163http://validation.linaro.org/lava-server/dashboard/permalink/bundle/f536c5ce3b0d979be7cdc5934e27e248beb66dff/
164
165You can also navigate to the bundle stream in the web UI and search for the
166bundle ID directly.
167
168* How to wrap existing tests to extend lava-test's capabilities
169
170So this is a pretty simple process but you need to know a few things to be efficient.
171
172First, write down a short description of your test and the requirements it has
173from the test environment. That will be useful to add to the documentation later on.
174
175Write down the installation instructions as a series of shell commands. You will need to
176pass them as an argument to TestInstaller or AndroidTestInstaller.
177
178Now install your test and write down the run instructions, you will need them
179for the TestRunner or AndroidTestRunner. Run your test and save the output to a file.
180
181Now you should look at the output, figure out what the parts that you want to
182capture are and how to capture them. Typically you will try to write a regular
183expression that describes the text you want to grab. At that point you should
184make a decision: you can either use the built in test parsers (TestParser or
185AndroidTestParser) and tweak the output to be easier to parse or write your
186custom parser in python. Usually starting with line-oriented, record-like
187output is the best as it's trivial to parse. Once this is done it's time to put
188your test together. Don't worry, it's not going to work initially.
189
190You will need to create a 'testobj' that glues your test installer, runner and
191parser together. Now add this test to the list of tests in the project you are
192working on (either built-in test in lava-test/lava-android-test or just another
193test in your custom test project). You may need to regenerate 'egg-info' as
194this is where the data is being stored for python to find. Check that you can
195see your test with lava-test list-tests. If everything is fine just install
196This will allow you to ensure your installation and run instructions are
197correct. Now install your test and run it once.
198
199Now we're ready to work on the regular expression for the default parsers or on
200your full-blown custom parser. We'll be continuously changing the test
201definition (the patterns) and running a command like this:
202
203$ lava-test parse stream stream.2012-05-29T04:54:02Z -o foo.json
204
205Then we'll be looking at foo.json to check if our parser grabbed the data we
206care about. Typically the file is huge because it contains the full log file
207(the entire output of the test) and the environment context (hardware and
208software description). To make it easier to work with we'll re-run the test but pass
209two extra options to it.
210
211$ lava-test run stream -SH
212
213The -S option or --skip-software-context will remove all of the verbose list of
214packages that were installed. The -H option or --skip-hardware-context will
215remove the much shorter but still unneeded at this stage list of devices and
216SoC information.
217
218Make sure you note the result id of the second run. Now re-run parse with that
219result and pass the -A (or --skip-attachments).
220
221$ lava-test parse stream stream.2012-05-29T05:06:59Z -o foo.json -A
222
223Now foo.json is small and tidy and you can instantly see all the data you
224actually care about.
225
226* What to keep in mind when designing new tests to allow them to run efficiently in lava-test
227
228There are no strong requirements but the things I would think about are:
229
2301) Try not to destroy the machine you are running on, this helps with
231development on a laptop and away from an arm development board. If you need to
232remove everything in /home or do something equally destructive for any reason
233then consider adding a safety argument like --enable-destructive-actions that
234if left unspecified will just abort the process with an appropriate error
235message.
236
2372) Try to have sane output, ease of parsing is paramount. Remember that the
238goal is to capture either test case name (a string) and one of the pass-fail
239keywords or a decimal number. Don't put multiple 'results' on one line. Don't
240use free form test case identifiers, DNS-like names work best (lowercase ASCII
241letters, digits dot and dash).
242
243* Where to add tests to lava-test that can be shared.
244
245You have a few options, the 'correct' choice depends on the nature of those
246tests and the kind of development cycle and maintainership you would like to
247see
248
2491) You can submit them to lava-test or lava-android-test directly. Here they
250should be able to work without special rare hardware or peripherals. Ideally
251they would work on our ubuntu or android images. Typically each merge request
252will take a few days to review and accept.
253
2542) You can create your own open-source project and keep your tests there. This
255use case makes sense if you plan on providing many tests and want to be free to
256change the code at any time without waiting for the lava team to review your
257changes. I would hope that dedicated testing projects, focused around specific
258topics, would emerge but this has not happened as of this time.
259
2603) If your test is very simple in terms of the wrapper code you can also use a
261declarative test wrapper and simply publish a single .json file somewhere for
262everyone to use.
263
264* Where to find the source, overview of development rules for new shared tests.
265
266It's always worth having a look at the lava documentation on
267http://lava.readthedocs.org/ There are links for lava-test and
268lava-android-test there. In doubt feel free to ask a question on the mailing
269list (linaro-validation@lists.linaro.org) on irc #linaro-lava
270
271Usually it's easier to start with an existing test wrapper, copy it and change
272the things that are relevant to you. Typically a wrapper is very short unless
273it has to change the parser (because the default parser we provide is
274unsuitable for the particular output that the test program generates)
275
276* how to do private testing
277
278You need to follow three rules:
279
2801) Use https to access your lava instance
2812) Create a token and authenticate your requests with that token
2823) Send your results to private bundle streams (or instruct your jobs to send results to a private bundle stream)
283
284This way you will ensure that:
285
2861) nobody will eavesdrop on your password
2872) nobody will eavesdrop on test results
2883) unauthorized users will not be able to see the details of your test job in lava
2894) unauthorized users will not be able to see the test results in lava

Subscribers

People subscribed via source and target branches