Merge lp:~linaro-validation/lava-test/connect-training-session-materials into lp:lava-test/0.0
- connect-training-session-materials
- Merge into trunk
Proposed by
Zygmunt Krynicki
Status: | Rejected |
---|---|
Rejected by: | Neil Williams |
Proposed branch: | lp:~linaro-validation/lava-test/connect-training-session-materials |
Merge into: | lp:lava-test/0.0 |
Diff against target: |
293 lines (+289/-0) 1 file modified
INTRO (+289/-0) |
To merge this branch: | bzr merge lp:~linaro-validation/lava-test/connect-training-session-materials |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Linaro Validation Team | Pending | ||
Review via email: mp+109087@code.launchpad.net |
Commit message
Description of the change
This branch adds a few things I wrote for Linaro Connect 12 in Hong Kong, I think they are worth keeping
To post a comment you must log in.
Unmerged revisions
- 154. By Zygmunt Krynicki
-
Add INTRO for the lava-test training session at Linaro Connect
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === added file 'INTRO' |
2 | --- INTRO 1970-01-01 00:00:00 +0000 |
3 | +++ INTRO 2012-06-07 08:53:21 +0000 |
4 | @@ -0,0 +1,289 @@ |
5 | +This session will cover: |
6 | +* What lava-test is. |
7 | +* How to install lava-test on a board |
8 | +* What tests are available in lava-test, and what they do. |
9 | +* How to install and execute tests. |
10 | +* How to submit test results to the dashboard. |
11 | +* How to find the results in LAVA from a lava-test run. |
12 | +* How to wrap existing tests to extend lava-test's capabilities |
13 | +* What to keep in mind when designing new tests to allow them to run efficiently in lava-test |
14 | +* Where to add tests to lava-test that can be shared. |
15 | +* Where to find the source, overview of development rules for new shared tests. |
16 | +* how to do private testing |
17 | + |
18 | +* What lava-test is. |
19 | + |
20 | +LAVA test is a integration layer between third party test programs / scripts |
21 | +and the LAVA infrastructure. Each test wrapper that lava tests can use must |
22 | +define how to install, execute and qualify the outcome of a test. |
23 | + |
24 | +Android also differs in the way it is integrated into LAVA, it is not sharing |
25 | +code-base with lava-test as of this time. We plan on integrating both so that |
26 | +they share APIs and implementation as much as possible. |
27 | + |
28 | +* How to install lava-test on a board |
29 | + |
30 | +There are two ways to install lava-test. You can either use a PPA or install from |
31 | +source directly. Since lava-test is written in python is shares the installation |
32 | +tools from the python ecosystem. In practice that means installation from |
33 | +source is very easy and portable to multiple systems (it works the same on |
34 | +debian-based systems, fedora-like systems and windows) |
35 | + |
36 | +For the purpose of this presentation we will start with a developer or desktop |
37 | +image and install from source. We will use the pip tool which is the current |
38 | +standard installer for python modules. We'll install pip from the repository to |
39 | +get started. |
40 | + |
41 | +$ sudo apt-get install python-pip |
42 | + |
43 | +We'll now install lava-test into /usr/local |
44 | + |
45 | +$ sudo pip install lava-test |
46 | + |
47 | +You will find that pip has quirks but generally is comparable in spirit with |
48 | +apt-get or yum. When we make releases we first publish our source packages |
49 | +of lava to python package index (also known as pypi). With pip you should be |
50 | +able to get the latest version of lava-test at any time. |
51 | + |
52 | +NOTE: you will find that many, if not all, python developers use virtualenvs to |
53 | +manage their installations. Unless you plan to re-image your board and don't |
54 | +care about what's get installed where it's almost always better to use |
55 | +virtualenv to manage multiple branches or versions of installed components. For |
56 | +simplicity I won't cover that here but I'll be using it in the demo. |
57 | + |
58 | +* What tests are available in lava-test, and what they do. |
59 | + |
60 | +LAVA test comes preloaded with a few test wrappers. There are commands |
61 | +that allow you to enumerate tests, install or remove tests and so on. |
62 | + |
63 | +$ lava-test list-tests |
64 | +LAVA: Tests built directly into LAVA Test: |
65 | +LAVA: - bluetooth-enablement |
66 | +LAVA: - bootchart |
67 | +... |
68 | + |
69 | +Here lava lists all the tests it knows about, tests are grouped by the provider |
70 | +they came from. You can think of providers as wholesale tests merchants. LAVA |
71 | +itself has a built-in provider with the tests wrappers it ships with as well as |
72 | +two other providers that allow you (end users) to install additional tests in |
73 | +two different ways. More advanced users can define a new provider but we won't |
74 | +cover that topic here. All you have to know is that lava-test, much like the |
75 | +rest of the framework is extensible and loads plug-ins to provide content and |
76 | +functionality. |
77 | + |
78 | +The LAVA team does not maintain the actual test definitions we ship, which is |
79 | +somewhat confusing. We accept patches from people that wish to add their test |
80 | +to the default collection but we cannot fix bugs or run them each time they are |
81 | +proposed (we only look at the integration side of the code) |
82 | + |
83 | +Tests are documented on our documentation page, test maintainers are encouraged |
84 | +to provide a description for their test wrappers to help others understand how |
85 | +to use them effectively. LAVA has documentation that is maintained in the |
86 | +source tree but built for proper presentation by a third party service called |
87 | +readthedocs.org |
88 | + |
89 | +http://lava-test.readthedocs.org/en/latest/tests.html |
90 | + |
91 | +* How to install and execute tests. |
92 | + |
93 | +To install a test simply issue the command |
94 | + |
95 | +$ lava-test install xxx |
96 | + |
97 | +Where xxx is the name of the test you wish to install. This step will refer |
98 | +to the data or code defined in the test wrapper. Typically most tests use the |
99 | +default implementation and simply provide a few shell commands to install some |
100 | +packages from the archive, download and compile a simple module or something |
101 | +similar. Advanced users can freely customize this step to perform any action |
102 | +necessary. |
103 | + |
104 | +LAVA keeps track of the installed tests by creating a directory tree in |
105 | +~/.local/share/lava_test/installed-tests/xxx with xxx being replaced by the |
106 | +test name. |
107 | + |
108 | +* How to submit test results to the dashboard. |
109 | + |
110 | +This step is a little out of order as we have not covered how to produce test |
111 | +results that the dashboard would understand yet. Still it's important to |
112 | +discuss how individual developers will submit individual test results vs how |
113 | +automated testing (using lava scheduler) submits them. |
114 | + |
115 | +Technically both approaches do the same thing. They use the XML-RPC API of the |
116 | +lava dashboard and call the put() or put_ex() methods to store the test result |
117 | +bundle in a particular bundle stream. |
118 | + |
119 | +If you are experimenting in your local environment (typically while working on |
120 | +a new test wrapper) you will be running a test multiple times, adjusting the |
121 | +parser logic or test code, and submitting the result to the dashboard for |
122 | +analysis. In that cycle you will be using lava-test to run your test code and |
123 | +lava-dashboard-tool to send the test result bundle. The latter tool is simply a |
124 | +command line front-end to all of the dashboard XML-RPC functions. It comes with |
125 | +built-in documentation (just run it with --help to get started) and is pretty |
126 | +easy to use. |
127 | + |
128 | +Caveat: you will need to create a bundle stream on the dashboard first. You |
129 | +can do that from command line by running: |
130 | + |
131 | +$ lava-dashboard-tool make-stream /personal/zyga/demo/ |
132 | + |
133 | +You will want to set DASHBOARD_URL _prior_ to running this command, otherwise |
134 | +you will have to keep passing the --dashboard-url=... option over and over. |
135 | + |
136 | +You will _have_ to create an authentication token in the lava web interface |
137 | +and associate that token with your machine for the above command to work. |
138 | +LAVA uses those tokens to authenticate your requests. |
139 | + |
140 | +(shows how to navigate to the authentication page and generate a token) |
141 | + |
142 | +To save your token simply on your machine simply run this command: |
143 | + |
144 | +$ lava-dashboard-tool auth-add https://zkrynicki@validation.linaro.org/lava-server/RPC2/ |
145 | + |
146 | +It is _essential_ to use https for security (otherwise your token is |
147 | +transmitted in plain text). It is also essential to use your username in the |
148 | +URL so that lava knows how to match the token to your account. Lastly it is |
149 | +essential to use the proper XML-RPC endpoint, including the trailing slash. |
150 | + |
151 | +Once the auth-add command finishes successfully you will be able to |
152 | +authenticate all your remote requests. You will be able to submit tests results |
153 | +to a private bundle streams as well. |
154 | + |
155 | +So having used the auth-add command I would set my DASHBOARD_URL to |
156 | +export DASHBOARD_URL=https://zkrynicki@validation.linaro.org/lava-server/RPC2/ |
157 | +(I have something like that in my shell configuration files) |
158 | + |
159 | +* How to find the results in LAVA from a lava-test run. |
160 | + |
161 | +Each time you submit a bundle you will get a bundle SHA1 identifier (which is |
162 | +just the SHA1 of the entire text of the bundle). You can append that SHA1 to |
163 | +a permalink to go directly to that bundle. |
164 | + |
165 | +Here is an example permalink I pulled from the our production instance: |
166 | + |
167 | +http://validation.linaro.org/lava-server/dashboard/permalink/bundle/f536c5ce3b0d979be7cdc5934e27e248beb66dff/ |
168 | + |
169 | +You can also navigate to the bundle stream in the web UI and search for the |
170 | +bundle ID directly. |
171 | + |
172 | +* How to wrap existing tests to extend lava-test's capabilities |
173 | + |
174 | +So this is a pretty simple process but you need to know a few things to be efficient. |
175 | + |
176 | +First, write down a short description of your test and the requirements it has |
177 | +from the test environment. That will be useful to add to the documentation later on. |
178 | + |
179 | +Write down the installation instructions as a series of shell commands. You will need to |
180 | +pass them as an argument to TestInstaller or AndroidTestInstaller. |
181 | + |
182 | +Now install your test and write down the run instructions, you will need them |
183 | +for the TestRunner or AndroidTestRunner. Run your test and save the output to a file. |
184 | + |
185 | +Now you should look at the output, figure out what the parts that you want to |
186 | +capture are and how to capture them. Typically you will try to write a regular |
187 | +expression that describes the text you want to grab. At that point you should |
188 | +make a decision: you can either use the built in test parsers (TestParser or |
189 | +AndroidTestParser) and tweak the output to be easier to parse or write your |
190 | +custom parser in python. Usually starting with line-oriented, record-like |
191 | +output is the best as it's trivial to parse. Once this is done it's time to put |
192 | +your test together. Don't worry, it's not going to work initially. |
193 | + |
194 | +You will need to create a 'testobj' that glues your test installer, runner and |
195 | +parser together. Now add this test to the list of tests in the project you are |
196 | +working on (either built-in test in lava-test/lava-android-test or just another |
197 | +test in your custom test project). You may need to regenerate 'egg-info' as |
198 | +this is where the data is being stored for python to find. Check that you can |
199 | +see your test with lava-test list-tests. If everything is fine just install |
200 | +This will allow you to ensure your installation and run instructions are |
201 | +correct. Now install your test and run it once. |
202 | + |
203 | +Now we're ready to work on the regular expression for the default parsers or on |
204 | +your full-blown custom parser. We'll be continuously changing the test |
205 | +definition (the patterns) and running a command like this: |
206 | + |
207 | +$ lava-test parse stream stream.2012-05-29T04:54:02Z -o foo.json |
208 | + |
209 | +Then we'll be looking at foo.json to check if our parser grabbed the data we |
210 | +care about. Typically the file is huge because it contains the full log file |
211 | +(the entire output of the test) and the environment context (hardware and |
212 | +software description). To make it easier to work with we'll re-run the test but pass |
213 | +two extra options to it. |
214 | + |
215 | +$ lava-test run stream -SH |
216 | + |
217 | +The -S option or --skip-software-context will remove all of the verbose list of |
218 | +packages that were installed. The -H option or --skip-hardware-context will |
219 | +remove the much shorter but still unneeded at this stage list of devices and |
220 | +SoC information. |
221 | + |
222 | +Make sure you note the result id of the second run. Now re-run parse with that |
223 | +result and pass the -A (or --skip-attachments). |
224 | + |
225 | +$ lava-test parse stream stream.2012-05-29T05:06:59Z -o foo.json -A |
226 | + |
227 | +Now foo.json is small and tidy and you can instantly see all the data you |
228 | +actually care about. |
229 | + |
230 | +* What to keep in mind when designing new tests to allow them to run efficiently in lava-test |
231 | + |
232 | +There are no strong requirements but the things I would think about are: |
233 | + |
234 | +1) Try not to destroy the machine you are running on, this helps with |
235 | +development on a laptop and away from an arm development board. If you need to |
236 | +remove everything in /home or do something equally destructive for any reason |
237 | +then consider adding a safety argument like --enable-destructive-actions that |
238 | +if left unspecified will just abort the process with an appropriate error |
239 | +message. |
240 | + |
241 | +2) Try to have sane output, ease of parsing is paramount. Remember that the |
242 | +goal is to capture either test case name (a string) and one of the pass-fail |
243 | +keywords or a decimal number. Don't put multiple 'results' on one line. Don't |
244 | +use free form test case identifiers, DNS-like names work best (lowercase ASCII |
245 | +letters, digits dot and dash). |
246 | + |
247 | +* Where to add tests to lava-test that can be shared. |
248 | + |
249 | +You have a few options, the 'correct' choice depends on the nature of those |
250 | +tests and the kind of development cycle and maintainership you would like to |
251 | +see |
252 | + |
253 | +1) You can submit them to lava-test or lava-android-test directly. Here they |
254 | +should be able to work without special rare hardware or peripherals. Ideally |
255 | +they would work on our ubuntu or android images. Typically each merge request |
256 | +will take a few days to review and accept. |
257 | + |
258 | +2) You can create your own open-source project and keep your tests there. This |
259 | +use case makes sense if you plan on providing many tests and want to be free to |
260 | +change the code at any time without waiting for the lava team to review your |
261 | +changes. I would hope that dedicated testing projects, focused around specific |
262 | +topics, would emerge but this has not happened as of this time. |
263 | + |
264 | +3) If your test is very simple in terms of the wrapper code you can also use a |
265 | +declarative test wrapper and simply publish a single .json file somewhere for |
266 | +everyone to use. |
267 | + |
268 | +* Where to find the source, overview of development rules for new shared tests. |
269 | + |
270 | +It's always worth having a look at the lava documentation on |
271 | +http://lava.readthedocs.org/ There are links for lava-test and |
272 | +lava-android-test there. In doubt feel free to ask a question on the mailing |
273 | +list (linaro-validation@lists.linaro.org) on irc #linaro-lava |
274 | + |
275 | +Usually it's easier to start with an existing test wrapper, copy it and change |
276 | +the things that are relevant to you. Typically a wrapper is very short unless |
277 | +it has to change the parser (because the default parser we provide is |
278 | +unsuitable for the particular output that the test program generates) |
279 | + |
280 | +* how to do private testing |
281 | + |
282 | +You need to follow three rules: |
283 | + |
284 | +1) Use https to access your lava instance |
285 | +2) Create a token and authenticate your requests with that token |
286 | +3) Send your results to private bundle streams (or instruct your jobs to send results to a private bundle stream) |
287 | + |
288 | +This way you will ensure that: |
289 | + |
290 | +1) nobody will eavesdrop on your password |
291 | +2) nobody will eavesdrop on test results |
292 | +3) unauthorized users will not be able to see the details of your test job in lava |
293 | +4) unauthorized users will not be able to see the test results in lava |