Merge lp:~linaro-validation/lava-test/connect-training-session-materials into lp:lava-test/0.0
- connect-training-session-materials
- Merge into trunk
Proposed by
Zygmunt Krynicki
Status: | Rejected |
---|---|
Rejected by: | Neil Williams |
Proposed branch: | lp:~linaro-validation/lava-test/connect-training-session-materials |
Merge into: | lp:lava-test/0.0 |
Diff against target: |
293 lines (+289/-0) 1 file modified
INTRO (+289/-0) |
To merge this branch: | bzr merge lp:~linaro-validation/lava-test/connect-training-session-materials |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Linaro Validation Team | Pending | ||
Review via email: mp+109087@code.launchpad.net |
Commit message
Description of the change
This branch adds a few things I wrote for Linaro Connect 12 in Hong Kong, I think they are worth keeping
To post a comment you must log in.
Unmerged revisions
- 154. By Zygmunt Krynicki
-
Add INTRO for the lava-test training session at Linaro Connect
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === added file 'INTRO' | |||
2 | --- INTRO 1970-01-01 00:00:00 +0000 | |||
3 | +++ INTRO 2012-06-07 08:53:21 +0000 | |||
4 | @@ -0,0 +1,289 @@ | |||
5 | 1 | This session will cover: | ||
6 | 2 | * What lava-test is. | ||
7 | 3 | * How to install lava-test on a board | ||
8 | 4 | * What tests are available in lava-test, and what they do. | ||
9 | 5 | * How to install and execute tests. | ||
10 | 6 | * How to submit test results to the dashboard. | ||
11 | 7 | * How to find the results in LAVA from a lava-test run. | ||
12 | 8 | * How to wrap existing tests to extend lava-test's capabilities | ||
13 | 9 | * What to keep in mind when designing new tests to allow them to run efficiently in lava-test | ||
14 | 10 | * Where to add tests to lava-test that can be shared. | ||
15 | 11 | * Where to find the source, overview of development rules for new shared tests. | ||
16 | 12 | * how to do private testing | ||
17 | 13 | |||
18 | 14 | * What lava-test is. | ||
19 | 15 | |||
20 | 16 | LAVA test is a integration layer between third party test programs / scripts | ||
21 | 17 | and the LAVA infrastructure. Each test wrapper that lava tests can use must | ||
22 | 18 | define how to install, execute and qualify the outcome of a test. | ||
23 | 19 | |||
24 | 20 | Android also differs in the way it is integrated into LAVA, it is not sharing | ||
25 | 21 | code-base with lava-test as of this time. We plan on integrating both so that | ||
26 | 22 | they share APIs and implementation as much as possible. | ||
27 | 23 | |||
28 | 24 | * How to install lava-test on a board | ||
29 | 25 | |||
30 | 26 | There are two ways to install lava-test. You can either use a PPA or install from | ||
31 | 27 | source directly. Since lava-test is written in python is shares the installation | ||
32 | 28 | tools from the python ecosystem. In practice that means installation from | ||
33 | 29 | source is very easy and portable to multiple systems (it works the same on | ||
34 | 30 | debian-based systems, fedora-like systems and windows) | ||
35 | 31 | |||
36 | 32 | For the purpose of this presentation we will start with a developer or desktop | ||
37 | 33 | image and install from source. We will use the pip tool which is the current | ||
38 | 34 | standard installer for python modules. We'll install pip from the repository to | ||
39 | 35 | get started. | ||
40 | 36 | |||
41 | 37 | $ sudo apt-get install python-pip | ||
42 | 38 | |||
43 | 39 | We'll now install lava-test into /usr/local | ||
44 | 40 | |||
45 | 41 | $ sudo pip install lava-test | ||
46 | 42 | |||
47 | 43 | You will find that pip has quirks but generally is comparable in spirit with | ||
48 | 44 | apt-get or yum. When we make releases we first publish our source packages | ||
49 | 45 | of lava to python package index (also known as pypi). With pip you should be | ||
50 | 46 | able to get the latest version of lava-test at any time. | ||
51 | 47 | |||
52 | 48 | NOTE: you will find that many, if not all, python developers use virtualenvs to | ||
53 | 49 | manage their installations. Unless you plan to re-image your board and don't | ||
54 | 50 | care about what's get installed where it's almost always better to use | ||
55 | 51 | virtualenv to manage multiple branches or versions of installed components. For | ||
56 | 52 | simplicity I won't cover that here but I'll be using it in the demo. | ||
57 | 53 | |||
58 | 54 | * What tests are available in lava-test, and what they do. | ||
59 | 55 | |||
60 | 56 | LAVA test comes preloaded with a few test wrappers. There are commands | ||
61 | 57 | that allow you to enumerate tests, install or remove tests and so on. | ||
62 | 58 | |||
63 | 59 | $ lava-test list-tests | ||
64 | 60 | LAVA: Tests built directly into LAVA Test: | ||
65 | 61 | LAVA: - bluetooth-enablement | ||
66 | 62 | LAVA: - bootchart | ||
67 | 63 | ... | ||
68 | 64 | |||
69 | 65 | Here lava lists all the tests it knows about, tests are grouped by the provider | ||
70 | 66 | they came from. You can think of providers as wholesale tests merchants. LAVA | ||
71 | 67 | itself has a built-in provider with the tests wrappers it ships with as well as | ||
72 | 68 | two other providers that allow you (end users) to install additional tests in | ||
73 | 69 | two different ways. More advanced users can define a new provider but we won't | ||
74 | 70 | cover that topic here. All you have to know is that lava-test, much like the | ||
75 | 71 | rest of the framework is extensible and loads plug-ins to provide content and | ||
76 | 72 | functionality. | ||
77 | 73 | |||
78 | 74 | The LAVA team does not maintain the actual test definitions we ship, which is | ||
79 | 75 | somewhat confusing. We accept patches from people that wish to add their test | ||
80 | 76 | to the default collection but we cannot fix bugs or run them each time they are | ||
81 | 77 | proposed (we only look at the integration side of the code) | ||
82 | 78 | |||
83 | 79 | Tests are documented on our documentation page, test maintainers are encouraged | ||
84 | 80 | to provide a description for their test wrappers to help others understand how | ||
85 | 81 | to use them effectively. LAVA has documentation that is maintained in the | ||
86 | 82 | source tree but built for proper presentation by a third party service called | ||
87 | 83 | readthedocs.org | ||
88 | 84 | |||
89 | 85 | http://lava-test.readthedocs.org/en/latest/tests.html | ||
90 | 86 | |||
91 | 87 | * How to install and execute tests. | ||
92 | 88 | |||
93 | 89 | To install a test simply issue the command | ||
94 | 90 | |||
95 | 91 | $ lava-test install xxx | ||
96 | 92 | |||
97 | 93 | Where xxx is the name of the test you wish to install. This step will refer | ||
98 | 94 | to the data or code defined in the test wrapper. Typically most tests use the | ||
99 | 95 | default implementation and simply provide a few shell commands to install some | ||
100 | 96 | packages from the archive, download and compile a simple module or something | ||
101 | 97 | similar. Advanced users can freely customize this step to perform any action | ||
102 | 98 | necessary. | ||
103 | 99 | |||
104 | 100 | LAVA keeps track of the installed tests by creating a directory tree in | ||
105 | 101 | ~/.local/share/lava_test/installed-tests/xxx with xxx being replaced by the | ||
106 | 102 | test name. | ||
107 | 103 | |||
108 | 104 | * How to submit test results to the dashboard. | ||
109 | 105 | |||
110 | 106 | This step is a little out of order as we have not covered how to produce test | ||
111 | 107 | results that the dashboard would understand yet. Still it's important to | ||
112 | 108 | discuss how individual developers will submit individual test results vs how | ||
113 | 109 | automated testing (using lava scheduler) submits them. | ||
114 | 110 | |||
115 | 111 | Technically both approaches do the same thing. They use the XML-RPC API of the | ||
116 | 112 | lava dashboard and call the put() or put_ex() methods to store the test result | ||
117 | 113 | bundle in a particular bundle stream. | ||
118 | 114 | |||
119 | 115 | If you are experimenting in your local environment (typically while working on | ||
120 | 116 | a new test wrapper) you will be running a test multiple times, adjusting the | ||
121 | 117 | parser logic or test code, and submitting the result to the dashboard for | ||
122 | 118 | analysis. In that cycle you will be using lava-test to run your test code and | ||
123 | 119 | lava-dashboard-tool to send the test result bundle. The latter tool is simply a | ||
124 | 120 | command line front-end to all of the dashboard XML-RPC functions. It comes with | ||
125 | 121 | built-in documentation (just run it with --help to get started) and is pretty | ||
126 | 122 | easy to use. | ||
127 | 123 | |||
128 | 124 | Caveat: you will need to create a bundle stream on the dashboard first. You | ||
129 | 125 | can do that from command line by running: | ||
130 | 126 | |||
131 | 127 | $ lava-dashboard-tool make-stream /personal/zyga/demo/ | ||
132 | 128 | |||
133 | 129 | You will want to set DASHBOARD_URL _prior_ to running this command, otherwise | ||
134 | 130 | you will have to keep passing the --dashboard-url=... option over and over. | ||
135 | 131 | |||
136 | 132 | You will _have_ to create an authentication token in the lava web interface | ||
137 | 133 | and associate that token with your machine for the above command to work. | ||
138 | 134 | LAVA uses those tokens to authenticate your requests. | ||
139 | 135 | |||
140 | 136 | (shows how to navigate to the authentication page and generate a token) | ||
141 | 137 | |||
142 | 138 | To save your token simply on your machine simply run this command: | ||
143 | 139 | |||
144 | 140 | $ lava-dashboard-tool auth-add https://zkrynicki@validation.linaro.org/lava-server/RPC2/ | ||
145 | 141 | |||
146 | 142 | It is _essential_ to use https for security (otherwise your token is | ||
147 | 143 | transmitted in plain text). It is also essential to use your username in the | ||
148 | 144 | URL so that lava knows how to match the token to your account. Lastly it is | ||
149 | 145 | essential to use the proper XML-RPC endpoint, including the trailing slash. | ||
150 | 146 | |||
151 | 147 | Once the auth-add command finishes successfully you will be able to | ||
152 | 148 | authenticate all your remote requests. You will be able to submit tests results | ||
153 | 149 | to a private bundle streams as well. | ||
154 | 150 | |||
155 | 151 | So having used the auth-add command I would set my DASHBOARD_URL to | ||
156 | 152 | export DASHBOARD_URL=https://zkrynicki@validation.linaro.org/lava-server/RPC2/ | ||
157 | 153 | (I have something like that in my shell configuration files) | ||
158 | 154 | |||
159 | 155 | * How to find the results in LAVA from a lava-test run. | ||
160 | 156 | |||
161 | 157 | Each time you submit a bundle you will get a bundle SHA1 identifier (which is | ||
162 | 158 | just the SHA1 of the entire text of the bundle). You can append that SHA1 to | ||
163 | 159 | a permalink to go directly to that bundle. | ||
164 | 160 | |||
165 | 161 | Here is an example permalink I pulled from the our production instance: | ||
166 | 162 | |||
167 | 163 | http://validation.linaro.org/lava-server/dashboard/permalink/bundle/f536c5ce3b0d979be7cdc5934e27e248beb66dff/ | ||
168 | 164 | |||
169 | 165 | You can also navigate to the bundle stream in the web UI and search for the | ||
170 | 166 | bundle ID directly. | ||
171 | 167 | |||
172 | 168 | * How to wrap existing tests to extend lava-test's capabilities | ||
173 | 169 | |||
174 | 170 | So this is a pretty simple process but you need to know a few things to be efficient. | ||
175 | 171 | |||
176 | 172 | First, write down a short description of your test and the requirements it has | ||
177 | 173 | from the test environment. That will be useful to add to the documentation later on. | ||
178 | 174 | |||
179 | 175 | Write down the installation instructions as a series of shell commands. You will need to | ||
180 | 176 | pass them as an argument to TestInstaller or AndroidTestInstaller. | ||
181 | 177 | |||
182 | 178 | Now install your test and write down the run instructions, you will need them | ||
183 | 179 | for the TestRunner or AndroidTestRunner. Run your test and save the output to a file. | ||
184 | 180 | |||
185 | 181 | Now you should look at the output, figure out what the parts that you want to | ||
186 | 182 | capture are and how to capture them. Typically you will try to write a regular | ||
187 | 183 | expression that describes the text you want to grab. At that point you should | ||
188 | 184 | make a decision: you can either use the built in test parsers (TestParser or | ||
189 | 185 | AndroidTestParser) and tweak the output to be easier to parse or write your | ||
190 | 186 | custom parser in python. Usually starting with line-oriented, record-like | ||
191 | 187 | output is the best as it's trivial to parse. Once this is done it's time to put | ||
192 | 188 | your test together. Don't worry, it's not going to work initially. | ||
193 | 189 | |||
194 | 190 | You will need to create a 'testobj' that glues your test installer, runner and | ||
195 | 191 | parser together. Now add this test to the list of tests in the project you are | ||
196 | 192 | working on (either built-in test in lava-test/lava-android-test or just another | ||
197 | 193 | test in your custom test project). You may need to regenerate 'egg-info' as | ||
198 | 194 | this is where the data is being stored for python to find. Check that you can | ||
199 | 195 | see your test with lava-test list-tests. If everything is fine just install | ||
200 | 196 | This will allow you to ensure your installation and run instructions are | ||
201 | 197 | correct. Now install your test and run it once. | ||
202 | 198 | |||
203 | 199 | Now we're ready to work on the regular expression for the default parsers or on | ||
204 | 200 | your full-blown custom parser. We'll be continuously changing the test | ||
205 | 201 | definition (the patterns) and running a command like this: | ||
206 | 202 | |||
207 | 203 | $ lava-test parse stream stream.2012-05-29T04:54:02Z -o foo.json | ||
208 | 204 | |||
209 | 205 | Then we'll be looking at foo.json to check if our parser grabbed the data we | ||
210 | 206 | care about. Typically the file is huge because it contains the full log file | ||
211 | 207 | (the entire output of the test) and the environment context (hardware and | ||
212 | 208 | software description). To make it easier to work with we'll re-run the test but pass | ||
213 | 209 | two extra options to it. | ||
214 | 210 | |||
215 | 211 | $ lava-test run stream -SH | ||
216 | 212 | |||
217 | 213 | The -S option or --skip-software-context will remove all of the verbose list of | ||
218 | 214 | packages that were installed. The -H option or --skip-hardware-context will | ||
219 | 215 | remove the much shorter but still unneeded at this stage list of devices and | ||
220 | 216 | SoC information. | ||
221 | 217 | |||
222 | 218 | Make sure you note the result id of the second run. Now re-run parse with that | ||
223 | 219 | result and pass the -A (or --skip-attachments). | ||
224 | 220 | |||
225 | 221 | $ lava-test parse stream stream.2012-05-29T05:06:59Z -o foo.json -A | ||
226 | 222 | |||
227 | 223 | Now foo.json is small and tidy and you can instantly see all the data you | ||
228 | 224 | actually care about. | ||
229 | 225 | |||
230 | 226 | * What to keep in mind when designing new tests to allow them to run efficiently in lava-test | ||
231 | 227 | |||
232 | 228 | There are no strong requirements but the things I would think about are: | ||
233 | 229 | |||
234 | 230 | 1) Try not to destroy the machine you are running on, this helps with | ||
235 | 231 | development on a laptop and away from an arm development board. If you need to | ||
236 | 232 | remove everything in /home or do something equally destructive for any reason | ||
237 | 233 | then consider adding a safety argument like --enable-destructive-actions that | ||
238 | 234 | if left unspecified will just abort the process with an appropriate error | ||
239 | 235 | message. | ||
240 | 236 | |||
241 | 237 | 2) Try to have sane output, ease of parsing is paramount. Remember that the | ||
242 | 238 | goal is to capture either test case name (a string) and one of the pass-fail | ||
243 | 239 | keywords or a decimal number. Don't put multiple 'results' on one line. Don't | ||
244 | 240 | use free form test case identifiers, DNS-like names work best (lowercase ASCII | ||
245 | 241 | letters, digits dot and dash). | ||
246 | 242 | |||
247 | 243 | * Where to add tests to lava-test that can be shared. | ||
248 | 244 | |||
249 | 245 | You have a few options, the 'correct' choice depends on the nature of those | ||
250 | 246 | tests and the kind of development cycle and maintainership you would like to | ||
251 | 247 | see | ||
252 | 248 | |||
253 | 249 | 1) You can submit them to lava-test or lava-android-test directly. Here they | ||
254 | 250 | should be able to work without special rare hardware or peripherals. Ideally | ||
255 | 251 | they would work on our ubuntu or android images. Typically each merge request | ||
256 | 252 | will take a few days to review and accept. | ||
257 | 253 | |||
258 | 254 | 2) You can create your own open-source project and keep your tests there. This | ||
259 | 255 | use case makes sense if you plan on providing many tests and want to be free to | ||
260 | 256 | change the code at any time without waiting for the lava team to review your | ||
261 | 257 | changes. I would hope that dedicated testing projects, focused around specific | ||
262 | 258 | topics, would emerge but this has not happened as of this time. | ||
263 | 259 | |||
264 | 260 | 3) If your test is very simple in terms of the wrapper code you can also use a | ||
265 | 261 | declarative test wrapper and simply publish a single .json file somewhere for | ||
266 | 262 | everyone to use. | ||
267 | 263 | |||
268 | 264 | * Where to find the source, overview of development rules for new shared tests. | ||
269 | 265 | |||
270 | 266 | It's always worth having a look at the lava documentation on | ||
271 | 267 | http://lava.readthedocs.org/ There are links for lava-test and | ||
272 | 268 | lava-android-test there. In doubt feel free to ask a question on the mailing | ||
273 | 269 | list (linaro-validation@lists.linaro.org) on irc #linaro-lava | ||
274 | 270 | |||
275 | 271 | Usually it's easier to start with an existing test wrapper, copy it and change | ||
276 | 272 | the things that are relevant to you. Typically a wrapper is very short unless | ||
277 | 273 | it has to change the parser (because the default parser we provide is | ||
278 | 274 | unsuitable for the particular output that the test program generates) | ||
279 | 275 | |||
280 | 276 | * how to do private testing | ||
281 | 277 | |||
282 | 278 | You need to follow three rules: | ||
283 | 279 | |||
284 | 280 | 1) Use https to access your lava instance | ||
285 | 281 | 2) Create a token and authenticate your requests with that token | ||
286 | 282 | 3) Send your results to private bundle streams (or instruct your jobs to send results to a private bundle stream) | ||
287 | 283 | |||
288 | 284 | This way you will ensure that: | ||
289 | 285 | |||
290 | 286 | 1) nobody will eavesdrop on your password | ||
291 | 287 | 2) nobody will eavesdrop on test results | ||
292 | 288 | 3) unauthorized users will not be able to see the details of your test job in lava | ||
293 | 289 | 4) unauthorized users will not be able to see the test results in lava |