Merge lp:~cprov/britney/testclient-api into lp:~canonical-ci-engineering/britney/queued-announce-and-collect
- testclient-api
- Merge into queued-announce-and-collect
Status: | Merged |
---|---|
Merged at revision: | 435 |
Proposed branch: | lp:~cprov/britney/testclient-api |
Merge into: | lp:~canonical-ci-engineering/britney/queued-announce-and-collect |
Diff against target: |
473 lines (+446/-0) 3 files modified
britney.py (+57/-0) testclient.py (+178/-0) tests/test_testclient.py (+211/-0) |
To merge this branch: | bzr merge lp:~cprov/britney/testclient-api |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Para Siva (community) | Approve | ||
Canonical CI Engineering | Pending | ||
Review via email: mp+259668@code.launchpad.net |
Commit message
Initial implementation of a generic interface for proposed-migration (britney) to produce new-candidates events and consume test results from the CI-infrastructure.
Description of the change
Initial implementation of a generic interface for proposed-migration (britney) to produce new-candidates events and consume test results from the CI-infrastructure.
Celso Providelo (cprov) wrote : | # |
Psivaa,
Thanks for the review.
Regarding config-changes, they will be only necessary when we do integration tests (which I've mentioned are missing, on purpose, from this MP).
On point 2) I don't get the registry duplication, it's impossible, since it's a dictionary, perhaps you mean re-announcement ?
On point 3) take a look at the corresponding test, it may clarify things about cleanup() effects.
On message ack-ing, you are right it was missing in the collect() action, fixed.
[]
Para Siva (psivaa) wrote : | # |
Thanks for fixing the comments cprov.
> On point 2) I don't get the registry duplication, it's impossible, since it's
> a dictionary, perhaps you mean re-announcement ?
>
Point 2 was about a notification about my inline comment, not using make_key bit, which you've fixed.
And thanks a lot again for this MP. Whole lot different than what I proposed.
- 442. By Celso Providelo
-
addressing review comments
- 443. By Celso Providelo
-
extend existing tests for ignored test results (series mismatch).
Preview Diff
1 | === modified file 'britney.py' |
2 | --- britney.py 2015-02-20 19:02:00 +0000 |
3 | +++ britney.py 2015-05-22 17:40:09 +0000 |
4 | @@ -227,6 +227,7 @@ |
5 | PROVIDES, RDEPENDS, RCONFLICTS, MULTIARCH, ESSENTIAL) |
6 | from autopkgtest import AutoPackageTest, ADT_PASS, ADT_EXCUSES_LABELS |
7 | from boottest import BootTest |
8 | +from testclient import TestClient |
9 | |
10 | |
11 | __author__ = 'Fabio Tranchitella and the Debian Release Team' |
12 | @@ -1984,6 +1985,62 @@ |
13 | upgrade_me.remove(excuse.name) |
14 | unconsidered.append(excuse.name) |
15 | |
16 | + if (getattr(self.options, "testclient_enable", "no") == "yes" and |
17 | + self.options.series): |
18 | + |
19 | + # Filter only new source candidates excuses. |
20 | + testing_excuses = [] |
21 | + for excuse in self.excuses: |
22 | + # Skip removals, binary-only candidates, proposed-updates |
23 | + # and unknown versions. |
24 | + if (excuse.name.startswith("-") or |
25 | + "/" in excuse.name or |
26 | + "_" in excuse.name or |
27 | + excuse.ver[1] == "-"): |
28 | + continue |
29 | + testing_excuses.append(excuse) |
30 | + |
31 | + amqp_uris = getattr( |
32 | + self.options, "testclient_amqp_uris", "").split() |
33 | + testclient = TestClient(self.options.series, amqp_uris) |
34 | + |
35 | + # Announce new candidates and collect new test results. |
36 | + if not self.options.dry_run: |
37 | + testclient.announce(testing_excuses) |
38 | + testclient.collect() |
39 | + testclient.cleanup(testing_excuses) |
40 | + |
41 | + # Update excuses considering hints and required_tests (for gating). |
42 | + required_tests = getattr( |
43 | + self.options, "testclient_required_tests", "").split() |
44 | + for excuse in testing_excuses: |
45 | + hints = self.hints.search('force', package=excuse.name) |
46 | + hints.extend( |
47 | + self.hints.search('force-badtest', package=excuse.name)) |
48 | + forces = [x for x in hints |
49 | + if same_source(excuse.ver[1], x.version)] |
50 | + for test in testclient.getTests(excuse.name): |
51 | + label = TestClient.EXCUSE_LABELS.get( |
52 | + test.status, 'UNKNOWN STATUS') |
53 | + excuse.addhtml( |
54 | + "%s result: %s (<a href=\"%s\">results</a>)" % ( |
55 | + test.name, label, test.result_url)) |
56 | + if forces: |
57 | + excuse.addhtml( |
58 | + "Should wait for %s %s %s, but forced by " |
59 | + "%s" % (excuse.name, excuse.ver[1], |
60 | + test.name, forces[0].user)) |
61 | + continue |
62 | + if test.name not in required_tests: |
63 | + continue |
64 | + if test.status not in TestClient.VALID_STATUSES: |
65 | + excuse.addreason(test.name) |
66 | + if excuse.is_valid: |
67 | + excuse.is_valid = False |
68 | + excuse.addhtml("Not considered") |
69 | + upgrade_me.remove(excuse.name) |
70 | + unconsidered.append(excuse.name) |
71 | + |
72 | # invalidate impossible excuses |
73 | for e in self.excuses: |
74 | # parts[0] == package name |
75 | |
76 | === added file 'testclient.py' |
77 | --- testclient.py 1970-01-01 00:00:00 +0000 |
78 | +++ testclient.py 2015-05-22 17:40:09 +0000 |
79 | @@ -0,0 +1,178 @@ |
80 | +# -*- coding: utf-8 -*- |
81 | + |
82 | +# Copyright (C) 2015 Canonical Ltd. |
83 | + |
84 | +# This program is free software; you can redistribute it and/or modify |
85 | +# it under the terms of the GNU General Public License as published by |
86 | +# the Free Software Foundation; either version 2 of the License, or |
87 | +# (at your option) any later version. |
88 | + |
89 | +# This program is distributed in the hope that it will be useful, |
90 | +# but WITHOUT ANY WARRANTY; without even the implied warranty of |
91 | +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
92 | +# GNU General Public License for more details. |
93 | +from __future__ import print_function |
94 | + |
95 | + |
96 | +from contextlib import ( |
97 | + contextmanager, |
98 | + nested, |
99 | +) |
100 | +import json |
101 | +import os |
102 | + |
103 | + |
104 | +import kombu |
105 | +from kombu.pools import producers |
106 | + |
107 | + |
108 | +@contextmanager |
109 | +def json_cached_info(path): |
110 | + """Context manager for caching a JSON object on disk.""" |
111 | + info = {} |
112 | + if os.path.exists(path): |
113 | + with open(path) as fp: |
114 | + try: |
115 | + info = json.load(fp) |
116 | + except ValueError: |
117 | + # cache is empty or corrupted (!). |
118 | + info = {} |
119 | + else: |
120 | + dirname = os.path.dirname(path) |
121 | + if not os.path.exists(dirname): |
122 | + os.makedirs(dirname) |
123 | + yield info |
124 | + with open(path, 'w') as fp: |
125 | + json.dump(info, fp, indent=2) |
126 | + |
127 | + |
128 | +def make_cache_key(name, version): |
129 | + """Return a json-hashable key for given source & version.""" |
130 | + return '{}_{}'.format(name, version) |
131 | + |
132 | + |
133 | +class TestClient(object): |
134 | + """Generic test client implementation. |
135 | + |
136 | + announce: announcing new source candidates to a pre-defined rabbitmq |
137 | + exchange (testing subsystems can hook/subscribe). |
138 | + |
139 | + collect: collect test results for the context series from a pre-defined |
140 | + rabbitmq exchange (other promotion agents can do the same). |
141 | + |
142 | + cleanup: sanitize internal announcement/testing registry after runs. |
143 | + """ |
144 | + |
145 | + EXCHANGE_CANDIDATES = 'candidates.exchange' |
146 | + EXCHANGE_RESULTS = 'results.exchange' |
147 | + |
148 | + VALID_STATUSES = ('PASS', 'SKIP') |
149 | + |
150 | + LABELS = { |
151 | + "PASS": '<span style="background:#87d96c">Pass</span>', |
152 | + "SKIP": '<span style="background:#ffff00">Skip</span>', |
153 | + "FAIL": '<span style="background:#ff6666">Regression</span>', |
154 | + "RUNNING": '<span style="background:#99ddff">Test in progress</span>', |
155 | + } |
156 | + |
157 | + def __init__(self, series, amqp_uris): |
158 | + self.series = series |
159 | + self.amqp_uris = amqp_uris |
160 | + |
161 | + @property |
162 | + def cache_path(self): |
163 | + """Series-specific test announcement/result cache.""" |
164 | + return 'testclient/{}.json'.format(self.series) |
165 | + |
166 | + @property |
167 | + def results_queue(self): |
168 | + """Series-specific queue for collecting tests results.""" |
169 | + return 'pm.results.{}'.format(self.series) |
170 | + |
171 | + def announce(self, excuses): |
172 | + """Announce new source candidates. |
173 | + |
174 | + Post a message to the EXCHANGE_CANDATIDATES for every new given |
175 | + excuses (cache announcementes so excuses do not get re-annouced). |
176 | + """ |
177 | + with nested(json_cached_info(self.cache_path), |
178 | + kombu.Connection(self.amqp_uris)) as (cache, connection): |
179 | + # XXX cprov 20150521: nested() is deprecated, and multi-statement |
180 | + # 'with' does not support nesting (i.e. the previous context |
181 | + # manager is not available to the next, in this case |
182 | + # 'connection'). |
183 | + with producers[connection].acquire(block=True) as producer: |
184 | + publisher = connection.ensure( |
185 | + producer, producer.publish, max_retries=3) |
186 | + exchange = kombu.Exchange( |
187 | + self.EXCHANGE_CANDIDATES, type="fanout") |
188 | + for excuse in excuses: |
189 | + if make_cache_key( |
190 | + excuse.name, excuse.ver[1]) in cache.keys(): |
191 | + continue |
192 | + payload = { |
193 | + 'source_name': excuse.name, |
194 | + 'source_version': excuse.ver[1], |
195 | + 'series': self.series, |
196 | + } |
197 | + publisher(payload, exchange=exchange, declare=[exchange]) |
198 | + cache[make_cache_key(excuse.name, excuse.ver[1])] = [] |
199 | + |
200 | + def collect(self): |
201 | + """Collect available test results. |
202 | + |
203 | + Consume all messages from the EXCHANGE_RESULTS (routed to a series- |
204 | + specific queue). Ignore test results for other series and update |
205 | + test results registry. |
206 | + """ |
207 | + with nested(json_cached_info(self.cache_path), |
208 | + kombu.Connection(self.amqp_uris)) as (cache, connection): |
209 | + exchange = kombu.Exchange( |
210 | + self.EXCHANGE_RESULTS, type="fanout") |
211 | + queue = kombu.Queue(self.results_queue, exchange) |
212 | + # XXX cprov 20150521: same as above about nested context managers. |
213 | + with connection.SimpleQueue(queue) as q: |
214 | + for i in range(len(q)): |
215 | + msg = q.get() |
216 | + payload = msg.payload |
217 | + if payload.get('series') != self.series: |
218 | + continue |
219 | + tests = cache.setdefault( |
220 | + make_cache_key( |
221 | + payload.get('source_name'), |
222 | + payload.get('source_version') |
223 | + ), []) |
224 | + tests.append({ |
225 | + 'name': payload.get('test_name'), |
226 | + 'status': payload.get('test_status'), |
227 | + 'url': payload.get('test_url'), |
228 | + }) |
229 | + msg.ack() |
230 | + |
231 | + def cleanup(self, excuses): |
232 | + """Remove test result entries without corresponding excuse. |
233 | + |
234 | + If there is not excuse the test results are not relevant anymore. |
235 | + """ |
236 | + with json_cached_info(self.cache_path) as cache: |
237 | + current_keys = [ |
238 | + make_cache_key(e.name, e.ver[1]) for e in excuses] |
239 | + cached_keys = list(cache.keys()) |
240 | + for k in cached_keys: |
241 | + if k not in current_keys: |
242 | + del cache[k] |
243 | + |
244 | + def getTests(self, name, version): |
245 | + """Yields test results for a given source-version pair. |
246 | + |
247 | + Tests results are a list of dictionaries container test-name, status |
248 | + and url. |
249 | + """ |
250 | + with json_cached_info(self.cache_path) as cache: |
251 | + tests = cache.get(make_cache_key(name, version), []) |
252 | + for test in tests: |
253 | + yield { |
254 | + 'name': test.get('name'), |
255 | + 'status': test.get('status'), |
256 | + 'url': test.get('url'), |
257 | + } |
258 | |
259 | === added file 'tests/test_testclient.py' |
260 | --- tests/test_testclient.py 1970-01-01 00:00:00 +0000 |
261 | +++ tests/test_testclient.py 2015-05-22 17:40:09 +0000 |
262 | @@ -0,0 +1,211 @@ |
263 | +#!/usr/bin/python |
264 | +# (C) 2015 Canonical Ltd. |
265 | +# |
266 | +# This program is free software; you can redistribute it and/or modify |
267 | +# it under the terms of the GNU General Public License as published by |
268 | +# the Free Software Foundation; either version 2 of the License, or |
269 | +# (at your option) any later version. |
270 | + |
271 | +import os |
272 | +import shutil |
273 | +import sys |
274 | +import tempfile |
275 | +import unittest |
276 | + |
277 | +import kombu |
278 | +from kombu.pools import producers |
279 | + |
280 | +PROJECT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) |
281 | +sys.path.insert(0, PROJECT_DIR) |
282 | + |
283 | +from excuse import Excuse |
284 | +from testclient import ( |
285 | + json_cached_info, |
286 | + make_cache_key, |
287 | + TestClient, |
288 | +) |
289 | + |
290 | + |
291 | +class TestJsonCachedInfo(unittest.TestCase): |
292 | + |
293 | + def setUp(self): |
294 | + super(TestJsonCachedInfo, self).setUp() |
295 | + (_dummy, self.test_cache) = tempfile.mkstemp() |
296 | + self.addCleanup(os.unlink, self.test_cache) |
297 | + |
298 | + def test_simple(self): |
299 | + # `json_cached_info` context manager correctly persists a |
300 | + # python dictionary on disk. |
301 | + with json_cached_info(self.test_cache) as cache: |
302 | + self.assertEqual({}, cache) |
303 | + cache['foo'] = 'bar' |
304 | + |
305 | + with open(self.test_cache) as fp: |
306 | + self.assertEqual( |
307 | + ['{\n', |
308 | + ' "foo": "bar"\n', |
309 | + '}'], fp.readlines()) |
310 | + |
311 | + with json_cached_info(self.test_cache) as cache: |
312 | + self.assertEqual(cache['foo'], 'bar') |
313 | + |
314 | + |
315 | +def make_excuse(name, version): |
316 | + """Return a `Excuse` for the give source name and version.""" |
317 | + e = Excuse(name) |
318 | + e.set_vers('-', version) |
319 | + return e |
320 | + |
321 | + |
322 | +class TestTestClient(unittest.TestCase): |
323 | + |
324 | + def setUp(self): |
325 | + super(TestTestClient, self).setUp() |
326 | + self.path = tempfile.mkdtemp(prefix='testclient') |
327 | + os.makedirs(os.path.join(self.path, 'testclient/')) |
328 | + self.addCleanup(shutil.rmtree, self.path) |
329 | + os.chdir(self.path) |
330 | + |
331 | + def test_announce(self): |
332 | + # 'announce' post messages to the EXCHANGE_CANDIDATES exchange and |
333 | + # updates its internal cache. |
334 | + amqp_uris = ['memory://'] |
335 | + testclient = TestClient('vivid', amqp_uris) |
336 | + test_excuses = [ |
337 | + make_excuse('foo', '1.0'), |
338 | + make_excuse('bar', '2.0'), |
339 | + ] |
340 | + |
341 | + with kombu.Connection(amqp_uris) as connection: |
342 | + exchange = kombu.Exchange( |
343 | + testclient.EXCHANGE_CANDIDATES, type="fanout") |
344 | + queue = kombu.Queue('testing', exchange) |
345 | + with connection.SimpleQueue(queue) as q: |
346 | + q.queue.purge() |
347 | + testclient.announce(test_excuses) |
348 | + self.assertEqual( |
349 | + [{'series': 'vivid', |
350 | + 'source_name': 'foo', |
351 | + 'source_version': '1.0'}, |
352 | + {'series': 'vivid', |
353 | + 'source_name': 'bar', |
354 | + 'source_version': '2.0'}], |
355 | + [q.get().payload for i in range(len(q))]) |
356 | + |
357 | + with json_cached_info(testclient.cache_path) as cache: |
358 | + self.assertEqual( |
359 | + {'bar_2.0': [], 'foo_1.0': []}, |
360 | + cache) |
361 | + |
362 | + def test_collect(self): |
363 | + # 'collect' collects test results and aggregates them in its |
364 | + # internal cache. |
365 | + amqp_uris = ['memory://'] |
366 | + testclient = TestClient('vivid', amqp_uris) |
367 | + |
368 | + result_payloads = [ |
369 | + {'source_name': 'foo', |
370 | + 'source_version': '1.0', |
371 | + 'series': testclient.series, |
372 | + 'test_name': 'snappy', |
373 | + 'test_status': 'RUNNING', |
374 | + 'test_url': 'http://snappy.com/foo'}, |
375 | + {'source_name': 'bar', |
376 | + 'source_version': '1.0', |
377 | + 'series': testclient.series, |
378 | + 'test_name': 'ubuntu', |
379 | + 'test_status': 'RUNNING', |
380 | + 'test_url': 'http://ubuntu.com/foo'}, |
381 | + {'source_name': 'foo', |
382 | + 'source_version': '1.0', |
383 | + 'series': testclient.series, |
384 | + 'test_name': 'bbb', |
385 | + 'test_status': 'RUNNING', |
386 | + 'test_url': 'http://bbb.com/foo'}, |
387 | + # This result will be ignored due to the series mismatch. |
388 | + {'source_name': 'zoing', |
389 | + 'source_version': '1.0', |
390 | + 'series': 'some-other-series', |
391 | + 'test_name': 'ubuntu', |
392 | + 'test_status': 'RUNNING', |
393 | + 'test_url': 'http://ubuntu.com/foo'}, |
394 | + ] |
395 | + |
396 | + with kombu.Connection(amqp_uris) as connection: |
397 | + with producers[connection].acquire(block=True) as producer: |
398 | + # Just for binding destination queue to the exchange. |
399 | + testclient.collect() |
400 | + exchange = kombu.Exchange( |
401 | + testclient.EXCHANGE_RESULTS, type="fanout") |
402 | + publisher = connection.ensure( |
403 | + producer, producer.publish, max_retries=3) |
404 | + for payload in result_payloads: |
405 | + publisher(payload, exchange=exchange, declare=[exchange]) |
406 | + testclient.collect() |
407 | + |
408 | + with json_cached_info(testclient.cache_path) as cache: |
409 | + self.assertEqual( |
410 | + {'foo_1.0': [{'name': 'snappy', |
411 | + 'status': 'RUNNING', |
412 | + 'url': 'http://snappy.com/foo'}, |
413 | + {'name': 'bbb', |
414 | + 'status': 'RUNNING', |
415 | + 'url': 'http://bbb.com/foo'}], |
416 | + 'bar_1.0': [{'name': 'ubuntu', |
417 | + 'status': 'RUNNING', |
418 | + 'url': 'http://ubuntu.com/foo'}]}, |
419 | + cache) |
420 | + |
421 | + def test_cleanup(self): |
422 | + # `cleanup` remove cache entries that are not present in the |
423 | + # given excuses list (i.e. not relevant for promotion anymore). |
424 | + amqp_uris = ['memory://'] |
425 | + testclient = TestClient('vivid', amqp_uris) |
426 | + test_excuses = [ |
427 | + make_excuse('foo', '1.0'), |
428 | + make_excuse('bar', '2.0'), |
429 | + ] |
430 | + |
431 | + with json_cached_info(testclient.cache_path) as cache: |
432 | + cache[make_cache_key('foo', '0.9')] = [] |
433 | + cache[make_cache_key('foo', '1.0')] = [] |
434 | + cache[make_cache_key('bar', '2.0')] = [] |
435 | + |
436 | + testclient.cleanup(test_excuses) |
437 | + |
438 | + with json_cached_info(testclient.cache_path) as cache: |
439 | + self.assertEqual( |
440 | + {'bar_2.0': [], 'foo_1.0': []}, |
441 | + cache) |
442 | + |
443 | + def test_getTests(self): |
444 | + # `getTests` yields cached tests results for a given source name |
445 | + # and version. |
446 | + amqp_uris = ['memory://'] |
447 | + testclient = TestClient('vivid', amqp_uris) |
448 | + |
449 | + with json_cached_info(testclient.cache_path) as cache: |
450 | + cache[make_cache_key('foo', '1.0')] = [ |
451 | + {'name': 'snappy', |
452 | + 'status': 'RUNNING', |
453 | + 'url': 'http://snappy.com/foo'}, |
454 | + {'name': 'bbb', |
455 | + 'status': 'RUNNING', |
456 | + 'url': 'http://bbb.com/foo'} |
457 | + ] |
458 | + |
459 | + self.assertEqual( |
460 | + [{'name': 'snappy', |
461 | + 'status': 'RUNNING', |
462 | + 'url': 'http://snappy.com/foo'}, |
463 | + {'name': 'bbb', |
464 | + 'status': 'RUNNING', |
465 | + 'url': 'http://bbb.com/foo'}], |
466 | + list(testclient.getTests('foo', '1.0'))) |
467 | + |
468 | + self.assertEqual( |
469 | + [], list(testclient.getTests('bar', '1.0'))) |
470 | + |
471 | + |
472 | +if __name__ == '__main__': |
473 | + unittest.main() |
Thanks a lot cprov for the MP.
I really like your way of handling the cache. Making keys with name and version is a clever way. And thanks for adding tests too.
I have a few points,
1. To include the conf file changes,
2. An inline comment in avoiding duplicate entries in the reg
3. I think the cleanup will handle entries with one source having two or more versions at the same time, but want to double check.
4. Also i'm not sure if we're handling acking and nacking etc. I'm asking this because we're removing entries from the cache and that may lead to messages being in the queue forever?