Merge lp:~cprov/britney/testclient-api into lp:~canonical-ci-engineering/britney/queued-announce-and-collect

Proposed by Celso Providelo
Status: Merged
Merged at revision: 435
Proposed branch: lp:~cprov/britney/testclient-api
Merge into: lp:~canonical-ci-engineering/britney/queued-announce-and-collect
Diff against target: 473 lines (+446/-0)
3 files modified
britney.py (+57/-0)
testclient.py (+178/-0)
tests/test_testclient.py (+211/-0)
To merge this branch: bzr merge lp:~cprov/britney/testclient-api
Reviewer Review Type Date Requested Status
Para Siva (community) Approve
Canonical CI Engineering Pending
Review via email: mp+259668@code.launchpad.net

Commit message

Initial implementation of a generic interface for proposed-migration (britney) to produce new-candidates events and consume test results from the CI-infrastructure.

Description of the change

Initial implementation of a generic interface for proposed-migration (britney) to produce new-candidates events and consume test results from the CI-infrastructure.

To post a comment you must log in.
Revision history for this message
Para Siva (psivaa) wrote :

Thanks a lot cprov for the MP.
I really like your way of handling the cache. Making keys with name and version is a clever way. And thanks for adding tests too.

I have a few points,
1. To include the conf file changes,
2. An inline comment in avoiding duplicate entries in the reg
3. I think the cleanup will handle entries with one source having two or more versions at the same time, but want to double check.
4. Also i'm not sure if we're handling acking and nacking etc. I'm asking this because we're removing entries from the cache and that may lead to messages being in the queue forever?

review: Needs Fixing
Revision history for this message
Celso Providelo (cprov) wrote :

Psivaa,

Thanks for the review.

Regarding config-changes, they will be only necessary when we do integration tests (which I've mentioned are missing, on purpose, from this MP).
On point 2) I don't get the registry duplication, it's impossible, since it's a dictionary, perhaps you mean re-announcement ?
On point 3) take a look at the corresponding test, it may clarify things about cleanup() effects.
On message ack-ing, you are right it was missing in the collect() action, fixed.

[]

Revision history for this message
Para Siva (psivaa) wrote :

Thanks for fixing the comments cprov.

> On point 2) I don't get the registry duplication, it's impossible, since it's
> a dictionary, perhaps you mean re-announcement ?
>
Point 2 was about a notification about my inline comment, not using make_key bit, which you've fixed.

And thanks a lot again for this MP. Whole lot different than what I proposed.

review: Approve
lp:~cprov/britney/testclient-api updated
442. By Celso Providelo

addressing review comments

443. By Celso Providelo

extend existing tests for ignored test results (series mismatch).

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'britney.py'
--- britney.py 2015-02-20 19:02:00 +0000
+++ britney.py 2015-05-22 17:40:09 +0000
@@ -227,6 +227,7 @@
227 PROVIDES, RDEPENDS, RCONFLICTS, MULTIARCH, ESSENTIAL)227 PROVIDES, RDEPENDS, RCONFLICTS, MULTIARCH, ESSENTIAL)
228from autopkgtest import AutoPackageTest, ADT_PASS, ADT_EXCUSES_LABELS228from autopkgtest import AutoPackageTest, ADT_PASS, ADT_EXCUSES_LABELS
229from boottest import BootTest229from boottest import BootTest
230from testclient import TestClient
230231
231232
232__author__ = 'Fabio Tranchitella and the Debian Release Team'233__author__ = 'Fabio Tranchitella and the Debian Release Team'
@@ -1984,6 +1985,62 @@
1984 upgrade_me.remove(excuse.name)1985 upgrade_me.remove(excuse.name)
1985 unconsidered.append(excuse.name)1986 unconsidered.append(excuse.name)
19861987
1988 if (getattr(self.options, "testclient_enable", "no") == "yes" and
1989 self.options.series):
1990
1991 # Filter only new source candidates excuses.
1992 testing_excuses = []
1993 for excuse in self.excuses:
1994 # Skip removals, binary-only candidates, proposed-updates
1995 # and unknown versions.
1996 if (excuse.name.startswith("-") or
1997 "/" in excuse.name or
1998 "_" in excuse.name or
1999 excuse.ver[1] == "-"):
2000 continue
2001 testing_excuses.append(excuse)
2002
2003 amqp_uris = getattr(
2004 self.options, "testclient_amqp_uris", "").split()
2005 testclient = TestClient(self.options.series, amqp_uris)
2006
2007 # Announce new candidates and collect new test results.
2008 if not self.options.dry_run:
2009 testclient.announce(testing_excuses)
2010 testclient.collect()
2011 testclient.cleanup(testing_excuses)
2012
2013 # Update excuses considering hints and required_tests (for gating).
2014 required_tests = getattr(
2015 self.options, "testclient_required_tests", "").split()
2016 for excuse in testing_excuses:
2017 hints = self.hints.search('force', package=excuse.name)
2018 hints.extend(
2019 self.hints.search('force-badtest', package=excuse.name))
2020 forces = [x for x in hints
2021 if same_source(excuse.ver[1], x.version)]
2022 for test in testclient.getTests(excuse.name):
2023 label = TestClient.EXCUSE_LABELS.get(
2024 test.status, 'UNKNOWN STATUS')
2025 excuse.addhtml(
2026 "%s result: %s (<a href=\"%s\">results</a>)" % (
2027 test.name, label, test.result_url))
2028 if forces:
2029 excuse.addhtml(
2030 "Should wait for %s %s %s, but forced by "
2031 "%s" % (excuse.name, excuse.ver[1],
2032 test.name, forces[0].user))
2033 continue
2034 if test.name not in required_tests:
2035 continue
2036 if test.status not in TestClient.VALID_STATUSES:
2037 excuse.addreason(test.name)
2038 if excuse.is_valid:
2039 excuse.is_valid = False
2040 excuse.addhtml("Not considered")
2041 upgrade_me.remove(excuse.name)
2042 unconsidered.append(excuse.name)
2043
1987 # invalidate impossible excuses2044 # invalidate impossible excuses
1988 for e in self.excuses:2045 for e in self.excuses:
1989 # parts[0] == package name2046 # parts[0] == package name
19902047
=== added file 'testclient.py'
--- testclient.py 1970-01-01 00:00:00 +0000
+++ testclient.py 2015-05-22 17:40:09 +0000
@@ -0,0 +1,178 @@
1# -*- coding: utf-8 -*-
2
3# Copyright (C) 2015 Canonical Ltd.
4
5# This program is free software; you can redistribute it and/or modify
6# it under the terms of the GNU General Public License as published by
7# the Free Software Foundation; either version 2 of the License, or
8# (at your option) any later version.
9
10# This program is distributed in the hope that it will be useful,
11# but WITHOUT ANY WARRANTY; without even the implied warranty of
12# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13# GNU General Public License for more details.
14from __future__ import print_function
15
16
17from contextlib import (
18 contextmanager,
19 nested,
20)
21import json
22import os
23
24
25import kombu
26from kombu.pools import producers
27
28
29@contextmanager
30def json_cached_info(path):
31 """Context manager for caching a JSON object on disk."""
32 info = {}
33 if os.path.exists(path):
34 with open(path) as fp:
35 try:
36 info = json.load(fp)
37 except ValueError:
38 # cache is empty or corrupted (!).
39 info = {}
40 else:
41 dirname = os.path.dirname(path)
42 if not os.path.exists(dirname):
43 os.makedirs(dirname)
44 yield info
45 with open(path, 'w') as fp:
46 json.dump(info, fp, indent=2)
47
48
49def make_cache_key(name, version):
50 """Return a json-hashable key for given source & version."""
51 return '{}_{}'.format(name, version)
52
53
54class TestClient(object):
55 """Generic test client implementation.
56
57 announce: announcing new source candidates to a pre-defined rabbitmq
58 exchange (testing subsystems can hook/subscribe).
59
60 collect: collect test results for the context series from a pre-defined
61 rabbitmq exchange (other promotion agents can do the same).
62
63 cleanup: sanitize internal announcement/testing registry after runs.
64 """
65
66 EXCHANGE_CANDIDATES = 'candidates.exchange'
67 EXCHANGE_RESULTS = 'results.exchange'
68
69 VALID_STATUSES = ('PASS', 'SKIP')
70
71 LABELS = {
72 "PASS": '<span style="background:#87d96c">Pass</span>',
73 "SKIP": '<span style="background:#ffff00">Skip</span>',
74 "FAIL": '<span style="background:#ff6666">Regression</span>',
75 "RUNNING": '<span style="background:#99ddff">Test in progress</span>',
76 }
77
78 def __init__(self, series, amqp_uris):
79 self.series = series
80 self.amqp_uris = amqp_uris
81
82 @property
83 def cache_path(self):
84 """Series-specific test announcement/result cache."""
85 return 'testclient/{}.json'.format(self.series)
86
87 @property
88 def results_queue(self):
89 """Series-specific queue for collecting tests results."""
90 return 'pm.results.{}'.format(self.series)
91
92 def announce(self, excuses):
93 """Announce new source candidates.
94
95 Post a message to the EXCHANGE_CANDATIDATES for every new given
96 excuses (cache announcementes so excuses do not get re-annouced).
97 """
98 with nested(json_cached_info(self.cache_path),
99 kombu.Connection(self.amqp_uris)) as (cache, connection):
100 # XXX cprov 20150521: nested() is deprecated, and multi-statement
101 # 'with' does not support nesting (i.e. the previous context
102 # manager is not available to the next, in this case
103 # 'connection').
104 with producers[connection].acquire(block=True) as producer:
105 publisher = connection.ensure(
106 producer, producer.publish, max_retries=3)
107 exchange = kombu.Exchange(
108 self.EXCHANGE_CANDIDATES, type="fanout")
109 for excuse in excuses:
110 if make_cache_key(
111 excuse.name, excuse.ver[1]) in cache.keys():
112 continue
113 payload = {
114 'source_name': excuse.name,
115 'source_version': excuse.ver[1],
116 'series': self.series,
117 }
118 publisher(payload, exchange=exchange, declare=[exchange])
119 cache[make_cache_key(excuse.name, excuse.ver[1])] = []
120
121 def collect(self):
122 """Collect available test results.
123
124 Consume all messages from the EXCHANGE_RESULTS (routed to a series-
125 specific queue). Ignore test results for other series and update
126 test results registry.
127 """
128 with nested(json_cached_info(self.cache_path),
129 kombu.Connection(self.amqp_uris)) as (cache, connection):
130 exchange = kombu.Exchange(
131 self.EXCHANGE_RESULTS, type="fanout")
132 queue = kombu.Queue(self.results_queue, exchange)
133 # XXX cprov 20150521: same as above about nested context managers.
134 with connection.SimpleQueue(queue) as q:
135 for i in range(len(q)):
136 msg = q.get()
137 payload = msg.payload
138 if payload.get('series') != self.series:
139 continue
140 tests = cache.setdefault(
141 make_cache_key(
142 payload.get('source_name'),
143 payload.get('source_version')
144 ), [])
145 tests.append({
146 'name': payload.get('test_name'),
147 'status': payload.get('test_status'),
148 'url': payload.get('test_url'),
149 })
150 msg.ack()
151
152 def cleanup(self, excuses):
153 """Remove test result entries without corresponding excuse.
154
155 If there is not excuse the test results are not relevant anymore.
156 """
157 with json_cached_info(self.cache_path) as cache:
158 current_keys = [
159 make_cache_key(e.name, e.ver[1]) for e in excuses]
160 cached_keys = list(cache.keys())
161 for k in cached_keys:
162 if k not in current_keys:
163 del cache[k]
164
165 def getTests(self, name, version):
166 """Yields test results for a given source-version pair.
167
168 Tests results are a list of dictionaries container test-name, status
169 and url.
170 """
171 with json_cached_info(self.cache_path) as cache:
172 tests = cache.get(make_cache_key(name, version), [])
173 for test in tests:
174 yield {
175 'name': test.get('name'),
176 'status': test.get('status'),
177 'url': test.get('url'),
178 }
0179
=== added file 'tests/test_testclient.py'
--- tests/test_testclient.py 1970-01-01 00:00:00 +0000
+++ tests/test_testclient.py 2015-05-22 17:40:09 +0000
@@ -0,0 +1,211 @@
1#!/usr/bin/python
2# (C) 2015 Canonical Ltd.
3#
4# This program is free software; you can redistribute it and/or modify
5# it under the terms of the GNU General Public License as published by
6# the Free Software Foundation; either version 2 of the License, or
7# (at your option) any later version.
8
9import os
10import shutil
11import sys
12import tempfile
13import unittest
14
15import kombu
16from kombu.pools import producers
17
18PROJECT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
19sys.path.insert(0, PROJECT_DIR)
20
21from excuse import Excuse
22from testclient import (
23 json_cached_info,
24 make_cache_key,
25 TestClient,
26)
27
28
29class TestJsonCachedInfo(unittest.TestCase):
30
31 def setUp(self):
32 super(TestJsonCachedInfo, self).setUp()
33 (_dummy, self.test_cache) = tempfile.mkstemp()
34 self.addCleanup(os.unlink, self.test_cache)
35
36 def test_simple(self):
37 # `json_cached_info` context manager correctly persists a
38 # python dictionary on disk.
39 with json_cached_info(self.test_cache) as cache:
40 self.assertEqual({}, cache)
41 cache['foo'] = 'bar'
42
43 with open(self.test_cache) as fp:
44 self.assertEqual(
45 ['{\n',
46 ' "foo": "bar"\n',
47 '}'], fp.readlines())
48
49 with json_cached_info(self.test_cache) as cache:
50 self.assertEqual(cache['foo'], 'bar')
51
52
53def make_excuse(name, version):
54 """Return a `Excuse` for the give source name and version."""
55 e = Excuse(name)
56 e.set_vers('-', version)
57 return e
58
59
60class TestTestClient(unittest.TestCase):
61
62 def setUp(self):
63 super(TestTestClient, self).setUp()
64 self.path = tempfile.mkdtemp(prefix='testclient')
65 os.makedirs(os.path.join(self.path, 'testclient/'))
66 self.addCleanup(shutil.rmtree, self.path)
67 os.chdir(self.path)
68
69 def test_announce(self):
70 # 'announce' post messages to the EXCHANGE_CANDIDATES exchange and
71 # updates its internal cache.
72 amqp_uris = ['memory://']
73 testclient = TestClient('vivid', amqp_uris)
74 test_excuses = [
75 make_excuse('foo', '1.0'),
76 make_excuse('bar', '2.0'),
77 ]
78
79 with kombu.Connection(amqp_uris) as connection:
80 exchange = kombu.Exchange(
81 testclient.EXCHANGE_CANDIDATES, type="fanout")
82 queue = kombu.Queue('testing', exchange)
83 with connection.SimpleQueue(queue) as q:
84 q.queue.purge()
85 testclient.announce(test_excuses)
86 self.assertEqual(
87 [{'series': 'vivid',
88 'source_name': 'foo',
89 'source_version': '1.0'},
90 {'series': 'vivid',
91 'source_name': 'bar',
92 'source_version': '2.0'}],
93 [q.get().payload for i in range(len(q))])
94
95 with json_cached_info(testclient.cache_path) as cache:
96 self.assertEqual(
97 {'bar_2.0': [], 'foo_1.0': []},
98 cache)
99
100 def test_collect(self):
101 # 'collect' collects test results and aggregates them in its
102 # internal cache.
103 amqp_uris = ['memory://']
104 testclient = TestClient('vivid', amqp_uris)
105
106 result_payloads = [
107 {'source_name': 'foo',
108 'source_version': '1.0',
109 'series': testclient.series,
110 'test_name': 'snappy',
111 'test_status': 'RUNNING',
112 'test_url': 'http://snappy.com/foo'},
113 {'source_name': 'bar',
114 'source_version': '1.0',
115 'series': testclient.series,
116 'test_name': 'ubuntu',
117 'test_status': 'RUNNING',
118 'test_url': 'http://ubuntu.com/foo'},
119 {'source_name': 'foo',
120 'source_version': '1.0',
121 'series': testclient.series,
122 'test_name': 'bbb',
123 'test_status': 'RUNNING',
124 'test_url': 'http://bbb.com/foo'},
125 # This result will be ignored due to the series mismatch.
126 {'source_name': 'zoing',
127 'source_version': '1.0',
128 'series': 'some-other-series',
129 'test_name': 'ubuntu',
130 'test_status': 'RUNNING',
131 'test_url': 'http://ubuntu.com/foo'},
132 ]
133
134 with kombu.Connection(amqp_uris) as connection:
135 with producers[connection].acquire(block=True) as producer:
136 # Just for binding destination queue to the exchange.
137 testclient.collect()
138 exchange = kombu.Exchange(
139 testclient.EXCHANGE_RESULTS, type="fanout")
140 publisher = connection.ensure(
141 producer, producer.publish, max_retries=3)
142 for payload in result_payloads:
143 publisher(payload, exchange=exchange, declare=[exchange])
144 testclient.collect()
145
146 with json_cached_info(testclient.cache_path) as cache:
147 self.assertEqual(
148 {'foo_1.0': [{'name': 'snappy',
149 'status': 'RUNNING',
150 'url': 'http://snappy.com/foo'},
151 {'name': 'bbb',
152 'status': 'RUNNING',
153 'url': 'http://bbb.com/foo'}],
154 'bar_1.0': [{'name': 'ubuntu',
155 'status': 'RUNNING',
156 'url': 'http://ubuntu.com/foo'}]},
157 cache)
158
159 def test_cleanup(self):
160 # `cleanup` remove cache entries that are not present in the
161 # given excuses list (i.e. not relevant for promotion anymore).
162 amqp_uris = ['memory://']
163 testclient = TestClient('vivid', amqp_uris)
164 test_excuses = [
165 make_excuse('foo', '1.0'),
166 make_excuse('bar', '2.0'),
167 ]
168
169 with json_cached_info(testclient.cache_path) as cache:
170 cache[make_cache_key('foo', '0.9')] = []
171 cache[make_cache_key('foo', '1.0')] = []
172 cache[make_cache_key('bar', '2.0')] = []
173
174 testclient.cleanup(test_excuses)
175
176 with json_cached_info(testclient.cache_path) as cache:
177 self.assertEqual(
178 {'bar_2.0': [], 'foo_1.0': []},
179 cache)
180
181 def test_getTests(self):
182 # `getTests` yields cached tests results for a given source name
183 # and version.
184 amqp_uris = ['memory://']
185 testclient = TestClient('vivid', amqp_uris)
186
187 with json_cached_info(testclient.cache_path) as cache:
188 cache[make_cache_key('foo', '1.0')] = [
189 {'name': 'snappy',
190 'status': 'RUNNING',
191 'url': 'http://snappy.com/foo'},
192 {'name': 'bbb',
193 'status': 'RUNNING',
194 'url': 'http://bbb.com/foo'}
195 ]
196
197 self.assertEqual(
198 [{'name': 'snappy',
199 'status': 'RUNNING',
200 'url': 'http://snappy.com/foo'},
201 {'name': 'bbb',
202 'status': 'RUNNING',
203 'url': 'http://bbb.com/foo'}],
204 list(testclient.getTests('foo', '1.0')))
205
206 self.assertEqual(
207 [], list(testclient.getTests('bar', '1.0')))
208
209
210if __name__ == '__main__':
211 unittest.main()

Subscribers

People subscribed via source and target branches