Merge lp:~canonical-ci-engineering/auto-package-testing/add-boottest-requests into lp:auto-package-testing

Proposed by Celso Providelo
Status: Merged
Merged at revision: 389
Proposed branch: lp:~canonical-ci-engineering/auto-package-testing/add-boottest-requests
Merge into: lp:auto-package-testing
Diff against target: 1530 lines (+1333/-50)
7 files modified
etc/boottest-britney.rc (+9/-0)
jenkins/adt-jenkins (+8/-2)
jenkins/adtjob.py (+69/-46)
jenkins/aptcache.py (+10/-2)
jenkins/boottest-britney (+961/-0)
jenkins/jenkins_boottest_config.xml.tmpl (+79/-0)
jenkins/run-boottest-jenkins.py (+197/-0)
To merge this branch: bzr merge lp:~canonical-ci-engineering/auto-package-testing/add-boottest-requests
Reviewer Review Type Date Requested Status
Jean-Baptiste Lallement Approve
Review via email: mp+248936@code.launchpad.net

Description of the change

Support for boottesting

To post a comment you must log in.
Revision history for this message
Martin Pitt (pitti) wrote :

I'm not very familiar with the current jenkins <-> britney transition logic, so I'll let Jean-Baptiste have another look at this.

Just to understand, this will mostly run on a new "phone testing gateway" host similar to alderamin & friends, which has a couple of phones attached to it? Which parts will run on britney? Or is this all done through the .state files?

I was hoping that we don't have to add new code to this project. This is all slated to go away with moving to AMQP/swift, and is rather complex and brittle code. Even today we still have several problems with it, and I'm afraid of breaking it even more with such large changes. Anyway, this needs good testing with the existing -proposed package testing; and you need to be aware that this will (hopefully) not survive very long.

I have a couple of inline comments, see below.

401. By Para Siva

Why timestamp file is diff to pkgcache.bin and to make the config settings to be using QA deployment

Revision history for this message
Para Siva (psivaa) wrote :

Hi Martin,
Thanks for the review.

This is all done similar to the other adt tests that are already running in tachash. i.e. Kicked off from .state file received from snakefruit and to run on d-jenkins (tachash). The phone is attached to 'heymann', which hosts all the touch devices.

You might have seen Evan's reply as to why this was done using lp:auto-package-testing.

I have addressed your comments regarding apt-cache timestamp file and the aptroot concern in rev 401.

Leaving fginther to address the rest of the comments because he is more familiar with them.

402. By Para Siva

Use same apt cache but different data dir and creds for boottesting

403. By Francis Ginther

Fix typo in default boottest aptroot, '/tmp/boottest/apt'.

Revision history for this message
Francis Ginther (fginther) wrote :

Martin, I've addressed some of your concerns with this MP: https://code.launchpad.net/~fginther/auto-package-testing/adtjob-separate-archs/+merge/250198

I left it unmerged as I looks a little easier to see the merge in isolation.

404. By Vincent Ladeuil

boottest output is different from the standard adt one, fix the descriptionsetter to catch the package version in the boottest output.

405. By Vincent Ladeuil

Merge fix for ARCHS_BOOTTEST

Revision history for this message
Jean-Baptiste Lallement (jibel) wrote :

LGTM. Some comments inline.

406. By Para Siva

Review comment fixes in run-boottest-jenkins

Revision history for this message
Para Siva (psivaa) wrote :

Thanks a bunch for the review, jibel. I've addressed your comments in rev 406.

407. By Para Siva

Dateformat year first

Revision history for this message
Para Siva (psivaa) wrote :

In fact please use r 407. Thanks

Revision history for this message
Jean-Baptiste Lallement (jibel) wrote :

releases = [distro_info.UbuntuDistroInfo().devel(), ] will fail when distro-info-data is outdated.

Revision history for this message
Francis Ginther (fginther) wrote :
Revision history for this message
Para Siva (psivaa) wrote :

Thanks a lot jibel for looking at it again.
AFAUI, distro-info-data will be outdated during a new release set up and all the other times it should be up-to-date. By using 'distro_info.UbuntuDistroInfo().devel()' the new release will be picked up once distro-info-data is updated. If we want to have a config file for this purpose, that will need to be modified manually anyway during the start of a release. I do not there fore see an advantage. But I could miss anything here, for e.g. this failure itself is going to be in the way of registering a new release and releasing new distro-info version.

408. By Para Siva

Exit with the right error message when devel release could not be found

Revision history for this message
Jean-Baptiste Lallement (jibel) wrote :

approved

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'etc/boottest-britney.rc'
2--- etc/boottest-britney.rc 1970-01-01 00:00:00 +0000
3+++ etc/boottest-britney.rc 2015-02-26 11:17:00 +0000
4@@ -0,0 +1,9 @@
5+# Default configuration file for adt-britney
6+# This file should be placed under
7+# /var/lib/jenkins/CI/
8+# But using the adt apt cache in order to avoid
9+# the disk filling up
10+aptroot: /var/lib/jenkins/QA/adt/aptroot
11+apturi: http://ftpmaster.internal/ubuntu/
12+components: main restricted universe multiverse
13+datadir: /var/local/boottest/
14
15=== modified file 'jenkins/adt-jenkins'
16--- jenkins/adt-jenkins 2014-10-23 18:56:36 +0000
17+++ jenkins/adt-jenkins 2015-02-26 11:17:00 +0000
18@@ -115,6 +115,9 @@
19 metavar='state_file',
20 help='Submit a job to jenkins described in file '
21 '<state_file>')
22+ parser.add_argument('-b', '--boottest', action='store_true',
23+ default=False,
24+ help='Boottesting. A flag to enable only boottests')
25
26 self.args = parser.parse_args()
27 set_logging(self.args.debug)
28@@ -152,8 +155,11 @@
29 logging.debug('Loading state file: %s', statefile)
30 job = AdtJob(self.cache, statefile,
31 use_proposed=self.args.use_proposed,
32- credfile=self.args.use_jenkins)
33- ret = job.execute(update=True, dryrun=self.args.dry_run)
34+ credfile=self.args.use_jenkins,
35+ boottest=self.args.boottest)
36+ ret = job.execute(update=True,
37+ dryrun=self.args.dry_run,
38+ boottest=self.args.boottest)
39 if ret:
40 archivedir = os.path.join(self.config['datadir'],
41 self.config['release'],
42
43=== modified file 'jenkins/adtjob.py'
44--- jenkins/adtjob.py 2014-02-21 00:46:28 +0000
45+++ jenkins/adtjob.py 2015-02-26 11:17:00 +0000
46@@ -2,7 +2,7 @@
47 """ Interface between britney and autopkgtest running on Jenkins
48
49 """
50-# Copyright (C) 2012, Canonical Ltd (http://www.canonical.com/)
51+# Copyright (C) 2012-2015 Canonical Ltd (http://www.canonical.com/)
52 #
53 # Author: Jean-Baptiste Lallement <jean-baptiste.lallement@canonical.com>
54 #
55@@ -45,11 +45,13 @@
56 from urllib2 import urlopen
57
58
59-ARCHS = ('i386', 'amd64', 'all')
60+ARCHS_ADT = ('i386', 'amd64', 'all')
61+ARCHS_BOOTTEST = ('krillin', 'all')
62 BINDIR = os.path.dirname(__file__)
63 JENKINS_TMPL = 'jenkins_config.xml.tmpl'
64 JENKINS_TMPL_ARMHF = 'jenkins_config_armhf.xml.tmpl'
65 JENKINS_TMPL_PPC64EL = 'jenkins_config_ppc64el.xml.tmpl'
66+JENKINS_BOOTTEST_TMPL = 'jenkins_boottest_config.xml.tmpl'
67 DEFAULT_RECIPIENTS = ["ubuntu-testing-notifications@lists.ubuntu.com",
68 "jean-baptiste.lallement@canonical.com",
69 "martin.pitt@ubuntu.com"]
70@@ -57,11 +59,16 @@
71
72 class AdtJob(object):
73 '''Class that manages jobs'''
74- def __init__(self, cache, state_path, use_proposed=False, credfile=None):
75- global ARCHS
76+ def __init__(self, cache, state_path, use_proposed=False, credfile=None, boottest=False):
77+ global ARCHS_ADT
78+ global ARCHS_BOOTTEST
79
80+ if boottest:
81+ self.archs = ARCHS_BOOTTEST
82+ else:
83+ self.archs = ARCHS_ADT
84 self.job_status = {
85- 'status': dict([(arch, None) for arch in ARCHS]),
86+ 'status': dict([(arch, None) for arch in self.archs]),
87 'release': None,
88 'package': None,
89 'version': None,
90@@ -72,7 +79,7 @@
91 self.state_path = state_path
92 self.cache = cache
93
94- self.status = dict([(arch, None) for arch in ARCHS])
95+ self.status = dict([(arch, None) for arch in self.archs])
96 self.release = None
97 self.package = None
98 self.pkgname = None
99@@ -81,6 +88,7 @@
100 self.causes = {}
101
102 self.use_proposed = use_proposed
103+ self.boottest = boottest
104 self.load_status(state_path)
105
106 if credfile:
107@@ -139,7 +147,7 @@
108 '''Update package information'''
109 logging.debug('Updating job status')
110 if not self.status:
111- self.status = dict([(arch, 'NEEDRUN') for arch in ARCHS])
112+ self.status = dict([(arch, 'NEEDRUN') for arch in self.archs])
113
114 (self.version, pocket, self.depends) = self.get_package_info()
115
116@@ -217,32 +225,39 @@
117 causes[self.pkgname] = version
118 ret = True
119
120- # Verification of the dependencies
121- # Run a test if:
122- # - No package list is specified and
123- # - this is a new dependency
124- # - or version of a dependency in the archive is newer than previous run
125- # - Or if a package list is specified the following conditions must
126- # also be met:
127- # - and the dependency is in the list
128- # - the version in the archive is newer or equal to the version requested
129- #
130- # Note that request files use source package names while dependency
131- # lists returned by get_package_info use binary package names
132- for depname, depver in depends.iteritems():
133- depsrcname = self._get_sourcename(depname)
134- if (not pkglist or
135- (pkglist and depsrcname in pkglist.keys() and
136- apt_pkg.version_compare(pkglist[depsrcname]['version'], depver) <= 0)):
137- if not depname in self.depends:
138- logging.info("== New dependency '%s'.", depname)
139- causes[depsrcname] = depver
140- ret = True
141- elif apt_pkg.version_compare(self.depends[depname], depver) < 0:
142- logging.info("== New version of dependency '%s %s'.",
143- depname, depver)
144- causes[depsrcname] = depver
145- ret = True
146+ if not self.boottest:
147+ # Verification of the dependencies
148+ # Run a test if:
149+ # - No package list is specified and
150+ # - this is a new dependency
151+ # - or version of a dependency in the archive is newer than previous run
152+ # - Or if a package list is specified the following conditions must
153+ # also be met:
154+ # - and the dependency is in the list
155+ # - the version in the archive is newer or equal to the version requested
156+ #
157+ # Note that request files use source package names while dependency
158+ # lists returned by get_package_info use binary package names
159+ # XXX - fginther 20150129
160+ # Skip this for boottesting because the reverse dependences
161+ # will almost always already be a part of the image being tested
162+ # The only exception is when the reverse dependence is also in
163+ # the proposed pocket. We'll just acknowledge that these are
164+ # skipped for now.
165+ for depname, depver in depends.iteritems():
166+ depsrcname = self._get_sourcename(depname)
167+ if (not pkglist or
168+ (pkglist and depsrcname in pkglist.keys() and
169+ apt_pkg.version_compare(pkglist[depsrcname]['version'], depver) <= 0)):
170+ if not depname in self.depends:
171+ logging.info("== New dependency '%s'.", depname)
172+ causes[depsrcname] = depver
173+ ret = True
174+ elif apt_pkg.version_compare(self.depends[depname], depver) < 0:
175+ logging.info("== New version of dependency '%s %s'.",
176+ depname, depver)
177+ causes[depsrcname] = depver
178+ ret = True
179
180 if ret:
181 self.causes = causes
182@@ -269,7 +284,7 @@
183 '''
184 if force or self.run_required():
185 logging.debug('Starting job')
186- self.status = dict([(arch, 'RUNNING') for arch in ARCHS])
187+ self.status = dict([(arch, 'RUNNING') for arch in self.archs])
188 self.update_status()
189 self.write_status()
190 logging.info('Triggering remote job for package %s', self.pkgname)
191@@ -328,7 +343,7 @@
192 logging.debug('Dependency list: %s', depends)
193 return depends
194
195- def execute(self, update=False, dryrun=False):
196+ def execute(self, update=False, dryrun=False, boottest=False):
197 '''Execute a jenkins job'''
198 logging.debug('Executing job')
199
200@@ -345,9 +360,12 @@
201 jobname = self.release
202 if self.use_proposed:
203 opts += '-P'
204- jobname += '-adt-' + self.pkgname.replace('+', '-')
205- jobname_armhf = jobname + '-armhf'
206- jobname_ppc64el = jobname + '-ppc64el'
207+ if not self.boottest:
208+ jobname += '-adt-' + self.pkgname.replace('+', '-')
209+ jobname_armhf = jobname + '-armhf'
210+ jobname_ppc64el = jobname + '-ppc64el'
211+ else:
212+ jobname += '-boottest-' + self.pkgname.replace('+', '-')
213
214 # Who should be notified
215 notify = AdtNotify(package=self.pkgname, release=self.release)
216@@ -373,14 +391,19 @@
217 }
218
219 logging.debug('Generating job: %s', jobname)
220- tmpl = templates.get_template(JENKINS_TMPL)
221- tmpl_armhf = templates.get_template(JENKINS_TMPL_ARMHF)
222- tmpl_ppc64el = templates.get_template(JENKINS_TMPL_PPC64EL)
223- config = tmpl.render(ctxt)
224- config_armhf = tmpl_armhf.render(ctxt)
225- config_ppc64el = tmpl_ppc64el.render(ctxt)
226- self.__update_jenkins_job(jobname_armhf, config_armhf, update, dryrun)
227- self.__update_jenkins_job(jobname_ppc64el, config_ppc64el, update, dryrun)
228+ if not self.boottest:
229+ tmpl = templates.get_template(JENKINS_TMPL)
230+ tmpl_armhf = templates.get_template(JENKINS_TMPL_ARMHF)
231+ tmpl_ppc64el = templates.get_template(JENKINS_TMPL_PPC64EL)
232+ config = tmpl.render(ctxt)
233+ config_armhf = tmpl_armhf.render(ctxt)
234+ config_ppc64el = tmpl_ppc64el.render(ctxt)
235+ self.__update_jenkins_job(jobname_armhf, config_armhf, update, dryrun)
236+ self.__update_jenkins_job(jobname_ppc64el, config_ppc64el, update, dryrun)
237+ else:
238+ boottest_tmpl = templates.get_template(JENKINS_BOOTTEST_TMPL)
239+ config = boottest_tmpl.render(ctxt)
240+
241 if not self.__update_jenkins_job(jobname, config, update, dryrun):
242 return False
243
244
245=== modified file 'jenkins/aptcache.py'
246--- jenkins/aptcache.py 2014-04-17 09:56:58 +0000
247+++ jenkins/aptcache.py 2015-02-26 11:17:00 +0000
248@@ -53,13 +53,20 @@
249 self.components = ['main']
250 self.proposed = False
251
252+ # XXX psivaa 16-02-2015: apt.Cache.open() is invoked
253+ # everytime adt-jenkins is called without `--no-update`.
254+ # This changed timestamp of `var/cache/apt/pkgcache.bin`.
255+ # Hence pkgcache.bin was always `new`
256+ # This was making update() below to not to trigger apt.Cache.update
257+ # Using a separate timestamp file to update its timestamp only when
258+ # apt.Cache.update is called.
259 self.timestampfile = os.path.normpath(
260- os.path.join(rootdir, 'var/cache/apt/pkgcache.bin'))
261+ os.path.join(rootdir, 'var/cache/apt/cache.timestamp'))
262
263 super(AptCache, self).__init__(rootdir=rootdir, memonly=False)
264 self.rootdir = rootdir
265
266- def update(self, cache_age=60, force_update=False):
267+ def update(self, cache_age=10, force_update=False):
268 ''' Equivalent of apt-get update
269
270 :param cache_age: Update the cache when it is older than
271@@ -78,6 +85,7 @@
272 not os.path.exists(self.timestampfile)) or (
273 (currtime - timestamp) > cache_age * 60):
274 logging.debug('Updating cache')
275+ open(self.timestampfile, 'w').close()
276 super(AptCache, self).update()
277 super(AptCache, self).open()
278 else:
279
280=== added file 'jenkins/boottest-britney'
281--- jenkins/boottest-britney 1970-01-01 00:00:00 +0000
282+++ jenkins/boottest-britney 2015-02-26 11:17:00 +0000
283@@ -0,0 +1,961 @@
284+#! /usr/bin/python
285+""" Interface between britney and boottest running on Jenkins."""
286+# Copyright (C) 2012, Canonical Ltd (http://www.canonical.com/)
287+#
288+# Author: Jean-Baptiste Lallement <jean-baptiste.lallement@canonical.com>
289+#
290+# This software is free software: you can redistribute it
291+# and/or modify it under the terms of the GNU General Public License
292+# as published by the Free Software Foundation, either version 3 of
293+# the License, or (at your option) any later version.
294+#
295+# This software is distributed in the hope that it will
296+# be useful, but WITHOUT ANY WARRANTY; without even the implied warranty
297+# of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
298+# GNU General Public License for more details.
299+#
300+# You should have received a copy of the GNU General Public License
301+# along with this software. If not, see <http://www.gnu.org/licenses/>.
302+#
303+import apt_pkg
304+import os
305+import logging
306+import sys
307+from glob import glob
308+import subprocess
309+import yaml
310+import argparse
311+from adtjob import AdtJob
312+from aptcache import AptCache
313+from time import strftime
314+from shutil import copyfile
315+import json
316+
317+try:
318+ import kombu
319+except ImportError:
320+ sys.stderr.write('python-kombu is not installed. AMQP support is '
321+ 'disabled\n')
322+
323+# 'all' is used to track the status across all archs.
324+ARCHS = ('krillin', 'all')
325+DEFAULT_ARCH = 'krillin'
326+DEFAULT_RELEASE = 'vivid'
327+BINDIR = os.path.dirname(__file__)
328+# Do we run from a bzr checkout ?
329+if os.path.exists(os.path.join(BINDIR, "../.bzr")):
330+ sys.path.insert(1, BINDIR)
331+
332+# Error codes
333+E_CONFIG = 1
334+E_APT = 2
335+
336+
337+class RecursiveDict(dict):
338+ """Implementation of perl's autovivification feature."""
339+ def __missing__(self, key):
340+ value = self[key] = type(self)()
341+ return value
342+
343+
344+class BoottestBritney(object):
345+ """Interface between britney and autopkgtest running in Jenkins"""
346+ DEFAULT_RC = os.path.join(BINDIR, 'boottest-britney.rc')
347+
348+ config_defaults = {
349+ 'arch': DEFAULT_ARCH,
350+ 'release': DEFAULT_RELEASE,
351+ 'aptroot': '/tmp/boottest/apt',
352+ 'datadir': '/tmp/boottest/data',
353+ 'apturi': 'http://ports.ubuntu.com/',
354+ 'components': 'main restricted',
355+ 'extrasources': '',
356+ 'rsync_host': '',
357+ 'amqp_host': '',
358+ 'amqp_queue': '',
359+ }
360+
361+ def __init__(self):
362+ """Constructor
363+
364+ Parse command line argument and loads package list
365+ """
366+ self.cache = None
367+ self.pocket = ""
368+ self.args = None
369+
370+ self.config = dict(self.config_defaults)
371+ self.__parse_args()
372+ self.__load_config()
373+ self.__init_apt()
374+
375+ # Status Files
376+ datadir = os.path.abspath(
377+ os.path.expanduser(
378+ os.path.join(self.config['datadir'], 'boottest',
379+ self.config['release'] + self.pocket,
380+ self.config['arch'])))
381+ if not os.path.exists(datadir):
382+ os.makedirs(datadir)
383+ elif not os.path.isdir(datadir):
384+ logging.error("'%s' is not a directory, exiting!", datadir)
385+ sys.exit(E_CONFIG)
386+ # release(-pocket)?_arch_package
387+ self.status_pattern = os.path.join(datadir, '%s%s_%s_%s')
388+ self.boottest_cachedir = datadir
389+ self.boottest_workdir = os.path.join(self.boottest_cachedir, 'work')
390+ if not os.path.exists(self.boottest_workdir):
391+ os.makedirs(self.boottest_workdir)
392+ self.adt_archivedir = os.path.join(self.boottest_cachedir, 'archive')
393+ if not os.path.exists(self.adt_archivedir):
394+ os.makedirs(self.adt_archivedir)
395+
396+ # test runner AMQP queue
397+ if self.config.get('amqp_host'):
398+ self.amqp_con = kombu.Connection(self.config['amqp_host'])
399+ self.amqp_queue = self.amqp_con.SimpleQueue(self.config['amqp_queue'])
400+ else:
401+ self.amqp_con = None
402+ self.amqp_queue = None
403+
404+ self.args.func()
405+
406+ def __del__(self):
407+ if self.amqp_con:
408+ self.amqp_con.release()
409+ self.amqp_con = None
410+
411+ def __parse_args(self):
412+ """Parse command line arguments
413+
414+ This method parses command line arguments and configures logging
415+ """
416+
417+ parser = argparse.ArgumentParser(
418+ description='Check if a new version of a package or its '
419+ 'dependencies is available from the archive and trigger an '
420+ 'autopkgtest run if there is.')
421+ parser.add_argument('-d', '--debug', action='store_true',
422+ default=False, help='enable debug mode')
423+ parser.add_argument('-c', '--config', default=self.DEFAULT_RC,
424+ help='Path to configuration file (default=%s)' %
425+ self.DEFAULT_RC)
426+ parser.add_argument('-a', '--arch', metavar='ARCH',
427+ help='Architecture')
428+ parser.add_argument('-r', '--release', metavar='RELEASE',
429+ help='Release')
430+ parser.add_argument('-P', '--use-proposed', action='store_true',
431+ default=False,
432+ help='Check if new versions are in -proposed and '
433+ 'enable -proposed in testbed.')
434+ parser.add_argument('-U', '--no-update', action='store_true',
435+ default=False,
436+ help='Do not update apt cache')
437+ subparsers = parser.add_subparsers()
438+
439+ prequest = subparsers.add_parser(
440+ 'request',
441+ help='Generates a request file or update it if passed in argument')
442+ prequest.add_argument(
443+ 'requestfile', nargs='?',
444+ help='Request file. If no request file is speficied, it will be '
445+ 'generated from the list of packages with an XS-Testsuite source '
446+ 'header')
447+ prequest.add_argument('-O', '--output', default=None,
448+ metavar='file',
449+ help='Request file will be written to \'file\''
450+ 'instead of stdout')
451+ prequest.set_defaults(func=self.cmd_request)
452+
453+ psubmit = subparsers.add_parser(
454+ 'submit',
455+ help='Submit a request file to Jenkins and trigger the tests')
456+ psubmit.add_argument(
457+ 'requestfile', help='Request file to submit to Jenkins')
458+ psubmit.set_defaults(func=self.cmd_submit)
459+
460+ pcollect = subparsers.add_parser(
461+ 'collect', help='Collect the results from the tests')
462+ pcollect.add_argument('-O', '--output', default=None,
463+ metavar='file',
464+ help='Request file will be written to \'file\''
465+ 'instead of stdout')
466+ pcollect.add_argument('-n', '--new-only', action='store_true',
467+ default=False,
468+ help='Display only new results otherwise the '
469+ 'history is displayed')
470+
471+ pcollect.set_defaults(func=self.cmd_collect)
472+
473+ pretry = subparsers.add_parser(
474+ 'retry',
475+ help='Resubmit a test request for a list of packages')
476+ pretry.add_argument('sourcepackages', metavar='SRCPKG', nargs='+',
477+ help='Name of the source package to retry')
478+ pretry.add_argument('-f', '--force', metavar='STATUS',
479+ choices=['PASS', 'FAIL', 'RUNNING'],
480+ help='Force the result to STATUS for a list of'
481+ 'source packages without considering the actual'
482+ 'result of the tests')
483+ pretry.set_defaults(func=self.cmd_retry)
484+
485+ self.args = parser.parse_args()
486+ set_logging(self.args.debug)
487+ logging.debug(self.args)
488+
489+ def __load_config(self):
490+ ''' Load a configuration file in yaml format
491+
492+ :param path: path to configuration file '''
493+ path = self.args.config
494+ logging.debug('Loading configuration file: %s', path)
495+ path = os.path.abspath(os.path.expanduser(path))
496+
497+ if not os.path.isfile(path):
498+ logging.warning('Configuration file doesn\'t exists (%s), using '
499+ 'default settings.', path)
500+ return
501+
502+ try:
503+ cfg = yaml.load(file(path, 'r'))
504+ except yaml.scanner.ScannerError as exc:
505+ logging.error('Error loading configuration file, exiting!\n' +
506+ '=' * 80 + '\n%s\n' + '=' * 80, exc)
507+ sys.exit(E_CONFIG)
508+
509+ for kcfg in self.config:
510+ if kcfg in cfg:
511+ self.config[kcfg] = cfg[kcfg]
512+
513+ if self.args.release:
514+ self.config['release'] = self.args.release
515+ if self.args.arch:
516+ self.config['arch'] = self.args.arch
517+
518+ def __init_apt(self):
519+ ''' Initialize apt and create all required the files '''
520+
521+ if not self.config['aptroot']:
522+ logging.error('No APTROOT defined, exiting!')
523+ sys.exit(E_CONFIG)
524+ # Create aptroot tree
525+ self.pocket = "-proposed" if self.args.use_proposed else ""
526+
527+ self.config['aptroot'] = os.path.abspath(
528+ os.path.expanduser(self.config['aptroot']))
529+
530+ if not os.path.isdir(self.config['aptroot']):
531+ logging.debug('Creating APT root directory (%s)',
532+ self.config['aptroot'])
533+ os.makedirs(self.config['aptroot'])
534+ apt_pkg.init()
535+ apt_pkg.config.set('Dir', self.config['aptroot'])
536+ self.cache = AptCache(rootdir=self.config['aptroot'])
537+
538+ # We don't need package indices in collect mode
539+ if self.args.func == self.cmd_collect:
540+ return
541+
542+ if not self.args.no_update:
543+ self.cache.write_sources_list(self.config['apturi'],
544+ self.config['arch'],
545+ self.config['release'],
546+ self.config['components'],
547+ self.args.use_proposed)
548+ self.cache.update()
549+ self.cache.open()
550+
551+ def __get_release_header(self):
552+ '''Extract headers from release files'''
553+ from itertools import islice
554+
555+ # Load header from Release files
556+ rfile_pattern = os.path.join(self.config['aptroot'],
557+ 'var/lib/apt/lists/') +\
558+ '*_{}*_Release'.format(self.config['release'])
559+
560+ header = []
561+ for rfilename in glob(rfile_pattern):
562+ if rfilename.endswith('{}-proposed_Release') and \
563+ not self.args.use_proposed:
564+ continue
565+ with open(rfilename) as rfd:
566+ header += [x.strip() for x in list(islice(rfd, 6))
567+ if x.startswith('Suite: ') or
568+ x.startswith('Date: ')]
569+ return header
570+
571+ def __write_result_file(self, lines, filename=None, header=True):
572+ '''Write packages to test to a request file or stdout if none is
573+ given
574+ '''
575+ # filename overrides self.args.output
576+ if filename is not None:
577+ outputfile = filename
578+ else:
579+ outputfile = self.args.output
580+
581+ if not outputfile:
582+ logging.debug('Writing output to stdout')
583+ fdh = sys.stdout
584+ else:
585+ logging.debug('Writing output to %s', outputfile)
586+ output_dir = os.path.dirname(outputfile)
587+ if not os.path.exists(output_dir):
588+ os.makedirs(output_dir)
589+ fdh = open(outputfile, 'w')
590+
591+ if header:
592+ header = self.__get_release_header()
593+ fdh.write('\n'.join(header) + '\n')
594+
595+ for line in lines:
596+ fdh.write(line + '\n')
597+
598+ if fdh != sys.stdout:
599+ fdh.close()
600+
601+ def cmd_request(self):
602+ ''' Process or generate a request file '''
603+ logging.debug('Cmd request')
604+
605+ lines = []
606+ reqlist = {}
607+ if self.args.requestfile:
608+ (header, reqlist) = load_request_file(self.args.requestfile)
609+ for pkgname in reqlist:
610+ causes = {}
611+ deps_path = self.status_pattern % (
612+ self.config['release'], self.pocket,
613+ self.config['arch'], pkgname)
614+ job = AdtJob(self.cache, deps_path,
615+ use_proposed=self.args.use_proposed,
616+ boottest=True)
617+ if not job.package:
618+ job.release = self.config['release']
619+ job.pkgname = pkgname
620+ (pkgvers, pocket, pkgdeps) = job.get_package_info()
621+ logging.info('== New job')
622+ causes = {pkgname: pkgvers}
623+ status = 'NEW'
624+ else:
625+ if job.run_required(reqlist):
626+ causes = job.causes
627+ (pkgvers, pocket, pkgdeps) = job.get_package_info()
628+ status = 'NEW'
629+ if causes:
630+ lines.append(
631+ ' '.join((
632+ pkgname, pkgvers, status,
633+ ' '.join([
634+ "%s %s" % (k, v) for k, v in causes.iteritems()
635+ ]))))
636+
637+ self.__write_result_file(lines)
638+
639+ def cmd_submit(self, packages=None):
640+ ''' Submit a request a request file to jenkins '''
641+ logging.debug('Cmd submit')
642+ # Load the request file
643+ pkglist = {}
644+ if packages is None:
645+ packages = []
646+ if hasattr(self.args, 'requestfile'):
647+ (header, pkglist) = load_request_file(self.args.requestfile)
648+
649+ for pkg in packages:
650+ if pkg not in pkglist:
651+ logging.debug("Adding '%s' to request list", pkg)
652+ pkglist[pkg] = {
653+ 'version': None,
654+ 'status': 'NEW',
655+ 'causes': {pkg: None}
656+ }
657+
658+ for pkgname, pkgprops in pkglist.iteritems():
659+ deps_path = self.status_pattern % (
660+ self.config['release'], self.pocket,
661+ self.config['arch'], pkgname)
662+ job = AdtJob(self.cache, deps_path,
663+ use_proposed=self.args.use_proposed,
664+ boottest=True)
665+ if ('causes' in pkgprops and
666+ pkgname in pkgprops['causes'] and
667+ pkgprops['causes'][pkgname] is None and job.version):
668+ pkgprops['causes'][pkgname] = job.version
669+ if not job.package:
670+ logging.info('== New job')
671+ job.release = self.config['release']
672+ job.pkgname = pkgname
673+ if 'causes' in pkgprops:
674+ job.causes = pkgprops['causes']
675+ ret = job.submit(os.path.join(self.config['rsync_host'],
676+ self.config['release'], 'in/'),
677+ force=True)
678+ else:
679+ job.status = 'NEW'
680+ if 'causes' in pkgprops:
681+ job.causes = pkgprops['causes']
682+ force = self.args.func == self.cmd_retry
683+ ret = job.submit(os.path.join(self.config['rsync_host'],
684+ self.config['release'], 'in/'),
685+ force=force)
686+
687+ if ret:
688+ # Copy to state file to workdir
689+ dest = os.path.join(self.boottest_workdir,
690+ os.path.basename(deps_path) + '.' +
691+ strftime('%Y%m%d-%H%M%S') + '.state')
692+ copyfile(deps_path, dest)
693+
694+ # also submit the job to the UCI airline through AMQP; FIXME: this is
695+ # temporary to run airline testing side by side with Jenkins testing
696+ # until we switch over
697+ # FIXME: track architectures on a higher level; for now, just hardcode
698+ # them
699+ if self.amqp_queue:
700+ for arch in ['i386', 'amd64']:
701+ msg = json.dumps({'series': self.config['release'],
702+ 'architecture': arch,
703+ 'package': job.pkgname})
704+ logging.debug('submitting AMQP request: %s', msg)
705+ self.amqp_queue.put(msg)
706+
707+ if (len(pkglist) > 0 and hasattr(self.args, 'requestfile')):
708+ self.__copy_to_archive(self.args.requestfile)
709+
710+ def __load_results(self, filelist):
711+ """
712+ Loads result files
713+
714+ This method loads the results from a list of result files (*.result in
715+ working directory) and returns a dictionary container the results.
716+ The result file which all the architecture have been tested are
717+ archived. Those that have not the archs covered are kept to update the
718+ history later on or allow test retry without having a state file
719+ matching a request from britney
720+
721+ Args:
722+ filelist: List of files to process
723+
724+ Returns:
725+ Dictionary containing the results that have been loaded
726+ """
727+ logging.debug("Loading result files")
728+ # Load all the results in memory
729+ test_results = RecursiveDict()
730+ result_files = RecursiveDict()
731+
732+ for result_file in filelist:
733+ try:
734+ (rel, arc, pkg, ver, res, dep) = load_result(result_file)
735+ except ValueError:
736+ logging.warning('Invalid file: %s', result_file)
737+ self.__move_to_archive(result_file)
738+ continue
739+
740+ if not pkg in test_results[rel][arc]:
741+ test_results[rel][arc][pkg] = []
742+ if not (res, dep) in test_results[rel][arc][pkg]:
743+ test_results[rel][arc][pkg].append((res, dep))
744+
745+ # Build a list of test result files where pkg and arc are inverted
746+ # compared to results. The goal is to identify the packages for
747+ # which we have all the result files and can archive the results.
748+ if not arc in result_files[rel][pkg]:
749+ result_files[rel][pkg][arc] = []
750+ result_files[rel][pkg][arc].append(result_file)
751+
752+ # Only archive files where all the archs have been tested
753+ for rel, pkgs in result_files.iteritems():
754+ for pkg, arcs in pkgs.iteritems():
755+ if len(arcs) >= len(ARCHS) - 1:
756+ for arc, files in arcs.iteritems():
757+ for resfile in files:
758+ self.__move_to_archive(resfile)
759+ return test_results
760+
761+ def __process_state_files(self, filelist, results):
762+ """
763+ Update state files from test results
764+
765+ Process a list of state files from a list of test results.
766+
767+ Args:
768+ filelist: List of state files
769+ results: Dictionary containing the test results
770+
771+ Returns:
772+ List of results that have been processed
773+ """
774+ all_results = []
775+ # Loads the state files and compare to the results
776+ for state_file in filelist:
777+ logging.debug('Processing %s', state_file)
778+ state = json.load(file(state_file))
779+
780+ # File already processed move it away
781+ if state['status']['all'] != 'RUNNING':
782+ logging.debug('Already done, moving %s', state_file)
783+ self.__move_to_archive(state_file)
784+ continue
785+
786+ global_result = True
787+ accepted_arch = 0
788+ all_done = False
789+ for arch in ARCHS:
790+ if arch == 'all':
791+ continue
792+
793+ # Already collected a result for this file, moving on
794+ if state['status'][arch] != 'RUNNING':
795+ accepted_arch += 1
796+ global_result = global_result & (
797+ state['status'][arch] == 'PASS')
798+ continue
799+ try:
800+ results_for_arch = results[
801+ state['release']][arch][state['package']]
802+ except KeyError:
803+ # No result for package
804+ continue
805+
806+ # Check which result file matches the test
807+ # Stop on the first with packages in causes and a version >=
808+ # the version that triggered the test
809+ for (test_result, test_depends) in results_for_arch:
810+ logging.debug('result %s : %s', test_result, test_depends)
811+ accepted_causes = 0
812+ for cause_pkg, cause_version \
813+ in state['causes'].iteritems():
814+ logging.debug('cause: %s %s', cause_pkg, cause_version)
815+ if not cause_pkg in test_depends:
816+ logging.debug('rejected: not in test depends: %s',
817+ cause_pkg)
818+ break
819+ test_version = test_depends[cause_pkg]
820+ # Version tested < Version requested => reject result
821+ if apt_pkg.version_compare(test_version,
822+ cause_version) < 0:
823+ logging.debug('rejected: test version <'
824+ 'pkg version')
825+ break
826+ accepted_causes += 1
827+ logging.debug('accepted for cause %s %s', cause_pkg,
828+ cause_version)
829+
830+ # Exit when a result is found that matches all the criteria
831+ if accepted_causes == len(state['causes']):
832+ accepted_arch += 1
833+ logging.debug('accepted for arch: %s %s %s',
834+ arch, test_result, test_depends)
835+ state['status'][arch] = test_result
836+ global_result = global_result & (test_result == 'PASS')
837+ break
838+
839+ if accepted_arch == len(ARCHS) - 1: # Removes 'all'
840+ state['status']['all'] = 'PASS' if global_result else 'FAIL'
841+ all_done = True
842+ all_results.append(state)
843+ # Update state file
844+ with open(state_file, 'w') as fp:
845+ logging.debug("Writing status file")
846+ json.dump(state, fp)
847+ logging.debug('TEST RESULT: %s', global_result)
848+ if all_done:
849+ self.__move_to_archive(state_file)
850+
851+ logging.debug('Runs accepted: %d', len(all_results))
852+ return all_results
853+
854+ def __update_history(self, history, results):
855+ """
856+ Update history with results even without state files
857+
858+ This method updates the history file used by britney to report test
859+ results, even if there is no state file. Tihs allows to replay tests
860+ directly from jenkins without an explicit request from britney. Tihs
861+ is useful when there is an infrastructure failure for example.
862+
863+ We read all the history and for all tests marked as FAIL or RUNNING,
864+ check if a newer version of the package has been tested and is
865+ successful if the result was FAIL; or PASS or FAIL if the result was
866+ RUNNING
867+
868+ Args:
869+ history: Dictionary containing the historical results
870+ results: New results loaded from result files. Only results with
871+ all the archs will be processed
872+
873+ Returns:
874+ An updated history
875+
876+ """
877+
878+ def result_is_acceptable(history, tested):
879+ """
880+ Compare package/version in history to version of the packages that
881+ have been tested and return True if packages tested have a version
882+ >= versions in history
883+
884+ Args:
885+ history: History record
886+ tested: Tuple container result of the test and packages that
887+ have been tested
888+ Returns:
889+ True if packages that have been tested are newer or equal to
890+ the version requested for the test
891+ """
892+ pkgs = history['causes']
893+ pkgs[history['package']] = history['version']
894+
895+ for pkg in pkgs:
896+ if pkg not in tested[1]:
897+ continue # Not a dependency anymore
898+ if apt_pkg.version_compare(pkgs[pkg], tested[1][pkg]) > 0:
899+ # Older version that requested -> discard
900+ return False
901+ return True
902+
903+ logging.debug("Reprocessing history")
904+ linenum = -1
905+ for hrow in history:
906+ linenum += 1
907+ if hrow['status']['all'] == 'PASS':
908+ continue
909+
910+ # Check if all the results are there for this result/package
911+ testres = 'UNK'
912+ # Flag that indicates if test results are available for all archs
913+ arch_seen = 0
914+ resrow = None
915+ for arch in ARCHS:
916+ if arch == 'all':
917+ continue
918+ # We need results for all the archs
919+ if hrow['package'] not in results[
920+ self.config['release']][arch]:
921+ break
922+ resrow = results[
923+ self.config['release']][arch][hrow['package']][-1]
924+ # Check if the package and causes have a version >= run in
925+ # history
926+ if not result_is_acceptable(hrow, resrow):
927+ break
928+
929+ if resrow[0] == 'FAIL':
930+ testres = 'FAIL'
931+ elif resrow[0] == 'PASS' and testres != 'FAIL':
932+ testres = 'PASS'
933+ arch_seen += 1
934+
935+ # There is one arch missing or , skip result
936+ if arch_seen < len(ARCHS) - 1 or testres == 'UNK':
937+ continue
938+
939+ # Update history with the latest result and versions
940+ if testres != 'UNK':
941+ history[linenum]['status']['all'] = testres
942+ if history[linenum]['package'] in resrow[1]:
943+ history[linenum]['version'] = resrow[1][
944+ history[linenum]['package']]
945+ else:
946+ logging.debug("package %s not found in tested packages",
947+ history[linenum]['package'])
948+
949+ for pkg in history[linenum]['causes']:
950+ if pkg in resrow[1]:
951+ history[linenum]['causes'][pkg] = resrow[1][pkg]
952+ else:
953+ logging.debug("cause %s not found in tested packages", pkg)
954+ return history
955+
956+ def cmd_collect(self):
957+ ''' Collect results from jenkins '''
958+ logging.debug('Cmd collect')
959+
960+ state_pattern = os.path.join(
961+ self.boottest_workdir, self.config['release'] + '*.state')
962+ result_pattern = os.path.join(
963+ self.boottest_workdir, self.config['release'] + '*.result')
964+
965+ if not self.__sync_results():
966+ return False
967+
968+ result_files = sorted(glob(result_pattern))
969+ state_files = sorted(glob(state_pattern))
970+
971+ # Load all the results in memory
972+ test_results = self.__load_results(result_files)
973+ all_results = self.__process_state_files(state_files, test_results)
974+
975+ # Update history file
976+ history_path = os.path.join(self.boottest_workdir, "results.history")
977+ logging.debug('Loading history from %s', history_path)
978+ history_res = load_results_history(history_path)
979+ history_all = merge_results(history_res, all_results)
980+ history_all = self.__update_history(history_all, test_results)
981+
982+ def resultdict2csv(results):
983+ lines = []
984+ for rec in results:
985+ lines.append(' '.join([rec['package'], rec['version'],
986+ rec['status']['all']] +
987+ [" ".join(x)
988+ for x in sorted(
989+ rec['causes'].iteritems())]))
990+ return lines
991+ lines = deduplicate_and_sort(resultdict2csv(history_all))
992+
993+ self.__write_result_file(lines, header=False, filename=history_path)
994+
995+ # Write new results
996+ if self.args.new_only:
997+ lines = resultdict2csv(all_results)
998+ self.__write_result_file(lines, header=False)
999+ else:
1000+ # Full history print lines again
1001+ self.__write_result_file(lines, header=False)
1002+
1003+ def cmd_retry(self):
1004+ """"
1005+ Submit a test request for a list of package passed in argument
1006+
1007+ Args:
1008+ None
1009+ Returns:
1010+ None
1011+ """
1012+ if self.args.force is not None:
1013+ # Read all the history and change the status of all the lines
1014+ # corresponding to the packages on the cmd line
1015+ history_path = os.path.join(self.boottest_workdir, "results.history")
1016+ lines = []
1017+ with open(history_path, 'r') as fhist:
1018+ for line in fhist:
1019+ linebits = line.strip().split()
1020+ for pkg in self.args.sourcepackages:
1021+ if pkg == linebits[0]:
1022+ logging.debug("Changing status of '%s' to '%s'",
1023+ pkg, self.args.force)
1024+ linebits[2] = self.args.force
1025+ lines.append(" ".join(linebits))
1026+ self.__write_result_file(lines, header=False,
1027+ filename=history_path)
1028+ return True
1029+ else:
1030+ self.cmd_submit(self.args.sourcepackages)
1031+
1032+ def __move_to_archive(self, source):
1033+ '''Copy a file to the archive folder, move it if move is True'''
1034+ self.__copy_to_archive(source, move=True)
1035+
1036+ def __copy_to_archive(self, source, move=False):
1037+ '''Copy a file to the archive folder, move it if move is True'''
1038+ num = 0
1039+ dest_dir = os.path.join(self.adt_archivedir, strftime('%Y/%m/%d'))
1040+ dest = os.path.join(dest_dir, os.path.basename(source))
1041+ if not os.path.exists(dest_dir):
1042+ os.makedirs(dest_dir)
1043+ tmpdest = dest
1044+ while os.path.exists(tmpdest):
1045+ num += 1
1046+ tmpdest = "{}.{}".format(dest, num)
1047+ dest = tmpdest
1048+ if move:
1049+ logging.debug('Moving %s -> %s', source, dest)
1050+ os.rename(source, dest)
1051+ else:
1052+ logging.debug('Copying %s -> %s', source, dest)
1053+ copyfile(source, dest)
1054+
1055+ def __sync_results(self):
1056+ ''' Update the latest results from remote host'''
1057+ logging.debug('Syncing latest results')
1058+ remote = os.path.join(self.config['rsync_host'],
1059+ self.config['release'], 'out/')
1060+ cmd = ['rsync', '-a', '--remove-source-files', remote,
1061+ self.boottest_workdir]
1062+ logging.debug('cmd: %s', cmd)
1063+ try:
1064+ subprocess.check_call(cmd)
1065+ except subprocess.CalledProcessError as exc:
1066+ logging.error('Command failed: %s', exc)
1067+ return False
1068+ return True
1069+
1070+def load_result(path):
1071+ ''' Load a result file '''
1072+ depends = []
1073+ with open(path, 'r') as f:
1074+ data = f.readline().split()
1075+ (release, arch, pkgname, pkgvers, result) = [data.pop(0) for x in
1076+ data[:5]]
1077+ i = iter(data)
1078+ depends = dict(zip(i, i))
1079+ depends[pkgname] = pkgvers
1080+
1081+ return (release, arch, pkgname, pkgvers, result, depends)
1082+
1083+
1084+def load_results_history(path):
1085+ """ Load history of results from path
1086+ """
1087+ depends = []
1088+ if not os.path.isfile(path):
1089+ logging.warning('File doesn\'t exists: %s', path)
1090+ return []
1091+ with open(path, 'r') as f:
1092+ lines = f.readlines()
1093+
1094+ results = []
1095+ for data in lines:
1096+ data = data.split()
1097+ (pkgname, pkgvers, result) = [data.pop(0) for x in data[:3]]
1098+ i = iter(data)
1099+ depends = dict(zip(i, i))
1100+ results.append({"package": pkgname,
1101+ "version": pkgvers,
1102+ "status": {"all": result},
1103+ "causes": depends,
1104+ "depends": {}})
1105+
1106+ return results
1107+
1108+
1109+def load_request_file(filename):
1110+ """Loads a request file with format
1111+ <package version> <status> [<package version> ...]
1112+ """
1113+
1114+ reqfile = filename
1115+ pkglist = {}
1116+
1117+ if not reqfile:
1118+ return pkglist
1119+
1120+ if not os.path.isfile(filename):
1121+ logging.error('File doesn\'t exists (%s), exiting!', filename)
1122+ sys.exit(E_CONFIG)
1123+
1124+ with open(filename, 'r') as fdh:
1125+ content = fdh.readlines()
1126+
1127+ header = []
1128+ for line in content:
1129+ # Header
1130+ if line.startswith('Suite:') or line.startswith('Date:'):
1131+ header.append(line.strip())
1132+ continue
1133+
1134+ # Packages
1135+ linebits = line.strip().split()
1136+ if len(linebits) < 2:
1137+ logging.warning('Invalid line format: %s, skipped',
1138+ line.strip())
1139+ continue
1140+ pkgname = linebits.pop(0)
1141+ pkgvers = linebits.pop(0)
1142+ pkglist[pkgname] = {
1143+ 'version': pkgvers,
1144+ 'status': 'NEW',
1145+ 'causes': {}
1146+ }
1147+
1148+ try:
1149+ pkglist[pkgname]['status'] = linebits.pop(0).upper()
1150+ while True:
1151+ cname = linebits.pop(0)
1152+ cvers = linebits.pop(0)
1153+ pkglist[pkgname]['causes'][cname] = cvers
1154+ except IndexError:
1155+ # End of the list
1156+ pass
1157+
1158+ logging.debug('Loaded request file:\n%s', pkglist)
1159+
1160+ return (header, pkglist)
1161+
1162+
1163+def merge_results(results_orig, results_new):
1164+ """ Returns history+new result and merges the results when packages and
1165+ causes of the test are identical and versions are greater or equal
1166+ """
1167+ flattened = {}
1168+ for result in results_orig:
1169+ # Create a key made of the package names to make finding same set of
1170+ # package/causes easier
1171+ key = result['package'] + "".join(result['causes'].keys())
1172+ flattened[key] = result
1173+ for newres in results_new:
1174+ key = newres['package'] + "".join(newres['causes'].keys())
1175+ if key in flattened:
1176+ # Check if versions of package and causes are equal or higher
1177+ # than previous run
1178+ if (apt_pkg.version_compare(
1179+ flattened[key]['version'], newres['version']) <= 0):
1180+ for pkg, vers in flattened[key]['causes'].iteritems():
1181+ alldepsge = True
1182+ if not (pkg in newres['causes'] and
1183+ apt_pkg.version_compare(
1184+ vers, newres['causes'][pkg]) <= 0):
1185+ alldepsge = False
1186+ if alldepsge:
1187+ # Replace previous result
1188+ flattened[key] = newres
1189+ else:
1190+ # New package/deps
1191+ flattened[key] = newres
1192+ return flattened.values()
1193+
1194+
1195+def deduplicate_and_sort(history_lines):
1196+ """ Remove duplicate, split list of dependency and sort by package name and
1197+ version
1198+ """
1199+ results = RecursiveDict()
1200+ lines = []
1201+ for line in history_lines:
1202+ line = line.split()
1203+ try:
1204+ (pname, pversion, result) = line[:3]
1205+ i = iter(line[3:])
1206+ causes = dict(zip(i, i))
1207+ except ValueError:
1208+ print('E: Invalid record: %s' % line)
1209+ continue
1210+ for cname, cversion in causes.iteritems():
1211+ if not results[pname][pversion][cname][cversion]:
1212+ results[pname][pversion][cname][cversion] = result
1213+ else:
1214+ if results[pname][pversion][cname][cversion] == 'PASS':
1215+ # If it passed once it cannot fail with the same version
1216+ continue
1217+ elif result == 'PASS':
1218+ # If it failed then pass, accept the pass. Failure might
1219+ # be caused by another dependency or an infra error
1220+ results[pname][pversion][cname][cversion] = 'PASS'
1221+
1222+ for pname in sorted(results):
1223+ pversions = sorted(results[pname].keys(), cmp=apt_pkg.version_compare)
1224+ for pversion in pversions:
1225+ for cname in sorted(results[pname][pversion]):
1226+ for cversion in sorted(results[pname][pversion][cname].keys(),
1227+ cmp=apt_pkg.version_compare):
1228+ lines.append(" ".join(
1229+ [pname, pversion,
1230+ results[pname][pversion][cname][cversion], cname,
1231+ cversion]))
1232+ return lines
1233+
1234+
1235+def set_logging(debugmode=False):
1236+ """Initialize logging"""
1237+ logging.basicConfig(
1238+ level=logging.DEBUG if debugmode else logging.INFO,
1239+ format="%(asctime)s %(levelname)s %(message)s")
1240+ logging.debug('Debug mode enabled')
1241+
1242+
1243+if __name__ == "__main__":
1244+ BoottestBritney()
1245
1246=== added file 'jenkins/jenkins_boottest_config.xml.tmpl'
1247--- jenkins/jenkins_boottest_config.xml.tmpl 1970-01-01 00:00:00 +0000
1248+++ jenkins/jenkins_boottest_config.xml.tmpl 2015-02-26 11:17:00 +0000
1249@@ -0,0 +1,79 @@
1250+<?xml version='1.0' encoding='UTF-8'?>
1251+<project>
1252+ <actions/>
1253+ <description>boottest test - {{ test }}&lt;br/&gt;&#xd;
1254+Ref: lp:auto-package-testing&lt;br/&gt;&#xd;
1255+Owner: Ubuntu CI Team&lt;br/&gt;&#xd;
1256+ </description>
1257+ <logRotator>
1258+ <daysToKeep>-1</daysToKeep>
1259+ <numToKeep>20</numToKeep>
1260+ <artifactDaysToKeep>-1</artifactDaysToKeep>
1261+ <artifactNumToKeep>-1</artifactNumToKeep>
1262+ </logRotator>
1263+ <keepDependencies>false</keepDependencies>
1264+ <properties>
1265+ <hudson.queueSorter.PrioritySorterJobProperty>
1266+ <priority>100</priority>
1267+ </hudson.queueSorter.PrioritySorterJobProperty>
1268+ <hudson.plugins.throttleconcurrents.ThrottleJobProperty>
1269+ <maxConcurrentPerNode>0</maxConcurrentPerNode>
1270+ <maxConcurrentTotal>0</maxConcurrentTotal>
1271+ <categories/>
1272+ <throttleEnabled>false</throttleEnabled>
1273+ <throttleOption>project</throttleOption>
1274+ <configVersion>1</configVersion>
1275+ </hudson.plugins.throttleconcurrents.ThrottleJobProperty>
1276+ </properties>
1277+ <scm class="hudson.scm.NullSCM"/>
1278+ <assignedNode>boottest&amp;&amp;krillin</assignedNode>
1279+ <canRoam>false</canRoam>
1280+ <disabled>false</disabled>
1281+ <blockBuildWhenDownstreamBuilding>false</blockBuildWhenDownstreamBuilding>
1282+ <blockBuildWhenUpstreamBuilding>false</blockBuildWhenUpstreamBuilding>
1283+ <authToken>crobEeshEms3</authToken>
1284+ <triggers class="vector"/>
1285+ <concurrentBuild>false</concurrentBuild>
1286+ <builders>
1287+ <hudson.tasks.Shell>
1288+ <command>#!/bin/bash -x
1289+set +e
1290+rm -Rf *
1291+
1292+rm -r test-runner || true
1293+setup_branch=&quot;lp:ubuntu-test-cases/touch&quot;
1294+bzr_cmd=&quot;bzr branch $setup_branch test-runner&quot;
1295+$bzr_cmd || $bzr_cmd || $bzr_cmd
1296+
1297+
1298+annotate-output test-runner/scripts/boottest.sh {{ release }} {{ test }} ${NODE_NAME}
1299+</command>
1300+ </hudson.tasks.Shell>
1301+ </builders>
1302+ <publishers>
1303+ <hudson.tasks.ArtifactArchiver>
1304+ <artifacts>results/*stderr,results/*stdout,results/*log,results/log,results/*-packages,results/testpkg-version,results/*.crash,results/testresults/*.xml,results/syslog</artifacts>
1305+ <latestOnly>false</latestOnly>
1306+ </hudson.tasks.ArtifactArchiver>
1307+ <hudson.plugins.descriptionsetter.DescriptionSetterPublisher>
1308+ <regexp>\sO:\sGet:[0-9]+\s\S+\s(\S+)\s(\S+)\s(\S+)\s\(dsc\)</regexp>
1309+ <regexpForFailed>\sO:\sGet:[0-9]+\s\S+\s(\S+)\s(\S+)\s(\S+)\s\(dsc\)</regexpForFailed>
1310+ <description>\1 \3</description>
1311+ <descriptionForFailed>\1 \3</descriptionForFailed>
1312+ <setForMatrix>false</setForMatrix>
1313+ </hudson.plugins.descriptionsetter.DescriptionSetterPublisher>
1314+ <hudson.plugins.build__publisher.BuildPublisher>
1315+ <publishUnstableBuilds>true</publishUnstableBuilds>
1316+ <publishFailedBuilds>true</publishFailedBuilds>
1317+ <postActions class="vector"/>
1318+ </hudson.plugins.build__publisher.BuildPublisher>
1319+ </publishers>
1320+ <buildWrappers>
1321+ <hudson.plugins.build__timeout.BuildTimeoutWrapper>
1322+ <strategy class="hudson.plugins.build_timeout.impl.AbsoluteTimeOutStrategy">
1323+ <timeoutMinutes>30</timeoutMinutes>
1324+ </strategy>
1325+ <operationList/>
1326+ </hudson.plugins.build__timeout.BuildTimeoutWrapper>
1327+ </buildWrappers>
1328+</project>
1329
1330=== added file 'jenkins/run-boottest-jenkins.py'
1331--- jenkins/run-boottest-jenkins.py 1970-01-01 00:00:00 +0000
1332+++ jenkins/run-boottest-jenkins.py 2015-02-26 11:17:00 +0000
1333@@ -0,0 +1,197 @@
1334+#!/usr/bin/env python
1335+
1336+# Copyright 2015 Canonical Ltd.
1337+# This program is free software: you can redistribute it and/or modify it
1338+# under the terms of the GNU Affero General Public License version 3, as
1339+# published by the Free Software Foundation.
1340+
1341+# This program is distributed in the hope that it will be useful, but
1342+# WITHOUT ANY WARRANTY; without even the implied warranties of
1343+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
1344+# PURPOSE. See the GNU Affero General Public License for more details.
1345+
1346+# You should have received a copy of the GNU Affero General Public License
1347+# along with this program. If not, see <http://www.gnu.org/licenses/>.
1348+
1349+import os
1350+import distro_info
1351+import logging
1352+import datetime
1353+import subprocess
1354+import sys
1355+
1356+from os import listdir
1357+from os.path import isfile, join
1358+
1359+DATAROOT = '/var/local/boottest/'
1360+APTROOT = '/var/lib/jenkins/CI/boottest/aptroot'
1361+LOCKFILE = os.path.join('/var/lock',
1362+ '{}.lock'.format(os.path.basename(sys.argv[0])))
1363+CWD = os.path.dirname(os.path.realpath(__file__))
1364+
1365+
1366+def logger_inst(logdir, name):
1367+ logger = logging.getLogger(name)
1368+ logger.setLevel(logging.DEBUG)
1369+ file_name = os.path.join(logdir, name)
1370+ if not logger.handlers:
1371+ handler = logging.FileHandler(file_name)
1372+ formatter = logging.Formatter('%(asctime)s'
1373+ ' %(levelname)s: %(message)s')
1374+ handler.setFormatter(formatter)
1375+ handler.setLevel(logging.DEBUG)
1376+ logger.addHandler(handler)
1377+
1378+ return logger
1379+
1380+
1381+def exit_handler():
1382+ if os.path.exists(os.path.abspath(LOCKFILE)):
1383+ os.remove(os.path.abspath(LOCKFILE))
1384+
1385+
1386+def submit_job(release, statefile):
1387+ statefile = os.path.abspath(statefile)
1388+ cmd = ['python',
1389+ os.path.join(CWD, 'adt-jenkins'),
1390+ '-dbP',
1391+ '-c', os.path.expanduser('~/CI/boottest-britney.rc'),
1392+ '-J', os.path.expanduser('~/.boottest.credentials'),
1393+ '-r', release,
1394+ '--submit', statefile]
1395+ proc = subprocess.Popen(cmd, stdout=subprocess.PIPE,
1396+ stderr=subprocess.PIPE)
1397+ out, err = proc.communicate()
1398+ return proc, out, err
1399+
1400+
1401+def copy_results(log, release):
1402+ log.info('Preparing the results to be fetched by Britney')
1403+
1404+ tmpdir = os.path.join(DATAROOT, release, 'tmp')
1405+ outdir = os.path.join(DATAROOT, release, 'out')
1406+ updcmd = ['python',
1407+ os.path.join(CWD, '../bin/testbed/adt-export-result')]
1408+ log.info('Moving results files')
1409+
1410+ resultfiles = [f for f in listdir(tmpdir)
1411+ if (isfile(join(tmpdir, f))
1412+ and '.result' in f)]
1413+ for resfile in resultfiles:
1414+ try:
1415+ results = open(os.path.join(tmpdir, resfile)).read().split()
1416+ rel = results[0]
1417+ arc = results[1]
1418+ pkg = results[2]
1419+ res = results[4]
1420+ updcmd += ['-r', rel, '-a', arc,
1421+ '--rootdir={}/{}-proposed/amd64/'.format(APTROOT, rel),
1422+ '-D', outdir, pkg, res]
1423+ proc = subprocess.Popen(updcmd, stdout=subprocess.PIPE,
1424+ stderr=subprocess.PIPE)
1425+ out, err = proc.communicate()
1426+ if proc.returncode:
1427+ msg = ('Updating boottest results on {} package failed'
1428+ ' with stdout:\n{}\n stderr:\n{}\n'.format(pkg,
1429+ out,
1430+ err))
1431+ log.error(msg)
1432+ os.remove(os.path.join(tmpdir, resfile))
1433+ except (OSError, ValueError):
1434+ log.error('Moving result file from temp to out failed')
1435+ raise
1436+
1437+ errorfiles = [f for f in listdir(tmpdir)
1438+ if (isfile(join(tmpdir, f))
1439+ and '.error' in f)]
1440+ for errfile in errorfiles:
1441+ try:
1442+ errors = open(os.path.join(tmpdir, errfile)).read().split()
1443+ rel = errors[0]
1444+ arc = errors[1]
1445+ pkg = errors[2]
1446+ log.error('Errors found for {} {} {}'.format(rel, arc, pkg))
1447+ updcmd += ['-r', rel, '-a', arc,
1448+ '--rootdir={}/{}-proposed/amd64/'.format(APTROOT, rel),
1449+ '-D', outdir, pkg, 'FAIL']
1450+ proc = subprocess.Popen(updcmd, stdout=subprocess.PIPE,
1451+ stderr=subprocess.PIPE)
1452+ out, err = proc.communicate()
1453+ if proc.returncode:
1454+ msg = ('Updating boottest results on {} package failed'
1455+ ' with stdout:\n{}\n stderr:\n{}\n'.format(pkg,
1456+ out,
1457+ err))
1458+ log.error(msg)
1459+ os.remove(os.path.join(tmpdir, errfile))
1460+ except (OSError, ValueError) as e:
1461+ log.error('Moving error file from temp to out failed')
1462+ raise e
1463+
1464+
1465+def lock_script():
1466+ pid = str(os.getpid())
1467+ if os.path.exists(os.path.abspath(LOCKFILE)):
1468+ msg = ("The lock file {} is present "
1469+ "Check for any stale runs".format(os.path.abspath(LOCKFILE)))
1470+ sys.exit(msg)
1471+
1472+ with open(os.path.abspath(LOCKFILE), 'w') as lock:
1473+ lock.write(pid)
1474+
1475+
1476+def run_cron():
1477+ try:
1478+ releases = [distro_info.UbuntuDistroInfo().devel(), ]
1479+ except:
1480+ msg = ('No devel release found. `distro-info-data` could be outdated')
1481+ sys.exit(msg)
1482+
1483+ lock_script()
1484+
1485+ for release in releases:
1486+ logdir = os.path.join(os.path.expanduser('~/CI/logs/'), release)
1487+ indir = os.path.join(DATAROOT, release, 'in')
1488+ tmpdir = os.path.join(DATAROOT, release, 'tmp')
1489+ outdir = os.path.join(DATAROOT, release, 'out')
1490+
1491+ try:
1492+ for dirs in [logdir, indir, tmpdir, outdir]:
1493+ if not os.path.exists(dirs):
1494+ os.makedirs(dirs)
1495+ except OSError:
1496+ raise
1497+
1498+ today = datetime.date.today().strftime("%Y%m%d")
1499+ name = 'run-jenkins.{}.{}.log'.format(release, today)
1500+ log = logger_inst(logdir, name)
1501+ log.info("Checking for incoming requests")
1502+
1503+ statefiles = [f for f in listdir(indir)
1504+ if (isfile(join(indir, f))
1505+ and '{}-propose'.format(release) in f)]
1506+
1507+ for statefile in statefiles:
1508+ proc, out, err = submit_job(release,
1509+ os.path.join(indir, statefile))
1510+ if proc.returncode:
1511+ msg = ('Triggering boottest on {} release failed'
1512+ ' with stdout:\n{}\n stderr:\n{}\n'.format(release,
1513+ out,
1514+ err))
1515+ log.error(msg)
1516+ log.info(err)
1517+
1518+ copy_results(log, release)
1519+ log.info("===== Done =====")
1520+
1521+
1522+def main():
1523+ try:
1524+ run_cron()
1525+ finally:
1526+ exit_handler()
1527+
1528+
1529+if __name__ == '__main__':
1530+ main()

Subscribers

People subscribed via source and target branches