Merge lp:~fginther/ubuntu-ci-services-itself/bsb-dput-3 into lp:ubuntu-ci-services-itself

Proposed by Francis Ginther
Status: Merged
Approved by: Francis Ginther
Approved revision: 195
Merged at revision: 233
Proposed branch: lp:~fginther/ubuntu-ci-services-itself/bsb-dput-3
Merge into: lp:ubuntu-ci-services-itself
Diff against target: 2431 lines (+2300/-16)
16 files modified
branch-source-builder/cupstream2distro/branchhandling.py (+283/-0)
branch-source-builder/cupstream2distro/launchpadmanager.py (+157/-0)
branch-source-builder/cupstream2distro/packageinppa.py (+225/-0)
branch-source-builder/cupstream2distro/packageinppamanager.py (+109/-0)
branch-source-builder/cupstream2distro/packagemanager.py (+504/-0)
branch-source-builder/cupstream2distro/settings.py (+121/-0)
branch-source-builder/cupstream2distro/silomanager.py (+115/-0)
branch-source-builder/cupstream2distro/stack.py (+204/-0)
branch-source-builder/cupstream2distro/stacks.py (+89/-0)
branch-source-builder/cupstream2distro/tools.py (+94/-0)
branch-source-builder/cupstream2distro/utils.py (+29/-0)
branch-source-builder/run_worker (+48/-9)
branch-source-builder/upload_package.py (+137/-0)
branch-source-builder/watch_ppa.py (+182/-0)
charms/precise/lander-jenkins/templates/jobs/lander_master.xml (+0/-7)
juju-deployer/branch-source-builder.yaml (+3/-0)
To merge this branch: bzr merge lp:~fginther/ubuntu-ci-services-itself/bsb-dput-3
Reviewer Review Type Date Requested Status
Andy Doan (community) Approve
PS Jenkins bot (community) continuous-integration Approve
Francis Ginther Needs Resubmitting
Review via email: mp+205851@code.launchpad.net

Commit message

Adds the dput and ppa_watch functionality to the branch source builder. This is very raw, most of the functionality was pulled directly from lp:cupstream2distro with small changes related to the input parameters and the deployment environment.

Description of the change

Adds the dput and ppa_watch functionality to the branch source builder. This is very raw, most of the functionality was pulled directly from lp:cupstream2distro with small changes related to the input parameters and the deployment environment.

The series and archive PPA are currently hard coded. This will be removed by a subsequent MP.

You'll also notice no tests here. The cupstream2distro code also has no tests for these modules. I need to address this. For now, I'm banking on the fact that this code is already consumed by the daily-release process.

To post a comment you must log in.
Revision history for this message
Francis Ginther (fginther) wrote :

This is a follow up to bsb-dput-2. I screwed up the branch and had to resubmit the whole thing.

> I'm skipping the cupstream2distro code since its mostly copy/paste.
>
> = run-worker:
>
> 2050 + if amqp_utils.progress_completed(trigger):
> 2051 + log.error('Unable to notify progress-trigger completion of action')
>
> the progress_completed function already logs an error, so you really
> only need to check the return code for it if you are going to handle
> failure in some special way. Since we don't here, I'd delete the
> log.error line.

Fixed.

> You also need to supply a payload to the "progress_completed" call. The
> data returned from this function winds up being the data that will be
> passed to the image builder. I think it needs to contain:
>
> {
> "ppa": "<foo>"
> }
>
> so we can remove the hack from:
>
> <http://bazaar.launchpad.net/~canonical-ci-engineering/ubuntu-ci-services-
> itself/trunk/view/head:/charms/precise/lander-
> jenkins/templates/jobs/lander_master.xml#L170>

Fixed, good catch on removing the hack.

> 2052 + except EnvironmentError as e:
>
> I think I'd just catch "Exception" here. That way nothing slips through
> in a way where we won't call amqp_utils.progress_failed

Fixed

> = upload_package:
>
> 2141 + file_name = source.split('/')[-1]
>
> I think you can just os.path.basename to make it more obvious what your
> intent is:
>
> os.path.basename(source)

Fixed, I also removed the hack to remove the container name from the filename (the data-store module has been updated to just use the filename).

> 2186 + logging.basicConfig(level=logging.INFO, format="%(asctime)s
> %(levelname)s "
> 2187 + "%(message)s")
>
> I think that should be moved to "main" instead. Otherwise you might just
> override the logging config of the process that's using this API.

Fixed, also fixed in watch_ppa.py

review: Needs Resubmitting
Revision history for this message
PS Jenkins bot (ps-jenkins) wrote :

PASSED: Continuous integration, rev:195
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/142/
Executed test runs:

Click here to trigger a rebuild:
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/142/rebuild

review: Approve (continuous-integration)
Revision history for this message
Andy Doan (doanac) wrote :

the most exciting thing i've seen today!

I saw we merge this and then start testing it hard. I noticed one small thing:

2035 + archive_ppa = 'ppa:ci-engineering-airline/ci-archive'
2036 + series = 'saucy'

once this is merged into trunk, I think you can pick up some work Chris added today:

  http://bazaar.launchpad.net/~canonical-ci-engineering/ubuntu-ci-services-itself/trunk/revision/231

Revision history for this message
Francis Ginther (fginther) wrote :

> the most exciting thing i've seen today!
>
> I saw we merge this and then start testing it hard. I noticed one small thing:
>
> 2035 + archive_ppa = 'ppa:ci-engineering-airline/ci-archive'
> 2036 + series = 'saucy'
>
> once this is merged into trunk, I think you can pick up some work Chris added
> today:
>
> http://bazaar.launchpad.net/~canonical-ci-engineering/ubuntu-ci-services-
> itself/trunk/revision/231

Right, I also noticed the ticket system change today, but didn't want to further delay this MP.

Revision history for this message
Andy Doan (doanac) :
review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== added directory 'branch-source-builder/cupstream2distro'
=== added file 'branch-source-builder/cupstream2distro/__init__.py'
=== added file 'branch-source-builder/cupstream2distro/branchhandling.py'
--- branch-source-builder/cupstream2distro/branchhandling.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/branchhandling.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,283 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import ConfigParser
21from collections import defaultdict
22import logging
23import os
24import re
25import subprocess
26
27from .settings import BRANCH_URL, IGNORECHANGELOG_COMMIT, PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX, SILO_PACKAGING_RELEASE_COMMIT_MESSAGE
28
29
30def get_branch(branch_url, dest_dir):
31 '''Grab a branch'''
32 instance = subprocess.Popen(["bzr", "branch", branch_url, dest_dir], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
33 (stdout, stderr) = instance.communicate()
34 if instance.returncode != 0:
35 raise Exception(stderr.decode("utf-8").strip())
36
37
38def get_tip_bzr_revision():
39 '''Get latest revision in bzr'''
40 instance = subprocess.Popen(["bzr", "log", "-c", "-1", "--line"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
41 (stdout, stderr) = instance.communicate()
42 if instance.returncode != 0:
43 raise Exception(stderr.decode("utf-8").strip())
44 return (int(stdout.split(':')[0]))
45
46
47def collect_author_commits(content_to_parse, bugs_to_skip, additional_stamp=""):
48 '''return a tuple of a dict with authors and commits message from the content to parse
49
50 bugs_to_skip is a set of bugs we need to skip
51
52 Form: ({Author: [commit_message]}, set(bugs))'''
53
54 author_commit = defaultdict(list)
55 all_bugs = set()
56
57 current_authors = set()
58 current_commit = ""
59 current_bugs = set()
60 commit_message_stenza = False
61 for line in content_to_parse.splitlines():
62 # new revision, collect what we have found
63 if line.startswith("------------------------------------------------------------"):
64 # try to decipher a special case: we have some commits which were already in bugs_to_skip,
65 # so we eliminate them.
66 # Also ignore when having IGNORECHANGELOG_COMMIT
67 if (current_bugs and not (current_bugs - bugs_to_skip)) or IGNORECHANGELOG_COMMIT in current_commit:
68 current_authors = set()
69 current_commit = ""
70 current_bugs = set()
71 continue
72 current_bugs -= bugs_to_skip
73 commit_message = current_commit + _format_bugs(current_bugs)
74 for author in current_authors:
75 if not author.startswith("Launchpad "):
76 author_commit[author].append(commit_message)
77 all_bugs = all_bugs.union(current_bugs)
78 current_authors = set()
79 current_commit = ""
80 current_bugs = set()
81
82 # we ignore this commit if we have a changelog provided as part of the diff
83 if line.startswith("=== modified file 'debian/changelog'"):
84 current_authors = set()
85 current_commit = ""
86 current_bugs = set()
87
88 if line.startswith("author: "):
89 current_authors = _extract_authors(line[8:])
90 # if direct commit to trunk
91 elif not current_authors and line.startswith("committer: "):
92 current_authors = _extract_authors(line[11:])
93 # file the commit message log
94 elif commit_message_stenza:
95 if line.startswith("diff:"):
96 commit_message_stenza = False
97 current_commit, current_bugs = _extract_commit_bugs(current_commit, additional_stamp)
98 else:
99 line = line[2:] # Dedent the message provided by bzr
100 if line[0:2] in ('* ', '- '): # paragraph line.
101 line = line[2:] # Remove bullet
102 if line[-1] != '.': # Grammar nazi...
103 line += '.' # ... or the lines will be merged.
104 line = line + ' ' # Add a space to preserve lines
105 current_commit += line
106 # Maybe add something like that
107 #for content in mp.commit_message.split('\n'):
108 # content = content.strip()
109 # if content.startswith('-') or content.startswith('*'):
110 # content = content[1:]
111 # description_content.append(content.strip())
112 elif line.startswith("message:"):
113 commit_message_stenza = True
114 elif line.startswith("fixes bug: "):
115 current_bugs = current_bugs.union(_return_bugs(line[11:]))
116
117 return (dict(author_commit), all_bugs)
118
119
120def _format_bugs(bugs):
121 '''Format a list of launchpad bugs.'''
122 if bugs:
123 msg = ' (LP: {})'.format(', '.join(['#{}'.format(b) for b in bugs]))
124 else:
125 msg = ''
126 return msg
127
128
129def _extract_commit_bugs(commit_message, additional_stamp=""):
130 '''extract relevant commit message part and bugs number from a commit message'''
131
132 current_bugs = _return_bugs(commit_message)
133 changelog_content = " ".join(commit_message.rsplit('Fixes: ')[0].rsplit('Approved by ')[0].split())
134 if additional_stamp:
135 changelog_content = changelog_content + " " + additional_stamp
136 return (changelog_content, current_bugs)
137
138
139def _return_bugs(string):
140 '''return a set of bugs from string'''
141
142 # we are trying to match in the commit message:
143 # bug #12345, bug#12345, bug12345, bug 12345
144 # lp: #12345, lp:#12345, lp:12345, lp: 12345
145 # lp #12345, lp#12345, lp12345, lp 12345,
146 # Fix #12345, Fix 12345, Fix: 12345, Fix12345, Fix: #12345,
147 # Fixes #12345, Fixes 12345, Fixes: 12345, Fixes:12345, Fixes: #12345
148 # *launchpad.net/bugs/1234567890 and
149 # #12345 (but not 12345 for false positive)
150 # Support multiple bugs per commit
151 bug_numbers = set()
152 bug_regexp = re.compile("((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})", re.IGNORECASE)
153 for match in bug_regexp.findall(string):
154 logging.debug("Bug regexp match: {}".format(match[-1]))
155 bug_numbers.add(int(match[-1]))
156 return bug_numbers
157
158
159def _extract_authors(string):
160 '''return a authors set from string, ignoring emails'''
161
162 authors = set()
163 for author_with_mail in string.split(", "):
164 author = author_with_mail.rsplit(' <')[0]
165 logging.debug("Found {} as author".format(author))
166 authors.add(author)
167 return authors
168
169
170def return_log_diff(starting_rev):
171 '''Return the relevant part of the cvs log since starting_rev'''
172
173 instance = subprocess.Popen(["bzr", "log", "-r", "{}..".format(starting_rev), "--show-diff", "--forward"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
174 (stdout, stderr) = instance.communicate()
175 if instance.returncode != 0:
176 raise Exception(stderr.decode("utf-8").strip())
177 return stdout
178
179
180def return_log_diff_since_last_release(content_to_parse):
181 '''From a bzr log content, return only the log diff since the latest release'''
182 after_release = content_to_parse.split(SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(''))[-1]
183 sep = '------------------------------------------------------------'
184 sep_index = after_release.find(sep)
185 if sep_index != 1:
186 after_release = after_release[sep_index:]
187 return after_release
188
189
190def commit_release(new_package_version, tip_bzr_rev=None):
191 '''Commit latest release'''
192 if not tip_bzr_rev:
193 message = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version)
194 else:
195 message = "Releasing {}, based on r{}".format(new_package_version, tip_bzr_rev)
196 if subprocess.call(["bzr", "commit", "-m", message]) != 0:
197 raise Exception("The above command returned an error.")
198
199
200def _get_parent_branch(source_package_name):
201 '''Get parent branch from config'''
202 config = ConfigParser.RawConfigParser()
203 config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX))
204 return config.get('Branch', 'branch')
205
206
207def propose_branch_for_merging(source_package_name, version, tip_rev, branch):
208 '''Propose and commit a branch upstream'''
209
210 parent_branch = _get_parent_branch(source_package_name)
211 # suppress browser opening
212 env = os.environ.copy()
213 env["BROWSER"] = "echo"
214 env["BZR_EDITOR"] = "echo"
215
216 os.chdir(source_package_name)
217 if subprocess.call(["bzr", "push", BRANCH_URL.format(source_package_name, version.replace("~", "").replace(":", "")), "--overwrite"]) != 0:
218 raise Exception("The push command returned an error.")
219 mergeinstance = subprocess.Popen(["bzr", "lp-propose-merge", parent_branch, "-m", PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch), "--approve"], stdin=subprocess.PIPE, env=env)
220 mergeinstance.communicate(input="y")
221 if mergeinstance.returncode != 0:
222 raise Exception("The lp-propose command returned an error.")
223 os.chdir('..')
224
225
226def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri, commit_message, revision):
227 """Merge local branch into lp_parent_branch at revision"""
228 success = False
229 cur_dir = os.path.abspath('.')
230 subprocess.call(["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri])
231 os.chdir(dest_uri)
232 if subprocess.call(["bzr", "merge", local_branch_uri]) == 0:
233 subprocess.call(["bzr", "commit", "-m", commit_message])
234 success = True
235 os.chdir(cur_dir)
236 return success
237
238
239def merge_branch(uri_to_merge, lp_parent_branch, commit_message, authors=set()):
240 """Resync with targeted branch if possible"""
241 success = False
242 cur_dir = os.path.abspath('.')
243 os.chdir(uri_to_merge)
244 lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:")
245 if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0:
246 cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"]
247 for author in authors:
248 cmd.extend(['--author', author])
249 subprocess.call(cmd)
250 success = True
251 os.chdir(cur_dir)
252 return success
253
254def push_to_branch(source_uri, lp_parent_branch, overwrite=False):
255 """Push source to parent branch"""
256 success = False
257 cur_dir = os.path.abspath('.')
258 os.chdir(source_uri)
259 lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:")
260 command = ["bzr", "push", lp_parent_branch]
261 if overwrite:
262 command.append("--overwrite")
263 if subprocess.call(command) == 0:
264 success = True
265 os.chdir(cur_dir)
266 return success
267
268def grab_committers_compared_to(source_uri, lp_branch_to_scan):
269 """Return unique list of committers for a given branch"""
270 committers = set()
271 cur_dir = os.path.abspath('.')
272 os.chdir(source_uri)
273 lp_branch_to_scan = lp_branch_to_scan.replace("https://code.launchpad.net/", "lp:")
274 instance = subprocess.Popen(["bzr", "missing", lp_branch_to_scan, "--other"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
275 (stdout, stderr) = instance.communicate()
276 if stderr != "":
277 raise Exception("bzr missing on {} returned a failure: {}".format(lp_branch_to_scan, stderr.decode("utf-8").strip()))
278 committer_regexp = re.compile("\ncommitter: (.*)\n")
279 for match in committer_regexp.findall(stdout):
280 for committer in match.split(', '):
281 committers.add(committer)
282 os.chdir(cur_dir)
283 return committers
0284
=== added file 'branch-source-builder/cupstream2distro/launchpadmanager.py'
--- branch-source-builder/cupstream2distro/launchpadmanager.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,157 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.cat
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20from __future__ import unicode_literals
21
22from launchpadlib.launchpad import Launchpad
23import lazr
24import logging
25import os
26launchpad = None
27
28from .settings import ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH, CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR
29
30
31def get_launchpad(use_staging=False, use_cred_file=os.path.expanduser(CRED_FILE_PATH)):
32 '''Get THE Launchpad'''
33 global launchpad
34 if not launchpad:
35 if use_staging:
36 server = 'staging'
37 else:
38 server = 'production'
39
40 launchpadlib_dir = COMMON_LAUNCHPAD_CACHE_DIR
41 if not os.path.exists(launchpadlib_dir):
42 os.makdedirs(launchpadlib_dir)
43
44 if use_cred_file:
45 launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"],
46 version='devel', # devel because copyPackage is only available there
47 credentials_file=use_cred_file,
48 launchpadlib_dir=launchpadlib_dir)
49 else:
50 launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"],
51 version='devel', # devel because copyPackage is only available there
52 launchpadlib_dir=launchpadlib_dir)
53
54 return launchpad
55
56
57def get_ubuntu():
58 '''Get the ubuntu distro'''
59 lp = get_launchpad()
60 return lp.distributions['ubuntu']
61
62
63def get_ubuntu_archive():
64 '''Get the ubuntu main archive'''
65 return get_ubuntu().main_archive
66
67
68def get_series(series_name):
69 '''Return the launchpad object for the requested series'''
70 return get_ubuntu().getSeries(name_or_version=series_name)
71
72
73def get_bugs_titles(author_bugs):
74 lp = get_launchpad()
75 author_bugs_with_title = author_bugs.copy()
76 for author in author_bugs:
77 bug_title_sets = set()
78 for bug in author_bugs[author]:
79 try:
80 bug_title_sets.add("{} (LP: #{})".format(lp.bugs[bug].title, bug))
81 except KeyError:
82 # still list non existing or if launchpad timeouts bugs
83 bug_title_sets.add(u"Fix LP: #{}".format(bug))
84 author_bugs_with_title[author] = bug_title_sets
85
86 return author_bugs_with_title
87
88
89def open_bugs_for_source(bugs_list, source_name, series_name):
90 lp = get_launchpad()
91 ubuntu = get_ubuntu()
92
93 # don't nominate for current series
94 if ubuntu.current_series.name == series_name:
95 package = ubuntu.getSourcePackage(name=source_name)
96 else:
97 series = get_series(series_name)
98 package = series.getSourcePackage(name=source_name)
99
100 for bug_num in bugs_list:
101 try:
102 bug = lp.bugs[bug_num]
103 bug.addTask(target=package)
104 bug.lp_save()
105 except (KeyError, lazr.restfulclient.errors.BadRequest, lazr.restfulclient.errors.ServerError):
106 # ignore non existing or available bugs
107 logging.info("Can't synchronize upstream/downstream bugs for bug #{}. Not blocking on that.".format(bug_num))
108
109
110def get_available_and_all_archs(series, ppa=None):
111 '''Return a set of available archs, and the all arch'''
112 available_arch = set()
113 if ppa and ppa.require_virtualized:
114 available_arch = set(VIRTUALIZED_PPA_ARCH)
115 arch_all_arch = VIRTUALIZED_PPA_ARCH[0]
116 else:
117 for arch in series.architectures:
118 # HACK: filters armel as it's still seen as available on raring: https://launchpad.net/bugs/1077257
119 if arch.architecture_tag == "armel":
120 continue
121 available_arch.add(arch.architecture_tag)
122 if arch.is_nominated_arch_indep:
123 arch_all_arch = arch.architecture_tag
124
125 return (available_arch, arch_all_arch)
126
127
128def get_ignored_archs():
129 '''The archs we can eventually ignore if nothing is published in dest'''
130 return (ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE)
131
132
133def get_ppa(ppa_name):
134 '''Return a launchpad ppa'''
135 if ppa_name.startswith("ppa:"):
136 ppa_name = ppa_name[4:]
137 ppa_dispatch = ppa_name.split("/")
138 return get_launchpad().people[ppa_dispatch[0]].getPPAByName(name=ppa_dispatch[1])
139
140def is_series_current(series_name):
141 '''Return if series_name is the edge development version'''
142 return get_ubuntu().current_series.name == series_name
143
144def get_resource_from_url(url):
145 '''Return a lp resource from a launchpad url'''
146 lp = get_launchpad()
147 url = lp.resource_type_link.replace("/#service-root", "") + url.split("launchpad.net")[1]
148 return lp.load(url)
149
150def get_resource_from_token(url):
151 '''Return a lp resource from a launchpad token'''
152 lp = get_launchpad()
153 return lp.load(url)
154
155def is_dest_ubuntu_archive(series_link):
156 '''return if series_link is the ubuntu archive'''
157 return series_link == get_ubuntu_archive().self_link
0158
=== added file 'branch-source-builder/cupstream2distro/packageinppa.py'
--- branch-source-builder/cupstream2distro/packageinppa.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/packageinppa.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,225 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import logging
21import os
22import re
23
24
25class PackageInPPA():
26
27 (BUILDING, FAILED, PUBLISHED) = range(3)
28
29 def __init__(self, source_name, version, ppa, destarchive, series,
30 available_archs_in_ppa, arch_all_arch, archs_to_eventually_ignore,
31 archs_to_unconditionually_ignore, package_archs=None):
32 self.source_name = source_name
33 self.version = version
34 self.series = series
35 self.arch_all_arch = arch_all_arch
36 self.ppa = ppa
37 self.current_status = {}
38
39 # Get archs we should look at
40 version_for_source_file = version.split(':')[-1]
41 if not package_archs:
42 dsc_filename = "{}_{}.dsc".format(source_name, version_for_source_file)
43 regexp = re.compile("^Architecture: (.*)\n")
44 for line in open(dsc_filename):
45 arch_lists = regexp.findall(line)
46 if arch_lists:
47 package_archs = arch_lists[0]
48 break
49
50 if not package_archs:
51 raise EnvironmentError("Can't determine package Architecture.")
52 if "any" in package_archs:
53 self.archs = available_archs_in_ppa.copy()
54 elif package_archs == "all":
55 self.archs = set([self.arch_all_arch])
56 else:
57 archs_supported_by_package = set()
58 for arch in package_archs.split():
59 archs_supported_by_package.add(arch)
60 self.archs = archs_supported_by_package.intersection(available_archs_in_ppa)
61 # ignore some eventual archs if doesn't exist in latest published version in dest
62 archs_to_eventually_ignore = archs_to_eventually_ignore.copy()
63 if archs_to_eventually_ignore:
64 try:
65 previous_source = destarchive.getPublishedSources(exact_match=True, source_name=self.source_name,
66 distro_series=self.series, status="Published")[0]
67 for binary in previous_source.getPublishedBinaries():
68 if binary.architecture_specific and binary.distro_arch_series.architecture_tag in archs_to_eventually_ignore:
69 archs_to_eventually_ignore -= set([binary.distro_arch_series.architecture_tag])
70 if not archs_to_eventually_ignore:
71 break
72
73 except IndexError:
74 # no package in dest, don't wait on any archs_to_eventually_ignore
75 pass
76 # remove from the inspection remaining archs to ignore
77 if archs_to_eventually_ignore:
78 self.archs -= archs_to_eventually_ignore
79 if archs_to_unconditionually_ignore:
80 self.archs -= archs_to_unconditionually_ignore
81
82
83 def __repr__(self):
84 return '{} - {}'.format(self.source_name, self.version)
85
86
87 def get_status(self, on_particular_arch=None):
88 '''Look at the package status in the ppa
89
90 Can scope to a particular arch to watch for'''
91
92 self._refresh_status()
93 if not self.current_status:
94 return None
95
96 current_package_building = False
97 current_package_failed = False
98 for arch in self.current_status:
99 if on_particular_arch and arch != on_particular_arch:
100 continue
101 str_status = "published"
102 if self.current_status[arch] == PackageInPPA.BUILDING:
103 current_package_building = True
104 str_status = "building"
105 if self.current_status[arch] == PackageInPPA.FAILED:
106 current_package_failed = True
107 str_status = "failed"
108 logging.info("arch: {}, status: {}".format(arch, str_status))
109
110 if current_package_building:
111 return self.BUILDING
112 # no more package is building, if one failed, time to signal it
113 if current_package_failed:
114 return self.FAILED
115 # if it's not None, not BUILDING, nor FAILED, it's PUBLISHED
116 return self.PUBLISHED
117
118 def _refresh_archs_skipped(self):
119 '''Refresh archs that we should skip for this build'''
120
121 for arch in self.archs.copy():
122 if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)):
123 logging.warning("Request to ignore {} on {}.".format(self.source_name, arch))
124 try:
125 self.archs.remove(arch)
126 except ValueError:
127 logging.warning("Request to ignore {} on {} has been proceeded, but this one wasn't in the list we were monitor for.".format(self.source_name, arch))
128 try:
129 self.current_status.pop(arch)
130 except KeyError:
131 pass
132
133 def _refresh_status(self):
134 '''Refresh status from the ppa'''
135
136 self._refresh_archs_skipped()
137
138 # first step, get the source published
139 if not self.current_status:
140 (self.current_status, self.source) = self._get_status_for_source_package_in_ppa()
141 # check the binary status
142 if self.current_status:
143 self.current_status = self._get_status_for_binary_packages_in_ppa()
144
145 def _get_status_for_source_package_in_ppa(self):
146 '''Return current_status for source package in ppa.
147
148 The status is dict (if not None) with {arch: status} and can be:
149 - None -> not visible yet
150 - BUILDING -> currently Building (or waiting to build)
151 - FAILED -> Build failed for this arch or has been canceled
152 - PUBLISHED -> All packages (including arch:all from other archs) published.
153
154 Only the 2 first status are returned by this call. See _get_status_for_binary_packages_in_ppa
155 for the others.'''
156
157 try:
158 source = self.ppa.getPublishedSources(exact_match=True, source_name=self.source_name, version=self.version, distro_series=self.series)[0]
159 logging.info("Source available in ppa")
160 current_status = {}
161 for arch in self.archs:
162 current_status[arch] = self.BUILDING
163 return (current_status, source)
164 except (KeyError, IndexError):
165 logging.info("Source not yet available")
166 return ({}, None)
167
168 def _get_status_for_binary_packages_in_ppa(self):
169 '''Return current status for package in ppa
170
171 The status is dict (if not None) with {arch: status} and can be:
172 - None -> not visible yet
173 - BUILDING -> currently Building (or waiting to build)
174 - FAILED -> Build failed for this arch or has been canceled
175 - PUBLISHED -> All packages (including arch:all from other archs) published.
176
177 Only the 3 last statuses are returned by this call. See _get_status_for_source_package_in_ppa
178 for the other.'''
179
180 # Try to see if all binaries availables for this arch are built, including arch:all on other archs
181 status = self.current_status
182 at_least_one_published_binary = False
183 for binary in self.source.getPublishedBinaries():
184 at_least_one_published_binary = True
185 # all binaries for an arch are published at the same time
186 # launchpad is lying, it's telling that archs not in the ppa are built (for arch:all). Even for non supported arch!
187 # for instance, we can have the case of self.arch_all_arch (arch:all), built before the others and amd64 will be built for it
188 if binary.status == "Published" and (binary.distro_arch_series.architecture_tag == self.arch_all_arch or
189 (binary.distro_arch_series.architecture_tag != self.arch_all_arch and binary.architecture_specific)):
190 status[binary.distro_arch_series.architecture_tag] = self.PUBLISHED
191
192 # Looking for builds on archs still BUILDING (just loop on builds once to avoid too many lp requests)
193 needs_checking_build = False
194 build_state_failed = ('Failed to build', 'Chroot problem', 'Failed to upload', 'Cancelled build', 'Build for superseded Source')
195 for arch in self.archs:
196 if self.current_status[arch] == self.BUILDING:
197 needs_checking_build = True
198 if needs_checking_build:
199 for build in self.source.getBuilds():
200 # ignored archs
201 if not build.arch_tag in self.current_status:
202 continue
203 if self.current_status[build.arch_tag] == self.BUILDING:
204 if build.buildstate in build_state_failed:
205 logging.error("{}: Build {} ({}) failed because of {}".format(build.arch_tag, build.title,
206 build.web_link, build.buildstate))
207 status[build.arch_tag] = self.FAILED
208 # Another launchpad trick: if a binary arch was published, but then is superseeded, getPublishedBinaries() won't list
209 # those binaries anymore. So it's seen as BUILDING again.
210 # If there is a successful build record of it and the source is superseded, it means that it built fine at some point,
211 # Another arch will fail as superseeded.
212 # We don't just retain the old state of "PUBLISHED" because maybe we started the script with that situation already
213 elif build.buildstate not in build_state_failed and self.source.status == "Superseded":
214 status[build.arch_tag] = self.PUBLISHED
215
216 # There is no way to know if there are some arch:all packages (and there are not in publishedBinaries for this arch until
217 # it's built on arch_all_arch). So mark all arch to BUILDING if self.arch_all_arch is building or FAILED if it failed.
218 if self.arch_all_arch in status and status[self.arch_all_arch] != self.PUBLISHED:
219 for arch in self.archs:
220 if status[arch] == self.PUBLISHED:
221 status[arch] = status[self.arch_all_arch]
222 if arch != self.arch_all_arch and status[arch] == self.FAILED:
223 logging.error("{} marked as FAILED because {} build FAILED and we may miss arch:all packages".format(arch, self.arch_all_arch))
224
225 return status
0226
=== added file 'branch-source-builder/cupstream2distro/packageinppamanager.py'
--- branch-source-builder/cupstream2distro/packageinppamanager.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,109 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import ConfigParser
21import logging
22import os
23import re
24
25from .branchhandling import _get_parent_branch
26from .packageinppa import PackageInPPA
27from .settings import PROJECT_CONFIG_SUFFIX
28
29
30def _ensure_removed_from_set(target_set, content_to_remove):
31 '''Silent removal from an existing set'''
32 try:
33 target_set.remove(content_to_remove)
34 except KeyError:
35 pass # in case we missed the "build" step
36
37
38def get_all_packages_uploaded():
39 '''Get (package, version, rev, branch) of all packages uploaded'''
40
41 # we do not rely on the .changes files but in the config file
42 # because we need the exact version (which can have an epoch)
43 result = set()
44 source_package_regexp = re.compile("(.*).{}$".format(PROJECT_CONFIG_SUFFIX))
45 for file in os.listdir('.'):
46 substract = source_package_regexp.findall(file)
47 if substract:
48 version = _get_current_packaging_version_from_config(substract[0])
49 rev = _get_current_rev_from_config(substract[0])
50 branch = _get_parent_branch(substract[0])
51 result.add((substract[0], version, rev, branch))
52 foo = set()
53 foo.add(('autopilot', '1.3.1+13.10.20131003.1-0ubuntu2~fginther.1',
54 '100', 'lp:autopilot'))
55 foo.add(('autopilot', '1.3.1+13.10.20131003.1-0ubuntu2~fginther.2',
56 '105', 'lp:autopilot'))
57 return foo
58 return result
59
60
61def get_packages_and_versions_uploaded():
62 '''Get (package, version) of all packages uploaded. We can have duplicates'''
63
64 # we do not rely on the .changes files but in the config file
65 # because we need the exact version (which can have an epoch)
66 result = set()
67 source_package_regexp = re.compile("(.*).{}.*$".format(PROJECT_CONFIG_SUFFIX))
68 for file in os.listdir('.'):
69 substract = source_package_regexp.findall(file)
70 if substract:
71 config = ConfigParser.RawConfigParser()
72 config.read(file)
73 result.add((substract[0], config.get('Package', 'packaging_version')))
74 return result
75
76
77def update_all_packages_status(packages_not_in_ppa, packages_building, packages_failed, particular_arch=None):
78 '''Update all packages status, checking in the ppa'''
79
80 for current_package in (packages_not_in_ppa.union(packages_building)):
81 logging.info("current_package: " + current_package.source_name + " " + current_package.version)
82 package_status = current_package.get_status(particular_arch)
83 if package_status != None: # global package_status can be 0 (building), 1 (failed), 2 (published)
84 # if one arch building, still considered as building
85 if package_status == PackageInPPA.BUILDING:
86 _ensure_removed_from_set(packages_not_in_ppa, current_package) # maybe already removed
87 packages_building.add(current_package)
88 # if one arch failed, considered as failed
89 elif package_status == PackageInPPA.FAILED:
90 _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step
91 _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step
92 packages_failed.add(current_package)
93 elif package_status == PackageInPPA.PUBLISHED:
94 _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step
95 _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step
96
97
98def _get_current_packaging_version_from_config(source_package_name):
99 '''Get current packaging version from the saved config'''
100 config = ConfigParser.RawConfigParser()
101 config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX))
102 return config.get('Package', 'packaging_version')
103
104
105def _get_current_rev_from_config(source_package_name):
106 '''Get current tip revision from the saved config'''
107 config = ConfigParser.RawConfigParser()
108 config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX))
109 return config.get('Branch', 'rev')
0110
=== added file 'branch-source-builder/cupstream2distro/packagemanager.py'
--- branch-source-builder/cupstream2distro/packagemanager.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/packagemanager.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,504 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012-2014 Canonical
3#
4# Authors:
5# Didier Roche
6# Rodney Dawes
7#
8# This program is free software; you can redistribute it and/or modify it under
9# the terms of the GNU General Public License as published by the Free Software
10# Foundation; version 3.
11#
12# This program is distributed in the hope that it will be useful, but WITHOUT
13# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
14# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
15# details.
16#
17# You should have received a copy of the GNU General Public License along with
18# this program; if not, write to the Free Software Foundation, Inc.,
19# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
20
21import datetime
22import fileinput
23import logging
24import os
25import re
26import shutil
27import sys
28import subprocess
29import urllib
30
31import launchpadmanager
32import settings
33from .utils import ignored
34
35
36def get_current_version_for_series(source_package_name, series_name, ppa_name=None, dest=None):
37 '''Get current version for a package name in that series'''
38 series = launchpadmanager.get_series(series_name)
39 if not dest:
40 if ppa_name:
41 dest = launchpadmanager.get_ppa(ppa_name)
42 else:
43 dest = launchpadmanager.get_ubuntu_archive()
44 source_collection = dest.getPublishedSources(exact_match=True, source_name=source_package_name, distro_series=series)
45 try:
46 # cjwatson told that list always have the more recently published first (even if removed)
47 return source_collection[0].source_package_version
48 # was never in the dest, set the lowest possible version
49 except IndexError:
50 return "0"
51
52
53def is_version_for_series_in_dest(source_package_name, version, series, dest, pocket="Release"):
54 '''Return if version for a package name in that series is in dest'''
55 return dest.getPublishedSources(exact_match=True, source_name=source_package_name, version=version,
56 distro_series=series, pocket=pocket).total_size > 0
57
58def is_version_in_queue(source_package_name, version, dest_serie, queue):
59 '''Return if version for a package name in that series is in dest'''
60 return dest_serie.getPackageUploads(exact_match=True, name=source_package_name, version=version,
61 status=queue).total_size > 0
62
63
64def is_version1_higher_than_version2(version1, version2):
65 '''return if version1 is higher than version2'''
66 return (subprocess.call(["dpkg", "--compare-versions", version1, 'gt', version2], stdout=subprocess.PIPE, stderr=subprocess.PIPE) == 0)
67
68
69def is_version_in_changelog(version, f):
70 '''Return if the version is in the upstream changelog (released)'''
71
72 if version == "0":
73 return True
74
75 desired_changelog_line = re.compile("\({}\) (?!UNRELEASED).*\; urgency=".format(version.replace('+', '\+')))
76 for line in f.readlines():
77 if desired_changelog_line.search(line):
78 return True
79
80 return False
81
82
83def get_latest_upstream_bzr_rev(f, dest_ppa=None):
84 '''Report latest bzr rev in the file
85
86 If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to first distro version'''
87 distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT))
88 destppa_regexp = re.compile("{} (\d+) \(ppa:{}\)".format(settings.REV_STRING_FORMAT, dest_ppa))
89 distro_rev = None
90 candidate_destppa_rev = None
91 candidate_distro_rev = None
92 destppa_element_found = False
93
94 # handle marker spread on two lines
95 end_of_line_regexp = re.compile(" *(.*\))")
96 previous_line = None
97
98 for line in f:
99 line = line[:-1]
100
101 if previous_line:
102 try:
103 line = previous_line + end_of_line_regexp.findall(line)[0]
104 except IndexError:
105 None
106 previous_line = None
107
108 if dest_ppa:
109 try:
110 candidate_destppa_rev = int(destppa_regexp.findall(line)[0])
111 destppa_element_found = True
112 except IndexError:
113 destppa_element_found = False
114 if not distro_rev and not destppa_element_found and not "(ppa:" in line:
115 try:
116 candidate_distro_rev = int(distro_regex.findall(line)[0])
117 distro_element_found = True
118 except IndexError:
119 distro_element_found = False
120
121 # try to catchup next line if we have a marker start without anything found
122 if settings.REV_STRING_FORMAT in line and (dest_ppa and not destppa_element_found) and not distro_element_found:
123 previous_line = line
124
125 if line.startswith(" -- "):
126 # first grab the dest ppa
127 if candidate_destppa_rev:
128 return candidate_destppa_rev
129 if not distro_rev and candidate_distro_rev:
130 distro_rev = candidate_distro_rev
131 if not dest_ppa and distro_rev:
132 return distro_rev
133
134 # we didn't find any dest ppa result but there is a distro_rev one
135 if dest_ppa and distro_rev:
136 return distro_rev
137
138 # we force a bootstrap commit for new components
139 return 0
140
141
142def list_packages_info_in_str(packages_set):
143 '''Return the packages info in a string'''
144
145 results = []
146 for package in packages_set:
147 results.append("{} ({})".format(package.source_name, package.version))
148 return " ".join(results)
149
150
151def get_packaging_version():
152 '''Get current packaging rev'''
153 instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
154 (stdout, stderr) = instance.communicate()
155 if instance.returncode != 0:
156 raise Exception(stderr.decode("utf-8").strip())
157 expr = re.compile("Version: (.*)")
158 for line in stdout.splitlines():
159 packaging_version = expr.findall(line)
160 if packaging_version:
161 return packaging_version[0]
162
163 raise Exception("Didn't find any Version in the package: {}".format(stdout))
164
165
166def get_source_package_from_dest(source_package_name, dest_archive, dest_current_version, series_name):
167 '''Download and return a path containing a checkout of the current dest version.
168
169 None if this package was never published to dest archive'''
170
171 if dest_current_version == "0":
172 logging.info("This package was never released to the destination archive, don't return downloaded source")
173 return None
174
175 logging.info("Grab code for {} ({}) from {}".format(source_package_name, dest_current_version, series_name))
176 source_package_download_dir = os.path.join('ubuntu', source_package_name)
177 series = launchpadmanager.get_series(series_name)
178 with ignored(OSError):
179 os.makedirs(source_package_download_dir)
180 os.chdir(source_package_download_dir)
181
182 try:
183 sourcepkg = dest_archive.getPublishedSources(status="Published", exact_match=True, source_name=source_package_name, distro_series=series, version=dest_current_version)[0]
184 except IndexError:
185 raise Exception("Couldn't get in the destination the expected version")
186 logging.info('Downloading %s version %s', source_package_name, dest_current_version)
187 for url in sourcepkg.sourceFileUrls():
188 urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1]))
189 instance = subprocess.Popen("dpkg-source -x *dsc", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
190 (stdout, stderr) = instance.communicate()
191 if instance.returncode != 0:
192 raise Exception(stderr.decode("utf-8").strip())
193
194 # check the dir exist
195 splitted_version = dest_current_version.split(':')[-1].split('-') # remove epoch is there is one
196 # TODO: debian version (like -3) is not handled here.
197 # We do handle 42ubuntu1 though (as splitted_version[0] can contain "ubuntu")
198 if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1: # don't remove last item for the case where we had a native version (-0.35.2) without ubuntu in it
199 splitted_version = splitted_version[:-1]
200 version_for_source_file = '-'.join(splitted_version)
201 source_directory_name = "{}-{}".format(source_package_name, version_for_source_file)
202 if not os.path.isdir(source_directory_name):
203 raise Exception("We tried to download and check that the directory {} is present, but it's not the case".format(source_directory_name))
204 os.chdir('../..')
205 return (os.path.join(source_package_download_dir, source_directory_name))
206
207
208def is_new_content_relevant_since_old_published_source(dest_version_source):
209 '''Return True if a new snapshot is needed
210
211 dest_version_source can be None if no released version was done before.'''
212
213 # we always released something not yet in ubuntu, no matter criterias are not met.
214 if not dest_version_source:
215 return True
216
217 # now check the relevance of the committed changes compared to the version in the repository (if any)
218 diffinstance = subprocess.Popen(['diff', '-Nrup', '.', dest_version_source], stdout=subprocess.PIPE)
219 filterinstance = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE)
220 lsdiffinstance = subprocess.Popen(['lsdiff'], stdin=filterinstance.stdout, stdout=subprocess.PIPE)
221 (relevant_changes, err) = subprocess.Popen(['grep', '-Ev', '.bzr|.pc'], stdin=lsdiffinstance.stdout, stdout=subprocess.PIPE).communicate()
222
223 # detect if the only change is a Vcs* target changes (with or without changelog edit). We won't release in that case
224 number_of_changed_files = relevant_changes.count("\n")
225 if ((number_of_changed_files == 1 and "debian/control" in relevant_changes) or
226 (number_of_changed_files == 2 and "debian/control" in relevant_changes and "debian/changelog" in relevant_changes)):
227 (results, err) = subprocess.Popen(['diff', os.path.join('debian', 'control'), os.path.join(dest_version_source, "debian", "control")], stdout=subprocess.PIPE).communicate()
228 for diff_line in results.split('\n'):
229 if diff_line.startswith("< ") or diff_line.startswith("> "):
230 if not diff_line[2:].startswith("Vcs-") and not diff_line[2:].startswith("#"):
231 return True
232 return False
233
234 logging.debug("Relevant changes are:")
235 logging.debug(relevant_changes)
236
237 return (relevant_changes != '')
238
239
240def is_relevant_source_diff_from_previous_dest_version(newdsc_path, dest_version_source):
241 '''Extract and check if the generated source diff different from previous one'''
242
243 with ignored(OSError):
244 os.makedirs("generated")
245 extracted_generated_source = os.path.join("generated", newdsc_path.split('_')[0])
246 with ignored(OSError):
247 shutil.rmtree(extracted_generated_source)
248
249 # remove epoch is there is one
250 if subprocess.call(["dpkg-source", "-x", newdsc_path, extracted_generated_source]) != 0:
251 raise Exception("dpkg-source command returned an error.")
252
253 # now check the relevance of the committed changes compared to the version in the repository (if any)
254 diffinstance = subprocess.Popen(['diff', '-Nrup', extracted_generated_source, dest_version_source], stdout=subprocess.PIPE)
255 (diff, err) = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate()
256
257 # there is no important diff if the diff only contains 12 lines, corresponding to "Automatic daily release" marker in debian/changelog
258 if (diff.count('\n') <= 12):
259 return False
260 return True
261
262
263def _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc):
264 '''Return if there has been a packaging change between two dsc files
265
266 We ignore the changelog only changes'''
267 if not oldsource_dsc:
268 return True
269 if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc):
270 raise Exception("{} or {} doesn't not exist, can't create a diff".format(oldsource_dsc, newsource_dsc))
271 diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE)
272 filterinstance = subprocess.Popen(['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog'], stdin=diffinstance.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
273 (change_in_debian, filter_err) = filterinstance.communicate()
274 # we can't rely on diffinstance returncode as the signature key is maybe not present and it will exit with 1
275 if filterinstance.returncode != 0:
276 raise Exception("Error in diff: {}".format(filter_err.decode("utf-8").strip()))
277 return(change_in_debian != "")
278
279
280def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc):
281 '''Generate a diff file in diff_filepath if there is a relevant packaging diff between 2 sources
282
283 The diff contains autotools files and cmakeries'''
284 if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc):
285 with open(diff_filepath, "w") as f:
286 if not oldsource_dsc:
287 f.writelines("This source is a new package, if the destination is ubuntu, please ensure it has been preNEWed by an archive admin before publishing that stack.")
288 return
289 f.write("/!\ Remember that this diff only represents packaging changes and build tools diff, not the whole content diff!\n\n")
290 diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE)
291 (changes_to_publish, err) = subprocess.Popen(['filterdiff', '--remove-timestamps', '--clean', '-i', '*setup.py',
292 '-i', '*Makefile.am', '-i', '*configure.*', '-i', '*debian/*',
293 '-i', '*CMakeLists.txt'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate()
294 f.write(changes_to_publish)
295
296
297def create_new_packaging_version(base_package_version, series_version, destppa=''):
298 '''Deliver a new packaging version, based on simple rules:
299
300 Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1
301 if we already have something delivered today, it will be .minor, then, .minor+1…
302
303 We append the destination ppa name if we target a dest ppa and not distro'''
304 # to keep track of whether the package is native or not
305 native_pkg = False
306
307 today_version = datetime.date.today().strftime('%Y%m%d')
308 destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".")
309 # bootstrapping mode or direct upload or UNRELEASED for bumping to a new series
310 # TRANSITION
311 if not ("daily" in base_package_version or "+" in base_package_version):
312 # support both 42, 42-0ubuntu1
313 upstream_version = base_package_version.split('-')[0]
314 # if we have 42ubuntu1 like a wrong native version
315 if "ubuntu" in upstream_version:
316 upstream_version = upstream_version.split('ubuntu')[0]
317 elif not "-" in base_package_version and "+" in base_package_version:
318 # extract the day of previous daily upload and bump if already uploaded
319 regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*")
320 try:
321 previous_day = regexp.findall(base_package_version)[0]
322 upstream_version = previous_day[0]
323 native_pkg = True
324 if (previous_day[1] == series_version and
325 previous_day[2] == today_version):
326 minor = 1
327 if previous_day[3]: # second upload of the day
328 minor = int(previous_day[3]) + 1
329 today_version = "{}.{}".format(today_version, minor)
330 except IndexError:
331 raise Exception(
332 "Unable to get previous day from native version: %s"
333 % base_package_version)
334 else:
335 # extract the day of previous daily upload and bump if already uploaded today
336 regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*")
337 try:
338 previous_day = regexp.findall(base_package_version)[0]
339 except IndexError:
340 # TRANSITION FALLBACK
341 try:
342 regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*")
343 previous_day = regexp.findall(base_package_version)[0]
344 # make the version compatible with the new version
345 previous_day = (previous_day[0], previous_day[1], "20" + previous_day[2].replace(".", ""), previous_day[3])
346 except IndexError:
347 raise Exception("Didn't find a correct versioning in the current package: {}".format(base_package_version))
348 upstream_version = previous_day[0]
349 if previous_day[1] == series_version and previous_day[2] == today_version:
350 minor = 1
351 if previous_day[3]: # second upload of the day
352 minor = int(previous_day[3]) + 1
353 today_version = "{}.{}".format(today_version, minor)
354
355 new_upstream_version = "{upstream}+{series}.{date}{destppa}".format(
356 upstream=upstream_version, series=series_version,
357 date=today_version, destppa=destppa)
358 if native_pkg is not True:
359 new_upstream_version = "{}-0ubuntu1".format(new_upstream_version)
360
361 return new_upstream_version
362
363
364def get_packaging_sourcename():
365 '''Get current packaging source name'''
366 instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
367 (stdout, stderr) = instance.communicate()
368 if instance.returncode != 0:
369 raise Exception(stderr.decode("utf-8").strip())
370 expr = re.compile("Source: (.*)")
371 for line in stdout.splitlines():
372 source_name = expr.findall(line)
373 if source_name:
374 return source_name[0]
375
376 raise Exception("Didn't find any source name in the package: {}".format(stdout))
377
378
379def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name):
380 '''Collect all bugs in the changelog until latest snapshot'''
381 bugs = set()
382 # matching only bug format that launchpad accepts
383 group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE)
384 bug_decipher_regexp = re.compile("(#\d{5,})+")
385 new_upload_changelog_regexp = re.compile(settings.NEW_CHANGELOG_PATTERN.format(source_package_name))
386 for line in f:
387 grouped_bugs_list = group_bugs_regexp.findall(line)
388 for grouped_bugs in grouped_bugs_list:
389 for bug in map(lambda bug_with_hash: bug_with_hash.replace('#', ''), bug_decipher_regexp.findall(grouped_bugs)):
390 bugs.add(bug)
391 # a released upload to distro (automated or manual)n exit as bugs before were already covered
392 if new_upload_changelog_regexp.match(line):
393 return bugs
394
395 return bugs
396
397
398def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits, dest_ppa=None):
399 '''Update the changelog for the incoming upload'''
400
401 dch_env = os.environ.copy()
402 for author in authors_commits:
403 dch_env["DEBFULLNAME"] = author
404 for bug_desc in authors_commits[author]:
405 if bug_desc.startswith('-'):
406 # Remove leading '-' or dch thinks (rightly) that it's an option
407 bug_desc = bug_desc[1:]
408 if bug_desc.startswith(' '):
409 # Remove leading spaces, there are useless and the result is
410 # prettier without them anyway ;)
411 bug_desc = bug_desc.strip()
412 cmd = ["dch", "--multimaint-merge", "--release-heuristic", "changelog",
413 "-v{}".format(new_package_version), bug_desc]
414 subprocess.Popen(cmd, env=dch_env).communicate()
415
416 if tip_bzr_rev != None:
417 commit_message = "{} {}".format(settings.REV_STRING_FORMAT, tip_bzr_rev)
418 if dest_ppa:
419 commit_message += " ({})".format(dest_ppa)
420 else:
421 commit_message = ""
422
423 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME
424 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL
425 instance = subprocess.Popen(["dch", "--release-heuristic", "changelog",
426 "-v{}".format(new_package_version), commit_message],
427 stderr=subprocess.PIPE, env=dch_env)
428 (stdout, stderr) = instance.communicate()
429 if instance.returncode != 0:
430 raise Exception(stderr.decode("utf-8").strip())
431 subprocess.call(["dch", "-r", "--distribution", series, "--force-distribution", ""], env=dch_env)
432
433 # in the case of no commit_message and no symbols file change, we have an addition [ DEBFULLNAME ] follow by an empty line
434 # better to remove both lines
435 subprocess.call(["sed", "-i", "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}", "debian/changelog"])
436
437
438def build_source_package(series, distro_version, ppa=None):
439 '''Build the source package using the internal helper
440
441 Add the additional ppa inside the chroot if requested.'''
442
443 chroot_tool_dir = os.path.join(settings.ROOT_CU2D, "chroot-tools")
444 buildsource = os.path.join(chroot_tool_dir, "buildsource-chroot")
445 branch_dir = os.path.abspath('.')
446 parent_dir = os.path.abspath(os.path.dirname(branch_dir))
447 cowbuilder_env = os.environ.copy()
448 cowbuilder_env["HOME"] = chroot_tool_dir # take the internal .pbuilderrc
449 cowbuilder_env["DIST"] = series
450 cmd = ["sudo", "-E", "cowbuilder", "--execute",
451 "--bindmounts", parent_dir,
452 "--bindmounts", settings.GNUPG_DIR,
453 "--", buildsource, branch_dir,
454 "--gnupg-parentdir", settings.GNUPG_DIR,
455 "--uid", str(os.getuid()), "--gid", str(os.getgid()),
456 "--gnupg-keyid", settings.BOT_KEY,
457 "--distro-version", distro_version]
458 if ppa:
459 cmd.extend(["--ppa", ppa])
460 instance = subprocess.Popen(cmd, env=cowbuilder_env)
461 instance.communicate()
462 if instance.returncode != 0:
463 raise Exception("%r returned: %s." % (cmd, instance.returncode))
464
465
466def upload_package(source, version, ppa, source_directory=None):
467 '''Upload the new package to a ppa'''
468 # remove epoch is there is one
469 here = None
470 if source_directory:
471 here = os.getcwd()
472 os.chdir(source_directory)
473 version_for_source_file = version.split(':')[-1]
474 cmd = ["dput", ppa,
475 "{}_{}_source.changes".format(source, version_for_source_file)]
476 if subprocess.call(cmd) != 0:
477 if here:
478 os.chdir(here)
479 raise Exception("%r returned an error." % (cmd,))
480 if here:
481 os.chdir(here)
482
483
484def refresh_symbol_files(packaging_version):
485 '''Refresh the symbols file having REPLACEME_TAG with version of the day.
486
487 Add a changelog entry if needed'''
488
489 new_upstream_version = packaging_version.split("-")[0]
490 replacement_done = False
491 for filename in os.listdir("debian"):
492 if filename.endswith("symbols"):
493 for line in fileinput.input(os.path.join('debian', filename), inplace=1):
494 if settings.REPLACEME_TAG in line:
495 replacement_done = True
496 line = line.replace(settings.REPLACEME_TAG, new_upstream_version)
497 sys.stdout.write(line)
498
499 if replacement_done:
500 dch_env = os.environ.copy()
501 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME
502 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL
503 subprocess.Popen(["dch", "debian/*symbols: auto-update new symbols to released version"], env=dch_env).communicate()
504 subprocess.call(["bzr", "commit", "-m", "Update symbols"])
0505
=== added file 'branch-source-builder/cupstream2distro/settings.py'
--- branch-source-builder/cupstream2distro/settings.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/settings.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,121 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import os
21import yaml
22
23REV_STRING_FORMAT = "Automatic snapshot from revision"
24NEW_CHANGELOG_PATTERN = "^{} \(.*\) (?!UNRELEASED)"
25PACKAGING_MERGE_COMMIT_MESSAGE = "Releasing {} (revision {} from {})"
26REPLACEME_TAG = "0replaceme"
27BRANCH_URL = "lp:~ps-jenkins/{}/latestsnapshot-{}"
28
29IGNORECHANGELOG_COMMIT = "#nochangelog"
30
31PROJECT_CONFIG_SUFFIX = "project"
32
33BOT_DEBFULLNAME = "Ubuntu daily release"
34BOT_DEBEMAIL = "ps-jenkins@lists.canonical.com"
35home_dir = "/srv/bsb_worker"
36CU2D_DIR = home_dir
37GNUPG_DIR = CU2D_DIR
38if not os.path.isdir(os.path.join(GNUPG_DIR, '.gnupg')):
39 GNUPG_DIR = home_dir
40CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials")
41
42# TODO refactor into a ci-utils module
43def _unit_config():
44 path = os.path.join(home_dir, 'unit_config')
45 config = {}
46 try:
47 with open(path) as f:
48 config = yaml.safe_load(f.read())
49 except:
50 print('Unable to use unit_config(%s), defaulting values' % path)
51 return config
52_cfg = _unit_config()
53LAUNCHPAD_PPA_USER = _cfg.get('launchpad_user', None)
54LAUNCHPAD_API_BASE = _cfg.get(
55 'launchpad_api_base', 'https://api.launchpad.net/1.0')
56OAUTH_CONSUMER_KEY = _cfg.get('oauth_consumer_key', None)
57OAUTH_TOKEN = _cfg.get('oauth_token', None)
58OAUTH_TOKEN_SECRET = _cfg.get('oauth_token_secret', None)
59OAUTH_REALM = _cfg.get('oauth_realm', 'https://api.launchpad.net/')
60
61if not os.path.exists(CRED_FILE_PATH):
62 with open(CRED_FILE_PATH, 'w') as f:
63 f.write('[1]\n')
64 f.write('consumer_key = %s\n' % OAUTH_CONSUMER_KEY)
65 f.write('consumer_secret = \n')
66 f.write('access_token = %s\n' % OAUTH_TOKEN)
67 f.write('access_secret = %s\n' % OAUTH_TOKEN_SECRET)
68
69COMMON_LAUNCHPAD_CACHE_DIR = os.path.join("/tmp", "launchpad.cache")
70if not os.path.isdir(COMMON_LAUNCHPAD_CACHE_DIR):
71 os.makedirs(COMMON_LAUNCHPAD_CACHE_DIR)
72BOT_KEY = "B879A3E9"
73
74# selected arch for building arch:all packages
75VIRTUALIZED_PPA_ARCH = ["i386", "amd64"]
76# an arch we will ignore for publication if latest published version in dest doesn't build it
77ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el'])
78ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el'])
79SRU_PPA = "ubuntu-unity/sru-staging"
80
81TIME_BETWEEN_PPA_CHECKS = 60
82TIME_BETWEEN_STACK_CHECKS = 60
83TIME_BEFORE_STOP_LOOKING_FOR_SOURCE_PUBLISH = 20 * 60
84
85PUBLISHER_ARTEFACTS_FILENAME = 'publisher.xml'
86PREPARE_ARTEFACTS_FILENAME_FORMAT = 'prepare_{}.xml'
87
88OLD_STACK_DIR = 'old'
89PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync'
90PACKAGE_LIST_RSYNC_FILENAME_FORMAT = PACKAGE_LIST_RSYNC_FILENAME_PREFIX + '_{}-{}'
91RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*".format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX)
92
93ROOT_CU2D = os.path.join(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
94DEFAULT_CONFIG_STACKS_DIR = os.path.join(os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks')
95STACK_STATUS_FILENAME = "stack.status"
96STACK_STARTED_FILENAME = "stack.started"
97STACK_BUILDING_FILENAME = "stack.building"
98
99STACK_RUNNING_DIR = "/iSCSI/jenkins/cu2d/work"
100STACK_STATUS_PUBLISHING_DIR = "/iSCSI/jenkins/cu2d/result_publishing"
101
102# for citrain
103SILO_NAME_LIST = []
104for i in xrange(1, 11):
105 SILO_NAME_LIST.append("landing-{:03d}".format(i))
106SILO_CONFIG_FILENAME = "config"
107SILO_BUILDPPA_SCHEME = "ci-train-ppa-service/{}"
108SILO_PACKAGING_RELEASE_COMMIT_MESSAGE = "Releasing {}"
109SILOS_RAW_DIR = "~/silos"
110SILOS_DIR = os.path.expanduser(SILOS_RAW_DIR)
111SILO_RSYNCDIR = "~/out"
112SILO_STATUS_RSYNCDIR = os.path.expanduser("~/status")
113CITRAIN_BINDIR = "~/citrain/citrain"
114(SILO_EMPTY, SILO_BUILTCHECKED, SILO_PUBLISHED, SILO_DONE) = range(4)
115
116SERIES_VERSION = {
117 'precise': '12.04',
118 'raring': '13.04',
119 'saucy': '13.10',
120 'trusty': '14.04'
121}
0122
=== added file 'branch-source-builder/cupstream2distro/silomanager.py'
--- branch-source-builder/cupstream2distro/silomanager.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/silomanager.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,115 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2014 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import json
21import logging
22import os
23import shutil
24
25from cupstream2distro.settings import SILO_CONFIG_FILENAME, SILO_NAME_LIST, SILO_STATUS_RSYNCDIR
26from cupstream2distro.utils import ignored
27
28
29def save_config(config, uri=''):
30 """Save config in uri and copy to outdir"""
31 silo_config_path = os.path.abspath(os.path.join(uri, SILO_CONFIG_FILENAME))
32 with ignored(OSError):
33 os.makedirs(uri)
34 try:
35 json.dump(config, open(silo_config_path, 'w'))
36 except TypeError as e:
37 logging.error("Can't save configuration: " + e.message)
38 os.remove(silo_config_path)
39 return False
40 # copy to outdir
41 with ignored(OSError):
42 os.makedirs(SILO_STATUS_RSYNCDIR)
43 silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1]
44 dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name)
45 logging.debug("Copying configuration from {} to {}".format(silo_config_path, dest))
46 shutil.copy2(silo_config_path, os.path.join(SILO_STATUS_RSYNCDIR, silo_name))
47 return True
48
49def load_config(uri=None):
50 """return a loaded config
51
52 If no uri, load in the current directory"""
53 if not uri:
54 uri = os.path.abspath('.')
55 logging.debug("Reading configuration in {}".format(uri))
56 try:
57 return json.load(open(os.path.join(uri, SILO_CONFIG_FILENAME)))
58 # if silo isn't configured
59 except IOError:
60 pass
61 except ValueError as e:
62 logging.warning("Can't load configuration: " + e.message)
63 return None
64
65def remove_status_file(silo_name):
66 """Remove status file"""
67 os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name))
68
69
70def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri, ignore_silo):
71 """Return true if the project for that serie in that dest is not in any configuration"""
72 logging.info("Checking if {} is already configured for {} ({}) in another silo".format(project_name, dest.name, series.name))
73 for silo_name in SILO_NAME_LIST:
74 # we are reconfiguring current silo, ignoring it
75 if ignore_silo == silo_name:
76 continue
77 config = load_config(os.path.join(base_silo_uri, silo_name))
78 if config:
79 if (config["global"]["dest"] == dest.self_link and config["global"]["series"] == series.self_link and
80 (project_name in config["mps"] or project_name in config["sources"])):
81 logging.error("{} is already prepared for the same serie and destination in {}".format(project_name, silo_name))
82 return False
83 return True
84
85
86def return_first_available_silo(base_silo_uri):
87 """Check which silos are free and return the first one"""
88 for silo_name in SILO_NAME_LIST:
89 if not os.path.isfile(os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)):
90 return silo_name
91 return None
92
93def get_config_step(config):
94 """Get configuration step"""
95 return config["global"]["step"]
96
97def set_config_step(config, new_step, uri=''):
98 """Set configuration step to new_step"""
99 config["global"]["step"] = new_step
100 return save_config(config, uri)
101
102def set_config_status(config, status, uri='', add_url=True):
103 """Change status to reflect latest status"""
104 build_url = os.getenv('BUILD_URL')
105 if add_url and build_url:
106 status = "{} ({}console)".format(status , build_url)
107 config["global"]["status"] = status
108 return save_config(config, uri)
109
110def get_all_projects(config):
111 """Get a list of all projets"""
112 projects = []
113 projects.extend(config["mps"])
114 projects.extend(config["sources"])
115 return projects
0116
=== added file 'branch-source-builder/cupstream2distro/stack.py'
--- branch-source-builder/cupstream2distro/stack.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/stack.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,204 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import logging
21import os
22import yaml
23
24from .settings import DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME, STACK_STARTED_FILENAME
25from .utils import ignored
26
27_stacks_ref = {}
28
29# TODO: should be used by a metaclass
30def get_stack(release, stack_name):
31 try:
32 return _stacks_ref[release][stack_name]
33 except KeyError:
34 return Stack(release, stack_name)
35
36class Stack():
37
38 def __init__(self, release, stack_name):
39 self.stack_name = stack_name
40 self.release = release
41 self.statusfile = os.path.join('..', '..', release, stack_name, STACK_STATUS_FILENAME)
42 self.startedfile = os.path.join('..', '..', release, stack_name, STACK_STARTED_FILENAME)
43 self.stack_file_path = None
44 self._dependencies = None
45 self._rdependencies = None
46 for stack_file_path in Stack.get_stacks_file_path(release):
47 if stack_file_path.split(os.path.sep)[-1] == "{}.cfg".format(stack_name):
48 self.stack_file_path = stack_file_path
49 break
50 if not self.stack_file_path:
51 raise Exception("{}.cfg for {} doesn't exist anywhere in {}".format(stack_name, release, self.get_root_stacks_dir()))
52
53 with open(self.stack_file_path, 'r') as f:
54 cfg = yaml.load(f)
55 try:
56 self.forced_manualpublish = cfg['stack']['manualpublish']
57 except (TypeError, KeyError):
58 self.forced_manualpublish = False
59 # register to the global dict
60 _stacks_ref.setdefault(release, {})[stack_name] = self
61
62 def get_status(self):
63 '''Return a stack status
64
65 0 is everything is fine and published
66 1 is the stack failed in a step
67 2 is the stack succeeded, but need manual publishing
68
69 Return None if the status is not available yet'''
70
71 cfg = yaml.load(open(self.stack_file_path))
72 with ignored(KeyError):
73 if cfg['stack']['status_ignored']:
74 return 0
75 if not os.path.isfile(self.statusfile):
76 return None
77 with open(self.statusfile, 'r') as f:
78 return(int(f.read()))
79
80 def is_started(self):
81 '''Return True if the stack is started (dep-wait or building)'''
82 if os.path.isfile(self.startedfile):
83 return True
84 return False
85
86 def is_building(self):
87 '''Return True if the stack is building'''
88 if os.path.isfile(self.startedfile):
89 return True
90 return False
91
92 def is_enabled(self):
93 '''Return True if the stack is enabled for daily release'''
94 with open(self.stack_file_path, 'r') as f:
95 cfg = yaml.load(f)
96 try:
97 if not cfg['stack']['enabled']:
98 return False
99 except KeyError:
100 pass
101 return True
102
103 def get_direct_depending_stacks(self):
104 '''Get a list of direct depending stacks'''
105 if self._dependencies is not None:
106 return self._dependencies
107
108 with open(self.stack_file_path, 'r') as f:
109 cfg = yaml.load(f)
110 try:
111 deps_list = cfg['stack']['dependencies']
112 self._dependencies = []
113 if not deps_list:
114 return self._dependencies
115 for item in deps_list:
116 if isinstance(item, dict):
117 (stackname, release) = (item["name"], item["release"])
118 else:
119 (stackname, release) = (item, self.release)
120 self._dependencies.append(get_stack(release, stackname))
121 logging.info("{} ({}) dependency list is: {}".format(self.stack_name, self.release, ["{} ({})".format(stack.stack_name, stack.release) for stack in self._dependencies]))
122 return self._dependencies
123 except (TypeError, KeyError):
124 return []
125
126 def get_direct_rdepends_stack(self):
127 '''Get a list of direct rdepends'''
128 if self._rdependencies is not None:
129 return self._rdependencies
130
131 self._rdependencies = []
132 for stackfile in Stack.get_stacks_file_path(self.release):
133 path = stackfile.split(os.path.sep)
134 stack = get_stack(path[-2], path[-1].replace(".cfg", ""))
135 if self in stack.get_direct_depending_stacks():
136 self._rdependencies.append(stack)
137 return self._rdependencies
138
139 def generate_dep_status_message(self):
140 '''Return a list of potential problems from others stack which should block current publication'''
141
142 # TODO: get the first Stack object
143 # iterate over all stacks objects from dep chain
144 # call get_status on all of them
145
146 global_dep_status_info = []
147 for stack in self.get_direct_depending_stacks():
148 logging.info("Check status for {} ({})".format(stack.stack_name, stack.release))
149 status = stack.get_status()
150 message = None
151 # We should have a status for every stack
152 if status is None:
153 message = "Can't find status for {depstack} ({deprel}). This shouldn't happen unless the stack is currently running. If this is the case, it means that the current stack shouldn't be uploaded as the state is unknown.".format(depstack=stack, deprel=stack.release)
154 elif status == 1:
155 message = '''{depstack} ({deprel}) failed to publish. Possible causes are:
156 * the stack really didn't build/can't be prepared at all.
157 * the stack has integration tests not working with this previous stack.
158
159 What needs to be done:
160 Either:
161 * If we want to publish both stacks: retry the integration tests for {depstack} ({deprel}), including components from this stack (check with the whole PPA). If that works, both stacks should be published at the same time.
162 Or:
163 * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
164 elif status == 2:
165 message = '''{depstack} ({deprel}) is in manually publish mode. Possible causes are:
166 * Some part of the stack has packaging changes
167 * This stack is depending on another stack not being published
168
169 What needs to be done:
170 Either:
171 * If {depstack} ({deprel}) can be published, we should publish both stacks at the same time.
172 Or:
173 * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
174 elif status == 3 or status == -1:
175 message = '''{depstack} ({deprel}) has been manually aborted or failed for an unknown reason. Possible causes are:
176 * A job of this stack was stopped manually
177 * Jenkins had an internal error/shutdown
178
179 What needs to be done:
180 * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
181
182 if message:
183 logging.warning(message)
184 global_dep_status_info.append(message)
185 return global_dep_status_info
186
187 @staticmethod
188 def get_root_stacks_dir():
189 '''Get root stack dir'''
190 return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR)
191
192 @staticmethod
193 def get_stacks_file_path(release):
194 '''Return an iterator with all path for every discovered stack files'''
195 for root, dirs, files in os.walk(os.path.join(Stack.get_root_stacks_dir(), release)):
196 for candidate in files:
197 if candidate.endswith('.cfg'):
198 yield os.path.join(root, candidate)
199
200 @staticmethod
201 def get_current_stack():
202 '''Return current stack object based on current path (release/stackname)'''
203 path = os.getcwd().split(os.path.sep)
204 return get_stack(path[-2], path[-1])
0205
=== added file 'branch-source-builder/cupstream2distro/stacks.py'
--- branch-source-builder/cupstream2distro/stacks.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/stacks.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,89 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import logging
21import os
22import yaml
23import subprocess
24
25from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN
26from .tools import get_packaging_diff_filename
27from .stack import Stack
28
29
30def _rsync_stack_files():
31 '''rsync all stack files'''
32 server = os.getenv('CU2D_RSYNCSVR')
33 if server == "none":
34 return
35 elif server:
36 remoteaddr = RSYNC_PATTERN.replace('RSYNCSVR', server)
37 else:
38 raise Exception('Please set environment variable CU2D_RSYNCSVR')
39
40 cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.']
41 instance = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
42 (stdout, stderr) = instance.communicate()
43 if instance.returncode not in (0, 23):
44 raise Exception(stderr.decode("utf-8").strip())
45
46
47def get_stack_files_to_sync():
48 '''Return a list of tuple: (file, release)'''
49 _rsync_stack_files()
50 for file in os.listdir('.'):
51 if file.startswith(PACKAGE_LIST_RSYNC_FILENAME_PREFIX):
52 yield (file, file.split('-')[-1])
53
54
55def get_allowed_projects(release):
56 '''Get all projects allowed to be uploaded for this release'''
57
58 projects = []
59 for file_path in Stack.get_stacks_file_path(release):
60 with open(file_path, 'r') as f:
61 cfg = yaml.load(f)
62 try:
63 projects_list = cfg['stack']['projects']
64 except (TypeError, KeyError):
65 logging.warning("{} seems broken in not having stack or projects keys".format(file_path))
66 continue
67 if not projects_list:
68 logging.warning("{} don't have any project list".format(file_path))
69 continue
70 for project in projects_list:
71 if isinstance(project, dict):
72 projects.append(project.keys()[0])
73 else:
74 projects.append(project)
75 return set(projects)
76
77def get_stack_packaging_change_status(source_version_list):
78 '''Return global package change status list
79
80 # FIXME: added too many infos now, should only be: (source, version)
81 source_version_list is a list of couples (source, version, tip_rev, target_branch)'''
82
83 packaging_change_status = []
84 for (source, version, tip_rev, target_branch) in source_version_list:
85 if os.path.exists(get_packaging_diff_filename(source, version)):
86 message = "Packaging change for {} ({}).".format(source, version)
87 logging.warning(message)
88 packaging_change_status.append(message)
89 return packaging_change_status
090
=== added file 'branch-source-builder/cupstream2distro/tools.py'
--- branch-source-builder/cupstream2distro/tools.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/tools.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,94 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2012 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20import ConfigParser
21import glob
22import os
23import shutil
24from xml.sax.saxutils import quoteattr, escape
25
26from .settings import PROJECT_CONFIG_SUFFIX
27from .utils import ignored
28
29WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1" time="0.1">
30 <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase>
31</testsuite>'''
32
33
34def generate_xml_artefacts(test_name, details, filename):
35 '''Generate a fake test name xml result for marking the build as unstable'''
36 failure = ""
37 errnum = 0
38 for detail in details:
39 errnum = 1
40 failure += ' <failure type="exception">{}</failure>\n'.format(escape(detail))
41 if failure:
42 failure = '\n{}'.format(failure)
43
44 with open(filename, 'w') as f:
45 f.write(WRAPPER_STRING.format(errnum, quoteattr(test_name), failure))
46
47
48def get_previous_distro_version_from_config(source_package_name):
49 '''Get previous packaging version which was in bzr from the saved config'''
50 config = ConfigParser.RawConfigParser()
51 config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX))
52 return config.get('Package', 'dest_current_version')
53
54
55def save_project_config(source_package_name, branch, revision, dest_current_version, current_packaging_version):
56 '''Save branch and package configuration'''
57 config = ConfigParser.RawConfigParser()
58 config.add_section('Branch')
59 config.set('Branch', 'branch', branch)
60 config.set('Branch', 'rev', revision)
61 config.add_section('Package')
62 config.set('Package', 'dest_current_version', dest_current_version)
63 config.set('Package', 'packaging_version', current_packaging_version)
64 with open("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX), 'wb') as configfile:
65 config.write(configfile)
66
67
68def get_packaging_diff_filename(source_package_name, packaging_version):
69 '''Return the packaging diff filename'''
70
71 return "packaging_changes_{}_{}.diff".format(source_package_name, packaging_version)
72
73
74def mark_project_as_published(source_package_name, packaging_version):
75 '''Rename .project and eventual diff files so that if we do a partial rebuild, we don't try to republish them'''
76 project_filename = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)
77 os.rename(project_filename, "{}_{}".format(project_filename, packaging_version))
78 diff_filename = get_packaging_diff_filename(source_package_name, packaging_version)
79 if os.path.isfile(diff_filename):
80 os.rename(diff_filename, "{}.published".format(diff_filename))
81
82
83def clean_source(source):
84 """clean all related source content from current silos"""
85 with ignored(OSError):
86 shutil.rmtree(source)
87 with ignored(OSError):
88 os.remove("{}.{}".format(source, PROJECT_CONFIG_SUFFIX))
89 with ignored(OSError):
90 shutil.rmtree("ubuntu/{}".format(source))
91 for filename in glob.glob("{}_*".format(source)):
92 os.remove(filename)
93 for filename in glob.glob("packaging_changes_{}_*diff".format(source)):
94 os.remove(filename)
095
=== added file 'branch-source-builder/cupstream2distro/utils.py'
--- branch-source-builder/cupstream2distro/utils.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/cupstream2distro/utils.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,29 @@
1# -*- coding: utf-8 -*-
2# Copyright (C) 2013 Canonical
3#
4# Authors:
5# Didier Roche
6#
7# This program is free software; you can redistribute it and/or modify it under
8# the terms of the GNU General Public License as published by the Free Software
9# Foundation; version 3.
10#
11# This program is distributed in the hope that it will be useful, but WITHOUT
12# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
13# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
14# details.
15#
16# You should have received a copy of the GNU General Public License along with
17# this program; if not, write to the Free Software Foundation, Inc.,
18# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
19
20from contextlib import contextmanager
21
22
23# this is stolen from python 3.4 :)
24@contextmanager
25def ignored(*exceptions):
26 try:
27 yield
28 except exceptions:
29 pass
030
=== modified file 'branch-source-builder/run_worker'
--- branch-source-builder/run_worker 2014-01-06 19:47:57 +0000
+++ branch-source-builder/run_worker 2014-02-11 20:59:14 +0000
@@ -18,7 +18,10 @@
18import logging18import logging
19import os19import os
20import sys20import sys
21import time
2122
23import upload_package
24import watch_ppa
2225
23logging.basicConfig(level=logging.INFO)26logging.basicConfig(level=logging.INFO)
24log = logging.getLogger(__name__)27log = logging.getLogger(__name__)
@@ -27,23 +30,59 @@
27# and add it, so we can always safely import stuff30# and add it, so we can always safely import stuff
28sys.path.append(os.path.join(os.path.dirname(__file__), '../ci-utils'))31sys.path.append(os.path.join(os.path.dirname(__file__), '../ci-utils'))
29from ci_utils import amqp_utils32from ci_utils import amqp_utils
3033TIME_BETWEEN_CHECKS = 60
3134
32def on_message(msg):35def on_message(msg):
33 log.info('on_message: %s', msg.body)36 log.info('on_message: {}'.format(msg.body))
34 params = json.loads(msg.body)37 params = json.loads(msg.body)
35 sources = params['source_packages']38 sources = params['source_packages']
36 ppa = params['ppa']39 ppa = params['ppa']
40 log.info('The PPA is: {}'.format(ppa))
37 trigger = params['progress_trigger']41 trigger = params['progress_trigger']
42
43 # Setup the output data to send back to the caller
44 # TODO: sources_packages are just the passed in source files, this
45 # can be made to be more useful.
46 # TODO: artifacts will be populated with artifacts from this build.
47 out_data = {'source_packages': sources,
48 'ppa': ppa,
49 'artifacts': []}
38 amqp_utils.progress_update(trigger, params)50 amqp_utils.progress_update(trigger, params)
3951
40 # TODO build stuff52 # TODO Replace these hardcoded parameters with params from the message
4153 archive_ppa = 'ppa:ci-engineering-airline/ci-archive'
42 if amqp_utils.progress_completed(trigger):54 series = 'saucy'
43 log.error('Unable to notify progress-trigger completition of action')55 try:
4456 upload_list = upload_package.upload_source_packages(ppa, sources)
45 # remove from queue so request becomes completed57 log.info('upload_list: {}'.format(upload_list))
46 msg.channel.basic_ack(msg.delivery_tag)58 start_time = time.time()
59 while True:
60 (ret, status) = watch_ppa.watch_ppa(start_time, series, ppa,
61 archive_ppa, None,
62 upload_list)
63 progress = {}
64 for key in status:
65 progress[key] = str(status[key])
66 log.info('progress: {}'.format(progress))
67 out_data['status'] = progress
68 amqp_utils.progress_update(trigger, out_data)
69 if ret == -1:
70 log.info('Going to sleep for {}'.format(TIME_BETWEEN_CHECKS))
71 time.sleep(TIME_BETWEEN_CHECKS)
72 else:
73 log.info('All done')
74 break
75
76 amqp_utils.progress_completed(trigger, out_data)
77 except Exception as e:
78 error_msg = 'Exception: {}'.format(e)
79 log.error(error_msg)
80 out_data['error_message'] = error_msg
81 amqp_utils.progress_failed(trigger, out_data)
82 finally:
83 # remove from queue so request becomes completed
84 log.info('Acking the request: {}'.format(msg.body))
85 msg.channel.basic_ack(msg.delivery_tag)
4786
4887
49if __name__ == '__main__':88if __name__ == '__main__':
5089
=== added file 'branch-source-builder/upload_package.py'
--- branch-source-builder/upload_package.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/upload_package.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,137 @@
1#!/usr/bin/env python
2# Ubuntu CI Engine
3# Copyright 2013 Canonical Ltd.
4
5# This program is free software: you can redistribute it and/or modify it
6# under the terms of the GNU Affero General Public License version 3, as
7# published by the Free Software Foundation.
8
9# This program is distributed in the hope that it will be useful, but
10# WITHOUT ANY WARRANTY; without even the implied warranties of
11# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
12# PURPOSE. See the GNU Affero General Public License for more details.
13
14# You should have received a copy of the GNU Affero General Public License
15# along with this program. If not, see <http://www.gnu.org/licenses/>.
16
17import argparse
18import atexit
19from dput.changes import parse_changes_file
20import json
21import logging
22import os
23import re
24import shutil
25import sys
26import tempfile
27import urllib2
28
29from cupstream2distro import packagemanager
30
31
32def parse_arguments():
33 parser = argparse.ArgumentParser(
34 description='Source package upload handler.')
35 parser.add_argument('-p', '--ppa',
36 required=True,
37 help='Target PPA for source package upload.')
38 parser.add_argument('-s', '--source',
39 required=True,
40 action='append',
41 help='A source package file to include in the upload.')
42 return parser.parse_args()
43
44
45def cleanup(directory):
46 if os.path.exists(directory) and os.path.isdir(directory):
47 logging.info('Removing temp directory: {}'.format(directory))
48 shutil.rmtree(directory)
49
50
51def create_temp_dir():
52 directory = tempfile.mkdtemp()
53 atexit.register(cleanup, directory)
54 return directory
55
56
57def get_url_contents(url):
58 try:
59 req = urllib2.Request(url)
60 f = urllib2.urlopen(req)
61 response = f.read()
62 f.close()
63 return response
64 except IOError as e:
65 raise EnvironmentError('Failed to open url [{}]: {}'.format(url, e))
66
67
68def get_source_files(source_files):
69 location = create_temp_dir()
70 local_files = []
71 for source in source_files:
72 # Co-locate all the files into a temp directory
73 file_name = os.path.basename(source)
74 file_path = os.path.join(location, file_name)
75 logging.info('Retrieving source file: {}'.format(file_name))
76 with open(file_path, 'w') as out_file:
77 out_file.write(get_url_contents(source))
78 local_files.append(file_name)
79 return (location, local_files)
80
81
82def parse_dsc_file(dsc_file):
83 regexp = re.compile("^Architecture: (.*)\n")
84 with open(dsc_file) as in_file:
85 for line in in_file:
86 arch_lists = regexp.findall(line)
87 if arch_lists:
88 return arch_lists[0]
89
90
91def parse_source_files(source_directory, source_files):
92 package_list = []
93 for source in source_files:
94 source_name = os.path.join(source_directory, source)
95 if source.endswith('changes'):
96 changes = parse_changes_file(filename=source_name,
97 directory=source_directory)
98 package = {}
99 package['name'] = changes.get('Source')
100 package['version'] = changes.get('Version')
101 package['files'] = []
102 package['files'].append(changes.get_changes_file())
103 for package_file in changes.get_files():
104 if os.path.exists(package_file):
105 if package_file.endswith('.dsc'):
106 package['architecture'] = parse_dsc_file(
107 package_file)
108 package['files'].append(package_file)
109 else:
110 raise EnvironmentError(
111 'Not found: {}'.format(package_file))
112 package_list.append(package)
113 return package_list
114
115
116def upload_source_packages(ppa, upload_files):
117 '''Attempts source file upload into the PPA.'''
118 logging.info('Upload to the ppa: {}'.format(ppa))
119 (source_directory, source_files) = get_source_files(upload_files)
120 source_packages = parse_source_files(source_directory, source_files)
121 for package in source_packages:
122 packagemanager.upload_package(package['name'], package['version'],
123 ppa, source_directory)
124 # TODO Return the data set of packages uploaded
125 return source_packages
126
127
128def main():
129 logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s "
130 "%(message)s")
131
132 args = parse_arguments()
133 upload_source_packages(args.ppa, args.source)
134
135
136if __name__ == '__main__':
137 sys.exit(main())
0138
=== added file 'branch-source-builder/watch_ppa.py'
--- branch-source-builder/watch_ppa.py 1970-01-01 00:00:00 +0000
+++ branch-source-builder/watch_ppa.py 2014-02-11 20:59:14 +0000
@@ -0,0 +1,182 @@
1#!/usr/bin/python
2# -*- coding: utf-8 -*-
3# Copyright (C) 2012 Canonical
4#
5# Authors:
6# Didier Roche
7#
8# This program is free software; you can redistribute it and/or modify it under
9# the terms of the GNU General Public License as published by the Free Software
10# Foundation; version 3.
11#
12# This program is distributed in the hope that it will be useful, but WITHOUT
13# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
14# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
15# details.
16#
17# You should have received a copy of the GNU General Public License along with
18# this program; if not, write to the Free Software Foundation, Inc.,
19# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
20
21import argparse
22import logging
23import os
24import sys
25import time
26
27from cupstream2distro import (launchpadmanager, packageinppamanager)
28from cupstream2distro.packageinppa import PackageInPPA
29from cupstream2distro.packagemanager import list_packages_info_in_str
30from cupstream2distro.settings import (
31 TIME_BETWEEN_PPA_CHECKS, TIME_BEFORE_STOP_LOOKING_FOR_SOURCE_PUBLISH)
32
33
34def parse_arguments():
35 parser = argparse.ArgumentParser(
36 description="Watch for published package in a ppa",
37 epilog="""series and ppa options can be set by the corresponding long
38 option name env variables as well""")
39 parser.add_argument("-s", "--series",
40 help="Serie used to build the package")
41 parser.add_argument("-p", "--ppa",
42 help="""PPA to publish this package to (for instance:
43 'ubuntu-unity/daily-build')""")
44 parser.add_argument("-a", "--arch",
45 default=None,
46 help="Only consider the provided target")
47 parser.add_argument("-d", "--destppa",
48 help="""Consider this destppa instead of
49 {series}-proposed""")
50 return parser.parse_args()
51
52
53def get_environment_variables(args):
54 series = args.series
55 ppa = args.ppa
56 if not series:
57 series = os.getenv("series")
58 if not ppa:
59 ppa = os.getenv("ppa")
60 return (series, ppa)
61
62
63def get_ppa(ppa):
64 return launchpadmanager.get_ppa(ppa)
65
66
67def watch_ppa(time_start, series, ppa, dest_ppa, arch, upload_list):
68 # Prepare launchpad connection:
69 lp_series = launchpadmanager.get_series(series)
70 monitored_ppa = launchpadmanager.get_ppa(ppa)
71 if dest_ppa:
72 dest_archive = get_ppa(dest_ppa)
73 else:
74 dest_archive = launchpadmanager.get_ubuntu_archive()
75 logging.info('Series: {}'.format(lp_series))
76 logging.info('Monitoring PPA: {}'.format(monitored_ppa))
77
78 logging.info('Destination Archive: {}'.format(dest_archive))
79
80 # Get archs available and archs to ignore
81 (available_archs_in_ppa,
82 arch_all_arch) = launchpadmanager.get_available_and_all_archs(
83 lp_series, monitored_ppa)
84 (archs_to_eventually_ignore,
85 archs_to_unconditionally_ignore) = launchpadmanager.get_ignored_archs()
86 logging.info('Arches available in ppa: {}'.format(available_archs_in_ppa))
87 logging.info('All arch in ppa: {}'.format(arch_all_arch))
88 logging.info('Arches to eventually ignore: {}'.format(
89 archs_to_eventually_ignore))
90 logging.info('Arches to unconditionally ignore: {}'.format(
91 archs_to_unconditionally_ignore))
92
93 # Collecting all packages that have been uploaded to the ppa
94 packages_not_in_ppa = set()
95 packages_building = set()
96 packages_failed = set()
97 for source_package in upload_list:
98 source = source_package['name']
99 version = source_package['version']
100 archs = source_package['architecture']
101 logging.info('Inspecting upload: {} - {}'.format(source, version))
102 packages_not_in_ppa.add(PackageInPPA(source, version, monitored_ppa,
103 dest_archive, lp_series,
104 available_archs_in_ppa,
105 arch_all_arch,
106 archs_to_eventually_ignore,
107 archs_to_unconditionally_ignore,
108 package_archs=archs))
109
110 # packages_not_in_ppa are packages that were uploaded and are expeceted
111 # to eventually appear in the ppa.
112 logging.info('Packages not in PPA: {}'.format(
113 list_packages_info_in_str(packages_not_in_ppa)))
114 logging.info('Packages building: {}'.format(packages_building))
115 logging.info('Packages failed: {}'.format(packages_failed))
116
117 # Check the status regularly on all packages
118 # TODO The following is the original check loop. This can be extracted
119 # and optimized.
120 logging.info("Checking the status for {}".format(
121 list_packages_info_in_str(
122 packages_not_in_ppa.union(packages_building))))
123 packageinppamanager.update_all_packages_status(
124 packages_not_in_ppa, packages_building, packages_failed, arch)
125
126 status = {'pending': packages_not_in_ppa,
127 'building': packages_building,
128 'failed': packages_failed}
129 # if we have no package building or failing and have wait for
130 # long enough to have some package appearing in the ppa, exit
131 if (packages_not_in_ppa and not packages_building and
132 ((time.time() - time_start) >
133 TIME_BEFORE_STOP_LOOKING_FOR_SOURCE_PUBLISH)):
134 # TODO return error on the missing packages
135 logging.info(
136 "Some source packages were never published in the ppa: "
137 "{}".format(list_packages_info_in_str(packages_not_in_ppa)))
138 return (1, status)
139
140 # break out of status check loop if all packages have arrived in
141 # the ppa and have completed building
142 if not packages_not_in_ppa and not packages_building:
143 if packages_failed:
144 # TODO Return package failure info
145 logging.info(
146 "Some of the packages failed to build: {}".format(
147 list_packages_info_in_str(packages_failed)))
148 return (1, status)
149 return (0, status)
150
151 # -1 indicates to retry
152 # TODO return useful status about what is still in progress
153 return (-1, status)
154
155
156def main():
157 '''Provides usage through the command line.'''
158 logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s "
159 "%(message)s")
160
161 args = parse_arguments()
162 (series, ppa) = get_environment_variables(args)
163
164 if not series or not ppa:
165 logging.error("Missing compulsory environment variables (ppa, series) "
166 "watching: {}, series: {}".format(ppa, series))
167 return 1
168
169 # TODO UPLOAD LIST is not ready for use via the CLI yet.
170 # The original cu2d tools extracted this from files stored locally
171 time_start = time.time()
172 while True:
173 (ret, status) = watch_ppa(time_start, series, ppa,
174 args.destppa, args.arch, UPLOAD_LIST)
175 if ret == -1:
176 time.sleep(TIME_BETWEEN_PPA_CHECKS)
177 else:
178 return ret
179
180
181if __name__ == '__main__':
182 sys.exit(main())
0183
=== modified file 'charms/precise/lander-jenkins/templates/jobs/lander_master.xml'
--- charms/precise/lander-jenkins/templates/jobs/lander_master.xml 2014-02-06 21:13:48 +0000
+++ charms/precise/lander-jenkins/templates/jobs/lander_master.xml 2014-02-11 20:59:14 +0000
@@ -168,13 +168,6 @@
168child_number=$TRIGGERED_BUILD_NUMBER_lander_branch_source_builder168child_number=$TRIGGERED_BUILD_NUMBER_lander_branch_source_builder
169wget -O params.json "${JENKINS_URL}job/${LAST_TRIGGERED_JOB_NAME}/${child_number}/artifact/results/params.json"169wget -O params.json "${JENKINS_URL}job/${LAST_TRIGGERED_JOB_NAME}/${child_number}/artifact/results/params.json"
170170
171# TODO - temporary until the bsbuilder works properly
172cat > params.json <<EOF
173{
174 "ppa": "ppa:phablet-team/tools"
175}
176EOF
177
178/srv/lander_jenkins_sub/lander/bin/lander_merge_parameters.py --result-file params.json --service bsbuilder --output-file all.json --prior-file all.json171/srv/lander_jenkins_sub/lander/bin/lander_merge_parameters.py --result-file params.json --service bsbuilder --output-file all.json --prior-file all.json
179172
180# Convert to a format that can be passed to the child jobs173# Convert to a format that can be passed to the child jobs
181174
=== modified file 'juju-deployer/branch-source-builder.yaml'
--- juju-deployer/branch-source-builder.yaml 2014-01-31 09:15:34 +0000
+++ juju-deployer/branch-source-builder.yaml 2014-02-11 20:59:14 +0000
@@ -18,6 +18,9 @@
18 options:18 options:
19 branch: lp:ubuntu-ci-services-itself19 branch: lp:ubuntu-ci-services-itself
20 main: ./branch-source-builder/run_worker20 main: ./branch-source-builder/run_worker
21 unit-config: include-base64://configs/unit_config.yaml
22 packages: dput
23 pip-packages: dput
21 rabbit:24 rabbit:
22 branch: lp:~canonical-ci-engineering/charms/precise/ubuntu-ci-services-itself/rabbitmq-server25 branch: lp:~canonical-ci-engineering/charms/precise/ubuntu-ci-services-itself/rabbitmq-server
23 charm: rabbitmq26 charm: rabbitmq

Subscribers

People subscribed via source and target branches