Merge lp:~fginther/ubuntu-ci-services-itself/bsb-dput-3 into lp:ubuntu-ci-services-itself
- bsb-dput-3
- Merge into trunk
Status: | Merged |
---|---|
Approved by: | Francis Ginther |
Approved revision: | 195 |
Merged at revision: | 233 |
Proposed branch: | lp:~fginther/ubuntu-ci-services-itself/bsb-dput-3 |
Merge into: | lp:ubuntu-ci-services-itself |
Diff against target: |
2431 lines (+2300/-16) 16 files modified
branch-source-builder/cupstream2distro/branchhandling.py (+283/-0) branch-source-builder/cupstream2distro/launchpadmanager.py (+157/-0) branch-source-builder/cupstream2distro/packageinppa.py (+225/-0) branch-source-builder/cupstream2distro/packageinppamanager.py (+109/-0) branch-source-builder/cupstream2distro/packagemanager.py (+504/-0) branch-source-builder/cupstream2distro/settings.py (+121/-0) branch-source-builder/cupstream2distro/silomanager.py (+115/-0) branch-source-builder/cupstream2distro/stack.py (+204/-0) branch-source-builder/cupstream2distro/stacks.py (+89/-0) branch-source-builder/cupstream2distro/tools.py (+94/-0) branch-source-builder/cupstream2distro/utils.py (+29/-0) branch-source-builder/run_worker (+48/-9) branch-source-builder/upload_package.py (+137/-0) branch-source-builder/watch_ppa.py (+182/-0) charms/precise/lander-jenkins/templates/jobs/lander_master.xml (+0/-7) juju-deployer/branch-source-builder.yaml (+3/-0) |
To merge this branch: | bzr merge lp:~fginther/ubuntu-ci-services-itself/bsb-dput-3 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Andy Doan (community) | Approve | ||
PS Jenkins bot (community) | continuous-integration | Approve | |
Francis Ginther | Needs Resubmitting | ||
Review via email: mp+205851@code.launchpad.net |
Commit message
Adds the dput and ppa_watch functionality to the branch source builder. This is very raw, most of the functionality was pulled directly from lp:cupstream2distro with small changes related to the input parameters and the deployment environment.
Description of the change
Adds the dput and ppa_watch functionality to the branch source builder. This is very raw, most of the functionality was pulled directly from lp:cupstream2distro with small changes related to the input parameters and the deployment environment.
The series and archive PPA are currently hard coded. This will be removed by a subsequent MP.
You'll also notice no tests here. The cupstream2distro code also has no tests for these modules. I need to address this. For now, I'm banking on the fact that this code is already consumed by the daily-release process.
PS Jenkins bot (ps-jenkins) wrote : | # |
PASSED: Continuous integration, rev:195
http://
Executed test runs:
Click here to trigger a rebuild:
http://
Andy Doan (doanac) wrote : | # |
the most exciting thing i've seen today!
I saw we merge this and then start testing it hard. I noticed one small thing:
2035 + archive_ppa = 'ppa:ci-
2036 + series = 'saucy'
once this is merged into trunk, I think you can pick up some work Chris added today:
http://
Francis Ginther (fginther) wrote : | # |
> the most exciting thing i've seen today!
>
> I saw we merge this and then start testing it hard. I noticed one small thing:
>
> 2035 + archive_ppa = 'ppa:ci-
> 2036 + series = 'saucy'
>
> once this is merged into trunk, I think you can pick up some work Chris added
> today:
>
> http://
> itself/
Right, I also noticed the ticket system change today, but didn't want to further delay this MP.
Andy Doan (doanac) : | # |
Preview Diff
1 | === added directory 'branch-source-builder/cupstream2distro' |
2 | === added file 'branch-source-builder/cupstream2distro/__init__.py' |
3 | === added file 'branch-source-builder/cupstream2distro/branchhandling.py' |
4 | --- branch-source-builder/cupstream2distro/branchhandling.py 1970-01-01 00:00:00 +0000 |
5 | +++ branch-source-builder/cupstream2distro/branchhandling.py 2014-02-11 20:59:14 +0000 |
6 | @@ -0,0 +1,283 @@ |
7 | +# -*- coding: utf-8 -*- |
8 | +# Copyright (C) 2012 Canonical |
9 | +# |
10 | +# Authors: |
11 | +# Didier Roche |
12 | +# |
13 | +# This program is free software; you can redistribute it and/or modify it under |
14 | +# the terms of the GNU General Public License as published by the Free Software |
15 | +# Foundation; version 3. |
16 | +# |
17 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
18 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
19 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
20 | +# details. |
21 | +# |
22 | +# You should have received a copy of the GNU General Public License along with |
23 | +# this program; if not, write to the Free Software Foundation, Inc., |
24 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
25 | + |
26 | +import ConfigParser |
27 | +from collections import defaultdict |
28 | +import logging |
29 | +import os |
30 | +import re |
31 | +import subprocess |
32 | + |
33 | +from .settings import BRANCH_URL, IGNORECHANGELOG_COMMIT, PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX, SILO_PACKAGING_RELEASE_COMMIT_MESSAGE |
34 | + |
35 | + |
36 | +def get_branch(branch_url, dest_dir): |
37 | + '''Grab a branch''' |
38 | + instance = subprocess.Popen(["bzr", "branch", branch_url, dest_dir], stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
39 | + (stdout, stderr) = instance.communicate() |
40 | + if instance.returncode != 0: |
41 | + raise Exception(stderr.decode("utf-8").strip()) |
42 | + |
43 | + |
44 | +def get_tip_bzr_revision(): |
45 | + '''Get latest revision in bzr''' |
46 | + instance = subprocess.Popen(["bzr", "log", "-c", "-1", "--line"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
47 | + (stdout, stderr) = instance.communicate() |
48 | + if instance.returncode != 0: |
49 | + raise Exception(stderr.decode("utf-8").strip()) |
50 | + return (int(stdout.split(':')[0])) |
51 | + |
52 | + |
53 | +def collect_author_commits(content_to_parse, bugs_to_skip, additional_stamp=""): |
54 | + '''return a tuple of a dict with authors and commits message from the content to parse |
55 | + |
56 | + bugs_to_skip is a set of bugs we need to skip |
57 | + |
58 | + Form: ({Author: [commit_message]}, set(bugs))''' |
59 | + |
60 | + author_commit = defaultdict(list) |
61 | + all_bugs = set() |
62 | + |
63 | + current_authors = set() |
64 | + current_commit = "" |
65 | + current_bugs = set() |
66 | + commit_message_stenza = False |
67 | + for line in content_to_parse.splitlines(): |
68 | + # new revision, collect what we have found |
69 | + if line.startswith("------------------------------------------------------------"): |
70 | + # try to decipher a special case: we have some commits which were already in bugs_to_skip, |
71 | + # so we eliminate them. |
72 | + # Also ignore when having IGNORECHANGELOG_COMMIT |
73 | + if (current_bugs and not (current_bugs - bugs_to_skip)) or IGNORECHANGELOG_COMMIT in current_commit: |
74 | + current_authors = set() |
75 | + current_commit = "" |
76 | + current_bugs = set() |
77 | + continue |
78 | + current_bugs -= bugs_to_skip |
79 | + commit_message = current_commit + _format_bugs(current_bugs) |
80 | + for author in current_authors: |
81 | + if not author.startswith("Launchpad "): |
82 | + author_commit[author].append(commit_message) |
83 | + all_bugs = all_bugs.union(current_bugs) |
84 | + current_authors = set() |
85 | + current_commit = "" |
86 | + current_bugs = set() |
87 | + |
88 | + # we ignore this commit if we have a changelog provided as part of the diff |
89 | + if line.startswith("=== modified file 'debian/changelog'"): |
90 | + current_authors = set() |
91 | + current_commit = "" |
92 | + current_bugs = set() |
93 | + |
94 | + if line.startswith("author: "): |
95 | + current_authors = _extract_authors(line[8:]) |
96 | + # if direct commit to trunk |
97 | + elif not current_authors and line.startswith("committer: "): |
98 | + current_authors = _extract_authors(line[11:]) |
99 | + # file the commit message log |
100 | + elif commit_message_stenza: |
101 | + if line.startswith("diff:"): |
102 | + commit_message_stenza = False |
103 | + current_commit, current_bugs = _extract_commit_bugs(current_commit, additional_stamp) |
104 | + else: |
105 | + line = line[2:] # Dedent the message provided by bzr |
106 | + if line[0:2] in ('* ', '- '): # paragraph line. |
107 | + line = line[2:] # Remove bullet |
108 | + if line[-1] != '.': # Grammar nazi... |
109 | + line += '.' # ... or the lines will be merged. |
110 | + line = line + ' ' # Add a space to preserve lines |
111 | + current_commit += line |
112 | + # Maybe add something like that |
113 | + #for content in mp.commit_message.split('\n'): |
114 | + # content = content.strip() |
115 | + # if content.startswith('-') or content.startswith('*'): |
116 | + # content = content[1:] |
117 | + # description_content.append(content.strip()) |
118 | + elif line.startswith("message:"): |
119 | + commit_message_stenza = True |
120 | + elif line.startswith("fixes bug: "): |
121 | + current_bugs = current_bugs.union(_return_bugs(line[11:])) |
122 | + |
123 | + return (dict(author_commit), all_bugs) |
124 | + |
125 | + |
126 | +def _format_bugs(bugs): |
127 | + '''Format a list of launchpad bugs.''' |
128 | + if bugs: |
129 | + msg = ' (LP: {})'.format(', '.join(['#{}'.format(b) for b in bugs])) |
130 | + else: |
131 | + msg = '' |
132 | + return msg |
133 | + |
134 | + |
135 | +def _extract_commit_bugs(commit_message, additional_stamp=""): |
136 | + '''extract relevant commit message part and bugs number from a commit message''' |
137 | + |
138 | + current_bugs = _return_bugs(commit_message) |
139 | + changelog_content = " ".join(commit_message.rsplit('Fixes: ')[0].rsplit('Approved by ')[0].split()) |
140 | + if additional_stamp: |
141 | + changelog_content = changelog_content + " " + additional_stamp |
142 | + return (changelog_content, current_bugs) |
143 | + |
144 | + |
145 | +def _return_bugs(string): |
146 | + '''return a set of bugs from string''' |
147 | + |
148 | + # we are trying to match in the commit message: |
149 | + # bug #12345, bug#12345, bug12345, bug 12345 |
150 | + # lp: #12345, lp:#12345, lp:12345, lp: 12345 |
151 | + # lp #12345, lp#12345, lp12345, lp 12345, |
152 | + # Fix #12345, Fix 12345, Fix: 12345, Fix12345, Fix: #12345, |
153 | + # Fixes #12345, Fixes 12345, Fixes: 12345, Fixes:12345, Fixes: #12345 |
154 | + # *launchpad.net/bugs/1234567890 and |
155 | + # #12345 (but not 12345 for false positive) |
156 | + # Support multiple bugs per commit |
157 | + bug_numbers = set() |
158 | + bug_regexp = re.compile("((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})", re.IGNORECASE) |
159 | + for match in bug_regexp.findall(string): |
160 | + logging.debug("Bug regexp match: {}".format(match[-1])) |
161 | + bug_numbers.add(int(match[-1])) |
162 | + return bug_numbers |
163 | + |
164 | + |
165 | +def _extract_authors(string): |
166 | + '''return a authors set from string, ignoring emails''' |
167 | + |
168 | + authors = set() |
169 | + for author_with_mail in string.split(", "): |
170 | + author = author_with_mail.rsplit(' <')[0] |
171 | + logging.debug("Found {} as author".format(author)) |
172 | + authors.add(author) |
173 | + return authors |
174 | + |
175 | + |
176 | +def return_log_diff(starting_rev): |
177 | + '''Return the relevant part of the cvs log since starting_rev''' |
178 | + |
179 | + instance = subprocess.Popen(["bzr", "log", "-r", "{}..".format(starting_rev), "--show-diff", "--forward"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
180 | + (stdout, stderr) = instance.communicate() |
181 | + if instance.returncode != 0: |
182 | + raise Exception(stderr.decode("utf-8").strip()) |
183 | + return stdout |
184 | + |
185 | + |
186 | +def return_log_diff_since_last_release(content_to_parse): |
187 | + '''From a bzr log content, return only the log diff since the latest release''' |
188 | + after_release = content_to_parse.split(SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(''))[-1] |
189 | + sep = '------------------------------------------------------------' |
190 | + sep_index = after_release.find(sep) |
191 | + if sep_index != 1: |
192 | + after_release = after_release[sep_index:] |
193 | + return after_release |
194 | + |
195 | + |
196 | +def commit_release(new_package_version, tip_bzr_rev=None): |
197 | + '''Commit latest release''' |
198 | + if not tip_bzr_rev: |
199 | + message = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version) |
200 | + else: |
201 | + message = "Releasing {}, based on r{}".format(new_package_version, tip_bzr_rev) |
202 | + if subprocess.call(["bzr", "commit", "-m", message]) != 0: |
203 | + raise Exception("The above command returned an error.") |
204 | + |
205 | + |
206 | +def _get_parent_branch(source_package_name): |
207 | + '''Get parent branch from config''' |
208 | + config = ConfigParser.RawConfigParser() |
209 | + config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)) |
210 | + return config.get('Branch', 'branch') |
211 | + |
212 | + |
213 | +def propose_branch_for_merging(source_package_name, version, tip_rev, branch): |
214 | + '''Propose and commit a branch upstream''' |
215 | + |
216 | + parent_branch = _get_parent_branch(source_package_name) |
217 | + # suppress browser opening |
218 | + env = os.environ.copy() |
219 | + env["BROWSER"] = "echo" |
220 | + env["BZR_EDITOR"] = "echo" |
221 | + |
222 | + os.chdir(source_package_name) |
223 | + if subprocess.call(["bzr", "push", BRANCH_URL.format(source_package_name, version.replace("~", "").replace(":", "")), "--overwrite"]) != 0: |
224 | + raise Exception("The push command returned an error.") |
225 | + mergeinstance = subprocess.Popen(["bzr", "lp-propose-merge", parent_branch, "-m", PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch), "--approve"], stdin=subprocess.PIPE, env=env) |
226 | + mergeinstance.communicate(input="y") |
227 | + if mergeinstance.returncode != 0: |
228 | + raise Exception("The lp-propose command returned an error.") |
229 | + os.chdir('..') |
230 | + |
231 | + |
232 | +def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri, commit_message, revision): |
233 | + """Merge local branch into lp_parent_branch at revision""" |
234 | + success = False |
235 | + cur_dir = os.path.abspath('.') |
236 | + subprocess.call(["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri]) |
237 | + os.chdir(dest_uri) |
238 | + if subprocess.call(["bzr", "merge", local_branch_uri]) == 0: |
239 | + subprocess.call(["bzr", "commit", "-m", commit_message]) |
240 | + success = True |
241 | + os.chdir(cur_dir) |
242 | + return success |
243 | + |
244 | + |
245 | +def merge_branch(uri_to_merge, lp_parent_branch, commit_message, authors=set()): |
246 | + """Resync with targeted branch if possible""" |
247 | + success = False |
248 | + cur_dir = os.path.abspath('.') |
249 | + os.chdir(uri_to_merge) |
250 | + lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:") |
251 | + if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0: |
252 | + cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"] |
253 | + for author in authors: |
254 | + cmd.extend(['--author', author]) |
255 | + subprocess.call(cmd) |
256 | + success = True |
257 | + os.chdir(cur_dir) |
258 | + return success |
259 | + |
260 | +def push_to_branch(source_uri, lp_parent_branch, overwrite=False): |
261 | + """Push source to parent branch""" |
262 | + success = False |
263 | + cur_dir = os.path.abspath('.') |
264 | + os.chdir(source_uri) |
265 | + lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:") |
266 | + command = ["bzr", "push", lp_parent_branch] |
267 | + if overwrite: |
268 | + command.append("--overwrite") |
269 | + if subprocess.call(command) == 0: |
270 | + success = True |
271 | + os.chdir(cur_dir) |
272 | + return success |
273 | + |
274 | +def grab_committers_compared_to(source_uri, lp_branch_to_scan): |
275 | + """Return unique list of committers for a given branch""" |
276 | + committers = set() |
277 | + cur_dir = os.path.abspath('.') |
278 | + os.chdir(source_uri) |
279 | + lp_branch_to_scan = lp_branch_to_scan.replace("https://code.launchpad.net/", "lp:") |
280 | + instance = subprocess.Popen(["bzr", "missing", lp_branch_to_scan, "--other"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
281 | + (stdout, stderr) = instance.communicate() |
282 | + if stderr != "": |
283 | + raise Exception("bzr missing on {} returned a failure: {}".format(lp_branch_to_scan, stderr.decode("utf-8").strip())) |
284 | + committer_regexp = re.compile("\ncommitter: (.*)\n") |
285 | + for match in committer_regexp.findall(stdout): |
286 | + for committer in match.split(', '): |
287 | + committers.add(committer) |
288 | + os.chdir(cur_dir) |
289 | + return committers |
290 | |
291 | === added file 'branch-source-builder/cupstream2distro/launchpadmanager.py' |
292 | --- branch-source-builder/cupstream2distro/launchpadmanager.py 1970-01-01 00:00:00 +0000 |
293 | +++ branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-11 20:59:14 +0000 |
294 | @@ -0,0 +1,157 @@ |
295 | +# -*- coding: utf-8 -*- |
296 | +# Copyright (C) 2012 Canonical |
297 | +# |
298 | +# Authors: |
299 | +# Didier Roche |
300 | +# |
301 | +# This program is free software; you can redistribute it and/or modify it under |
302 | +# the terms of the GNU General Public License as published by the Free Software |
303 | +# Foundation; version 3. |
304 | +# |
305 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
306 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
307 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
308 | +# details.cat |
309 | +# |
310 | +# You should have received a copy of the GNU General Public License along with |
311 | +# this program; if not, write to the Free Software Foundation, Inc., |
312 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
313 | + |
314 | +from __future__ import unicode_literals |
315 | + |
316 | +from launchpadlib.launchpad import Launchpad |
317 | +import lazr |
318 | +import logging |
319 | +import os |
320 | +launchpad = None |
321 | + |
322 | +from .settings import ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH, CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR |
323 | + |
324 | + |
325 | +def get_launchpad(use_staging=False, use_cred_file=os.path.expanduser(CRED_FILE_PATH)): |
326 | + '''Get THE Launchpad''' |
327 | + global launchpad |
328 | + if not launchpad: |
329 | + if use_staging: |
330 | + server = 'staging' |
331 | + else: |
332 | + server = 'production' |
333 | + |
334 | + launchpadlib_dir = COMMON_LAUNCHPAD_CACHE_DIR |
335 | + if not os.path.exists(launchpadlib_dir): |
336 | + os.makdedirs(launchpadlib_dir) |
337 | + |
338 | + if use_cred_file: |
339 | + launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"], |
340 | + version='devel', # devel because copyPackage is only available there |
341 | + credentials_file=use_cred_file, |
342 | + launchpadlib_dir=launchpadlib_dir) |
343 | + else: |
344 | + launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"], |
345 | + version='devel', # devel because copyPackage is only available there |
346 | + launchpadlib_dir=launchpadlib_dir) |
347 | + |
348 | + return launchpad |
349 | + |
350 | + |
351 | +def get_ubuntu(): |
352 | + '''Get the ubuntu distro''' |
353 | + lp = get_launchpad() |
354 | + return lp.distributions['ubuntu'] |
355 | + |
356 | + |
357 | +def get_ubuntu_archive(): |
358 | + '''Get the ubuntu main archive''' |
359 | + return get_ubuntu().main_archive |
360 | + |
361 | + |
362 | +def get_series(series_name): |
363 | + '''Return the launchpad object for the requested series''' |
364 | + return get_ubuntu().getSeries(name_or_version=series_name) |
365 | + |
366 | + |
367 | +def get_bugs_titles(author_bugs): |
368 | + lp = get_launchpad() |
369 | + author_bugs_with_title = author_bugs.copy() |
370 | + for author in author_bugs: |
371 | + bug_title_sets = set() |
372 | + for bug in author_bugs[author]: |
373 | + try: |
374 | + bug_title_sets.add("{} (LP: #{})".format(lp.bugs[bug].title, bug)) |
375 | + except KeyError: |
376 | + # still list non existing or if launchpad timeouts bugs |
377 | + bug_title_sets.add(u"Fix LP: #{}".format(bug)) |
378 | + author_bugs_with_title[author] = bug_title_sets |
379 | + |
380 | + return author_bugs_with_title |
381 | + |
382 | + |
383 | +def open_bugs_for_source(bugs_list, source_name, series_name): |
384 | + lp = get_launchpad() |
385 | + ubuntu = get_ubuntu() |
386 | + |
387 | + # don't nominate for current series |
388 | + if ubuntu.current_series.name == series_name: |
389 | + package = ubuntu.getSourcePackage(name=source_name) |
390 | + else: |
391 | + series = get_series(series_name) |
392 | + package = series.getSourcePackage(name=source_name) |
393 | + |
394 | + for bug_num in bugs_list: |
395 | + try: |
396 | + bug = lp.bugs[bug_num] |
397 | + bug.addTask(target=package) |
398 | + bug.lp_save() |
399 | + except (KeyError, lazr.restfulclient.errors.BadRequest, lazr.restfulclient.errors.ServerError): |
400 | + # ignore non existing or available bugs |
401 | + logging.info("Can't synchronize upstream/downstream bugs for bug #{}. Not blocking on that.".format(bug_num)) |
402 | + |
403 | + |
404 | +def get_available_and_all_archs(series, ppa=None): |
405 | + '''Return a set of available archs, and the all arch''' |
406 | + available_arch = set() |
407 | + if ppa and ppa.require_virtualized: |
408 | + available_arch = set(VIRTUALIZED_PPA_ARCH) |
409 | + arch_all_arch = VIRTUALIZED_PPA_ARCH[0] |
410 | + else: |
411 | + for arch in series.architectures: |
412 | + # HACK: filters armel as it's still seen as available on raring: https://launchpad.net/bugs/1077257 |
413 | + if arch.architecture_tag == "armel": |
414 | + continue |
415 | + available_arch.add(arch.architecture_tag) |
416 | + if arch.is_nominated_arch_indep: |
417 | + arch_all_arch = arch.architecture_tag |
418 | + |
419 | + return (available_arch, arch_all_arch) |
420 | + |
421 | + |
422 | +def get_ignored_archs(): |
423 | + '''The archs we can eventually ignore if nothing is published in dest''' |
424 | + return (ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE) |
425 | + |
426 | + |
427 | +def get_ppa(ppa_name): |
428 | + '''Return a launchpad ppa''' |
429 | + if ppa_name.startswith("ppa:"): |
430 | + ppa_name = ppa_name[4:] |
431 | + ppa_dispatch = ppa_name.split("/") |
432 | + return get_launchpad().people[ppa_dispatch[0]].getPPAByName(name=ppa_dispatch[1]) |
433 | + |
434 | +def is_series_current(series_name): |
435 | + '''Return if series_name is the edge development version''' |
436 | + return get_ubuntu().current_series.name == series_name |
437 | + |
438 | +def get_resource_from_url(url): |
439 | + '''Return a lp resource from a launchpad url''' |
440 | + lp = get_launchpad() |
441 | + url = lp.resource_type_link.replace("/#service-root", "") + url.split("launchpad.net")[1] |
442 | + return lp.load(url) |
443 | + |
444 | +def get_resource_from_token(url): |
445 | + '''Return a lp resource from a launchpad token''' |
446 | + lp = get_launchpad() |
447 | + return lp.load(url) |
448 | + |
449 | +def is_dest_ubuntu_archive(series_link): |
450 | + '''return if series_link is the ubuntu archive''' |
451 | + return series_link == get_ubuntu_archive().self_link |
452 | |
453 | === added file 'branch-source-builder/cupstream2distro/packageinppa.py' |
454 | --- branch-source-builder/cupstream2distro/packageinppa.py 1970-01-01 00:00:00 +0000 |
455 | +++ branch-source-builder/cupstream2distro/packageinppa.py 2014-02-11 20:59:14 +0000 |
456 | @@ -0,0 +1,225 @@ |
457 | +# -*- coding: utf-8 -*- |
458 | +# Copyright (C) 2012 Canonical |
459 | +# |
460 | +# Authors: |
461 | +# Didier Roche |
462 | +# |
463 | +# This program is free software; you can redistribute it and/or modify it under |
464 | +# the terms of the GNU General Public License as published by the Free Software |
465 | +# Foundation; version 3. |
466 | +# |
467 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
468 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
469 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
470 | +# details. |
471 | +# |
472 | +# You should have received a copy of the GNU General Public License along with |
473 | +# this program; if not, write to the Free Software Foundation, Inc., |
474 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
475 | + |
476 | +import logging |
477 | +import os |
478 | +import re |
479 | + |
480 | + |
481 | +class PackageInPPA(): |
482 | + |
483 | + (BUILDING, FAILED, PUBLISHED) = range(3) |
484 | + |
485 | + def __init__(self, source_name, version, ppa, destarchive, series, |
486 | + available_archs_in_ppa, arch_all_arch, archs_to_eventually_ignore, |
487 | + archs_to_unconditionually_ignore, package_archs=None): |
488 | + self.source_name = source_name |
489 | + self.version = version |
490 | + self.series = series |
491 | + self.arch_all_arch = arch_all_arch |
492 | + self.ppa = ppa |
493 | + self.current_status = {} |
494 | + |
495 | + # Get archs we should look at |
496 | + version_for_source_file = version.split(':')[-1] |
497 | + if not package_archs: |
498 | + dsc_filename = "{}_{}.dsc".format(source_name, version_for_source_file) |
499 | + regexp = re.compile("^Architecture: (.*)\n") |
500 | + for line in open(dsc_filename): |
501 | + arch_lists = regexp.findall(line) |
502 | + if arch_lists: |
503 | + package_archs = arch_lists[0] |
504 | + break |
505 | + |
506 | + if not package_archs: |
507 | + raise EnvironmentError("Can't determine package Architecture.") |
508 | + if "any" in package_archs: |
509 | + self.archs = available_archs_in_ppa.copy() |
510 | + elif package_archs == "all": |
511 | + self.archs = set([self.arch_all_arch]) |
512 | + else: |
513 | + archs_supported_by_package = set() |
514 | + for arch in package_archs.split(): |
515 | + archs_supported_by_package.add(arch) |
516 | + self.archs = archs_supported_by_package.intersection(available_archs_in_ppa) |
517 | + # ignore some eventual archs if doesn't exist in latest published version in dest |
518 | + archs_to_eventually_ignore = archs_to_eventually_ignore.copy() |
519 | + if archs_to_eventually_ignore: |
520 | + try: |
521 | + previous_source = destarchive.getPublishedSources(exact_match=True, source_name=self.source_name, |
522 | + distro_series=self.series, status="Published")[0] |
523 | + for binary in previous_source.getPublishedBinaries(): |
524 | + if binary.architecture_specific and binary.distro_arch_series.architecture_tag in archs_to_eventually_ignore: |
525 | + archs_to_eventually_ignore -= set([binary.distro_arch_series.architecture_tag]) |
526 | + if not archs_to_eventually_ignore: |
527 | + break |
528 | + |
529 | + except IndexError: |
530 | + # no package in dest, don't wait on any archs_to_eventually_ignore |
531 | + pass |
532 | + # remove from the inspection remaining archs to ignore |
533 | + if archs_to_eventually_ignore: |
534 | + self.archs -= archs_to_eventually_ignore |
535 | + if archs_to_unconditionually_ignore: |
536 | + self.archs -= archs_to_unconditionually_ignore |
537 | + |
538 | + |
539 | + def __repr__(self): |
540 | + return '{} - {}'.format(self.source_name, self.version) |
541 | + |
542 | + |
543 | + def get_status(self, on_particular_arch=None): |
544 | + '''Look at the package status in the ppa |
545 | + |
546 | + Can scope to a particular arch to watch for''' |
547 | + |
548 | + self._refresh_status() |
549 | + if not self.current_status: |
550 | + return None |
551 | + |
552 | + current_package_building = False |
553 | + current_package_failed = False |
554 | + for arch in self.current_status: |
555 | + if on_particular_arch and arch != on_particular_arch: |
556 | + continue |
557 | + str_status = "published" |
558 | + if self.current_status[arch] == PackageInPPA.BUILDING: |
559 | + current_package_building = True |
560 | + str_status = "building" |
561 | + if self.current_status[arch] == PackageInPPA.FAILED: |
562 | + current_package_failed = True |
563 | + str_status = "failed" |
564 | + logging.info("arch: {}, status: {}".format(arch, str_status)) |
565 | + |
566 | + if current_package_building: |
567 | + return self.BUILDING |
568 | + # no more package is building, if one failed, time to signal it |
569 | + if current_package_failed: |
570 | + return self.FAILED |
571 | + # if it's not None, not BUILDING, nor FAILED, it's PUBLISHED |
572 | + return self.PUBLISHED |
573 | + |
574 | + def _refresh_archs_skipped(self): |
575 | + '''Refresh archs that we should skip for this build''' |
576 | + |
577 | + for arch in self.archs.copy(): |
578 | + if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)): |
579 | + logging.warning("Request to ignore {} on {}.".format(self.source_name, arch)) |
580 | + try: |
581 | + self.archs.remove(arch) |
582 | + except ValueError: |
583 | + logging.warning("Request to ignore {} on {} has been proceeded, but this one wasn't in the list we were monitor for.".format(self.source_name, arch)) |
584 | + try: |
585 | + self.current_status.pop(arch) |
586 | + except KeyError: |
587 | + pass |
588 | + |
589 | + def _refresh_status(self): |
590 | + '''Refresh status from the ppa''' |
591 | + |
592 | + self._refresh_archs_skipped() |
593 | + |
594 | + # first step, get the source published |
595 | + if not self.current_status: |
596 | + (self.current_status, self.source) = self._get_status_for_source_package_in_ppa() |
597 | + # check the binary status |
598 | + if self.current_status: |
599 | + self.current_status = self._get_status_for_binary_packages_in_ppa() |
600 | + |
601 | + def _get_status_for_source_package_in_ppa(self): |
602 | + '''Return current_status for source package in ppa. |
603 | + |
604 | + The status is dict (if not None) with {arch: status} and can be: |
605 | + - None -> not visible yet |
606 | + - BUILDING -> currently Building (or waiting to build) |
607 | + - FAILED -> Build failed for this arch or has been canceled |
608 | + - PUBLISHED -> All packages (including arch:all from other archs) published. |
609 | + |
610 | + Only the 2 first status are returned by this call. See _get_status_for_binary_packages_in_ppa |
611 | + for the others.''' |
612 | + |
613 | + try: |
614 | + source = self.ppa.getPublishedSources(exact_match=True, source_name=self.source_name, version=self.version, distro_series=self.series)[0] |
615 | + logging.info("Source available in ppa") |
616 | + current_status = {} |
617 | + for arch in self.archs: |
618 | + current_status[arch] = self.BUILDING |
619 | + return (current_status, source) |
620 | + except (KeyError, IndexError): |
621 | + logging.info("Source not yet available") |
622 | + return ({}, None) |
623 | + |
624 | + def _get_status_for_binary_packages_in_ppa(self): |
625 | + '''Return current status for package in ppa |
626 | + |
627 | + The status is dict (if not None) with {arch: status} and can be: |
628 | + - None -> not visible yet |
629 | + - BUILDING -> currently Building (or waiting to build) |
630 | + - FAILED -> Build failed for this arch or has been canceled |
631 | + - PUBLISHED -> All packages (including arch:all from other archs) published. |
632 | + |
633 | + Only the 3 last statuses are returned by this call. See _get_status_for_source_package_in_ppa |
634 | + for the other.''' |
635 | + |
636 | + # Try to see if all binaries availables for this arch are built, including arch:all on other archs |
637 | + status = self.current_status |
638 | + at_least_one_published_binary = False |
639 | + for binary in self.source.getPublishedBinaries(): |
640 | + at_least_one_published_binary = True |
641 | + # all binaries for an arch are published at the same time |
642 | + # launchpad is lying, it's telling that archs not in the ppa are built (for arch:all). Even for non supported arch! |
643 | + # for instance, we can have the case of self.arch_all_arch (arch:all), built before the others and amd64 will be built for it |
644 | + if binary.status == "Published" and (binary.distro_arch_series.architecture_tag == self.arch_all_arch or |
645 | + (binary.distro_arch_series.architecture_tag != self.arch_all_arch and binary.architecture_specific)): |
646 | + status[binary.distro_arch_series.architecture_tag] = self.PUBLISHED |
647 | + |
648 | + # Looking for builds on archs still BUILDING (just loop on builds once to avoid too many lp requests) |
649 | + needs_checking_build = False |
650 | + build_state_failed = ('Failed to build', 'Chroot problem', 'Failed to upload', 'Cancelled build', 'Build for superseded Source') |
651 | + for arch in self.archs: |
652 | + if self.current_status[arch] == self.BUILDING: |
653 | + needs_checking_build = True |
654 | + if needs_checking_build: |
655 | + for build in self.source.getBuilds(): |
656 | + # ignored archs |
657 | + if not build.arch_tag in self.current_status: |
658 | + continue |
659 | + if self.current_status[build.arch_tag] == self.BUILDING: |
660 | + if build.buildstate in build_state_failed: |
661 | + logging.error("{}: Build {} ({}) failed because of {}".format(build.arch_tag, build.title, |
662 | + build.web_link, build.buildstate)) |
663 | + status[build.arch_tag] = self.FAILED |
664 | + # Another launchpad trick: if a binary arch was published, but then is superseeded, getPublishedBinaries() won't list |
665 | + # those binaries anymore. So it's seen as BUILDING again. |
666 | + # If there is a successful build record of it and the source is superseded, it means that it built fine at some point, |
667 | + # Another arch will fail as superseeded. |
668 | + # We don't just retain the old state of "PUBLISHED" because maybe we started the script with that situation already |
669 | + elif build.buildstate not in build_state_failed and self.source.status == "Superseded": |
670 | + status[build.arch_tag] = self.PUBLISHED |
671 | + |
672 | + # There is no way to know if there are some arch:all packages (and there are not in publishedBinaries for this arch until |
673 | + # it's built on arch_all_arch). So mark all arch to BUILDING if self.arch_all_arch is building or FAILED if it failed. |
674 | + if self.arch_all_arch in status and status[self.arch_all_arch] != self.PUBLISHED: |
675 | + for arch in self.archs: |
676 | + if status[arch] == self.PUBLISHED: |
677 | + status[arch] = status[self.arch_all_arch] |
678 | + if arch != self.arch_all_arch and status[arch] == self.FAILED: |
679 | + logging.error("{} marked as FAILED because {} build FAILED and we may miss arch:all packages".format(arch, self.arch_all_arch)) |
680 | + |
681 | + return status |
682 | |
683 | === added file 'branch-source-builder/cupstream2distro/packageinppamanager.py' |
684 | --- branch-source-builder/cupstream2distro/packageinppamanager.py 1970-01-01 00:00:00 +0000 |
685 | +++ branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-11 20:59:14 +0000 |
686 | @@ -0,0 +1,109 @@ |
687 | +# -*- coding: utf-8 -*- |
688 | +# Copyright (C) 2012 Canonical |
689 | +# |
690 | +# Authors: |
691 | +# Didier Roche |
692 | +# |
693 | +# This program is free software; you can redistribute it and/or modify it under |
694 | +# the terms of the GNU General Public License as published by the Free Software |
695 | +# Foundation; version 3. |
696 | +# |
697 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
698 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
699 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
700 | +# details. |
701 | +# |
702 | +# You should have received a copy of the GNU General Public License along with |
703 | +# this program; if not, write to the Free Software Foundation, Inc., |
704 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
705 | + |
706 | +import ConfigParser |
707 | +import logging |
708 | +import os |
709 | +import re |
710 | + |
711 | +from .branchhandling import _get_parent_branch |
712 | +from .packageinppa import PackageInPPA |
713 | +from .settings import PROJECT_CONFIG_SUFFIX |
714 | + |
715 | + |
716 | +def _ensure_removed_from_set(target_set, content_to_remove): |
717 | + '''Silent removal from an existing set''' |
718 | + try: |
719 | + target_set.remove(content_to_remove) |
720 | + except KeyError: |
721 | + pass # in case we missed the "build" step |
722 | + |
723 | + |
724 | +def get_all_packages_uploaded(): |
725 | + '''Get (package, version, rev, branch) of all packages uploaded''' |
726 | + |
727 | + # we do not rely on the .changes files but in the config file |
728 | + # because we need the exact version (which can have an epoch) |
729 | + result = set() |
730 | + source_package_regexp = re.compile("(.*).{}$".format(PROJECT_CONFIG_SUFFIX)) |
731 | + for file in os.listdir('.'): |
732 | + substract = source_package_regexp.findall(file) |
733 | + if substract: |
734 | + version = _get_current_packaging_version_from_config(substract[0]) |
735 | + rev = _get_current_rev_from_config(substract[0]) |
736 | + branch = _get_parent_branch(substract[0]) |
737 | + result.add((substract[0], version, rev, branch)) |
738 | + foo = set() |
739 | + foo.add(('autopilot', '1.3.1+13.10.20131003.1-0ubuntu2~fginther.1', |
740 | + '100', 'lp:autopilot')) |
741 | + foo.add(('autopilot', '1.3.1+13.10.20131003.1-0ubuntu2~fginther.2', |
742 | + '105', 'lp:autopilot')) |
743 | + return foo |
744 | + return result |
745 | + |
746 | + |
747 | +def get_packages_and_versions_uploaded(): |
748 | + '''Get (package, version) of all packages uploaded. We can have duplicates''' |
749 | + |
750 | + # we do not rely on the .changes files but in the config file |
751 | + # because we need the exact version (which can have an epoch) |
752 | + result = set() |
753 | + source_package_regexp = re.compile("(.*).{}.*$".format(PROJECT_CONFIG_SUFFIX)) |
754 | + for file in os.listdir('.'): |
755 | + substract = source_package_regexp.findall(file) |
756 | + if substract: |
757 | + config = ConfigParser.RawConfigParser() |
758 | + config.read(file) |
759 | + result.add((substract[0], config.get('Package', 'packaging_version'))) |
760 | + return result |
761 | + |
762 | + |
763 | +def update_all_packages_status(packages_not_in_ppa, packages_building, packages_failed, particular_arch=None): |
764 | + '''Update all packages status, checking in the ppa''' |
765 | + |
766 | + for current_package in (packages_not_in_ppa.union(packages_building)): |
767 | + logging.info("current_package: " + current_package.source_name + " " + current_package.version) |
768 | + package_status = current_package.get_status(particular_arch) |
769 | + if package_status != None: # global package_status can be 0 (building), 1 (failed), 2 (published) |
770 | + # if one arch building, still considered as building |
771 | + if package_status == PackageInPPA.BUILDING: |
772 | + _ensure_removed_from_set(packages_not_in_ppa, current_package) # maybe already removed |
773 | + packages_building.add(current_package) |
774 | + # if one arch failed, considered as failed |
775 | + elif package_status == PackageInPPA.FAILED: |
776 | + _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step |
777 | + _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step |
778 | + packages_failed.add(current_package) |
779 | + elif package_status == PackageInPPA.PUBLISHED: |
780 | + _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step |
781 | + _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step |
782 | + |
783 | + |
784 | +def _get_current_packaging_version_from_config(source_package_name): |
785 | + '''Get current packaging version from the saved config''' |
786 | + config = ConfigParser.RawConfigParser() |
787 | + config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)) |
788 | + return config.get('Package', 'packaging_version') |
789 | + |
790 | + |
791 | +def _get_current_rev_from_config(source_package_name): |
792 | + '''Get current tip revision from the saved config''' |
793 | + config = ConfigParser.RawConfigParser() |
794 | + config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)) |
795 | + return config.get('Branch', 'rev') |
796 | |
797 | === added file 'branch-source-builder/cupstream2distro/packagemanager.py' |
798 | --- branch-source-builder/cupstream2distro/packagemanager.py 1970-01-01 00:00:00 +0000 |
799 | +++ branch-source-builder/cupstream2distro/packagemanager.py 2014-02-11 20:59:14 +0000 |
800 | @@ -0,0 +1,504 @@ |
801 | +# -*- coding: utf-8 -*- |
802 | +# Copyright (C) 2012-2014 Canonical |
803 | +# |
804 | +# Authors: |
805 | +# Didier Roche |
806 | +# Rodney Dawes |
807 | +# |
808 | +# This program is free software; you can redistribute it and/or modify it under |
809 | +# the terms of the GNU General Public License as published by the Free Software |
810 | +# Foundation; version 3. |
811 | +# |
812 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
813 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
814 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
815 | +# details. |
816 | +# |
817 | +# You should have received a copy of the GNU General Public License along with |
818 | +# this program; if not, write to the Free Software Foundation, Inc., |
819 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
820 | + |
821 | +import datetime |
822 | +import fileinput |
823 | +import logging |
824 | +import os |
825 | +import re |
826 | +import shutil |
827 | +import sys |
828 | +import subprocess |
829 | +import urllib |
830 | + |
831 | +import launchpadmanager |
832 | +import settings |
833 | +from .utils import ignored |
834 | + |
835 | + |
836 | +def get_current_version_for_series(source_package_name, series_name, ppa_name=None, dest=None): |
837 | + '''Get current version for a package name in that series''' |
838 | + series = launchpadmanager.get_series(series_name) |
839 | + if not dest: |
840 | + if ppa_name: |
841 | + dest = launchpadmanager.get_ppa(ppa_name) |
842 | + else: |
843 | + dest = launchpadmanager.get_ubuntu_archive() |
844 | + source_collection = dest.getPublishedSources(exact_match=True, source_name=source_package_name, distro_series=series) |
845 | + try: |
846 | + # cjwatson told that list always have the more recently published first (even if removed) |
847 | + return source_collection[0].source_package_version |
848 | + # was never in the dest, set the lowest possible version |
849 | + except IndexError: |
850 | + return "0" |
851 | + |
852 | + |
853 | +def is_version_for_series_in_dest(source_package_name, version, series, dest, pocket="Release"): |
854 | + '''Return if version for a package name in that series is in dest''' |
855 | + return dest.getPublishedSources(exact_match=True, source_name=source_package_name, version=version, |
856 | + distro_series=series, pocket=pocket).total_size > 0 |
857 | + |
858 | +def is_version_in_queue(source_package_name, version, dest_serie, queue): |
859 | + '''Return if version for a package name in that series is in dest''' |
860 | + return dest_serie.getPackageUploads(exact_match=True, name=source_package_name, version=version, |
861 | + status=queue).total_size > 0 |
862 | + |
863 | + |
864 | +def is_version1_higher_than_version2(version1, version2): |
865 | + '''return if version1 is higher than version2''' |
866 | + return (subprocess.call(["dpkg", "--compare-versions", version1, 'gt', version2], stdout=subprocess.PIPE, stderr=subprocess.PIPE) == 0) |
867 | + |
868 | + |
869 | +def is_version_in_changelog(version, f): |
870 | + '''Return if the version is in the upstream changelog (released)''' |
871 | + |
872 | + if version == "0": |
873 | + return True |
874 | + |
875 | + desired_changelog_line = re.compile("\({}\) (?!UNRELEASED).*\; urgency=".format(version.replace('+', '\+'))) |
876 | + for line in f.readlines(): |
877 | + if desired_changelog_line.search(line): |
878 | + return True |
879 | + |
880 | + return False |
881 | + |
882 | + |
883 | +def get_latest_upstream_bzr_rev(f, dest_ppa=None): |
884 | + '''Report latest bzr rev in the file |
885 | + |
886 | + If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to first distro version''' |
887 | + distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT)) |
888 | + destppa_regexp = re.compile("{} (\d+) \(ppa:{}\)".format(settings.REV_STRING_FORMAT, dest_ppa)) |
889 | + distro_rev = None |
890 | + candidate_destppa_rev = None |
891 | + candidate_distro_rev = None |
892 | + destppa_element_found = False |
893 | + |
894 | + # handle marker spread on two lines |
895 | + end_of_line_regexp = re.compile(" *(.*\))") |
896 | + previous_line = None |
897 | + |
898 | + for line in f: |
899 | + line = line[:-1] |
900 | + |
901 | + if previous_line: |
902 | + try: |
903 | + line = previous_line + end_of_line_regexp.findall(line)[0] |
904 | + except IndexError: |
905 | + None |
906 | + previous_line = None |
907 | + |
908 | + if dest_ppa: |
909 | + try: |
910 | + candidate_destppa_rev = int(destppa_regexp.findall(line)[0]) |
911 | + destppa_element_found = True |
912 | + except IndexError: |
913 | + destppa_element_found = False |
914 | + if not distro_rev and not destppa_element_found and not "(ppa:" in line: |
915 | + try: |
916 | + candidate_distro_rev = int(distro_regex.findall(line)[0]) |
917 | + distro_element_found = True |
918 | + except IndexError: |
919 | + distro_element_found = False |
920 | + |
921 | + # try to catchup next line if we have a marker start without anything found |
922 | + if settings.REV_STRING_FORMAT in line and (dest_ppa and not destppa_element_found) and not distro_element_found: |
923 | + previous_line = line |
924 | + |
925 | + if line.startswith(" -- "): |
926 | + # first grab the dest ppa |
927 | + if candidate_destppa_rev: |
928 | + return candidate_destppa_rev |
929 | + if not distro_rev and candidate_distro_rev: |
930 | + distro_rev = candidate_distro_rev |
931 | + if not dest_ppa and distro_rev: |
932 | + return distro_rev |
933 | + |
934 | + # we didn't find any dest ppa result but there is a distro_rev one |
935 | + if dest_ppa and distro_rev: |
936 | + return distro_rev |
937 | + |
938 | + # we force a bootstrap commit for new components |
939 | + return 0 |
940 | + |
941 | + |
942 | +def list_packages_info_in_str(packages_set): |
943 | + '''Return the packages info in a string''' |
944 | + |
945 | + results = [] |
946 | + for package in packages_set: |
947 | + results.append("{} ({})".format(package.source_name, package.version)) |
948 | + return " ".join(results) |
949 | + |
950 | + |
951 | +def get_packaging_version(): |
952 | + '''Get current packaging rev''' |
953 | + instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
954 | + (stdout, stderr) = instance.communicate() |
955 | + if instance.returncode != 0: |
956 | + raise Exception(stderr.decode("utf-8").strip()) |
957 | + expr = re.compile("Version: (.*)") |
958 | + for line in stdout.splitlines(): |
959 | + packaging_version = expr.findall(line) |
960 | + if packaging_version: |
961 | + return packaging_version[0] |
962 | + |
963 | + raise Exception("Didn't find any Version in the package: {}".format(stdout)) |
964 | + |
965 | + |
966 | +def get_source_package_from_dest(source_package_name, dest_archive, dest_current_version, series_name): |
967 | + '''Download and return a path containing a checkout of the current dest version. |
968 | + |
969 | + None if this package was never published to dest archive''' |
970 | + |
971 | + if dest_current_version == "0": |
972 | + logging.info("This package was never released to the destination archive, don't return downloaded source") |
973 | + return None |
974 | + |
975 | + logging.info("Grab code for {} ({}) from {}".format(source_package_name, dest_current_version, series_name)) |
976 | + source_package_download_dir = os.path.join('ubuntu', source_package_name) |
977 | + series = launchpadmanager.get_series(series_name) |
978 | + with ignored(OSError): |
979 | + os.makedirs(source_package_download_dir) |
980 | + os.chdir(source_package_download_dir) |
981 | + |
982 | + try: |
983 | + sourcepkg = dest_archive.getPublishedSources(status="Published", exact_match=True, source_name=source_package_name, distro_series=series, version=dest_current_version)[0] |
984 | + except IndexError: |
985 | + raise Exception("Couldn't get in the destination the expected version") |
986 | + logging.info('Downloading %s version %s', source_package_name, dest_current_version) |
987 | + for url in sourcepkg.sourceFileUrls(): |
988 | + urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1])) |
989 | + instance = subprocess.Popen("dpkg-source -x *dsc", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
990 | + (stdout, stderr) = instance.communicate() |
991 | + if instance.returncode != 0: |
992 | + raise Exception(stderr.decode("utf-8").strip()) |
993 | + |
994 | + # check the dir exist |
995 | + splitted_version = dest_current_version.split(':')[-1].split('-') # remove epoch is there is one |
996 | + # TODO: debian version (like -3) is not handled here. |
997 | + # We do handle 42ubuntu1 though (as splitted_version[0] can contain "ubuntu") |
998 | + if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1: # don't remove last item for the case where we had a native version (-0.35.2) without ubuntu in it |
999 | + splitted_version = splitted_version[:-1] |
1000 | + version_for_source_file = '-'.join(splitted_version) |
1001 | + source_directory_name = "{}-{}".format(source_package_name, version_for_source_file) |
1002 | + if not os.path.isdir(source_directory_name): |
1003 | + raise Exception("We tried to download and check that the directory {} is present, but it's not the case".format(source_directory_name)) |
1004 | + os.chdir('../..') |
1005 | + return (os.path.join(source_package_download_dir, source_directory_name)) |
1006 | + |
1007 | + |
1008 | +def is_new_content_relevant_since_old_published_source(dest_version_source): |
1009 | + '''Return True if a new snapshot is needed |
1010 | + |
1011 | + dest_version_source can be None if no released version was done before.''' |
1012 | + |
1013 | + # we always released something not yet in ubuntu, no matter criterias are not met. |
1014 | + if not dest_version_source: |
1015 | + return True |
1016 | + |
1017 | + # now check the relevance of the committed changes compared to the version in the repository (if any) |
1018 | + diffinstance = subprocess.Popen(['diff', '-Nrup', '.', dest_version_source], stdout=subprocess.PIPE) |
1019 | + filterinstance = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE) |
1020 | + lsdiffinstance = subprocess.Popen(['lsdiff'], stdin=filterinstance.stdout, stdout=subprocess.PIPE) |
1021 | + (relevant_changes, err) = subprocess.Popen(['grep', '-Ev', '.bzr|.pc'], stdin=lsdiffinstance.stdout, stdout=subprocess.PIPE).communicate() |
1022 | + |
1023 | + # detect if the only change is a Vcs* target changes (with or without changelog edit). We won't release in that case |
1024 | + number_of_changed_files = relevant_changes.count("\n") |
1025 | + if ((number_of_changed_files == 1 and "debian/control" in relevant_changes) or |
1026 | + (number_of_changed_files == 2 and "debian/control" in relevant_changes and "debian/changelog" in relevant_changes)): |
1027 | + (results, err) = subprocess.Popen(['diff', os.path.join('debian', 'control'), os.path.join(dest_version_source, "debian", "control")], stdout=subprocess.PIPE).communicate() |
1028 | + for diff_line in results.split('\n'): |
1029 | + if diff_line.startswith("< ") or diff_line.startswith("> "): |
1030 | + if not diff_line[2:].startswith("Vcs-") and not diff_line[2:].startswith("#"): |
1031 | + return True |
1032 | + return False |
1033 | + |
1034 | + logging.debug("Relevant changes are:") |
1035 | + logging.debug(relevant_changes) |
1036 | + |
1037 | + return (relevant_changes != '') |
1038 | + |
1039 | + |
1040 | +def is_relevant_source_diff_from_previous_dest_version(newdsc_path, dest_version_source): |
1041 | + '''Extract and check if the generated source diff different from previous one''' |
1042 | + |
1043 | + with ignored(OSError): |
1044 | + os.makedirs("generated") |
1045 | + extracted_generated_source = os.path.join("generated", newdsc_path.split('_')[0]) |
1046 | + with ignored(OSError): |
1047 | + shutil.rmtree(extracted_generated_source) |
1048 | + |
1049 | + # remove epoch is there is one |
1050 | + if subprocess.call(["dpkg-source", "-x", newdsc_path, extracted_generated_source]) != 0: |
1051 | + raise Exception("dpkg-source command returned an error.") |
1052 | + |
1053 | + # now check the relevance of the committed changes compared to the version in the repository (if any) |
1054 | + diffinstance = subprocess.Popen(['diff', '-Nrup', extracted_generated_source, dest_version_source], stdout=subprocess.PIPE) |
1055 | + (diff, err) = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate() |
1056 | + |
1057 | + # there is no important diff if the diff only contains 12 lines, corresponding to "Automatic daily release" marker in debian/changelog |
1058 | + if (diff.count('\n') <= 12): |
1059 | + return False |
1060 | + return True |
1061 | + |
1062 | + |
1063 | +def _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc): |
1064 | + '''Return if there has been a packaging change between two dsc files |
1065 | + |
1066 | + We ignore the changelog only changes''' |
1067 | + if not oldsource_dsc: |
1068 | + return True |
1069 | + if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc): |
1070 | + raise Exception("{} or {} doesn't not exist, can't create a diff".format(oldsource_dsc, newsource_dsc)) |
1071 | + diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE) |
1072 | + filterinstance = subprocess.Popen(['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog'], stdin=diffinstance.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
1073 | + (change_in_debian, filter_err) = filterinstance.communicate() |
1074 | + # we can't rely on diffinstance returncode as the signature key is maybe not present and it will exit with 1 |
1075 | + if filterinstance.returncode != 0: |
1076 | + raise Exception("Error in diff: {}".format(filter_err.decode("utf-8").strip())) |
1077 | + return(change_in_debian != "") |
1078 | + |
1079 | + |
1080 | +def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc): |
1081 | + '''Generate a diff file in diff_filepath if there is a relevant packaging diff between 2 sources |
1082 | + |
1083 | + The diff contains autotools files and cmakeries''' |
1084 | + if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc): |
1085 | + with open(diff_filepath, "w") as f: |
1086 | + if not oldsource_dsc: |
1087 | + f.writelines("This source is a new package, if the destination is ubuntu, please ensure it has been preNEWed by an archive admin before publishing that stack.") |
1088 | + return |
1089 | + f.write("/!\ Remember that this diff only represents packaging changes and build tools diff, not the whole content diff!\n\n") |
1090 | + diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE) |
1091 | + (changes_to_publish, err) = subprocess.Popen(['filterdiff', '--remove-timestamps', '--clean', '-i', '*setup.py', |
1092 | + '-i', '*Makefile.am', '-i', '*configure.*', '-i', '*debian/*', |
1093 | + '-i', '*CMakeLists.txt'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate() |
1094 | + f.write(changes_to_publish) |
1095 | + |
1096 | + |
1097 | +def create_new_packaging_version(base_package_version, series_version, destppa=''): |
1098 | + '''Deliver a new packaging version, based on simple rules: |
1099 | + |
1100 | + Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1 |
1101 | + if we already have something delivered today, it will be .minor, then, .minor+1… |
1102 | + |
1103 | + We append the destination ppa name if we target a dest ppa and not distro''' |
1104 | + # to keep track of whether the package is native or not |
1105 | + native_pkg = False |
1106 | + |
1107 | + today_version = datetime.date.today().strftime('%Y%m%d') |
1108 | + destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".") |
1109 | + # bootstrapping mode or direct upload or UNRELEASED for bumping to a new series |
1110 | + # TRANSITION |
1111 | + if not ("daily" in base_package_version or "+" in base_package_version): |
1112 | + # support both 42, 42-0ubuntu1 |
1113 | + upstream_version = base_package_version.split('-')[0] |
1114 | + # if we have 42ubuntu1 like a wrong native version |
1115 | + if "ubuntu" in upstream_version: |
1116 | + upstream_version = upstream_version.split('ubuntu')[0] |
1117 | + elif not "-" in base_package_version and "+" in base_package_version: |
1118 | + # extract the day of previous daily upload and bump if already uploaded |
1119 | + regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*") |
1120 | + try: |
1121 | + previous_day = regexp.findall(base_package_version)[0] |
1122 | + upstream_version = previous_day[0] |
1123 | + native_pkg = True |
1124 | + if (previous_day[1] == series_version and |
1125 | + previous_day[2] == today_version): |
1126 | + minor = 1 |
1127 | + if previous_day[3]: # second upload of the day |
1128 | + minor = int(previous_day[3]) + 1 |
1129 | + today_version = "{}.{}".format(today_version, minor) |
1130 | + except IndexError: |
1131 | + raise Exception( |
1132 | + "Unable to get previous day from native version: %s" |
1133 | + % base_package_version) |
1134 | + else: |
1135 | + # extract the day of previous daily upload and bump if already uploaded today |
1136 | + regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*") |
1137 | + try: |
1138 | + previous_day = regexp.findall(base_package_version)[0] |
1139 | + except IndexError: |
1140 | + # TRANSITION FALLBACK |
1141 | + try: |
1142 | + regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*") |
1143 | + previous_day = regexp.findall(base_package_version)[0] |
1144 | + # make the version compatible with the new version |
1145 | + previous_day = (previous_day[0], previous_day[1], "20" + previous_day[2].replace(".", ""), previous_day[3]) |
1146 | + except IndexError: |
1147 | + raise Exception("Didn't find a correct versioning in the current package: {}".format(base_package_version)) |
1148 | + upstream_version = previous_day[0] |
1149 | + if previous_day[1] == series_version and previous_day[2] == today_version: |
1150 | + minor = 1 |
1151 | + if previous_day[3]: # second upload of the day |
1152 | + minor = int(previous_day[3]) + 1 |
1153 | + today_version = "{}.{}".format(today_version, minor) |
1154 | + |
1155 | + new_upstream_version = "{upstream}+{series}.{date}{destppa}".format( |
1156 | + upstream=upstream_version, series=series_version, |
1157 | + date=today_version, destppa=destppa) |
1158 | + if native_pkg is not True: |
1159 | + new_upstream_version = "{}-0ubuntu1".format(new_upstream_version) |
1160 | + |
1161 | + return new_upstream_version |
1162 | + |
1163 | + |
1164 | +def get_packaging_sourcename(): |
1165 | + '''Get current packaging source name''' |
1166 | + instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
1167 | + (stdout, stderr) = instance.communicate() |
1168 | + if instance.returncode != 0: |
1169 | + raise Exception(stderr.decode("utf-8").strip()) |
1170 | + expr = re.compile("Source: (.*)") |
1171 | + for line in stdout.splitlines(): |
1172 | + source_name = expr.findall(line) |
1173 | + if source_name: |
1174 | + return source_name[0] |
1175 | + |
1176 | + raise Exception("Didn't find any source name in the package: {}".format(stdout)) |
1177 | + |
1178 | + |
1179 | +def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name): |
1180 | + '''Collect all bugs in the changelog until latest snapshot''' |
1181 | + bugs = set() |
1182 | + # matching only bug format that launchpad accepts |
1183 | + group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE) |
1184 | + bug_decipher_regexp = re.compile("(#\d{5,})+") |
1185 | + new_upload_changelog_regexp = re.compile(settings.NEW_CHANGELOG_PATTERN.format(source_package_name)) |
1186 | + for line in f: |
1187 | + grouped_bugs_list = group_bugs_regexp.findall(line) |
1188 | + for grouped_bugs in grouped_bugs_list: |
1189 | + for bug in map(lambda bug_with_hash: bug_with_hash.replace('#', ''), bug_decipher_regexp.findall(grouped_bugs)): |
1190 | + bugs.add(bug) |
1191 | + # a released upload to distro (automated or manual)n exit as bugs before were already covered |
1192 | + if new_upload_changelog_regexp.match(line): |
1193 | + return bugs |
1194 | + |
1195 | + return bugs |
1196 | + |
1197 | + |
1198 | +def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits, dest_ppa=None): |
1199 | + '''Update the changelog for the incoming upload''' |
1200 | + |
1201 | + dch_env = os.environ.copy() |
1202 | + for author in authors_commits: |
1203 | + dch_env["DEBFULLNAME"] = author |
1204 | + for bug_desc in authors_commits[author]: |
1205 | + if bug_desc.startswith('-'): |
1206 | + # Remove leading '-' or dch thinks (rightly) that it's an option |
1207 | + bug_desc = bug_desc[1:] |
1208 | + if bug_desc.startswith(' '): |
1209 | + # Remove leading spaces, there are useless and the result is |
1210 | + # prettier without them anyway ;) |
1211 | + bug_desc = bug_desc.strip() |
1212 | + cmd = ["dch", "--multimaint-merge", "--release-heuristic", "changelog", |
1213 | + "-v{}".format(new_package_version), bug_desc] |
1214 | + subprocess.Popen(cmd, env=dch_env).communicate() |
1215 | + |
1216 | + if tip_bzr_rev != None: |
1217 | + commit_message = "{} {}".format(settings.REV_STRING_FORMAT, tip_bzr_rev) |
1218 | + if dest_ppa: |
1219 | + commit_message += " ({})".format(dest_ppa) |
1220 | + else: |
1221 | + commit_message = "" |
1222 | + |
1223 | + dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME |
1224 | + dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL |
1225 | + instance = subprocess.Popen(["dch", "--release-heuristic", "changelog", |
1226 | + "-v{}".format(new_package_version), commit_message], |
1227 | + stderr=subprocess.PIPE, env=dch_env) |
1228 | + (stdout, stderr) = instance.communicate() |
1229 | + if instance.returncode != 0: |
1230 | + raise Exception(stderr.decode("utf-8").strip()) |
1231 | + subprocess.call(["dch", "-r", "--distribution", series, "--force-distribution", ""], env=dch_env) |
1232 | + |
1233 | + # in the case of no commit_message and no symbols file change, we have an addition [ DEBFULLNAME ] follow by an empty line |
1234 | + # better to remove both lines |
1235 | + subprocess.call(["sed", "-i", "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}", "debian/changelog"]) |
1236 | + |
1237 | + |
1238 | +def build_source_package(series, distro_version, ppa=None): |
1239 | + '''Build the source package using the internal helper |
1240 | + |
1241 | + Add the additional ppa inside the chroot if requested.''' |
1242 | + |
1243 | + chroot_tool_dir = os.path.join(settings.ROOT_CU2D, "chroot-tools") |
1244 | + buildsource = os.path.join(chroot_tool_dir, "buildsource-chroot") |
1245 | + branch_dir = os.path.abspath('.') |
1246 | + parent_dir = os.path.abspath(os.path.dirname(branch_dir)) |
1247 | + cowbuilder_env = os.environ.copy() |
1248 | + cowbuilder_env["HOME"] = chroot_tool_dir # take the internal .pbuilderrc |
1249 | + cowbuilder_env["DIST"] = series |
1250 | + cmd = ["sudo", "-E", "cowbuilder", "--execute", |
1251 | + "--bindmounts", parent_dir, |
1252 | + "--bindmounts", settings.GNUPG_DIR, |
1253 | + "--", buildsource, branch_dir, |
1254 | + "--gnupg-parentdir", settings.GNUPG_DIR, |
1255 | + "--uid", str(os.getuid()), "--gid", str(os.getgid()), |
1256 | + "--gnupg-keyid", settings.BOT_KEY, |
1257 | + "--distro-version", distro_version] |
1258 | + if ppa: |
1259 | + cmd.extend(["--ppa", ppa]) |
1260 | + instance = subprocess.Popen(cmd, env=cowbuilder_env) |
1261 | + instance.communicate() |
1262 | + if instance.returncode != 0: |
1263 | + raise Exception("%r returned: %s." % (cmd, instance.returncode)) |
1264 | + |
1265 | + |
1266 | +def upload_package(source, version, ppa, source_directory=None): |
1267 | + '''Upload the new package to a ppa''' |
1268 | + # remove epoch is there is one |
1269 | + here = None |
1270 | + if source_directory: |
1271 | + here = os.getcwd() |
1272 | + os.chdir(source_directory) |
1273 | + version_for_source_file = version.split(':')[-1] |
1274 | + cmd = ["dput", ppa, |
1275 | + "{}_{}_source.changes".format(source, version_for_source_file)] |
1276 | + if subprocess.call(cmd) != 0: |
1277 | + if here: |
1278 | + os.chdir(here) |
1279 | + raise Exception("%r returned an error." % (cmd,)) |
1280 | + if here: |
1281 | + os.chdir(here) |
1282 | + |
1283 | + |
1284 | +def refresh_symbol_files(packaging_version): |
1285 | + '''Refresh the symbols file having REPLACEME_TAG with version of the day. |
1286 | + |
1287 | + Add a changelog entry if needed''' |
1288 | + |
1289 | + new_upstream_version = packaging_version.split("-")[0] |
1290 | + replacement_done = False |
1291 | + for filename in os.listdir("debian"): |
1292 | + if filename.endswith("symbols"): |
1293 | + for line in fileinput.input(os.path.join('debian', filename), inplace=1): |
1294 | + if settings.REPLACEME_TAG in line: |
1295 | + replacement_done = True |
1296 | + line = line.replace(settings.REPLACEME_TAG, new_upstream_version) |
1297 | + sys.stdout.write(line) |
1298 | + |
1299 | + if replacement_done: |
1300 | + dch_env = os.environ.copy() |
1301 | + dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME |
1302 | + dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL |
1303 | + subprocess.Popen(["dch", "debian/*symbols: auto-update new symbols to released version"], env=dch_env).communicate() |
1304 | + subprocess.call(["bzr", "commit", "-m", "Update symbols"]) |
1305 | |
1306 | === added file 'branch-source-builder/cupstream2distro/settings.py' |
1307 | --- branch-source-builder/cupstream2distro/settings.py 1970-01-01 00:00:00 +0000 |
1308 | +++ branch-source-builder/cupstream2distro/settings.py 2014-02-11 20:59:14 +0000 |
1309 | @@ -0,0 +1,121 @@ |
1310 | +# -*- coding: utf-8 -*- |
1311 | +# Copyright (C) 2012 Canonical |
1312 | +# |
1313 | +# Authors: |
1314 | +# Didier Roche |
1315 | +# |
1316 | +# This program is free software; you can redistribute it and/or modify it under |
1317 | +# the terms of the GNU General Public License as published by the Free Software |
1318 | +# Foundation; version 3. |
1319 | +# |
1320 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
1321 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
1322 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
1323 | +# details. |
1324 | +# |
1325 | +# You should have received a copy of the GNU General Public License along with |
1326 | +# this program; if not, write to the Free Software Foundation, Inc., |
1327 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
1328 | + |
1329 | +import os |
1330 | +import yaml |
1331 | + |
1332 | +REV_STRING_FORMAT = "Automatic snapshot from revision" |
1333 | +NEW_CHANGELOG_PATTERN = "^{} \(.*\) (?!UNRELEASED)" |
1334 | +PACKAGING_MERGE_COMMIT_MESSAGE = "Releasing {} (revision {} from {})" |
1335 | +REPLACEME_TAG = "0replaceme" |
1336 | +BRANCH_URL = "lp:~ps-jenkins/{}/latestsnapshot-{}" |
1337 | + |
1338 | +IGNORECHANGELOG_COMMIT = "#nochangelog" |
1339 | + |
1340 | +PROJECT_CONFIG_SUFFIX = "project" |
1341 | + |
1342 | +BOT_DEBFULLNAME = "Ubuntu daily release" |
1343 | +BOT_DEBEMAIL = "ps-jenkins@lists.canonical.com" |
1344 | +home_dir = "/srv/bsb_worker" |
1345 | +CU2D_DIR = home_dir |
1346 | +GNUPG_DIR = CU2D_DIR |
1347 | +if not os.path.isdir(os.path.join(GNUPG_DIR, '.gnupg')): |
1348 | + GNUPG_DIR = home_dir |
1349 | +CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials") |
1350 | + |
1351 | +# TODO refactor into a ci-utils module |
1352 | +def _unit_config(): |
1353 | + path = os.path.join(home_dir, 'unit_config') |
1354 | + config = {} |
1355 | + try: |
1356 | + with open(path) as f: |
1357 | + config = yaml.safe_load(f.read()) |
1358 | + except: |
1359 | + print('Unable to use unit_config(%s), defaulting values' % path) |
1360 | + return config |
1361 | +_cfg = _unit_config() |
1362 | +LAUNCHPAD_PPA_USER = _cfg.get('launchpad_user', None) |
1363 | +LAUNCHPAD_API_BASE = _cfg.get( |
1364 | + 'launchpad_api_base', 'https://api.launchpad.net/1.0') |
1365 | +OAUTH_CONSUMER_KEY = _cfg.get('oauth_consumer_key', None) |
1366 | +OAUTH_TOKEN = _cfg.get('oauth_token', None) |
1367 | +OAUTH_TOKEN_SECRET = _cfg.get('oauth_token_secret', None) |
1368 | +OAUTH_REALM = _cfg.get('oauth_realm', 'https://api.launchpad.net/') |
1369 | + |
1370 | +if not os.path.exists(CRED_FILE_PATH): |
1371 | + with open(CRED_FILE_PATH, 'w') as f: |
1372 | + f.write('[1]\n') |
1373 | + f.write('consumer_key = %s\n' % OAUTH_CONSUMER_KEY) |
1374 | + f.write('consumer_secret = \n') |
1375 | + f.write('access_token = %s\n' % OAUTH_TOKEN) |
1376 | + f.write('access_secret = %s\n' % OAUTH_TOKEN_SECRET) |
1377 | + |
1378 | +COMMON_LAUNCHPAD_CACHE_DIR = os.path.join("/tmp", "launchpad.cache") |
1379 | +if not os.path.isdir(COMMON_LAUNCHPAD_CACHE_DIR): |
1380 | + os.makedirs(COMMON_LAUNCHPAD_CACHE_DIR) |
1381 | +BOT_KEY = "B879A3E9" |
1382 | + |
1383 | +# selected arch for building arch:all packages |
1384 | +VIRTUALIZED_PPA_ARCH = ["i386", "amd64"] |
1385 | +# an arch we will ignore for publication if latest published version in dest doesn't build it |
1386 | +ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el']) |
1387 | +ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el']) |
1388 | +SRU_PPA = "ubuntu-unity/sru-staging" |
1389 | + |
1390 | +TIME_BETWEEN_PPA_CHECKS = 60 |
1391 | +TIME_BETWEEN_STACK_CHECKS = 60 |
1392 | +TIME_BEFORE_STOP_LOOKING_FOR_SOURCE_PUBLISH = 20 * 60 |
1393 | + |
1394 | +PUBLISHER_ARTEFACTS_FILENAME = 'publisher.xml' |
1395 | +PREPARE_ARTEFACTS_FILENAME_FORMAT = 'prepare_{}.xml' |
1396 | + |
1397 | +OLD_STACK_DIR = 'old' |
1398 | +PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync' |
1399 | +PACKAGE_LIST_RSYNC_FILENAME_FORMAT = PACKAGE_LIST_RSYNC_FILENAME_PREFIX + '_{}-{}' |
1400 | +RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*".format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX) |
1401 | + |
1402 | +ROOT_CU2D = os.path.join(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) |
1403 | +DEFAULT_CONFIG_STACKS_DIR = os.path.join(os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks') |
1404 | +STACK_STATUS_FILENAME = "stack.status" |
1405 | +STACK_STARTED_FILENAME = "stack.started" |
1406 | +STACK_BUILDING_FILENAME = "stack.building" |
1407 | + |
1408 | +STACK_RUNNING_DIR = "/iSCSI/jenkins/cu2d/work" |
1409 | +STACK_STATUS_PUBLISHING_DIR = "/iSCSI/jenkins/cu2d/result_publishing" |
1410 | + |
1411 | +# for citrain |
1412 | +SILO_NAME_LIST = [] |
1413 | +for i in xrange(1, 11): |
1414 | + SILO_NAME_LIST.append("landing-{:03d}".format(i)) |
1415 | +SILO_CONFIG_FILENAME = "config" |
1416 | +SILO_BUILDPPA_SCHEME = "ci-train-ppa-service/{}" |
1417 | +SILO_PACKAGING_RELEASE_COMMIT_MESSAGE = "Releasing {}" |
1418 | +SILOS_RAW_DIR = "~/silos" |
1419 | +SILOS_DIR = os.path.expanduser(SILOS_RAW_DIR) |
1420 | +SILO_RSYNCDIR = "~/out" |
1421 | +SILO_STATUS_RSYNCDIR = os.path.expanduser("~/status") |
1422 | +CITRAIN_BINDIR = "~/citrain/citrain" |
1423 | +(SILO_EMPTY, SILO_BUILTCHECKED, SILO_PUBLISHED, SILO_DONE) = range(4) |
1424 | + |
1425 | +SERIES_VERSION = { |
1426 | + 'precise': '12.04', |
1427 | + 'raring': '13.04', |
1428 | + 'saucy': '13.10', |
1429 | + 'trusty': '14.04' |
1430 | +} |
1431 | |
1432 | === added file 'branch-source-builder/cupstream2distro/silomanager.py' |
1433 | --- branch-source-builder/cupstream2distro/silomanager.py 1970-01-01 00:00:00 +0000 |
1434 | +++ branch-source-builder/cupstream2distro/silomanager.py 2014-02-11 20:59:14 +0000 |
1435 | @@ -0,0 +1,115 @@ |
1436 | +# -*- coding: utf-8 -*- |
1437 | +# Copyright (C) 2014 Canonical |
1438 | +# |
1439 | +# Authors: |
1440 | +# Didier Roche |
1441 | +# |
1442 | +# This program is free software; you can redistribute it and/or modify it under |
1443 | +# the terms of the GNU General Public License as published by the Free Software |
1444 | +# Foundation; version 3. |
1445 | +# |
1446 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
1447 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
1448 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
1449 | +# details. |
1450 | +# |
1451 | +# You should have received a copy of the GNU General Public License along with |
1452 | +# this program; if not, write to the Free Software Foundation, Inc., |
1453 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
1454 | + |
1455 | +import json |
1456 | +import logging |
1457 | +import os |
1458 | +import shutil |
1459 | + |
1460 | +from cupstream2distro.settings import SILO_CONFIG_FILENAME, SILO_NAME_LIST, SILO_STATUS_RSYNCDIR |
1461 | +from cupstream2distro.utils import ignored |
1462 | + |
1463 | + |
1464 | +def save_config(config, uri=''): |
1465 | + """Save config in uri and copy to outdir""" |
1466 | + silo_config_path = os.path.abspath(os.path.join(uri, SILO_CONFIG_FILENAME)) |
1467 | + with ignored(OSError): |
1468 | + os.makedirs(uri) |
1469 | + try: |
1470 | + json.dump(config, open(silo_config_path, 'w')) |
1471 | + except TypeError as e: |
1472 | + logging.error("Can't save configuration: " + e.message) |
1473 | + os.remove(silo_config_path) |
1474 | + return False |
1475 | + # copy to outdir |
1476 | + with ignored(OSError): |
1477 | + os.makedirs(SILO_STATUS_RSYNCDIR) |
1478 | + silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1] |
1479 | + dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name) |
1480 | + logging.debug("Copying configuration from {} to {}".format(silo_config_path, dest)) |
1481 | + shutil.copy2(silo_config_path, os.path.join(SILO_STATUS_RSYNCDIR, silo_name)) |
1482 | + return True |
1483 | + |
1484 | +def load_config(uri=None): |
1485 | + """return a loaded config |
1486 | + |
1487 | + If no uri, load in the current directory""" |
1488 | + if not uri: |
1489 | + uri = os.path.abspath('.') |
1490 | + logging.debug("Reading configuration in {}".format(uri)) |
1491 | + try: |
1492 | + return json.load(open(os.path.join(uri, SILO_CONFIG_FILENAME))) |
1493 | + # if silo isn't configured |
1494 | + except IOError: |
1495 | + pass |
1496 | + except ValueError as e: |
1497 | + logging.warning("Can't load configuration: " + e.message) |
1498 | + return None |
1499 | + |
1500 | +def remove_status_file(silo_name): |
1501 | + """Remove status file""" |
1502 | + os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name)) |
1503 | + |
1504 | + |
1505 | +def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri, ignore_silo): |
1506 | + """Return true if the project for that serie in that dest is not in any configuration""" |
1507 | + logging.info("Checking if {} is already configured for {} ({}) in another silo".format(project_name, dest.name, series.name)) |
1508 | + for silo_name in SILO_NAME_LIST: |
1509 | + # we are reconfiguring current silo, ignoring it |
1510 | + if ignore_silo == silo_name: |
1511 | + continue |
1512 | + config = load_config(os.path.join(base_silo_uri, silo_name)) |
1513 | + if config: |
1514 | + if (config["global"]["dest"] == dest.self_link and config["global"]["series"] == series.self_link and |
1515 | + (project_name in config["mps"] or project_name in config["sources"])): |
1516 | + logging.error("{} is already prepared for the same serie and destination in {}".format(project_name, silo_name)) |
1517 | + return False |
1518 | + return True |
1519 | + |
1520 | + |
1521 | +def return_first_available_silo(base_silo_uri): |
1522 | + """Check which silos are free and return the first one""" |
1523 | + for silo_name in SILO_NAME_LIST: |
1524 | + if not os.path.isfile(os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)): |
1525 | + return silo_name |
1526 | + return None |
1527 | + |
1528 | +def get_config_step(config): |
1529 | + """Get configuration step""" |
1530 | + return config["global"]["step"] |
1531 | + |
1532 | +def set_config_step(config, new_step, uri=''): |
1533 | + """Set configuration step to new_step""" |
1534 | + config["global"]["step"] = new_step |
1535 | + return save_config(config, uri) |
1536 | + |
1537 | +def set_config_status(config, status, uri='', add_url=True): |
1538 | + """Change status to reflect latest status""" |
1539 | + build_url = os.getenv('BUILD_URL') |
1540 | + if add_url and build_url: |
1541 | + status = "{} ({}console)".format(status , build_url) |
1542 | + config["global"]["status"] = status |
1543 | + return save_config(config, uri) |
1544 | + |
1545 | +def get_all_projects(config): |
1546 | + """Get a list of all projets""" |
1547 | + projects = [] |
1548 | + projects.extend(config["mps"]) |
1549 | + projects.extend(config["sources"]) |
1550 | + return projects |
1551 | |
1552 | === added file 'branch-source-builder/cupstream2distro/stack.py' |
1553 | --- branch-source-builder/cupstream2distro/stack.py 1970-01-01 00:00:00 +0000 |
1554 | +++ branch-source-builder/cupstream2distro/stack.py 2014-02-11 20:59:14 +0000 |
1555 | @@ -0,0 +1,204 @@ |
1556 | +# -*- coding: utf-8 -*- |
1557 | +# Copyright (C) 2012 Canonical |
1558 | +# |
1559 | +# Authors: |
1560 | +# Didier Roche |
1561 | +# |
1562 | +# This program is free software; you can redistribute it and/or modify it under |
1563 | +# the terms of the GNU General Public License as published by the Free Software |
1564 | +# Foundation; version 3. |
1565 | +# |
1566 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
1567 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
1568 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
1569 | +# details. |
1570 | +# |
1571 | +# You should have received a copy of the GNU General Public License along with |
1572 | +# this program; if not, write to the Free Software Foundation, Inc., |
1573 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
1574 | + |
1575 | +import logging |
1576 | +import os |
1577 | +import yaml |
1578 | + |
1579 | +from .settings import DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME, STACK_STARTED_FILENAME |
1580 | +from .utils import ignored |
1581 | + |
1582 | +_stacks_ref = {} |
1583 | + |
1584 | +# TODO: should be used by a metaclass |
1585 | +def get_stack(release, stack_name): |
1586 | + try: |
1587 | + return _stacks_ref[release][stack_name] |
1588 | + except KeyError: |
1589 | + return Stack(release, stack_name) |
1590 | + |
1591 | +class Stack(): |
1592 | + |
1593 | + def __init__(self, release, stack_name): |
1594 | + self.stack_name = stack_name |
1595 | + self.release = release |
1596 | + self.statusfile = os.path.join('..', '..', release, stack_name, STACK_STATUS_FILENAME) |
1597 | + self.startedfile = os.path.join('..', '..', release, stack_name, STACK_STARTED_FILENAME) |
1598 | + self.stack_file_path = None |
1599 | + self._dependencies = None |
1600 | + self._rdependencies = None |
1601 | + for stack_file_path in Stack.get_stacks_file_path(release): |
1602 | + if stack_file_path.split(os.path.sep)[-1] == "{}.cfg".format(stack_name): |
1603 | + self.stack_file_path = stack_file_path |
1604 | + break |
1605 | + if not self.stack_file_path: |
1606 | + raise Exception("{}.cfg for {} doesn't exist anywhere in {}".format(stack_name, release, self.get_root_stacks_dir())) |
1607 | + |
1608 | + with open(self.stack_file_path, 'r') as f: |
1609 | + cfg = yaml.load(f) |
1610 | + try: |
1611 | + self.forced_manualpublish = cfg['stack']['manualpublish'] |
1612 | + except (TypeError, KeyError): |
1613 | + self.forced_manualpublish = False |
1614 | + # register to the global dict |
1615 | + _stacks_ref.setdefault(release, {})[stack_name] = self |
1616 | + |
1617 | + def get_status(self): |
1618 | + '''Return a stack status |
1619 | + |
1620 | + 0 is everything is fine and published |
1621 | + 1 is the stack failed in a step |
1622 | + 2 is the stack succeeded, but need manual publishing |
1623 | + |
1624 | + Return None if the status is not available yet''' |
1625 | + |
1626 | + cfg = yaml.load(open(self.stack_file_path)) |
1627 | + with ignored(KeyError): |
1628 | + if cfg['stack']['status_ignored']: |
1629 | + return 0 |
1630 | + if not os.path.isfile(self.statusfile): |
1631 | + return None |
1632 | + with open(self.statusfile, 'r') as f: |
1633 | + return(int(f.read())) |
1634 | + |
1635 | + def is_started(self): |
1636 | + '''Return True if the stack is started (dep-wait or building)''' |
1637 | + if os.path.isfile(self.startedfile): |
1638 | + return True |
1639 | + return False |
1640 | + |
1641 | + def is_building(self): |
1642 | + '''Return True if the stack is building''' |
1643 | + if os.path.isfile(self.startedfile): |
1644 | + return True |
1645 | + return False |
1646 | + |
1647 | + def is_enabled(self): |
1648 | + '''Return True if the stack is enabled for daily release''' |
1649 | + with open(self.stack_file_path, 'r') as f: |
1650 | + cfg = yaml.load(f) |
1651 | + try: |
1652 | + if not cfg['stack']['enabled']: |
1653 | + return False |
1654 | + except KeyError: |
1655 | + pass |
1656 | + return True |
1657 | + |
1658 | + def get_direct_depending_stacks(self): |
1659 | + '''Get a list of direct depending stacks''' |
1660 | + if self._dependencies is not None: |
1661 | + return self._dependencies |
1662 | + |
1663 | + with open(self.stack_file_path, 'r') as f: |
1664 | + cfg = yaml.load(f) |
1665 | + try: |
1666 | + deps_list = cfg['stack']['dependencies'] |
1667 | + self._dependencies = [] |
1668 | + if not deps_list: |
1669 | + return self._dependencies |
1670 | + for item in deps_list: |
1671 | + if isinstance(item, dict): |
1672 | + (stackname, release) = (item["name"], item["release"]) |
1673 | + else: |
1674 | + (stackname, release) = (item, self.release) |
1675 | + self._dependencies.append(get_stack(release, stackname)) |
1676 | + logging.info("{} ({}) dependency list is: {}".format(self.stack_name, self.release, ["{} ({})".format(stack.stack_name, stack.release) for stack in self._dependencies])) |
1677 | + return self._dependencies |
1678 | + except (TypeError, KeyError): |
1679 | + return [] |
1680 | + |
1681 | + def get_direct_rdepends_stack(self): |
1682 | + '''Get a list of direct rdepends''' |
1683 | + if self._rdependencies is not None: |
1684 | + return self._rdependencies |
1685 | + |
1686 | + self._rdependencies = [] |
1687 | + for stackfile in Stack.get_stacks_file_path(self.release): |
1688 | + path = stackfile.split(os.path.sep) |
1689 | + stack = get_stack(path[-2], path[-1].replace(".cfg", "")) |
1690 | + if self in stack.get_direct_depending_stacks(): |
1691 | + self._rdependencies.append(stack) |
1692 | + return self._rdependencies |
1693 | + |
1694 | + def generate_dep_status_message(self): |
1695 | + '''Return a list of potential problems from others stack which should block current publication''' |
1696 | + |
1697 | + # TODO: get the first Stack object |
1698 | + # iterate over all stacks objects from dep chain |
1699 | + # call get_status on all of them |
1700 | + |
1701 | + global_dep_status_info = [] |
1702 | + for stack in self.get_direct_depending_stacks(): |
1703 | + logging.info("Check status for {} ({})".format(stack.stack_name, stack.release)) |
1704 | + status = stack.get_status() |
1705 | + message = None |
1706 | + # We should have a status for every stack |
1707 | + if status is None: |
1708 | + message = "Can't find status for {depstack} ({deprel}). This shouldn't happen unless the stack is currently running. If this is the case, it means that the current stack shouldn't be uploaded as the state is unknown.".format(depstack=stack, deprel=stack.release) |
1709 | + elif status == 1: |
1710 | + message = '''{depstack} ({deprel}) failed to publish. Possible causes are: |
1711 | + * the stack really didn't build/can't be prepared at all. |
1712 | + * the stack has integration tests not working with this previous stack. |
1713 | + |
1714 | + What needs to be done: |
1715 | + Either: |
1716 | + * If we want to publish both stacks: retry the integration tests for {depstack} ({deprel}), including components from this stack (check with the whole PPA). If that works, both stacks should be published at the same time. |
1717 | + Or: |
1718 | + * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release) |
1719 | + elif status == 2: |
1720 | + message = '''{depstack} ({deprel}) is in manually publish mode. Possible causes are: |
1721 | + * Some part of the stack has packaging changes |
1722 | + * This stack is depending on another stack not being published |
1723 | + |
1724 | + What needs to be done: |
1725 | + Either: |
1726 | + * If {depstack} ({deprel}) can be published, we should publish both stacks at the same time. |
1727 | + Or: |
1728 | + * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release) |
1729 | + elif status == 3 or status == -1: |
1730 | + message = '''{depstack} ({deprel}) has been manually aborted or failed for an unknown reason. Possible causes are: |
1731 | + * A job of this stack was stopped manually |
1732 | + * Jenkins had an internal error/shutdown |
1733 | + |
1734 | + What needs to be done: |
1735 | + * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release) |
1736 | + |
1737 | + if message: |
1738 | + logging.warning(message) |
1739 | + global_dep_status_info.append(message) |
1740 | + return global_dep_status_info |
1741 | + |
1742 | + @staticmethod |
1743 | + def get_root_stacks_dir(): |
1744 | + '''Get root stack dir''' |
1745 | + return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR) |
1746 | + |
1747 | + @staticmethod |
1748 | + def get_stacks_file_path(release): |
1749 | + '''Return an iterator with all path for every discovered stack files''' |
1750 | + for root, dirs, files in os.walk(os.path.join(Stack.get_root_stacks_dir(), release)): |
1751 | + for candidate in files: |
1752 | + if candidate.endswith('.cfg'): |
1753 | + yield os.path.join(root, candidate) |
1754 | + |
1755 | + @staticmethod |
1756 | + def get_current_stack(): |
1757 | + '''Return current stack object based on current path (release/stackname)''' |
1758 | + path = os.getcwd().split(os.path.sep) |
1759 | + return get_stack(path[-2], path[-1]) |
1760 | |
1761 | === added file 'branch-source-builder/cupstream2distro/stacks.py' |
1762 | --- branch-source-builder/cupstream2distro/stacks.py 1970-01-01 00:00:00 +0000 |
1763 | +++ branch-source-builder/cupstream2distro/stacks.py 2014-02-11 20:59:14 +0000 |
1764 | @@ -0,0 +1,89 @@ |
1765 | +# -*- coding: utf-8 -*- |
1766 | +# Copyright (C) 2012 Canonical |
1767 | +# |
1768 | +# Authors: |
1769 | +# Didier Roche |
1770 | +# |
1771 | +# This program is free software; you can redistribute it and/or modify it under |
1772 | +# the terms of the GNU General Public License as published by the Free Software |
1773 | +# Foundation; version 3. |
1774 | +# |
1775 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
1776 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
1777 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
1778 | +# details. |
1779 | +# |
1780 | +# You should have received a copy of the GNU General Public License along with |
1781 | +# this program; if not, write to the Free Software Foundation, Inc., |
1782 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
1783 | + |
1784 | +import logging |
1785 | +import os |
1786 | +import yaml |
1787 | +import subprocess |
1788 | + |
1789 | +from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN |
1790 | +from .tools import get_packaging_diff_filename |
1791 | +from .stack import Stack |
1792 | + |
1793 | + |
1794 | +def _rsync_stack_files(): |
1795 | + '''rsync all stack files''' |
1796 | + server = os.getenv('CU2D_RSYNCSVR') |
1797 | + if server == "none": |
1798 | + return |
1799 | + elif server: |
1800 | + remoteaddr = RSYNC_PATTERN.replace('RSYNCSVR', server) |
1801 | + else: |
1802 | + raise Exception('Please set environment variable CU2D_RSYNCSVR') |
1803 | + |
1804 | + cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.'] |
1805 | + instance = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) |
1806 | + (stdout, stderr) = instance.communicate() |
1807 | + if instance.returncode not in (0, 23): |
1808 | + raise Exception(stderr.decode("utf-8").strip()) |
1809 | + |
1810 | + |
1811 | +def get_stack_files_to_sync(): |
1812 | + '''Return a list of tuple: (file, release)''' |
1813 | + _rsync_stack_files() |
1814 | + for file in os.listdir('.'): |
1815 | + if file.startswith(PACKAGE_LIST_RSYNC_FILENAME_PREFIX): |
1816 | + yield (file, file.split('-')[-1]) |
1817 | + |
1818 | + |
1819 | +def get_allowed_projects(release): |
1820 | + '''Get all projects allowed to be uploaded for this release''' |
1821 | + |
1822 | + projects = [] |
1823 | + for file_path in Stack.get_stacks_file_path(release): |
1824 | + with open(file_path, 'r') as f: |
1825 | + cfg = yaml.load(f) |
1826 | + try: |
1827 | + projects_list = cfg['stack']['projects'] |
1828 | + except (TypeError, KeyError): |
1829 | + logging.warning("{} seems broken in not having stack or projects keys".format(file_path)) |
1830 | + continue |
1831 | + if not projects_list: |
1832 | + logging.warning("{} don't have any project list".format(file_path)) |
1833 | + continue |
1834 | + for project in projects_list: |
1835 | + if isinstance(project, dict): |
1836 | + projects.append(project.keys()[0]) |
1837 | + else: |
1838 | + projects.append(project) |
1839 | + return set(projects) |
1840 | + |
1841 | +def get_stack_packaging_change_status(source_version_list): |
1842 | + '''Return global package change status list |
1843 | + |
1844 | + # FIXME: added too many infos now, should only be: (source, version) |
1845 | + source_version_list is a list of couples (source, version, tip_rev, target_branch)''' |
1846 | + |
1847 | + packaging_change_status = [] |
1848 | + for (source, version, tip_rev, target_branch) in source_version_list: |
1849 | + if os.path.exists(get_packaging_diff_filename(source, version)): |
1850 | + message = "Packaging change for {} ({}).".format(source, version) |
1851 | + logging.warning(message) |
1852 | + packaging_change_status.append(message) |
1853 | + return packaging_change_status |
1854 | |
1855 | === added file 'branch-source-builder/cupstream2distro/tools.py' |
1856 | --- branch-source-builder/cupstream2distro/tools.py 1970-01-01 00:00:00 +0000 |
1857 | +++ branch-source-builder/cupstream2distro/tools.py 2014-02-11 20:59:14 +0000 |
1858 | @@ -0,0 +1,94 @@ |
1859 | +# -*- coding: utf-8 -*- |
1860 | +# Copyright (C) 2012 Canonical |
1861 | +# |
1862 | +# Authors: |
1863 | +# Didier Roche |
1864 | +# |
1865 | +# This program is free software; you can redistribute it and/or modify it under |
1866 | +# the terms of the GNU General Public License as published by the Free Software |
1867 | +# Foundation; version 3. |
1868 | +# |
1869 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
1870 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
1871 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
1872 | +# details. |
1873 | +# |
1874 | +# You should have received a copy of the GNU General Public License along with |
1875 | +# this program; if not, write to the Free Software Foundation, Inc., |
1876 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
1877 | + |
1878 | +import ConfigParser |
1879 | +import glob |
1880 | +import os |
1881 | +import shutil |
1882 | +from xml.sax.saxutils import quoteattr, escape |
1883 | + |
1884 | +from .settings import PROJECT_CONFIG_SUFFIX |
1885 | +from .utils import ignored |
1886 | + |
1887 | +WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1" time="0.1"> |
1888 | + <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase> |
1889 | +</testsuite>''' |
1890 | + |
1891 | + |
1892 | +def generate_xml_artefacts(test_name, details, filename): |
1893 | + '''Generate a fake test name xml result for marking the build as unstable''' |
1894 | + failure = "" |
1895 | + errnum = 0 |
1896 | + for detail in details: |
1897 | + errnum = 1 |
1898 | + failure += ' <failure type="exception">{}</failure>\n'.format(escape(detail)) |
1899 | + if failure: |
1900 | + failure = '\n{}'.format(failure) |
1901 | + |
1902 | + with open(filename, 'w') as f: |
1903 | + f.write(WRAPPER_STRING.format(errnum, quoteattr(test_name), failure)) |
1904 | + |
1905 | + |
1906 | +def get_previous_distro_version_from_config(source_package_name): |
1907 | + '''Get previous packaging version which was in bzr from the saved config''' |
1908 | + config = ConfigParser.RawConfigParser() |
1909 | + config.read("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)) |
1910 | + return config.get('Package', 'dest_current_version') |
1911 | + |
1912 | + |
1913 | +def save_project_config(source_package_name, branch, revision, dest_current_version, current_packaging_version): |
1914 | + '''Save branch and package configuration''' |
1915 | + config = ConfigParser.RawConfigParser() |
1916 | + config.add_section('Branch') |
1917 | + config.set('Branch', 'branch', branch) |
1918 | + config.set('Branch', 'rev', revision) |
1919 | + config.add_section('Package') |
1920 | + config.set('Package', 'dest_current_version', dest_current_version) |
1921 | + config.set('Package', 'packaging_version', current_packaging_version) |
1922 | + with open("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX), 'wb') as configfile: |
1923 | + config.write(configfile) |
1924 | + |
1925 | + |
1926 | +def get_packaging_diff_filename(source_package_name, packaging_version): |
1927 | + '''Return the packaging diff filename''' |
1928 | + |
1929 | + return "packaging_changes_{}_{}.diff".format(source_package_name, packaging_version) |
1930 | + |
1931 | + |
1932 | +def mark_project_as_published(source_package_name, packaging_version): |
1933 | + '''Rename .project and eventual diff files so that if we do a partial rebuild, we don't try to republish them''' |
1934 | + project_filename = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX) |
1935 | + os.rename(project_filename, "{}_{}".format(project_filename, packaging_version)) |
1936 | + diff_filename = get_packaging_diff_filename(source_package_name, packaging_version) |
1937 | + if os.path.isfile(diff_filename): |
1938 | + os.rename(diff_filename, "{}.published".format(diff_filename)) |
1939 | + |
1940 | + |
1941 | +def clean_source(source): |
1942 | + """clean all related source content from current silos""" |
1943 | + with ignored(OSError): |
1944 | + shutil.rmtree(source) |
1945 | + with ignored(OSError): |
1946 | + os.remove("{}.{}".format(source, PROJECT_CONFIG_SUFFIX)) |
1947 | + with ignored(OSError): |
1948 | + shutil.rmtree("ubuntu/{}".format(source)) |
1949 | + for filename in glob.glob("{}_*".format(source)): |
1950 | + os.remove(filename) |
1951 | + for filename in glob.glob("packaging_changes_{}_*diff".format(source)): |
1952 | + os.remove(filename) |
1953 | |
1954 | === added file 'branch-source-builder/cupstream2distro/utils.py' |
1955 | --- branch-source-builder/cupstream2distro/utils.py 1970-01-01 00:00:00 +0000 |
1956 | +++ branch-source-builder/cupstream2distro/utils.py 2014-02-11 20:59:14 +0000 |
1957 | @@ -0,0 +1,29 @@ |
1958 | +# -*- coding: utf-8 -*- |
1959 | +# Copyright (C) 2013 Canonical |
1960 | +# |
1961 | +# Authors: |
1962 | +# Didier Roche |
1963 | +# |
1964 | +# This program is free software; you can redistribute it and/or modify it under |
1965 | +# the terms of the GNU General Public License as published by the Free Software |
1966 | +# Foundation; version 3. |
1967 | +# |
1968 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
1969 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
1970 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
1971 | +# details. |
1972 | +# |
1973 | +# You should have received a copy of the GNU General Public License along with |
1974 | +# this program; if not, write to the Free Software Foundation, Inc., |
1975 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
1976 | + |
1977 | +from contextlib import contextmanager |
1978 | + |
1979 | + |
1980 | +# this is stolen from python 3.4 :) |
1981 | +@contextmanager |
1982 | +def ignored(*exceptions): |
1983 | + try: |
1984 | + yield |
1985 | + except exceptions: |
1986 | + pass |
1987 | |
1988 | === modified file 'branch-source-builder/run_worker' |
1989 | --- branch-source-builder/run_worker 2014-01-06 19:47:57 +0000 |
1990 | +++ branch-source-builder/run_worker 2014-02-11 20:59:14 +0000 |
1991 | @@ -18,7 +18,10 @@ |
1992 | import logging |
1993 | import os |
1994 | import sys |
1995 | +import time |
1996 | |
1997 | +import upload_package |
1998 | +import watch_ppa |
1999 | |
2000 | logging.basicConfig(level=logging.INFO) |
2001 | log = logging.getLogger(__name__) |
2002 | @@ -27,23 +30,59 @@ |
2003 | # and add it, so we can always safely import stuff |
2004 | sys.path.append(os.path.join(os.path.dirname(__file__), '../ci-utils')) |
2005 | from ci_utils import amqp_utils |
2006 | - |
2007 | +TIME_BETWEEN_CHECKS = 60 |
2008 | |
2009 | def on_message(msg): |
2010 | - log.info('on_message: %s', msg.body) |
2011 | + log.info('on_message: {}'.format(msg.body)) |
2012 | params = json.loads(msg.body) |
2013 | sources = params['source_packages'] |
2014 | ppa = params['ppa'] |
2015 | + log.info('The PPA is: {}'.format(ppa)) |
2016 | trigger = params['progress_trigger'] |
2017 | + |
2018 | + # Setup the output data to send back to the caller |
2019 | + # TODO: sources_packages are just the passed in source files, this |
2020 | + # can be made to be more useful. |
2021 | + # TODO: artifacts will be populated with artifacts from this build. |
2022 | + out_data = {'source_packages': sources, |
2023 | + 'ppa': ppa, |
2024 | + 'artifacts': []} |
2025 | amqp_utils.progress_update(trigger, params) |
2026 | |
2027 | - # TODO build stuff |
2028 | - |
2029 | - if amqp_utils.progress_completed(trigger): |
2030 | - log.error('Unable to notify progress-trigger completition of action') |
2031 | - |
2032 | - # remove from queue so request becomes completed |
2033 | - msg.channel.basic_ack(msg.delivery_tag) |
2034 | + # TODO Replace these hardcoded parameters with params from the message |
2035 | + archive_ppa = 'ppa:ci-engineering-airline/ci-archive' |
2036 | + series = 'saucy' |
2037 | + try: |
2038 | + upload_list = upload_package.upload_source_packages(ppa, sources) |
2039 | + log.info('upload_list: {}'.format(upload_list)) |
2040 | + start_time = time.time() |
2041 | + while True: |
2042 | + (ret, status) = watch_ppa.watch_ppa(start_time, series, ppa, |
2043 | + archive_ppa, None, |
2044 | + upload_list) |
2045 | + progress = {} |
2046 | + for key in status: |
2047 | + progress[key] = str(status[key]) |
2048 | + log.info('progress: {}'.format(progress)) |
2049 | + out_data['status'] = progress |
2050 | + amqp_utils.progress_update(trigger, out_data) |
2051 | + if ret == -1: |
2052 | + log.info('Going to sleep for {}'.format(TIME_BETWEEN_CHECKS)) |
2053 | + time.sleep(TIME_BETWEEN_CHECKS) |
2054 | + else: |
2055 | + log.info('All done') |
2056 | + break |
2057 | + |
2058 | + amqp_utils.progress_completed(trigger, out_data) |
2059 | + except Exception as e: |
2060 | + error_msg = 'Exception: {}'.format(e) |
2061 | + log.error(error_msg) |
2062 | + out_data['error_message'] = error_msg |
2063 | + amqp_utils.progress_failed(trigger, out_data) |
2064 | + finally: |
2065 | + # remove from queue so request becomes completed |
2066 | + log.info('Acking the request: {}'.format(msg.body)) |
2067 | + msg.channel.basic_ack(msg.delivery_tag) |
2068 | |
2069 | |
2070 | if __name__ == '__main__': |
2071 | |
2072 | === added file 'branch-source-builder/upload_package.py' |
2073 | --- branch-source-builder/upload_package.py 1970-01-01 00:00:00 +0000 |
2074 | +++ branch-source-builder/upload_package.py 2014-02-11 20:59:14 +0000 |
2075 | @@ -0,0 +1,137 @@ |
2076 | +#!/usr/bin/env python |
2077 | +# Ubuntu CI Engine |
2078 | +# Copyright 2013 Canonical Ltd. |
2079 | + |
2080 | +# This program is free software: you can redistribute it and/or modify it |
2081 | +# under the terms of the GNU Affero General Public License version 3, as |
2082 | +# published by the Free Software Foundation. |
2083 | + |
2084 | +# This program is distributed in the hope that it will be useful, but |
2085 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
2086 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
2087 | +# PURPOSE. See the GNU Affero General Public License for more details. |
2088 | + |
2089 | +# You should have received a copy of the GNU Affero General Public License |
2090 | +# along with this program. If not, see <http://www.gnu.org/licenses/>. |
2091 | + |
2092 | +import argparse |
2093 | +import atexit |
2094 | +from dput.changes import parse_changes_file |
2095 | +import json |
2096 | +import logging |
2097 | +import os |
2098 | +import re |
2099 | +import shutil |
2100 | +import sys |
2101 | +import tempfile |
2102 | +import urllib2 |
2103 | + |
2104 | +from cupstream2distro import packagemanager |
2105 | + |
2106 | + |
2107 | +def parse_arguments(): |
2108 | + parser = argparse.ArgumentParser( |
2109 | + description='Source package upload handler.') |
2110 | + parser.add_argument('-p', '--ppa', |
2111 | + required=True, |
2112 | + help='Target PPA for source package upload.') |
2113 | + parser.add_argument('-s', '--source', |
2114 | + required=True, |
2115 | + action='append', |
2116 | + help='A source package file to include in the upload.') |
2117 | + return parser.parse_args() |
2118 | + |
2119 | + |
2120 | +def cleanup(directory): |
2121 | + if os.path.exists(directory) and os.path.isdir(directory): |
2122 | + logging.info('Removing temp directory: {}'.format(directory)) |
2123 | + shutil.rmtree(directory) |
2124 | + |
2125 | + |
2126 | +def create_temp_dir(): |
2127 | + directory = tempfile.mkdtemp() |
2128 | + atexit.register(cleanup, directory) |
2129 | + return directory |
2130 | + |
2131 | + |
2132 | +def get_url_contents(url): |
2133 | + try: |
2134 | + req = urllib2.Request(url) |
2135 | + f = urllib2.urlopen(req) |
2136 | + response = f.read() |
2137 | + f.close() |
2138 | + return response |
2139 | + except IOError as e: |
2140 | + raise EnvironmentError('Failed to open url [{}]: {}'.format(url, e)) |
2141 | + |
2142 | + |
2143 | +def get_source_files(source_files): |
2144 | + location = create_temp_dir() |
2145 | + local_files = [] |
2146 | + for source in source_files: |
2147 | + # Co-locate all the files into a temp directory |
2148 | + file_name = os.path.basename(source) |
2149 | + file_path = os.path.join(location, file_name) |
2150 | + logging.info('Retrieving source file: {}'.format(file_name)) |
2151 | + with open(file_path, 'w') as out_file: |
2152 | + out_file.write(get_url_contents(source)) |
2153 | + local_files.append(file_name) |
2154 | + return (location, local_files) |
2155 | + |
2156 | + |
2157 | +def parse_dsc_file(dsc_file): |
2158 | + regexp = re.compile("^Architecture: (.*)\n") |
2159 | + with open(dsc_file) as in_file: |
2160 | + for line in in_file: |
2161 | + arch_lists = regexp.findall(line) |
2162 | + if arch_lists: |
2163 | + return arch_lists[0] |
2164 | + |
2165 | + |
2166 | +def parse_source_files(source_directory, source_files): |
2167 | + package_list = [] |
2168 | + for source in source_files: |
2169 | + source_name = os.path.join(source_directory, source) |
2170 | + if source.endswith('changes'): |
2171 | + changes = parse_changes_file(filename=source_name, |
2172 | + directory=source_directory) |
2173 | + package = {} |
2174 | + package['name'] = changes.get('Source') |
2175 | + package['version'] = changes.get('Version') |
2176 | + package['files'] = [] |
2177 | + package['files'].append(changes.get_changes_file()) |
2178 | + for package_file in changes.get_files(): |
2179 | + if os.path.exists(package_file): |
2180 | + if package_file.endswith('.dsc'): |
2181 | + package['architecture'] = parse_dsc_file( |
2182 | + package_file) |
2183 | + package['files'].append(package_file) |
2184 | + else: |
2185 | + raise EnvironmentError( |
2186 | + 'Not found: {}'.format(package_file)) |
2187 | + package_list.append(package) |
2188 | + return package_list |
2189 | + |
2190 | + |
2191 | +def upload_source_packages(ppa, upload_files): |
2192 | + '''Attempts source file upload into the PPA.''' |
2193 | + logging.info('Upload to the ppa: {}'.format(ppa)) |
2194 | + (source_directory, source_files) = get_source_files(upload_files) |
2195 | + source_packages = parse_source_files(source_directory, source_files) |
2196 | + for package in source_packages: |
2197 | + packagemanager.upload_package(package['name'], package['version'], |
2198 | + ppa, source_directory) |
2199 | + # TODO Return the data set of packages uploaded |
2200 | + return source_packages |
2201 | + |
2202 | + |
2203 | +def main(): |
2204 | + logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s " |
2205 | + "%(message)s") |
2206 | + |
2207 | + args = parse_arguments() |
2208 | + upload_source_packages(args.ppa, args.source) |
2209 | + |
2210 | + |
2211 | +if __name__ == '__main__': |
2212 | + sys.exit(main()) |
2213 | |
2214 | === added file 'branch-source-builder/watch_ppa.py' |
2215 | --- branch-source-builder/watch_ppa.py 1970-01-01 00:00:00 +0000 |
2216 | +++ branch-source-builder/watch_ppa.py 2014-02-11 20:59:14 +0000 |
2217 | @@ -0,0 +1,182 @@ |
2218 | +#!/usr/bin/python |
2219 | +# -*- coding: utf-8 -*- |
2220 | +# Copyright (C) 2012 Canonical |
2221 | +# |
2222 | +# Authors: |
2223 | +# Didier Roche |
2224 | +# |
2225 | +# This program is free software; you can redistribute it and/or modify it under |
2226 | +# the terms of the GNU General Public License as published by the Free Software |
2227 | +# Foundation; version 3. |
2228 | +# |
2229 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
2230 | +# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS |
2231 | +# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more |
2232 | +# details. |
2233 | +# |
2234 | +# You should have received a copy of the GNU General Public License along with |
2235 | +# this program; if not, write to the Free Software Foundation, Inc., |
2236 | +# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA |
2237 | + |
2238 | +import argparse |
2239 | +import logging |
2240 | +import os |
2241 | +import sys |
2242 | +import time |
2243 | + |
2244 | +from cupstream2distro import (launchpadmanager, packageinppamanager) |
2245 | +from cupstream2distro.packageinppa import PackageInPPA |
2246 | +from cupstream2distro.packagemanager import list_packages_info_in_str |
2247 | +from cupstream2distro.settings import ( |
2248 | + TIME_BETWEEN_PPA_CHECKS, TIME_BEFORE_STOP_LOOKING_FOR_SOURCE_PUBLISH) |
2249 | + |
2250 | + |
2251 | +def parse_arguments(): |
2252 | + parser = argparse.ArgumentParser( |
2253 | + description="Watch for published package in a ppa", |
2254 | + epilog="""series and ppa options can be set by the corresponding long |
2255 | + option name env variables as well""") |
2256 | + parser.add_argument("-s", "--series", |
2257 | + help="Serie used to build the package") |
2258 | + parser.add_argument("-p", "--ppa", |
2259 | + help="""PPA to publish this package to (for instance: |
2260 | + 'ubuntu-unity/daily-build')""") |
2261 | + parser.add_argument("-a", "--arch", |
2262 | + default=None, |
2263 | + help="Only consider the provided target") |
2264 | + parser.add_argument("-d", "--destppa", |
2265 | + help="""Consider this destppa instead of |
2266 | + {series}-proposed""") |
2267 | + return parser.parse_args() |
2268 | + |
2269 | + |
2270 | +def get_environment_variables(args): |
2271 | + series = args.series |
2272 | + ppa = args.ppa |
2273 | + if not series: |
2274 | + series = os.getenv("series") |
2275 | + if not ppa: |
2276 | + ppa = os.getenv("ppa") |
2277 | + return (series, ppa) |
2278 | + |
2279 | + |
2280 | +def get_ppa(ppa): |
2281 | + return launchpadmanager.get_ppa(ppa) |
2282 | + |
2283 | + |
2284 | +def watch_ppa(time_start, series, ppa, dest_ppa, arch, upload_list): |
2285 | + # Prepare launchpad connection: |
2286 | + lp_series = launchpadmanager.get_series(series) |
2287 | + monitored_ppa = launchpadmanager.get_ppa(ppa) |
2288 | + if dest_ppa: |
2289 | + dest_archive = get_ppa(dest_ppa) |
2290 | + else: |
2291 | + dest_archive = launchpadmanager.get_ubuntu_archive() |
2292 | + logging.info('Series: {}'.format(lp_series)) |
2293 | + logging.info('Monitoring PPA: {}'.format(monitored_ppa)) |
2294 | + |
2295 | + logging.info('Destination Archive: {}'.format(dest_archive)) |
2296 | + |
2297 | + # Get archs available and archs to ignore |
2298 | + (available_archs_in_ppa, |
2299 | + arch_all_arch) = launchpadmanager.get_available_and_all_archs( |
2300 | + lp_series, monitored_ppa) |
2301 | + (archs_to_eventually_ignore, |
2302 | + archs_to_unconditionally_ignore) = launchpadmanager.get_ignored_archs() |
2303 | + logging.info('Arches available in ppa: {}'.format(available_archs_in_ppa)) |
2304 | + logging.info('All arch in ppa: {}'.format(arch_all_arch)) |
2305 | + logging.info('Arches to eventually ignore: {}'.format( |
2306 | + archs_to_eventually_ignore)) |
2307 | + logging.info('Arches to unconditionally ignore: {}'.format( |
2308 | + archs_to_unconditionally_ignore)) |
2309 | + |
2310 | + # Collecting all packages that have been uploaded to the ppa |
2311 | + packages_not_in_ppa = set() |
2312 | + packages_building = set() |
2313 | + packages_failed = set() |
2314 | + for source_package in upload_list: |
2315 | + source = source_package['name'] |
2316 | + version = source_package['version'] |
2317 | + archs = source_package['architecture'] |
2318 | + logging.info('Inspecting upload: {} - {}'.format(source, version)) |
2319 | + packages_not_in_ppa.add(PackageInPPA(source, version, monitored_ppa, |
2320 | + dest_archive, lp_series, |
2321 | + available_archs_in_ppa, |
2322 | + arch_all_arch, |
2323 | + archs_to_eventually_ignore, |
2324 | + archs_to_unconditionally_ignore, |
2325 | + package_archs=archs)) |
2326 | + |
2327 | + # packages_not_in_ppa are packages that were uploaded and are expeceted |
2328 | + # to eventually appear in the ppa. |
2329 | + logging.info('Packages not in PPA: {}'.format( |
2330 | + list_packages_info_in_str(packages_not_in_ppa))) |
2331 | + logging.info('Packages building: {}'.format(packages_building)) |
2332 | + logging.info('Packages failed: {}'.format(packages_failed)) |
2333 | + |
2334 | + # Check the status regularly on all packages |
2335 | + # TODO The following is the original check loop. This can be extracted |
2336 | + # and optimized. |
2337 | + logging.info("Checking the status for {}".format( |
2338 | + list_packages_info_in_str( |
2339 | + packages_not_in_ppa.union(packages_building)))) |
2340 | + packageinppamanager.update_all_packages_status( |
2341 | + packages_not_in_ppa, packages_building, packages_failed, arch) |
2342 | + |
2343 | + status = {'pending': packages_not_in_ppa, |
2344 | + 'building': packages_building, |
2345 | + 'failed': packages_failed} |
2346 | + # if we have no package building or failing and have wait for |
2347 | + # long enough to have some package appearing in the ppa, exit |
2348 | + if (packages_not_in_ppa and not packages_building and |
2349 | + ((time.time() - time_start) > |
2350 | + TIME_BEFORE_STOP_LOOKING_FOR_SOURCE_PUBLISH)): |
2351 | + # TODO return error on the missing packages |
2352 | + logging.info( |
2353 | + "Some source packages were never published in the ppa: " |
2354 | + "{}".format(list_packages_info_in_str(packages_not_in_ppa))) |
2355 | + return (1, status) |
2356 | + |
2357 | + # break out of status check loop if all packages have arrived in |
2358 | + # the ppa and have completed building |
2359 | + if not packages_not_in_ppa and not packages_building: |
2360 | + if packages_failed: |
2361 | + # TODO Return package failure info |
2362 | + logging.info( |
2363 | + "Some of the packages failed to build: {}".format( |
2364 | + list_packages_info_in_str(packages_failed))) |
2365 | + return (1, status) |
2366 | + return (0, status) |
2367 | + |
2368 | + # -1 indicates to retry |
2369 | + # TODO return useful status about what is still in progress |
2370 | + return (-1, status) |
2371 | + |
2372 | + |
2373 | +def main(): |
2374 | + '''Provides usage through the command line.''' |
2375 | + logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s " |
2376 | + "%(message)s") |
2377 | + |
2378 | + args = parse_arguments() |
2379 | + (series, ppa) = get_environment_variables(args) |
2380 | + |
2381 | + if not series or not ppa: |
2382 | + logging.error("Missing compulsory environment variables (ppa, series) " |
2383 | + "watching: {}, series: {}".format(ppa, series)) |
2384 | + return 1 |
2385 | + |
2386 | + # TODO UPLOAD LIST is not ready for use via the CLI yet. |
2387 | + # The original cu2d tools extracted this from files stored locally |
2388 | + time_start = time.time() |
2389 | + while True: |
2390 | + (ret, status) = watch_ppa(time_start, series, ppa, |
2391 | + args.destppa, args.arch, UPLOAD_LIST) |
2392 | + if ret == -1: |
2393 | + time.sleep(TIME_BETWEEN_PPA_CHECKS) |
2394 | + else: |
2395 | + return ret |
2396 | + |
2397 | + |
2398 | +if __name__ == '__main__': |
2399 | + sys.exit(main()) |
2400 | |
2401 | === modified file 'charms/precise/lander-jenkins/templates/jobs/lander_master.xml' |
2402 | --- charms/precise/lander-jenkins/templates/jobs/lander_master.xml 2014-02-06 21:13:48 +0000 |
2403 | +++ charms/precise/lander-jenkins/templates/jobs/lander_master.xml 2014-02-11 20:59:14 +0000 |
2404 | @@ -168,13 +168,6 @@ |
2405 | child_number=$TRIGGERED_BUILD_NUMBER_lander_branch_source_builder |
2406 | wget -O params.json "${JENKINS_URL}job/${LAST_TRIGGERED_JOB_NAME}/${child_number}/artifact/results/params.json" |
2407 | |
2408 | -# TODO - temporary until the bsbuilder works properly |
2409 | -cat > params.json <<EOF |
2410 | -{ |
2411 | - "ppa": "ppa:phablet-team/tools" |
2412 | -} |
2413 | -EOF |
2414 | - |
2415 | /srv/lander_jenkins_sub/lander/bin/lander_merge_parameters.py --result-file params.json --service bsbuilder --output-file all.json --prior-file all.json |
2416 | |
2417 | # Convert to a format that can be passed to the child jobs |
2418 | |
2419 | === modified file 'juju-deployer/branch-source-builder.yaml' |
2420 | --- juju-deployer/branch-source-builder.yaml 2014-01-31 09:15:34 +0000 |
2421 | +++ juju-deployer/branch-source-builder.yaml 2014-02-11 20:59:14 +0000 |
2422 | @@ -18,6 +18,9 @@ |
2423 | options: |
2424 | branch: lp:ubuntu-ci-services-itself |
2425 | main: ./branch-source-builder/run_worker |
2426 | + unit-config: include-base64://configs/unit_config.yaml |
2427 | + packages: dput |
2428 | + pip-packages: dput |
2429 | rabbit: |
2430 | branch: lp:~canonical-ci-engineering/charms/precise/ubuntu-ci-services-itself/rabbitmq-server |
2431 | charm: rabbitmq |
This is a follow up to bsb-dput-2. I screwed up the branch and had to resubmit the whole thing.
> I'm skipping the cupstream2distro code since its mostly copy/paste. progress_ completed( trigger) :
>
> = run-worker:
>
> 2050 + if amqp_utils.
> 2051 + log.error('Unable to notify progress-trigger completion of action')
>
> the progress_completed function already logs an error, so you really
> only need to check the return code for it if you are going to handle
> failure in some special way. Since we don't here, I'd delete the
> log.error line.
Fixed.
> You also need to supply a payload to the "progress_ completed" call. The bazaar. launchpad. net/~canonical- ci-engineering/ ubuntu- ci-services- trunk/view/ head:/charms/ precise/ lander- templates/ jobs/lander_ master. xml#L170>
> data returned from this function winds up being the data that will be
> passed to the image builder. I think it needs to contain:
>
> {
> "ppa": "<foo>"
> }
>
> so we can remove the hack from:
>
> <http://
> itself/
> jenkins/
Fixed, good catch on removing the hack.
> 2052 + except EnvironmentError as e: progress_ failed
>
> I think I'd just catch "Exception" here. That way nothing slips through
> in a way where we won't call amqp_utils.
Fixed
> = upload_package: split(' /')[-1] basename( source)
>
> 2141 + file_name = source.
>
> I think you can just os.path.basename to make it more obvious what your
> intent is:
>
> os.path.
Fixed, I also removed the hack to remove the container name from the filename (the data-store module has been updated to just use the filename).
> 2186 + logging. basicConfig( level=logging. INFO, format="%(asctime)s
> %(levelname)s "
> 2187 + "%(message)s")
>
> I think that should be moved to "main" instead. Otherwise you might just
> override the logging config of the process that's using this API.
Fixed, also fixed in watch_ppa.py