Merge lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage into lp:ubuntu-ci-services-itself

Proposed by Evan
Status: Needs review
Proposed branch: lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage
Merge into: lp:ubuntu-ci-services-itself
Diff against target: 1888 lines (+633/-308)
16 files modified
branch-source-builder/bsbuilder/tests/test_style.py (+8/-4)
branch-source-builder/cupstream2distro/branchhandling.py (+75/-40)
branch-source-builder/cupstream2distro/launchpadmanager.py (+38/-18)
branch-source-builder/cupstream2distro/packageinppa.py (+86/-49)
branch-source-builder/cupstream2distro/packageinppamanager.py (+24/-12)
branch-source-builder/cupstream2distro/packagemanager.py (+204/-102)
branch-source-builder/cupstream2distro/settings.py (+11/-5)
branch-source-builder/cupstream2distro/silomanager.py (+30/-11)
branch-source-builder/cupstream2distro/stack.py (+102/-40)
branch-source-builder/cupstream2distro/stacks.py (+9/-4)
branch-source-builder/cupstream2distro/tools.py (+20/-10)
charms/precise/python-django/hooks/hooks.py (+1/-2)
image-builder/imagebuilder/tests/test_style.py (+8/-4)
image-builder/run_worker (+1/-1)
lander/lander/tests/test_style.py (+8/-4)
test_runner/tstrun/tests/test_style.py (+8/-2)
To merge this branch: bzr merge lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage
Reviewer Review Type Date Requested Status
PS Jenkins bot (community) continuous-integration Needs Fixing
Canonical CI Engineering Pending
Review via email: mp+207801@code.launchpad.net

Commit message

Fix a large number of pep8 issues in cupstream2distro. Cover run_worker and other out of module scripts with the pep8 and pyflakes tests.

Description of the change

I noticed that pyflakes wasn't running against watch_ppa.py, which exists a level above the bsbuilder module.

While fixing this and running the tests, I was having a hard time pulling useful failures from the noise generated by cupstream2distro. So I fixed all of its pep8 and pyflakes issues.

We should be very careful in landing this branch. cupstream2distro is completely untested code and I made some logic changes to it.

You'll notice that there are symlinks to the run_worker script in a few modules. This exists for three reasons:
 1. pep8 and pyflakes are hardcoded in uci-tests to look for *.py files.
 2. Importing `run_worker` using imp creates a run_workerc file unless you tell Python to give the compiled code a different name (by providing an open file and path argument).
 3. Our test harness imports all the run_worker scripts under the same namespace, so you end up using whichever was imported last. Providing a uniquely named symlink fixes this.

Note that there is still a pyflakes failure with this branch. That is bug 1283449.

To post a comment you must log in.
Revision history for this message
PS Jenkins bot (ps-jenkins) wrote :

FAILED: Continuous integration, rev:280
No commit message was specified in the merge proposal. Click on the following link and set the commit message (if you want a jenkins rebuild you need to trigger it yourself):
https://code.launchpad.net/~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage/+merge/207801/+edit-commit-message

http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/217/
Executed test runs:

Click here to trigger a rebuild:
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/217/rebuild

review: Needs Fixing (continuous-integration)
281. By Evan

Expand the pyflakes/pep8 coverage for the test_runner as well.

Revision history for this message
PS Jenkins bot (ps-jenkins) wrote :

FAILED: Continuous integration, rev:281
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/219/
Executed test runs:

Click here to trigger a rebuild:
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/219/rebuild

review: Needs Fixing (continuous-integration)
Revision history for this message
Ursula Junque (ursinha) wrote :

I like how elegant this is handling the huge messages (the part that was worrying me most). I've been working on adding tests to this code before actually making such changes, because they are already part of upstream and "only" need to be ported (at least basic unit tests), and that would avoid headaches in case things started to fail because we accidentally changed the logic somewhere. If you agree to hold this for a bit I think I can land tests first, then you can land this branch safely.

Revision history for this message
Evan (ev) wrote :

Yup, I think that's entirely reasonable. Let's put this on hold until your branch lands.

Unmerged revisions

281. By Evan

Expand the pyflakes/pep8 coverage for the test_runner as well.

280. By Evan

Fix the style tests. Specifying more than one module under the same directory causes it to be tested twice by pep8 and pyflakes. Modules imported with the same name with override each other.

279. By Evan

Fix pep8 issues in cupstream2distro packageinppa module.

278. By Evan

Fix pep8 issues in cupstream2distro packageinppamanager module.

277. By Evan

Fix pep8 issues in cupstream2distro packagemanager module.

276. By Evan

Fix pep8 issues in cupstream2distro silomanager module.

275. By Evan

Fix pep8 issues in cupstream2distro settings module.

274. By Evan

Fix pep8 issues in cupstream2distro stacks module.

273. By Evan

Oops. One more in tools.py.

272. By Evan

Fix pep8 issues in cupstream2distro tools module.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'branch-source-builder/bsbuilder/tests/test_style.py'
--- branch-source-builder/bsbuilder/tests/test_style.py 2014-02-15 12:06:40 +0000
+++ branch-source-builder/bsbuilder/tests/test_style.py 2014-02-23 16:37:01 +0000
@@ -14,15 +14,19 @@
14# along with this program. If not, see <http://www.gnu.org/licenses/>.14# along with this program. If not, see <http://www.gnu.org/licenses/>.
1515
16from ucitests import styles16from ucitests import styles
1717# If we just import run_worker, we'll end up testing whichever run_worker was
18import bsbuilder18# last imported, so import it under a different name.
19import bsbuilder_run_worker
1920
2021
21class TestPep8(styles.TestPep8):22class TestPep8(styles.TestPep8):
2223
23 packages = [bsbuilder]24 # uci-tests will scan all subdirectories that this module is found in, so
25 # we do not need to explicitly provide bsbuilder. Doing so would cause the
26 # tests to be run twice.
27 packages = [bsbuilder_run_worker]
2428
2529
26class TestPyflakes(styles.TestPyflakes):30class TestPyflakes(styles.TestPyflakes):
2731
28 packages = [bsbuilder]32 packages = [bsbuilder_run_worker]
2933
=== added symlink 'branch-source-builder/bsbuilder_run_worker.py'
=== target is u'run_worker'
=== modified file 'branch-source-builder/cupstream2distro/branchhandling.py'
--- branch-source-builder/cupstream2distro/branchhandling.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/branchhandling.py 2014-02-23 16:37:01 +0000
@@ -23,13 +23,17 @@
23import os23import os
24import re24import re
25import subprocess25import subprocess
26from subprocess import PIPE
2627
27from .settings import BRANCH_URL, IGNORECHANGELOG_COMMIT, PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX, SILO_PACKAGING_RELEASE_COMMIT_MESSAGE28from .settings import (BRANCH_URL, IGNORECHANGELOG_COMMIT,
29 PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX,
30 SILO_PACKAGING_RELEASE_COMMIT_MESSAGE)
2831
2932
30def get_branch(branch_url, dest_dir):33def get_branch(branch_url, dest_dir):
31 '''Grab a branch'''34 '''Grab a branch'''
32 instance = subprocess.Popen(["bzr", "branch", branch_url, dest_dir], stdout=subprocess.PIPE, stderr=subprocess.PIPE)35 cmd = ["bzr", "branch", branch_url, dest_dir]
36 instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
33 (stdout, stderr) = instance.communicate()37 (stdout, stderr) = instance.communicate()
34 if instance.returncode != 0:38 if instance.returncode != 0:
35 raise Exception(stderr.decode("utf-8").strip())39 raise Exception(stderr.decode("utf-8").strip())
@@ -37,15 +41,18 @@
3741
38def get_tip_bzr_revision():42def get_tip_bzr_revision():
39 '''Get latest revision in bzr'''43 '''Get latest revision in bzr'''
40 instance = subprocess.Popen(["bzr", "log", "-c", "-1", "--line"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)44 cmd = ["bzr", "log", "-c", "-1", "--line"]
45 instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
41 (stdout, stderr) = instance.communicate()46 (stdout, stderr) = instance.communicate()
42 if instance.returncode != 0:47 if instance.returncode != 0:
43 raise Exception(stderr.decode("utf-8").strip())48 raise Exception(stderr.decode("utf-8").strip())
44 return (int(stdout.split(':')[0]))49 return (int(stdout.split(':')[0]))
4550
4651
47def collect_author_commits(content_to_parse, bugs_to_skip, additional_stamp=""):52def collect_author_commits(content_to_parse, bugs_to_skip,
48 '''return a tuple of a dict with authors and commits message from the content to parse53 additional_stamp=""):
54 '''Return a tuple of a dict with authors and commits message from the
55 content to parse.
4956
50 bugs_to_skip is a set of bugs we need to skip57 bugs_to_skip is a set of bugs we need to skip
5158
@@ -60,15 +67,17 @@
60 commit_message_stenza = False67 commit_message_stenza = False
61 for line in content_to_parse.splitlines():68 for line in content_to_parse.splitlines():
62 # new revision, collect what we have found69 # new revision, collect what we have found
63 if line.startswith("------------------------------------------------------------"):70 dash = "------------------------------------------------------------"
64 # try to decipher a special case: we have some commits which were already in bugs_to_skip,71 if line.startswith(dash):
65 # so we eliminate them.72 # try to decipher a special case: we have some commits which were
73 # already in bugs_to_skip, so we eliminate them.
66 # Also ignore when having IGNORECHANGELOG_COMMIT74 # Also ignore when having IGNORECHANGELOG_COMMIT
67 if (current_bugs and not (current_bugs - bugs_to_skip)) or IGNORECHANGELOG_COMMIT in current_commit:75 if ((current_bugs and not (current_bugs - bugs_to_skip))
68 current_authors = set()76 or IGNORECHANGELOG_COMMIT in current_commit):
69 current_commit = ""77 current_authors = set()
70 current_bugs = set()78 current_commit = ""
71 continue79 current_bugs = set()
80 continue
72 current_bugs -= bugs_to_skip81 current_bugs -= bugs_to_skip
73 commit_message = current_commit + _format_bugs(current_bugs)82 commit_message = current_commit + _format_bugs(current_bugs)
74 for author in current_authors:83 for author in current_authors:
@@ -79,7 +88,8 @@
79 current_commit = ""88 current_commit = ""
80 current_bugs = set()89 current_bugs = set()
8190
82 # we ignore this commit if we have a changelog provided as part of the diff91 # we ignore this commit if we have a changelog provided as part of the
92 # diff
83 if line.startswith("=== modified file 'debian/changelog'"):93 if line.startswith("=== modified file 'debian/changelog'"):
84 current_authors = set()94 current_authors = set()
85 current_commit = ""95 current_commit = ""
@@ -94,14 +104,15 @@
94 elif commit_message_stenza:104 elif commit_message_stenza:
95 if line.startswith("diff:"):105 if line.startswith("diff:"):
96 commit_message_stenza = False106 commit_message_stenza = False
97 current_commit, current_bugs = _extract_commit_bugs(current_commit, additional_stamp)107 args = (current_commit, additional_stamp)
108 current_commit, current_bugs = _extract_commit_bugs(*args)
98 else:109 else:
99 line = line[2:] # Dedent the message provided by bzr110 line = line[2:] # Dedent the message provided by bzr
100 if line[0:2] in ('* ', '- '): # paragraph line.111 if line[0:2] in ('* ', '- '): # paragraph line.
101 line = line[2:] # Remove bullet112 line = line[2:] # Remove bullet
102 if line[-1] != '.': # Grammar nazi...113 if line[-1] != '.': # Grammar nazi...
103 line += '.' # ... or the lines will be merged.114 line += '.' # ... or the lines will be merged.
104 line = line + ' ' # Add a space to preserve lines115 line = line + ' ' # Add a space to preserve lines
105 current_commit += line116 current_commit += line
106 # Maybe add something like that117 # Maybe add something like that
107 #for content in mp.commit_message.split('\n'):118 #for content in mp.commit_message.split('\n'):
@@ -127,10 +138,13 @@
127138
128139
129def _extract_commit_bugs(commit_message, additional_stamp=""):140def _extract_commit_bugs(commit_message, additional_stamp=""):
130 '''extract relevant commit message part and bugs number from a commit message'''141 '''extract relevant commit message part and bugs number from a commit
142 message'''
131143
132 current_bugs = _return_bugs(commit_message)144 current_bugs = _return_bugs(commit_message)
133 changelog_content = " ".join(commit_message.rsplit('Fixes: ')[0].rsplit('Approved by ')[0].split())145 changelog_content = commit_message.rsplit('Fixes: ')[0]
146 changelog_content = changelog_content.rsplit('Approved by ')[0].split()
147 changelog_content = " ".join(changelog_content)
134 if additional_stamp:148 if additional_stamp:
135 changelog_content = changelog_content + " " + additional_stamp149 changelog_content = changelog_content + " " + additional_stamp
136 return (changelog_content, current_bugs)150 return (changelog_content, current_bugs)
@@ -149,7 +163,8 @@
149 # #12345 (but not 12345 for false positive)163 # #12345 (but not 12345 for false positive)
150 # Support multiple bugs per commit164 # Support multiple bugs per commit
151 bug_numbers = set()165 bug_numbers = set()
152 bug_regexp = re.compile("((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})", re.IGNORECASE)166 pattern = "((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})"
167 bug_regexp = re.compile(pattern, re.IGNORECASE)
153 for match in bug_regexp.findall(string):168 for match in bug_regexp.findall(string):
154 logging.debug("Bug regexp match: {}".format(match[-1]))169 logging.debug("Bug regexp match: {}".format(match[-1]))
155 bug_numbers.add(int(match[-1]))170 bug_numbers.add(int(match[-1]))
@@ -170,7 +185,9 @@
170def return_log_diff(starting_rev):185def return_log_diff(starting_rev):
171 '''Return the relevant part of the cvs log since starting_rev'''186 '''Return the relevant part of the cvs log since starting_rev'''
172187
173 instance = subprocess.Popen(["bzr", "log", "-r", "{}..".format(starting_rev), "--show-diff", "--forward"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)188 rev = "{}..".format(starting_rev)
189 cmd = ["bzr", "log", "-r", rev, "--show-diff", "--forward"]
190 instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
174 (stdout, stderr) = instance.communicate()191 (stdout, stderr) = instance.communicate()
175 if instance.returncode != 0:192 if instance.returncode != 0:
176 raise Exception(stderr.decode("utf-8").strip())193 raise Exception(stderr.decode("utf-8").strip())
@@ -178,8 +195,10 @@
178195
179196
180def return_log_diff_since_last_release(content_to_parse):197def return_log_diff_since_last_release(content_to_parse):
181 '''From a bzr log content, return only the log diff since the latest release'''198 '''From a bzr log content, return only the log diff since the latest
182 after_release = content_to_parse.split(SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(''))[-1]199 release'''
200 formatted = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format('')
201 after_release = content_to_parse.split(formatted)[-1]
183 sep = '------------------------------------------------------------'202 sep = '------------------------------------------------------------'
184 sep_index = after_release.find(sep)203 sep_index = after_release.find(sep)
185 if sep_index != 1:204 if sep_index != 1:
@@ -190,10 +209,11 @@
190def commit_release(new_package_version, tip_bzr_rev=None):209def commit_release(new_package_version, tip_bzr_rev=None):
191 '''Commit latest release'''210 '''Commit latest release'''
192 if not tip_bzr_rev:211 if not tip_bzr_rev:
193 message = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version)212 msg = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version)
194 else:213 else:
195 message = "Releasing {}, based on r{}".format(new_package_version, tip_bzr_rev)214 msg = "Releasing {}, based on r{}"
196 if subprocess.call(["bzr", "commit", "-m", message]) != 0:215 msg = msg.format(new_package_version, tip_bzr_rev)
216 if subprocess.call(["bzr", "commit", "-m", msg]) != 0:
197 raise Exception("The above command returned an error.")217 raise Exception("The above command returned an error.")
198218
199219
@@ -214,20 +234,26 @@
214 env["BZR_EDITOR"] = "echo"234 env["BZR_EDITOR"] = "echo"
215235
216 os.chdir(source_package_name)236 os.chdir(source_package_name)
217 if subprocess.call(["bzr", "push", BRANCH_URL.format(source_package_name, version.replace("~", "").replace(":", "")), "--overwrite"]) != 0:237 args = (source_package_name, version.replace("~", "").replace(":", ""))
238 loc = BRANCH_URL.format(*args)
239 if subprocess.call(["bzr", "push", loc, "--overwrite"]) != 0:
218 raise Exception("The push command returned an error.")240 raise Exception("The push command returned an error.")
219 mergeinstance = subprocess.Popen(["bzr", "lp-propose-merge", parent_branch, "-m", PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch), "--approve"], stdin=subprocess.PIPE, env=env)241 msg = PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch)
242 cmd = ["bzr", "lp-propose-merge", parent_branch, "-m", msg, "--approve"]
243 mergeinstance = subprocess.Popen(cmd, stdin=PIPE, env=env)
220 mergeinstance.communicate(input="y")244 mergeinstance.communicate(input="y")
221 if mergeinstance.returncode != 0:245 if mergeinstance.returncode != 0:
222 raise Exception("The lp-propose command returned an error.")246 raise Exception("The lp-propose command returned an error.")
223 os.chdir('..')247 os.chdir('..')
224248
225249
226def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri, commit_message, revision):250def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri,
251 commit_message, revision):
227 """Merge local branch into lp_parent_branch at revision"""252 """Merge local branch into lp_parent_branch at revision"""
228 success = False253 success = False
229 cur_dir = os.path.abspath('.')254 cur_dir = os.path.abspath('.')
230 subprocess.call(["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri])255 cmd = ["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri]
256 subprocess.call(cmd)
231 os.chdir(dest_uri)257 os.chdir(dest_uri)
232 if subprocess.call(["bzr", "merge", local_branch_uri]) == 0:258 if subprocess.call(["bzr", "merge", local_branch_uri]) == 0:
233 subprocess.call(["bzr", "commit", "-m", commit_message])259 subprocess.call(["bzr", "commit", "-m", commit_message])
@@ -236,12 +262,14 @@
236 return success262 return success
237263
238264
239def merge_branch(uri_to_merge, lp_parent_branch, commit_message, authors=set()):265def merge_branch(uri_to_merge, lp_parent_branch, commit_message,
266 authors=set()):
240 """Resync with targeted branch if possible"""267 """Resync with targeted branch if possible"""
241 success = False268 success = False
242 cur_dir = os.path.abspath('.')269 cur_dir = os.path.abspath('.')
243 os.chdir(uri_to_merge)270 os.chdir(uri_to_merge)
244 lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:")271 args = ("https://code.launchpad.net/", "lp:")
272 lp_parent_branch = lp_parent_branch.replace(*args)
245 if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0:273 if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0:
246 cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"]274 cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"]
247 for author in authors:275 for author in authors:
@@ -251,12 +279,14 @@
251 os.chdir(cur_dir)279 os.chdir(cur_dir)
252 return success280 return success
253281
282
254def push_to_branch(source_uri, lp_parent_branch, overwrite=False):283def push_to_branch(source_uri, lp_parent_branch, overwrite=False):
255 """Push source to parent branch"""284 """Push source to parent branch"""
256 success = False285 success = False
257 cur_dir = os.path.abspath('.')286 cur_dir = os.path.abspath('.')
258 os.chdir(source_uri)287 os.chdir(source_uri)
259 lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:")288 args = ("https://code.launchpad.net/", "lp:")
289 lp_parent_branch = lp_parent_branch.replace(*args)
260 command = ["bzr", "push", lp_parent_branch]290 command = ["bzr", "push", lp_parent_branch]
261 if overwrite:291 if overwrite:
262 command.append("--overwrite")292 command.append("--overwrite")
@@ -265,16 +295,21 @@
265 os.chdir(cur_dir)295 os.chdir(cur_dir)
266 return success296 return success
267297
298
268def grab_committers_compared_to(source_uri, lp_branch_to_scan):299def grab_committers_compared_to(source_uri, lp_branch_to_scan):
269 """Return unique list of committers for a given branch"""300 """Return unique list of committers for a given branch"""
270 committers = set()301 committers = set()
271 cur_dir = os.path.abspath('.')302 cur_dir = os.path.abspath('.')
272 os.chdir(source_uri)303 os.chdir(source_uri)
273 lp_branch_to_scan = lp_branch_to_scan.replace("https://code.launchpad.net/", "lp:")304 args = ("https://code.launchpad.net/", "lp:")
274 instance = subprocess.Popen(["bzr", "missing", lp_branch_to_scan, "--other"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)305 lp_branch_to_scan = lp_branch_to_scan.replace(*args)
306 cmd = ["bzr", "missing", lp_branch_to_scan, "--other"]
307 instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
275 (stdout, stderr) = instance.communicate()308 (stdout, stderr) = instance.communicate()
276 if stderr != "":309 if stderr != "":
277 raise Exception("bzr missing on {} returned a failure: {}".format(lp_branch_to_scan, stderr.decode("utf-8").strip()))310 msg = "bzr missing on {} returned a failure: {}"
311 msg = msg.format(lp_branch_to_scan, stderr.decode("utf-8").strip())
312 raise Exception(msg)
278 committer_regexp = re.compile("\ncommitter: (.*)\n")313 committer_regexp = re.compile("\ncommitter: (.*)\n")
279 for match in committer_regexp.findall(stdout):314 for match in committer_regexp.findall(stdout):
280 for committer in match.split(', '):315 for committer in match.split(', '):
281316
=== modified file 'branch-source-builder/cupstream2distro/launchpadmanager.py'
--- branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-23 16:37:01 +0000
@@ -25,10 +25,13 @@
25import os25import os
26launchpad = None26launchpad = None
2727
28from .settings import ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH, CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR28from .settings import (ARCHS_TO_EVENTUALLY_IGNORE,
2929 ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH,
3030 CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR)
31def get_launchpad(use_staging=False, use_cred_file=os.path.expanduser(CRED_FILE_PATH)):31
32
33def get_launchpad(use_staging=False,
34 use_cred_file=os.path.expanduser(CRED_FILE_PATH)):
32 '''Get THE Launchpad'''35 '''Get THE Launchpad'''
33 global launchpad36 global launchpad
34 if not launchpad:37 if not launchpad:
@@ -42,14 +45,20 @@
42 os.makdedirs(launchpadlib_dir)45 os.makdedirs(launchpadlib_dir)
4346
44 if use_cred_file:47 if use_cred_file:
45 launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"],48 launchpad = Launchpad.login_with(
46 version='devel', # devel because copyPackage is only available there49 'cupstream2distro', server,
47 credentials_file=use_cred_file,50 allow_access_levels=["WRITE_PRIVATE"],
48 launchpadlib_dir=launchpadlib_dir)51 # devel because copyPackage is only available there
52 version='devel',
53 credentials_file=use_cred_file,
54 launchpadlib_dir=launchpadlib_dir)
49 else:55 else:
50 launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"],56 launchpad = Launchpad.login_with(
51 version='devel', # devel because copyPackage is only available there57 'cupstream2distro', server,
52 launchpadlib_dir=launchpadlib_dir)58 allow_access_levels=["WRITE_PRIVATE"],
59 # devel because copyPackage is only available there
60 version='devel',
61 launchpadlib_dir=launchpadlib_dir)
5362
54 return launchpad63 return launchpad
5564
@@ -77,7 +86,8 @@
77 bug_title_sets = set()86 bug_title_sets = set()
78 for bug in author_bugs[author]:87 for bug in author_bugs[author]:
79 try:88 try:
80 bug_title_sets.add("{} (LP: #{})".format(lp.bugs[bug].title, bug))89 title = "{} (LP: #{})".format(lp.bugs[bug].title, bug)
90 bug_title_sets.add(title)
81 except KeyError:91 except KeyError:
82 # still list non existing or if launchpad timeouts bugs92 # still list non existing or if launchpad timeouts bugs
83 bug_title_sets.add(u"Fix LP: #{}".format(bug))93 bug_title_sets.add(u"Fix LP: #{}".format(bug))
@@ -102,9 +112,12 @@
102 bug = lp.bugs[bug_num]112 bug = lp.bugs[bug_num]
103 bug.addTask(target=package)113 bug.addTask(target=package)
104 bug.lp_save()114 bug.lp_save()
105 except (KeyError, lazr.restfulclient.errors.BadRequest, lazr.restfulclient.errors.ServerError):115 except (KeyError, lazr.restfulclient.errors.BadRequest,
106 # ignore non existing or available bugs116 lazr.restfulclient.errors.ServerError):
107 logging.info("Can't synchronize upstream/downstream bugs for bug #{}. Not blocking on that.".format(bug_num))117 # ignore non existing or available bugs
118 m = ("Can't synchronize upstream/downstream bugs "
119 "for bug #{}. Not blocking on that.".format(bug_num))
120 logging.info(m)
108121
109122
110def get_available_and_all_archs(series, ppa=None):123def get_available_and_all_archs(series, ppa=None):
@@ -115,7 +128,8 @@
115 arch_all_arch = VIRTUALIZED_PPA_ARCH[0]128 arch_all_arch = VIRTUALIZED_PPA_ARCH[0]
116 else:129 else:
117 for arch in series.architectures:130 for arch in series.architectures:
118 # HACK: filters armel as it's still seen as available on raring: https://launchpad.net/bugs/1077257131 # HACK: filters armel as it's still seen as available on raring:
132 # https://launchpad.net/bugs/1077257
119 if arch.architecture_tag == "armel":133 if arch.architecture_tag == "armel":
120 continue134 continue
121 available_arch.add(arch.architecture_tag)135 available_arch.add(arch.architecture_tag)
@@ -135,23 +149,29 @@
135 if ppa_name.startswith("ppa:"):149 if ppa_name.startswith("ppa:"):
136 ppa_name = ppa_name[4:]150 ppa_name = ppa_name[4:]
137 ppa_dispatch = ppa_name.split("/")151 ppa_dispatch = ppa_name.split("/")
138 return get_launchpad().people[ppa_dispatch[0]].getPPAByName(name=ppa_dispatch[1])152 person = get_launchpad().people[ppa_dispatch[0]]
153 return person.getPPAByName(name=ppa_dispatch[1])
154
139155
140def is_series_current(series_name):156def is_series_current(series_name):
141 '''Return if series_name is the edge development version'''157 '''Return if series_name is the edge development version'''
142 return get_ubuntu().current_series.name == series_name158 return get_ubuntu().current_series.name == series_name
143159
160
144def get_resource_from_url(url):161def get_resource_from_url(url):
145 '''Return a lp resource from a launchpad url'''162 '''Return a lp resource from a launchpad url'''
146 lp = get_launchpad()163 lp = get_launchpad()
147 url = lp.resource_type_link.replace("/#service-root", "") + url.split("launchpad.net")[1]164 url = lp.resource_type_link.replace("/#service-root", "")
165 url = url + url.split("launchpad.net")[1]
148 return lp.load(url)166 return lp.load(url)
149167
168
150def get_resource_from_token(url):169def get_resource_from_token(url):
151 '''Return a lp resource from a launchpad token'''170 '''Return a lp resource from a launchpad token'''
152 lp = get_launchpad()171 lp = get_launchpad()
153 return lp.load(url)172 return lp.load(url)
154173
174
155def is_dest_ubuntu_archive(series_link):175def is_dest_ubuntu_archive(series_link):
156 '''return if series_link is the ubuntu archive'''176 '''return if series_link is the ubuntu archive'''
157 return series_link == get_ubuntu_archive().self_link177 return series_link == get_ubuntu_archive().self_link
158178
=== modified file 'branch-source-builder/cupstream2distro/packageinppa.py'
--- branch-source-builder/cupstream2distro/packageinppa.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/packageinppa.py 2014-02-23 16:37:01 +0000
@@ -27,8 +27,9 @@
27 (BUILDING, FAILED, PUBLISHED) = range(3)27 (BUILDING, FAILED, PUBLISHED) = range(3)
2828
29 def __init__(self, source_name, version, ppa, destarchive, series,29 def __init__(self, source_name, version, ppa, destarchive, series,
30 available_archs_in_ppa, arch_all_arch, archs_to_eventually_ignore,30 available_archs_in_ppa, arch_all_arch,
31 archs_to_unconditionually_ignore, package_archs=None):31 archs_to_eventually_ignore, archs_to_unconditionually_ignore,
32 package_archs=None):
32 self.source_name = source_name33 self.source_name = source_name
33 self.version = version34 self.version = version
34 self.series = series35 self.series = series
@@ -39,7 +40,8 @@
39 # Get archs we should look at40 # Get archs we should look at
40 version_for_source_file = version.split(':')[-1]41 version_for_source_file = version.split(':')[-1]
41 if not package_archs:42 if not package_archs:
42 dsc_filename = "{}_{}.dsc".format(source_name, version_for_source_file)43 tmpl = "{}_{}.dsc"
44 dsc_filename = tmpl.format(source_name, version_for_source_file)
43 regexp = re.compile("^Architecture: (.*)\n")45 regexp = re.compile("^Architecture: (.*)\n")
44 for line in open(dsc_filename):46 for line in open(dsc_filename):
45 arch_lists = regexp.findall(line)47 arch_lists = regexp.findall(line)
@@ -57,21 +59,28 @@
57 archs_supported_by_package = set()59 archs_supported_by_package = set()
58 for arch in package_archs.split():60 for arch in package_archs.split():
59 archs_supported_by_package.add(arch)61 archs_supported_by_package.add(arch)
60 self.archs = archs_supported_by_package.intersection(available_archs_in_ppa)62 intersection = archs_supported_by_package.intersection
61 # ignore some eventual archs if doesn't exist in latest published version in dest63 self.archs = intersection(available_archs_in_ppa)
64 # ignore some eventual archs if doesn't exist in latest published
65 # version in dest
62 archs_to_eventually_ignore = archs_to_eventually_ignore.copy()66 archs_to_eventually_ignore = archs_to_eventually_ignore.copy()
63 if archs_to_eventually_ignore:67 if archs_to_eventually_ignore:
64 try:68 try:
65 previous_source = destarchive.getPublishedSources(exact_match=True, source_name=self.source_name,69 kw = {'exact_match': True, 'source_name': self.source_name,
66 distro_series=self.series, status="Published")[0]70 'distro_series': self.series, 'status': "Published"}
71 previous_source = destarchive.getPublishedSources(**kw)[0]
67 for binary in previous_source.getPublishedBinaries():72 for binary in previous_source.getPublishedBinaries():
68 if binary.architecture_specific and binary.distro_arch_series.architecture_tag in archs_to_eventually_ignore:73 if (binary.architecture_specific and
69 archs_to_eventually_ignore -= set([binary.distro_arch_series.architecture_tag])74 binary.distro_arch_series.architecture_tag in
75 archs_to_eventually_ignore):
76 rem = [binary.distro_arch_series.architecture_tag]
77 archs_to_eventually_ignore -= set(rem)
70 if not archs_to_eventually_ignore:78 if not archs_to_eventually_ignore:
71 break79 break
7280
73 except IndexError:81 except IndexError:
74 # no package in dest, don't wait on any archs_to_eventually_ignore82 # no package in dest, don't wait on any
83 # archs_to_eventually_ignore
75 pass84 pass
76 # remove from the inspection remaining archs to ignore85 # remove from the inspection remaining archs to ignore
77 if archs_to_eventually_ignore:86 if archs_to_eventually_ignore:
@@ -79,11 +88,9 @@
79 if archs_to_unconditionually_ignore:88 if archs_to_unconditionually_ignore:
80 self.archs -= archs_to_unconditionually_ignore89 self.archs -= archs_to_unconditionually_ignore
8190
82
83 def __repr__(self):91 def __repr__(self):
84 return '{} - {}'.format(self.source_name, self.version)92 return '{} - {}'.format(self.source_name, self.version)
8593
86
87 def get_status(self, on_particular_arch=None):94 def get_status(self, on_particular_arch=None):
88 '''Look at the package status in the ppa95 '''Look at the package status in the ppa
8996
@@ -120,11 +127,15 @@
120127
121 for arch in self.archs.copy():128 for arch in self.archs.copy():
122 if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)):129 if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)):
123 logging.warning("Request to ignore {} on {}.".format(self.source_name, arch))130 tmpl = "Request to ignore {} on {}."
131 logging.warning(tmpl.format(self.source_name, arch))
124 try:132 try:
125 self.archs.remove(arch)133 self.archs.remove(arch)
126 except ValueError:134 except ValueError:
127 logging.warning("Request to ignore {} on {} has been proceeded, but this one wasn't in the list we were monitor for.".format(self.source_name, arch))135 tmpl = ("Request to ignore {} on {} has been proceeded, "
136 "but this one wasn't in the list we were monitor "
137 "for.")
138 logging.warning(tmpl.format(self.source_name, arch))
128 try:139 try:
129 self.current_status.pop(arch)140 self.current_status.pop(arch)
130 except KeyError:141 except KeyError:
@@ -137,10 +148,12 @@
137148
138 # first step, get the source published149 # first step, get the source published
139 if not self.current_status:150 if not self.current_status:
140 (self.current_status, self.source) = self._get_status_for_source_package_in_ppa()151 status = self._get_status_for_source_package_in_ppa()
152 (self.current_status, self.source) = status
141 # check the binary status153 # check the binary status
142 if self.current_status:154 if self.current_status:
143 self.current_status = self._get_status_for_binary_packages_in_ppa()155 status = self._get_status_for_binary_packages_in_ppa()
156 self.current_status = status
144157
145 def _get_status_for_source_package_in_ppa(self):158 def _get_status_for_source_package_in_ppa(self):
146 '''Return current_status for source package in ppa.159 '''Return current_status for source package in ppa.
@@ -149,13 +162,16 @@
149 - None -> not visible yet162 - None -> not visible yet
150 - BUILDING -> currently Building (or waiting to build)163 - BUILDING -> currently Building (or waiting to build)
151 - FAILED -> Build failed for this arch or has been canceled164 - FAILED -> Build failed for this arch or has been canceled
152 - PUBLISHED -> All packages (including arch:all from other archs) published.165 - PUBLISHED -> All packages (including arch:all from other archs)
166 published.
153167
154 Only the 2 first status are returned by this call. See _get_status_for_binary_packages_in_ppa168 Only the 2 first status are returned by this call. See
155 for the others.'''169 _get_status_for_binary_packages_in_ppa for the others.'''
156170
157 try:171 try:
158 source = self.ppa.getPublishedSources(exact_match=True, source_name=self.source_name, version=self.version, distro_series=self.series)[0]172 kw = {'exact_match': True, 'source_name': self.source_name,
173 'version': self.version, 'distro_series': self.series}
174 source = self.ppa.getPublishedSources(**kw)[0]
159 logging.info("Source available in ppa")175 logging.info("Source available in ppa")
160 current_status = {}176 current_status = {}
161 for arch in self.archs:177 for arch in self.archs:
@@ -172,26 +188,36 @@
172 - None -> not visible yet188 - None -> not visible yet
173 - BUILDING -> currently Building (or waiting to build)189 - BUILDING -> currently Building (or waiting to build)
174 - FAILED -> Build failed for this arch or has been canceled190 - FAILED -> Build failed for this arch or has been canceled
175 - PUBLISHED -> All packages (including arch:all from other archs) published.191 - PUBLISHED -> All packages (including arch:all from other archs)
176192 published.
177 Only the 3 last statuses are returned by this call. See _get_status_for_source_package_in_ppa193
178 for the other.'''194 Only the 3 last statuses are returned by this call. See
179195 _get_status_for_source_package_in_ppa for the other.'''
180 # Try to see if all binaries availables for this arch are built, including arch:all on other archs196
197 # Try to see if all binaries availables for this arch are built,
198 # including arch:all on other archs
181 status = self.current_status199 status = self.current_status
182 at_least_one_published_binary = False
183 for binary in self.source.getPublishedBinaries():200 for binary in self.source.getPublishedBinaries():
184 at_least_one_published_binary = True
185 # all binaries for an arch are published at the same time201 # all binaries for an arch are published at the same time
186 # launchpad is lying, it's telling that archs not in the ppa are built (for arch:all). Even for non supported arch!202 # launchpad is lying, it's telling that archs not in the ppa are
187 # for instance, we can have the case of self.arch_all_arch (arch:all), built before the others and amd64 will be built for it203 # built (for arch:all). Even for non supported arch!
188 if binary.status == "Published" and (binary.distro_arch_series.architecture_tag == self.arch_all_arch or204 # for instance, we can have the case of self.arch_all_arch
189 (binary.distro_arch_series.architecture_tag != self.arch_all_arch and binary.architecture_specific)):205 # (arch:all), built before the others and amd64 will be built for
190 status[binary.distro_arch_series.architecture_tag] = self.PUBLISHED206 # it
207 published = binary.status == "Published"
208 arch_tag = binary.distro_arch_series.architecture_tag
209 not_arch_all = arch_tag != self.arch_all_arch
210 arch_all = arch_tag == self.arch_all_arch
211 arch_specific = binary.architecture_specific
212 if published and (arch_all or (not_arch_all and arch_specific)):
213 status[arch_tag] = self.PUBLISHED
191214
192 # Looking for builds on archs still BUILDING (just loop on builds once to avoid too many lp requests)215 # Looking for builds on archs still BUILDING (just loop on builds once
216 # to avoid too many lp requests)
193 needs_checking_build = False217 needs_checking_build = False
194 build_state_failed = ('Failed to build', 'Chroot problem', 'Failed to upload', 'Cancelled build', 'Build for superseded Source')218 build_state_failed = ('Failed to build', 'Chroot problem',
219 'Failed to upload', 'Cancelled build',
220 'Build for superseded Source')
195 for arch in self.archs:221 for arch in self.archs:
196 if self.current_status[arch] == self.BUILDING:222 if self.current_status[arch] == self.BUILDING:
197 needs_checking_build = True223 needs_checking_build = True
@@ -202,24 +228,35 @@
202 continue228 continue
203 if self.current_status[build.arch_tag] == self.BUILDING:229 if self.current_status[build.arch_tag] == self.BUILDING:
204 if build.buildstate in build_state_failed:230 if build.buildstate in build_state_failed:
205 logging.error("{}: Build {} ({}) failed because of {}".format(build.arch_tag, build.title,231 tmpl = "{}: Build {} ({}) failed because of {}"
206 build.web_link, build.buildstate))232 logging.error(tmpl.format(build.arch_tag, build.title,
233 build.web_link, build.buildstate))
207 status[build.arch_tag] = self.FAILED234 status[build.arch_tag] = self.FAILED
208 # Another launchpad trick: if a binary arch was published, but then is superseeded, getPublishedBinaries() won't list235 # Another launchpad trick: if a binary arch was published,
209 # those binaries anymore. So it's seen as BUILDING again.236 # but then is superseeded, getPublishedBinaries() won't
210 # If there is a successful build record of it and the source is superseded, it means that it built fine at some point,237 # list those binaries anymore. So it's seen as BUILDING
211 # Another arch will fail as superseeded.238 # again. If there is a successful build record of it and
212 # We don't just retain the old state of "PUBLISHED" because maybe we started the script with that situation already239 # the source is superseded, it means that it built fine at
213 elif build.buildstate not in build_state_failed and self.source.status == "Superseded":240 # some point, Another arch will fail as superseeded.
214 status[build.arch_tag] = self.PUBLISHED241 # We don't just retain the old state of "PUBLISHED" because
242 # maybe we started the script with that situation already
243 elif (build.buildstate not in build_state_failed
244 and self.source.status == "Superseded"):
245 status[build.arch_tag] = self.PUBLISHED
215246
216 # There is no way to know if there are some arch:all packages (and there are not in publishedBinaries for this arch until247 # There is no way to know if there are some arch:all packages (and
217 # it's built on arch_all_arch). So mark all arch to BUILDING if self.arch_all_arch is building or FAILED if it failed.248 # there are not in publishedBinaries for this arch until
218 if self.arch_all_arch in status and status[self.arch_all_arch] != self.PUBLISHED:249 # it's built on arch_all_arch). So mark all arch to BUILDING if
250 # self.arch_all_arch is building or FAILED if it failed.
251 not_published = status[self.arch_all_arch] != self.PUBLISHED
252 if self.arch_all_arch in status and not_published:
219 for arch in self.archs:253 for arch in self.archs:
220 if status[arch] == self.PUBLISHED:254 if status[arch] == self.PUBLISHED:
221 status[arch] = status[self.arch_all_arch]255 status[arch] = status[self.arch_all_arch]
222 if arch != self.arch_all_arch and status[arch] == self.FAILED:256 failed = status[arch] == self.FAILED
223 logging.error("{} marked as FAILED because {} build FAILED and we may miss arch:all packages".format(arch, self.arch_all_arch))257 if arch != self.arch_all_arch and failed:
258 msg = ("{} marked as FAILED because {} build FAILED "
259 "and we may miss arch:all packages")
260 logging.error(msg.format(arch, self.arch_all_arch))
224261
225 return status262 return status
226263
=== modified file 'branch-source-builder/cupstream2distro/packageinppamanager.py'
--- branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-23 16:37:01 +0000
@@ -41,7 +41,8 @@
41 # we do not rely on the .changes files but in the config file41 # we do not rely on the .changes files but in the config file
42 # because we need the exact version (which can have an epoch)42 # because we need the exact version (which can have an epoch)
43 result = set()43 result = set()
44 source_package_regexp = re.compile("(.*).{}$".format(PROJECT_CONFIG_SUFFIX))44 tmpl = "(.*).{}$"
45 source_package_regexp = re.compile(tmpl.format(PROJECT_CONFIG_SUFFIX))
45 for file in os.listdir('.'):46 for file in os.listdir('.'):
46 substract = source_package_regexp.findall(file)47 substract = source_package_regexp.findall(file)
47 if substract:48 if substract:
@@ -59,40 +60,51 @@
5960
6061
61def get_packages_and_versions_uploaded():62def get_packages_and_versions_uploaded():
62 '''Get (package, version) of all packages uploaded. We can have duplicates'''63 '''Get (package, version) of all packages uploaded. We can have
64 duplicates'''
6365
64 # we do not rely on the .changes files but in the config file66 # we do not rely on the .changes files but in the config file
65 # because we need the exact version (which can have an epoch)67 # because we need the exact version (which can have an epoch)
66 result = set()68 result = set()
67 source_package_regexp = re.compile("(.*).{}.*$".format(PROJECT_CONFIG_SUFFIX))69 tmpl = "(.*).{}.*$"
70 source_package_regexp = re.compile(tmpl.format(PROJECT_CONFIG_SUFFIX))
68 for file in os.listdir('.'):71 for file in os.listdir('.'):
69 substract = source_package_regexp.findall(file)72 substract = source_package_regexp.findall(file)
70 if substract:73 if substract:
71 config = ConfigParser.RawConfigParser()74 config = ConfigParser.RawConfigParser()
72 config.read(file)75 config.read(file)
73 result.add((substract[0], config.get('Package', 'packaging_version')))76 pkg = config.get('Package', 'packaging_version')
77 result.add((substract[0], pkg))
74 return result78 return result
7579
7680
77def update_all_packages_status(packages_not_in_ppa, packages_building, packages_failed, particular_arch=None):81def update_all_packages_status(packages_not_in_ppa, packages_building,
82 packages_failed, particular_arch=None):
78 '''Update all packages status, checking in the ppa'''83 '''Update all packages status, checking in the ppa'''
7984
80 for current_package in (packages_not_in_ppa.union(packages_building)):85 for current_package in (packages_not_in_ppa.union(packages_building)):
81 logging.info("current_package: " + current_package.source_name + " " + current_package.version)86 logging.info("current_package: " + current_package.source_name + " " +
87 current_package.version)
82 package_status = current_package.get_status(particular_arch)88 package_status = current_package.get_status(particular_arch)
83 if package_status != None: # global package_status can be 0 (building), 1 (failed), 2 (published)89 # global package_status can be 0 (building), 1 (failed), 2 (published)
90 if package_status is None:
84 # if one arch building, still considered as building91 # if one arch building, still considered as building
85 if package_status == PackageInPPA.BUILDING:92 if package_status == PackageInPPA.BUILDING:
86 _ensure_removed_from_set(packages_not_in_ppa, current_package) # maybe already removed93 # maybe already removed
94 _ensure_removed_from_set(packages_not_in_ppa, current_package)
87 packages_building.add(current_package)95 packages_building.add(current_package)
88 # if one arch failed, considered as failed96 # if one arch failed, considered as failed
89 elif package_status == PackageInPPA.FAILED:97 elif package_status == PackageInPPA.FAILED:
90 _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step98 # in case we missed the "build" step
91 _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step99 _ensure_removed_from_set(packages_building, current_package)
100 # in case we missed the "wait" step
101 _ensure_removed_from_set(packages_not_in_ppa, current_package)
92 packages_failed.add(current_package)102 packages_failed.add(current_package)
93 elif package_status == PackageInPPA.PUBLISHED:103 elif package_status == PackageInPPA.PUBLISHED:
94 _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step104 # in case we missed the "build" step
95 _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step105 _ensure_removed_from_set(packages_building, current_package)
106 # in case we missed the "wait" step
107 _ensure_removed_from_set(packages_not_in_ppa, current_package)
96108
97109
98def _get_current_packaging_version_from_config(source_package_name):110def _get_current_packaging_version_from_config(source_package_name):
99111
=== modified file 'branch-source-builder/cupstream2distro/packagemanager.py'
--- branch-source-builder/cupstream2distro/packagemanager.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/packagemanager.py 2014-02-23 16:37:01 +0000
@@ -31,9 +31,11 @@
31import launchpadmanager31import launchpadmanager
32import settings32import settings
33from .utils import ignored33from .utils import ignored
3434from subprocess import Popen, PIPE
3535
36def get_current_version_for_series(source_package_name, series_name, ppa_name=None, dest=None):36
37def get_current_version_for_series(source_package_name, series_name,
38 ppa_name=None, dest=None):
37 '''Get current version for a package name in that series'''39 '''Get current version for a package name in that series'''
38 series = launchpadmanager.get_series(series_name)40 series = launchpadmanager.get_series(series_name)
39 if not dest:41 if not dest:
@@ -41,29 +43,37 @@
41 dest = launchpadmanager.get_ppa(ppa_name)43 dest = launchpadmanager.get_ppa(ppa_name)
42 else:44 else:
43 dest = launchpadmanager.get_ubuntu_archive()45 dest = launchpadmanager.get_ubuntu_archive()
44 source_collection = dest.getPublishedSources(exact_match=True, source_name=source_package_name, distro_series=series)46 kw = {'exact_match': True, 'source_name': source_package_name,
47 'distro_series': series}
48 source_collection = dest.getPublishedSources(**kw)
45 try:49 try:
46 # cjwatson told that list always have the more recently published first (even if removed)50 # cjwatson told that list always have the more recently published first
51 # (even if removed)
47 return source_collection[0].source_package_version52 return source_collection[0].source_package_version
48 # was never in the dest, set the lowest possible version53 # was never in the dest, set the lowest possible version
49 except IndexError:54 except IndexError:
50 return "0"55 return "0"
5156
5257
53def is_version_for_series_in_dest(source_package_name, version, series, dest, pocket="Release"):58def is_version_for_series_in_dest(source_package_name, version, series, dest,
59 pocket="Release"):
54 '''Return if version for a package name in that series is in dest'''60 '''Return if version for a package name in that series is in dest'''
55 return dest.getPublishedSources(exact_match=True, source_name=source_package_name, version=version,61 kw = {'exact_match': True, 'source_name': source_package_name,
56 distro_series=series, pocket=pocket).total_size > 062 'version': version, 'distro_series': series, 'pocket': pocket}
63 return dest.getPublishedSources(**kw).total_size > 0
64
5765
58def is_version_in_queue(source_package_name, version, dest_serie, queue):66def is_version_in_queue(source_package_name, version, dest_serie, queue):
59 '''Return if version for a package name in that series is in dest'''67 '''Return if version for a package name in that series is in dest'''
60 return dest_serie.getPackageUploads(exact_match=True, name=source_package_name, version=version,68 kw = {'exact_match': True, 'name': source_package_name, 'version': version,
61 status=queue).total_size > 069 'status': queue}
70 return dest_serie.getPackageUploads(**kw).total_size > 0
6271
6372
64def is_version1_higher_than_version2(version1, version2):73def is_version1_higher_than_version2(version1, version2):
65 '''return if version1 is higher than version2'''74 '''return if version1 is higher than version2'''
66 return (subprocess.call(["dpkg", "--compare-versions", version1, 'gt', version2], stdout=subprocess.PIPE, stderr=subprocess.PIPE) == 0)75 cmd = ["dpkg", "--compare-versions", version1, 'gt', version2]
76 return (subprocess.call(cmd, stdout=PIPE, stderr=PIPE) == 0)
6777
6878
69def is_version_in_changelog(version, f):79def is_version_in_changelog(version, f):
@@ -72,7 +82,8 @@
72 if version == "0":82 if version == "0":
73 return True83 return True
7484
75 desired_changelog_line = re.compile("\({}\) (?!UNRELEASED).*\; urgency=".format(version.replace('+', '\+')))85 t = "\({}\) (?!UNRELEASED).*\; urgency="
86 desired_changelog_line = re.compile(t.format(version.replace('+', '\+')))
76 for line in f.readlines():87 for line in f.readlines():
77 if desired_changelog_line.search(line):88 if desired_changelog_line.search(line):
78 return True89 return True
@@ -83,9 +94,12 @@
83def get_latest_upstream_bzr_rev(f, dest_ppa=None):94def get_latest_upstream_bzr_rev(f, dest_ppa=None):
84 '''Report latest bzr rev in the file95 '''Report latest bzr rev in the file
8596
86 If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to first distro version'''97 If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to
98 first distro version'''
87 distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT))99 distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT))
88 destppa_regexp = re.compile("{} (\d+) \(ppa:{}\)".format(settings.REV_STRING_FORMAT, dest_ppa))100 tmpl = "{} (\d+) \(ppa:{}\)"
101 destppa_regexp = re.compile(
102 tmpl.format(settings.REV_STRING_FORMAT, dest_ppa))
89 distro_rev = None103 distro_rev = None
90 candidate_destppa_rev = None104 candidate_destppa_rev = None
91 candidate_distro_rev = None105 candidate_distro_rev = None
@@ -111,15 +125,19 @@
111 destppa_element_found = True125 destppa_element_found = True
112 except IndexError:126 except IndexError:
113 destppa_element_found = False127 destppa_element_found = False
114 if not distro_rev and not destppa_element_found and not "(ppa:" in line:128 ppa_line = "(ppa:" in line
129 if not distro_rev and not destppa_element_found and not ppa_line:
115 try:130 try:
116 candidate_distro_rev = int(distro_regex.findall(line)[0])131 candidate_distro_rev = int(distro_regex.findall(line)[0])
117 distro_element_found = True132 distro_element_found = True
118 except IndexError:133 except IndexError:
119 distro_element_found = False134 distro_element_found = False
120135
121 # try to catchup next line if we have a marker start without anything found136 # try to catchup next line if we have a marker start without anything
122 if settings.REV_STRING_FORMAT in line and (dest_ppa and not destppa_element_found) and not distro_element_found:137 # found
138 rev_str_found = settings.REV_STRING_FORMAT in line
139 ppa_found = dest_ppa and not destppa_element_found
140 if rev_str_found and ppa_found and not distro_element_found:
123 previous_line = line141 previous_line = line
124142
125 if line.startswith(" -- "):143 if line.startswith(" -- "):
@@ -150,7 +168,8 @@
150168
151def get_packaging_version():169def get_packaging_version():
152 '''Get current packaging rev'''170 '''Get current packaging rev'''
153 instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)171 instance = Popen(["dpkg-parsechangelog"],
172 stdout=PIPE, stderr=PIPE)
154 (stdout, stderr) = instance.communicate()173 (stdout, stderr) = instance.communicate()
155 if instance.returncode != 0:174 if instance.returncode != 0:
156 raise Exception(stderr.decode("utf-8").strip())175 raise Exception(stderr.decode("utf-8").strip())
@@ -160,19 +179,24 @@
160 if packaging_version:179 if packaging_version:
161 return packaging_version[0]180 return packaging_version[0]
162181
163 raise Exception("Didn't find any Version in the package: {}".format(stdout))182 tmpl = "Didn't find any Version in the package: {}"
164183 raise Exception(tmpl.format(stdout))
165184
166def get_source_package_from_dest(source_package_name, dest_archive, dest_current_version, series_name):185
167 '''Download and return a path containing a checkout of the current dest version.186def get_source_package_from_dest(source_package_name, dest_archive,
187 dest_current_version, series_name):
188 '''Download and return a path containing a checkout of the current dest
189 version.
168190
169 None if this package was never published to dest archive'''191 None if this package was never published to dest archive'''
170192
171 if dest_current_version == "0":193 if dest_current_version == "0":
172 logging.info("This package was never released to the destination archive, don't return downloaded source")194 logging.info("This package was never released to the destination "
195 "archive, don't return downloaded source")
173 return None196 return None
174197
175 logging.info("Grab code for {} ({}) from {}".format(source_package_name, dest_current_version, series_name))198 args = (source_package_name, dest_current_version, series_name)
199 logging.info("Grab code for {} ({}) from {}".format(*args))
176 source_package_download_dir = os.path.join('ubuntu', source_package_name)200 source_package_download_dir = os.path.join('ubuntu', source_package_name)
177 series = launchpadmanager.get_series(series_name)201 series = launchpadmanager.get_series(series_name)
178 with ignored(OSError):202 with ignored(OSError):
@@ -180,27 +204,39 @@
180 os.chdir(source_package_download_dir)204 os.chdir(source_package_download_dir)
181205
182 try:206 try:
183 sourcepkg = dest_archive.getPublishedSources(status="Published", exact_match=True, source_name=source_package_name, distro_series=series, version=dest_current_version)[0]207 kw = {'status': "Published", 'exact_match': True,
208 'source_name': source_package_name, 'distro_series': series,
209 'version': dest_current_version}
210 sourcepkg = dest_archive.getPublishedSources(**kw)[0]
184 except IndexError:211 except IndexError:
185 raise Exception("Couldn't get in the destination the expected version")212 raise Exception("Couldn't get in the destination the expected version")
186 logging.info('Downloading %s version %s', source_package_name, dest_current_version)213 tmpl = 'Downloading %s version %s'
214 logging.info(tmpl, source_package_name, dest_current_version)
187 for url in sourcepkg.sourceFileUrls():215 for url in sourcepkg.sourceFileUrls():
188 urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1]))216 urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1]))
189 instance = subprocess.Popen("dpkg-source -x *dsc", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)217 cmd = "dpkg-source -x *dsc"
218 instance = Popen(cmd, shell=True, stdout=PIPE, stderr=PIPE)
190 (stdout, stderr) = instance.communicate()219 (stdout, stderr) = instance.communicate()
191 if instance.returncode != 0:220 if instance.returncode != 0:
192 raise Exception(stderr.decode("utf-8").strip())221 raise Exception(stderr.decode("utf-8").strip())
193222
194 # check the dir exist223 # check the dir exist
195 splitted_version = dest_current_version.split(':')[-1].split('-') # remove epoch is there is one224 # remove epoch is there is one
225 splitted_version = dest_current_version.split(':')[-1].split('-')
196 # TODO: debian version (like -3) is not handled here.226 # TODO: debian version (like -3) is not handled here.
197 # We do handle 42ubuntu1 though (as splitted_version[0] can contain "ubuntu")227 # We do handle 42ubuntu1 though (as splitted_version[0] can contain
198 if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1: # don't remove last item for the case where we had a native version (-0.35.2) without ubuntu in it228 # "ubuntu")
229 if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1:
230 # don't remove last item for the case where we had a native version
231 # (-0.35.2) without ubuntu in it
199 splitted_version = splitted_version[:-1]232 splitted_version = splitted_version[:-1]
200 version_for_source_file = '-'.join(splitted_version)233 version_for_source_file = '-'.join(splitted_version)
201 source_directory_name = "{}-{}".format(source_package_name, version_for_source_file)234 args = (source_package_name, version_for_source_file)
235 source_directory_name = "{}-{}".format(*args)
202 if not os.path.isdir(source_directory_name):236 if not os.path.isdir(source_directory_name):
203 raise Exception("We tried to download and check that the directory {} is present, but it's not the case".format(source_directory_name))237 tmpl = ("We tried to download and check that the directory {} is "
238 "present, but it's not the case")
239 raise Exception(tmpl.format(source_directory_name))
204 os.chdir('../..')240 os.chdir('../..')
205 return (os.path.join(source_package_download_dir, source_directory_name))241 return (os.path.join(source_package_download_dir, source_directory_name))
206242
@@ -210,24 +246,40 @@
210246
211 dest_version_source can be None if no released version was done before.'''247 dest_version_source can be None if no released version was done before.'''
212248
213 # we always released something not yet in ubuntu, no matter criterias are not met.249 # we always released something not yet in ubuntu, no matter criterias are
250 # not met.
214 if not dest_version_source:251 if not dest_version_source:
215 return True252 return True
216253
217 # now check the relevance of the committed changes compared to the version in the repository (if any)254 # now check the relevance of the committed changes compared to the version
218 diffinstance = subprocess.Popen(['diff', '-Nrup', '.', dest_version_source], stdout=subprocess.PIPE)255 # in the repository (if any)
219 filterinstance = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE)256 cmd = ['diff', '-Nrup', '.', dest_version_source]
220 lsdiffinstance = subprocess.Popen(['lsdiff'], stdin=filterinstance.stdout, stdout=subprocess.PIPE)257 diffinstance = Popen(cmd, stdout=PIPE)
221 (relevant_changes, err) = subprocess.Popen(['grep', '-Ev', '.bzr|.pc'], stdin=lsdiffinstance.stdout, stdout=subprocess.PIPE).communicate()258
222259 cmd = ['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x',
223 # detect if the only change is a Vcs* target changes (with or without changelog edit). We won't release in that case260 '*local-options']
261 filterinstance = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE)
262 cmd = ['lsdiff']
263 lsdiffinstance = Popen(cmd, stdin=filterinstance.stdout, stdout=PIPE)
264 cmd = ['grep', '-Ev', '.bzr|.pc']
265 p = Popen(cmd, stdin=lsdiffinstance.stdout, stdout=PIPE)
266 (relevant_changes, err) = p.communicate()
267
268 # detect if the only change is a Vcs* target changes (with or without
269 # changelog edit). We won't release in that case
224 number_of_changed_files = relevant_changes.count("\n")270 number_of_changed_files = relevant_changes.count("\n")
225 if ((number_of_changed_files == 1 and "debian/control" in relevant_changes) or271
226 (number_of_changed_files == 2 and "debian/control" in relevant_changes and "debian/changelog" in relevant_changes)):272 control_changed = "debian/control" in relevant_changes
227 (results, err) = subprocess.Popen(['diff', os.path.join('debian', 'control'), os.path.join(dest_version_source, "debian", "control")], stdout=subprocess.PIPE).communicate()273 if control_changed and number_of_changed_files in (1, 2):
274 cmd = ['diff', os.path.join('debian', 'control'),
275 os.path.join(dest_version_source, "debian", "control")]
276
277 (results, err) = Popen(cmd, stdout=PIPE).communicate()
228 for diff_line in results.split('\n'):278 for diff_line in results.split('\n'):
229 if diff_line.startswith("< ") or diff_line.startswith("> "):279 if diff_line.startswith("< ") or diff_line.startswith("> "):
230 if not diff_line[2:].startswith("Vcs-") and not diff_line[2:].startswith("#"):280 vcs = diff_line[2:].startswith("Vcs-")
281 comment = diff_line[2:].startswith("#")
282 if not vcs and not comment:
231 return True283 return True
232 return False284 return False
233285
@@ -237,24 +289,34 @@
237 return (relevant_changes != '')289 return (relevant_changes != '')
238290
239291
240def is_relevant_source_diff_from_previous_dest_version(newdsc_path, dest_version_source):292def is_relevant_source_diff_from_previous_dest_version(newdsc_path,
241 '''Extract and check if the generated source diff different from previous one'''293 dest_version_source):
294 '''Extract and check if the generated source diff different from previous
295 one'''
242296
243 with ignored(OSError):297 with ignored(OSError):
244 os.makedirs("generated")298 os.makedirs("generated")
245 extracted_generated_source = os.path.join("generated", newdsc_path.split('_')[0])299 prefix = newdsc_path.split('_')[0]
300 extracted_generated_source = os.path.join("generated", prefix)
246 with ignored(OSError):301 with ignored(OSError):
247 shutil.rmtree(extracted_generated_source)302 shutil.rmtree(extracted_generated_source)
248303
249 # remove epoch is there is one304 # remove epoch is there is one
250 if subprocess.call(["dpkg-source", "-x", newdsc_path, extracted_generated_source]) != 0:305 cmd = ["dpkg-source", "-x", newdsc_path, extracted_generated_source]
306 if subprocess.call(cmd) != 0:
251 raise Exception("dpkg-source command returned an error.")307 raise Exception("dpkg-source command returned an error.")
252308
253 # now check the relevance of the committed changes compared to the version in the repository (if any)309 # now check the relevance of the committed changes compared to the version
254 diffinstance = subprocess.Popen(['diff', '-Nrup', extracted_generated_source, dest_version_source], stdout=subprocess.PIPE)310 # in the repository (if any)
255 (diff, err) = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate()311 cmd = ['diff', '-Nrup', extracted_generated_source, dest_version_source]
312 diffinstance = Popen(cmd, stdout=PIPE)
313 cmd = ['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x',
314 '*local-options']
315 p = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE)
316 (diff, err) = p.communicate()
256317
257 # there is no important diff if the diff only contains 12 lines, corresponding to "Automatic daily release" marker in debian/changelog318 # there is no important diff if the diff only contains 12 lines,
319 # corresponding to "Automatic daily release" marker in debian/changelog
258 if (diff.count('\n') <= 12):320 if (diff.count('\n') <= 12):
259 return False321 return False
260 return True322 return True
@@ -267,47 +329,65 @@
267 if not oldsource_dsc:329 if not oldsource_dsc:
268 return True330 return True
269 if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc):331 if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc):
270 raise Exception("{} or {} doesn't not exist, can't create a diff".format(oldsource_dsc, newsource_dsc))332 tmpl = "{} or {} doesn't not exist, can't create a diff"
271 diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE)333 raise Exception(tmpl.format(oldsource_dsc, newsource_dsc))
272 filterinstance = subprocess.Popen(['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog'], stdin=diffinstance.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)334 diffinstance = Popen(['debdiff', oldsource_dsc, newsource_dsc],
335 stdout=PIPE)
336 cmd = ['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog']
337 filterinstance = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE,
338 stderr=PIPE)
273 (change_in_debian, filter_err) = filterinstance.communicate()339 (change_in_debian, filter_err) = filterinstance.communicate()
274 # we can't rely on diffinstance returncode as the signature key is maybe not present and it will exit with 1340 # we can't rely on diffinstance returncode as the signature key is maybe
341 # not present and it will exit with 1
275 if filterinstance.returncode != 0:342 if filterinstance.returncode != 0:
276 raise Exception("Error in diff: {}".format(filter_err.decode("utf-8").strip()))343 tmpl = "Error in diff: {}"
344 raise Exception(tmpl.format(filter_err.decode("utf-8").strip()))
277 return(change_in_debian != "")345 return(change_in_debian != "")
278346
279347
280def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc):348def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc):
281 '''Generate a diff file in diff_filepath if there is a relevant packaging diff between 2 sources349 '''Generate a diff file in diff_filepath if there is a relevant packaging
350 diff between 2 sources
282351
283 The diff contains autotools files and cmakeries'''352 The diff contains autotools files and cmakeries'''
284 if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc):353 if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc):
285 with open(diff_filepath, "w") as f:354 with open(diff_filepath, "w") as f:
286 if not oldsource_dsc:355 if not oldsource_dsc:
287 f.writelines("This source is a new package, if the destination is ubuntu, please ensure it has been preNEWed by an archive admin before publishing that stack.")356 f.writelines("This source is a new package, if the destination"
357 " is ubuntu, please ensure it has been preNEWed"
358 " by an archive admin before publishing that"
359 " stack.")
288 return360 return
289 f.write("/!\ Remember that this diff only represents packaging changes and build tools diff, not the whole content diff!\n\n")361 f.write("/!\ Remember that this diff only represents packaging "
290 diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE)362 "changes and build tools diff, not the whole content "
291 (changes_to_publish, err) = subprocess.Popen(['filterdiff', '--remove-timestamps', '--clean', '-i', '*setup.py',363 "diff!\n\n")
292 '-i', '*Makefile.am', '-i', '*configure.*', '-i', '*debian/*',364 diffinstance = Popen(['debdiff', oldsource_dsc, newsource_dsc],
293 '-i', '*CMakeLists.txt'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate()365 stdout=PIPE)
366 cmd = ['filterdiff', '--remove-timestamps', '--clean', '-i',
367 '*setup.py', '-i', '*Makefile.am', '-i', '*configure.*',
368 '-i', '*debian/*', '-i', '*CMakeLists.txt']
369 (changes_to_publish, err) = Popen(cmd, stdin=diffinstance.stdout,
370 stdout=PIPE).communicate()
294 f.write(changes_to_publish)371 f.write(changes_to_publish)
295372
296373
297def create_new_packaging_version(base_package_version, series_version, destppa=''):374def create_new_packaging_version(base_package_version, series_version,
375 destppa=''):
298 '''Deliver a new packaging version, based on simple rules:376 '''Deliver a new packaging version, based on simple rules:
299377
300 Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1378 Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1 if
301 if we already have something delivered today, it will be .minor, then, .minor+1…379 we already have something delivered today, it will be .minor, then,
380 .minor+1…
302381
303 We append the destination ppa name if we target a dest ppa and not distro'''382 We append the destination ppa name if we target a dest ppa and not
383 distro'''
304 # to keep track of whether the package is native or not384 # to keep track of whether the package is native or not
305 native_pkg = False385 native_pkg = False
306386
307 today_version = datetime.date.today().strftime('%Y%m%d')387 today_version = datetime.date.today().strftime('%Y%m%d')
308 destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".")388 destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".")
309 # bootstrapping mode or direct upload or UNRELEASED for bumping to a new series389 # bootstrapping mode or direct upload or UNRELEASED for bumping to a new
310 # TRANSITION390 # series TRANSITION
311 if not ("daily" in base_package_version or "+" in base_package_version):391 if not ("daily" in base_package_version or "+" in base_package_version):
312 # support both 42, 42-0ubuntu1392 # support both 42, 42-0ubuntu1
313 upstream_version = base_package_version.split('-')[0]393 upstream_version = base_package_version.split('-')[0]
@@ -322,17 +402,18 @@
322 upstream_version = previous_day[0]402 upstream_version = previous_day[0]
323 native_pkg = True403 native_pkg = True
324 if (previous_day[1] == series_version and404 if (previous_day[1] == series_version and
325 previous_day[2] == today_version):405 previous_day[2] == today_version):
326 minor = 1406 minor = 1
327 if previous_day[3]: # second upload of the day407 if previous_day[3]: # second upload of the day
328 minor = int(previous_day[3]) + 1408 minor = int(previous_day[3]) + 1
329 today_version = "{}.{}".format(today_version, minor)409 today_version = "{}.{}".format(today_version, minor)
330 except IndexError:410 except IndexError:
331 raise Exception(411 raise Exception(
332 "Unable to get previous day from native version: %s"412 "Unable to get previous day from native version: %s"
333 % base_package_version)413 % base_package_version)
334 else:414 else:
335 # extract the day of previous daily upload and bump if already uploaded today415 # extract the day of previous daily upload and bump if already uploaded
416 # today
336 regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*")417 regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*")
337 try:418 try:
338 previous_day = regexp.findall(base_package_version)[0]419 previous_day = regexp.findall(base_package_version)[0]
@@ -342,11 +423,17 @@
342 regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*")423 regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*")
343 previous_day = regexp.findall(base_package_version)[0]424 previous_day = regexp.findall(base_package_version)[0]
344 # make the version compatible with the new version425 # make the version compatible with the new version
345 previous_day = (previous_day[0], previous_day[1], "20" + previous_day[2].replace(".", ""), previous_day[3])426 previous_day = (previous_day[0], previous_day[1], "20" +
427 previous_day[2].replace(".", ""),
428 previous_day[3])
346 except IndexError:429 except IndexError:
347 raise Exception("Didn't find a correct versioning in the current package: {}".format(base_package_version))430 tmpl = ("Didn't find a correct versioning in the current "
431 "package: {}")
432 raise Exception(tmpl.format(base_package_version))
348 upstream_version = previous_day[0]433 upstream_version = previous_day[0]
349 if previous_day[1] == series_version and previous_day[2] == today_version:434 is_series = previous_day[1] == series_version
435 is_today = previous_day[2] == today_version
436 if is_series and is_today:
350 minor = 1437 minor = 1
351 if previous_day[3]: # second upload of the day438 if previous_day[3]: # second upload of the day
352 minor = int(previous_day[3]) + 1439 minor = int(previous_day[3]) + 1
@@ -363,7 +450,7 @@
363450
364def get_packaging_sourcename():451def get_packaging_sourcename():
365 '''Get current packaging source name'''452 '''Get current packaging source name'''
366 instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)453 instance = Popen(["dpkg-parsechangelog"], stdout=PIPE, stderr=PIPE)
367 (stdout, stderr) = instance.communicate()454 (stdout, stderr) = instance.communicate()
368 if instance.returncode != 0:455 if instance.returncode != 0:
369 raise Exception(stderr.decode("utf-8").strip())456 raise Exception(stderr.decode("utf-8").strip())
@@ -373,7 +460,8 @@
373 if source_name:460 if source_name:
374 return source_name[0]461 return source_name[0]
375462
376 raise Exception("Didn't find any source name in the package: {}".format(stdout))463 tmpl = "Didn't find any source name in the package: {}"
464 raise Exception(tmpl.format(stdout))
377465
378466
379def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name):467def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name):
@@ -382,20 +470,24 @@
382 # matching only bug format that launchpad accepts470 # matching only bug format that launchpad accepts
383 group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE)471 group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE)
384 bug_decipher_regexp = re.compile("(#\d{5,})+")472 bug_decipher_regexp = re.compile("(#\d{5,})+")
385 new_upload_changelog_regexp = re.compile(settings.NEW_CHANGELOG_PATTERN.format(source_package_name))473 pattern = settings.NEW_CHANGELOG_PATTERN.format(source_package_name)
474 new_upload_changelog_regexp = re.compile(pattern)
386 for line in f:475 for line in f:
387 grouped_bugs_list = group_bugs_regexp.findall(line)476 grouped_bugs_list = group_bugs_regexp.findall(line)
388 for grouped_bugs in grouped_bugs_list:477 for grouped_bugs in grouped_bugs_list:
389 for bug in map(lambda bug_with_hash: bug_with_hash.replace('#', ''), bug_decipher_regexp.findall(grouped_bugs)):478 func = lambda bug_with_hash: bug_with_hash.replace('#', '')
479 for bug in map(func, bug_decipher_regexp.findall(grouped_bugs)):
390 bugs.add(bug)480 bugs.add(bug)
391 # a released upload to distro (automated or manual)n exit as bugs before were already covered481 # a released upload to distro (automated or manual)n exit as bugs
482 # before were already covered
392 if new_upload_changelog_regexp.match(line):483 if new_upload_changelog_regexp.match(line):
393 return bugs484 return bugs
394485
395 return bugs486 return bugs
396487
397488
398def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits, dest_ppa=None):489def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits,
490 dest_ppa=None):
399 '''Update the changelog for the incoming upload'''491 '''Update the changelog for the incoming upload'''
400492
401 dch_env = os.environ.copy()493 dch_env = os.environ.copy()
@@ -403,18 +495,20 @@
403 dch_env["DEBFULLNAME"] = author495 dch_env["DEBFULLNAME"] = author
404 for bug_desc in authors_commits[author]:496 for bug_desc in authors_commits[author]:
405 if bug_desc.startswith('-'):497 if bug_desc.startswith('-'):
406 # Remove leading '-' or dch thinks (rightly) that it's an option498 # Remove leading '-' or dch thinks (rightly) that it's an
499 # option
407 bug_desc = bug_desc[1:]500 bug_desc = bug_desc[1:]
408 if bug_desc.startswith(' '):501 if bug_desc.startswith(' '):
409 # Remove leading spaces, there are useless and the result is502 # Remove leading spaces, there are useless and the result is
410 # prettier without them anyway ;)503 # prettier without them anyway ;)
411 bug_desc = bug_desc.strip()504 bug_desc = bug_desc.strip()
412 cmd = ["dch", "--multimaint-merge", "--release-heuristic", "changelog",505 cmd = ["dch", "--multimaint-merge", "--release-heuristic",
413 "-v{}".format(new_package_version), bug_desc]506 "changelog", "-v{}".format(new_package_version), bug_desc]
414 subprocess.Popen(cmd, env=dch_env).communicate()507 Popen(cmd, env=dch_env).communicate()
415508
416 if tip_bzr_rev != None:509 if tip_bzr_rev is None:
417 commit_message = "{} {}".format(settings.REV_STRING_FORMAT, tip_bzr_rev)510 tmpl = "{} {}"
511 commit_message = tmpl.format(settings.REV_STRING_FORMAT, tip_bzr_rev)
418 if dest_ppa:512 if dest_ppa:
419 commit_message += " ({})".format(dest_ppa)513 commit_message += " ({})".format(dest_ppa)
420 else:514 else:
@@ -422,17 +516,21 @@
422516
423 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME517 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME
424 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL518 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL
425 instance = subprocess.Popen(["dch", "--release-heuristic", "changelog",519 instance = Popen(["dch", "--release-heuristic", "changelog",
426 "-v{}".format(new_package_version), commit_message],520 "-v{}".format(new_package_version), commit_message],
427 stderr=subprocess.PIPE, env=dch_env)521 stderr=PIPE, env=dch_env)
428 (stdout, stderr) = instance.communicate()522 (stdout, stderr) = instance.communicate()
429 if instance.returncode != 0:523 if instance.returncode != 0:
430 raise Exception(stderr.decode("utf-8").strip())524 raise Exception(stderr.decode("utf-8").strip())
431 subprocess.call(["dch", "-r", "--distribution", series, "--force-distribution", ""], env=dch_env)525 cmd = ["dch", "-r", "--distribution", series, "--force-distribution", ""]
526 subprocess.call(cmd, env=dch_env)
432527
433 # in the case of no commit_message and no symbols file change, we have an addition [ DEBFULLNAME ] follow by an empty line528 # in the case of no commit_message and no symbols file change, we have an
529 # addition [ DEBFULLNAME ] follow by an empty line
434 # better to remove both lines530 # better to remove both lines
435 subprocess.call(["sed", "-i", "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}", "debian/changelog"])531 pattern = "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}"
532 cmd = ["sed", "-i", pattern, "debian/changelog"]
533 subprocess.call(cmd)
436534
437535
438def build_source_package(series, distro_version, ppa=None):536def build_source_package(series, distro_version, ppa=None):
@@ -457,7 +555,7 @@
457 "--distro-version", distro_version]555 "--distro-version", distro_version]
458 if ppa:556 if ppa:
459 cmd.extend(["--ppa", ppa])557 cmd.extend(["--ppa", ppa])
460 instance = subprocess.Popen(cmd, env=cowbuilder_env)558 instance = Popen(cmd, env=cowbuilder_env)
461 instance.communicate()559 instance.communicate()
462 if instance.returncode != 0:560 if instance.returncode != 0:
463 raise Exception("%r returned: %s." % (cmd, instance.returncode))561 raise Exception("%r returned: %s." % (cmd, instance.returncode))
@@ -490,15 +588,19 @@
490 replacement_done = False588 replacement_done = False
491 for filename in os.listdir("debian"):589 for filename in os.listdir("debian"):
492 if filename.endswith("symbols"):590 if filename.endswith("symbols"):
493 for line in fileinput.input(os.path.join('debian', filename), inplace=1):591 dfile = os.path.join('debian', filename)
592 for line in fileinput.input(dfile, inplace=1):
494 if settings.REPLACEME_TAG in line:593 if settings.REPLACEME_TAG in line:
495 replacement_done = True594 replacement_done = True
496 line = line.replace(settings.REPLACEME_TAG, new_upstream_version)595 line = line.replace(settings.REPLACEME_TAG,
596 new_upstream_version)
497 sys.stdout.write(line)597 sys.stdout.write(line)
498598
499 if replacement_done:599 if replacement_done:
500 dch_env = os.environ.copy()600 dch_env = os.environ.copy()
501 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME601 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME
502 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL602 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL
503 subprocess.Popen(["dch", "debian/*symbols: auto-update new symbols to released version"], env=dch_env).communicate()603 cmd = ["dch",
604 "debian/*symbols: auto-update new symbols to released version"]
605 Popen(cmd, env=dch_env).communicate()
504 subprocess.call(["bzr", "commit", "-m", "Update symbols"])606 subprocess.call(["bzr", "commit", "-m", "Update symbols"])
505607
=== modified file 'branch-source-builder/cupstream2distro/settings.py'
--- branch-source-builder/cupstream2distro/settings.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/settings.py 2014-02-23 16:37:01 +0000
@@ -39,6 +39,7 @@
39 GNUPG_DIR = home_dir39 GNUPG_DIR = home_dir
40CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials")40CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials")
4141
42
42# TODO refactor into a ci-utils module43# TODO refactor into a ci-utils module
43def _unit_config():44def _unit_config():
44 path = os.path.join(home_dir, 'unit_config')45 path = os.path.join(home_dir, 'unit_config')
@@ -73,7 +74,8 @@
7374
74# selected arch for building arch:all packages75# selected arch for building arch:all packages
75VIRTUALIZED_PPA_ARCH = ["i386", "amd64"]76VIRTUALIZED_PPA_ARCH = ["i386", "amd64"]
76# an arch we will ignore for publication if latest published version in dest doesn't build it77# an arch we will ignore for publication if latest published version in dest
78# doesn't build it
77ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el'])79ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el'])
78ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el'])80ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el'])
79SRU_PPA = "ubuntu-unity/sru-staging"81SRU_PPA = "ubuntu-unity/sru-staging"
@@ -87,11 +89,15 @@
8789
88OLD_STACK_DIR = 'old'90OLD_STACK_DIR = 'old'
89PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync'91PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync'
90PACKAGE_LIST_RSYNC_FILENAME_FORMAT = PACKAGE_LIST_RSYNC_FILENAME_PREFIX + '_{}-{}'92PACKAGE_LIST_RSYNC_FILENAME_FORMAT = (PACKAGE_LIST_RSYNC_FILENAME_PREFIX
91RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*".format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX)93 + '_{}-{}')
94RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*"
95RSYNC_PATTERN = RSYNC_PATTERN.format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX)
9296
93ROOT_CU2D = os.path.join(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))97ROOT_CU2D = os.path.join(
94DEFAULT_CONFIG_STACKS_DIR = os.path.join(os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks')98 os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
99DEFAULT_CONFIG_STACKS_DIR = os.path.join(
100 os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks')
95STACK_STATUS_FILENAME = "stack.status"101STACK_STATUS_FILENAME = "stack.status"
96STACK_STARTED_FILENAME = "stack.started"102STACK_STARTED_FILENAME = "stack.started"
97STACK_BUILDING_FILENAME = "stack.building"103STACK_BUILDING_FILENAME = "stack.building"
98104
=== modified file 'branch-source-builder/cupstream2distro/silomanager.py'
--- branch-source-builder/cupstream2distro/silomanager.py 2014-01-31 17:06:36 +0000
+++ branch-source-builder/cupstream2distro/silomanager.py 2014-02-23 16:37:01 +0000
@@ -22,7 +22,8 @@
22import os22import os
23import shutil23import shutil
2424
25from cupstream2distro.settings import SILO_CONFIG_FILENAME, SILO_NAME_LIST, SILO_STATUS_RSYNCDIR25from cupstream2distro.settings import (SILO_CONFIG_FILENAME, SILO_NAME_LIST,
26 SILO_STATUS_RSYNCDIR)
26from cupstream2distro.utils import ignored27from cupstream2distro.utils import ignored
2728
2829
@@ -42,10 +43,13 @@
42 os.makedirs(SILO_STATUS_RSYNCDIR)43 os.makedirs(SILO_STATUS_RSYNCDIR)
43 silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1]44 silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1]
44 dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name)45 dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name)
45 logging.debug("Copying configuration from {} to {}".format(silo_config_path, dest))46 tmpl = "Copying configuration from {} to {}"
46 shutil.copy2(silo_config_path, os.path.join(SILO_STATUS_RSYNCDIR, silo_name))47 logging.debug(tmpl.format(silo_config_path, dest))
48 dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name)
49 shutil.copy2(silo_config_path, dest)
47 return True50 return True
4851
52
49def load_config(uri=None):53def load_config(uri=None):
50 """return a loaded config54 """return a loaded config
5155
@@ -62,23 +66,33 @@
62 logging.warning("Can't load configuration: " + e.message)66 logging.warning("Can't load configuration: " + e.message)
63 return None67 return None
6468
69
65def remove_status_file(silo_name):70def remove_status_file(silo_name):
66 """Remove status file"""71 """Remove status file"""
67 os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name))72 os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name))
6873
6974
70def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri, ignore_silo):75def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri,
71 """Return true if the project for that serie in that dest is not in any configuration"""76 ignore_silo):
72 logging.info("Checking if {} is already configured for {} ({}) in another silo".format(project_name, dest.name, series.name))77 """Return true if the project for that serie in that dest is not in any
78 configuration"""
79 tmpl = "Checking if {} is already configured for {} ({}) in another silo"
80 logging.info(tmpl.format(project_name, dest.name, series.name))
73 for silo_name in SILO_NAME_LIST:81 for silo_name in SILO_NAME_LIST:
74 # we are reconfiguring current silo, ignoring it82 # we are reconfiguring current silo, ignoring it
75 if ignore_silo == silo_name:83 if ignore_silo == silo_name:
76 continue84 continue
77 config = load_config(os.path.join(base_silo_uri, silo_name))85 config = load_config(os.path.join(base_silo_uri, silo_name))
78 if config:86 if config:
79 if (config["global"]["dest"] == dest.self_link and config["global"]["series"] == series.self_link and87 project_match = (project_name in config["mps"]
80 (project_name in config["mps"] or project_name in config["sources"])):88 or project_name in config["sources"])
81 logging.error("{} is already prepared for the same serie and destination in {}".format(project_name, silo_name))89 series_match = config["global"]["series"] == series.self_link
90 dest_match = config["global"]["dest"] == dest.self_link
91
92 if (dest_match and series_match and project_match):
93 tmpl = ("{} is already prepared for the same series "
94 "and destination in {}")
95 logging.error(tmpl.format(project_name, silo_name))
82 return False96 return False
83 return True97 return True
8498
@@ -86,27 +100,32 @@
86def return_first_available_silo(base_silo_uri):100def return_first_available_silo(base_silo_uri):
87 """Check which silos are free and return the first one"""101 """Check which silos are free and return the first one"""
88 for silo_name in SILO_NAME_LIST:102 for silo_name in SILO_NAME_LIST:
89 if not os.path.isfile(os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)):103 p = os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)
104 if not os.path.isfile(p):
90 return silo_name105 return silo_name
91 return None106 return None
92107
108
93def get_config_step(config):109def get_config_step(config):
94 """Get configuration step"""110 """Get configuration step"""
95 return config["global"]["step"]111 return config["global"]["step"]
96112
113
97def set_config_step(config, new_step, uri=''):114def set_config_step(config, new_step, uri=''):
98 """Set configuration step to new_step"""115 """Set configuration step to new_step"""
99 config["global"]["step"] = new_step116 config["global"]["step"] = new_step
100 return save_config(config, uri)117 return save_config(config, uri)
101118
119
102def set_config_status(config, status, uri='', add_url=True):120def set_config_status(config, status, uri='', add_url=True):
103 """Change status to reflect latest status"""121 """Change status to reflect latest status"""
104 build_url = os.getenv('BUILD_URL')122 build_url = os.getenv('BUILD_URL')
105 if add_url and build_url:123 if add_url and build_url:
106 status = "{} ({}console)".format(status , build_url)124 status = "{} ({}console)".format(status, build_url)
107 config["global"]["status"] = status125 config["global"]["status"] = status
108 return save_config(config, uri)126 return save_config(config, uri)
109127
128
110def get_all_projects(config):129def get_all_projects(config):
111 """Get a list of all projets"""130 """Get a list of all projets"""
112 projects = []131 projects = []
113132
=== modified file 'branch-source-builder/cupstream2distro/stack.py'
--- branch-source-builder/cupstream2distro/stack.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/stack.py 2014-02-23 16:37:01 +0000
@@ -21,11 +21,75 @@
21import os21import os
22import yaml22import yaml
2323
24from .settings import DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME, STACK_STARTED_FILENAME24from .settings import (DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME,
25 STACK_STARTED_FILENAME)
25from .utils import ignored26from .utils import ignored
2627
27_stacks_ref = {}28_stacks_ref = {}
2829
30CANT_FIND_STATUS = '''Can't find status for {depstack} ({deprel}). This
31shouldn't happen unless the stack is currently running. If this is the case, it
32means that the current stack shouldn't be uploaded as the state is unknown.'''
33
34FAILED_TO_PUBLISH = '''{depstack} ({deprel}) failed to publish. Possible causes
35are:
36 * the stack really didn't build/can't be prepared at all.
37 * the stack has integration tests not working with this previous stack.
38
39 What needs to be done:
40 Either:
41 * If we want to publish both stacks: retry the integration tests for
42 {depstack} ({deprel}), including components from this stack (check
43 with the whole PPA). If that works, both stacks should be published
44 at the same time.
45 Or:
46 * If we only want to publish this stack: check that we can safely
47 publish it by itself (e.g. without the stacks it depends on). The
48 trick there is to make sure that the stack is not relying on, or
49 affected by, a change that happened in one of its dependencies.
50 Example: if the {depstack} ({deprel}) API changed in a way that
51 affects any component of the current stack, and both stacks got
52 updated in trunk, we need to make sure we don't land only one of the
53 two stacks which would result in a broken state. Also think about
54 potential ABI changes.'''
55
56MANUAL_PUBLISH = '''{depstack} ({deprel}) is in manually publish mode. Possible
57causes are:
58 * Some part of the stack has packaging changes
59 * This stack is depending on another stack not being published
60
61 What needs to be done:
62 Either:
63 * If {depstack} ({deprel}) can be published, we should publish both
64 stacks at the same time.
65 Or:
66 * If we only want to publish this stack: check that we can safely
67 publish it by itself (e.g. without the stacks it depends on). The
68 trick there is to make sure that the stack is not relying on, or
69 affected by, a change that happened in one of its dependencies.
70 Example: if the {depstack} ({deprel}) API changed in a way that
71 affects any component of the current stack, and both stacks got
72 updated in trunk, we need to make sure we don't land only one of the
73 two stacks which would result in a broken state. Also think about
74 potential ABI changes.'''
75
76MANUALLY_ABORTED = '''{depstack} ({deprel}) has been manually aborted or failed
77for an unknown reason. Possible causes are:
78 * A job of this stack was stopped manually
79 * Jenkins had an internal error/shutdown
80
81 What needs to be done:
82 * If we only want to publish this stack: check that we can safely
83 publish it by itself (e.g. without the stacks it depends on). The
84 trick there is to make sure that the stack is not relying on, or
85 affected by, a change that happened in one of its dependencies.
86 Example: if the {depstack} ({deprel}) API changed in a way that
87 affects any component of the current stack, and both stacks got
88 updated in trunk, we need to make sure we don't land only one of the
89 two stacks which would result in a broken state. Also think about
90 potential ABI changes.'''
91
92
29# TODO: should be used by a metaclass93# TODO: should be used by a metaclass
30def get_stack(release, stack_name):94def get_stack(release, stack_name):
31 try:95 try:
@@ -33,22 +97,28 @@
33 except KeyError:97 except KeyError:
34 return Stack(release, stack_name)98 return Stack(release, stack_name)
3599
100
36class Stack():101class Stack():
37102
38 def __init__(self, release, stack_name):103 def __init__(self, release, stack_name):
39 self.stack_name = stack_name104 self.stack_name = stack_name
40 self.release = release105 self.release = release
41 self.statusfile = os.path.join('..', '..', release, stack_name, STACK_STATUS_FILENAME)106 args = ('..', '..', release, stack_name, STACK_STATUS_FILENAME)
42 self.startedfile = os.path.join('..', '..', release, stack_name, STACK_STARTED_FILENAME)107 self.statusfile = os.path.join(*args)
108 args = ('..', '..', release, stack_name, STACK_STARTED_FILENAME)
109 self.startedfile = os.path.join(*args)
43 self.stack_file_path = None110 self.stack_file_path = None
44 self._dependencies = None111 self._dependencies = None
45 self._rdependencies = None112 self._rdependencies = None
46 for stack_file_path in Stack.get_stacks_file_path(release):113 for stack_file_path in Stack.get_stacks_file_path(release):
47 if stack_file_path.split(os.path.sep)[-1] == "{}.cfg".format(stack_name):114 formatted = "{}.cfg".format(stack_name)
115 if stack_file_path.split(os.path.sep)[-1] == formatted:
48 self.stack_file_path = stack_file_path116 self.stack_file_path = stack_file_path
49 break117 break
50 if not self.stack_file_path:118 if not self.stack_file_path:
51 raise Exception("{}.cfg for {} doesn't exist anywhere in {}".format(stack_name, release, self.get_root_stacks_dir()))119 msg = "{}.cfg for {} doesn't exist anywhere in {}"
120 msg = msg.format(stack_name, release, self.get_root_stacks_dir())
121 raise Exception(msg)
52122
53 with open(self.stack_file_path, 'r') as f:123 with open(self.stack_file_path, 'r') as f:
54 cfg = yaml.load(f)124 cfg = yaml.load(f)
@@ -118,7 +188,11 @@
118 else:188 else:
119 (stackname, release) = (item, self.release)189 (stackname, release) = (item, self.release)
120 self._dependencies.append(get_stack(release, stackname))190 self._dependencies.append(get_stack(release, stackname))
121 logging.info("{} ({}) dependency list is: {}".format(self.stack_name, self.release, ["{} ({})".format(stack.stack_name, stack.release) for stack in self._dependencies]))191 deps = ["{} ({})".format(stack.stack_name, stack.release)
192 for stack in self._dependencies]
193 msg = "{} ({}) dependency list is: {}"
194 msg = msg.format(self.stack_name, self.release, deps)
195 logging.info(msg)
122 return self._dependencies196 return self._dependencies
123 except (TypeError, KeyError):197 except (TypeError, KeyError):
124 return []198 return []
@@ -137,7 +211,8 @@
137 return self._rdependencies211 return self._rdependencies
138212
139 def generate_dep_status_message(self):213 def generate_dep_status_message(self):
140 '''Return a list of potential problems from others stack which should block current publication'''214 '''Return a list of potential problems from others stack which should
215 block current publication'''
141216
142 # TODO: get the first Stack object217 # TODO: get the first Stack object
143 # iterate over all stacks objects from dep chain218 # iterate over all stacks objects from dep chain
@@ -145,39 +220,24 @@
145220
146 global_dep_status_info = []221 global_dep_status_info = []
147 for stack in self.get_direct_depending_stacks():222 for stack in self.get_direct_depending_stacks():
148 logging.info("Check status for {} ({})".format(stack.stack_name, stack.release))223 msg = "Check status for {} ({})"
224 msg = msg.format(stack.stack_name, stack.release)
225 logging.info(msg)
149 status = stack.get_status()226 status = stack.get_status()
150 message = None227 message = None
151 # We should have a status for every stack228 # We should have a status for every stack
152 if status is None:229 if status is None:
153 message = "Can't find status for {depstack} ({deprel}). This shouldn't happen unless the stack is currently running. If this is the case, it means that the current stack shouldn't be uploaded as the state is unknown.".format(depstack=stack, deprel=stack.release)230 kw = {'depstack': stack, 'deprel': stack.release}
231 message = CANT_FIND_STATUS.format(**kw)
154 elif status == 1:232 elif status == 1:
155 message = '''{depstack} ({deprel}) failed to publish. Possible causes are:233 kw = {'depstack': stack.stack_name, 'deprel': stack.release}
156 * the stack really didn't build/can't be prepared at all.234 message = FAILED_TO_PUBLISH.format(**kw)
157 * the stack has integration tests not working with this previous stack.
158
159 What needs to be done:
160 Either:
161 * If we want to publish both stacks: retry the integration tests for {depstack} ({deprel}), including components from this stack (check with the whole PPA). If that works, both stacks should be published at the same time.
162 Or:
163 * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
164 elif status == 2:235 elif status == 2:
165 message = '''{depstack} ({deprel}) is in manually publish mode. Possible causes are:236 kw = {'depstack': stack.stack_name, 'deprel': stack.release}
166 * Some part of the stack has packaging changes237 message = MANUAL_PUBLISH.format(**kw)
167 * This stack is depending on another stack not being published
168
169 What needs to be done:
170 Either:
171 * If {depstack} ({deprel}) can be published, we should publish both stacks at the same time.
172 Or:
173 * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
174 elif status == 3 or status == -1:238 elif status == 3 or status == -1:
175 message = '''{depstack} ({deprel}) has been manually aborted or failed for an unknown reason. Possible causes are:239 kw = {'depstack': stack.stack_name, 'deprel': stack.release}
176 * A job of this stack was stopped manually240 message = MANUALLY_ABORTED.format(**kw)
177 * Jenkins had an internal error/shutdown
178
179 What needs to be done:
180 * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
181241
182 if message:242 if message:
183 logging.warning(message)243 logging.warning(message)
@@ -186,19 +246,21 @@
186246
187 @staticmethod247 @staticmethod
188 def get_root_stacks_dir():248 def get_root_stacks_dir():
189 '''Get root stack dir'''249 '''Get root stack dir'''
190 return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR)250 return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR)
191251
192 @staticmethod252 @staticmethod
193 def get_stacks_file_path(release):253 def get_stacks_file_path(release):
194 '''Return an iterator with all path for every discovered stack files'''254 '''Return an iterator with all path for every discovered stack files'''
195 for root, dirs, files in os.walk(os.path.join(Stack.get_root_stacks_dir(), release)):255 walking = os.walk(os.path.join(Stack.get_root_stacks_dir(), release))
256 for root, dirs, files in walking:
196 for candidate in files:257 for candidate in files:
197 if candidate.endswith('.cfg'):258 if candidate.endswith('.cfg'):
198 yield os.path.join(root, candidate)259 yield os.path.join(root, candidate)
199260
200 @staticmethod261 @staticmethod
201 def get_current_stack():262 def get_current_stack():
202 '''Return current stack object based on current path (release/stackname)'''263 '''Return current stack object based on current path
264 (release/stackname)'''
203 path = os.getcwd().split(os.path.sep)265 path = os.getcwd().split(os.path.sep)
204 return get_stack(path[-2], path[-1])266 return get_stack(path[-2], path[-1])
205267
=== modified file 'branch-source-builder/cupstream2distro/stacks.py'
--- branch-source-builder/cupstream2distro/stacks.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/stacks.py 2014-02-23 16:37:01 +0000
@@ -21,6 +21,7 @@
21import os21import os
22import yaml22import yaml
23import subprocess23import subprocess
24from subprocess import PIPE
2425
25from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN26from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN
26from .tools import get_packaging_diff_filename27from .tools import get_packaging_diff_filename
@@ -38,7 +39,7 @@
38 raise Exception('Please set environment variable CU2D_RSYNCSVR')39 raise Exception('Please set environment variable CU2D_RSYNCSVR')
3940
40 cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.']41 cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.']
41 instance = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)42 instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
42 (stdout, stderr) = instance.communicate()43 (stdout, stderr) = instance.communicate()
43 if instance.returncode not in (0, 23):44 if instance.returncode not in (0, 23):
44 raise Exception(stderr.decode("utf-8").strip())45 raise Exception(stderr.decode("utf-8").strip())
@@ -62,10 +63,12 @@
62 try:63 try:
63 projects_list = cfg['stack']['projects']64 projects_list = cfg['stack']['projects']
64 except (TypeError, KeyError):65 except (TypeError, KeyError):
65 logging.warning("{} seems broken in not having stack or projects keys".format(file_path))66 tmpl = "{} seems broken in not having stack or projects keys"
67 logging.warning(tmpl.format(file_path))
66 continue68 continue
67 if not projects_list:69 if not projects_list:
68 logging.warning("{} don't have any project list".format(file_path))70 tmpl = "{} don't have any project list"
71 logging.warning(tmpl.format(file_path))
69 continue72 continue
70 for project in projects_list:73 for project in projects_list:
71 if isinstance(project, dict):74 if isinstance(project, dict):
@@ -74,11 +77,13 @@
74 projects.append(project)77 projects.append(project)
75 return set(projects)78 return set(projects)
7679
80
77def get_stack_packaging_change_status(source_version_list):81def get_stack_packaging_change_status(source_version_list):
78 '''Return global package change status list82 '''Return global package change status list
7983
80 # FIXME: added too many infos now, should only be: (source, version)84 # FIXME: added too many infos now, should only be: (source, version)
81 source_version_list is a list of couples (source, version, tip_rev, target_branch)'''85 source_version_list is a list of couples (source, version, tip_rev,
86 target_branch)'''
8287
83 packaging_change_status = []88 packaging_change_status = []
84 for (source, version, tip_rev, target_branch) in source_version_list:89 for (source, version, tip_rev, target_branch) in source_version_list:
8590
=== modified file 'branch-source-builder/cupstream2distro/tools.py'
--- branch-source-builder/cupstream2distro/tools.py 2014-02-19 16:34:46 +0000
+++ branch-source-builder/cupstream2distro/tools.py 2014-02-23 16:37:01 +0000
@@ -26,18 +26,21 @@
26from .settings import PROJECT_CONFIG_SUFFIX26from .settings import PROJECT_CONFIG_SUFFIX
27from .utils import ignored27from .utils import ignored
2828
29WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1" time="0.1">29WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1"
30time="0.1">
30 <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase>31 <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase>
31</testsuite>'''32</testsuite>'''
3233
3334
34def generate_xml_artefacts(test_name, details, filename):35def generate_xml_artefacts(test_name, details, filename):
35 '''Generate a fake test name xml result for marking the build as unstable'''36 '''Generate a fake test name xml result for marking the build as
37 unstable'''
36 failure = ""38 failure = ""
37 errnum = 039 errnum = 0
38 for detail in details:40 for detail in details:
39 errnum = 141 errnum = 1
40 failure += ' <failure type="exception">{}</failure>\n'.format(escape(detail))42 ex = escape(detail)
43 failure += ' <failure type="exception">{}</failure>\n'.format(ex)
41 if failure:44 if failure:
42 failure = '\n{}'.format(failure)45 failure = '\n{}'.format(failure)
4346
@@ -52,7 +55,8 @@
52 return config.get('Package', 'dest_current_version')55 return config.get('Package', 'dest_current_version')
5356
5457
55def save_project_config(source_package_name, branch, revision, dest_current_version, current_packaging_version):58def save_project_config(source_package_name, branch, revision,
59 dest_current_version, current_packaging_version):
56 '''Save branch and package configuration'''60 '''Save branch and package configuration'''
57 config = ConfigParser.RawConfigParser()61 config = ConfigParser.RawConfigParser()
58 config.add_section('Branch')62 config.add_section('Branch')
@@ -61,21 +65,27 @@
61 config.add_section('Package')65 config.add_section('Package')
62 config.set('Package', 'dest_current_version', dest_current_version)66 config.set('Package', 'dest_current_version', dest_current_version)
63 config.set('Package', 'packaging_version', current_packaging_version)67 config.set('Package', 'packaging_version', current_packaging_version)
64 with open("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX), 'wb') as configfile:68 path = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)
69 with open(path, 'wb') as configfile:
65 config.write(configfile)70 config.write(configfile)
6671
6772
68def get_packaging_diff_filename(source_package_name, packaging_version):73def get_packaging_diff_filename(source_package_name, packaging_version):
69 '''Return the packaging diff filename'''74 '''Return the packaging diff filename'''
7075
71 return "packaging_changes_{}_{}.diff".format(source_package_name, packaging_version)76 ret = "packaging_changes_{}_{}.diff"
77 return ret.format(source_package_name, packaging_version)
7278
7379
74def mark_project_as_published(source_package_name, packaging_version):80def mark_project_as_published(source_package_name, packaging_version):
75 '''Rename .project and eventual diff files so that if we do a partial rebuild, we don't try to republish them'''81 '''Rename .project and eventual diff files so that if we do a partial
76 project_filename = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)82 rebuild, we don't try to republish them'''
77 os.rename(project_filename, "{}_{}".format(project_filename, packaging_version))83 tmpl = "{}.{}"
78 diff_filename = get_packaging_diff_filename(source_package_name, packaging_version)84 project_filename = tmpl.format(source_package_name, PROJECT_CONFIG_SUFFIX)
85 new_name = tmpl.format(project_filename, packaging_version)
86 os.rename(project_filename, new_name)
87 diff_filename = get_packaging_diff_filename(source_package_name,
88 packaging_version)
79 if os.path.isfile(diff_filename):89 if os.path.isfile(diff_filename):
80 os.rename(diff_filename, "{}.published".format(diff_filename))90 os.rename(diff_filename, "{}.published".format(diff_filename))
8191
8292
=== modified file 'charms/precise/python-django/hooks/hooks.py'
--- charms/precise/python-django/hooks/hooks.py 2014-02-18 20:11:41 +0000
+++ charms/precise/python-django/hooks/hooks.py 2014-02-23 16:37:01 +0000
@@ -562,7 +562,6 @@
562562
563def config_changed(config_data):563def config_changed(config_data):
564 os.environ['DJANGO_SETTINGS_MODULE'] = django_settings_modules564 os.environ['DJANGO_SETTINGS_MODULE'] = django_settings_modules
565 django_admin_cmd = find_django_admin_cmd()
566565
567 site_secret_key = config_data['site_secret_key']566 site_secret_key = config_data['site_secret_key']
568 if not site_secret_key:567 if not site_secret_key:
@@ -610,7 +609,7 @@
610 else:609 else:
611 run('git pull %s %s' % (repos_url, vcs_clone_dir))610 run('git pull %s %s' % (repos_url, vcs_clone_dir))
612 elif vcs == 'bzr' or vcs == 'bazaar':611 elif vcs == 'bzr' or vcs == 'bazaar':
613 run('cd %s; bzr pull %s %s' % (vcs_clonse_dir, repos_url))612 run('cd %s; bzr pull %s %s' % (vcs_clone_dir, repos_url))
614 elif vcs == 'svn' or vcs == 'subversion':613 elif vcs == 'svn' or vcs == 'subversion':
615 run('svn up %s %s' % (repos_url, vcs_clone_dir))614 run('svn up %s %s' % (repos_url, vcs_clone_dir))
616 else:615 else:
617616
=== modified file 'image-builder/imagebuilder/tests/test_style.py'
--- image-builder/imagebuilder/tests/test_style.py 2014-02-15 12:06:40 +0000
+++ image-builder/imagebuilder/tests/test_style.py 2014-02-23 16:37:01 +0000
@@ -14,15 +14,19 @@
14# along with this program. If not, see <http://www.gnu.org/licenses/>.14# along with this program. If not, see <http://www.gnu.org/licenses/>.
1515
16from ucitests import styles16from ucitests import styles
1717# If we just import run_worker, we'll end up testing whichever run_worker was
18import imagebuilder18# last imported, so import it under a different name.
19import imagebuilder_run_worker
1920
2021
21class TestPep8(styles.TestPep8):22class TestPep8(styles.TestPep8):
2223
23 packages = [imagebuilder]24 # uci-tests will scan all subdirectories that this module is found in, so
25 # we do not need to explicitly provide imagebuilder. Doing so would cause
26 # the tests to be run twice.
27 packages = [imagebuilder_run_worker]
2428
2529
26class TestPyflakes(styles.TestPyflakes):30class TestPyflakes(styles.TestPyflakes):
2731
28 packages = [imagebuilder]32 packages = [imagebuilder_run_worker]
2933
=== added symlink 'image-builder/imagebuilder_run_worker.py'
=== target is u'run_worker'
=== modified file 'image-builder/run_worker'
--- image-builder/run_worker 2014-01-28 13:50:43 +0000
+++ image-builder/run_worker 2014-02-23 16:37:01 +0000
@@ -50,7 +50,7 @@
50 if amqp_utils.progress_completed(trigger, {'image_id': image_id}):50 if amqp_utils.progress_completed(trigger, {'image_id': image_id}):
51 log.error(51 log.error(
52 'Unable to notify progress-trigger completition of action')52 'Unable to notify progress-trigger completition of action')
53 except Exception as e:53 except Exception:
54 type, val, tb = sys.exc_info()54 type, val, tb = sys.exc_info()
55 amqp_utils.progress_failed(trigger, {'message': val.message})55 amqp_utils.progress_failed(trigger, {'message': val.message})
56 log.exception('Image build failed:')56 log.exception('Image build failed:')
5757
=== modified file 'lander/lander/tests/test_style.py'
--- lander/lander/tests/test_style.py 2014-02-15 12:06:40 +0000
+++ lander/lander/tests/test_style.py 2014-02-23 16:37:01 +0000
@@ -14,15 +14,19 @@
14# along with this program. If not, see <http://www.gnu.org/licenses/>.14# along with this program. If not, see <http://www.gnu.org/licenses/>.
1515
16from ucitests import styles16from ucitests import styles
1717# If we just import run_worker, we'll end up testing whichever run_worker was
18import lander18# last imported, so import it under a different name.
19import lander_run_worker
1920
2021
21class TestPep8(styles.TestPep8):22class TestPep8(styles.TestPep8):
2223
23 packages = [lander]24 # uci-tests will scan all subdirectories that this module is found in, so
25 # we do not need to explicitly provide lander. Doing so would cause the
26 # tests to be run twice.
27 packages = [lander_run_worker]
2428
2529
26class TestPyflakes(styles.TestPyflakes):30class TestPyflakes(styles.TestPyflakes):
2731
28 packages = [lander]32 packages = [lander_run_worker]
2933
=== added symlink 'lander/lander_run_worker.py'
=== target is u'run_worker'
=== modified file 'test_runner/tstrun/tests/test_style.py'
--- test_runner/tstrun/tests/test_style.py 2014-02-15 12:06:40 +0000
+++ test_runner/tstrun/tests/test_style.py 2014-02-23 16:37:01 +0000
@@ -16,13 +16,19 @@
16from ucitests import styles16from ucitests import styles
1717
18import tstrun18import tstrun
19# If we just import run_worker, we'll end up testing whichever run_worker was
20# last imported, so import it under a different name.
21import tstrun_run_worker
1922
2023
21class TestPep8(styles.TestPep8):24class TestPep8(styles.TestPep8):
2225
23 packages = [tstrun]26 # uci-tests will scan all subdirectories that this module is found in, so
27 # we do not need to explicitly provide tstrun. Doing so would cause
28 # the tests to be run twice.
29 packages = [tstrun_run_worker]
2430
2531
26class TestPyflakes(styles.TestPyflakes):32class TestPyflakes(styles.TestPyflakes):
2733
28 packages = [tstrun]34 packages = [tstrun_run_worker]
2935
=== added symlink 'test_runner/tstrun_run_worker.py'
=== target is u'run_worker'

Subscribers

People subscribed via source and target branches

to all changes: