Merge lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage into lp:ubuntu-ci-services-itself

Proposed by Evan
Status: Needs review
Proposed branch: lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage
Merge into: lp:ubuntu-ci-services-itself
Diff against target: 1888 lines (+633/-308)
16 files modified
branch-source-builder/bsbuilder/tests/test_style.py (+8/-4)
branch-source-builder/cupstream2distro/branchhandling.py (+75/-40)
branch-source-builder/cupstream2distro/launchpadmanager.py (+38/-18)
branch-source-builder/cupstream2distro/packageinppa.py (+86/-49)
branch-source-builder/cupstream2distro/packageinppamanager.py (+24/-12)
branch-source-builder/cupstream2distro/packagemanager.py (+204/-102)
branch-source-builder/cupstream2distro/settings.py (+11/-5)
branch-source-builder/cupstream2distro/silomanager.py (+30/-11)
branch-source-builder/cupstream2distro/stack.py (+102/-40)
branch-source-builder/cupstream2distro/stacks.py (+9/-4)
branch-source-builder/cupstream2distro/tools.py (+20/-10)
charms/precise/python-django/hooks/hooks.py (+1/-2)
image-builder/imagebuilder/tests/test_style.py (+8/-4)
image-builder/run_worker (+1/-1)
lander/lander/tests/test_style.py (+8/-4)
test_runner/tstrun/tests/test_style.py (+8/-2)
To merge this branch: bzr merge lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage
Reviewer Review Type Date Requested Status
PS Jenkins bot (community) continuous-integration Needs Fixing
Canonical CI Engineering Pending
Review via email: mp+207801@code.launchpad.net

Commit message

Fix a large number of pep8 issues in cupstream2distro. Cover run_worker and other out of module scripts with the pep8 and pyflakes tests.

Description of the change

I noticed that pyflakes wasn't running against watch_ppa.py, which exists a level above the bsbuilder module.

While fixing this and running the tests, I was having a hard time pulling useful failures from the noise generated by cupstream2distro. So I fixed all of its pep8 and pyflakes issues.

We should be very careful in landing this branch. cupstream2distro is completely untested code and I made some logic changes to it.

You'll notice that there are symlinks to the run_worker script in a few modules. This exists for three reasons:
 1. pep8 and pyflakes are hardcoded in uci-tests to look for *.py files.
 2. Importing `run_worker` using imp creates a run_workerc file unless you tell Python to give the compiled code a different name (by providing an open file and path argument).
 3. Our test harness imports all the run_worker scripts under the same namespace, so you end up using whichever was imported last. Providing a uniquely named symlink fixes this.

Note that there is still a pyflakes failure with this branch. That is bug 1283449.

To post a comment you must log in.
Revision history for this message
PS Jenkins bot (ps-jenkins) wrote :

FAILED: Continuous integration, rev:280
No commit message was specified in the merge proposal. Click on the following link and set the commit message (if you want a jenkins rebuild you need to trigger it yourself):
https://code.launchpad.net/~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage/+merge/207801/+edit-commit-message

http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/217/
Executed test runs:

Click here to trigger a rebuild:
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/217/rebuild

review: Needs Fixing (continuous-integration)
281. By Evan

Expand the pyflakes/pep8 coverage for the test_runner as well.

Revision history for this message
PS Jenkins bot (ps-jenkins) wrote :

FAILED: Continuous integration, rev:281
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/219/
Executed test runs:

Click here to trigger a rebuild:
http://s-jenkins.ubuntu-ci:8080/job/uci-engine-ci/219/rebuild

review: Needs Fixing (continuous-integration)
Revision history for this message
Ursula Junque (ursinha) wrote :

I like how elegant this is handling the huge messages (the part that was worrying me most). I've been working on adding tests to this code before actually making such changes, because they are already part of upstream and "only" need to be ported (at least basic unit tests), and that would avoid headaches in case things started to fail because we accidentally changed the logic somewhere. If you agree to hold this for a bit I think I can land tests first, then you can land this branch safely.

Revision history for this message
Evan (ev) wrote :

Yup, I think that's entirely reasonable. Let's put this on hold until your branch lands.

Unmerged revisions

281. By Evan

Expand the pyflakes/pep8 coverage for the test_runner as well.

280. By Evan

Fix the style tests. Specifying more than one module under the same directory causes it to be tested twice by pep8 and pyflakes. Modules imported with the same name with override each other.

279. By Evan

Fix pep8 issues in cupstream2distro packageinppa module.

278. By Evan

Fix pep8 issues in cupstream2distro packageinppamanager module.

277. By Evan

Fix pep8 issues in cupstream2distro packagemanager module.

276. By Evan

Fix pep8 issues in cupstream2distro silomanager module.

275. By Evan

Fix pep8 issues in cupstream2distro settings module.

274. By Evan

Fix pep8 issues in cupstream2distro stacks module.

273. By Evan

Oops. One more in tools.py.

272. By Evan

Fix pep8 issues in cupstream2distro tools module.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'branch-source-builder/bsbuilder/tests/test_style.py'
2--- branch-source-builder/bsbuilder/tests/test_style.py 2014-02-15 12:06:40 +0000
3+++ branch-source-builder/bsbuilder/tests/test_style.py 2014-02-23 16:37:01 +0000
4@@ -14,15 +14,19 @@
5 # along with this program. If not, see <http://www.gnu.org/licenses/>.
6
7 from ucitests import styles
8-
9-import bsbuilder
10+# If we just import run_worker, we'll end up testing whichever run_worker was
11+# last imported, so import it under a different name.
12+import bsbuilder_run_worker
13
14
15 class TestPep8(styles.TestPep8):
16
17- packages = [bsbuilder]
18+ # uci-tests will scan all subdirectories that this module is found in, so
19+ # we do not need to explicitly provide bsbuilder. Doing so would cause the
20+ # tests to be run twice.
21+ packages = [bsbuilder_run_worker]
22
23
24 class TestPyflakes(styles.TestPyflakes):
25
26- packages = [bsbuilder]
27+ packages = [bsbuilder_run_worker]
28
29=== added symlink 'branch-source-builder/bsbuilder_run_worker.py'
30=== target is u'run_worker'
31=== modified file 'branch-source-builder/cupstream2distro/branchhandling.py'
32--- branch-source-builder/cupstream2distro/branchhandling.py 2014-02-19 16:34:46 +0000
33+++ branch-source-builder/cupstream2distro/branchhandling.py 2014-02-23 16:37:01 +0000
34@@ -23,13 +23,17 @@
35 import os
36 import re
37 import subprocess
38+from subprocess import PIPE
39
40-from .settings import BRANCH_URL, IGNORECHANGELOG_COMMIT, PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX, SILO_PACKAGING_RELEASE_COMMIT_MESSAGE
41+from .settings import (BRANCH_URL, IGNORECHANGELOG_COMMIT,
42+ PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX,
43+ SILO_PACKAGING_RELEASE_COMMIT_MESSAGE)
44
45
46 def get_branch(branch_url, dest_dir):
47 '''Grab a branch'''
48- instance = subprocess.Popen(["bzr", "branch", branch_url, dest_dir], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
49+ cmd = ["bzr", "branch", branch_url, dest_dir]
50+ instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
51 (stdout, stderr) = instance.communicate()
52 if instance.returncode != 0:
53 raise Exception(stderr.decode("utf-8").strip())
54@@ -37,15 +41,18 @@
55
56 def get_tip_bzr_revision():
57 '''Get latest revision in bzr'''
58- instance = subprocess.Popen(["bzr", "log", "-c", "-1", "--line"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
59+ cmd = ["bzr", "log", "-c", "-1", "--line"]
60+ instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
61 (stdout, stderr) = instance.communicate()
62 if instance.returncode != 0:
63 raise Exception(stderr.decode("utf-8").strip())
64 return (int(stdout.split(':')[0]))
65
66
67-def collect_author_commits(content_to_parse, bugs_to_skip, additional_stamp=""):
68- '''return a tuple of a dict with authors and commits message from the content to parse
69+def collect_author_commits(content_to_parse, bugs_to_skip,
70+ additional_stamp=""):
71+ '''Return a tuple of a dict with authors and commits message from the
72+ content to parse.
73
74 bugs_to_skip is a set of bugs we need to skip
75
76@@ -60,15 +67,17 @@
77 commit_message_stenza = False
78 for line in content_to_parse.splitlines():
79 # new revision, collect what we have found
80- if line.startswith("------------------------------------------------------------"):
81- # try to decipher a special case: we have some commits which were already in bugs_to_skip,
82- # so we eliminate them.
83+ dash = "------------------------------------------------------------"
84+ if line.startswith(dash):
85+ # try to decipher a special case: we have some commits which were
86+ # already in bugs_to_skip, so we eliminate them.
87 # Also ignore when having IGNORECHANGELOG_COMMIT
88- if (current_bugs and not (current_bugs - bugs_to_skip)) or IGNORECHANGELOG_COMMIT in current_commit:
89- current_authors = set()
90- current_commit = ""
91- current_bugs = set()
92- continue
93+ if ((current_bugs and not (current_bugs - bugs_to_skip))
94+ or IGNORECHANGELOG_COMMIT in current_commit):
95+ current_authors = set()
96+ current_commit = ""
97+ current_bugs = set()
98+ continue
99 current_bugs -= bugs_to_skip
100 commit_message = current_commit + _format_bugs(current_bugs)
101 for author in current_authors:
102@@ -79,7 +88,8 @@
103 current_commit = ""
104 current_bugs = set()
105
106- # we ignore this commit if we have a changelog provided as part of the diff
107+ # we ignore this commit if we have a changelog provided as part of the
108+ # diff
109 if line.startswith("=== modified file 'debian/changelog'"):
110 current_authors = set()
111 current_commit = ""
112@@ -94,14 +104,15 @@
113 elif commit_message_stenza:
114 if line.startswith("diff:"):
115 commit_message_stenza = False
116- current_commit, current_bugs = _extract_commit_bugs(current_commit, additional_stamp)
117+ args = (current_commit, additional_stamp)
118+ current_commit, current_bugs = _extract_commit_bugs(*args)
119 else:
120- line = line[2:] # Dedent the message provided by bzr
121- if line[0:2] in ('* ', '- '): # paragraph line.
122- line = line[2:] # Remove bullet
123- if line[-1] != '.': # Grammar nazi...
124- line += '.' # ... or the lines will be merged.
125- line = line + ' ' # Add a space to preserve lines
126+ line = line[2:] # Dedent the message provided by bzr
127+ if line[0:2] in ('* ', '- '): # paragraph line.
128+ line = line[2:] # Remove bullet
129+ if line[-1] != '.': # Grammar nazi...
130+ line += '.' # ... or the lines will be merged.
131+ line = line + ' ' # Add a space to preserve lines
132 current_commit += line
133 # Maybe add something like that
134 #for content in mp.commit_message.split('\n'):
135@@ -127,10 +138,13 @@
136
137
138 def _extract_commit_bugs(commit_message, additional_stamp=""):
139- '''extract relevant commit message part and bugs number from a commit message'''
140+ '''extract relevant commit message part and bugs number from a commit
141+ message'''
142
143 current_bugs = _return_bugs(commit_message)
144- changelog_content = " ".join(commit_message.rsplit('Fixes: ')[0].rsplit('Approved by ')[0].split())
145+ changelog_content = commit_message.rsplit('Fixes: ')[0]
146+ changelog_content = changelog_content.rsplit('Approved by ')[0].split()
147+ changelog_content = " ".join(changelog_content)
148 if additional_stamp:
149 changelog_content = changelog_content + " " + additional_stamp
150 return (changelog_content, current_bugs)
151@@ -149,7 +163,8 @@
152 # #12345 (but not 12345 for false positive)
153 # Support multiple bugs per commit
154 bug_numbers = set()
155- bug_regexp = re.compile("((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})", re.IGNORECASE)
156+ pattern = "((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})"
157+ bug_regexp = re.compile(pattern, re.IGNORECASE)
158 for match in bug_regexp.findall(string):
159 logging.debug("Bug regexp match: {}".format(match[-1]))
160 bug_numbers.add(int(match[-1]))
161@@ -170,7 +185,9 @@
162 def return_log_diff(starting_rev):
163 '''Return the relevant part of the cvs log since starting_rev'''
164
165- instance = subprocess.Popen(["bzr", "log", "-r", "{}..".format(starting_rev), "--show-diff", "--forward"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
166+ rev = "{}..".format(starting_rev)
167+ cmd = ["bzr", "log", "-r", rev, "--show-diff", "--forward"]
168+ instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
169 (stdout, stderr) = instance.communicate()
170 if instance.returncode != 0:
171 raise Exception(stderr.decode("utf-8").strip())
172@@ -178,8 +195,10 @@
173
174
175 def return_log_diff_since_last_release(content_to_parse):
176- '''From a bzr log content, return only the log diff since the latest release'''
177- after_release = content_to_parse.split(SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(''))[-1]
178+ '''From a bzr log content, return only the log diff since the latest
179+ release'''
180+ formatted = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format('')
181+ after_release = content_to_parse.split(formatted)[-1]
182 sep = '------------------------------------------------------------'
183 sep_index = after_release.find(sep)
184 if sep_index != 1:
185@@ -190,10 +209,11 @@
186 def commit_release(new_package_version, tip_bzr_rev=None):
187 '''Commit latest release'''
188 if not tip_bzr_rev:
189- message = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version)
190+ msg = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version)
191 else:
192- message = "Releasing {}, based on r{}".format(new_package_version, tip_bzr_rev)
193- if subprocess.call(["bzr", "commit", "-m", message]) != 0:
194+ msg = "Releasing {}, based on r{}"
195+ msg = msg.format(new_package_version, tip_bzr_rev)
196+ if subprocess.call(["bzr", "commit", "-m", msg]) != 0:
197 raise Exception("The above command returned an error.")
198
199
200@@ -214,20 +234,26 @@
201 env["BZR_EDITOR"] = "echo"
202
203 os.chdir(source_package_name)
204- if subprocess.call(["bzr", "push", BRANCH_URL.format(source_package_name, version.replace("~", "").replace(":", "")), "--overwrite"]) != 0:
205+ args = (source_package_name, version.replace("~", "").replace(":", ""))
206+ loc = BRANCH_URL.format(*args)
207+ if subprocess.call(["bzr", "push", loc, "--overwrite"]) != 0:
208 raise Exception("The push command returned an error.")
209- mergeinstance = subprocess.Popen(["bzr", "lp-propose-merge", parent_branch, "-m", PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch), "--approve"], stdin=subprocess.PIPE, env=env)
210+ msg = PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch)
211+ cmd = ["bzr", "lp-propose-merge", parent_branch, "-m", msg, "--approve"]
212+ mergeinstance = subprocess.Popen(cmd, stdin=PIPE, env=env)
213 mergeinstance.communicate(input="y")
214 if mergeinstance.returncode != 0:
215 raise Exception("The lp-propose command returned an error.")
216 os.chdir('..')
217
218
219-def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri, commit_message, revision):
220+def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri,
221+ commit_message, revision):
222 """Merge local branch into lp_parent_branch at revision"""
223 success = False
224 cur_dir = os.path.abspath('.')
225- subprocess.call(["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri])
226+ cmd = ["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri]
227+ subprocess.call(cmd)
228 os.chdir(dest_uri)
229 if subprocess.call(["bzr", "merge", local_branch_uri]) == 0:
230 subprocess.call(["bzr", "commit", "-m", commit_message])
231@@ -236,12 +262,14 @@
232 return success
233
234
235-def merge_branch(uri_to_merge, lp_parent_branch, commit_message, authors=set()):
236+def merge_branch(uri_to_merge, lp_parent_branch, commit_message,
237+ authors=set()):
238 """Resync with targeted branch if possible"""
239 success = False
240 cur_dir = os.path.abspath('.')
241 os.chdir(uri_to_merge)
242- lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:")
243+ args = ("https://code.launchpad.net/", "lp:")
244+ lp_parent_branch = lp_parent_branch.replace(*args)
245 if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0:
246 cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"]
247 for author in authors:
248@@ -251,12 +279,14 @@
249 os.chdir(cur_dir)
250 return success
251
252+
253 def push_to_branch(source_uri, lp_parent_branch, overwrite=False):
254 """Push source to parent branch"""
255 success = False
256 cur_dir = os.path.abspath('.')
257 os.chdir(source_uri)
258- lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:")
259+ args = ("https://code.launchpad.net/", "lp:")
260+ lp_parent_branch = lp_parent_branch.replace(*args)
261 command = ["bzr", "push", lp_parent_branch]
262 if overwrite:
263 command.append("--overwrite")
264@@ -265,16 +295,21 @@
265 os.chdir(cur_dir)
266 return success
267
268+
269 def grab_committers_compared_to(source_uri, lp_branch_to_scan):
270 """Return unique list of committers for a given branch"""
271 committers = set()
272 cur_dir = os.path.abspath('.')
273 os.chdir(source_uri)
274- lp_branch_to_scan = lp_branch_to_scan.replace("https://code.launchpad.net/", "lp:")
275- instance = subprocess.Popen(["bzr", "missing", lp_branch_to_scan, "--other"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
276+ args = ("https://code.launchpad.net/", "lp:")
277+ lp_branch_to_scan = lp_branch_to_scan.replace(*args)
278+ cmd = ["bzr", "missing", lp_branch_to_scan, "--other"]
279+ instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
280 (stdout, stderr) = instance.communicate()
281 if stderr != "":
282- raise Exception("bzr missing on {} returned a failure: {}".format(lp_branch_to_scan, stderr.decode("utf-8").strip()))
283+ msg = "bzr missing on {} returned a failure: {}"
284+ msg = msg.format(lp_branch_to_scan, stderr.decode("utf-8").strip())
285+ raise Exception(msg)
286 committer_regexp = re.compile("\ncommitter: (.*)\n")
287 for match in committer_regexp.findall(stdout):
288 for committer in match.split(', '):
289
290=== modified file 'branch-source-builder/cupstream2distro/launchpadmanager.py'
291--- branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-19 16:34:46 +0000
292+++ branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-23 16:37:01 +0000
293@@ -25,10 +25,13 @@
294 import os
295 launchpad = None
296
297-from .settings import ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH, CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR
298-
299-
300-def get_launchpad(use_staging=False, use_cred_file=os.path.expanduser(CRED_FILE_PATH)):
301+from .settings import (ARCHS_TO_EVENTUALLY_IGNORE,
302+ ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH,
303+ CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR)
304+
305+
306+def get_launchpad(use_staging=False,
307+ use_cred_file=os.path.expanduser(CRED_FILE_PATH)):
308 '''Get THE Launchpad'''
309 global launchpad
310 if not launchpad:
311@@ -42,14 +45,20 @@
312 os.makdedirs(launchpadlib_dir)
313
314 if use_cred_file:
315- launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"],
316- version='devel', # devel because copyPackage is only available there
317- credentials_file=use_cred_file,
318- launchpadlib_dir=launchpadlib_dir)
319+ launchpad = Launchpad.login_with(
320+ 'cupstream2distro', server,
321+ allow_access_levels=["WRITE_PRIVATE"],
322+ # devel because copyPackage is only available there
323+ version='devel',
324+ credentials_file=use_cred_file,
325+ launchpadlib_dir=launchpadlib_dir)
326 else:
327- launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"],
328- version='devel', # devel because copyPackage is only available there
329- launchpadlib_dir=launchpadlib_dir)
330+ launchpad = Launchpad.login_with(
331+ 'cupstream2distro', server,
332+ allow_access_levels=["WRITE_PRIVATE"],
333+ # devel because copyPackage is only available there
334+ version='devel',
335+ launchpadlib_dir=launchpadlib_dir)
336
337 return launchpad
338
339@@ -77,7 +86,8 @@
340 bug_title_sets = set()
341 for bug in author_bugs[author]:
342 try:
343- bug_title_sets.add("{} (LP: #{})".format(lp.bugs[bug].title, bug))
344+ title = "{} (LP: #{})".format(lp.bugs[bug].title, bug)
345+ bug_title_sets.add(title)
346 except KeyError:
347 # still list non existing or if launchpad timeouts bugs
348 bug_title_sets.add(u"Fix LP: #{}".format(bug))
349@@ -102,9 +112,12 @@
350 bug = lp.bugs[bug_num]
351 bug.addTask(target=package)
352 bug.lp_save()
353- except (KeyError, lazr.restfulclient.errors.BadRequest, lazr.restfulclient.errors.ServerError):
354- # ignore non existing or available bugs
355- logging.info("Can't synchronize upstream/downstream bugs for bug #{}. Not blocking on that.".format(bug_num))
356+ except (KeyError, lazr.restfulclient.errors.BadRequest,
357+ lazr.restfulclient.errors.ServerError):
358+ # ignore non existing or available bugs
359+ m = ("Can't synchronize upstream/downstream bugs "
360+ "for bug #{}. Not blocking on that.".format(bug_num))
361+ logging.info(m)
362
363
364 def get_available_and_all_archs(series, ppa=None):
365@@ -115,7 +128,8 @@
366 arch_all_arch = VIRTUALIZED_PPA_ARCH[0]
367 else:
368 for arch in series.architectures:
369- # HACK: filters armel as it's still seen as available on raring: https://launchpad.net/bugs/1077257
370+ # HACK: filters armel as it's still seen as available on raring:
371+ # https://launchpad.net/bugs/1077257
372 if arch.architecture_tag == "armel":
373 continue
374 available_arch.add(arch.architecture_tag)
375@@ -135,23 +149,29 @@
376 if ppa_name.startswith("ppa:"):
377 ppa_name = ppa_name[4:]
378 ppa_dispatch = ppa_name.split("/")
379- return get_launchpad().people[ppa_dispatch[0]].getPPAByName(name=ppa_dispatch[1])
380+ person = get_launchpad().people[ppa_dispatch[0]]
381+ return person.getPPAByName(name=ppa_dispatch[1])
382+
383
384 def is_series_current(series_name):
385 '''Return if series_name is the edge development version'''
386 return get_ubuntu().current_series.name == series_name
387
388+
389 def get_resource_from_url(url):
390 '''Return a lp resource from a launchpad url'''
391 lp = get_launchpad()
392- url = lp.resource_type_link.replace("/#service-root", "") + url.split("launchpad.net")[1]
393+ url = lp.resource_type_link.replace("/#service-root", "")
394+ url = url + url.split("launchpad.net")[1]
395 return lp.load(url)
396
397+
398 def get_resource_from_token(url):
399 '''Return a lp resource from a launchpad token'''
400 lp = get_launchpad()
401 return lp.load(url)
402
403+
404 def is_dest_ubuntu_archive(series_link):
405 '''return if series_link is the ubuntu archive'''
406 return series_link == get_ubuntu_archive().self_link
407
408=== modified file 'branch-source-builder/cupstream2distro/packageinppa.py'
409--- branch-source-builder/cupstream2distro/packageinppa.py 2014-02-19 16:34:46 +0000
410+++ branch-source-builder/cupstream2distro/packageinppa.py 2014-02-23 16:37:01 +0000
411@@ -27,8 +27,9 @@
412 (BUILDING, FAILED, PUBLISHED) = range(3)
413
414 def __init__(self, source_name, version, ppa, destarchive, series,
415- available_archs_in_ppa, arch_all_arch, archs_to_eventually_ignore,
416- archs_to_unconditionually_ignore, package_archs=None):
417+ available_archs_in_ppa, arch_all_arch,
418+ archs_to_eventually_ignore, archs_to_unconditionually_ignore,
419+ package_archs=None):
420 self.source_name = source_name
421 self.version = version
422 self.series = series
423@@ -39,7 +40,8 @@
424 # Get archs we should look at
425 version_for_source_file = version.split(':')[-1]
426 if not package_archs:
427- dsc_filename = "{}_{}.dsc".format(source_name, version_for_source_file)
428+ tmpl = "{}_{}.dsc"
429+ dsc_filename = tmpl.format(source_name, version_for_source_file)
430 regexp = re.compile("^Architecture: (.*)\n")
431 for line in open(dsc_filename):
432 arch_lists = regexp.findall(line)
433@@ -57,21 +59,28 @@
434 archs_supported_by_package = set()
435 for arch in package_archs.split():
436 archs_supported_by_package.add(arch)
437- self.archs = archs_supported_by_package.intersection(available_archs_in_ppa)
438- # ignore some eventual archs if doesn't exist in latest published version in dest
439+ intersection = archs_supported_by_package.intersection
440+ self.archs = intersection(available_archs_in_ppa)
441+ # ignore some eventual archs if doesn't exist in latest published
442+ # version in dest
443 archs_to_eventually_ignore = archs_to_eventually_ignore.copy()
444 if archs_to_eventually_ignore:
445 try:
446- previous_source = destarchive.getPublishedSources(exact_match=True, source_name=self.source_name,
447- distro_series=self.series, status="Published")[0]
448+ kw = {'exact_match': True, 'source_name': self.source_name,
449+ 'distro_series': self.series, 'status': "Published"}
450+ previous_source = destarchive.getPublishedSources(**kw)[0]
451 for binary in previous_source.getPublishedBinaries():
452- if binary.architecture_specific and binary.distro_arch_series.architecture_tag in archs_to_eventually_ignore:
453- archs_to_eventually_ignore -= set([binary.distro_arch_series.architecture_tag])
454+ if (binary.architecture_specific and
455+ binary.distro_arch_series.architecture_tag in
456+ archs_to_eventually_ignore):
457+ rem = [binary.distro_arch_series.architecture_tag]
458+ archs_to_eventually_ignore -= set(rem)
459 if not archs_to_eventually_ignore:
460 break
461
462 except IndexError:
463- # no package in dest, don't wait on any archs_to_eventually_ignore
464+ # no package in dest, don't wait on any
465+ # archs_to_eventually_ignore
466 pass
467 # remove from the inspection remaining archs to ignore
468 if archs_to_eventually_ignore:
469@@ -79,11 +88,9 @@
470 if archs_to_unconditionually_ignore:
471 self.archs -= archs_to_unconditionually_ignore
472
473-
474 def __repr__(self):
475 return '{} - {}'.format(self.source_name, self.version)
476
477-
478 def get_status(self, on_particular_arch=None):
479 '''Look at the package status in the ppa
480
481@@ -120,11 +127,15 @@
482
483 for arch in self.archs.copy():
484 if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)):
485- logging.warning("Request to ignore {} on {}.".format(self.source_name, arch))
486+ tmpl = "Request to ignore {} on {}."
487+ logging.warning(tmpl.format(self.source_name, arch))
488 try:
489 self.archs.remove(arch)
490 except ValueError:
491- logging.warning("Request to ignore {} on {} has been proceeded, but this one wasn't in the list we were monitor for.".format(self.source_name, arch))
492+ tmpl = ("Request to ignore {} on {} has been proceeded, "
493+ "but this one wasn't in the list we were monitor "
494+ "for.")
495+ logging.warning(tmpl.format(self.source_name, arch))
496 try:
497 self.current_status.pop(arch)
498 except KeyError:
499@@ -137,10 +148,12 @@
500
501 # first step, get the source published
502 if not self.current_status:
503- (self.current_status, self.source) = self._get_status_for_source_package_in_ppa()
504+ status = self._get_status_for_source_package_in_ppa()
505+ (self.current_status, self.source) = status
506 # check the binary status
507 if self.current_status:
508- self.current_status = self._get_status_for_binary_packages_in_ppa()
509+ status = self._get_status_for_binary_packages_in_ppa()
510+ self.current_status = status
511
512 def _get_status_for_source_package_in_ppa(self):
513 '''Return current_status for source package in ppa.
514@@ -149,13 +162,16 @@
515 - None -> not visible yet
516 - BUILDING -> currently Building (or waiting to build)
517 - FAILED -> Build failed for this arch or has been canceled
518- - PUBLISHED -> All packages (including arch:all from other archs) published.
519+ - PUBLISHED -> All packages (including arch:all from other archs)
520+ published.
521
522- Only the 2 first status are returned by this call. See _get_status_for_binary_packages_in_ppa
523- for the others.'''
524+ Only the 2 first status are returned by this call. See
525+ _get_status_for_binary_packages_in_ppa for the others.'''
526
527 try:
528- source = self.ppa.getPublishedSources(exact_match=True, source_name=self.source_name, version=self.version, distro_series=self.series)[0]
529+ kw = {'exact_match': True, 'source_name': self.source_name,
530+ 'version': self.version, 'distro_series': self.series}
531+ source = self.ppa.getPublishedSources(**kw)[0]
532 logging.info("Source available in ppa")
533 current_status = {}
534 for arch in self.archs:
535@@ -172,26 +188,36 @@
536 - None -> not visible yet
537 - BUILDING -> currently Building (or waiting to build)
538 - FAILED -> Build failed for this arch or has been canceled
539- - PUBLISHED -> All packages (including arch:all from other archs) published.
540-
541- Only the 3 last statuses are returned by this call. See _get_status_for_source_package_in_ppa
542- for the other.'''
543-
544- # Try to see if all binaries availables for this arch are built, including arch:all on other archs
545+ - PUBLISHED -> All packages (including arch:all from other archs)
546+ published.
547+
548+ Only the 3 last statuses are returned by this call. See
549+ _get_status_for_source_package_in_ppa for the other.'''
550+
551+ # Try to see if all binaries availables for this arch are built,
552+ # including arch:all on other archs
553 status = self.current_status
554- at_least_one_published_binary = False
555 for binary in self.source.getPublishedBinaries():
556- at_least_one_published_binary = True
557 # all binaries for an arch are published at the same time
558- # launchpad is lying, it's telling that archs not in the ppa are built (for arch:all). Even for non supported arch!
559- # for instance, we can have the case of self.arch_all_arch (arch:all), built before the others and amd64 will be built for it
560- if binary.status == "Published" and (binary.distro_arch_series.architecture_tag == self.arch_all_arch or
561- (binary.distro_arch_series.architecture_tag != self.arch_all_arch and binary.architecture_specific)):
562- status[binary.distro_arch_series.architecture_tag] = self.PUBLISHED
563+ # launchpad is lying, it's telling that archs not in the ppa are
564+ # built (for arch:all). Even for non supported arch!
565+ # for instance, we can have the case of self.arch_all_arch
566+ # (arch:all), built before the others and amd64 will be built for
567+ # it
568+ published = binary.status == "Published"
569+ arch_tag = binary.distro_arch_series.architecture_tag
570+ not_arch_all = arch_tag != self.arch_all_arch
571+ arch_all = arch_tag == self.arch_all_arch
572+ arch_specific = binary.architecture_specific
573+ if published and (arch_all or (not_arch_all and arch_specific)):
574+ status[arch_tag] = self.PUBLISHED
575
576- # Looking for builds on archs still BUILDING (just loop on builds once to avoid too many lp requests)
577+ # Looking for builds on archs still BUILDING (just loop on builds once
578+ # to avoid too many lp requests)
579 needs_checking_build = False
580- build_state_failed = ('Failed to build', 'Chroot problem', 'Failed to upload', 'Cancelled build', 'Build for superseded Source')
581+ build_state_failed = ('Failed to build', 'Chroot problem',
582+ 'Failed to upload', 'Cancelled build',
583+ 'Build for superseded Source')
584 for arch in self.archs:
585 if self.current_status[arch] == self.BUILDING:
586 needs_checking_build = True
587@@ -202,24 +228,35 @@
588 continue
589 if self.current_status[build.arch_tag] == self.BUILDING:
590 if build.buildstate in build_state_failed:
591- logging.error("{}: Build {} ({}) failed because of {}".format(build.arch_tag, build.title,
592- build.web_link, build.buildstate))
593+ tmpl = "{}: Build {} ({}) failed because of {}"
594+ logging.error(tmpl.format(build.arch_tag, build.title,
595+ build.web_link, build.buildstate))
596 status[build.arch_tag] = self.FAILED
597- # Another launchpad trick: if a binary arch was published, but then is superseeded, getPublishedBinaries() won't list
598- # those binaries anymore. So it's seen as BUILDING again.
599- # If there is a successful build record of it and the source is superseded, it means that it built fine at some point,
600- # Another arch will fail as superseeded.
601- # We don't just retain the old state of "PUBLISHED" because maybe we started the script with that situation already
602- elif build.buildstate not in build_state_failed and self.source.status == "Superseded":
603- status[build.arch_tag] = self.PUBLISHED
604+ # Another launchpad trick: if a binary arch was published,
605+ # but then is superseeded, getPublishedBinaries() won't
606+ # list those binaries anymore. So it's seen as BUILDING
607+ # again. If there is a successful build record of it and
608+ # the source is superseded, it means that it built fine at
609+ # some point, Another arch will fail as superseeded.
610+ # We don't just retain the old state of "PUBLISHED" because
611+ # maybe we started the script with that situation already
612+ elif (build.buildstate not in build_state_failed
613+ and self.source.status == "Superseded"):
614+ status[build.arch_tag] = self.PUBLISHED
615
616- # There is no way to know if there are some arch:all packages (and there are not in publishedBinaries for this arch until
617- # it's built on arch_all_arch). So mark all arch to BUILDING if self.arch_all_arch is building or FAILED if it failed.
618- if self.arch_all_arch in status and status[self.arch_all_arch] != self.PUBLISHED:
619+ # There is no way to know if there are some arch:all packages (and
620+ # there are not in publishedBinaries for this arch until
621+ # it's built on arch_all_arch). So mark all arch to BUILDING if
622+ # self.arch_all_arch is building or FAILED if it failed.
623+ not_published = status[self.arch_all_arch] != self.PUBLISHED
624+ if self.arch_all_arch in status and not_published:
625 for arch in self.archs:
626 if status[arch] == self.PUBLISHED:
627 status[arch] = status[self.arch_all_arch]
628- if arch != self.arch_all_arch and status[arch] == self.FAILED:
629- logging.error("{} marked as FAILED because {} build FAILED and we may miss arch:all packages".format(arch, self.arch_all_arch))
630+ failed = status[arch] == self.FAILED
631+ if arch != self.arch_all_arch and failed:
632+ msg = ("{} marked as FAILED because {} build FAILED "
633+ "and we may miss arch:all packages")
634+ logging.error(msg.format(arch, self.arch_all_arch))
635
636 return status
637
638=== modified file 'branch-source-builder/cupstream2distro/packageinppamanager.py'
639--- branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-19 16:34:46 +0000
640+++ branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-23 16:37:01 +0000
641@@ -41,7 +41,8 @@
642 # we do not rely on the .changes files but in the config file
643 # because we need the exact version (which can have an epoch)
644 result = set()
645- source_package_regexp = re.compile("(.*).{}$".format(PROJECT_CONFIG_SUFFIX))
646+ tmpl = "(.*).{}$"
647+ source_package_regexp = re.compile(tmpl.format(PROJECT_CONFIG_SUFFIX))
648 for file in os.listdir('.'):
649 substract = source_package_regexp.findall(file)
650 if substract:
651@@ -59,40 +60,51 @@
652
653
654 def get_packages_and_versions_uploaded():
655- '''Get (package, version) of all packages uploaded. We can have duplicates'''
656+ '''Get (package, version) of all packages uploaded. We can have
657+ duplicates'''
658
659 # we do not rely on the .changes files but in the config file
660 # because we need the exact version (which can have an epoch)
661 result = set()
662- source_package_regexp = re.compile("(.*).{}.*$".format(PROJECT_CONFIG_SUFFIX))
663+ tmpl = "(.*).{}.*$"
664+ source_package_regexp = re.compile(tmpl.format(PROJECT_CONFIG_SUFFIX))
665 for file in os.listdir('.'):
666 substract = source_package_regexp.findall(file)
667 if substract:
668 config = ConfigParser.RawConfigParser()
669 config.read(file)
670- result.add((substract[0], config.get('Package', 'packaging_version')))
671+ pkg = config.get('Package', 'packaging_version')
672+ result.add((substract[0], pkg))
673 return result
674
675
676-def update_all_packages_status(packages_not_in_ppa, packages_building, packages_failed, particular_arch=None):
677+def update_all_packages_status(packages_not_in_ppa, packages_building,
678+ packages_failed, particular_arch=None):
679 '''Update all packages status, checking in the ppa'''
680
681 for current_package in (packages_not_in_ppa.union(packages_building)):
682- logging.info("current_package: " + current_package.source_name + " " + current_package.version)
683+ logging.info("current_package: " + current_package.source_name + " " +
684+ current_package.version)
685 package_status = current_package.get_status(particular_arch)
686- if package_status != None: # global package_status can be 0 (building), 1 (failed), 2 (published)
687+ # global package_status can be 0 (building), 1 (failed), 2 (published)
688+ if package_status is None:
689 # if one arch building, still considered as building
690 if package_status == PackageInPPA.BUILDING:
691- _ensure_removed_from_set(packages_not_in_ppa, current_package) # maybe already removed
692+ # maybe already removed
693+ _ensure_removed_from_set(packages_not_in_ppa, current_package)
694 packages_building.add(current_package)
695 # if one arch failed, considered as failed
696 elif package_status == PackageInPPA.FAILED:
697- _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step
698- _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step
699+ # in case we missed the "build" step
700+ _ensure_removed_from_set(packages_building, current_package)
701+ # in case we missed the "wait" step
702+ _ensure_removed_from_set(packages_not_in_ppa, current_package)
703 packages_failed.add(current_package)
704 elif package_status == PackageInPPA.PUBLISHED:
705- _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step
706- _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step
707+ # in case we missed the "build" step
708+ _ensure_removed_from_set(packages_building, current_package)
709+ # in case we missed the "wait" step
710+ _ensure_removed_from_set(packages_not_in_ppa, current_package)
711
712
713 def _get_current_packaging_version_from_config(source_package_name):
714
715=== modified file 'branch-source-builder/cupstream2distro/packagemanager.py'
716--- branch-source-builder/cupstream2distro/packagemanager.py 2014-02-19 16:34:46 +0000
717+++ branch-source-builder/cupstream2distro/packagemanager.py 2014-02-23 16:37:01 +0000
718@@ -31,9 +31,11 @@
719 import launchpadmanager
720 import settings
721 from .utils import ignored
722-
723-
724-def get_current_version_for_series(source_package_name, series_name, ppa_name=None, dest=None):
725+from subprocess import Popen, PIPE
726+
727+
728+def get_current_version_for_series(source_package_name, series_name,
729+ ppa_name=None, dest=None):
730 '''Get current version for a package name in that series'''
731 series = launchpadmanager.get_series(series_name)
732 if not dest:
733@@ -41,29 +43,37 @@
734 dest = launchpadmanager.get_ppa(ppa_name)
735 else:
736 dest = launchpadmanager.get_ubuntu_archive()
737- source_collection = dest.getPublishedSources(exact_match=True, source_name=source_package_name, distro_series=series)
738+ kw = {'exact_match': True, 'source_name': source_package_name,
739+ 'distro_series': series}
740+ source_collection = dest.getPublishedSources(**kw)
741 try:
742- # cjwatson told that list always have the more recently published first (even if removed)
743+ # cjwatson told that list always have the more recently published first
744+ # (even if removed)
745 return source_collection[0].source_package_version
746 # was never in the dest, set the lowest possible version
747 except IndexError:
748 return "0"
749
750
751-def is_version_for_series_in_dest(source_package_name, version, series, dest, pocket="Release"):
752+def is_version_for_series_in_dest(source_package_name, version, series, dest,
753+ pocket="Release"):
754 '''Return if version for a package name in that series is in dest'''
755- return dest.getPublishedSources(exact_match=True, source_name=source_package_name, version=version,
756- distro_series=series, pocket=pocket).total_size > 0
757+ kw = {'exact_match': True, 'source_name': source_package_name,
758+ 'version': version, 'distro_series': series, 'pocket': pocket}
759+ return dest.getPublishedSources(**kw).total_size > 0
760+
761
762 def is_version_in_queue(source_package_name, version, dest_serie, queue):
763 '''Return if version for a package name in that series is in dest'''
764- return dest_serie.getPackageUploads(exact_match=True, name=source_package_name, version=version,
765- status=queue).total_size > 0
766+ kw = {'exact_match': True, 'name': source_package_name, 'version': version,
767+ 'status': queue}
768+ return dest_serie.getPackageUploads(**kw).total_size > 0
769
770
771 def is_version1_higher_than_version2(version1, version2):
772 '''return if version1 is higher than version2'''
773- return (subprocess.call(["dpkg", "--compare-versions", version1, 'gt', version2], stdout=subprocess.PIPE, stderr=subprocess.PIPE) == 0)
774+ cmd = ["dpkg", "--compare-versions", version1, 'gt', version2]
775+ return (subprocess.call(cmd, stdout=PIPE, stderr=PIPE) == 0)
776
777
778 def is_version_in_changelog(version, f):
779@@ -72,7 +82,8 @@
780 if version == "0":
781 return True
782
783- desired_changelog_line = re.compile("\({}\) (?!UNRELEASED).*\; urgency=".format(version.replace('+', '\+')))
784+ t = "\({}\) (?!UNRELEASED).*\; urgency="
785+ desired_changelog_line = re.compile(t.format(version.replace('+', '\+')))
786 for line in f.readlines():
787 if desired_changelog_line.search(line):
788 return True
789@@ -83,9 +94,12 @@
790 def get_latest_upstream_bzr_rev(f, dest_ppa=None):
791 '''Report latest bzr rev in the file
792
793- If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to first distro version'''
794+ If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to
795+ first distro version'''
796 distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT))
797- destppa_regexp = re.compile("{} (\d+) \(ppa:{}\)".format(settings.REV_STRING_FORMAT, dest_ppa))
798+ tmpl = "{} (\d+) \(ppa:{}\)"
799+ destppa_regexp = re.compile(
800+ tmpl.format(settings.REV_STRING_FORMAT, dest_ppa))
801 distro_rev = None
802 candidate_destppa_rev = None
803 candidate_distro_rev = None
804@@ -111,15 +125,19 @@
805 destppa_element_found = True
806 except IndexError:
807 destppa_element_found = False
808- if not distro_rev and not destppa_element_found and not "(ppa:" in line:
809+ ppa_line = "(ppa:" in line
810+ if not distro_rev and not destppa_element_found and not ppa_line:
811 try:
812 candidate_distro_rev = int(distro_regex.findall(line)[0])
813 distro_element_found = True
814 except IndexError:
815 distro_element_found = False
816
817- # try to catchup next line if we have a marker start without anything found
818- if settings.REV_STRING_FORMAT in line and (dest_ppa and not destppa_element_found) and not distro_element_found:
819+ # try to catchup next line if we have a marker start without anything
820+ # found
821+ rev_str_found = settings.REV_STRING_FORMAT in line
822+ ppa_found = dest_ppa and not destppa_element_found
823+ if rev_str_found and ppa_found and not distro_element_found:
824 previous_line = line
825
826 if line.startswith(" -- "):
827@@ -150,7 +168,8 @@
828
829 def get_packaging_version():
830 '''Get current packaging rev'''
831- instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
832+ instance = Popen(["dpkg-parsechangelog"],
833+ stdout=PIPE, stderr=PIPE)
834 (stdout, stderr) = instance.communicate()
835 if instance.returncode != 0:
836 raise Exception(stderr.decode("utf-8").strip())
837@@ -160,19 +179,24 @@
838 if packaging_version:
839 return packaging_version[0]
840
841- raise Exception("Didn't find any Version in the package: {}".format(stdout))
842-
843-
844-def get_source_package_from_dest(source_package_name, dest_archive, dest_current_version, series_name):
845- '''Download and return a path containing a checkout of the current dest version.
846+ tmpl = "Didn't find any Version in the package: {}"
847+ raise Exception(tmpl.format(stdout))
848+
849+
850+def get_source_package_from_dest(source_package_name, dest_archive,
851+ dest_current_version, series_name):
852+ '''Download and return a path containing a checkout of the current dest
853+ version.
854
855 None if this package was never published to dest archive'''
856
857 if dest_current_version == "0":
858- logging.info("This package was never released to the destination archive, don't return downloaded source")
859+ logging.info("This package was never released to the destination "
860+ "archive, don't return downloaded source")
861 return None
862
863- logging.info("Grab code for {} ({}) from {}".format(source_package_name, dest_current_version, series_name))
864+ args = (source_package_name, dest_current_version, series_name)
865+ logging.info("Grab code for {} ({}) from {}".format(*args))
866 source_package_download_dir = os.path.join('ubuntu', source_package_name)
867 series = launchpadmanager.get_series(series_name)
868 with ignored(OSError):
869@@ -180,27 +204,39 @@
870 os.chdir(source_package_download_dir)
871
872 try:
873- sourcepkg = dest_archive.getPublishedSources(status="Published", exact_match=True, source_name=source_package_name, distro_series=series, version=dest_current_version)[0]
874+ kw = {'status': "Published", 'exact_match': True,
875+ 'source_name': source_package_name, 'distro_series': series,
876+ 'version': dest_current_version}
877+ sourcepkg = dest_archive.getPublishedSources(**kw)[0]
878 except IndexError:
879 raise Exception("Couldn't get in the destination the expected version")
880- logging.info('Downloading %s version %s', source_package_name, dest_current_version)
881+ tmpl = 'Downloading %s version %s'
882+ logging.info(tmpl, source_package_name, dest_current_version)
883 for url in sourcepkg.sourceFileUrls():
884 urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1]))
885- instance = subprocess.Popen("dpkg-source -x *dsc", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
886+ cmd = "dpkg-source -x *dsc"
887+ instance = Popen(cmd, shell=True, stdout=PIPE, stderr=PIPE)
888 (stdout, stderr) = instance.communicate()
889 if instance.returncode != 0:
890 raise Exception(stderr.decode("utf-8").strip())
891
892 # check the dir exist
893- splitted_version = dest_current_version.split(':')[-1].split('-') # remove epoch is there is one
894+ # remove epoch is there is one
895+ splitted_version = dest_current_version.split(':')[-1].split('-')
896 # TODO: debian version (like -3) is not handled here.
897- # We do handle 42ubuntu1 though (as splitted_version[0] can contain "ubuntu")
898- if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1: # don't remove last item for the case where we had a native version (-0.35.2) without ubuntu in it
899+ # We do handle 42ubuntu1 though (as splitted_version[0] can contain
900+ # "ubuntu")
901+ if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1:
902+ # don't remove last item for the case where we had a native version
903+ # (-0.35.2) without ubuntu in it
904 splitted_version = splitted_version[:-1]
905 version_for_source_file = '-'.join(splitted_version)
906- source_directory_name = "{}-{}".format(source_package_name, version_for_source_file)
907+ args = (source_package_name, version_for_source_file)
908+ source_directory_name = "{}-{}".format(*args)
909 if not os.path.isdir(source_directory_name):
910- raise Exception("We tried to download and check that the directory {} is present, but it's not the case".format(source_directory_name))
911+ tmpl = ("We tried to download and check that the directory {} is "
912+ "present, but it's not the case")
913+ raise Exception(tmpl.format(source_directory_name))
914 os.chdir('../..')
915 return (os.path.join(source_package_download_dir, source_directory_name))
916
917@@ -210,24 +246,40 @@
918
919 dest_version_source can be None if no released version was done before.'''
920
921- # we always released something not yet in ubuntu, no matter criterias are not met.
922+ # we always released something not yet in ubuntu, no matter criterias are
923+ # not met.
924 if not dest_version_source:
925 return True
926
927- # now check the relevance of the committed changes compared to the version in the repository (if any)
928- diffinstance = subprocess.Popen(['diff', '-Nrup', '.', dest_version_source], stdout=subprocess.PIPE)
929- filterinstance = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE)
930- lsdiffinstance = subprocess.Popen(['lsdiff'], stdin=filterinstance.stdout, stdout=subprocess.PIPE)
931- (relevant_changes, err) = subprocess.Popen(['grep', '-Ev', '.bzr|.pc'], stdin=lsdiffinstance.stdout, stdout=subprocess.PIPE).communicate()
932-
933- # detect if the only change is a Vcs* target changes (with or without changelog edit). We won't release in that case
934+ # now check the relevance of the committed changes compared to the version
935+ # in the repository (if any)
936+ cmd = ['diff', '-Nrup', '.', dest_version_source]
937+ diffinstance = Popen(cmd, stdout=PIPE)
938+
939+ cmd = ['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x',
940+ '*local-options']
941+ filterinstance = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE)
942+ cmd = ['lsdiff']
943+ lsdiffinstance = Popen(cmd, stdin=filterinstance.stdout, stdout=PIPE)
944+ cmd = ['grep', '-Ev', '.bzr|.pc']
945+ p = Popen(cmd, stdin=lsdiffinstance.stdout, stdout=PIPE)
946+ (relevant_changes, err) = p.communicate()
947+
948+ # detect if the only change is a Vcs* target changes (with or without
949+ # changelog edit). We won't release in that case
950 number_of_changed_files = relevant_changes.count("\n")
951- if ((number_of_changed_files == 1 and "debian/control" in relevant_changes) or
952- (number_of_changed_files == 2 and "debian/control" in relevant_changes and "debian/changelog" in relevant_changes)):
953- (results, err) = subprocess.Popen(['diff', os.path.join('debian', 'control'), os.path.join(dest_version_source, "debian", "control")], stdout=subprocess.PIPE).communicate()
954+
955+ control_changed = "debian/control" in relevant_changes
956+ if control_changed and number_of_changed_files in (1, 2):
957+ cmd = ['diff', os.path.join('debian', 'control'),
958+ os.path.join(dest_version_source, "debian", "control")]
959+
960+ (results, err) = Popen(cmd, stdout=PIPE).communicate()
961 for diff_line in results.split('\n'):
962 if diff_line.startswith("< ") or diff_line.startswith("> "):
963- if not diff_line[2:].startswith("Vcs-") and not diff_line[2:].startswith("#"):
964+ vcs = diff_line[2:].startswith("Vcs-")
965+ comment = diff_line[2:].startswith("#")
966+ if not vcs and not comment:
967 return True
968 return False
969
970@@ -237,24 +289,34 @@
971 return (relevant_changes != '')
972
973
974-def is_relevant_source_diff_from_previous_dest_version(newdsc_path, dest_version_source):
975- '''Extract and check if the generated source diff different from previous one'''
976+def is_relevant_source_diff_from_previous_dest_version(newdsc_path,
977+ dest_version_source):
978+ '''Extract and check if the generated source diff different from previous
979+ one'''
980
981 with ignored(OSError):
982 os.makedirs("generated")
983- extracted_generated_source = os.path.join("generated", newdsc_path.split('_')[0])
984+ prefix = newdsc_path.split('_')[0]
985+ extracted_generated_source = os.path.join("generated", prefix)
986 with ignored(OSError):
987 shutil.rmtree(extracted_generated_source)
988
989 # remove epoch is there is one
990- if subprocess.call(["dpkg-source", "-x", newdsc_path, extracted_generated_source]) != 0:
991+ cmd = ["dpkg-source", "-x", newdsc_path, extracted_generated_source]
992+ if subprocess.call(cmd) != 0:
993 raise Exception("dpkg-source command returned an error.")
994
995- # now check the relevance of the committed changes compared to the version in the repository (if any)
996- diffinstance = subprocess.Popen(['diff', '-Nrup', extracted_generated_source, dest_version_source], stdout=subprocess.PIPE)
997- (diff, err) = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate()
998+ # now check the relevance of the committed changes compared to the version
999+ # in the repository (if any)
1000+ cmd = ['diff', '-Nrup', extracted_generated_source, dest_version_source]
1001+ diffinstance = Popen(cmd, stdout=PIPE)
1002+ cmd = ['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x',
1003+ '*local-options']
1004+ p = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE)
1005+ (diff, err) = p.communicate()
1006
1007- # there is no important diff if the diff only contains 12 lines, corresponding to "Automatic daily release" marker in debian/changelog
1008+ # there is no important diff if the diff only contains 12 lines,
1009+ # corresponding to "Automatic daily release" marker in debian/changelog
1010 if (diff.count('\n') <= 12):
1011 return False
1012 return True
1013@@ -267,47 +329,65 @@
1014 if not oldsource_dsc:
1015 return True
1016 if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc):
1017- raise Exception("{} or {} doesn't not exist, can't create a diff".format(oldsource_dsc, newsource_dsc))
1018- diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE)
1019- filterinstance = subprocess.Popen(['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog'], stdin=diffinstance.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
1020+ tmpl = "{} or {} doesn't not exist, can't create a diff"
1021+ raise Exception(tmpl.format(oldsource_dsc, newsource_dsc))
1022+ diffinstance = Popen(['debdiff', oldsource_dsc, newsource_dsc],
1023+ stdout=PIPE)
1024+ cmd = ['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog']
1025+ filterinstance = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE,
1026+ stderr=PIPE)
1027 (change_in_debian, filter_err) = filterinstance.communicate()
1028- # we can't rely on diffinstance returncode as the signature key is maybe not present and it will exit with 1
1029+ # we can't rely on diffinstance returncode as the signature key is maybe
1030+ # not present and it will exit with 1
1031 if filterinstance.returncode != 0:
1032- raise Exception("Error in diff: {}".format(filter_err.decode("utf-8").strip()))
1033+ tmpl = "Error in diff: {}"
1034+ raise Exception(tmpl.format(filter_err.decode("utf-8").strip()))
1035 return(change_in_debian != "")
1036
1037
1038 def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc):
1039- '''Generate a diff file in diff_filepath if there is a relevant packaging diff between 2 sources
1040+ '''Generate a diff file in diff_filepath if there is a relevant packaging
1041+ diff between 2 sources
1042
1043 The diff contains autotools files and cmakeries'''
1044 if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc):
1045 with open(diff_filepath, "w") as f:
1046 if not oldsource_dsc:
1047- f.writelines("This source is a new package, if the destination is ubuntu, please ensure it has been preNEWed by an archive admin before publishing that stack.")
1048+ f.writelines("This source is a new package, if the destination"
1049+ " is ubuntu, please ensure it has been preNEWed"
1050+ " by an archive admin before publishing that"
1051+ " stack.")
1052 return
1053- f.write("/!\ Remember that this diff only represents packaging changes and build tools diff, not the whole content diff!\n\n")
1054- diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE)
1055- (changes_to_publish, err) = subprocess.Popen(['filterdiff', '--remove-timestamps', '--clean', '-i', '*setup.py',
1056- '-i', '*Makefile.am', '-i', '*configure.*', '-i', '*debian/*',
1057- '-i', '*CMakeLists.txt'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate()
1058+ f.write("/!\ Remember that this diff only represents packaging "
1059+ "changes and build tools diff, not the whole content "
1060+ "diff!\n\n")
1061+ diffinstance = Popen(['debdiff', oldsource_dsc, newsource_dsc],
1062+ stdout=PIPE)
1063+ cmd = ['filterdiff', '--remove-timestamps', '--clean', '-i',
1064+ '*setup.py', '-i', '*Makefile.am', '-i', '*configure.*',
1065+ '-i', '*debian/*', '-i', '*CMakeLists.txt']
1066+ (changes_to_publish, err) = Popen(cmd, stdin=diffinstance.stdout,
1067+ stdout=PIPE).communicate()
1068 f.write(changes_to_publish)
1069
1070
1071-def create_new_packaging_version(base_package_version, series_version, destppa=''):
1072+def create_new_packaging_version(base_package_version, series_version,
1073+ destppa=''):
1074 '''Deliver a new packaging version, based on simple rules:
1075
1076- Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1
1077- if we already have something delivered today, it will be .minor, then, .minor+1…
1078+ Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1 if
1079+ we already have something delivered today, it will be .minor, then,
1080+ .minor+1…
1081
1082- We append the destination ppa name if we target a dest ppa and not distro'''
1083+ We append the destination ppa name if we target a dest ppa and not
1084+ distro'''
1085 # to keep track of whether the package is native or not
1086 native_pkg = False
1087
1088 today_version = datetime.date.today().strftime('%Y%m%d')
1089 destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".")
1090- # bootstrapping mode or direct upload or UNRELEASED for bumping to a new series
1091- # TRANSITION
1092+ # bootstrapping mode or direct upload or UNRELEASED for bumping to a new
1093+ # series TRANSITION
1094 if not ("daily" in base_package_version or "+" in base_package_version):
1095 # support both 42, 42-0ubuntu1
1096 upstream_version = base_package_version.split('-')[0]
1097@@ -322,17 +402,18 @@
1098 upstream_version = previous_day[0]
1099 native_pkg = True
1100 if (previous_day[1] == series_version and
1101- previous_day[2] == today_version):
1102- minor = 1
1103- if previous_day[3]: # second upload of the day
1104- minor = int(previous_day[3]) + 1
1105- today_version = "{}.{}".format(today_version, minor)
1106+ previous_day[2] == today_version):
1107+ minor = 1
1108+ if previous_day[3]: # second upload of the day
1109+ minor = int(previous_day[3]) + 1
1110+ today_version = "{}.{}".format(today_version, minor)
1111 except IndexError:
1112 raise Exception(
1113 "Unable to get previous day from native version: %s"
1114 % base_package_version)
1115 else:
1116- # extract the day of previous daily upload and bump if already uploaded today
1117+ # extract the day of previous daily upload and bump if already uploaded
1118+ # today
1119 regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*")
1120 try:
1121 previous_day = regexp.findall(base_package_version)[0]
1122@@ -342,11 +423,17 @@
1123 regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*")
1124 previous_day = regexp.findall(base_package_version)[0]
1125 # make the version compatible with the new version
1126- previous_day = (previous_day[0], previous_day[1], "20" + previous_day[2].replace(".", ""), previous_day[3])
1127+ previous_day = (previous_day[0], previous_day[1], "20" +
1128+ previous_day[2].replace(".", ""),
1129+ previous_day[3])
1130 except IndexError:
1131- raise Exception("Didn't find a correct versioning in the current package: {}".format(base_package_version))
1132+ tmpl = ("Didn't find a correct versioning in the current "
1133+ "package: {}")
1134+ raise Exception(tmpl.format(base_package_version))
1135 upstream_version = previous_day[0]
1136- if previous_day[1] == series_version and previous_day[2] == today_version:
1137+ is_series = previous_day[1] == series_version
1138+ is_today = previous_day[2] == today_version
1139+ if is_series and is_today:
1140 minor = 1
1141 if previous_day[3]: # second upload of the day
1142 minor = int(previous_day[3]) + 1
1143@@ -363,7 +450,7 @@
1144
1145 def get_packaging_sourcename():
1146 '''Get current packaging source name'''
1147- instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
1148+ instance = Popen(["dpkg-parsechangelog"], stdout=PIPE, stderr=PIPE)
1149 (stdout, stderr) = instance.communicate()
1150 if instance.returncode != 0:
1151 raise Exception(stderr.decode("utf-8").strip())
1152@@ -373,7 +460,8 @@
1153 if source_name:
1154 return source_name[0]
1155
1156- raise Exception("Didn't find any source name in the package: {}".format(stdout))
1157+ tmpl = "Didn't find any source name in the package: {}"
1158+ raise Exception(tmpl.format(stdout))
1159
1160
1161 def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name):
1162@@ -382,20 +470,24 @@
1163 # matching only bug format that launchpad accepts
1164 group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE)
1165 bug_decipher_regexp = re.compile("(#\d{5,})+")
1166- new_upload_changelog_regexp = re.compile(settings.NEW_CHANGELOG_PATTERN.format(source_package_name))
1167+ pattern = settings.NEW_CHANGELOG_PATTERN.format(source_package_name)
1168+ new_upload_changelog_regexp = re.compile(pattern)
1169 for line in f:
1170 grouped_bugs_list = group_bugs_regexp.findall(line)
1171 for grouped_bugs in grouped_bugs_list:
1172- for bug in map(lambda bug_with_hash: bug_with_hash.replace('#', ''), bug_decipher_regexp.findall(grouped_bugs)):
1173+ func = lambda bug_with_hash: bug_with_hash.replace('#', '')
1174+ for bug in map(func, bug_decipher_regexp.findall(grouped_bugs)):
1175 bugs.add(bug)
1176- # a released upload to distro (automated or manual)n exit as bugs before were already covered
1177+ # a released upload to distro (automated or manual)n exit as bugs
1178+ # before were already covered
1179 if new_upload_changelog_regexp.match(line):
1180 return bugs
1181
1182 return bugs
1183
1184
1185-def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits, dest_ppa=None):
1186+def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits,
1187+ dest_ppa=None):
1188 '''Update the changelog for the incoming upload'''
1189
1190 dch_env = os.environ.copy()
1191@@ -403,18 +495,20 @@
1192 dch_env["DEBFULLNAME"] = author
1193 for bug_desc in authors_commits[author]:
1194 if bug_desc.startswith('-'):
1195- # Remove leading '-' or dch thinks (rightly) that it's an option
1196+ # Remove leading '-' or dch thinks (rightly) that it's an
1197+ # option
1198 bug_desc = bug_desc[1:]
1199 if bug_desc.startswith(' '):
1200 # Remove leading spaces, there are useless and the result is
1201 # prettier without them anyway ;)
1202 bug_desc = bug_desc.strip()
1203- cmd = ["dch", "--multimaint-merge", "--release-heuristic", "changelog",
1204- "-v{}".format(new_package_version), bug_desc]
1205- subprocess.Popen(cmd, env=dch_env).communicate()
1206+ cmd = ["dch", "--multimaint-merge", "--release-heuristic",
1207+ "changelog", "-v{}".format(new_package_version), bug_desc]
1208+ Popen(cmd, env=dch_env).communicate()
1209
1210- if tip_bzr_rev != None:
1211- commit_message = "{} {}".format(settings.REV_STRING_FORMAT, tip_bzr_rev)
1212+ if tip_bzr_rev is None:
1213+ tmpl = "{} {}"
1214+ commit_message = tmpl.format(settings.REV_STRING_FORMAT, tip_bzr_rev)
1215 if dest_ppa:
1216 commit_message += " ({})".format(dest_ppa)
1217 else:
1218@@ -422,17 +516,21 @@
1219
1220 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME
1221 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL
1222- instance = subprocess.Popen(["dch", "--release-heuristic", "changelog",
1223- "-v{}".format(new_package_version), commit_message],
1224- stderr=subprocess.PIPE, env=dch_env)
1225+ instance = Popen(["dch", "--release-heuristic", "changelog",
1226+ "-v{}".format(new_package_version), commit_message],
1227+ stderr=PIPE, env=dch_env)
1228 (stdout, stderr) = instance.communicate()
1229 if instance.returncode != 0:
1230 raise Exception(stderr.decode("utf-8").strip())
1231- subprocess.call(["dch", "-r", "--distribution", series, "--force-distribution", ""], env=dch_env)
1232+ cmd = ["dch", "-r", "--distribution", series, "--force-distribution", ""]
1233+ subprocess.call(cmd, env=dch_env)
1234
1235- # in the case of no commit_message and no symbols file change, we have an addition [ DEBFULLNAME ] follow by an empty line
1236+ # in the case of no commit_message and no symbols file change, we have an
1237+ # addition [ DEBFULLNAME ] follow by an empty line
1238 # better to remove both lines
1239- subprocess.call(["sed", "-i", "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}", "debian/changelog"])
1240+ pattern = "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}"
1241+ cmd = ["sed", "-i", pattern, "debian/changelog"]
1242+ subprocess.call(cmd)
1243
1244
1245 def build_source_package(series, distro_version, ppa=None):
1246@@ -457,7 +555,7 @@
1247 "--distro-version", distro_version]
1248 if ppa:
1249 cmd.extend(["--ppa", ppa])
1250- instance = subprocess.Popen(cmd, env=cowbuilder_env)
1251+ instance = Popen(cmd, env=cowbuilder_env)
1252 instance.communicate()
1253 if instance.returncode != 0:
1254 raise Exception("%r returned: %s." % (cmd, instance.returncode))
1255@@ -490,15 +588,19 @@
1256 replacement_done = False
1257 for filename in os.listdir("debian"):
1258 if filename.endswith("symbols"):
1259- for line in fileinput.input(os.path.join('debian', filename), inplace=1):
1260+ dfile = os.path.join('debian', filename)
1261+ for line in fileinput.input(dfile, inplace=1):
1262 if settings.REPLACEME_TAG in line:
1263 replacement_done = True
1264- line = line.replace(settings.REPLACEME_TAG, new_upstream_version)
1265+ line = line.replace(settings.REPLACEME_TAG,
1266+ new_upstream_version)
1267 sys.stdout.write(line)
1268
1269 if replacement_done:
1270 dch_env = os.environ.copy()
1271 dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME
1272 dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL
1273- subprocess.Popen(["dch", "debian/*symbols: auto-update new symbols to released version"], env=dch_env).communicate()
1274+ cmd = ["dch",
1275+ "debian/*symbols: auto-update new symbols to released version"]
1276+ Popen(cmd, env=dch_env).communicate()
1277 subprocess.call(["bzr", "commit", "-m", "Update symbols"])
1278
1279=== modified file 'branch-source-builder/cupstream2distro/settings.py'
1280--- branch-source-builder/cupstream2distro/settings.py 2014-02-19 16:34:46 +0000
1281+++ branch-source-builder/cupstream2distro/settings.py 2014-02-23 16:37:01 +0000
1282@@ -39,6 +39,7 @@
1283 GNUPG_DIR = home_dir
1284 CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials")
1285
1286+
1287 # TODO refactor into a ci-utils module
1288 def _unit_config():
1289 path = os.path.join(home_dir, 'unit_config')
1290@@ -73,7 +74,8 @@
1291
1292 # selected arch for building arch:all packages
1293 VIRTUALIZED_PPA_ARCH = ["i386", "amd64"]
1294-# an arch we will ignore for publication if latest published version in dest doesn't build it
1295+# an arch we will ignore for publication if latest published version in dest
1296+# doesn't build it
1297 ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el'])
1298 ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el'])
1299 SRU_PPA = "ubuntu-unity/sru-staging"
1300@@ -87,11 +89,15 @@
1301
1302 OLD_STACK_DIR = 'old'
1303 PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync'
1304-PACKAGE_LIST_RSYNC_FILENAME_FORMAT = PACKAGE_LIST_RSYNC_FILENAME_PREFIX + '_{}-{}'
1305-RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*".format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX)
1306+PACKAGE_LIST_RSYNC_FILENAME_FORMAT = (PACKAGE_LIST_RSYNC_FILENAME_PREFIX
1307+ + '_{}-{}')
1308+RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*"
1309+RSYNC_PATTERN = RSYNC_PATTERN.format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX)
1310
1311-ROOT_CU2D = os.path.join(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
1312-DEFAULT_CONFIG_STACKS_DIR = os.path.join(os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks')
1313+ROOT_CU2D = os.path.join(
1314+ os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
1315+DEFAULT_CONFIG_STACKS_DIR = os.path.join(
1316+ os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks')
1317 STACK_STATUS_FILENAME = "stack.status"
1318 STACK_STARTED_FILENAME = "stack.started"
1319 STACK_BUILDING_FILENAME = "stack.building"
1320
1321=== modified file 'branch-source-builder/cupstream2distro/silomanager.py'
1322--- branch-source-builder/cupstream2distro/silomanager.py 2014-01-31 17:06:36 +0000
1323+++ branch-source-builder/cupstream2distro/silomanager.py 2014-02-23 16:37:01 +0000
1324@@ -22,7 +22,8 @@
1325 import os
1326 import shutil
1327
1328-from cupstream2distro.settings import SILO_CONFIG_FILENAME, SILO_NAME_LIST, SILO_STATUS_RSYNCDIR
1329+from cupstream2distro.settings import (SILO_CONFIG_FILENAME, SILO_NAME_LIST,
1330+ SILO_STATUS_RSYNCDIR)
1331 from cupstream2distro.utils import ignored
1332
1333
1334@@ -42,10 +43,13 @@
1335 os.makedirs(SILO_STATUS_RSYNCDIR)
1336 silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1]
1337 dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name)
1338- logging.debug("Copying configuration from {} to {}".format(silo_config_path, dest))
1339- shutil.copy2(silo_config_path, os.path.join(SILO_STATUS_RSYNCDIR, silo_name))
1340+ tmpl = "Copying configuration from {} to {}"
1341+ logging.debug(tmpl.format(silo_config_path, dest))
1342+ dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name)
1343+ shutil.copy2(silo_config_path, dest)
1344 return True
1345
1346+
1347 def load_config(uri=None):
1348 """return a loaded config
1349
1350@@ -62,23 +66,33 @@
1351 logging.warning("Can't load configuration: " + e.message)
1352 return None
1353
1354+
1355 def remove_status_file(silo_name):
1356 """Remove status file"""
1357 os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name))
1358
1359
1360-def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri, ignore_silo):
1361- """Return true if the project for that serie in that dest is not in any configuration"""
1362- logging.info("Checking if {} is already configured for {} ({}) in another silo".format(project_name, dest.name, series.name))
1363+def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri,
1364+ ignore_silo):
1365+ """Return true if the project for that serie in that dest is not in any
1366+ configuration"""
1367+ tmpl = "Checking if {} is already configured for {} ({}) in another silo"
1368+ logging.info(tmpl.format(project_name, dest.name, series.name))
1369 for silo_name in SILO_NAME_LIST:
1370 # we are reconfiguring current silo, ignoring it
1371 if ignore_silo == silo_name:
1372 continue
1373 config = load_config(os.path.join(base_silo_uri, silo_name))
1374 if config:
1375- if (config["global"]["dest"] == dest.self_link and config["global"]["series"] == series.self_link and
1376- (project_name in config["mps"] or project_name in config["sources"])):
1377- logging.error("{} is already prepared for the same serie and destination in {}".format(project_name, silo_name))
1378+ project_match = (project_name in config["mps"]
1379+ or project_name in config["sources"])
1380+ series_match = config["global"]["series"] == series.self_link
1381+ dest_match = config["global"]["dest"] == dest.self_link
1382+
1383+ if (dest_match and series_match and project_match):
1384+ tmpl = ("{} is already prepared for the same series "
1385+ "and destination in {}")
1386+ logging.error(tmpl.format(project_name, silo_name))
1387 return False
1388 return True
1389
1390@@ -86,27 +100,32 @@
1391 def return_first_available_silo(base_silo_uri):
1392 """Check which silos are free and return the first one"""
1393 for silo_name in SILO_NAME_LIST:
1394- if not os.path.isfile(os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)):
1395+ p = os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)
1396+ if not os.path.isfile(p):
1397 return silo_name
1398 return None
1399
1400+
1401 def get_config_step(config):
1402 """Get configuration step"""
1403 return config["global"]["step"]
1404
1405+
1406 def set_config_step(config, new_step, uri=''):
1407 """Set configuration step to new_step"""
1408 config["global"]["step"] = new_step
1409 return save_config(config, uri)
1410
1411+
1412 def set_config_status(config, status, uri='', add_url=True):
1413 """Change status to reflect latest status"""
1414 build_url = os.getenv('BUILD_URL')
1415 if add_url and build_url:
1416- status = "{} ({}console)".format(status , build_url)
1417+ status = "{} ({}console)".format(status, build_url)
1418 config["global"]["status"] = status
1419 return save_config(config, uri)
1420
1421+
1422 def get_all_projects(config):
1423 """Get a list of all projets"""
1424 projects = []
1425
1426=== modified file 'branch-source-builder/cupstream2distro/stack.py'
1427--- branch-source-builder/cupstream2distro/stack.py 2014-02-19 16:34:46 +0000
1428+++ branch-source-builder/cupstream2distro/stack.py 2014-02-23 16:37:01 +0000
1429@@ -21,11 +21,75 @@
1430 import os
1431 import yaml
1432
1433-from .settings import DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME, STACK_STARTED_FILENAME
1434+from .settings import (DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME,
1435+ STACK_STARTED_FILENAME)
1436 from .utils import ignored
1437
1438 _stacks_ref = {}
1439
1440+CANT_FIND_STATUS = '''Can't find status for {depstack} ({deprel}). This
1441+shouldn't happen unless the stack is currently running. If this is the case, it
1442+means that the current stack shouldn't be uploaded as the state is unknown.'''
1443+
1444+FAILED_TO_PUBLISH = '''{depstack} ({deprel}) failed to publish. Possible causes
1445+are:
1446+ * the stack really didn't build/can't be prepared at all.
1447+ * the stack has integration tests not working with this previous stack.
1448+
1449+ What needs to be done:
1450+ Either:
1451+ * If we want to publish both stacks: retry the integration tests for
1452+ {depstack} ({deprel}), including components from this stack (check
1453+ with the whole PPA). If that works, both stacks should be published
1454+ at the same time.
1455+ Or:
1456+ * If we only want to publish this stack: check that we can safely
1457+ publish it by itself (e.g. without the stacks it depends on). The
1458+ trick there is to make sure that the stack is not relying on, or
1459+ affected by, a change that happened in one of its dependencies.
1460+ Example: if the {depstack} ({deprel}) API changed in a way that
1461+ affects any component of the current stack, and both stacks got
1462+ updated in trunk, we need to make sure we don't land only one of the
1463+ two stacks which would result in a broken state. Also think about
1464+ potential ABI changes.'''
1465+
1466+MANUAL_PUBLISH = '''{depstack} ({deprel}) is in manually publish mode. Possible
1467+causes are:
1468+ * Some part of the stack has packaging changes
1469+ * This stack is depending on another stack not being published
1470+
1471+ What needs to be done:
1472+ Either:
1473+ * If {depstack} ({deprel}) can be published, we should publish both
1474+ stacks at the same time.
1475+ Or:
1476+ * If we only want to publish this stack: check that we can safely
1477+ publish it by itself (e.g. without the stacks it depends on). The
1478+ trick there is to make sure that the stack is not relying on, or
1479+ affected by, a change that happened in one of its dependencies.
1480+ Example: if the {depstack} ({deprel}) API changed in a way that
1481+ affects any component of the current stack, and both stacks got
1482+ updated in trunk, we need to make sure we don't land only one of the
1483+ two stacks which would result in a broken state. Also think about
1484+ potential ABI changes.'''
1485+
1486+MANUALLY_ABORTED = '''{depstack} ({deprel}) has been manually aborted or failed
1487+for an unknown reason. Possible causes are:
1488+ * A job of this stack was stopped manually
1489+ * Jenkins had an internal error/shutdown
1490+
1491+ What needs to be done:
1492+ * If we only want to publish this stack: check that we can safely
1493+ publish it by itself (e.g. without the stacks it depends on). The
1494+ trick there is to make sure that the stack is not relying on, or
1495+ affected by, a change that happened in one of its dependencies.
1496+ Example: if the {depstack} ({deprel}) API changed in a way that
1497+ affects any component of the current stack, and both stacks got
1498+ updated in trunk, we need to make sure we don't land only one of the
1499+ two stacks which would result in a broken state. Also think about
1500+ potential ABI changes.'''
1501+
1502+
1503 # TODO: should be used by a metaclass
1504 def get_stack(release, stack_name):
1505 try:
1506@@ -33,22 +97,28 @@
1507 except KeyError:
1508 return Stack(release, stack_name)
1509
1510+
1511 class Stack():
1512
1513 def __init__(self, release, stack_name):
1514- self.stack_name = stack_name
1515- self.release = release
1516- self.statusfile = os.path.join('..', '..', release, stack_name, STACK_STATUS_FILENAME)
1517- self.startedfile = os.path.join('..', '..', release, stack_name, STACK_STARTED_FILENAME)
1518+ self.stack_name = stack_name
1519+ self.release = release
1520+ args = ('..', '..', release, stack_name, STACK_STATUS_FILENAME)
1521+ self.statusfile = os.path.join(*args)
1522+ args = ('..', '..', release, stack_name, STACK_STARTED_FILENAME)
1523+ self.startedfile = os.path.join(*args)
1524 self.stack_file_path = None
1525 self._dependencies = None
1526 self._rdependencies = None
1527- for stack_file_path in Stack.get_stacks_file_path(release):
1528- if stack_file_path.split(os.path.sep)[-1] == "{}.cfg".format(stack_name):
1529+ for stack_file_path in Stack.get_stacks_file_path(release):
1530+ formatted = "{}.cfg".format(stack_name)
1531+ if stack_file_path.split(os.path.sep)[-1] == formatted:
1532 self.stack_file_path = stack_file_path
1533 break
1534 if not self.stack_file_path:
1535- raise Exception("{}.cfg for {} doesn't exist anywhere in {}".format(stack_name, release, self.get_root_stacks_dir()))
1536+ msg = "{}.cfg for {} doesn't exist anywhere in {}"
1537+ msg = msg.format(stack_name, release, self.get_root_stacks_dir())
1538+ raise Exception(msg)
1539
1540 with open(self.stack_file_path, 'r') as f:
1541 cfg = yaml.load(f)
1542@@ -118,7 +188,11 @@
1543 else:
1544 (stackname, release) = (item, self.release)
1545 self._dependencies.append(get_stack(release, stackname))
1546- logging.info("{} ({}) dependency list is: {}".format(self.stack_name, self.release, ["{} ({})".format(stack.stack_name, stack.release) for stack in self._dependencies]))
1547+ deps = ["{} ({})".format(stack.stack_name, stack.release)
1548+ for stack in self._dependencies]
1549+ msg = "{} ({}) dependency list is: {}"
1550+ msg = msg.format(self.stack_name, self.release, deps)
1551+ logging.info(msg)
1552 return self._dependencies
1553 except (TypeError, KeyError):
1554 return []
1555@@ -137,7 +211,8 @@
1556 return self._rdependencies
1557
1558 def generate_dep_status_message(self):
1559- '''Return a list of potential problems from others stack which should block current publication'''
1560+ '''Return a list of potential problems from others stack which should
1561+ block current publication'''
1562
1563 # TODO: get the first Stack object
1564 # iterate over all stacks objects from dep chain
1565@@ -145,39 +220,24 @@
1566
1567 global_dep_status_info = []
1568 for stack in self.get_direct_depending_stacks():
1569- logging.info("Check status for {} ({})".format(stack.stack_name, stack.release))
1570+ msg = "Check status for {} ({})"
1571+ msg = msg.format(stack.stack_name, stack.release)
1572+ logging.info(msg)
1573 status = stack.get_status()
1574 message = None
1575 # We should have a status for every stack
1576 if status is None:
1577- message = "Can't find status for {depstack} ({deprel}). This shouldn't happen unless the stack is currently running. If this is the case, it means that the current stack shouldn't be uploaded as the state is unknown.".format(depstack=stack, deprel=stack.release)
1578+ kw = {'depstack': stack, 'deprel': stack.release}
1579+ message = CANT_FIND_STATUS.format(**kw)
1580 elif status == 1:
1581- message = '''{depstack} ({deprel}) failed to publish. Possible causes are:
1582- * the stack really didn't build/can't be prepared at all.
1583- * the stack has integration tests not working with this previous stack.
1584-
1585- What needs to be done:
1586- Either:
1587- * If we want to publish both stacks: retry the integration tests for {depstack} ({deprel}), including components from this stack (check with the whole PPA). If that works, both stacks should be published at the same time.
1588- Or:
1589- * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
1590+ kw = {'depstack': stack.stack_name, 'deprel': stack.release}
1591+ message = FAILED_TO_PUBLISH.format(**kw)
1592 elif status == 2:
1593- message = '''{depstack} ({deprel}) is in manually publish mode. Possible causes are:
1594- * Some part of the stack has packaging changes
1595- * This stack is depending on another stack not being published
1596-
1597- What needs to be done:
1598- Either:
1599- * If {depstack} ({deprel}) can be published, we should publish both stacks at the same time.
1600- Or:
1601- * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
1602+ kw = {'depstack': stack.stack_name, 'deprel': stack.release}
1603+ message = MANUAL_PUBLISH.format(**kw)
1604 elif status == 3 or status == -1:
1605- message = '''{depstack} ({deprel}) has been manually aborted or failed for an unknown reason. Possible causes are:
1606- * A job of this stack was stopped manually
1607- * Jenkins had an internal error/shutdown
1608-
1609- What needs to be done:
1610- * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release)
1611+ kw = {'depstack': stack.stack_name, 'deprel': stack.release}
1612+ message = MANUALLY_ABORTED.format(**kw)
1613
1614 if message:
1615 logging.warning(message)
1616@@ -186,19 +246,21 @@
1617
1618 @staticmethod
1619 def get_root_stacks_dir():
1620- '''Get root stack dir'''
1621- return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR)
1622+ '''Get root stack dir'''
1623+ return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR)
1624
1625 @staticmethod
1626 def get_stacks_file_path(release):
1627 '''Return an iterator with all path for every discovered stack files'''
1628- for root, dirs, files in os.walk(os.path.join(Stack.get_root_stacks_dir(), release)):
1629+ walking = os.walk(os.path.join(Stack.get_root_stacks_dir(), release))
1630+ for root, dirs, files in walking:
1631 for candidate in files:
1632 if candidate.endswith('.cfg'):
1633 yield os.path.join(root, candidate)
1634
1635 @staticmethod
1636 def get_current_stack():
1637- '''Return current stack object based on current path (release/stackname)'''
1638+ '''Return current stack object based on current path
1639+ (release/stackname)'''
1640 path = os.getcwd().split(os.path.sep)
1641 return get_stack(path[-2], path[-1])
1642
1643=== modified file 'branch-source-builder/cupstream2distro/stacks.py'
1644--- branch-source-builder/cupstream2distro/stacks.py 2014-02-19 16:34:46 +0000
1645+++ branch-source-builder/cupstream2distro/stacks.py 2014-02-23 16:37:01 +0000
1646@@ -21,6 +21,7 @@
1647 import os
1648 import yaml
1649 import subprocess
1650+from subprocess import PIPE
1651
1652 from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN
1653 from .tools import get_packaging_diff_filename
1654@@ -38,7 +39,7 @@
1655 raise Exception('Please set environment variable CU2D_RSYNCSVR')
1656
1657 cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.']
1658- instance = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
1659+ instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
1660 (stdout, stderr) = instance.communicate()
1661 if instance.returncode not in (0, 23):
1662 raise Exception(stderr.decode("utf-8").strip())
1663@@ -62,10 +63,12 @@
1664 try:
1665 projects_list = cfg['stack']['projects']
1666 except (TypeError, KeyError):
1667- logging.warning("{} seems broken in not having stack or projects keys".format(file_path))
1668+ tmpl = "{} seems broken in not having stack or projects keys"
1669+ logging.warning(tmpl.format(file_path))
1670 continue
1671 if not projects_list:
1672- logging.warning("{} don't have any project list".format(file_path))
1673+ tmpl = "{} don't have any project list"
1674+ logging.warning(tmpl.format(file_path))
1675 continue
1676 for project in projects_list:
1677 if isinstance(project, dict):
1678@@ -74,11 +77,13 @@
1679 projects.append(project)
1680 return set(projects)
1681
1682+
1683 def get_stack_packaging_change_status(source_version_list):
1684 '''Return global package change status list
1685
1686 # FIXME: added too many infos now, should only be: (source, version)
1687- source_version_list is a list of couples (source, version, tip_rev, target_branch)'''
1688+ source_version_list is a list of couples (source, version, tip_rev,
1689+ target_branch)'''
1690
1691 packaging_change_status = []
1692 for (source, version, tip_rev, target_branch) in source_version_list:
1693
1694=== modified file 'branch-source-builder/cupstream2distro/tools.py'
1695--- branch-source-builder/cupstream2distro/tools.py 2014-02-19 16:34:46 +0000
1696+++ branch-source-builder/cupstream2distro/tools.py 2014-02-23 16:37:01 +0000
1697@@ -26,18 +26,21 @@
1698 from .settings import PROJECT_CONFIG_SUFFIX
1699 from .utils import ignored
1700
1701-WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1" time="0.1">
1702+WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1"
1703+time="0.1">
1704 <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase>
1705 </testsuite>'''
1706
1707
1708 def generate_xml_artefacts(test_name, details, filename):
1709- '''Generate a fake test name xml result for marking the build as unstable'''
1710+ '''Generate a fake test name xml result for marking the build as
1711+ unstable'''
1712 failure = ""
1713 errnum = 0
1714 for detail in details:
1715 errnum = 1
1716- failure += ' <failure type="exception">{}</failure>\n'.format(escape(detail))
1717+ ex = escape(detail)
1718+ failure += ' <failure type="exception">{}</failure>\n'.format(ex)
1719 if failure:
1720 failure = '\n{}'.format(failure)
1721
1722@@ -52,7 +55,8 @@
1723 return config.get('Package', 'dest_current_version')
1724
1725
1726-def save_project_config(source_package_name, branch, revision, dest_current_version, current_packaging_version):
1727+def save_project_config(source_package_name, branch, revision,
1728+ dest_current_version, current_packaging_version):
1729 '''Save branch and package configuration'''
1730 config = ConfigParser.RawConfigParser()
1731 config.add_section('Branch')
1732@@ -61,21 +65,27 @@
1733 config.add_section('Package')
1734 config.set('Package', 'dest_current_version', dest_current_version)
1735 config.set('Package', 'packaging_version', current_packaging_version)
1736- with open("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX), 'wb') as configfile:
1737+ path = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)
1738+ with open(path, 'wb') as configfile:
1739 config.write(configfile)
1740
1741
1742 def get_packaging_diff_filename(source_package_name, packaging_version):
1743 '''Return the packaging diff filename'''
1744
1745- return "packaging_changes_{}_{}.diff".format(source_package_name, packaging_version)
1746+ ret = "packaging_changes_{}_{}.diff"
1747+ return ret.format(source_package_name, packaging_version)
1748
1749
1750 def mark_project_as_published(source_package_name, packaging_version):
1751- '''Rename .project and eventual diff files so that if we do a partial rebuild, we don't try to republish them'''
1752- project_filename = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX)
1753- os.rename(project_filename, "{}_{}".format(project_filename, packaging_version))
1754- diff_filename = get_packaging_diff_filename(source_package_name, packaging_version)
1755+ '''Rename .project and eventual diff files so that if we do a partial
1756+ rebuild, we don't try to republish them'''
1757+ tmpl = "{}.{}"
1758+ project_filename = tmpl.format(source_package_name, PROJECT_CONFIG_SUFFIX)
1759+ new_name = tmpl.format(project_filename, packaging_version)
1760+ os.rename(project_filename, new_name)
1761+ diff_filename = get_packaging_diff_filename(source_package_name,
1762+ packaging_version)
1763 if os.path.isfile(diff_filename):
1764 os.rename(diff_filename, "{}.published".format(diff_filename))
1765
1766
1767=== modified file 'charms/precise/python-django/hooks/hooks.py'
1768--- charms/precise/python-django/hooks/hooks.py 2014-02-18 20:11:41 +0000
1769+++ charms/precise/python-django/hooks/hooks.py 2014-02-23 16:37:01 +0000
1770@@ -562,7 +562,6 @@
1771
1772 def config_changed(config_data):
1773 os.environ['DJANGO_SETTINGS_MODULE'] = django_settings_modules
1774- django_admin_cmd = find_django_admin_cmd()
1775
1776 site_secret_key = config_data['site_secret_key']
1777 if not site_secret_key:
1778@@ -610,7 +609,7 @@
1779 else:
1780 run('git pull %s %s' % (repos_url, vcs_clone_dir))
1781 elif vcs == 'bzr' or vcs == 'bazaar':
1782- run('cd %s; bzr pull %s %s' % (vcs_clonse_dir, repos_url))
1783+ run('cd %s; bzr pull %s %s' % (vcs_clone_dir, repos_url))
1784 elif vcs == 'svn' or vcs == 'subversion':
1785 run('svn up %s %s' % (repos_url, vcs_clone_dir))
1786 else:
1787
1788=== modified file 'image-builder/imagebuilder/tests/test_style.py'
1789--- image-builder/imagebuilder/tests/test_style.py 2014-02-15 12:06:40 +0000
1790+++ image-builder/imagebuilder/tests/test_style.py 2014-02-23 16:37:01 +0000
1791@@ -14,15 +14,19 @@
1792 # along with this program. If not, see <http://www.gnu.org/licenses/>.
1793
1794 from ucitests import styles
1795-
1796-import imagebuilder
1797+# If we just import run_worker, we'll end up testing whichever run_worker was
1798+# last imported, so import it under a different name.
1799+import imagebuilder_run_worker
1800
1801
1802 class TestPep8(styles.TestPep8):
1803
1804- packages = [imagebuilder]
1805+ # uci-tests will scan all subdirectories that this module is found in, so
1806+ # we do not need to explicitly provide imagebuilder. Doing so would cause
1807+ # the tests to be run twice.
1808+ packages = [imagebuilder_run_worker]
1809
1810
1811 class TestPyflakes(styles.TestPyflakes):
1812
1813- packages = [imagebuilder]
1814+ packages = [imagebuilder_run_worker]
1815
1816=== added symlink 'image-builder/imagebuilder_run_worker.py'
1817=== target is u'run_worker'
1818=== modified file 'image-builder/run_worker'
1819--- image-builder/run_worker 2014-01-28 13:50:43 +0000
1820+++ image-builder/run_worker 2014-02-23 16:37:01 +0000
1821@@ -50,7 +50,7 @@
1822 if amqp_utils.progress_completed(trigger, {'image_id': image_id}):
1823 log.error(
1824 'Unable to notify progress-trigger completition of action')
1825- except Exception as e:
1826+ except Exception:
1827 type, val, tb = sys.exc_info()
1828 amqp_utils.progress_failed(trigger, {'message': val.message})
1829 log.exception('Image build failed:')
1830
1831=== modified file 'lander/lander/tests/test_style.py'
1832--- lander/lander/tests/test_style.py 2014-02-15 12:06:40 +0000
1833+++ lander/lander/tests/test_style.py 2014-02-23 16:37:01 +0000
1834@@ -14,15 +14,19 @@
1835 # along with this program. If not, see <http://www.gnu.org/licenses/>.
1836
1837 from ucitests import styles
1838-
1839-import lander
1840+# If we just import run_worker, we'll end up testing whichever run_worker was
1841+# last imported, so import it under a different name.
1842+import lander_run_worker
1843
1844
1845 class TestPep8(styles.TestPep8):
1846
1847- packages = [lander]
1848+ # uci-tests will scan all subdirectories that this module is found in, so
1849+ # we do not need to explicitly provide lander. Doing so would cause the
1850+ # tests to be run twice.
1851+ packages = [lander_run_worker]
1852
1853
1854 class TestPyflakes(styles.TestPyflakes):
1855
1856- packages = [lander]
1857+ packages = [lander_run_worker]
1858
1859=== added symlink 'lander/lander_run_worker.py'
1860=== target is u'run_worker'
1861=== modified file 'test_runner/tstrun/tests/test_style.py'
1862--- test_runner/tstrun/tests/test_style.py 2014-02-15 12:06:40 +0000
1863+++ test_runner/tstrun/tests/test_style.py 2014-02-23 16:37:01 +0000
1864@@ -16,13 +16,19 @@
1865 from ucitests import styles
1866
1867 import tstrun
1868+# If we just import run_worker, we'll end up testing whichever run_worker was
1869+# last imported, so import it under a different name.
1870+import tstrun_run_worker
1871
1872
1873 class TestPep8(styles.TestPep8):
1874
1875- packages = [tstrun]
1876+ # uci-tests will scan all subdirectories that this module is found in, so
1877+ # we do not need to explicitly provide tstrun. Doing so would cause
1878+ # the tests to be run twice.
1879+ packages = [tstrun_run_worker]
1880
1881
1882 class TestPyflakes(styles.TestPyflakes):
1883
1884- packages = [tstrun]
1885+ packages = [tstrun_run_worker]
1886
1887=== added symlink 'test_runner/tstrun_run_worker.py'
1888=== target is u'run_worker'

Subscribers

People subscribed via source and target branches

to all changes: