Merge lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage into lp:ubuntu-ci-services-itself
- wider-pyflakes-coverage
- Merge into trunk
Status: | Needs review |
---|---|
Proposed branch: | lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage |
Merge into: | lp:ubuntu-ci-services-itself |
Diff against target: |
1888 lines (+633/-308) 16 files modified
branch-source-builder/bsbuilder/tests/test_style.py (+8/-4) branch-source-builder/cupstream2distro/branchhandling.py (+75/-40) branch-source-builder/cupstream2distro/launchpadmanager.py (+38/-18) branch-source-builder/cupstream2distro/packageinppa.py (+86/-49) branch-source-builder/cupstream2distro/packageinppamanager.py (+24/-12) branch-source-builder/cupstream2distro/packagemanager.py (+204/-102) branch-source-builder/cupstream2distro/settings.py (+11/-5) branch-source-builder/cupstream2distro/silomanager.py (+30/-11) branch-source-builder/cupstream2distro/stack.py (+102/-40) branch-source-builder/cupstream2distro/stacks.py (+9/-4) branch-source-builder/cupstream2distro/tools.py (+20/-10) charms/precise/python-django/hooks/hooks.py (+1/-2) image-builder/imagebuilder/tests/test_style.py (+8/-4) image-builder/run_worker (+1/-1) lander/lander/tests/test_style.py (+8/-4) test_runner/tstrun/tests/test_style.py (+8/-2) |
To merge this branch: | bzr merge lp:~ev/ubuntu-ci-services-itself/wider-pyflakes-coverage |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
PS Jenkins bot (community) | continuous-integration | Needs Fixing | |
Canonical CI Engineering | Pending | ||
Review via email: mp+207801@code.launchpad.net |
Commit message
Fix a large number of pep8 issues in cupstream2distro. Cover run_worker and other out of module scripts with the pep8 and pyflakes tests.
Description of the change
I noticed that pyflakes wasn't running against watch_ppa.py, which exists a level above the bsbuilder module.
While fixing this and running the tests, I was having a hard time pulling useful failures from the noise generated by cupstream2distro. So I fixed all of its pep8 and pyflakes issues.
We should be very careful in landing this branch. cupstream2distro is completely untested code and I made some logic changes to it.
You'll notice that there are symlinks to the run_worker script in a few modules. This exists for three reasons:
1. pep8 and pyflakes are hardcoded in uci-tests to look for *.py files.
2. Importing `run_worker` using imp creates a run_workerc file unless you tell Python to give the compiled code a different name (by providing an open file and path argument).
3. Our test harness imports all the run_worker scripts under the same namespace, so you end up using whichever was imported last. Providing a uniquely named symlink fixes this.
Note that there is still a pyflakes failure with this branch. That is bug 1283449.
PS Jenkins bot (ps-jenkins) wrote : | # |
PS Jenkins bot (ps-jenkins) wrote : | # |
FAILED: Continuous integration, rev:281
http://
Executed test runs:
Click here to trigger a rebuild:
http://
Ursula Junque (ursinha) wrote : | # |
I like how elegant this is handling the huge messages (the part that was worrying me most). I've been working on adding tests to this code before actually making such changes, because they are already part of upstream and "only" need to be ported (at least basic unit tests), and that would avoid headaches in case things started to fail because we accidentally changed the logic somewhere. If you agree to hold this for a bit I think I can land tests first, then you can land this branch safely.
Unmerged revisions
- 281. By Evan
-
Expand the pyflakes/pep8 coverage for the test_runner as well.
- 280. By Evan
-
Fix the style tests. Specifying more than one module under the same directory causes it to be tested twice by pep8 and pyflakes. Modules imported with the same name with override each other.
- 279. By Evan
-
Fix pep8 issues in cupstream2distro packageinppa module.
- 278. By Evan
-
Fix pep8 issues in cupstream2distro packageinppamanager module.
- 277. By Evan
-
Fix pep8 issues in cupstream2distro packagemanager module.
- 276. By Evan
-
Fix pep8 issues in cupstream2distro silomanager module.
- 275. By Evan
-
Fix pep8 issues in cupstream2distro settings module.
- 274. By Evan
-
Fix pep8 issues in cupstream2distro stacks module.
- 273. By Evan
-
Oops. One more in tools.py.
- 272. By Evan
-
Fix pep8 issues in cupstream2distro tools module.
Preview Diff
1 | === modified file 'branch-source-builder/bsbuilder/tests/test_style.py' | |||
2 | --- branch-source-builder/bsbuilder/tests/test_style.py 2014-02-15 12:06:40 +0000 | |||
3 | +++ branch-source-builder/bsbuilder/tests/test_style.py 2014-02-23 16:37:01 +0000 | |||
4 | @@ -14,15 +14,19 @@ | |||
5 | 14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | 14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
6 | 15 | 15 | ||
7 | 16 | from ucitests import styles | 16 | from ucitests import styles |
10 | 17 | 17 | # If we just import run_worker, we'll end up testing whichever run_worker was | |
11 | 18 | import bsbuilder | 18 | # last imported, so import it under a different name. |
12 | 19 | import bsbuilder_run_worker | ||
13 | 19 | 20 | ||
14 | 20 | 21 | ||
15 | 21 | class TestPep8(styles.TestPep8): | 22 | class TestPep8(styles.TestPep8): |
16 | 22 | 23 | ||
18 | 23 | packages = [bsbuilder] | 24 | # uci-tests will scan all subdirectories that this module is found in, so |
19 | 25 | # we do not need to explicitly provide bsbuilder. Doing so would cause the | ||
20 | 26 | # tests to be run twice. | ||
21 | 27 | packages = [bsbuilder_run_worker] | ||
22 | 24 | 28 | ||
23 | 25 | 29 | ||
24 | 26 | class TestPyflakes(styles.TestPyflakes): | 30 | class TestPyflakes(styles.TestPyflakes): |
25 | 27 | 31 | ||
27 | 28 | packages = [bsbuilder] | 32 | packages = [bsbuilder_run_worker] |
28 | 29 | 33 | ||
29 | === added symlink 'branch-source-builder/bsbuilder_run_worker.py' | |||
30 | === target is u'run_worker' | |||
31 | === modified file 'branch-source-builder/cupstream2distro/branchhandling.py' | |||
32 | --- branch-source-builder/cupstream2distro/branchhandling.py 2014-02-19 16:34:46 +0000 | |||
33 | +++ branch-source-builder/cupstream2distro/branchhandling.py 2014-02-23 16:37:01 +0000 | |||
34 | @@ -23,13 +23,17 @@ | |||
35 | 23 | import os | 23 | import os |
36 | 24 | import re | 24 | import re |
37 | 25 | import subprocess | 25 | import subprocess |
38 | 26 | from subprocess import PIPE | ||
39 | 26 | 27 | ||
41 | 27 | from .settings import BRANCH_URL, IGNORECHANGELOG_COMMIT, PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX, SILO_PACKAGING_RELEASE_COMMIT_MESSAGE | 28 | from .settings import (BRANCH_URL, IGNORECHANGELOG_COMMIT, |
42 | 29 | PACKAGING_MERGE_COMMIT_MESSAGE, PROJECT_CONFIG_SUFFIX, | ||
43 | 30 | SILO_PACKAGING_RELEASE_COMMIT_MESSAGE) | ||
44 | 28 | 31 | ||
45 | 29 | 32 | ||
46 | 30 | def get_branch(branch_url, dest_dir): | 33 | def get_branch(branch_url, dest_dir): |
47 | 31 | '''Grab a branch''' | 34 | '''Grab a branch''' |
49 | 32 | instance = subprocess.Popen(["bzr", "branch", branch_url, dest_dir], stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 35 | cmd = ["bzr", "branch", branch_url, dest_dir] |
50 | 36 | instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE) | ||
51 | 33 | (stdout, stderr) = instance.communicate() | 37 | (stdout, stderr) = instance.communicate() |
52 | 34 | if instance.returncode != 0: | 38 | if instance.returncode != 0: |
53 | 35 | raise Exception(stderr.decode("utf-8").strip()) | 39 | raise Exception(stderr.decode("utf-8").strip()) |
54 | @@ -37,15 +41,18 @@ | |||
55 | 37 | 41 | ||
56 | 38 | def get_tip_bzr_revision(): | 42 | def get_tip_bzr_revision(): |
57 | 39 | '''Get latest revision in bzr''' | 43 | '''Get latest revision in bzr''' |
59 | 40 | instance = subprocess.Popen(["bzr", "log", "-c", "-1", "--line"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 44 | cmd = ["bzr", "log", "-c", "-1", "--line"] |
60 | 45 | instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE) | ||
61 | 41 | (stdout, stderr) = instance.communicate() | 46 | (stdout, stderr) = instance.communicate() |
62 | 42 | if instance.returncode != 0: | 47 | if instance.returncode != 0: |
63 | 43 | raise Exception(stderr.decode("utf-8").strip()) | 48 | raise Exception(stderr.decode("utf-8").strip()) |
64 | 44 | return (int(stdout.split(':')[0])) | 49 | return (int(stdout.split(':')[0])) |
65 | 45 | 50 | ||
66 | 46 | 51 | ||
69 | 47 | def collect_author_commits(content_to_parse, bugs_to_skip, additional_stamp=""): | 52 | def collect_author_commits(content_to_parse, bugs_to_skip, |
70 | 48 | '''return a tuple of a dict with authors and commits message from the content to parse | 53 | additional_stamp=""): |
71 | 54 | '''Return a tuple of a dict with authors and commits message from the | ||
72 | 55 | content to parse. | ||
73 | 49 | 56 | ||
74 | 50 | bugs_to_skip is a set of bugs we need to skip | 57 | bugs_to_skip is a set of bugs we need to skip |
75 | 51 | 58 | ||
76 | @@ -60,15 +67,17 @@ | |||
77 | 60 | commit_message_stenza = False | 67 | commit_message_stenza = False |
78 | 61 | for line in content_to_parse.splitlines(): | 68 | for line in content_to_parse.splitlines(): |
79 | 62 | # new revision, collect what we have found | 69 | # new revision, collect what we have found |
83 | 63 | if line.startswith("------------------------------------------------------------"): | 70 | dash = "------------------------------------------------------------" |
84 | 64 | # try to decipher a special case: we have some commits which were already in bugs_to_skip, | 71 | if line.startswith(dash): |
85 | 65 | # so we eliminate them. | 72 | # try to decipher a special case: we have some commits which were |
86 | 73 | # already in bugs_to_skip, so we eliminate them. | ||
87 | 66 | # Also ignore when having IGNORECHANGELOG_COMMIT | 74 | # Also ignore when having IGNORECHANGELOG_COMMIT |
93 | 67 | if (current_bugs and not (current_bugs - bugs_to_skip)) or IGNORECHANGELOG_COMMIT in current_commit: | 75 | if ((current_bugs and not (current_bugs - bugs_to_skip)) |
94 | 68 | current_authors = set() | 76 | or IGNORECHANGELOG_COMMIT in current_commit): |
95 | 69 | current_commit = "" | 77 | current_authors = set() |
96 | 70 | current_bugs = set() | 78 | current_commit = "" |
97 | 71 | continue | 79 | current_bugs = set() |
98 | 80 | continue | ||
99 | 72 | current_bugs -= bugs_to_skip | 81 | current_bugs -= bugs_to_skip |
100 | 73 | commit_message = current_commit + _format_bugs(current_bugs) | 82 | commit_message = current_commit + _format_bugs(current_bugs) |
101 | 74 | for author in current_authors: | 83 | for author in current_authors: |
102 | @@ -79,7 +88,8 @@ | |||
103 | 79 | current_commit = "" | 88 | current_commit = "" |
104 | 80 | current_bugs = set() | 89 | current_bugs = set() |
105 | 81 | 90 | ||
107 | 82 | # we ignore this commit if we have a changelog provided as part of the diff | 91 | # we ignore this commit if we have a changelog provided as part of the |
108 | 92 | # diff | ||
109 | 83 | if line.startswith("=== modified file 'debian/changelog'"): | 93 | if line.startswith("=== modified file 'debian/changelog'"): |
110 | 84 | current_authors = set() | 94 | current_authors = set() |
111 | 85 | current_commit = "" | 95 | current_commit = "" |
112 | @@ -94,14 +104,15 @@ | |||
113 | 94 | elif commit_message_stenza: | 104 | elif commit_message_stenza: |
114 | 95 | if line.startswith("diff:"): | 105 | if line.startswith("diff:"): |
115 | 96 | commit_message_stenza = False | 106 | commit_message_stenza = False |
117 | 97 | current_commit, current_bugs = _extract_commit_bugs(current_commit, additional_stamp) | 107 | args = (current_commit, additional_stamp) |
118 | 108 | current_commit, current_bugs = _extract_commit_bugs(*args) | ||
119 | 98 | else: | 109 | else: |
126 | 99 | line = line[2:] # Dedent the message provided by bzr | 110 | line = line[2:] # Dedent the message provided by bzr |
127 | 100 | if line[0:2] in ('* ', '- '): # paragraph line. | 111 | if line[0:2] in ('* ', '- '): # paragraph line. |
128 | 101 | line = line[2:] # Remove bullet | 112 | line = line[2:] # Remove bullet |
129 | 102 | if line[-1] != '.': # Grammar nazi... | 113 | if line[-1] != '.': # Grammar nazi... |
130 | 103 | line += '.' # ... or the lines will be merged. | 114 | line += '.' # ... or the lines will be merged. |
131 | 104 | line = line + ' ' # Add a space to preserve lines | 115 | line = line + ' ' # Add a space to preserve lines |
132 | 105 | current_commit += line | 116 | current_commit += line |
133 | 106 | # Maybe add something like that | 117 | # Maybe add something like that |
134 | 107 | #for content in mp.commit_message.split('\n'): | 118 | #for content in mp.commit_message.split('\n'): |
135 | @@ -127,10 +138,13 @@ | |||
136 | 127 | 138 | ||
137 | 128 | 139 | ||
138 | 129 | def _extract_commit_bugs(commit_message, additional_stamp=""): | 140 | def _extract_commit_bugs(commit_message, additional_stamp=""): |
140 | 130 | '''extract relevant commit message part and bugs number from a commit message''' | 141 | '''extract relevant commit message part and bugs number from a commit |
141 | 142 | message''' | ||
142 | 131 | 143 | ||
143 | 132 | current_bugs = _return_bugs(commit_message) | 144 | current_bugs = _return_bugs(commit_message) |
145 | 133 | changelog_content = " ".join(commit_message.rsplit('Fixes: ')[0].rsplit('Approved by ')[0].split()) | 145 | changelog_content = commit_message.rsplit('Fixes: ')[0] |
146 | 146 | changelog_content = changelog_content.rsplit('Approved by ')[0].split() | ||
147 | 147 | changelog_content = " ".join(changelog_content) | ||
148 | 134 | if additional_stamp: | 148 | if additional_stamp: |
149 | 135 | changelog_content = changelog_content + " " + additional_stamp | 149 | changelog_content = changelog_content + " " + additional_stamp |
150 | 136 | return (changelog_content, current_bugs) | 150 | return (changelog_content, current_bugs) |
151 | @@ -149,7 +163,8 @@ | |||
152 | 149 | # #12345 (but not 12345 for false positive) | 163 | # #12345 (but not 12345 for false positive) |
153 | 150 | # Support multiple bugs per commit | 164 | # Support multiple bugs per commit |
154 | 151 | bug_numbers = set() | 165 | bug_numbers = set() |
156 | 152 | bug_regexp = re.compile("((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})", re.IGNORECASE) | 166 | pattern = "((lp|bug|fix(es)?)[: #]*|#|launchpad.net/bugs/)(\d{5,})" |
157 | 167 | bug_regexp = re.compile(pattern, re.IGNORECASE) | ||
158 | 153 | for match in bug_regexp.findall(string): | 168 | for match in bug_regexp.findall(string): |
159 | 154 | logging.debug("Bug regexp match: {}".format(match[-1])) | 169 | logging.debug("Bug regexp match: {}".format(match[-1])) |
160 | 155 | bug_numbers.add(int(match[-1])) | 170 | bug_numbers.add(int(match[-1])) |
161 | @@ -170,7 +185,9 @@ | |||
162 | 170 | def return_log_diff(starting_rev): | 185 | def return_log_diff(starting_rev): |
163 | 171 | '''Return the relevant part of the cvs log since starting_rev''' | 186 | '''Return the relevant part of the cvs log since starting_rev''' |
164 | 172 | 187 | ||
166 | 173 | instance = subprocess.Popen(["bzr", "log", "-r", "{}..".format(starting_rev), "--show-diff", "--forward"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 188 | rev = "{}..".format(starting_rev) |
167 | 189 | cmd = ["bzr", "log", "-r", rev, "--show-diff", "--forward"] | ||
168 | 190 | instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE) | ||
169 | 174 | (stdout, stderr) = instance.communicate() | 191 | (stdout, stderr) = instance.communicate() |
170 | 175 | if instance.returncode != 0: | 192 | if instance.returncode != 0: |
171 | 176 | raise Exception(stderr.decode("utf-8").strip()) | 193 | raise Exception(stderr.decode("utf-8").strip()) |
172 | @@ -178,8 +195,10 @@ | |||
173 | 178 | 195 | ||
174 | 179 | 196 | ||
175 | 180 | def return_log_diff_since_last_release(content_to_parse): | 197 | def return_log_diff_since_last_release(content_to_parse): |
178 | 181 | '''From a bzr log content, return only the log diff since the latest release''' | 198 | '''From a bzr log content, return only the log diff since the latest |
179 | 182 | after_release = content_to_parse.split(SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(''))[-1] | 199 | release''' |
180 | 200 | formatted = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format('') | ||
181 | 201 | after_release = content_to_parse.split(formatted)[-1] | ||
182 | 183 | sep = '------------------------------------------------------------' | 202 | sep = '------------------------------------------------------------' |
183 | 184 | sep_index = after_release.find(sep) | 203 | sep_index = after_release.find(sep) |
184 | 185 | if sep_index != 1: | 204 | if sep_index != 1: |
185 | @@ -190,10 +209,11 @@ | |||
186 | 190 | def commit_release(new_package_version, tip_bzr_rev=None): | 209 | def commit_release(new_package_version, tip_bzr_rev=None): |
187 | 191 | '''Commit latest release''' | 210 | '''Commit latest release''' |
188 | 192 | if not tip_bzr_rev: | 211 | if not tip_bzr_rev: |
190 | 193 | message = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version) | 212 | msg = SILO_PACKAGING_RELEASE_COMMIT_MESSAGE.format(new_package_version) |
191 | 194 | else: | 213 | else: |
194 | 195 | message = "Releasing {}, based on r{}".format(new_package_version, tip_bzr_rev) | 214 | msg = "Releasing {}, based on r{}" |
195 | 196 | if subprocess.call(["bzr", "commit", "-m", message]) != 0: | 215 | msg = msg.format(new_package_version, tip_bzr_rev) |
196 | 216 | if subprocess.call(["bzr", "commit", "-m", msg]) != 0: | ||
197 | 197 | raise Exception("The above command returned an error.") | 217 | raise Exception("The above command returned an error.") |
198 | 198 | 218 | ||
199 | 199 | 219 | ||
200 | @@ -214,20 +234,26 @@ | |||
201 | 214 | env["BZR_EDITOR"] = "echo" | 234 | env["BZR_EDITOR"] = "echo" |
202 | 215 | 235 | ||
203 | 216 | os.chdir(source_package_name) | 236 | os.chdir(source_package_name) |
205 | 217 | if subprocess.call(["bzr", "push", BRANCH_URL.format(source_package_name, version.replace("~", "").replace(":", "")), "--overwrite"]) != 0: | 237 | args = (source_package_name, version.replace("~", "").replace(":", "")) |
206 | 238 | loc = BRANCH_URL.format(*args) | ||
207 | 239 | if subprocess.call(["bzr", "push", loc, "--overwrite"]) != 0: | ||
208 | 218 | raise Exception("The push command returned an error.") | 240 | raise Exception("The push command returned an error.") |
210 | 219 | mergeinstance = subprocess.Popen(["bzr", "lp-propose-merge", parent_branch, "-m", PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch), "--approve"], stdin=subprocess.PIPE, env=env) | 241 | msg = PACKAGING_MERGE_COMMIT_MESSAGE.format(version, tip_rev, branch) |
211 | 242 | cmd = ["bzr", "lp-propose-merge", parent_branch, "-m", msg, "--approve"] | ||
212 | 243 | mergeinstance = subprocess.Popen(cmd, stdin=PIPE, env=env) | ||
213 | 220 | mergeinstance.communicate(input="y") | 244 | mergeinstance.communicate(input="y") |
214 | 221 | if mergeinstance.returncode != 0: | 245 | if mergeinstance.returncode != 0: |
215 | 222 | raise Exception("The lp-propose command returned an error.") | 246 | raise Exception("The lp-propose command returned an error.") |
216 | 223 | os.chdir('..') | 247 | os.chdir('..') |
217 | 224 | 248 | ||
218 | 225 | 249 | ||
220 | 226 | def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri, commit_message, revision): | 250 | def merge_branch_with_parent_into(local_branch_uri, lp_parent_branch, dest_uri, |
221 | 251 | commit_message, revision): | ||
222 | 227 | """Merge local branch into lp_parent_branch at revision""" | 252 | """Merge local branch into lp_parent_branch at revision""" |
223 | 228 | success = False | 253 | success = False |
224 | 229 | cur_dir = os.path.abspath('.') | 254 | cur_dir = os.path.abspath('.') |
226 | 230 | subprocess.call(["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri]) | 255 | cmd = ["bzr", "branch", "-r", str(revision), lp_parent_branch, dest_uri] |
227 | 256 | subprocess.call(cmd) | ||
228 | 231 | os.chdir(dest_uri) | 257 | os.chdir(dest_uri) |
229 | 232 | if subprocess.call(["bzr", "merge", local_branch_uri]) == 0: | 258 | if subprocess.call(["bzr", "merge", local_branch_uri]) == 0: |
230 | 233 | subprocess.call(["bzr", "commit", "-m", commit_message]) | 259 | subprocess.call(["bzr", "commit", "-m", commit_message]) |
231 | @@ -236,12 +262,14 @@ | |||
232 | 236 | return success | 262 | return success |
233 | 237 | 263 | ||
234 | 238 | 264 | ||
236 | 239 | def merge_branch(uri_to_merge, lp_parent_branch, commit_message, authors=set()): | 265 | def merge_branch(uri_to_merge, lp_parent_branch, commit_message, |
237 | 266 | authors=set()): | ||
238 | 240 | """Resync with targeted branch if possible""" | 267 | """Resync with targeted branch if possible""" |
239 | 241 | success = False | 268 | success = False |
240 | 242 | cur_dir = os.path.abspath('.') | 269 | cur_dir = os.path.abspath('.') |
241 | 243 | os.chdir(uri_to_merge) | 270 | os.chdir(uri_to_merge) |
243 | 244 | lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:") | 271 | args = ("https://code.launchpad.net/", "lp:") |
244 | 272 | lp_parent_branch = lp_parent_branch.replace(*args) | ||
245 | 245 | if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0: | 273 | if subprocess.call(["bzr", "merge", lp_parent_branch]) == 0: |
246 | 246 | cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"] | 274 | cmd = ["bzr", "commit", "-m", commit_message, "--unchanged"] |
247 | 247 | for author in authors: | 275 | for author in authors: |
248 | @@ -251,12 +279,14 @@ | |||
249 | 251 | os.chdir(cur_dir) | 279 | os.chdir(cur_dir) |
250 | 252 | return success | 280 | return success |
251 | 253 | 281 | ||
252 | 282 | |||
253 | 254 | def push_to_branch(source_uri, lp_parent_branch, overwrite=False): | 283 | def push_to_branch(source_uri, lp_parent_branch, overwrite=False): |
254 | 255 | """Push source to parent branch""" | 284 | """Push source to parent branch""" |
255 | 256 | success = False | 285 | success = False |
256 | 257 | cur_dir = os.path.abspath('.') | 286 | cur_dir = os.path.abspath('.') |
257 | 258 | os.chdir(source_uri) | 287 | os.chdir(source_uri) |
259 | 259 | lp_parent_branch = lp_parent_branch.replace("https://code.launchpad.net/", "lp:") | 288 | args = ("https://code.launchpad.net/", "lp:") |
260 | 289 | lp_parent_branch = lp_parent_branch.replace(*args) | ||
261 | 260 | command = ["bzr", "push", lp_parent_branch] | 290 | command = ["bzr", "push", lp_parent_branch] |
262 | 261 | if overwrite: | 291 | if overwrite: |
263 | 262 | command.append("--overwrite") | 292 | command.append("--overwrite") |
264 | @@ -265,16 +295,21 @@ | |||
265 | 265 | os.chdir(cur_dir) | 295 | os.chdir(cur_dir) |
266 | 266 | return success | 296 | return success |
267 | 267 | 297 | ||
268 | 298 | |||
269 | 268 | def grab_committers_compared_to(source_uri, lp_branch_to_scan): | 299 | def grab_committers_compared_to(source_uri, lp_branch_to_scan): |
270 | 269 | """Return unique list of committers for a given branch""" | 300 | """Return unique list of committers for a given branch""" |
271 | 270 | committers = set() | 301 | committers = set() |
272 | 271 | cur_dir = os.path.abspath('.') | 302 | cur_dir = os.path.abspath('.') |
273 | 272 | os.chdir(source_uri) | 303 | os.chdir(source_uri) |
276 | 273 | lp_branch_to_scan = lp_branch_to_scan.replace("https://code.launchpad.net/", "lp:") | 304 | args = ("https://code.launchpad.net/", "lp:") |
277 | 274 | instance = subprocess.Popen(["bzr", "missing", lp_branch_to_scan, "--other"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 305 | lp_branch_to_scan = lp_branch_to_scan.replace(*args) |
278 | 306 | cmd = ["bzr", "missing", lp_branch_to_scan, "--other"] | ||
279 | 307 | instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE) | ||
280 | 275 | (stdout, stderr) = instance.communicate() | 308 | (stdout, stderr) = instance.communicate() |
281 | 276 | if stderr != "": | 309 | if stderr != "": |
283 | 277 | raise Exception("bzr missing on {} returned a failure: {}".format(lp_branch_to_scan, stderr.decode("utf-8").strip())) | 310 | msg = "bzr missing on {} returned a failure: {}" |
284 | 311 | msg = msg.format(lp_branch_to_scan, stderr.decode("utf-8").strip()) | ||
285 | 312 | raise Exception(msg) | ||
286 | 278 | committer_regexp = re.compile("\ncommitter: (.*)\n") | 313 | committer_regexp = re.compile("\ncommitter: (.*)\n") |
287 | 279 | for match in committer_regexp.findall(stdout): | 314 | for match in committer_regexp.findall(stdout): |
288 | 280 | for committer in match.split(', '): | 315 | for committer in match.split(', '): |
289 | 281 | 316 | ||
290 | === modified file 'branch-source-builder/cupstream2distro/launchpadmanager.py' | |||
291 | --- branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-19 16:34:46 +0000 | |||
292 | +++ branch-source-builder/cupstream2distro/launchpadmanager.py 2014-02-23 16:37:01 +0000 | |||
293 | @@ -25,10 +25,13 @@ | |||
294 | 25 | import os | 25 | import os |
295 | 26 | launchpad = None | 26 | launchpad = None |
296 | 27 | 27 | ||
301 | 28 | from .settings import ARCHS_TO_EVENTUALLY_IGNORE, ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH, CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR | 28 | from .settings import (ARCHS_TO_EVENTUALLY_IGNORE, |
302 | 29 | 29 | ARCHS_TO_UNCONDITIONALLY_IGNORE, VIRTUALIZED_PPA_ARCH, | |
303 | 30 | 30 | CRED_FILE_PATH, COMMON_LAUNCHPAD_CACHE_DIR) | |
304 | 31 | def get_launchpad(use_staging=False, use_cred_file=os.path.expanduser(CRED_FILE_PATH)): | 31 | |
305 | 32 | |||
306 | 33 | def get_launchpad(use_staging=False, | ||
307 | 34 | use_cred_file=os.path.expanduser(CRED_FILE_PATH)): | ||
308 | 32 | '''Get THE Launchpad''' | 35 | '''Get THE Launchpad''' |
309 | 33 | global launchpad | 36 | global launchpad |
310 | 34 | if not launchpad: | 37 | if not launchpad: |
311 | @@ -42,14 +45,20 @@ | |||
312 | 42 | os.makdedirs(launchpadlib_dir) | 45 | os.makdedirs(launchpadlib_dir) |
313 | 43 | 46 | ||
314 | 44 | if use_cred_file: | 47 | if use_cred_file: |
319 | 45 | launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"], | 48 | launchpad = Launchpad.login_with( |
320 | 46 | version='devel', # devel because copyPackage is only available there | 49 | 'cupstream2distro', server, |
321 | 47 | credentials_file=use_cred_file, | 50 | allow_access_levels=["WRITE_PRIVATE"], |
322 | 48 | launchpadlib_dir=launchpadlib_dir) | 51 | # devel because copyPackage is only available there |
323 | 52 | version='devel', | ||
324 | 53 | credentials_file=use_cred_file, | ||
325 | 54 | launchpadlib_dir=launchpadlib_dir) | ||
326 | 49 | else: | 55 | else: |
330 | 50 | launchpad = Launchpad.login_with('cupstream2distro', server, allow_access_levels=["WRITE_PRIVATE"], | 56 | launchpad = Launchpad.login_with( |
331 | 51 | version='devel', # devel because copyPackage is only available there | 57 | 'cupstream2distro', server, |
332 | 52 | launchpadlib_dir=launchpadlib_dir) | 58 | allow_access_levels=["WRITE_PRIVATE"], |
333 | 59 | # devel because copyPackage is only available there | ||
334 | 60 | version='devel', | ||
335 | 61 | launchpadlib_dir=launchpadlib_dir) | ||
336 | 53 | 62 | ||
337 | 54 | return launchpad | 63 | return launchpad |
338 | 55 | 64 | ||
339 | @@ -77,7 +86,8 @@ | |||
340 | 77 | bug_title_sets = set() | 86 | bug_title_sets = set() |
341 | 78 | for bug in author_bugs[author]: | 87 | for bug in author_bugs[author]: |
342 | 79 | try: | 88 | try: |
344 | 80 | bug_title_sets.add("{} (LP: #{})".format(lp.bugs[bug].title, bug)) | 89 | title = "{} (LP: #{})".format(lp.bugs[bug].title, bug) |
345 | 90 | bug_title_sets.add(title) | ||
346 | 81 | except KeyError: | 91 | except KeyError: |
347 | 82 | # still list non existing or if launchpad timeouts bugs | 92 | # still list non existing or if launchpad timeouts bugs |
348 | 83 | bug_title_sets.add(u"Fix LP: #{}".format(bug)) | 93 | bug_title_sets.add(u"Fix LP: #{}".format(bug)) |
349 | @@ -102,9 +112,12 @@ | |||
350 | 102 | bug = lp.bugs[bug_num] | 112 | bug = lp.bugs[bug_num] |
351 | 103 | bug.addTask(target=package) | 113 | bug.addTask(target=package) |
352 | 104 | bug.lp_save() | 114 | bug.lp_save() |
356 | 105 | except (KeyError, lazr.restfulclient.errors.BadRequest, lazr.restfulclient.errors.ServerError): | 115 | except (KeyError, lazr.restfulclient.errors.BadRequest, |
357 | 106 | # ignore non existing or available bugs | 116 | lazr.restfulclient.errors.ServerError): |
358 | 107 | logging.info("Can't synchronize upstream/downstream bugs for bug #{}. Not blocking on that.".format(bug_num)) | 117 | # ignore non existing or available bugs |
359 | 118 | m = ("Can't synchronize upstream/downstream bugs " | ||
360 | 119 | "for bug #{}. Not blocking on that.".format(bug_num)) | ||
361 | 120 | logging.info(m) | ||
362 | 108 | 121 | ||
363 | 109 | 122 | ||
364 | 110 | def get_available_and_all_archs(series, ppa=None): | 123 | def get_available_and_all_archs(series, ppa=None): |
365 | @@ -115,7 +128,8 @@ | |||
366 | 115 | arch_all_arch = VIRTUALIZED_PPA_ARCH[0] | 128 | arch_all_arch = VIRTUALIZED_PPA_ARCH[0] |
367 | 116 | else: | 129 | else: |
368 | 117 | for arch in series.architectures: | 130 | for arch in series.architectures: |
370 | 118 | # HACK: filters armel as it's still seen as available on raring: https://launchpad.net/bugs/1077257 | 131 | # HACK: filters armel as it's still seen as available on raring: |
371 | 132 | # https://launchpad.net/bugs/1077257 | ||
372 | 119 | if arch.architecture_tag == "armel": | 133 | if arch.architecture_tag == "armel": |
373 | 120 | continue | 134 | continue |
374 | 121 | available_arch.add(arch.architecture_tag) | 135 | available_arch.add(arch.architecture_tag) |
375 | @@ -135,23 +149,29 @@ | |||
376 | 135 | if ppa_name.startswith("ppa:"): | 149 | if ppa_name.startswith("ppa:"): |
377 | 136 | ppa_name = ppa_name[4:] | 150 | ppa_name = ppa_name[4:] |
378 | 137 | ppa_dispatch = ppa_name.split("/") | 151 | ppa_dispatch = ppa_name.split("/") |
380 | 138 | return get_launchpad().people[ppa_dispatch[0]].getPPAByName(name=ppa_dispatch[1]) | 152 | person = get_launchpad().people[ppa_dispatch[0]] |
381 | 153 | return person.getPPAByName(name=ppa_dispatch[1]) | ||
382 | 154 | |||
383 | 139 | 155 | ||
384 | 140 | def is_series_current(series_name): | 156 | def is_series_current(series_name): |
385 | 141 | '''Return if series_name is the edge development version''' | 157 | '''Return if series_name is the edge development version''' |
386 | 142 | return get_ubuntu().current_series.name == series_name | 158 | return get_ubuntu().current_series.name == series_name |
387 | 143 | 159 | ||
388 | 160 | |||
389 | 144 | def get_resource_from_url(url): | 161 | def get_resource_from_url(url): |
390 | 145 | '''Return a lp resource from a launchpad url''' | 162 | '''Return a lp resource from a launchpad url''' |
391 | 146 | lp = get_launchpad() | 163 | lp = get_launchpad() |
393 | 147 | url = lp.resource_type_link.replace("/#service-root", "") + url.split("launchpad.net")[1] | 164 | url = lp.resource_type_link.replace("/#service-root", "") |
394 | 165 | url = url + url.split("launchpad.net")[1] | ||
395 | 148 | return lp.load(url) | 166 | return lp.load(url) |
396 | 149 | 167 | ||
397 | 168 | |||
398 | 150 | def get_resource_from_token(url): | 169 | def get_resource_from_token(url): |
399 | 151 | '''Return a lp resource from a launchpad token''' | 170 | '''Return a lp resource from a launchpad token''' |
400 | 152 | lp = get_launchpad() | 171 | lp = get_launchpad() |
401 | 153 | return lp.load(url) | 172 | return lp.load(url) |
402 | 154 | 173 | ||
403 | 174 | |||
404 | 155 | def is_dest_ubuntu_archive(series_link): | 175 | def is_dest_ubuntu_archive(series_link): |
405 | 156 | '''return if series_link is the ubuntu archive''' | 176 | '''return if series_link is the ubuntu archive''' |
406 | 157 | return series_link == get_ubuntu_archive().self_link | 177 | return series_link == get_ubuntu_archive().self_link |
407 | 158 | 178 | ||
408 | === modified file 'branch-source-builder/cupstream2distro/packageinppa.py' | |||
409 | --- branch-source-builder/cupstream2distro/packageinppa.py 2014-02-19 16:34:46 +0000 | |||
410 | +++ branch-source-builder/cupstream2distro/packageinppa.py 2014-02-23 16:37:01 +0000 | |||
411 | @@ -27,8 +27,9 @@ | |||
412 | 27 | (BUILDING, FAILED, PUBLISHED) = range(3) | 27 | (BUILDING, FAILED, PUBLISHED) = range(3) |
413 | 28 | 28 | ||
414 | 29 | def __init__(self, source_name, version, ppa, destarchive, series, | 29 | def __init__(self, source_name, version, ppa, destarchive, series, |
417 | 30 | available_archs_in_ppa, arch_all_arch, archs_to_eventually_ignore, | 30 | available_archs_in_ppa, arch_all_arch, |
418 | 31 | archs_to_unconditionually_ignore, package_archs=None): | 31 | archs_to_eventually_ignore, archs_to_unconditionually_ignore, |
419 | 32 | package_archs=None): | ||
420 | 32 | self.source_name = source_name | 33 | self.source_name = source_name |
421 | 33 | self.version = version | 34 | self.version = version |
422 | 34 | self.series = series | 35 | self.series = series |
423 | @@ -39,7 +40,8 @@ | |||
424 | 39 | # Get archs we should look at | 40 | # Get archs we should look at |
425 | 40 | version_for_source_file = version.split(':')[-1] | 41 | version_for_source_file = version.split(':')[-1] |
426 | 41 | if not package_archs: | 42 | if not package_archs: |
428 | 42 | dsc_filename = "{}_{}.dsc".format(source_name, version_for_source_file) | 43 | tmpl = "{}_{}.dsc" |
429 | 44 | dsc_filename = tmpl.format(source_name, version_for_source_file) | ||
430 | 43 | regexp = re.compile("^Architecture: (.*)\n") | 45 | regexp = re.compile("^Architecture: (.*)\n") |
431 | 44 | for line in open(dsc_filename): | 46 | for line in open(dsc_filename): |
432 | 45 | arch_lists = regexp.findall(line) | 47 | arch_lists = regexp.findall(line) |
433 | @@ -57,21 +59,28 @@ | |||
434 | 57 | archs_supported_by_package = set() | 59 | archs_supported_by_package = set() |
435 | 58 | for arch in package_archs.split(): | 60 | for arch in package_archs.split(): |
436 | 59 | archs_supported_by_package.add(arch) | 61 | archs_supported_by_package.add(arch) |
439 | 60 | self.archs = archs_supported_by_package.intersection(available_archs_in_ppa) | 62 | intersection = archs_supported_by_package.intersection |
440 | 61 | # ignore some eventual archs if doesn't exist in latest published version in dest | 63 | self.archs = intersection(available_archs_in_ppa) |
441 | 64 | # ignore some eventual archs if doesn't exist in latest published | ||
442 | 65 | # version in dest | ||
443 | 62 | archs_to_eventually_ignore = archs_to_eventually_ignore.copy() | 66 | archs_to_eventually_ignore = archs_to_eventually_ignore.copy() |
444 | 63 | if archs_to_eventually_ignore: | 67 | if archs_to_eventually_ignore: |
445 | 64 | try: | 68 | try: |
448 | 65 | previous_source = destarchive.getPublishedSources(exact_match=True, source_name=self.source_name, | 69 | kw = {'exact_match': True, 'source_name': self.source_name, |
449 | 66 | distro_series=self.series, status="Published")[0] | 70 | 'distro_series': self.series, 'status': "Published"} |
450 | 71 | previous_source = destarchive.getPublishedSources(**kw)[0] | ||
451 | 67 | for binary in previous_source.getPublishedBinaries(): | 72 | for binary in previous_source.getPublishedBinaries(): |
454 | 68 | if binary.architecture_specific and binary.distro_arch_series.architecture_tag in archs_to_eventually_ignore: | 73 | if (binary.architecture_specific and |
455 | 69 | archs_to_eventually_ignore -= set([binary.distro_arch_series.architecture_tag]) | 74 | binary.distro_arch_series.architecture_tag in |
456 | 75 | archs_to_eventually_ignore): | ||
457 | 76 | rem = [binary.distro_arch_series.architecture_tag] | ||
458 | 77 | archs_to_eventually_ignore -= set(rem) | ||
459 | 70 | if not archs_to_eventually_ignore: | 78 | if not archs_to_eventually_ignore: |
460 | 71 | break | 79 | break |
461 | 72 | 80 | ||
462 | 73 | except IndexError: | 81 | except IndexError: |
464 | 74 | # no package in dest, don't wait on any archs_to_eventually_ignore | 82 | # no package in dest, don't wait on any |
465 | 83 | # archs_to_eventually_ignore | ||
466 | 75 | pass | 84 | pass |
467 | 76 | # remove from the inspection remaining archs to ignore | 85 | # remove from the inspection remaining archs to ignore |
468 | 77 | if archs_to_eventually_ignore: | 86 | if archs_to_eventually_ignore: |
469 | @@ -79,11 +88,9 @@ | |||
470 | 79 | if archs_to_unconditionually_ignore: | 88 | if archs_to_unconditionually_ignore: |
471 | 80 | self.archs -= archs_to_unconditionually_ignore | 89 | self.archs -= archs_to_unconditionually_ignore |
472 | 81 | 90 | ||
473 | 82 | |||
474 | 83 | def __repr__(self): | 91 | def __repr__(self): |
475 | 84 | return '{} - {}'.format(self.source_name, self.version) | 92 | return '{} - {}'.format(self.source_name, self.version) |
476 | 85 | 93 | ||
477 | 86 | |||
478 | 87 | def get_status(self, on_particular_arch=None): | 94 | def get_status(self, on_particular_arch=None): |
479 | 88 | '''Look at the package status in the ppa | 95 | '''Look at the package status in the ppa |
480 | 89 | 96 | ||
481 | @@ -120,11 +127,15 @@ | |||
482 | 120 | 127 | ||
483 | 121 | for arch in self.archs.copy(): | 128 | for arch in self.archs.copy(): |
484 | 122 | if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)): | 129 | if os.path.isfile("{}.{}.ignore".format(self.source_name, arch)): |
486 | 123 | logging.warning("Request to ignore {} on {}.".format(self.source_name, arch)) | 130 | tmpl = "Request to ignore {} on {}." |
487 | 131 | logging.warning(tmpl.format(self.source_name, arch)) | ||
488 | 124 | try: | 132 | try: |
489 | 125 | self.archs.remove(arch) | 133 | self.archs.remove(arch) |
490 | 126 | except ValueError: | 134 | except ValueError: |
492 | 127 | logging.warning("Request to ignore {} on {} has been proceeded, but this one wasn't in the list we were monitor for.".format(self.source_name, arch)) | 135 | tmpl = ("Request to ignore {} on {} has been proceeded, " |
493 | 136 | "but this one wasn't in the list we were monitor " | ||
494 | 137 | "for.") | ||
495 | 138 | logging.warning(tmpl.format(self.source_name, arch)) | ||
496 | 128 | try: | 139 | try: |
497 | 129 | self.current_status.pop(arch) | 140 | self.current_status.pop(arch) |
498 | 130 | except KeyError: | 141 | except KeyError: |
499 | @@ -137,10 +148,12 @@ | |||
500 | 137 | 148 | ||
501 | 138 | # first step, get the source published | 149 | # first step, get the source published |
502 | 139 | if not self.current_status: | 150 | if not self.current_status: |
504 | 140 | (self.current_status, self.source) = self._get_status_for_source_package_in_ppa() | 151 | status = self._get_status_for_source_package_in_ppa() |
505 | 152 | (self.current_status, self.source) = status | ||
506 | 141 | # check the binary status | 153 | # check the binary status |
507 | 142 | if self.current_status: | 154 | if self.current_status: |
509 | 143 | self.current_status = self._get_status_for_binary_packages_in_ppa() | 155 | status = self._get_status_for_binary_packages_in_ppa() |
510 | 156 | self.current_status = status | ||
511 | 144 | 157 | ||
512 | 145 | def _get_status_for_source_package_in_ppa(self): | 158 | def _get_status_for_source_package_in_ppa(self): |
513 | 146 | '''Return current_status for source package in ppa. | 159 | '''Return current_status for source package in ppa. |
514 | @@ -149,13 +162,16 @@ | |||
515 | 149 | - None -> not visible yet | 162 | - None -> not visible yet |
516 | 150 | - BUILDING -> currently Building (or waiting to build) | 163 | - BUILDING -> currently Building (or waiting to build) |
517 | 151 | - FAILED -> Build failed for this arch or has been canceled | 164 | - FAILED -> Build failed for this arch or has been canceled |
519 | 152 | - PUBLISHED -> All packages (including arch:all from other archs) published. | 165 | - PUBLISHED -> All packages (including arch:all from other archs) |
520 | 166 | published. | ||
521 | 153 | 167 | ||
524 | 154 | Only the 2 first status are returned by this call. See _get_status_for_binary_packages_in_ppa | 168 | Only the 2 first status are returned by this call. See |
525 | 155 | for the others.''' | 169 | _get_status_for_binary_packages_in_ppa for the others.''' |
526 | 156 | 170 | ||
527 | 157 | try: | 171 | try: |
529 | 158 | source = self.ppa.getPublishedSources(exact_match=True, source_name=self.source_name, version=self.version, distro_series=self.series)[0] | 172 | kw = {'exact_match': True, 'source_name': self.source_name, |
530 | 173 | 'version': self.version, 'distro_series': self.series} | ||
531 | 174 | source = self.ppa.getPublishedSources(**kw)[0] | ||
532 | 159 | logging.info("Source available in ppa") | 175 | logging.info("Source available in ppa") |
533 | 160 | current_status = {} | 176 | current_status = {} |
534 | 161 | for arch in self.archs: | 177 | for arch in self.archs: |
535 | @@ -172,26 +188,36 @@ | |||
536 | 172 | - None -> not visible yet | 188 | - None -> not visible yet |
537 | 173 | - BUILDING -> currently Building (or waiting to build) | 189 | - BUILDING -> currently Building (or waiting to build) |
538 | 174 | - FAILED -> Build failed for this arch or has been canceled | 190 | - FAILED -> Build failed for this arch or has been canceled |
545 | 175 | - PUBLISHED -> All packages (including arch:all from other archs) published. | 191 | - PUBLISHED -> All packages (including arch:all from other archs) |
546 | 176 | 192 | published. | |
547 | 177 | Only the 3 last statuses are returned by this call. See _get_status_for_source_package_in_ppa | 193 | |
548 | 178 | for the other.''' | 194 | Only the 3 last statuses are returned by this call. See |
549 | 179 | 195 | _get_status_for_source_package_in_ppa for the other.''' | |
550 | 180 | # Try to see if all binaries availables for this arch are built, including arch:all on other archs | 196 | |
551 | 197 | # Try to see if all binaries availables for this arch are built, | ||
552 | 198 | # including arch:all on other archs | ||
553 | 181 | status = self.current_status | 199 | status = self.current_status |
554 | 182 | at_least_one_published_binary = False | ||
555 | 183 | for binary in self.source.getPublishedBinaries(): | 200 | for binary in self.source.getPublishedBinaries(): |
556 | 184 | at_least_one_published_binary = True | ||
557 | 185 | # all binaries for an arch are published at the same time | 201 | # all binaries for an arch are published at the same time |
563 | 186 | # launchpad is lying, it's telling that archs not in the ppa are built (for arch:all). Even for non supported arch! | 202 | # launchpad is lying, it's telling that archs not in the ppa are |
564 | 187 | # for instance, we can have the case of self.arch_all_arch (arch:all), built before the others and amd64 will be built for it | 203 | # built (for arch:all). Even for non supported arch! |
565 | 188 | if binary.status == "Published" and (binary.distro_arch_series.architecture_tag == self.arch_all_arch or | 204 | # for instance, we can have the case of self.arch_all_arch |
566 | 189 | (binary.distro_arch_series.architecture_tag != self.arch_all_arch and binary.architecture_specific)): | 205 | # (arch:all), built before the others and amd64 will be built for |
567 | 190 | status[binary.distro_arch_series.architecture_tag] = self.PUBLISHED | 206 | # it |
568 | 207 | published = binary.status == "Published" | ||
569 | 208 | arch_tag = binary.distro_arch_series.architecture_tag | ||
570 | 209 | not_arch_all = arch_tag != self.arch_all_arch | ||
571 | 210 | arch_all = arch_tag == self.arch_all_arch | ||
572 | 211 | arch_specific = binary.architecture_specific | ||
573 | 212 | if published and (arch_all or (not_arch_all and arch_specific)): | ||
574 | 213 | status[arch_tag] = self.PUBLISHED | ||
575 | 191 | 214 | ||
577 | 192 | # Looking for builds on archs still BUILDING (just loop on builds once to avoid too many lp requests) | 215 | # Looking for builds on archs still BUILDING (just loop on builds once |
578 | 216 | # to avoid too many lp requests) | ||
579 | 193 | needs_checking_build = False | 217 | needs_checking_build = False |
581 | 194 | build_state_failed = ('Failed to build', 'Chroot problem', 'Failed to upload', 'Cancelled build', 'Build for superseded Source') | 218 | build_state_failed = ('Failed to build', 'Chroot problem', |
582 | 219 | 'Failed to upload', 'Cancelled build', | ||
583 | 220 | 'Build for superseded Source') | ||
584 | 195 | for arch in self.archs: | 221 | for arch in self.archs: |
585 | 196 | if self.current_status[arch] == self.BUILDING: | 222 | if self.current_status[arch] == self.BUILDING: |
586 | 197 | needs_checking_build = True | 223 | needs_checking_build = True |
587 | @@ -202,24 +228,35 @@ | |||
588 | 202 | continue | 228 | continue |
589 | 203 | if self.current_status[build.arch_tag] == self.BUILDING: | 229 | if self.current_status[build.arch_tag] == self.BUILDING: |
590 | 204 | if build.buildstate in build_state_failed: | 230 | if build.buildstate in build_state_failed: |
593 | 205 | logging.error("{}: Build {} ({}) failed because of {}".format(build.arch_tag, build.title, | 231 | tmpl = "{}: Build {} ({}) failed because of {}" |
594 | 206 | build.web_link, build.buildstate)) | 232 | logging.error(tmpl.format(build.arch_tag, build.title, |
595 | 233 | build.web_link, build.buildstate)) | ||
596 | 207 | status[build.arch_tag] = self.FAILED | 234 | status[build.arch_tag] = self.FAILED |
604 | 208 | # Another launchpad trick: if a binary arch was published, but then is superseeded, getPublishedBinaries() won't list | 235 | # Another launchpad trick: if a binary arch was published, |
605 | 209 | # those binaries anymore. So it's seen as BUILDING again. | 236 | # but then is superseeded, getPublishedBinaries() won't |
606 | 210 | # If there is a successful build record of it and the source is superseded, it means that it built fine at some point, | 237 | # list those binaries anymore. So it's seen as BUILDING |
607 | 211 | # Another arch will fail as superseeded. | 238 | # again. If there is a successful build record of it and |
608 | 212 | # We don't just retain the old state of "PUBLISHED" because maybe we started the script with that situation already | 239 | # the source is superseded, it means that it built fine at |
609 | 213 | elif build.buildstate not in build_state_failed and self.source.status == "Superseded": | 240 | # some point, Another arch will fail as superseeded. |
610 | 214 | status[build.arch_tag] = self.PUBLISHED | 241 | # We don't just retain the old state of "PUBLISHED" because |
611 | 242 | # maybe we started the script with that situation already | ||
612 | 243 | elif (build.buildstate not in build_state_failed | ||
613 | 244 | and self.source.status == "Superseded"): | ||
614 | 245 | status[build.arch_tag] = self.PUBLISHED | ||
615 | 215 | 246 | ||
619 | 216 | # There is no way to know if there are some arch:all packages (and there are not in publishedBinaries for this arch until | 247 | # There is no way to know if there are some arch:all packages (and |
620 | 217 | # it's built on arch_all_arch). So mark all arch to BUILDING if self.arch_all_arch is building or FAILED if it failed. | 248 | # there are not in publishedBinaries for this arch until |
621 | 218 | if self.arch_all_arch in status and status[self.arch_all_arch] != self.PUBLISHED: | 249 | # it's built on arch_all_arch). So mark all arch to BUILDING if |
622 | 250 | # self.arch_all_arch is building or FAILED if it failed. | ||
623 | 251 | not_published = status[self.arch_all_arch] != self.PUBLISHED | ||
624 | 252 | if self.arch_all_arch in status and not_published: | ||
625 | 219 | for arch in self.archs: | 253 | for arch in self.archs: |
626 | 220 | if status[arch] == self.PUBLISHED: | 254 | if status[arch] == self.PUBLISHED: |
627 | 221 | status[arch] = status[self.arch_all_arch] | 255 | status[arch] = status[self.arch_all_arch] |
630 | 222 | if arch != self.arch_all_arch and status[arch] == self.FAILED: | 256 | failed = status[arch] == self.FAILED |
631 | 223 | logging.error("{} marked as FAILED because {} build FAILED and we may miss arch:all packages".format(arch, self.arch_all_arch)) | 257 | if arch != self.arch_all_arch and failed: |
632 | 258 | msg = ("{} marked as FAILED because {} build FAILED " | ||
633 | 259 | "and we may miss arch:all packages") | ||
634 | 260 | logging.error(msg.format(arch, self.arch_all_arch)) | ||
635 | 224 | 261 | ||
636 | 225 | return status | 262 | return status |
637 | 226 | 263 | ||
638 | === modified file 'branch-source-builder/cupstream2distro/packageinppamanager.py' | |||
639 | --- branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-19 16:34:46 +0000 | |||
640 | +++ branch-source-builder/cupstream2distro/packageinppamanager.py 2014-02-23 16:37:01 +0000 | |||
641 | @@ -41,7 +41,8 @@ | |||
642 | 41 | # we do not rely on the .changes files but in the config file | 41 | # we do not rely on the .changes files but in the config file |
643 | 42 | # because we need the exact version (which can have an epoch) | 42 | # because we need the exact version (which can have an epoch) |
644 | 43 | result = set() | 43 | result = set() |
646 | 44 | source_package_regexp = re.compile("(.*).{}$".format(PROJECT_CONFIG_SUFFIX)) | 44 | tmpl = "(.*).{}$" |
647 | 45 | source_package_regexp = re.compile(tmpl.format(PROJECT_CONFIG_SUFFIX)) | ||
648 | 45 | for file in os.listdir('.'): | 46 | for file in os.listdir('.'): |
649 | 46 | substract = source_package_regexp.findall(file) | 47 | substract = source_package_regexp.findall(file) |
650 | 47 | if substract: | 48 | if substract: |
651 | @@ -59,40 +60,51 @@ | |||
652 | 59 | 60 | ||
653 | 60 | 61 | ||
654 | 61 | def get_packages_and_versions_uploaded(): | 62 | def get_packages_and_versions_uploaded(): |
656 | 62 | '''Get (package, version) of all packages uploaded. We can have duplicates''' | 63 | '''Get (package, version) of all packages uploaded. We can have |
657 | 64 | duplicates''' | ||
658 | 63 | 65 | ||
659 | 64 | # we do not rely on the .changes files but in the config file | 66 | # we do not rely on the .changes files but in the config file |
660 | 65 | # because we need the exact version (which can have an epoch) | 67 | # because we need the exact version (which can have an epoch) |
661 | 66 | result = set() | 68 | result = set() |
663 | 67 | source_package_regexp = re.compile("(.*).{}.*$".format(PROJECT_CONFIG_SUFFIX)) | 69 | tmpl = "(.*).{}.*$" |
664 | 70 | source_package_regexp = re.compile(tmpl.format(PROJECT_CONFIG_SUFFIX)) | ||
665 | 68 | for file in os.listdir('.'): | 71 | for file in os.listdir('.'): |
666 | 69 | substract = source_package_regexp.findall(file) | 72 | substract = source_package_regexp.findall(file) |
667 | 70 | if substract: | 73 | if substract: |
668 | 71 | config = ConfigParser.RawConfigParser() | 74 | config = ConfigParser.RawConfigParser() |
669 | 72 | config.read(file) | 75 | config.read(file) |
671 | 73 | result.add((substract[0], config.get('Package', 'packaging_version'))) | 76 | pkg = config.get('Package', 'packaging_version') |
672 | 77 | result.add((substract[0], pkg)) | ||
673 | 74 | return result | 78 | return result |
674 | 75 | 79 | ||
675 | 76 | 80 | ||
677 | 77 | def update_all_packages_status(packages_not_in_ppa, packages_building, packages_failed, particular_arch=None): | 81 | def update_all_packages_status(packages_not_in_ppa, packages_building, |
678 | 82 | packages_failed, particular_arch=None): | ||
679 | 78 | '''Update all packages status, checking in the ppa''' | 83 | '''Update all packages status, checking in the ppa''' |
680 | 79 | 84 | ||
681 | 80 | for current_package in (packages_not_in_ppa.union(packages_building)): | 85 | for current_package in (packages_not_in_ppa.union(packages_building)): |
683 | 81 | logging.info("current_package: " + current_package.source_name + " " + current_package.version) | 86 | logging.info("current_package: " + current_package.source_name + " " + |
684 | 87 | current_package.version) | ||
685 | 82 | package_status = current_package.get_status(particular_arch) | 88 | package_status = current_package.get_status(particular_arch) |
687 | 83 | if package_status != None: # global package_status can be 0 (building), 1 (failed), 2 (published) | 89 | # global package_status can be 0 (building), 1 (failed), 2 (published) |
688 | 90 | if package_status is None: | ||
689 | 84 | # if one arch building, still considered as building | 91 | # if one arch building, still considered as building |
690 | 85 | if package_status == PackageInPPA.BUILDING: | 92 | if package_status == PackageInPPA.BUILDING: |
692 | 86 | _ensure_removed_from_set(packages_not_in_ppa, current_package) # maybe already removed | 93 | # maybe already removed |
693 | 94 | _ensure_removed_from_set(packages_not_in_ppa, current_package) | ||
694 | 87 | packages_building.add(current_package) | 95 | packages_building.add(current_package) |
695 | 88 | # if one arch failed, considered as failed | 96 | # if one arch failed, considered as failed |
696 | 89 | elif package_status == PackageInPPA.FAILED: | 97 | elif package_status == PackageInPPA.FAILED: |
699 | 90 | _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step | 98 | # in case we missed the "build" step |
700 | 91 | _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step | 99 | _ensure_removed_from_set(packages_building, current_package) |
701 | 100 | # in case we missed the "wait" step | ||
702 | 101 | _ensure_removed_from_set(packages_not_in_ppa, current_package) | ||
703 | 92 | packages_failed.add(current_package) | 102 | packages_failed.add(current_package) |
704 | 93 | elif package_status == PackageInPPA.PUBLISHED: | 103 | elif package_status == PackageInPPA.PUBLISHED: |
707 | 94 | _ensure_removed_from_set(packages_building, current_package) # in case we missed the "build" step | 104 | # in case we missed the "build" step |
708 | 95 | _ensure_removed_from_set(packages_not_in_ppa, current_package) # in case we missed the "wait" step | 105 | _ensure_removed_from_set(packages_building, current_package) |
709 | 106 | # in case we missed the "wait" step | ||
710 | 107 | _ensure_removed_from_set(packages_not_in_ppa, current_package) | ||
711 | 96 | 108 | ||
712 | 97 | 109 | ||
713 | 98 | def _get_current_packaging_version_from_config(source_package_name): | 110 | def _get_current_packaging_version_from_config(source_package_name): |
714 | 99 | 111 | ||
715 | === modified file 'branch-source-builder/cupstream2distro/packagemanager.py' | |||
716 | --- branch-source-builder/cupstream2distro/packagemanager.py 2014-02-19 16:34:46 +0000 | |||
717 | +++ branch-source-builder/cupstream2distro/packagemanager.py 2014-02-23 16:37:01 +0000 | |||
718 | @@ -31,9 +31,11 @@ | |||
719 | 31 | import launchpadmanager | 31 | import launchpadmanager |
720 | 32 | import settings | 32 | import settings |
721 | 33 | from .utils import ignored | 33 | from .utils import ignored |
725 | 34 | 34 | from subprocess import Popen, PIPE | |
726 | 35 | 35 | ||
727 | 36 | def get_current_version_for_series(source_package_name, series_name, ppa_name=None, dest=None): | 36 | |
728 | 37 | def get_current_version_for_series(source_package_name, series_name, | ||
729 | 38 | ppa_name=None, dest=None): | ||
730 | 37 | '''Get current version for a package name in that series''' | 39 | '''Get current version for a package name in that series''' |
731 | 38 | series = launchpadmanager.get_series(series_name) | 40 | series = launchpadmanager.get_series(series_name) |
732 | 39 | if not dest: | 41 | if not dest: |
733 | @@ -41,29 +43,37 @@ | |||
734 | 41 | dest = launchpadmanager.get_ppa(ppa_name) | 43 | dest = launchpadmanager.get_ppa(ppa_name) |
735 | 42 | else: | 44 | else: |
736 | 43 | dest = launchpadmanager.get_ubuntu_archive() | 45 | dest = launchpadmanager.get_ubuntu_archive() |
738 | 44 | source_collection = dest.getPublishedSources(exact_match=True, source_name=source_package_name, distro_series=series) | 46 | kw = {'exact_match': True, 'source_name': source_package_name, |
739 | 47 | 'distro_series': series} | ||
740 | 48 | source_collection = dest.getPublishedSources(**kw) | ||
741 | 45 | try: | 49 | try: |
743 | 46 | # cjwatson told that list always have the more recently published first (even if removed) | 50 | # cjwatson told that list always have the more recently published first |
744 | 51 | # (even if removed) | ||
745 | 47 | return source_collection[0].source_package_version | 52 | return source_collection[0].source_package_version |
746 | 48 | # was never in the dest, set the lowest possible version | 53 | # was never in the dest, set the lowest possible version |
747 | 49 | except IndexError: | 54 | except IndexError: |
748 | 50 | return "0" | 55 | return "0" |
749 | 51 | 56 | ||
750 | 52 | 57 | ||
752 | 53 | def is_version_for_series_in_dest(source_package_name, version, series, dest, pocket="Release"): | 58 | def is_version_for_series_in_dest(source_package_name, version, series, dest, |
753 | 59 | pocket="Release"): | ||
754 | 54 | '''Return if version for a package name in that series is in dest''' | 60 | '''Return if version for a package name in that series is in dest''' |
757 | 55 | return dest.getPublishedSources(exact_match=True, source_name=source_package_name, version=version, | 61 | kw = {'exact_match': True, 'source_name': source_package_name, |
758 | 56 | distro_series=series, pocket=pocket).total_size > 0 | 62 | 'version': version, 'distro_series': series, 'pocket': pocket} |
759 | 63 | return dest.getPublishedSources(**kw).total_size > 0 | ||
760 | 64 | |||
761 | 57 | 65 | ||
762 | 58 | def is_version_in_queue(source_package_name, version, dest_serie, queue): | 66 | def is_version_in_queue(source_package_name, version, dest_serie, queue): |
763 | 59 | '''Return if version for a package name in that series is in dest''' | 67 | '''Return if version for a package name in that series is in dest''' |
766 | 60 | return dest_serie.getPackageUploads(exact_match=True, name=source_package_name, version=version, | 68 | kw = {'exact_match': True, 'name': source_package_name, 'version': version, |
767 | 61 | status=queue).total_size > 0 | 69 | 'status': queue} |
768 | 70 | return dest_serie.getPackageUploads(**kw).total_size > 0 | ||
769 | 62 | 71 | ||
770 | 63 | 72 | ||
771 | 64 | def is_version1_higher_than_version2(version1, version2): | 73 | def is_version1_higher_than_version2(version1, version2): |
772 | 65 | '''return if version1 is higher than version2''' | 74 | '''return if version1 is higher than version2''' |
774 | 66 | return (subprocess.call(["dpkg", "--compare-versions", version1, 'gt', version2], stdout=subprocess.PIPE, stderr=subprocess.PIPE) == 0) | 75 | cmd = ["dpkg", "--compare-versions", version1, 'gt', version2] |
775 | 76 | return (subprocess.call(cmd, stdout=PIPE, stderr=PIPE) == 0) | ||
776 | 67 | 77 | ||
777 | 68 | 78 | ||
778 | 69 | def is_version_in_changelog(version, f): | 79 | def is_version_in_changelog(version, f): |
779 | @@ -72,7 +82,8 @@ | |||
780 | 72 | if version == "0": | 82 | if version == "0": |
781 | 73 | return True | 83 | return True |
782 | 74 | 84 | ||
784 | 75 | desired_changelog_line = re.compile("\({}\) (?!UNRELEASED).*\; urgency=".format(version.replace('+', '\+'))) | 85 | t = "\({}\) (?!UNRELEASED).*\; urgency=" |
785 | 86 | desired_changelog_line = re.compile(t.format(version.replace('+', '\+'))) | ||
786 | 76 | for line in f.readlines(): | 87 | for line in f.readlines(): |
787 | 77 | if desired_changelog_line.search(line): | 88 | if desired_changelog_line.search(line): |
788 | 78 | return True | 89 | return True |
789 | @@ -83,9 +94,12 @@ | |||
790 | 83 | def get_latest_upstream_bzr_rev(f, dest_ppa=None): | 94 | def get_latest_upstream_bzr_rev(f, dest_ppa=None): |
791 | 84 | '''Report latest bzr rev in the file | 95 | '''Report latest bzr rev in the file |
792 | 85 | 96 | ||
794 | 86 | If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to first distro version''' | 97 | If dest_ppa, first try to fetch the dest ppa tag. Otherwise, fallback to |
795 | 98 | first distro version''' | ||
796 | 87 | distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT)) | 99 | distro_regex = re.compile("{} (\d+)".format(settings.REV_STRING_FORMAT)) |
798 | 88 | destppa_regexp = re.compile("{} (\d+) \(ppa:{}\)".format(settings.REV_STRING_FORMAT, dest_ppa)) | 100 | tmpl = "{} (\d+) \(ppa:{}\)" |
799 | 101 | destppa_regexp = re.compile( | ||
800 | 102 | tmpl.format(settings.REV_STRING_FORMAT, dest_ppa)) | ||
801 | 89 | distro_rev = None | 103 | distro_rev = None |
802 | 90 | candidate_destppa_rev = None | 104 | candidate_destppa_rev = None |
803 | 91 | candidate_distro_rev = None | 105 | candidate_distro_rev = None |
804 | @@ -111,15 +125,19 @@ | |||
805 | 111 | destppa_element_found = True | 125 | destppa_element_found = True |
806 | 112 | except IndexError: | 126 | except IndexError: |
807 | 113 | destppa_element_found = False | 127 | destppa_element_found = False |
809 | 114 | if not distro_rev and not destppa_element_found and not "(ppa:" in line: | 128 | ppa_line = "(ppa:" in line |
810 | 129 | if not distro_rev and not destppa_element_found and not ppa_line: | ||
811 | 115 | try: | 130 | try: |
812 | 116 | candidate_distro_rev = int(distro_regex.findall(line)[0]) | 131 | candidate_distro_rev = int(distro_regex.findall(line)[0]) |
813 | 117 | distro_element_found = True | 132 | distro_element_found = True |
814 | 118 | except IndexError: | 133 | except IndexError: |
815 | 119 | distro_element_found = False | 134 | distro_element_found = False |
816 | 120 | 135 | ||
819 | 121 | # try to catchup next line if we have a marker start without anything found | 136 | # try to catchup next line if we have a marker start without anything |
820 | 122 | if settings.REV_STRING_FORMAT in line and (dest_ppa and not destppa_element_found) and not distro_element_found: | 137 | # found |
821 | 138 | rev_str_found = settings.REV_STRING_FORMAT in line | ||
822 | 139 | ppa_found = dest_ppa and not destppa_element_found | ||
823 | 140 | if rev_str_found and ppa_found and not distro_element_found: | ||
824 | 123 | previous_line = line | 141 | previous_line = line |
825 | 124 | 142 | ||
826 | 125 | if line.startswith(" -- "): | 143 | if line.startswith(" -- "): |
827 | @@ -150,7 +168,8 @@ | |||
828 | 150 | 168 | ||
829 | 151 | def get_packaging_version(): | 169 | def get_packaging_version(): |
830 | 152 | '''Get current packaging rev''' | 170 | '''Get current packaging rev''' |
832 | 153 | instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 171 | instance = Popen(["dpkg-parsechangelog"], |
833 | 172 | stdout=PIPE, stderr=PIPE) | ||
834 | 154 | (stdout, stderr) = instance.communicate() | 173 | (stdout, stderr) = instance.communicate() |
835 | 155 | if instance.returncode != 0: | 174 | if instance.returncode != 0: |
836 | 156 | raise Exception(stderr.decode("utf-8").strip()) | 175 | raise Exception(stderr.decode("utf-8").strip()) |
837 | @@ -160,19 +179,24 @@ | |||
838 | 160 | if packaging_version: | 179 | if packaging_version: |
839 | 161 | return packaging_version[0] | 180 | return packaging_version[0] |
840 | 162 | 181 | ||
846 | 163 | raise Exception("Didn't find any Version in the package: {}".format(stdout)) | 182 | tmpl = "Didn't find any Version in the package: {}" |
847 | 164 | 183 | raise Exception(tmpl.format(stdout)) | |
848 | 165 | 184 | ||
849 | 166 | def get_source_package_from_dest(source_package_name, dest_archive, dest_current_version, series_name): | 185 | |
850 | 167 | '''Download and return a path containing a checkout of the current dest version. | 186 | def get_source_package_from_dest(source_package_name, dest_archive, |
851 | 187 | dest_current_version, series_name): | ||
852 | 188 | '''Download and return a path containing a checkout of the current dest | ||
853 | 189 | version. | ||
854 | 168 | 190 | ||
855 | 169 | None if this package was never published to dest archive''' | 191 | None if this package was never published to dest archive''' |
856 | 170 | 192 | ||
857 | 171 | if dest_current_version == "0": | 193 | if dest_current_version == "0": |
859 | 172 | logging.info("This package was never released to the destination archive, don't return downloaded source") | 194 | logging.info("This package was never released to the destination " |
860 | 195 | "archive, don't return downloaded source") | ||
861 | 173 | return None | 196 | return None |
862 | 174 | 197 | ||
864 | 175 | logging.info("Grab code for {} ({}) from {}".format(source_package_name, dest_current_version, series_name)) | 198 | args = (source_package_name, dest_current_version, series_name) |
865 | 199 | logging.info("Grab code for {} ({}) from {}".format(*args)) | ||
866 | 176 | source_package_download_dir = os.path.join('ubuntu', source_package_name) | 200 | source_package_download_dir = os.path.join('ubuntu', source_package_name) |
867 | 177 | series = launchpadmanager.get_series(series_name) | 201 | series = launchpadmanager.get_series(series_name) |
868 | 178 | with ignored(OSError): | 202 | with ignored(OSError): |
869 | @@ -180,27 +204,39 @@ | |||
870 | 180 | os.chdir(source_package_download_dir) | 204 | os.chdir(source_package_download_dir) |
871 | 181 | 205 | ||
872 | 182 | try: | 206 | try: |
874 | 183 | sourcepkg = dest_archive.getPublishedSources(status="Published", exact_match=True, source_name=source_package_name, distro_series=series, version=dest_current_version)[0] | 207 | kw = {'status': "Published", 'exact_match': True, |
875 | 208 | 'source_name': source_package_name, 'distro_series': series, | ||
876 | 209 | 'version': dest_current_version} | ||
877 | 210 | sourcepkg = dest_archive.getPublishedSources(**kw)[0] | ||
878 | 184 | except IndexError: | 211 | except IndexError: |
879 | 185 | raise Exception("Couldn't get in the destination the expected version") | 212 | raise Exception("Couldn't get in the destination the expected version") |
881 | 186 | logging.info('Downloading %s version %s', source_package_name, dest_current_version) | 213 | tmpl = 'Downloading %s version %s' |
882 | 214 | logging.info(tmpl, source_package_name, dest_current_version) | ||
883 | 187 | for url in sourcepkg.sourceFileUrls(): | 215 | for url in sourcepkg.sourceFileUrls(): |
884 | 188 | urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1])) | 216 | urllib.urlretrieve(url, urllib.unquote(url.split('/')[-1])) |
886 | 189 | instance = subprocess.Popen("dpkg-source -x *dsc", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 217 | cmd = "dpkg-source -x *dsc" |
887 | 218 | instance = Popen(cmd, shell=True, stdout=PIPE, stderr=PIPE) | ||
888 | 190 | (stdout, stderr) = instance.communicate() | 219 | (stdout, stderr) = instance.communicate() |
889 | 191 | if instance.returncode != 0: | 220 | if instance.returncode != 0: |
890 | 192 | raise Exception(stderr.decode("utf-8").strip()) | 221 | raise Exception(stderr.decode("utf-8").strip()) |
891 | 193 | 222 | ||
892 | 194 | # check the dir exist | 223 | # check the dir exist |
894 | 195 | splitted_version = dest_current_version.split(':')[-1].split('-') # remove epoch is there is one | 224 | # remove epoch is there is one |
895 | 225 | splitted_version = dest_current_version.split(':')[-1].split('-') | ||
896 | 196 | # TODO: debian version (like -3) is not handled here. | 226 | # TODO: debian version (like -3) is not handled here. |
899 | 197 | # We do handle 42ubuntu1 though (as splitted_version[0] can contain "ubuntu") | 227 | # We do handle 42ubuntu1 though (as splitted_version[0] can contain |
900 | 198 | if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1: # don't remove last item for the case where we had a native version (-0.35.2) without ubuntu in it | 228 | # "ubuntu") |
901 | 229 | if "ubuntu" in splitted_version[-1] and len(splitted_version) > 1: | ||
902 | 230 | # don't remove last item for the case where we had a native version | ||
903 | 231 | # (-0.35.2) without ubuntu in it | ||
904 | 199 | splitted_version = splitted_version[:-1] | 232 | splitted_version = splitted_version[:-1] |
905 | 200 | version_for_source_file = '-'.join(splitted_version) | 233 | version_for_source_file = '-'.join(splitted_version) |
907 | 201 | source_directory_name = "{}-{}".format(source_package_name, version_for_source_file) | 234 | args = (source_package_name, version_for_source_file) |
908 | 235 | source_directory_name = "{}-{}".format(*args) | ||
909 | 202 | if not os.path.isdir(source_directory_name): | 236 | if not os.path.isdir(source_directory_name): |
911 | 203 | raise Exception("We tried to download and check that the directory {} is present, but it's not the case".format(source_directory_name)) | 237 | tmpl = ("We tried to download and check that the directory {} is " |
912 | 238 | "present, but it's not the case") | ||
913 | 239 | raise Exception(tmpl.format(source_directory_name)) | ||
914 | 204 | os.chdir('../..') | 240 | os.chdir('../..') |
915 | 205 | return (os.path.join(source_package_download_dir, source_directory_name)) | 241 | return (os.path.join(source_package_download_dir, source_directory_name)) |
916 | 206 | 242 | ||
917 | @@ -210,24 +246,40 @@ | |||
918 | 210 | 246 | ||
919 | 211 | dest_version_source can be None if no released version was done before.''' | 247 | dest_version_source can be None if no released version was done before.''' |
920 | 212 | 248 | ||
922 | 213 | # we always released something not yet in ubuntu, no matter criterias are not met. | 249 | # we always released something not yet in ubuntu, no matter criterias are |
923 | 250 | # not met. | ||
924 | 214 | if not dest_version_source: | 251 | if not dest_version_source: |
925 | 215 | return True | 252 | return True |
926 | 216 | 253 | ||
934 | 217 | # now check the relevance of the committed changes compared to the version in the repository (if any) | 254 | # now check the relevance of the committed changes compared to the version |
935 | 218 | diffinstance = subprocess.Popen(['diff', '-Nrup', '.', dest_version_source], stdout=subprocess.PIPE) | 255 | # in the repository (if any) |
936 | 219 | filterinstance = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE) | 256 | cmd = ['diff', '-Nrup', '.', dest_version_source] |
937 | 220 | lsdiffinstance = subprocess.Popen(['lsdiff'], stdin=filterinstance.stdout, stdout=subprocess.PIPE) | 257 | diffinstance = Popen(cmd, stdout=PIPE) |
938 | 221 | (relevant_changes, err) = subprocess.Popen(['grep', '-Ev', '.bzr|.pc'], stdin=lsdiffinstance.stdout, stdout=subprocess.PIPE).communicate() | 258 | |
939 | 222 | 259 | cmd = ['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', | |
940 | 223 | # detect if the only change is a Vcs* target changes (with or without changelog edit). We won't release in that case | 260 | '*local-options'] |
941 | 261 | filterinstance = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE) | ||
942 | 262 | cmd = ['lsdiff'] | ||
943 | 263 | lsdiffinstance = Popen(cmd, stdin=filterinstance.stdout, stdout=PIPE) | ||
944 | 264 | cmd = ['grep', '-Ev', '.bzr|.pc'] | ||
945 | 265 | p = Popen(cmd, stdin=lsdiffinstance.stdout, stdout=PIPE) | ||
946 | 266 | (relevant_changes, err) = p.communicate() | ||
947 | 267 | |||
948 | 268 | # detect if the only change is a Vcs* target changes (with or without | ||
949 | 269 | # changelog edit). We won't release in that case | ||
950 | 224 | number_of_changed_files = relevant_changes.count("\n") | 270 | number_of_changed_files = relevant_changes.count("\n") |
954 | 225 | if ((number_of_changed_files == 1 and "debian/control" in relevant_changes) or | 271 | |
955 | 226 | (number_of_changed_files == 2 and "debian/control" in relevant_changes and "debian/changelog" in relevant_changes)): | 272 | control_changed = "debian/control" in relevant_changes |
956 | 227 | (results, err) = subprocess.Popen(['diff', os.path.join('debian', 'control'), os.path.join(dest_version_source, "debian", "control")], stdout=subprocess.PIPE).communicate() | 273 | if control_changed and number_of_changed_files in (1, 2): |
957 | 274 | cmd = ['diff', os.path.join('debian', 'control'), | ||
958 | 275 | os.path.join(dest_version_source, "debian", "control")] | ||
959 | 276 | |||
960 | 277 | (results, err) = Popen(cmd, stdout=PIPE).communicate() | ||
961 | 228 | for diff_line in results.split('\n'): | 278 | for diff_line in results.split('\n'): |
962 | 229 | if diff_line.startswith("< ") or diff_line.startswith("> "): | 279 | if diff_line.startswith("< ") or diff_line.startswith("> "): |
964 | 230 | if not diff_line[2:].startswith("Vcs-") and not diff_line[2:].startswith("#"): | 280 | vcs = diff_line[2:].startswith("Vcs-") |
965 | 281 | comment = diff_line[2:].startswith("#") | ||
966 | 282 | if not vcs and not comment: | ||
967 | 231 | return True | 283 | return True |
968 | 232 | return False | 284 | return False |
969 | 233 | 285 | ||
970 | @@ -237,24 +289,34 @@ | |||
971 | 237 | return (relevant_changes != '') | 289 | return (relevant_changes != '') |
972 | 238 | 290 | ||
973 | 239 | 291 | ||
976 | 240 | def is_relevant_source_diff_from_previous_dest_version(newdsc_path, dest_version_source): | 292 | def is_relevant_source_diff_from_previous_dest_version(newdsc_path, |
977 | 241 | '''Extract and check if the generated source diff different from previous one''' | 293 | dest_version_source): |
978 | 294 | '''Extract and check if the generated source diff different from previous | ||
979 | 295 | one''' | ||
980 | 242 | 296 | ||
981 | 243 | with ignored(OSError): | 297 | with ignored(OSError): |
982 | 244 | os.makedirs("generated") | 298 | os.makedirs("generated") |
984 | 245 | extracted_generated_source = os.path.join("generated", newdsc_path.split('_')[0]) | 299 | prefix = newdsc_path.split('_')[0] |
985 | 300 | extracted_generated_source = os.path.join("generated", prefix) | ||
986 | 246 | with ignored(OSError): | 301 | with ignored(OSError): |
987 | 247 | shutil.rmtree(extracted_generated_source) | 302 | shutil.rmtree(extracted_generated_source) |
988 | 248 | 303 | ||
989 | 249 | # remove epoch is there is one | 304 | # remove epoch is there is one |
991 | 250 | if subprocess.call(["dpkg-source", "-x", newdsc_path, extracted_generated_source]) != 0: | 305 | cmd = ["dpkg-source", "-x", newdsc_path, extracted_generated_source] |
992 | 306 | if subprocess.call(cmd) != 0: | ||
993 | 251 | raise Exception("dpkg-source command returned an error.") | 307 | raise Exception("dpkg-source command returned an error.") |
994 | 252 | 308 | ||
998 | 253 | # now check the relevance of the committed changes compared to the version in the repository (if any) | 309 | # now check the relevance of the committed changes compared to the version |
999 | 254 | diffinstance = subprocess.Popen(['diff', '-Nrup', extracted_generated_source, dest_version_source], stdout=subprocess.PIPE) | 310 | # in the repository (if any) |
1000 | 255 | (diff, err) = subprocess.Popen(['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', '*local-options'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate() | 311 | cmd = ['diff', '-Nrup', extracted_generated_source, dest_version_source] |
1001 | 312 | diffinstance = Popen(cmd, stdout=PIPE) | ||
1002 | 313 | cmd = ['filterdiff', '--clean', '-x', '*po', '-x', '*pot', '-x', | ||
1003 | 314 | '*local-options'] | ||
1004 | 315 | p = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE) | ||
1005 | 316 | (diff, err) = p.communicate() | ||
1006 | 256 | 317 | ||
1008 | 257 | # there is no important diff if the diff only contains 12 lines, corresponding to "Automatic daily release" marker in debian/changelog | 318 | # there is no important diff if the diff only contains 12 lines, |
1009 | 319 | # corresponding to "Automatic daily release" marker in debian/changelog | ||
1010 | 258 | if (diff.count('\n') <= 12): | 320 | if (diff.count('\n') <= 12): |
1011 | 259 | return False | 321 | return False |
1012 | 260 | return True | 322 | return True |
1013 | @@ -267,47 +329,65 @@ | |||
1014 | 267 | if not oldsource_dsc: | 329 | if not oldsource_dsc: |
1015 | 268 | return True | 330 | return True |
1016 | 269 | if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc): | 331 | if not os.path.isfile(oldsource_dsc) or not os.path.isfile(newsource_dsc): |
1020 | 270 | raise Exception("{} or {} doesn't not exist, can't create a diff".format(oldsource_dsc, newsource_dsc)) | 332 | tmpl = "{} or {} doesn't not exist, can't create a diff" |
1021 | 271 | diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE) | 333 | raise Exception(tmpl.format(oldsource_dsc, newsource_dsc)) |
1022 | 272 | filterinstance = subprocess.Popen(['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog'], stdin=diffinstance.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 334 | diffinstance = Popen(['debdiff', oldsource_dsc, newsource_dsc], |
1023 | 335 | stdout=PIPE) | ||
1024 | 336 | cmd = ['filterdiff', '--clean', '-i', '*debian/*', '-x', '*changelog'] | ||
1025 | 337 | filterinstance = Popen(cmd, stdin=diffinstance.stdout, stdout=PIPE, | ||
1026 | 338 | stderr=PIPE) | ||
1027 | 273 | (change_in_debian, filter_err) = filterinstance.communicate() | 339 | (change_in_debian, filter_err) = filterinstance.communicate() |
1029 | 274 | # we can't rely on diffinstance returncode as the signature key is maybe not present and it will exit with 1 | 340 | # we can't rely on diffinstance returncode as the signature key is maybe |
1030 | 341 | # not present and it will exit with 1 | ||
1031 | 275 | if filterinstance.returncode != 0: | 342 | if filterinstance.returncode != 0: |
1033 | 276 | raise Exception("Error in diff: {}".format(filter_err.decode("utf-8").strip())) | 343 | tmpl = "Error in diff: {}" |
1034 | 344 | raise Exception(tmpl.format(filter_err.decode("utf-8").strip())) | ||
1035 | 277 | return(change_in_debian != "") | 345 | return(change_in_debian != "") |
1036 | 278 | 346 | ||
1037 | 279 | 347 | ||
1038 | 280 | def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc): | 348 | def generate_diff_between_dsc(diff_filepath, oldsource_dsc, newsource_dsc): |
1040 | 281 | '''Generate a diff file in diff_filepath if there is a relevant packaging diff between 2 sources | 349 | '''Generate a diff file in diff_filepath if there is a relevant packaging |
1041 | 350 | diff between 2 sources | ||
1042 | 282 | 351 | ||
1043 | 283 | The diff contains autotools files and cmakeries''' | 352 | The diff contains autotools files and cmakeries''' |
1044 | 284 | if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc): | 353 | if _packaging_changes_between_dsc(oldsource_dsc, newsource_dsc): |
1045 | 285 | with open(diff_filepath, "w") as f: | 354 | with open(diff_filepath, "w") as f: |
1046 | 286 | if not oldsource_dsc: | 355 | if not oldsource_dsc: |
1048 | 287 | f.writelines("This source is a new package, if the destination is ubuntu, please ensure it has been preNEWed by an archive admin before publishing that stack.") | 356 | f.writelines("This source is a new package, if the destination" |
1049 | 357 | " is ubuntu, please ensure it has been preNEWed" | ||
1050 | 358 | " by an archive admin before publishing that" | ||
1051 | 359 | " stack.") | ||
1052 | 288 | return | 360 | return |
1058 | 289 | f.write("/!\ Remember that this diff only represents packaging changes and build tools diff, not the whole content diff!\n\n") | 361 | f.write("/!\ Remember that this diff only represents packaging " |
1059 | 290 | diffinstance = subprocess.Popen(['debdiff', oldsource_dsc, newsource_dsc], stdout=subprocess.PIPE) | 362 | "changes and build tools diff, not the whole content " |
1060 | 291 | (changes_to_publish, err) = subprocess.Popen(['filterdiff', '--remove-timestamps', '--clean', '-i', '*setup.py', | 363 | "diff!\n\n") |
1061 | 292 | '-i', '*Makefile.am', '-i', '*configure.*', '-i', '*debian/*', | 364 | diffinstance = Popen(['debdiff', oldsource_dsc, newsource_dsc], |
1062 | 293 | '-i', '*CMakeLists.txt'], stdin=diffinstance.stdout, stdout=subprocess.PIPE).communicate() | 365 | stdout=PIPE) |
1063 | 366 | cmd = ['filterdiff', '--remove-timestamps', '--clean', '-i', | ||
1064 | 367 | '*setup.py', '-i', '*Makefile.am', '-i', '*configure.*', | ||
1065 | 368 | '-i', '*debian/*', '-i', '*CMakeLists.txt'] | ||
1066 | 369 | (changes_to_publish, err) = Popen(cmd, stdin=diffinstance.stdout, | ||
1067 | 370 | stdout=PIPE).communicate() | ||
1068 | 294 | f.write(changes_to_publish) | 371 | f.write(changes_to_publish) |
1069 | 295 | 372 | ||
1070 | 296 | 373 | ||
1072 | 297 | def create_new_packaging_version(base_package_version, series_version, destppa=''): | 374 | def create_new_packaging_version(base_package_version, series_version, |
1073 | 375 | destppa=''): | ||
1074 | 298 | '''Deliver a new packaging version, based on simple rules: | 376 | '''Deliver a new packaging version, based on simple rules: |
1075 | 299 | 377 | ||
1078 | 300 | Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1 | 378 | Version would be <upstream_version>.<series>.<yyyymmdd(.minor)>-0ubuntu1 if |
1079 | 301 | if we already have something delivered today, it will be .minor, then, .minor+1… | 379 | we already have something delivered today, it will be .minor, then, |
1080 | 380 | .minor+1… | ||
1081 | 302 | 381 | ||
1083 | 303 | We append the destination ppa name if we target a dest ppa and not distro''' | 382 | We append the destination ppa name if we target a dest ppa and not |
1084 | 383 | distro''' | ||
1085 | 304 | # to keep track of whether the package is native or not | 384 | # to keep track of whether the package is native or not |
1086 | 305 | native_pkg = False | 385 | native_pkg = False |
1087 | 306 | 386 | ||
1088 | 307 | today_version = datetime.date.today().strftime('%Y%m%d') | 387 | today_version = datetime.date.today().strftime('%Y%m%d') |
1089 | 308 | destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".") | 388 | destppa = destppa.replace("-", '.').replace("_", ".").replace("/", ".") |
1092 | 309 | # bootstrapping mode or direct upload or UNRELEASED for bumping to a new series | 389 | # bootstrapping mode or direct upload or UNRELEASED for bumping to a new |
1093 | 310 | # TRANSITION | 390 | # series TRANSITION |
1094 | 311 | if not ("daily" in base_package_version or "+" in base_package_version): | 391 | if not ("daily" in base_package_version or "+" in base_package_version): |
1095 | 312 | # support both 42, 42-0ubuntu1 | 392 | # support both 42, 42-0ubuntu1 |
1096 | 313 | upstream_version = base_package_version.split('-')[0] | 393 | upstream_version = base_package_version.split('-')[0] |
1097 | @@ -322,17 +402,18 @@ | |||
1098 | 322 | upstream_version = previous_day[0] | 402 | upstream_version = previous_day[0] |
1099 | 323 | native_pkg = True | 403 | native_pkg = True |
1100 | 324 | if (previous_day[1] == series_version and | 404 | if (previous_day[1] == series_version and |
1106 | 325 | previous_day[2] == today_version): | 405 | previous_day[2] == today_version): |
1107 | 326 | minor = 1 | 406 | minor = 1 |
1108 | 327 | if previous_day[3]: # second upload of the day | 407 | if previous_day[3]: # second upload of the day |
1109 | 328 | minor = int(previous_day[3]) + 1 | 408 | minor = int(previous_day[3]) + 1 |
1110 | 329 | today_version = "{}.{}".format(today_version, minor) | 409 | today_version = "{}.{}".format(today_version, minor) |
1111 | 330 | except IndexError: | 410 | except IndexError: |
1112 | 331 | raise Exception( | 411 | raise Exception( |
1113 | 332 | "Unable to get previous day from native version: %s" | 412 | "Unable to get previous day from native version: %s" |
1114 | 333 | % base_package_version) | 413 | % base_package_version) |
1115 | 334 | else: | 414 | else: |
1117 | 335 | # extract the day of previous daily upload and bump if already uploaded today | 415 | # extract the day of previous daily upload and bump if already uploaded |
1118 | 416 | # today | ||
1119 | 336 | regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*") | 417 | regexp = re.compile("(.*)\+([\d\.]{5})\.(\d{8})\.?([\d]*).*-.*") |
1120 | 337 | try: | 418 | try: |
1121 | 338 | previous_day = regexp.findall(base_package_version)[0] | 419 | previous_day = regexp.findall(base_package_version)[0] |
1122 | @@ -342,11 +423,17 @@ | |||
1123 | 342 | regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*") | 423 | regexp = re.compile("(.*)(daily)([\d\.]{8})\.?([\d]*).*-.*") |
1124 | 343 | previous_day = regexp.findall(base_package_version)[0] | 424 | previous_day = regexp.findall(base_package_version)[0] |
1125 | 344 | # make the version compatible with the new version | 425 | # make the version compatible with the new version |
1127 | 345 | previous_day = (previous_day[0], previous_day[1], "20" + previous_day[2].replace(".", ""), previous_day[3]) | 426 | previous_day = (previous_day[0], previous_day[1], "20" + |
1128 | 427 | previous_day[2].replace(".", ""), | ||
1129 | 428 | previous_day[3]) | ||
1130 | 346 | except IndexError: | 429 | except IndexError: |
1132 | 347 | raise Exception("Didn't find a correct versioning in the current package: {}".format(base_package_version)) | 430 | tmpl = ("Didn't find a correct versioning in the current " |
1133 | 431 | "package: {}") | ||
1134 | 432 | raise Exception(tmpl.format(base_package_version)) | ||
1135 | 348 | upstream_version = previous_day[0] | 433 | upstream_version = previous_day[0] |
1137 | 349 | if previous_day[1] == series_version and previous_day[2] == today_version: | 434 | is_series = previous_day[1] == series_version |
1138 | 435 | is_today = previous_day[2] == today_version | ||
1139 | 436 | if is_series and is_today: | ||
1140 | 350 | minor = 1 | 437 | minor = 1 |
1141 | 351 | if previous_day[3]: # second upload of the day | 438 | if previous_day[3]: # second upload of the day |
1142 | 352 | minor = int(previous_day[3]) + 1 | 439 | minor = int(previous_day[3]) + 1 |
1143 | @@ -363,7 +450,7 @@ | |||
1144 | 363 | 450 | ||
1145 | 364 | def get_packaging_sourcename(): | 451 | def get_packaging_sourcename(): |
1146 | 365 | '''Get current packaging source name''' | 452 | '''Get current packaging source name''' |
1148 | 366 | instance = subprocess.Popen(["dpkg-parsechangelog"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 453 | instance = Popen(["dpkg-parsechangelog"], stdout=PIPE, stderr=PIPE) |
1149 | 367 | (stdout, stderr) = instance.communicate() | 454 | (stdout, stderr) = instance.communicate() |
1150 | 368 | if instance.returncode != 0: | 455 | if instance.returncode != 0: |
1151 | 369 | raise Exception(stderr.decode("utf-8").strip()) | 456 | raise Exception(stderr.decode("utf-8").strip()) |
1152 | @@ -373,7 +460,8 @@ | |||
1153 | 373 | if source_name: | 460 | if source_name: |
1154 | 374 | return source_name[0] | 461 | return source_name[0] |
1155 | 375 | 462 | ||
1157 | 376 | raise Exception("Didn't find any source name in the package: {}".format(stdout)) | 463 | tmpl = "Didn't find any source name in the package: {}" |
1158 | 464 | raise Exception(tmpl.format(stdout)) | ||
1159 | 377 | 465 | ||
1160 | 378 | 466 | ||
1161 | 379 | def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name): | 467 | def collect_bugs_in_changelog_until_latest_snapshot(f, source_package_name): |
1162 | @@ -382,20 +470,24 @@ | |||
1163 | 382 | # matching only bug format that launchpad accepts | 470 | # matching only bug format that launchpad accepts |
1164 | 383 | group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE) | 471 | group_bugs_regexp = re.compile("lp: ?(.*\d{5,})", re.IGNORECASE) |
1165 | 384 | bug_decipher_regexp = re.compile("(#\d{5,})+") | 472 | bug_decipher_regexp = re.compile("(#\d{5,})+") |
1167 | 385 | new_upload_changelog_regexp = re.compile(settings.NEW_CHANGELOG_PATTERN.format(source_package_name)) | 473 | pattern = settings.NEW_CHANGELOG_PATTERN.format(source_package_name) |
1168 | 474 | new_upload_changelog_regexp = re.compile(pattern) | ||
1169 | 386 | for line in f: | 475 | for line in f: |
1170 | 387 | grouped_bugs_list = group_bugs_regexp.findall(line) | 476 | grouped_bugs_list = group_bugs_regexp.findall(line) |
1171 | 388 | for grouped_bugs in grouped_bugs_list: | 477 | for grouped_bugs in grouped_bugs_list: |
1173 | 389 | for bug in map(lambda bug_with_hash: bug_with_hash.replace('#', ''), bug_decipher_regexp.findall(grouped_bugs)): | 478 | func = lambda bug_with_hash: bug_with_hash.replace('#', '') |
1174 | 479 | for bug in map(func, bug_decipher_regexp.findall(grouped_bugs)): | ||
1175 | 390 | bugs.add(bug) | 480 | bugs.add(bug) |
1177 | 391 | # a released upload to distro (automated or manual)n exit as bugs before were already covered | 481 | # a released upload to distro (automated or manual)n exit as bugs |
1178 | 482 | # before were already covered | ||
1179 | 392 | if new_upload_changelog_regexp.match(line): | 483 | if new_upload_changelog_regexp.match(line): |
1180 | 393 | return bugs | 484 | return bugs |
1181 | 394 | 485 | ||
1182 | 395 | return bugs | 486 | return bugs |
1183 | 396 | 487 | ||
1184 | 397 | 488 | ||
1186 | 398 | def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits, dest_ppa=None): | 489 | def update_changelog(new_package_version, series, tip_bzr_rev, authors_commits, |
1187 | 490 | dest_ppa=None): | ||
1188 | 399 | '''Update the changelog for the incoming upload''' | 491 | '''Update the changelog for the incoming upload''' |
1189 | 400 | 492 | ||
1190 | 401 | dch_env = os.environ.copy() | 493 | dch_env = os.environ.copy() |
1191 | @@ -403,18 +495,20 @@ | |||
1192 | 403 | dch_env["DEBFULLNAME"] = author | 495 | dch_env["DEBFULLNAME"] = author |
1193 | 404 | for bug_desc in authors_commits[author]: | 496 | for bug_desc in authors_commits[author]: |
1194 | 405 | if bug_desc.startswith('-'): | 497 | if bug_desc.startswith('-'): |
1196 | 406 | # Remove leading '-' or dch thinks (rightly) that it's an option | 498 | # Remove leading '-' or dch thinks (rightly) that it's an |
1197 | 499 | # option | ||
1198 | 407 | bug_desc = bug_desc[1:] | 500 | bug_desc = bug_desc[1:] |
1199 | 408 | if bug_desc.startswith(' '): | 501 | if bug_desc.startswith(' '): |
1200 | 409 | # Remove leading spaces, there are useless and the result is | 502 | # Remove leading spaces, there are useless and the result is |
1201 | 410 | # prettier without them anyway ;) | 503 | # prettier without them anyway ;) |
1202 | 411 | bug_desc = bug_desc.strip() | 504 | bug_desc = bug_desc.strip() |
1206 | 412 | cmd = ["dch", "--multimaint-merge", "--release-heuristic", "changelog", | 505 | cmd = ["dch", "--multimaint-merge", "--release-heuristic", |
1207 | 413 | "-v{}".format(new_package_version), bug_desc] | 506 | "changelog", "-v{}".format(new_package_version), bug_desc] |
1208 | 414 | subprocess.Popen(cmd, env=dch_env).communicate() | 507 | Popen(cmd, env=dch_env).communicate() |
1209 | 415 | 508 | ||
1212 | 416 | if tip_bzr_rev != None: | 509 | if tip_bzr_rev is None: |
1213 | 417 | commit_message = "{} {}".format(settings.REV_STRING_FORMAT, tip_bzr_rev) | 510 | tmpl = "{} {}" |
1214 | 511 | commit_message = tmpl.format(settings.REV_STRING_FORMAT, tip_bzr_rev) | ||
1215 | 418 | if dest_ppa: | 512 | if dest_ppa: |
1216 | 419 | commit_message += " ({})".format(dest_ppa) | 513 | commit_message += " ({})".format(dest_ppa) |
1217 | 420 | else: | 514 | else: |
1218 | @@ -422,17 +516,21 @@ | |||
1219 | 422 | 516 | ||
1220 | 423 | dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME | 517 | dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME |
1221 | 424 | dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL | 518 | dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL |
1225 | 425 | instance = subprocess.Popen(["dch", "--release-heuristic", "changelog", | 519 | instance = Popen(["dch", "--release-heuristic", "changelog", |
1226 | 426 | "-v{}".format(new_package_version), commit_message], | 520 | "-v{}".format(new_package_version), commit_message], |
1227 | 427 | stderr=subprocess.PIPE, env=dch_env) | 521 | stderr=PIPE, env=dch_env) |
1228 | 428 | (stdout, stderr) = instance.communicate() | 522 | (stdout, stderr) = instance.communicate() |
1229 | 429 | if instance.returncode != 0: | 523 | if instance.returncode != 0: |
1230 | 430 | raise Exception(stderr.decode("utf-8").strip()) | 524 | raise Exception(stderr.decode("utf-8").strip()) |
1232 | 431 | subprocess.call(["dch", "-r", "--distribution", series, "--force-distribution", ""], env=dch_env) | 525 | cmd = ["dch", "-r", "--distribution", series, "--force-distribution", ""] |
1233 | 526 | subprocess.call(cmd, env=dch_env) | ||
1234 | 432 | 527 | ||
1236 | 433 | # in the case of no commit_message and no symbols file change, we have an addition [ DEBFULLNAME ] follow by an empty line | 528 | # in the case of no commit_message and no symbols file change, we have an |
1237 | 529 | # addition [ DEBFULLNAME ] follow by an empty line | ||
1238 | 434 | # better to remove both lines | 530 | # better to remove both lines |
1240 | 435 | subprocess.call(["sed", "-i", "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}", "debian/changelog"]) | 531 | pattern = "/ \[ " + settings.BOT_DEBFULLNAME + " \]/{$q; N; /\\n$/d;}" |
1241 | 532 | cmd = ["sed", "-i", pattern, "debian/changelog"] | ||
1242 | 533 | subprocess.call(cmd) | ||
1243 | 436 | 534 | ||
1244 | 437 | 535 | ||
1245 | 438 | def build_source_package(series, distro_version, ppa=None): | 536 | def build_source_package(series, distro_version, ppa=None): |
1246 | @@ -457,7 +555,7 @@ | |||
1247 | 457 | "--distro-version", distro_version] | 555 | "--distro-version", distro_version] |
1248 | 458 | if ppa: | 556 | if ppa: |
1249 | 459 | cmd.extend(["--ppa", ppa]) | 557 | cmd.extend(["--ppa", ppa]) |
1251 | 460 | instance = subprocess.Popen(cmd, env=cowbuilder_env) | 558 | instance = Popen(cmd, env=cowbuilder_env) |
1252 | 461 | instance.communicate() | 559 | instance.communicate() |
1253 | 462 | if instance.returncode != 0: | 560 | if instance.returncode != 0: |
1254 | 463 | raise Exception("%r returned: %s." % (cmd, instance.returncode)) | 561 | raise Exception("%r returned: %s." % (cmd, instance.returncode)) |
1255 | @@ -490,15 +588,19 @@ | |||
1256 | 490 | replacement_done = False | 588 | replacement_done = False |
1257 | 491 | for filename in os.listdir("debian"): | 589 | for filename in os.listdir("debian"): |
1258 | 492 | if filename.endswith("symbols"): | 590 | if filename.endswith("symbols"): |
1260 | 493 | for line in fileinput.input(os.path.join('debian', filename), inplace=1): | 591 | dfile = os.path.join('debian', filename) |
1261 | 592 | for line in fileinput.input(dfile, inplace=1): | ||
1262 | 494 | if settings.REPLACEME_TAG in line: | 593 | if settings.REPLACEME_TAG in line: |
1263 | 495 | replacement_done = True | 594 | replacement_done = True |
1265 | 496 | line = line.replace(settings.REPLACEME_TAG, new_upstream_version) | 595 | line = line.replace(settings.REPLACEME_TAG, |
1266 | 596 | new_upstream_version) | ||
1267 | 497 | sys.stdout.write(line) | 597 | sys.stdout.write(line) |
1268 | 498 | 598 | ||
1269 | 499 | if replacement_done: | 599 | if replacement_done: |
1270 | 500 | dch_env = os.environ.copy() | 600 | dch_env = os.environ.copy() |
1271 | 501 | dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME | 601 | dch_env["DEBFULLNAME"] = settings.BOT_DEBFULLNAME |
1272 | 502 | dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL | 602 | dch_env["DEBEMAIL"] = settings.BOT_DEBEMAIL |
1274 | 503 | subprocess.Popen(["dch", "debian/*symbols: auto-update new symbols to released version"], env=dch_env).communicate() | 603 | cmd = ["dch", |
1275 | 604 | "debian/*symbols: auto-update new symbols to released version"] | ||
1276 | 605 | Popen(cmd, env=dch_env).communicate() | ||
1277 | 504 | subprocess.call(["bzr", "commit", "-m", "Update symbols"]) | 606 | subprocess.call(["bzr", "commit", "-m", "Update symbols"]) |
1278 | 505 | 607 | ||
1279 | === modified file 'branch-source-builder/cupstream2distro/settings.py' | |||
1280 | --- branch-source-builder/cupstream2distro/settings.py 2014-02-19 16:34:46 +0000 | |||
1281 | +++ branch-source-builder/cupstream2distro/settings.py 2014-02-23 16:37:01 +0000 | |||
1282 | @@ -39,6 +39,7 @@ | |||
1283 | 39 | GNUPG_DIR = home_dir | 39 | GNUPG_DIR = home_dir |
1284 | 40 | CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials") | 40 | CRED_FILE_PATH = os.path.join("/tmp", "launchpad.credentials") |
1285 | 41 | 41 | ||
1286 | 42 | |||
1287 | 42 | # TODO refactor into a ci-utils module | 43 | # TODO refactor into a ci-utils module |
1288 | 43 | def _unit_config(): | 44 | def _unit_config(): |
1289 | 44 | path = os.path.join(home_dir, 'unit_config') | 45 | path = os.path.join(home_dir, 'unit_config') |
1290 | @@ -73,7 +74,8 @@ | |||
1291 | 73 | 74 | ||
1292 | 74 | # selected arch for building arch:all packages | 75 | # selected arch for building arch:all packages |
1293 | 75 | VIRTUALIZED_PPA_ARCH = ["i386", "amd64"] | 76 | VIRTUALIZED_PPA_ARCH = ["i386", "amd64"] |
1295 | 76 | # an arch we will ignore for publication if latest published version in dest doesn't build it | 77 | # an arch we will ignore for publication if latest published version in dest |
1296 | 78 | # doesn't build it | ||
1297 | 77 | ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el']) | 79 | ARCHS_TO_EVENTUALLY_IGNORE = set(['powerpc', 'arm64', 'ppc64el']) |
1298 | 78 | ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el']) | 80 | ARCHS_TO_UNCONDITIONALLY_IGNORE = set(['arm64', 'ppc64el']) |
1299 | 79 | SRU_PPA = "ubuntu-unity/sru-staging" | 81 | SRU_PPA = "ubuntu-unity/sru-staging" |
1300 | @@ -87,11 +89,15 @@ | |||
1301 | 87 | 89 | ||
1302 | 88 | OLD_STACK_DIR = 'old' | 90 | OLD_STACK_DIR = 'old' |
1303 | 89 | PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync' | 91 | PACKAGE_LIST_RSYNC_FILENAME_PREFIX = 'packagelist_rsync' |
1306 | 90 | PACKAGE_LIST_RSYNC_FILENAME_FORMAT = PACKAGE_LIST_RSYNC_FILENAME_PREFIX + '_{}-{}' | 92 | PACKAGE_LIST_RSYNC_FILENAME_FORMAT = (PACKAGE_LIST_RSYNC_FILENAME_PREFIX |
1307 | 91 | RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*".format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX) | 93 | + '_{}-{}') |
1308 | 94 | RSYNC_PATTERN = "rsync://RSYNCSVR/cu2d_out/{}*" | ||
1309 | 95 | RSYNC_PATTERN = RSYNC_PATTERN.format(PACKAGE_LIST_RSYNC_FILENAME_PREFIX) | ||
1310 | 92 | 96 | ||
1313 | 93 | ROOT_CU2D = os.path.join(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) | 97 | ROOT_CU2D = os.path.join( |
1314 | 94 | DEFAULT_CONFIG_STACKS_DIR = os.path.join(os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks') | 98 | os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) |
1315 | 99 | DEFAULT_CONFIG_STACKS_DIR = os.path.join( | ||
1316 | 100 | os.path.dirname(ROOT_CU2D), 'cupstream2distro-config', 'stacks') | ||
1317 | 95 | STACK_STATUS_FILENAME = "stack.status" | 101 | STACK_STATUS_FILENAME = "stack.status" |
1318 | 96 | STACK_STARTED_FILENAME = "stack.started" | 102 | STACK_STARTED_FILENAME = "stack.started" |
1319 | 97 | STACK_BUILDING_FILENAME = "stack.building" | 103 | STACK_BUILDING_FILENAME = "stack.building" |
1320 | 98 | 104 | ||
1321 | === modified file 'branch-source-builder/cupstream2distro/silomanager.py' | |||
1322 | --- branch-source-builder/cupstream2distro/silomanager.py 2014-01-31 17:06:36 +0000 | |||
1323 | +++ branch-source-builder/cupstream2distro/silomanager.py 2014-02-23 16:37:01 +0000 | |||
1324 | @@ -22,7 +22,8 @@ | |||
1325 | 22 | import os | 22 | import os |
1326 | 23 | import shutil | 23 | import shutil |
1327 | 24 | 24 | ||
1329 | 25 | from cupstream2distro.settings import SILO_CONFIG_FILENAME, SILO_NAME_LIST, SILO_STATUS_RSYNCDIR | 25 | from cupstream2distro.settings import (SILO_CONFIG_FILENAME, SILO_NAME_LIST, |
1330 | 26 | SILO_STATUS_RSYNCDIR) | ||
1331 | 26 | from cupstream2distro.utils import ignored | 27 | from cupstream2distro.utils import ignored |
1332 | 27 | 28 | ||
1333 | 28 | 29 | ||
1334 | @@ -42,10 +43,13 @@ | |||
1335 | 42 | os.makedirs(SILO_STATUS_RSYNCDIR) | 43 | os.makedirs(SILO_STATUS_RSYNCDIR) |
1336 | 43 | silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1] | 44 | silo_name = os.path.dirname(silo_config_path).split(os.path.sep)[-1] |
1337 | 44 | dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name) | 45 | dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name) |
1340 | 45 | logging.debug("Copying configuration from {} to {}".format(silo_config_path, dest)) | 46 | tmpl = "Copying configuration from {} to {}" |
1341 | 46 | shutil.copy2(silo_config_path, os.path.join(SILO_STATUS_RSYNCDIR, silo_name)) | 47 | logging.debug(tmpl.format(silo_config_path, dest)) |
1342 | 48 | dest = os.path.join(SILO_STATUS_RSYNCDIR, silo_name) | ||
1343 | 49 | shutil.copy2(silo_config_path, dest) | ||
1344 | 47 | return True | 50 | return True |
1345 | 48 | 51 | ||
1346 | 52 | |||
1347 | 49 | def load_config(uri=None): | 53 | def load_config(uri=None): |
1348 | 50 | """return a loaded config | 54 | """return a loaded config |
1349 | 51 | 55 | ||
1350 | @@ -62,23 +66,33 @@ | |||
1351 | 62 | logging.warning("Can't load configuration: " + e.message) | 66 | logging.warning("Can't load configuration: " + e.message) |
1352 | 63 | return None | 67 | return None |
1353 | 64 | 68 | ||
1354 | 69 | |||
1355 | 65 | def remove_status_file(silo_name): | 70 | def remove_status_file(silo_name): |
1356 | 66 | """Remove status file""" | 71 | """Remove status file""" |
1357 | 67 | os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name)) | 72 | os.remove(os.path.join(SILO_STATUS_RSYNCDIR, silo_name)) |
1358 | 68 | 73 | ||
1359 | 69 | 74 | ||
1363 | 70 | def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri, ignore_silo): | 75 | def is_project_not_in_any_configs(project_name, series, dest, base_silo_uri, |
1364 | 71 | """Return true if the project for that serie in that dest is not in any configuration""" | 76 | ignore_silo): |
1365 | 72 | logging.info("Checking if {} is already configured for {} ({}) in another silo".format(project_name, dest.name, series.name)) | 77 | """Return true if the project for that serie in that dest is not in any |
1366 | 78 | configuration""" | ||
1367 | 79 | tmpl = "Checking if {} is already configured for {} ({}) in another silo" | ||
1368 | 80 | logging.info(tmpl.format(project_name, dest.name, series.name)) | ||
1369 | 73 | for silo_name in SILO_NAME_LIST: | 81 | for silo_name in SILO_NAME_LIST: |
1370 | 74 | # we are reconfiguring current silo, ignoring it | 82 | # we are reconfiguring current silo, ignoring it |
1371 | 75 | if ignore_silo == silo_name: | 83 | if ignore_silo == silo_name: |
1372 | 76 | continue | 84 | continue |
1373 | 77 | config = load_config(os.path.join(base_silo_uri, silo_name)) | 85 | config = load_config(os.path.join(base_silo_uri, silo_name)) |
1374 | 78 | if config: | 86 | if config: |
1378 | 79 | if (config["global"]["dest"] == dest.self_link and config["global"]["series"] == series.self_link and | 87 | project_match = (project_name in config["mps"] |
1379 | 80 | (project_name in config["mps"] or project_name in config["sources"])): | 88 | or project_name in config["sources"]) |
1380 | 81 | logging.error("{} is already prepared for the same serie and destination in {}".format(project_name, silo_name)) | 89 | series_match = config["global"]["series"] == series.self_link |
1381 | 90 | dest_match = config["global"]["dest"] == dest.self_link | ||
1382 | 91 | |||
1383 | 92 | if (dest_match and series_match and project_match): | ||
1384 | 93 | tmpl = ("{} is already prepared for the same series " | ||
1385 | 94 | "and destination in {}") | ||
1386 | 95 | logging.error(tmpl.format(project_name, silo_name)) | ||
1387 | 82 | return False | 96 | return False |
1388 | 83 | return True | 97 | return True |
1389 | 84 | 98 | ||
1390 | @@ -86,27 +100,32 @@ | |||
1391 | 86 | def return_first_available_silo(base_silo_uri): | 100 | def return_first_available_silo(base_silo_uri): |
1392 | 87 | """Check which silos are free and return the first one""" | 101 | """Check which silos are free and return the first one""" |
1393 | 88 | for silo_name in SILO_NAME_LIST: | 102 | for silo_name in SILO_NAME_LIST: |
1395 | 89 | if not os.path.isfile(os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME)): | 103 | p = os.path.join(base_silo_uri, silo_name, SILO_CONFIG_FILENAME) |
1396 | 104 | if not os.path.isfile(p): | ||
1397 | 90 | return silo_name | 105 | return silo_name |
1398 | 91 | return None | 106 | return None |
1399 | 92 | 107 | ||
1400 | 108 | |||
1401 | 93 | def get_config_step(config): | 109 | def get_config_step(config): |
1402 | 94 | """Get configuration step""" | 110 | """Get configuration step""" |
1403 | 95 | return config["global"]["step"] | 111 | return config["global"]["step"] |
1404 | 96 | 112 | ||
1405 | 113 | |||
1406 | 97 | def set_config_step(config, new_step, uri=''): | 114 | def set_config_step(config, new_step, uri=''): |
1407 | 98 | """Set configuration step to new_step""" | 115 | """Set configuration step to new_step""" |
1408 | 99 | config["global"]["step"] = new_step | 116 | config["global"]["step"] = new_step |
1409 | 100 | return save_config(config, uri) | 117 | return save_config(config, uri) |
1410 | 101 | 118 | ||
1411 | 119 | |||
1412 | 102 | def set_config_status(config, status, uri='', add_url=True): | 120 | def set_config_status(config, status, uri='', add_url=True): |
1413 | 103 | """Change status to reflect latest status""" | 121 | """Change status to reflect latest status""" |
1414 | 104 | build_url = os.getenv('BUILD_URL') | 122 | build_url = os.getenv('BUILD_URL') |
1415 | 105 | if add_url and build_url: | 123 | if add_url and build_url: |
1417 | 106 | status = "{} ({}console)".format(status , build_url) | 124 | status = "{} ({}console)".format(status, build_url) |
1418 | 107 | config["global"]["status"] = status | 125 | config["global"]["status"] = status |
1419 | 108 | return save_config(config, uri) | 126 | return save_config(config, uri) |
1420 | 109 | 127 | ||
1421 | 128 | |||
1422 | 110 | def get_all_projects(config): | 129 | def get_all_projects(config): |
1423 | 111 | """Get a list of all projets""" | 130 | """Get a list of all projets""" |
1424 | 112 | projects = [] | 131 | projects = [] |
1425 | 113 | 132 | ||
1426 | === modified file 'branch-source-builder/cupstream2distro/stack.py' | |||
1427 | --- branch-source-builder/cupstream2distro/stack.py 2014-02-19 16:34:46 +0000 | |||
1428 | +++ branch-source-builder/cupstream2distro/stack.py 2014-02-23 16:37:01 +0000 | |||
1429 | @@ -21,11 +21,75 @@ | |||
1430 | 21 | import os | 21 | import os |
1431 | 22 | import yaml | 22 | import yaml |
1432 | 23 | 23 | ||
1434 | 24 | from .settings import DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME, STACK_STARTED_FILENAME | 24 | from .settings import (DEFAULT_CONFIG_STACKS_DIR, STACK_STATUS_FILENAME, |
1435 | 25 | STACK_STARTED_FILENAME) | ||
1436 | 25 | from .utils import ignored | 26 | from .utils import ignored |
1437 | 26 | 27 | ||
1438 | 27 | _stacks_ref = {} | 28 | _stacks_ref = {} |
1439 | 28 | 29 | ||
1440 | 30 | CANT_FIND_STATUS = '''Can't find status for {depstack} ({deprel}). This | ||
1441 | 31 | shouldn't happen unless the stack is currently running. If this is the case, it | ||
1442 | 32 | means that the current stack shouldn't be uploaded as the state is unknown.''' | ||
1443 | 33 | |||
1444 | 34 | FAILED_TO_PUBLISH = '''{depstack} ({deprel}) failed to publish. Possible causes | ||
1445 | 35 | are: | ||
1446 | 36 | * the stack really didn't build/can't be prepared at all. | ||
1447 | 37 | * the stack has integration tests not working with this previous stack. | ||
1448 | 38 | |||
1449 | 39 | What needs to be done: | ||
1450 | 40 | Either: | ||
1451 | 41 | * If we want to publish both stacks: retry the integration tests for | ||
1452 | 42 | {depstack} ({deprel}), including components from this stack (check | ||
1453 | 43 | with the whole PPA). If that works, both stacks should be published | ||
1454 | 44 | at the same time. | ||
1455 | 45 | Or: | ||
1456 | 46 | * If we only want to publish this stack: check that we can safely | ||
1457 | 47 | publish it by itself (e.g. without the stacks it depends on). The | ||
1458 | 48 | trick there is to make sure that the stack is not relying on, or | ||
1459 | 49 | affected by, a change that happened in one of its dependencies. | ||
1460 | 50 | Example: if the {depstack} ({deprel}) API changed in a way that | ||
1461 | 51 | affects any component of the current stack, and both stacks got | ||
1462 | 52 | updated in trunk, we need to make sure we don't land only one of the | ||
1463 | 53 | two stacks which would result in a broken state. Also think about | ||
1464 | 54 | potential ABI changes.''' | ||
1465 | 55 | |||
1466 | 56 | MANUAL_PUBLISH = '''{depstack} ({deprel}) is in manually publish mode. Possible | ||
1467 | 57 | causes are: | ||
1468 | 58 | * Some part of the stack has packaging changes | ||
1469 | 59 | * This stack is depending on another stack not being published | ||
1470 | 60 | |||
1471 | 61 | What needs to be done: | ||
1472 | 62 | Either: | ||
1473 | 63 | * If {depstack} ({deprel}) can be published, we should publish both | ||
1474 | 64 | stacks at the same time. | ||
1475 | 65 | Or: | ||
1476 | 66 | * If we only want to publish this stack: check that we can safely | ||
1477 | 67 | publish it by itself (e.g. without the stacks it depends on). The | ||
1478 | 68 | trick there is to make sure that the stack is not relying on, or | ||
1479 | 69 | affected by, a change that happened in one of its dependencies. | ||
1480 | 70 | Example: if the {depstack} ({deprel}) API changed in a way that | ||
1481 | 71 | affects any component of the current stack, and both stacks got | ||
1482 | 72 | updated in trunk, we need to make sure we don't land only one of the | ||
1483 | 73 | two stacks which would result in a broken state. Also think about | ||
1484 | 74 | potential ABI changes.''' | ||
1485 | 75 | |||
1486 | 76 | MANUALLY_ABORTED = '''{depstack} ({deprel}) has been manually aborted or failed | ||
1487 | 77 | for an unknown reason. Possible causes are: | ||
1488 | 78 | * A job of this stack was stopped manually | ||
1489 | 79 | * Jenkins had an internal error/shutdown | ||
1490 | 80 | |||
1491 | 81 | What needs to be done: | ||
1492 | 82 | * If we only want to publish this stack: check that we can safely | ||
1493 | 83 | publish it by itself (e.g. without the stacks it depends on). The | ||
1494 | 84 | trick there is to make sure that the stack is not relying on, or | ||
1495 | 85 | affected by, a change that happened in one of its dependencies. | ||
1496 | 86 | Example: if the {depstack} ({deprel}) API changed in a way that | ||
1497 | 87 | affects any component of the current stack, and both stacks got | ||
1498 | 88 | updated in trunk, we need to make sure we don't land only one of the | ||
1499 | 89 | two stacks which would result in a broken state. Also think about | ||
1500 | 90 | potential ABI changes.''' | ||
1501 | 91 | |||
1502 | 92 | |||
1503 | 29 | # TODO: should be used by a metaclass | 93 | # TODO: should be used by a metaclass |
1504 | 30 | def get_stack(release, stack_name): | 94 | def get_stack(release, stack_name): |
1505 | 31 | try: | 95 | try: |
1506 | @@ -33,22 +97,28 @@ | |||
1507 | 33 | except KeyError: | 97 | except KeyError: |
1508 | 34 | return Stack(release, stack_name) | 98 | return Stack(release, stack_name) |
1509 | 35 | 99 | ||
1510 | 100 | |||
1511 | 36 | class Stack(): | 101 | class Stack(): |
1512 | 37 | 102 | ||
1513 | 38 | def __init__(self, release, stack_name): | 103 | def __init__(self, release, stack_name): |
1518 | 39 | self.stack_name = stack_name | 104 | self.stack_name = stack_name |
1519 | 40 | self.release = release | 105 | self.release = release |
1520 | 41 | self.statusfile = os.path.join('..', '..', release, stack_name, STACK_STATUS_FILENAME) | 106 | args = ('..', '..', release, stack_name, STACK_STATUS_FILENAME) |
1521 | 42 | self.startedfile = os.path.join('..', '..', release, stack_name, STACK_STARTED_FILENAME) | 107 | self.statusfile = os.path.join(*args) |
1522 | 108 | args = ('..', '..', release, stack_name, STACK_STARTED_FILENAME) | ||
1523 | 109 | self.startedfile = os.path.join(*args) | ||
1524 | 43 | self.stack_file_path = None | 110 | self.stack_file_path = None |
1525 | 44 | self._dependencies = None | 111 | self._dependencies = None |
1526 | 45 | self._rdependencies = None | 112 | self._rdependencies = None |
1529 | 46 | for stack_file_path in Stack.get_stacks_file_path(release): | 113 | for stack_file_path in Stack.get_stacks_file_path(release): |
1530 | 47 | if stack_file_path.split(os.path.sep)[-1] == "{}.cfg".format(stack_name): | 114 | formatted = "{}.cfg".format(stack_name) |
1531 | 115 | if stack_file_path.split(os.path.sep)[-1] == formatted: | ||
1532 | 48 | self.stack_file_path = stack_file_path | 116 | self.stack_file_path = stack_file_path |
1533 | 49 | break | 117 | break |
1534 | 50 | if not self.stack_file_path: | 118 | if not self.stack_file_path: |
1536 | 51 | raise Exception("{}.cfg for {} doesn't exist anywhere in {}".format(stack_name, release, self.get_root_stacks_dir())) | 119 | msg = "{}.cfg for {} doesn't exist anywhere in {}" |
1537 | 120 | msg = msg.format(stack_name, release, self.get_root_stacks_dir()) | ||
1538 | 121 | raise Exception(msg) | ||
1539 | 52 | 122 | ||
1540 | 53 | with open(self.stack_file_path, 'r') as f: | 123 | with open(self.stack_file_path, 'r') as f: |
1541 | 54 | cfg = yaml.load(f) | 124 | cfg = yaml.load(f) |
1542 | @@ -118,7 +188,11 @@ | |||
1543 | 118 | else: | 188 | else: |
1544 | 119 | (stackname, release) = (item, self.release) | 189 | (stackname, release) = (item, self.release) |
1545 | 120 | self._dependencies.append(get_stack(release, stackname)) | 190 | self._dependencies.append(get_stack(release, stackname)) |
1547 | 121 | logging.info("{} ({}) dependency list is: {}".format(self.stack_name, self.release, ["{} ({})".format(stack.stack_name, stack.release) for stack in self._dependencies])) | 191 | deps = ["{} ({})".format(stack.stack_name, stack.release) |
1548 | 192 | for stack in self._dependencies] | ||
1549 | 193 | msg = "{} ({}) dependency list is: {}" | ||
1550 | 194 | msg = msg.format(self.stack_name, self.release, deps) | ||
1551 | 195 | logging.info(msg) | ||
1552 | 122 | return self._dependencies | 196 | return self._dependencies |
1553 | 123 | except (TypeError, KeyError): | 197 | except (TypeError, KeyError): |
1554 | 124 | return [] | 198 | return [] |
1555 | @@ -137,7 +211,8 @@ | |||
1556 | 137 | return self._rdependencies | 211 | return self._rdependencies |
1557 | 138 | 212 | ||
1558 | 139 | def generate_dep_status_message(self): | 213 | def generate_dep_status_message(self): |
1560 | 140 | '''Return a list of potential problems from others stack which should block current publication''' | 214 | '''Return a list of potential problems from others stack which should |
1561 | 215 | block current publication''' | ||
1562 | 141 | 216 | ||
1563 | 142 | # TODO: get the first Stack object | 217 | # TODO: get the first Stack object |
1564 | 143 | # iterate over all stacks objects from dep chain | 218 | # iterate over all stacks objects from dep chain |
1565 | @@ -145,39 +220,24 @@ | |||
1566 | 145 | 220 | ||
1567 | 146 | global_dep_status_info = [] | 221 | global_dep_status_info = [] |
1568 | 147 | for stack in self.get_direct_depending_stacks(): | 222 | for stack in self.get_direct_depending_stacks(): |
1570 | 148 | logging.info("Check status for {} ({})".format(stack.stack_name, stack.release)) | 223 | msg = "Check status for {} ({})" |
1571 | 224 | msg = msg.format(stack.stack_name, stack.release) | ||
1572 | 225 | logging.info(msg) | ||
1573 | 149 | status = stack.get_status() | 226 | status = stack.get_status() |
1574 | 150 | message = None | 227 | message = None |
1575 | 151 | # We should have a status for every stack | 228 | # We should have a status for every stack |
1576 | 152 | if status is None: | 229 | if status is None: |
1578 | 153 | message = "Can't find status for {depstack} ({deprel}). This shouldn't happen unless the stack is currently running. If this is the case, it means that the current stack shouldn't be uploaded as the state is unknown.".format(depstack=stack, deprel=stack.release) | 230 | kw = {'depstack': stack, 'deprel': stack.release} |
1579 | 231 | message = CANT_FIND_STATUS.format(**kw) | ||
1580 | 154 | elif status == 1: | 232 | elif status == 1: |
1590 | 155 | message = '''{depstack} ({deprel}) failed to publish. Possible causes are: | 233 | kw = {'depstack': stack.stack_name, 'deprel': stack.release} |
1591 | 156 | * the stack really didn't build/can't be prepared at all. | 234 | message = FAILED_TO_PUBLISH.format(**kw) |
1583 | 157 | * the stack has integration tests not working with this previous stack. | ||
1584 | 158 | |||
1585 | 159 | What needs to be done: | ||
1586 | 160 | Either: | ||
1587 | 161 | * If we want to publish both stacks: retry the integration tests for {depstack} ({deprel}), including components from this stack (check with the whole PPA). If that works, both stacks should be published at the same time. | ||
1588 | 162 | Or: | ||
1589 | 163 | * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release) | ||
1592 | 164 | elif status == 2: | 235 | elif status == 2: |
1602 | 165 | message = '''{depstack} ({deprel}) is in manually publish mode. Possible causes are: | 236 | kw = {'depstack': stack.stack_name, 'deprel': stack.release} |
1603 | 166 | * Some part of the stack has packaging changes | 237 | message = MANUAL_PUBLISH.format(**kw) |
1595 | 167 | * This stack is depending on another stack not being published | ||
1596 | 168 | |||
1597 | 169 | What needs to be done: | ||
1598 | 170 | Either: | ||
1599 | 171 | * If {depstack} ({deprel}) can be published, we should publish both stacks at the same time. | ||
1600 | 172 | Or: | ||
1601 | 173 | * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release) | ||
1604 | 174 | elif status == 3 or status == -1: | 238 | elif status == 3 or status == -1: |
1611 | 175 | message = '''{depstack} ({deprel}) has been manually aborted or failed for an unknown reason. Possible causes are: | 239 | kw = {'depstack': stack.stack_name, 'deprel': stack.release} |
1612 | 176 | * A job of this stack was stopped manually | 240 | message = MANUALLY_ABORTED.format(**kw) |
1607 | 177 | * Jenkins had an internal error/shutdown | ||
1608 | 178 | |||
1609 | 179 | What needs to be done: | ||
1610 | 180 | * If we only want to publish this stack: check that we can safely publish it by itself (e.g. without the stacks it depends on). The trick there is to make sure that the stack is not relying on, or affected by, a change that happened in one of its dependencies. Example: if the {depstack} ({deprel}) API changed in a way that affects any component of the current stack, and both stacks got updated in trunk, we need to make sure we don't land only one of the two stacks which would result in a broken state. Also think about potential ABI changes.'''.format(depstack=stack.stack_name, deprel=stack.release) | ||
1613 | 181 | 241 | ||
1614 | 182 | if message: | 242 | if message: |
1615 | 183 | logging.warning(message) | 243 | logging.warning(message) |
1616 | @@ -186,19 +246,21 @@ | |||
1617 | 186 | 246 | ||
1618 | 187 | @staticmethod | 247 | @staticmethod |
1619 | 188 | def get_root_stacks_dir(): | 248 | def get_root_stacks_dir(): |
1622 | 189 | '''Get root stack dir''' | 249 | '''Get root stack dir''' |
1623 | 190 | return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR) | 250 | return os.environ.get('CONFIG_STACKS_DIR', DEFAULT_CONFIG_STACKS_DIR) |
1624 | 191 | 251 | ||
1625 | 192 | @staticmethod | 252 | @staticmethod |
1626 | 193 | def get_stacks_file_path(release): | 253 | def get_stacks_file_path(release): |
1627 | 194 | '''Return an iterator with all path for every discovered stack files''' | 254 | '''Return an iterator with all path for every discovered stack files''' |
1629 | 195 | for root, dirs, files in os.walk(os.path.join(Stack.get_root_stacks_dir(), release)): | 255 | walking = os.walk(os.path.join(Stack.get_root_stacks_dir(), release)) |
1630 | 256 | for root, dirs, files in walking: | ||
1631 | 196 | for candidate in files: | 257 | for candidate in files: |
1632 | 197 | if candidate.endswith('.cfg'): | 258 | if candidate.endswith('.cfg'): |
1633 | 198 | yield os.path.join(root, candidate) | 259 | yield os.path.join(root, candidate) |
1634 | 199 | 260 | ||
1635 | 200 | @staticmethod | 261 | @staticmethod |
1636 | 201 | def get_current_stack(): | 262 | def get_current_stack(): |
1638 | 202 | '''Return current stack object based on current path (release/stackname)''' | 263 | '''Return current stack object based on current path |
1639 | 264 | (release/stackname)''' | ||
1640 | 203 | path = os.getcwd().split(os.path.sep) | 265 | path = os.getcwd().split(os.path.sep) |
1641 | 204 | return get_stack(path[-2], path[-1]) | 266 | return get_stack(path[-2], path[-1]) |
1642 | 205 | 267 | ||
1643 | === modified file 'branch-source-builder/cupstream2distro/stacks.py' | |||
1644 | --- branch-source-builder/cupstream2distro/stacks.py 2014-02-19 16:34:46 +0000 | |||
1645 | +++ branch-source-builder/cupstream2distro/stacks.py 2014-02-23 16:37:01 +0000 | |||
1646 | @@ -21,6 +21,7 @@ | |||
1647 | 21 | import os | 21 | import os |
1648 | 22 | import yaml | 22 | import yaml |
1649 | 23 | import subprocess | 23 | import subprocess |
1650 | 24 | from subprocess import PIPE | ||
1651 | 24 | 25 | ||
1652 | 25 | from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN | 26 | from .settings import PACKAGE_LIST_RSYNC_FILENAME_PREFIX, RSYNC_PATTERN |
1653 | 26 | from .tools import get_packaging_diff_filename | 27 | from .tools import get_packaging_diff_filename |
1654 | @@ -38,7 +39,7 @@ | |||
1655 | 38 | raise Exception('Please set environment variable CU2D_RSYNCSVR') | 39 | raise Exception('Please set environment variable CU2D_RSYNCSVR') |
1656 | 39 | 40 | ||
1657 | 40 | cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.'] | 41 | cmd = ["rsync", '--remove-source-files', '--timeout=60', remoteaddr, '.'] |
1659 | 41 | instance = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) | 42 | instance = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE) |
1660 | 42 | (stdout, stderr) = instance.communicate() | 43 | (stdout, stderr) = instance.communicate() |
1661 | 43 | if instance.returncode not in (0, 23): | 44 | if instance.returncode not in (0, 23): |
1662 | 44 | raise Exception(stderr.decode("utf-8").strip()) | 45 | raise Exception(stderr.decode("utf-8").strip()) |
1663 | @@ -62,10 +63,12 @@ | |||
1664 | 62 | try: | 63 | try: |
1665 | 63 | projects_list = cfg['stack']['projects'] | 64 | projects_list = cfg['stack']['projects'] |
1666 | 64 | except (TypeError, KeyError): | 65 | except (TypeError, KeyError): |
1668 | 65 | logging.warning("{} seems broken in not having stack or projects keys".format(file_path)) | 66 | tmpl = "{} seems broken in not having stack or projects keys" |
1669 | 67 | logging.warning(tmpl.format(file_path)) | ||
1670 | 66 | continue | 68 | continue |
1671 | 67 | if not projects_list: | 69 | if not projects_list: |
1673 | 68 | logging.warning("{} don't have any project list".format(file_path)) | 70 | tmpl = "{} don't have any project list" |
1674 | 71 | logging.warning(tmpl.format(file_path)) | ||
1675 | 69 | continue | 72 | continue |
1676 | 70 | for project in projects_list: | 73 | for project in projects_list: |
1677 | 71 | if isinstance(project, dict): | 74 | if isinstance(project, dict): |
1678 | @@ -74,11 +77,13 @@ | |||
1679 | 74 | projects.append(project) | 77 | projects.append(project) |
1680 | 75 | return set(projects) | 78 | return set(projects) |
1681 | 76 | 79 | ||
1682 | 80 | |||
1683 | 77 | def get_stack_packaging_change_status(source_version_list): | 81 | def get_stack_packaging_change_status(source_version_list): |
1684 | 78 | '''Return global package change status list | 82 | '''Return global package change status list |
1685 | 79 | 83 | ||
1686 | 80 | # FIXME: added too many infos now, should only be: (source, version) | 84 | # FIXME: added too many infos now, should only be: (source, version) |
1688 | 81 | source_version_list is a list of couples (source, version, tip_rev, target_branch)''' | 85 | source_version_list is a list of couples (source, version, tip_rev, |
1689 | 86 | target_branch)''' | ||
1690 | 82 | 87 | ||
1691 | 83 | packaging_change_status = [] | 88 | packaging_change_status = [] |
1692 | 84 | for (source, version, tip_rev, target_branch) in source_version_list: | 89 | for (source, version, tip_rev, target_branch) in source_version_list: |
1693 | 85 | 90 | ||
1694 | === modified file 'branch-source-builder/cupstream2distro/tools.py' | |||
1695 | --- branch-source-builder/cupstream2distro/tools.py 2014-02-19 16:34:46 +0000 | |||
1696 | +++ branch-source-builder/cupstream2distro/tools.py 2014-02-23 16:37:01 +0000 | |||
1697 | @@ -26,18 +26,21 @@ | |||
1698 | 26 | from .settings import PROJECT_CONFIG_SUFFIX | 26 | from .settings import PROJECT_CONFIG_SUFFIX |
1699 | 27 | from .utils import ignored | 27 | from .utils import ignored |
1700 | 28 | 28 | ||
1702 | 29 | WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1" time="0.1"> | 29 | WRAPPER_STRING = '''<testsuite errors="0" failures="{}" name="" tests="1" |
1703 | 30 | time="0.1"> | ||
1704 | 30 | <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase> | 31 | <testcase classname="MarkUnstable" name={} time="0.0">{}</testcase> |
1705 | 31 | </testsuite>''' | 32 | </testsuite>''' |
1706 | 32 | 33 | ||
1707 | 33 | 34 | ||
1708 | 34 | def generate_xml_artefacts(test_name, details, filename): | 35 | def generate_xml_artefacts(test_name, details, filename): |
1710 | 35 | '''Generate a fake test name xml result for marking the build as unstable''' | 36 | '''Generate a fake test name xml result for marking the build as |
1711 | 37 | unstable''' | ||
1712 | 36 | failure = "" | 38 | failure = "" |
1713 | 37 | errnum = 0 | 39 | errnum = 0 |
1714 | 38 | for detail in details: | 40 | for detail in details: |
1715 | 39 | errnum = 1 | 41 | errnum = 1 |
1717 | 40 | failure += ' <failure type="exception">{}</failure>\n'.format(escape(detail)) | 42 | ex = escape(detail) |
1718 | 43 | failure += ' <failure type="exception">{}</failure>\n'.format(ex) | ||
1719 | 41 | if failure: | 44 | if failure: |
1720 | 42 | failure = '\n{}'.format(failure) | 45 | failure = '\n{}'.format(failure) |
1721 | 43 | 46 | ||
1722 | @@ -52,7 +55,8 @@ | |||
1723 | 52 | return config.get('Package', 'dest_current_version') | 55 | return config.get('Package', 'dest_current_version') |
1724 | 53 | 56 | ||
1725 | 54 | 57 | ||
1727 | 55 | def save_project_config(source_package_name, branch, revision, dest_current_version, current_packaging_version): | 58 | def save_project_config(source_package_name, branch, revision, |
1728 | 59 | dest_current_version, current_packaging_version): | ||
1729 | 56 | '''Save branch and package configuration''' | 60 | '''Save branch and package configuration''' |
1730 | 57 | config = ConfigParser.RawConfigParser() | 61 | config = ConfigParser.RawConfigParser() |
1731 | 58 | config.add_section('Branch') | 62 | config.add_section('Branch') |
1732 | @@ -61,21 +65,27 @@ | |||
1733 | 61 | config.add_section('Package') | 65 | config.add_section('Package') |
1734 | 62 | config.set('Package', 'dest_current_version', dest_current_version) | 66 | config.set('Package', 'dest_current_version', dest_current_version) |
1735 | 63 | config.set('Package', 'packaging_version', current_packaging_version) | 67 | config.set('Package', 'packaging_version', current_packaging_version) |
1737 | 64 | with open("{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX), 'wb') as configfile: | 68 | path = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX) |
1738 | 69 | with open(path, 'wb') as configfile: | ||
1739 | 65 | config.write(configfile) | 70 | config.write(configfile) |
1740 | 66 | 71 | ||
1741 | 67 | 72 | ||
1742 | 68 | def get_packaging_diff_filename(source_package_name, packaging_version): | 73 | def get_packaging_diff_filename(source_package_name, packaging_version): |
1743 | 69 | '''Return the packaging diff filename''' | 74 | '''Return the packaging diff filename''' |
1744 | 70 | 75 | ||
1746 | 71 | return "packaging_changes_{}_{}.diff".format(source_package_name, packaging_version) | 76 | ret = "packaging_changes_{}_{}.diff" |
1747 | 77 | return ret.format(source_package_name, packaging_version) | ||
1748 | 72 | 78 | ||
1749 | 73 | 79 | ||
1750 | 74 | def mark_project_as_published(source_package_name, packaging_version): | 80 | def mark_project_as_published(source_package_name, packaging_version): |
1755 | 75 | '''Rename .project and eventual diff files so that if we do a partial rebuild, we don't try to republish them''' | 81 | '''Rename .project and eventual diff files so that if we do a partial |
1756 | 76 | project_filename = "{}.{}".format(source_package_name, PROJECT_CONFIG_SUFFIX) | 82 | rebuild, we don't try to republish them''' |
1757 | 77 | os.rename(project_filename, "{}_{}".format(project_filename, packaging_version)) | 83 | tmpl = "{}.{}" |
1758 | 78 | diff_filename = get_packaging_diff_filename(source_package_name, packaging_version) | 84 | project_filename = tmpl.format(source_package_name, PROJECT_CONFIG_SUFFIX) |
1759 | 85 | new_name = tmpl.format(project_filename, packaging_version) | ||
1760 | 86 | os.rename(project_filename, new_name) | ||
1761 | 87 | diff_filename = get_packaging_diff_filename(source_package_name, | ||
1762 | 88 | packaging_version) | ||
1763 | 79 | if os.path.isfile(diff_filename): | 89 | if os.path.isfile(diff_filename): |
1764 | 80 | os.rename(diff_filename, "{}.published".format(diff_filename)) | 90 | os.rename(diff_filename, "{}.published".format(diff_filename)) |
1765 | 81 | 91 | ||
1766 | 82 | 92 | ||
1767 | === modified file 'charms/precise/python-django/hooks/hooks.py' | |||
1768 | --- charms/precise/python-django/hooks/hooks.py 2014-02-18 20:11:41 +0000 | |||
1769 | +++ charms/precise/python-django/hooks/hooks.py 2014-02-23 16:37:01 +0000 | |||
1770 | @@ -562,7 +562,6 @@ | |||
1771 | 562 | 562 | ||
1772 | 563 | def config_changed(config_data): | 563 | def config_changed(config_data): |
1773 | 564 | os.environ['DJANGO_SETTINGS_MODULE'] = django_settings_modules | 564 | os.environ['DJANGO_SETTINGS_MODULE'] = django_settings_modules |
1774 | 565 | django_admin_cmd = find_django_admin_cmd() | ||
1775 | 566 | 565 | ||
1776 | 567 | site_secret_key = config_data['site_secret_key'] | 566 | site_secret_key = config_data['site_secret_key'] |
1777 | 568 | if not site_secret_key: | 567 | if not site_secret_key: |
1778 | @@ -610,7 +609,7 @@ | |||
1779 | 610 | else: | 609 | else: |
1780 | 611 | run('git pull %s %s' % (repos_url, vcs_clone_dir)) | 610 | run('git pull %s %s' % (repos_url, vcs_clone_dir)) |
1781 | 612 | elif vcs == 'bzr' or vcs == 'bazaar': | 611 | elif vcs == 'bzr' or vcs == 'bazaar': |
1783 | 613 | run('cd %s; bzr pull %s %s' % (vcs_clonse_dir, repos_url)) | 612 | run('cd %s; bzr pull %s %s' % (vcs_clone_dir, repos_url)) |
1784 | 614 | elif vcs == 'svn' or vcs == 'subversion': | 613 | elif vcs == 'svn' or vcs == 'subversion': |
1785 | 615 | run('svn up %s %s' % (repos_url, vcs_clone_dir)) | 614 | run('svn up %s %s' % (repos_url, vcs_clone_dir)) |
1786 | 616 | else: | 615 | else: |
1787 | 617 | 616 | ||
1788 | === modified file 'image-builder/imagebuilder/tests/test_style.py' | |||
1789 | --- image-builder/imagebuilder/tests/test_style.py 2014-02-15 12:06:40 +0000 | |||
1790 | +++ image-builder/imagebuilder/tests/test_style.py 2014-02-23 16:37:01 +0000 | |||
1791 | @@ -14,15 +14,19 @@ | |||
1792 | 14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | 14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
1793 | 15 | 15 | ||
1794 | 16 | from ucitests import styles | 16 | from ucitests import styles |
1797 | 17 | 17 | # If we just import run_worker, we'll end up testing whichever run_worker was | |
1798 | 18 | import imagebuilder | 18 | # last imported, so import it under a different name. |
1799 | 19 | import imagebuilder_run_worker | ||
1800 | 19 | 20 | ||
1801 | 20 | 21 | ||
1802 | 21 | class TestPep8(styles.TestPep8): | 22 | class TestPep8(styles.TestPep8): |
1803 | 22 | 23 | ||
1805 | 23 | packages = [imagebuilder] | 24 | # uci-tests will scan all subdirectories that this module is found in, so |
1806 | 25 | # we do not need to explicitly provide imagebuilder. Doing so would cause | ||
1807 | 26 | # the tests to be run twice. | ||
1808 | 27 | packages = [imagebuilder_run_worker] | ||
1809 | 24 | 28 | ||
1810 | 25 | 29 | ||
1811 | 26 | class TestPyflakes(styles.TestPyflakes): | 30 | class TestPyflakes(styles.TestPyflakes): |
1812 | 27 | 31 | ||
1814 | 28 | packages = [imagebuilder] | 32 | packages = [imagebuilder_run_worker] |
1815 | 29 | 33 | ||
1816 | === added symlink 'image-builder/imagebuilder_run_worker.py' | |||
1817 | === target is u'run_worker' | |||
1818 | === modified file 'image-builder/run_worker' | |||
1819 | --- image-builder/run_worker 2014-01-28 13:50:43 +0000 | |||
1820 | +++ image-builder/run_worker 2014-02-23 16:37:01 +0000 | |||
1821 | @@ -50,7 +50,7 @@ | |||
1822 | 50 | if amqp_utils.progress_completed(trigger, {'image_id': image_id}): | 50 | if amqp_utils.progress_completed(trigger, {'image_id': image_id}): |
1823 | 51 | log.error( | 51 | log.error( |
1824 | 52 | 'Unable to notify progress-trigger completition of action') | 52 | 'Unable to notify progress-trigger completition of action') |
1826 | 53 | except Exception as e: | 53 | except Exception: |
1827 | 54 | type, val, tb = sys.exc_info() | 54 | type, val, tb = sys.exc_info() |
1828 | 55 | amqp_utils.progress_failed(trigger, {'message': val.message}) | 55 | amqp_utils.progress_failed(trigger, {'message': val.message}) |
1829 | 56 | log.exception('Image build failed:') | 56 | log.exception('Image build failed:') |
1830 | 57 | 57 | ||
1831 | === modified file 'lander/lander/tests/test_style.py' | |||
1832 | --- lander/lander/tests/test_style.py 2014-02-15 12:06:40 +0000 | |||
1833 | +++ lander/lander/tests/test_style.py 2014-02-23 16:37:01 +0000 | |||
1834 | @@ -14,15 +14,19 @@ | |||
1835 | 14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | 14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
1836 | 15 | 15 | ||
1837 | 16 | from ucitests import styles | 16 | from ucitests import styles |
1840 | 17 | 17 | # If we just import run_worker, we'll end up testing whichever run_worker was | |
1841 | 18 | import lander | 18 | # last imported, so import it under a different name. |
1842 | 19 | import lander_run_worker | ||
1843 | 19 | 20 | ||
1844 | 20 | 21 | ||
1845 | 21 | class TestPep8(styles.TestPep8): | 22 | class TestPep8(styles.TestPep8): |
1846 | 22 | 23 | ||
1848 | 23 | packages = [lander] | 24 | # uci-tests will scan all subdirectories that this module is found in, so |
1849 | 25 | # we do not need to explicitly provide lander. Doing so would cause the | ||
1850 | 26 | # tests to be run twice. | ||
1851 | 27 | packages = [lander_run_worker] | ||
1852 | 24 | 28 | ||
1853 | 25 | 29 | ||
1854 | 26 | class TestPyflakes(styles.TestPyflakes): | 30 | class TestPyflakes(styles.TestPyflakes): |
1855 | 27 | 31 | ||
1857 | 28 | packages = [lander] | 32 | packages = [lander_run_worker] |
1858 | 29 | 33 | ||
1859 | === added symlink 'lander/lander_run_worker.py' | |||
1860 | === target is u'run_worker' | |||
1861 | === modified file 'test_runner/tstrun/tests/test_style.py' | |||
1862 | --- test_runner/tstrun/tests/test_style.py 2014-02-15 12:06:40 +0000 | |||
1863 | +++ test_runner/tstrun/tests/test_style.py 2014-02-23 16:37:01 +0000 | |||
1864 | @@ -16,13 +16,19 @@ | |||
1865 | 16 | from ucitests import styles | 16 | from ucitests import styles |
1866 | 17 | 17 | ||
1867 | 18 | import tstrun | 18 | import tstrun |
1868 | 19 | # If we just import run_worker, we'll end up testing whichever run_worker was | ||
1869 | 20 | # last imported, so import it under a different name. | ||
1870 | 21 | import tstrun_run_worker | ||
1871 | 19 | 22 | ||
1872 | 20 | 23 | ||
1873 | 21 | class TestPep8(styles.TestPep8): | 24 | class TestPep8(styles.TestPep8): |
1874 | 22 | 25 | ||
1876 | 23 | packages = [tstrun] | 26 | # uci-tests will scan all subdirectories that this module is found in, so |
1877 | 27 | # we do not need to explicitly provide tstrun. Doing so would cause | ||
1878 | 28 | # the tests to be run twice. | ||
1879 | 29 | packages = [tstrun_run_worker] | ||
1880 | 24 | 30 | ||
1881 | 25 | 31 | ||
1882 | 26 | class TestPyflakes(styles.TestPyflakes): | 32 | class TestPyflakes(styles.TestPyflakes): |
1883 | 27 | 33 | ||
1885 | 28 | packages = [tstrun] | 34 | packages = [tstrun_run_worker] |
1886 | 29 | 35 | ||
1887 | === added symlink 'test_runner/tstrun_run_worker.py' | |||
1888 | === target is u'run_worker' |
FAILED: Continuous integration, rev:280 /code.launchpad .net/~ev/ ubuntu- ci-services- itself/ wider-pyflakes- coverage/ +merge/ 207801/ +edit-commit- message
No commit message was specified in the merge proposal. Click on the following link and set the commit message (if you want a jenkins rebuild you need to trigger it yourself):
https:/
http:// s-jenkins. ubuntu- ci:8080/ job/uci- engine- ci/217/
Executed test runs:
Click here to trigger a rebuild: s-jenkins. ubuntu- ci:8080/ job/uci- engine- ci/217/ rebuild
http://