Merge lp:~jtv/launchpad/bug-659769 into lp:launchpad
- bug-659769
- Merge into devel
Status: | Superseded |
---|---|
Proposed branch: | lp:~jtv/launchpad/bug-659769 |
Merge into: | lp:launchpad |
Diff against target: |
1363 lines (+668/-571) 9 files modified
lib/lp/archivepublisher/scripts/publish_ftpmaster.py (+23/-8) lib/lp/archivepublisher/tests/test_publish_ftpmaster.py (+24/-0) lib/lp/codehosting/safe_open.py (+0/-263) lib/lp/codehosting/tests/test_safe_open.py (+0/-267) lib/lp/registry/interfaces/distroseries.py (+25/-11) lib/lp/registry/model/distroseries.py (+26/-21) lib/lp/soyuz/scripts/custom_uploads_copier.py (+149/-0) lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py (+420/-0) lib/lp/soyuz/tests/test_publishing.py (+1/-1) |
To merge this branch: | bzr merge lp:~jtv/launchpad/bug-659769 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Brad Crittenden (community) | code | Needs Information | |
Review via email: mp+71173@code.launchpad.net |
This proposal has been superseded by a proposal from 2011-08-15.
Commit message
Copy custom uploads into new distro release.
Description of the change
= Summary =
Every time the Ubuntu people (particularly Colin) set up a new release, they need to copy some of the archive's custom-upload files to the new release. Custom uploads are almost entirely unmanaged, so they would probably do this by copying direcetly in the filesystem. They have asked for an easier, more integrated way to get it done.
== Proposed fix ==
As far as I can make out, the only types of custom upload that need to be copied are for the Debian installer and the dist upgrader. The Rosetta translations are shared within the database, and presumably both kinds of translations for a package will soon be re-uploaded anyway. So I focused on the installers and upgraders.
Because custom uploads are so lightly managed (little or no metadata is kept, and we'd have to parse tarballs from the Librarian just to see what files came out of which upload), it's hard to figure out exactly what should or should not be copied. Some of the files may be obsolete and if we copy them along, they'll be with us forever.
The uploads contain tarballs with names in just one of two formats: <package>
Rather than get into the full hairy detail of version parsing (some of the versions look a little ad-hoc), I'm just copying the latest custom uploads for each (upload type, <package>, <architecture>). The <architecture> defaults to "all" because that's what seems to be meant when it is omitted.
As far as I've seen, the scheme I implemented here would actually work just fine for the other upload types. It will also work for derived distributions, with two limitations:
1. The custom uploads are copied only between consecutive releases of the same distribution, not across inheritance lines.
2. Uploads have to follow this naming scheme in order to be copied. This is something the package defines.
Having this supported for derived distributions could help save manual maintenance and system access needs for derived distros. We may want to add cross-distro inheritance of custom uploads later, but then we'll have to solve the conflict-resolution problem: should we copy the previous series' latest installer, or the one from the parent distro in the series that we're deriving from? What if there are multiple parent distros or releases?
== Pre-implementation notes ==
Discussed with various people, but alas not with Colin at the time of writing. I'm going to hold off on landing until he confirms that this will give him what he needs. Watch this space for updates.
== Implementation details ==
== Tests ==
{{{
./bin/test -vvc lp.soyuz -t test_publishing -t test_custom_
./bin/test -vvc lp.archivepubli
}}}
== Demo and Q/A ==
We'll have to do this hand in hand with the distro gurus. Not only to validate the result, but also to help set up the test scenario!
= Launchpad lint =
Checking for conflicts and issues in changed files.
Linting changed files:
lib/lp/
lib/lp/
lib/lp/
lib/lp/
lib/lp/
lib/lp/
lib/lp/
Jeroen T. Vermeulen (jtv) wrote : | # |
No idea what safe_open is or how it got into this MP. I'll have to resubmit.
Preview Diff
1 | === modified file 'lib/lp/archivepublisher/scripts/publish_ftpmaster.py' | |||
2 | --- lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-09 10:30:43 +0000 | |||
3 | +++ lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-11 10:31:20 +0000 | |||
4 | @@ -27,6 +27,7 @@ | |||
5 | 27 | ) | 27 | ) |
6 | 28 | from lp.services.utils import file_exists | 28 | from lp.services.utils import file_exists |
7 | 29 | from lp.soyuz.enums import ArchivePurpose | 29 | from lp.soyuz.enums import ArchivePurpose |
8 | 30 | from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier | ||
9 | 30 | from lp.soyuz.scripts.ftpmaster import LpQueryDistro | 31 | from lp.soyuz.scripts.ftpmaster import LpQueryDistro |
10 | 31 | from lp.soyuz.scripts.processaccepted import ProcessAccepted | 32 | from lp.soyuz.scripts.processaccepted import ProcessAccepted |
11 | 32 | from lp.soyuz.scripts.publishdistro import PublishDistro | 33 | from lp.soyuz.scripts.publishdistro import PublishDistro |
12 | @@ -539,6 +540,25 @@ | |||
13 | 539 | self.recoverWorkingDists() | 540 | self.recoverWorkingDists() |
14 | 540 | raise | 541 | raise |
15 | 541 | 542 | ||
16 | 543 | def prepareFreshSeries(self): | ||
17 | 544 | """If there are any new distroseries, prepare them for publishing. | ||
18 | 545 | |||
19 | 546 | :return: True if a series did indeed still need some preparation, | ||
20 | 547 | of False for the normal case. | ||
21 | 548 | """ | ||
22 | 549 | have_fresh_series = False | ||
23 | 550 | for series in self.distribution.series: | ||
24 | 551 | suites_needing_indexes = self.listSuitesNeedingIndexes(series) | ||
25 | 552 | if len(suites_needing_indexes) != 0: | ||
26 | 553 | # This is a fresh series. | ||
27 | 554 | have_fresh_series = True | ||
28 | 555 | if series.previous_series is not None: | ||
29 | 556 | CustomUploadsCopier(series).copy(series.previous_series) | ||
30 | 557 | for suite in suites_needing_indexes: | ||
31 | 558 | self.createIndexes(suite) | ||
32 | 559 | |||
33 | 560 | return have_fresh_series | ||
34 | 561 | |||
35 | 542 | def setUp(self): | 562 | def setUp(self): |
36 | 543 | """Process options, and set up internal state.""" | 563 | """Process options, and set up internal state.""" |
37 | 544 | self.processOptions() | 564 | self.processOptions() |
38 | @@ -550,14 +570,9 @@ | |||
39 | 550 | self.setUp() | 570 | self.setUp() |
40 | 551 | self.recoverWorkingDists() | 571 | self.recoverWorkingDists() |
41 | 552 | 572 | ||
50 | 553 | for series in self.distribution.series: | 573 | if self.prepareFreshSeries(): |
51 | 554 | suites_needing_indexes = self.listSuitesNeedingIndexes(series) | 574 | # We've done enough. Leave some room for others. |
52 | 555 | if len(suites_needing_indexes) > 0: | 575 | return |
45 | 556 | for suite in suites_needing_indexes: | ||
46 | 557 | self.createIndexes(suite) | ||
47 | 558 | # Don't try to do too much in one run. Leave the rest | ||
48 | 559 | # of the work for next time. | ||
49 | 560 | return | ||
53 | 561 | 576 | ||
54 | 562 | self.processAccepted() | 577 | self.processAccepted() |
55 | 563 | self.setUpDirs() | 578 | self.setUpDirs() |
56 | 564 | 579 | ||
57 | === modified file 'lib/lp/archivepublisher/tests/test_publish_ftpmaster.py' | |||
58 | --- lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-03 06:24:53 +0000 | |||
59 | +++ lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-11 10:31:20 +0000 | |||
60 | @@ -44,6 +44,7 @@ | |||
61 | 44 | from lp.soyuz.enums import ( | 44 | from lp.soyuz.enums import ( |
62 | 45 | ArchivePurpose, | 45 | ArchivePurpose, |
63 | 46 | PackagePublishingStatus, | 46 | PackagePublishingStatus, |
64 | 47 | PackageUploadCustomFormat, | ||
65 | 47 | ) | 48 | ) |
66 | 48 | from lp.soyuz.tests.test_publishing import SoyuzTestPublisher | 49 | from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
67 | 49 | from lp.testing import ( | 50 | from lp.testing import ( |
68 | @@ -980,6 +981,29 @@ | |||
69 | 980 | self.assertEqual([suite], kwargs['suites']) | 981 | self.assertEqual([suite], kwargs['suites']) |
70 | 981 | self.assertThat(kwargs['suites'][0], StartsWith(series.name)) | 982 | self.assertThat(kwargs['suites'][0], StartsWith(series.name)) |
71 | 982 | 983 | ||
72 | 984 | def test_prepareFreshSeries_copies_custom_uploads(self): | ||
73 | 985 | distro = self.makeDistroWithPublishDirectory() | ||
74 | 986 | old_series = self.factory.makeDistroSeries( | ||
75 | 987 | distribution=distro, status=SeriesStatus.CURRENT) | ||
76 | 988 | new_series = self.factory.makeDistroSeries( | ||
77 | 989 | distribution=distro, previous_series=old_series, | ||
78 | 990 | status=SeriesStatus.FROZEN) | ||
79 | 991 | custom_upload = self.factory.makeCustomPackageUpload( | ||
80 | 992 | distroseries=old_series, | ||
81 | 993 | custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER, | ||
82 | 994 | filename='debian-installer-images_1.0-20110805_i386.tar.gz') | ||
83 | 995 | script = self.makeScript(distro) | ||
84 | 996 | script.createIndexes = FakeMethod() | ||
85 | 997 | script.setUp() | ||
86 | 998 | have_fresh_series = script.prepareFreshSeries() | ||
87 | 999 | self.assertTrue(have_fresh_series) | ||
88 | 1000 | [copied_upload] = new_series.getPackageUploads( | ||
89 | 1001 | name=u'debian-installer-images', exact_match=False) | ||
90 | 1002 | [copied_custom] = copied_upload.customfiles | ||
91 | 1003 | self.assertEqual( | ||
92 | 1004 | custom_upload.customfiles[0].libraryfilealias.filename, | ||
93 | 1005 | copied_custom.libraryfilealias.filename) | ||
94 | 1006 | |||
95 | 983 | def test_script_creates_indexes(self): | 1007 | def test_script_creates_indexes(self): |
96 | 984 | # End-to-end test: the script creates indexes for distroseries | 1008 | # End-to-end test: the script creates indexes for distroseries |
97 | 985 | # that need them. | 1009 | # that need them. |
98 | 986 | 1010 | ||
99 | === removed file 'lib/lp/codehosting/safe_open.py' | |||
100 | --- lib/lp/codehosting/safe_open.py 2011-08-09 14:59:08 +0000 | |||
101 | +++ lib/lp/codehosting/safe_open.py 1970-01-01 00:00:00 +0000 | |||
102 | @@ -1,263 +0,0 @@ | |||
103 | 1 | # Copyright 2009-2011 Canonical Ltd. This software is licensed under the | ||
104 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
105 | 3 | |||
106 | 4 | """Safe branch opening.""" | ||
107 | 5 | |||
108 | 6 | __metaclass__ = type | ||
109 | 7 | |||
110 | 8 | from bzrlib import urlutils | ||
111 | 9 | from bzrlib.branch import Branch | ||
112 | 10 | from bzrlib.bzrdir import BzrDir | ||
113 | 11 | |||
114 | 12 | from lazr.uri import URI | ||
115 | 13 | |||
116 | 14 | __all__ = [ | ||
117 | 15 | 'AcceptAnythingPolicy', | ||
118 | 16 | 'BadUrl', | ||
119 | 17 | 'BlacklistPolicy', | ||
120 | 18 | 'BranchLoopError', | ||
121 | 19 | 'BranchOpenPolicy', | ||
122 | 20 | 'BranchReferenceForbidden', | ||
123 | 21 | 'SafeBranchOpener', | ||
124 | 22 | 'WhitelistPolicy', | ||
125 | 23 | 'safe_open', | ||
126 | 24 | ] | ||
127 | 25 | |||
128 | 26 | |||
129 | 27 | # TODO JelmerVernooij 2011-08-06: This module is generic enough to be | ||
130 | 28 | # in bzrlib, and may be of use to others. | ||
131 | 29 | |||
132 | 30 | |||
133 | 31 | class BadUrl(Exception): | ||
134 | 32 | """Tried to access a branch from a bad URL.""" | ||
135 | 33 | |||
136 | 34 | |||
137 | 35 | class BranchReferenceForbidden(Exception): | ||
138 | 36 | """Trying to mirror a branch reference and the branch type does not allow | ||
139 | 37 | references. | ||
140 | 38 | """ | ||
141 | 39 | |||
142 | 40 | |||
143 | 41 | class BranchLoopError(Exception): | ||
144 | 42 | """Encountered a branch cycle. | ||
145 | 43 | |||
146 | 44 | A URL may point to a branch reference or it may point to a stacked branch. | ||
147 | 45 | In either case, it's possible for there to be a cycle in these references, | ||
148 | 46 | and this exception is raised when we detect such a cycle. | ||
149 | 47 | """ | ||
150 | 48 | |||
151 | 49 | |||
152 | 50 | class BranchOpenPolicy: | ||
153 | 51 | """Policy on how to open branches. | ||
154 | 52 | |||
155 | 53 | In particular, a policy determines which branches are safe to open by | ||
156 | 54 | checking their URLs and deciding whether or not to follow branch | ||
157 | 55 | references. | ||
158 | 56 | """ | ||
159 | 57 | |||
160 | 58 | def shouldFollowReferences(self): | ||
161 | 59 | """Whether we traverse references when mirroring. | ||
162 | 60 | |||
163 | 61 | Subclasses must override this method. | ||
164 | 62 | |||
165 | 63 | If we encounter a branch reference and this returns false, an error is | ||
166 | 64 | raised. | ||
167 | 65 | |||
168 | 66 | :returns: A boolean to indicate whether to follow a branch reference. | ||
169 | 67 | """ | ||
170 | 68 | raise NotImplementedError(self.shouldFollowReferences) | ||
171 | 69 | |||
172 | 70 | def transformFallbackLocation(self, branch, url): | ||
173 | 71 | """Validate, maybe modify, 'url' to be used as a stacked-on location. | ||
174 | 72 | |||
175 | 73 | :param branch: The branch that is being opened. | ||
176 | 74 | :param url: The URL that the branch provides for its stacked-on | ||
177 | 75 | location. | ||
178 | 76 | :return: (new_url, check) where 'new_url' is the URL of the branch to | ||
179 | 77 | actually open and 'check' is true if 'new_url' needs to be | ||
180 | 78 | validated by checkAndFollowBranchReference. | ||
181 | 79 | """ | ||
182 | 80 | raise NotImplementedError(self.transformFallbackLocation) | ||
183 | 81 | |||
184 | 82 | def checkOneURL(self, url): | ||
185 | 83 | """Check the safety of the source URL. | ||
186 | 84 | |||
187 | 85 | Subclasses must override this method. | ||
188 | 86 | |||
189 | 87 | :param url: The source URL to check. | ||
190 | 88 | :raise BadUrl: subclasses are expected to raise this or a subclass | ||
191 | 89 | when it finds a URL it deems to be unsafe. | ||
192 | 90 | """ | ||
193 | 91 | raise NotImplementedError(self.checkOneURL) | ||
194 | 92 | |||
195 | 93 | |||
196 | 94 | class BlacklistPolicy(BranchOpenPolicy): | ||
197 | 95 | """Branch policy that forbids certain URLs.""" | ||
198 | 96 | |||
199 | 97 | def __init__(self, should_follow_references, unsafe_urls=None): | ||
200 | 98 | if unsafe_urls is None: | ||
201 | 99 | unsafe_urls = set() | ||
202 | 100 | self._unsafe_urls = unsafe_urls | ||
203 | 101 | self._should_follow_references = should_follow_references | ||
204 | 102 | |||
205 | 103 | def shouldFollowReferences(self): | ||
206 | 104 | return self._should_follow_references | ||
207 | 105 | |||
208 | 106 | def checkOneURL(self, url): | ||
209 | 107 | if url in self._unsafe_urls: | ||
210 | 108 | raise BadUrl(url) | ||
211 | 109 | |||
212 | 110 | def transformFallbackLocation(self, branch, url): | ||
213 | 111 | """See `BranchOpenPolicy.transformFallbackLocation`. | ||
214 | 112 | |||
215 | 113 | This class is not used for testing our smarter stacking features so we | ||
216 | 114 | just do the simplest thing: return the URL that would be used anyway | ||
217 | 115 | and don't check it. | ||
218 | 116 | """ | ||
219 | 117 | return urlutils.join(branch.base, url), False | ||
220 | 118 | |||
221 | 119 | |||
222 | 120 | class AcceptAnythingPolicy(BlacklistPolicy): | ||
223 | 121 | """Accept anything, to make testing easier.""" | ||
224 | 122 | |||
225 | 123 | def __init__(self): | ||
226 | 124 | super(AcceptAnythingPolicy, self).__init__(True, set()) | ||
227 | 125 | |||
228 | 126 | |||
229 | 127 | class WhitelistPolicy(BranchOpenPolicy): | ||
230 | 128 | """Branch policy that only allows certain URLs.""" | ||
231 | 129 | |||
232 | 130 | def __init__(self, should_follow_references, allowed_urls=None, | ||
233 | 131 | check=False): | ||
234 | 132 | if allowed_urls is None: | ||
235 | 133 | allowed_urls = [] | ||
236 | 134 | self.allowed_urls = set(url.rstrip('/') for url in allowed_urls) | ||
237 | 135 | self.check = check | ||
238 | 136 | |||
239 | 137 | def shouldFollowReferences(self): | ||
240 | 138 | return self._should_follow_references | ||
241 | 139 | |||
242 | 140 | def checkOneURL(self, url): | ||
243 | 141 | if url.rstrip('/') not in self.allowed_urls: | ||
244 | 142 | raise BadUrl(url) | ||
245 | 143 | |||
246 | 144 | def transformFallbackLocation(self, branch, url): | ||
247 | 145 | """See `BranchOpenPolicy.transformFallbackLocation`. | ||
248 | 146 | |||
249 | 147 | Here we return the URL that would be used anyway and optionally check | ||
250 | 148 | it. | ||
251 | 149 | """ | ||
252 | 150 | return urlutils.join(branch.base, url), self.check | ||
253 | 151 | |||
254 | 152 | |||
255 | 153 | class SafeBranchOpener(object): | ||
256 | 154 | """Safe branch opener. | ||
257 | 155 | |||
258 | 156 | The policy object is expected to have the following methods: | ||
259 | 157 | * checkOneURL | ||
260 | 158 | * shouldFollowReferences | ||
261 | 159 | * transformFallbackLocation | ||
262 | 160 | """ | ||
263 | 161 | |||
264 | 162 | def __init__(self, policy): | ||
265 | 163 | self.policy = policy | ||
266 | 164 | self._seen_urls = set() | ||
267 | 165 | |||
268 | 166 | def checkAndFollowBranchReference(self, url): | ||
269 | 167 | """Check URL (and possibly the referenced URL) for safety. | ||
270 | 168 | |||
271 | 169 | This method checks that `url` passes the policy's `checkOneURL` | ||
272 | 170 | method, and if `url` refers to a branch reference, it checks whether | ||
273 | 171 | references are allowed and whether the reference's URL passes muster | ||
274 | 172 | also -- recursively, until a real branch is found. | ||
275 | 173 | |||
276 | 174 | :raise BranchLoopError: If the branch references form a loop. | ||
277 | 175 | :raise BranchReferenceForbidden: If this opener forbids branch | ||
278 | 176 | references. | ||
279 | 177 | """ | ||
280 | 178 | while True: | ||
281 | 179 | if url in self._seen_urls: | ||
282 | 180 | raise BranchLoopError() | ||
283 | 181 | self._seen_urls.add(url) | ||
284 | 182 | self.policy.checkOneURL(url) | ||
285 | 183 | next_url = self.followReference(url) | ||
286 | 184 | if next_url is None: | ||
287 | 185 | return url | ||
288 | 186 | url = next_url | ||
289 | 187 | if not self.policy.shouldFollowReferences(): | ||
290 | 188 | raise BranchReferenceForbidden(url) | ||
291 | 189 | |||
292 | 190 | def transformFallbackLocationHook(self, branch, url): | ||
293 | 191 | """Installed as the 'transform_fallback_location' Branch hook. | ||
294 | 192 | |||
295 | 193 | This method calls `transformFallbackLocation` on the policy object and | ||
296 | 194 | either returns the url it provides or passes it back to | ||
297 | 195 | checkAndFollowBranchReference. | ||
298 | 196 | """ | ||
299 | 197 | new_url, check = self.policy.transformFallbackLocation(branch, url) | ||
300 | 198 | if check: | ||
301 | 199 | return self.checkAndFollowBranchReference(new_url) | ||
302 | 200 | else: | ||
303 | 201 | return new_url | ||
304 | 202 | |||
305 | 203 | def runWithTransformFallbackLocationHookInstalled( | ||
306 | 204 | self, callable, *args, **kw): | ||
307 | 205 | Branch.hooks.install_named_hook( | ||
308 | 206 | 'transform_fallback_location', self.transformFallbackLocationHook, | ||
309 | 207 | 'SafeBranchOpener.transformFallbackLocationHook') | ||
310 | 208 | try: | ||
311 | 209 | return callable(*args, **kw) | ||
312 | 210 | finally: | ||
313 | 211 | # XXX 2008-11-24 MichaelHudson, bug=301472: This is the hacky way | ||
314 | 212 | # to remove a hook. The linked bug report asks for an API to do | ||
315 | 213 | # it. | ||
316 | 214 | Branch.hooks['transform_fallback_location'].remove( | ||
317 | 215 | self.transformFallbackLocationHook) | ||
318 | 216 | # We reset _seen_urls here to avoid multiple calls to open giving | ||
319 | 217 | # spurious loop exceptions. | ||
320 | 218 | self._seen_urls = set() | ||
321 | 219 | |||
322 | 220 | def followReference(self, url): | ||
323 | 221 | """Get the branch-reference value at the specified url. | ||
324 | 222 | |||
325 | 223 | This exists as a separate method only to be overriden in unit tests. | ||
326 | 224 | """ | ||
327 | 225 | bzrdir = BzrDir.open(url) | ||
328 | 226 | return bzrdir.get_branch_reference() | ||
329 | 227 | |||
330 | 228 | def open(self, url): | ||
331 | 229 | """Open the Bazaar branch at url, first checking for safety. | ||
332 | 230 | |||
333 | 231 | What safety means is defined by a subclasses `followReference` and | ||
334 | 232 | `checkOneURL` methods. | ||
335 | 233 | """ | ||
336 | 234 | url = self.checkAndFollowBranchReference(url) | ||
337 | 235 | return self.runWithTransformFallbackLocationHookInstalled( | ||
338 | 236 | Branch.open, url) | ||
339 | 237 | |||
340 | 238 | |||
341 | 239 | class URLChecker(BranchOpenPolicy): | ||
342 | 240 | """Branch open policy that rejects URLs not on the given scheme.""" | ||
343 | 241 | |||
344 | 242 | def __init__(self, allowed_scheme): | ||
345 | 243 | self.allowed_scheme = allowed_scheme | ||
346 | 244 | |||
347 | 245 | def shouldFollowReferences(self): | ||
348 | 246 | return True | ||
349 | 247 | |||
350 | 248 | def transformFallbackLocation(self, branch, url): | ||
351 | 249 | return urlutils.join(branch.base, url), True | ||
352 | 250 | |||
353 | 251 | def checkOneURL(self, url): | ||
354 | 252 | """Check that `url` is safe to open.""" | ||
355 | 253 | if URI(url).scheme != self.allowed_scheme: | ||
356 | 254 | raise BadUrl(url) | ||
357 | 255 | |||
358 | 256 | |||
359 | 257 | def safe_open(allowed_scheme, url): | ||
360 | 258 | """Open the branch at `url`, only accessing URLs on `allowed_scheme`. | ||
361 | 259 | |||
362 | 260 | :raises BadUrl: An attempt was made to open a URL that was not on | ||
363 | 261 | `allowed_scheme`. | ||
364 | 262 | """ | ||
365 | 263 | return SafeBranchOpener(URLChecker(allowed_scheme)).open(url) | ||
366 | 264 | 0 | ||
367 | === removed file 'lib/lp/codehosting/tests/test_safe_open.py' | |||
368 | --- lib/lp/codehosting/tests/test_safe_open.py 2011-08-09 15:05:18 +0000 | |||
369 | +++ lib/lp/codehosting/tests/test_safe_open.py 1970-01-01 00:00:00 +0000 | |||
370 | @@ -1,267 +0,0 @@ | |||
371 | 1 | # Copyright 2009-2011 Canonical Ltd. This software is licensed under the | ||
372 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
373 | 3 | |||
374 | 4 | """Tests for the safe branch open code.""" | ||
375 | 5 | |||
376 | 6 | |||
377 | 7 | __metaclass__ = type | ||
378 | 8 | |||
379 | 9 | from lazr.uri import URI | ||
380 | 10 | |||
381 | 11 | from lp.codehosting.safe_open import ( | ||
382 | 12 | BadUrl, | ||
383 | 13 | BlacklistPolicy, | ||
384 | 14 | BranchLoopError, | ||
385 | 15 | BranchReferenceForbidden, | ||
386 | 16 | SafeBranchOpener, | ||
387 | 17 | WhitelistPolicy, | ||
388 | 18 | safe_open, | ||
389 | 19 | ) | ||
390 | 20 | |||
391 | 21 | from lp.testing import TestCase | ||
392 | 22 | |||
393 | 23 | from bzrlib.branch import ( | ||
394 | 24 | Branch, | ||
395 | 25 | BzrBranchFormat7, | ||
396 | 26 | ) | ||
397 | 27 | from bzrlib.bzrdir import ( | ||
398 | 28 | BzrDirMetaFormat1, | ||
399 | 29 | ) | ||
400 | 30 | from bzrlib.repofmt.pack_repo import RepositoryFormatKnitPack1 | ||
401 | 31 | from bzrlib.tests import ( | ||
402 | 32 | TestCaseWithTransport, | ||
403 | 33 | ) | ||
404 | 34 | from bzrlib.transport import chroot | ||
405 | 35 | |||
406 | 36 | |||
407 | 37 | class TestSafeBranchOpenerCheckAndFollowBranchReference(TestCase): | ||
408 | 38 | """Unit tests for `SafeBranchOpener.checkAndFollowBranchReference`.""" | ||
409 | 39 | |||
410 | 40 | class StubbedSafeBranchOpener(SafeBranchOpener): | ||
411 | 41 | """SafeBranchOpener that provides canned answers. | ||
412 | 42 | |||
413 | 43 | We implement the methods we need to to be able to control all the | ||
414 | 44 | inputs to the `BranchMirrorer.checkSource` method, which is what is | ||
415 | 45 | being tested in this class. | ||
416 | 46 | """ | ||
417 | 47 | |||
418 | 48 | def __init__(self, references, policy): | ||
419 | 49 | parent_cls = TestSafeBranchOpenerCheckAndFollowBranchReference | ||
420 | 50 | super(parent_cls.StubbedSafeBranchOpener, self).__init__(policy) | ||
421 | 51 | self._reference_values = {} | ||
422 | 52 | for i in range(len(references) - 1): | ||
423 | 53 | self._reference_values[references[i]] = references[i+1] | ||
424 | 54 | self.follow_reference_calls = [] | ||
425 | 55 | |||
426 | 56 | def followReference(self, url): | ||
427 | 57 | self.follow_reference_calls.append(url) | ||
428 | 58 | return self._reference_values[url] | ||
429 | 59 | |||
430 | 60 | def makeBranchOpener(self, should_follow_references, references, | ||
431 | 61 | unsafe_urls=None): | ||
432 | 62 | policy = BlacklistPolicy(should_follow_references, unsafe_urls) | ||
433 | 63 | opener = self.StubbedSafeBranchOpener(references, policy) | ||
434 | 64 | return opener | ||
435 | 65 | |||
436 | 66 | def testCheckInitialURL(self): | ||
437 | 67 | # checkSource rejects all URLs that are not allowed. | ||
438 | 68 | opener = self.makeBranchOpener(None, [], set(['a'])) | ||
439 | 69 | self.assertRaises(BadUrl, opener.checkAndFollowBranchReference, 'a') | ||
440 | 70 | |||
441 | 71 | def testNotReference(self): | ||
442 | 72 | # When branch references are forbidden, checkAndFollowBranchReference | ||
443 | 73 | # does not raise on non-references. | ||
444 | 74 | opener = self.makeBranchOpener(False, ['a', None]) | ||
445 | 75 | self.assertEquals('a', opener.checkAndFollowBranchReference('a')) | ||
446 | 76 | self.assertEquals(['a'], opener.follow_reference_calls) | ||
447 | 77 | |||
448 | 78 | def testBranchReferenceForbidden(self): | ||
449 | 79 | # checkAndFollowBranchReference raises BranchReferenceForbidden if | ||
450 | 80 | # branch references are forbidden and the source URL points to a | ||
451 | 81 | # branch reference. | ||
452 | 82 | opener = self.makeBranchOpener(False, ['a', 'b']) | ||
453 | 83 | self.assertRaises( | ||
454 | 84 | BranchReferenceForbidden, | ||
455 | 85 | opener.checkAndFollowBranchReference, 'a') | ||
456 | 86 | self.assertEquals(['a'], opener.follow_reference_calls) | ||
457 | 87 | |||
458 | 88 | def testAllowedReference(self): | ||
459 | 89 | # checkAndFollowBranchReference does not raise if following references | ||
460 | 90 | # is allowed and the source URL points to a branch reference to a | ||
461 | 91 | # permitted location. | ||
462 | 92 | opener = self.makeBranchOpener(True, ['a', 'b', None]) | ||
463 | 93 | self.assertEquals('b', opener.checkAndFollowBranchReference('a')) | ||
464 | 94 | self.assertEquals(['a', 'b'], opener.follow_reference_calls) | ||
465 | 95 | |||
466 | 96 | def testCheckReferencedURLs(self): | ||
467 | 97 | # checkAndFollowBranchReference checks if the URL a reference points | ||
468 | 98 | # to is safe. | ||
469 | 99 | opener = self.makeBranchOpener( | ||
470 | 100 | True, ['a', 'b', None], unsafe_urls=set('b')) | ||
471 | 101 | self.assertRaises(BadUrl, opener.checkAndFollowBranchReference, 'a') | ||
472 | 102 | self.assertEquals(['a'], opener.follow_reference_calls) | ||
473 | 103 | |||
474 | 104 | def testSelfReferencingBranch(self): | ||
475 | 105 | # checkAndFollowBranchReference raises BranchReferenceLoopError if | ||
476 | 106 | # following references is allowed and the source url points to a | ||
477 | 107 | # self-referencing branch reference. | ||
478 | 108 | opener = self.makeBranchOpener(True, ['a', 'a']) | ||
479 | 109 | self.assertRaises( | ||
480 | 110 | BranchLoopError, opener.checkAndFollowBranchReference, 'a') | ||
481 | 111 | self.assertEquals(['a'], opener.follow_reference_calls) | ||
482 | 112 | |||
483 | 113 | def testBranchReferenceLoop(self): | ||
484 | 114 | # checkAndFollowBranchReference raises BranchReferenceLoopError if | ||
485 | 115 | # following references is allowed and the source url points to a loop | ||
486 | 116 | # of branch references. | ||
487 | 117 | references = ['a', 'b', 'a'] | ||
488 | 118 | opener = self.makeBranchOpener(True, references) | ||
489 | 119 | self.assertRaises( | ||
490 | 120 | BranchLoopError, opener.checkAndFollowBranchReference, 'a') | ||
491 | 121 | self.assertEquals(['a', 'b'], opener.follow_reference_calls) | ||
492 | 122 | |||
493 | 123 | |||
494 | 124 | class TestSafeBranchOpenerStacking(TestCaseWithTransport): | ||
495 | 125 | |||
496 | 126 | def makeBranchOpener(self, allowed_urls): | ||
497 | 127 | policy = WhitelistPolicy(True, allowed_urls, True) | ||
498 | 128 | return SafeBranchOpener(policy) | ||
499 | 129 | |||
500 | 130 | def makeBranch(self, path, branch_format, repository_format): | ||
501 | 131 | """Make a Bazaar branch at 'path' with the given formats.""" | ||
502 | 132 | bzrdir_format = BzrDirMetaFormat1() | ||
503 | 133 | bzrdir_format.set_branch_format(branch_format) | ||
504 | 134 | bzrdir = self.make_bzrdir(path, format=bzrdir_format) | ||
505 | 135 | repository_format.initialize(bzrdir) | ||
506 | 136 | return bzrdir.create_branch() | ||
507 | 137 | |||
508 | 138 | def testAllowedURL(self): | ||
509 | 139 | # checkSource does not raise an exception for branches stacked on | ||
510 | 140 | # branches with allowed URLs. | ||
511 | 141 | stacked_on_branch = self.make_branch('base-branch', format='1.6') | ||
512 | 142 | stacked_branch = self.make_branch('stacked-branch', format='1.6') | ||
513 | 143 | stacked_branch.set_stacked_on_url(stacked_on_branch.base) | ||
514 | 144 | opener = self.makeBranchOpener( | ||
515 | 145 | [stacked_branch.base, stacked_on_branch.base]) | ||
516 | 146 | # This doesn't raise an exception. | ||
517 | 147 | opener.open(stacked_branch.base) | ||
518 | 148 | |||
519 | 149 | def testUnstackableRepository(self): | ||
520 | 150 | # checkSource treats branches with UnstackableRepositoryFormats as | ||
521 | 151 | # being not stacked. | ||
522 | 152 | branch = self.makeBranch( | ||
523 | 153 | 'unstacked', BzrBranchFormat7(), RepositoryFormatKnitPack1()) | ||
524 | 154 | opener = self.makeBranchOpener([branch.base]) | ||
525 | 155 | # This doesn't raise an exception. | ||
526 | 156 | opener.open(branch.base) | ||
527 | 157 | |||
528 | 158 | def testAllowedRelativeURL(self): | ||
529 | 159 | # checkSource passes on absolute urls to checkOneURL, even if the | ||
530 | 160 | # value of stacked_on_location in the config is set to a relative URL. | ||
531 | 161 | stacked_on_branch = self.make_branch('base-branch', format='1.6') | ||
532 | 162 | stacked_branch = self.make_branch('stacked-branch', format='1.6') | ||
533 | 163 | stacked_branch.set_stacked_on_url('../base-branch') | ||
534 | 164 | opener = self.makeBranchOpener( | ||
535 | 165 | [stacked_branch.base, stacked_on_branch.base]) | ||
536 | 166 | # Note that stacked_on_branch.base is not '../base-branch', it's an | ||
537 | 167 | # absolute URL. | ||
538 | 168 | self.assertNotEqual('../base-branch', stacked_on_branch.base) | ||
539 | 169 | # This doesn't raise an exception. | ||
540 | 170 | opener.open(stacked_branch.base) | ||
541 | 171 | |||
542 | 172 | def testAllowedRelativeNested(self): | ||
543 | 173 | # Relative URLs are resolved relative to the stacked branch. | ||
544 | 174 | self.get_transport().mkdir('subdir') | ||
545 | 175 | a = self.make_branch('subdir/a', format='1.6') | ||
546 | 176 | b = self.make_branch('b', format='1.6') | ||
547 | 177 | b.set_stacked_on_url('../subdir/a') | ||
548 | 178 | c = self.make_branch('subdir/c', format='1.6') | ||
549 | 179 | c.set_stacked_on_url('../../b') | ||
550 | 180 | opener = self.makeBranchOpener([c.base, b.base, a.base]) | ||
551 | 181 | # This doesn't raise an exception. | ||
552 | 182 | opener.open(c.base) | ||
553 | 183 | |||
554 | 184 | def testForbiddenURL(self): | ||
555 | 185 | # checkSource raises a BadUrl exception if a branch is stacked on a | ||
556 | 186 | # branch with a forbidden URL. | ||
557 | 187 | stacked_on_branch = self.make_branch('base-branch', format='1.6') | ||
558 | 188 | stacked_branch = self.make_branch('stacked-branch', format='1.6') | ||
559 | 189 | stacked_branch.set_stacked_on_url(stacked_on_branch.base) | ||
560 | 190 | opener = self.makeBranchOpener([stacked_branch.base]) | ||
561 | 191 | self.assertRaises(BadUrl, opener.open, stacked_branch.base) | ||
562 | 192 | |||
563 | 193 | def testForbiddenURLNested(self): | ||
564 | 194 | # checkSource raises a BadUrl exception if a branch is stacked on a | ||
565 | 195 | # branch that is in turn stacked on a branch with a forbidden URL. | ||
566 | 196 | a = self.make_branch('a', format='1.6') | ||
567 | 197 | b = self.make_branch('b', format='1.6') | ||
568 | 198 | b.set_stacked_on_url(a.base) | ||
569 | 199 | c = self.make_branch('c', format='1.6') | ||
570 | 200 | c.set_stacked_on_url(b.base) | ||
571 | 201 | opener = self.makeBranchOpener([c.base, b.base]) | ||
572 | 202 | self.assertRaises(BadUrl, opener.open, c.base) | ||
573 | 203 | |||
574 | 204 | def testSelfStackedBranch(self): | ||
575 | 205 | # checkSource raises StackingLoopError if a branch is stacked on | ||
576 | 206 | # itself. This avoids infinite recursion errors. | ||
577 | 207 | a = self.make_branch('a', format='1.6') | ||
578 | 208 | # Bazaar 1.17 and up make it harder to create branches like this. | ||
579 | 209 | # It's still worth testing that we don't blow up in the face of them, | ||
580 | 210 | # so we grovel around a bit to create one anyway. | ||
581 | 211 | a.get_config().set_user_option('stacked_on_location', a.base) | ||
582 | 212 | opener = self.makeBranchOpener([a.base]) | ||
583 | 213 | self.assertRaises(BranchLoopError, opener.open, a.base) | ||
584 | 214 | |||
585 | 215 | def testLoopStackedBranch(self): | ||
586 | 216 | # checkSource raises StackingLoopError if a branch is stacked in such | ||
587 | 217 | # a way so that it is ultimately stacked on itself. e.g. a stacked on | ||
588 | 218 | # b stacked on a. | ||
589 | 219 | a = self.make_branch('a', format='1.6') | ||
590 | 220 | b = self.make_branch('b', format='1.6') | ||
591 | 221 | a.set_stacked_on_url(b.base) | ||
592 | 222 | b.set_stacked_on_url(a.base) | ||
593 | 223 | opener = self.makeBranchOpener([a.base, b.base]) | ||
594 | 224 | self.assertRaises(BranchLoopError, opener.open, a.base) | ||
595 | 225 | self.assertRaises(BranchLoopError, opener.open, b.base) | ||
596 | 226 | |||
597 | 227 | |||
598 | 228 | class TestSafeOpen(TestCaseWithTransport): | ||
599 | 229 | """Tests for `safe_open`.""" | ||
600 | 230 | |||
601 | 231 | def get_chrooted_scheme(self, relpath): | ||
602 | 232 | """Create a server that is chrooted to `relpath`. | ||
603 | 233 | |||
604 | 234 | :return: ``(scheme, get_url)`` where ``scheme`` is the scheme of the | ||
605 | 235 | chroot server and ``get_url`` returns URLs on said server. | ||
606 | 236 | """ | ||
607 | 237 | transport = self.get_transport(relpath) | ||
608 | 238 | chroot_server = chroot.ChrootServer(transport) | ||
609 | 239 | chroot_server.start_server() | ||
610 | 240 | self.addCleanup(chroot_server.stop_server) | ||
611 | 241 | def get_url(relpath): | ||
612 | 242 | return chroot_server.get_url() + relpath | ||
613 | 243 | return URI(chroot_server.get_url()).scheme, get_url | ||
614 | 244 | |||
615 | 245 | def test_stacked_within_scheme(self): | ||
616 | 246 | # A branch that is stacked on a URL of the same scheme is safe to | ||
617 | 247 | # open. | ||
618 | 248 | self.get_transport().mkdir('inside') | ||
619 | 249 | self.make_branch('inside/stacked') | ||
620 | 250 | self.make_branch('inside/stacked-on') | ||
621 | 251 | scheme, get_chrooted_url = self.get_chrooted_scheme('inside') | ||
622 | 252 | Branch.open(get_chrooted_url('stacked')).set_stacked_on_url( | ||
623 | 253 | get_chrooted_url('stacked-on')) | ||
624 | 254 | safe_open(scheme, get_chrooted_url('stacked')) | ||
625 | 255 | |||
626 | 256 | def test_stacked_outside_scheme(self): | ||
627 | 257 | # A branch that is stacked on a URL that is not of the same scheme is | ||
628 | 258 | # not safe to open. | ||
629 | 259 | self.get_transport().mkdir('inside') | ||
630 | 260 | self.get_transport().mkdir('outside') | ||
631 | 261 | self.make_branch('inside/stacked') | ||
632 | 262 | self.make_branch('outside/stacked-on') | ||
633 | 263 | scheme, get_chrooted_url = self.get_chrooted_scheme('inside') | ||
634 | 264 | Branch.open(get_chrooted_url('stacked')).set_stacked_on_url( | ||
635 | 265 | self.get_url('outside/stacked-on')) | ||
636 | 266 | self.assertRaises( | ||
637 | 267 | BadUrl, safe_open, scheme, get_chrooted_url('stacked')) | ||
638 | 268 | 0 | ||
639 | === modified file 'lib/lp/registry/interfaces/distroseries.py' | |||
640 | --- lib/lp/registry/interfaces/distroseries.py 2011-08-05 03:58:16 +0000 | |||
641 | +++ lib/lp/registry/interfaces/distroseries.py 2011-08-11 10:31:20 +0000 | |||
642 | @@ -786,20 +786,34 @@ | |||
643 | 786 | DistroSeriesBinaryPackage objects that match the given text. | 786 | DistroSeriesBinaryPackage objects that match the given text. |
644 | 787 | """ | 787 | """ |
645 | 788 | 788 | ||
647 | 789 | def createQueueEntry(pocket, archive, changesfilename, changesfilecontent, | 789 | def createQueueEntry(pocket, archive, changesfilename=None, |
648 | 790 | changesfilecontent=None, changes_file_alias=None, | ||
649 | 790 | signingkey=None, package_copy_job=None): | 791 | signingkey=None, package_copy_job=None): |
650 | 791 | """Create a queue item attached to this distroseries. | 792 | """Create a queue item attached to this distroseries. |
651 | 792 | 793 | ||
662 | 793 | Create a new records respecting the given pocket and archive. | 794 | Create a new `PackageUpload` to the given pocket and archive. |
663 | 794 | 795 | ||
664 | 795 | The default state is NEW, sorted sqlobject declaration, any | 796 | The default state is NEW. Any further state changes go through |
665 | 796 | modification should be performed via Queue state-machine. | 797 | the Queue state-machine. |
666 | 797 | 798 | ||
667 | 798 | The changesfile argument should be the text of the .changes for this | 799 | :param pocket: The `PackagePublishingPocket` to upload to. |
668 | 799 | upload. The contents of this may be used later. | 800 | :param archive: The `Archive` to upload to. Must be for the same |
669 | 800 | 801 | `Distribution` as this series. | |
670 | 801 | 'signingkey' is the IGPGKey used to sign the changesfile or None if | 802 | :param changesfilename: Name for the upload's .changes file. You may |
671 | 802 | the changesfile is unsigned. | 803 | specify a changes file by passing both `changesfilename` and |
672 | 804 | `changesfilecontent`, or by passing `changes_file_alias`. | ||
673 | 805 | :param changesfilecontent: Text for the changes file. It will be | ||
674 | 806 | signed and stored in the Librarian. Must be passed together with | ||
675 | 807 | `changesfilename`; alternatively, you may provide a | ||
676 | 808 | `changes_file_alias` to replace both of these. | ||
677 | 809 | :param changes_file_alias: A `LibraryFileAlias` containing the | ||
678 | 810 | .changes file. Security warning: unless the file has already | ||
679 | 811 | been checked, this may open us up to replay attacks as per bugs | ||
680 | 812 | 159304 and 451396. Use `changes_file_alias` only if you know | ||
681 | 813 | this can't happen. | ||
682 | 814 | :param signingkey: `IGPGKey` used to sign the changes file, or None if | ||
683 | 815 | it is unsigned. | ||
684 | 816 | :return: A new `PackageUpload`. | ||
685 | 803 | """ | 817 | """ |
686 | 804 | 818 | ||
687 | 805 | def newArch(architecturetag, processorfamily, official, owner, | 819 | def newArch(architecturetag, processorfamily, official, owner, |
688 | 806 | 820 | ||
689 | === modified file 'lib/lp/registry/model/distroseries.py' | |||
690 | --- lib/lp/registry/model/distroseries.py 2011-08-05 03:58:16 +0000 | |||
691 | +++ lib/lp/registry/model/distroseries.py 2011-08-11 10:31:20 +0000 | |||
692 | @@ -86,9 +86,7 @@ | |||
693 | 86 | ISeriesBugTarget, | 86 | ISeriesBugTarget, |
694 | 87 | ) | 87 | ) |
695 | 88 | from lp.bugs.interfaces.bugtaskfilter import OrderedBugTask | 88 | from lp.bugs.interfaces.bugtaskfilter import OrderedBugTask |
699 | 89 | from lp.bugs.model.bug import ( | 89 | from lp.bugs.model.bug import get_bug_tags |
697 | 90 | get_bug_tags, | ||
698 | 91 | ) | ||
700 | 92 | from lp.bugs.model.bugtarget import ( | 90 | from lp.bugs.model.bugtarget import ( |
701 | 93 | BugTargetBase, | 91 | BugTargetBase, |
702 | 94 | HasBugHeatMixin, | 92 | HasBugHeatMixin, |
703 | @@ -1587,25 +1585,34 @@ | |||
704 | 1587 | get_property_cache(spph).newer_distroseries_version = version | 1585 | get_property_cache(spph).newer_distroseries_version = version |
705 | 1588 | 1586 | ||
706 | 1589 | def createQueueEntry(self, pocket, archive, changesfilename=None, | 1587 | def createQueueEntry(self, pocket, archive, changesfilename=None, |
709 | 1590 | changesfilecontent=None, signing_key=None, | 1588 | changesfilecontent=None, changes_file_alias=None, |
710 | 1591 | package_copy_job=None): | 1589 | signing_key=None, package_copy_job=None): |
711 | 1592 | """See `IDistroSeries`.""" | 1590 | """See `IDistroSeries`.""" |
723 | 1593 | # We store the changes file in the librarian to avoid having to | 1591 | if (changesfilename is None) != (changesfilecontent is None): |
724 | 1594 | # deal with broken encodings in these files; this will allow us | 1592 | raise AssertionError( |
725 | 1595 | # to regenerate these files as necessary. | 1593 | "Inconsistent changesfilename and changesfilecontent. " |
726 | 1596 | # | 1594 | "Pass either both, or neither.") |
727 | 1597 | # The use of StringIO here should be safe: we do not encoding of | 1595 | if changes_file_alias is not None and changesfilename is not None: |
728 | 1598 | # the content in the changes file (as doing so would be guessing | 1596 | raise AssertionError( |
729 | 1599 | # at best, causing unpredictable corruption), and simply pass it | 1597 | "Conflicting options: " |
730 | 1600 | # off to the librarian. | 1598 | "Both changesfilename and changes_file_alias were given.") |
731 | 1601 | 1599 | have_changes_file = not ( | |
732 | 1602 | if package_copy_job is None and ( | 1600 | changesfilename is None and changes_file_alias is None) |
733 | 1603 | changesfilename is None or changesfilecontent is None): | 1601 | if package_copy_job is None and not have_changes_file: |
734 | 1604 | raise AssertionError( | 1602 | raise AssertionError( |
735 | 1605 | "changesfilename and changesfilecontent must be supplied " | 1603 | "changesfilename and changesfilecontent must be supplied " |
736 | 1606 | "if there is no package_copy_job") | 1604 | "if there is no package_copy_job") |
737 | 1607 | 1605 | ||
739 | 1608 | if package_copy_job is None: | 1606 | if changesfilename is not None: |
740 | 1607 | # We store the changes file in the librarian to avoid having to | ||
741 | 1608 | # deal with broken encodings in these files; this will allow us | ||
742 | 1609 | # to regenerate these files as necessary. | ||
743 | 1610 | # | ||
744 | 1611 | # The use of StringIO here should be safe: we do not encoding of | ||
745 | 1612 | # the content in the changes file (as doing so would be guessing | ||
746 | 1613 | # at best, causing unpredictable corruption), and simply pass it | ||
747 | 1614 | # off to the librarian. | ||
748 | 1615 | |||
749 | 1609 | # The PGP signature is stripped from all changesfiles | 1616 | # The PGP signature is stripped from all changesfiles |
750 | 1610 | # to avoid replay attacks (see bugs 159304 and 451396). | 1617 | # to avoid replay attacks (see bugs 159304 and 451396). |
751 | 1611 | signed_message = signed_message_from_string(changesfilecontent) | 1618 | signed_message = signed_message_from_string(changesfilecontent) |
752 | @@ -1616,17 +1623,15 @@ | |||
753 | 1616 | if new_content is not None: | 1623 | if new_content is not None: |
754 | 1617 | changesfilecontent = signed_message.signedContent | 1624 | changesfilecontent = signed_message.signedContent |
755 | 1618 | 1625 | ||
757 | 1619 | changes_file = getUtility(ILibraryFileAliasSet).create( | 1626 | changes_file_alias = getUtility(ILibraryFileAliasSet).create( |
758 | 1620 | changesfilename, len(changesfilecontent), | 1627 | changesfilename, len(changesfilecontent), |
759 | 1621 | StringIO(changesfilecontent), 'text/plain', | 1628 | StringIO(changesfilecontent), 'text/plain', |
760 | 1622 | restricted=archive.private) | 1629 | restricted=archive.private) |
761 | 1623 | else: | ||
762 | 1624 | changes_file = None | ||
763 | 1625 | 1630 | ||
764 | 1626 | return PackageUpload( | 1631 | return PackageUpload( |
765 | 1627 | distroseries=self, status=PackageUploadStatus.NEW, | 1632 | distroseries=self, status=PackageUploadStatus.NEW, |
766 | 1628 | pocket=pocket, archive=archive, | 1633 | pocket=pocket, archive=archive, |
768 | 1629 | changesfile=changes_file, signing_key=signing_key, | 1634 | changesfile=changes_file_alias, signing_key=signing_key, |
769 | 1630 | package_copy_job=package_copy_job) | 1635 | package_copy_job=package_copy_job) |
770 | 1631 | 1636 | ||
771 | 1632 | def getPackageUploadQueue(self, state): | 1637 | def getPackageUploadQueue(self, state): |
772 | 1633 | 1638 | ||
773 | === added file 'lib/lp/soyuz/scripts/custom_uploads_copier.py' | |||
774 | --- lib/lp/soyuz/scripts/custom_uploads_copier.py 1970-01-01 00:00:00 +0000 | |||
775 | +++ lib/lp/soyuz/scripts/custom_uploads_copier.py 2011-08-11 10:31:20 +0000 | |||
776 | @@ -0,0 +1,149 @@ | |||
777 | 1 | # Copyright 2011 Canonical Ltd. This software is licensed under the | ||
778 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
779 | 3 | |||
780 | 4 | """Copy latest custom uploads into a distribution release series. | ||
781 | 5 | |||
782 | 6 | Use this when initializing the installer and dist upgrader for a new release | ||
783 | 7 | series based on the latest uploads from its preceding series. | ||
784 | 8 | """ | ||
785 | 9 | |||
786 | 10 | __metaclass__ = type | ||
787 | 11 | __all__ = [ | ||
788 | 12 | 'CustomUploadsCopier', | ||
789 | 13 | ] | ||
790 | 14 | |||
791 | 15 | from operator import attrgetter | ||
792 | 16 | import re | ||
793 | 17 | |||
794 | 18 | from zope.component import getUtility | ||
795 | 19 | |||
796 | 20 | #from canonical.launchpad.database.librarian import LibraryFileAlias | ||
797 | 21 | from lp.services.database.bulk import load_referencing | ||
798 | 22 | from lp.soyuz.enums import PackageUploadCustomFormat | ||
799 | 23 | from lp.soyuz.interfaces.archive import ( | ||
800 | 24 | IArchiveSet, | ||
801 | 25 | MAIN_ARCHIVE_PURPOSES, | ||
802 | 26 | ) | ||
803 | 27 | from lp.soyuz.model.queue import PackageUploadCustom | ||
804 | 28 | |||
805 | 29 | |||
806 | 30 | class CustomUploadsCopier: | ||
807 | 31 | """Copy `PackageUploadCustom` objects into a new `DistroSeries`.""" | ||
808 | 32 | |||
809 | 33 | copyable_types = [ | ||
810 | 34 | PackageUploadCustomFormat.DEBIAN_INSTALLER, | ||
811 | 35 | PackageUploadCustomFormat.DIST_UPGRADER, | ||
812 | 36 | ] | ||
813 | 37 | |||
814 | 38 | def __init__(self, target_series): | ||
815 | 39 | self.target_series = target_series | ||
816 | 40 | |||
817 | 41 | def isCopyable(self, upload): | ||
818 | 42 | """Is `upload` the kind of `PackageUploadCustom` that we can copy?""" | ||
819 | 43 | return upload.customformat in self.copyable_types | ||
820 | 44 | |||
821 | 45 | def getCandidateUploads(self, source_series): | ||
822 | 46 | """Find custom uploads that may need copying.""" | ||
823 | 47 | uploads = source_series.getPackageUploads( | ||
824 | 48 | custom_type=self.copyable_types) | ||
825 | 49 | load_referencing(PackageUploadCustom, uploads, ['packageuploadID']) | ||
826 | 50 | customs = sum([list(upload.customfiles) for upload in uploads], []) | ||
827 | 51 | customs = filter(self.isCopyable, customs) | ||
828 | 52 | customs.sort(key=attrgetter('id'), reverse=True) | ||
829 | 53 | return customs | ||
830 | 54 | |||
831 | 55 | def extractNameFields(self, filename): | ||
832 | 56 | """Get the relevant fields out of `filename`. | ||
833 | 57 | |||
834 | 58 | Scans filenames of any of these forms: | ||
835 | 59 | |||
836 | 60 | <package>_<version>_<architecture>.tar.<compression_suffix> | ||
837 | 61 | <package>_<version>.tar[.<compression_suffix>] | ||
838 | 62 | |||
839 | 63 | Versions may contain dots, dashes etc. but no underscores. | ||
840 | 64 | |||
841 | 65 | :return: A tuple of (<package>, <architecture>), or None if the | ||
842 | 66 | filename does not match the expected pattern. If no | ||
843 | 67 | architecture is found in the filename, it defaults to 'all'. | ||
844 | 68 | """ | ||
845 | 69 | regex_parts = { | ||
846 | 70 | 'package': "[^_]+", | ||
847 | 71 | 'version': "[^_]+", | ||
848 | 72 | 'arch': "[^._]+", | ||
849 | 73 | } | ||
850 | 74 | filename_regex = ( | ||
851 | 75 | "(%(package)s)_%(version)s(?:_(%(arch)s))?.tar" % regex_parts) | ||
852 | 76 | match = re.match(filename_regex, filename) | ||
853 | 77 | if match is None: | ||
854 | 78 | return None | ||
855 | 79 | default_arch = 'all' | ||
856 | 80 | fields = match.groups(default_arch) | ||
857 | 81 | if len(fields) != 2: | ||
858 | 82 | return None | ||
859 | 83 | return fields | ||
860 | 84 | |||
861 | 85 | def getKey(self, upload): | ||
862 | 86 | """Get an indexing key for `upload`.""" | ||
863 | 87 | custom_format = (upload.customformat, ) | ||
864 | 88 | name_fields = self.extractNameFields(upload.libraryfilealias.filename) | ||
865 | 89 | if name_fields is None: | ||
866 | 90 | return None | ||
867 | 91 | else: | ||
868 | 92 | return custom_format + name_fields | ||
869 | 93 | |||
870 | 94 | def getLatestUploads(self, source_series): | ||
871 | 95 | """Find the latest uploads. | ||
872 | 96 | |||
873 | 97 | :param source_series: The `DistroSeries` whose uploads to get. | ||
874 | 98 | :return: A dict containing the latest uploads, indexed by keys as | ||
875 | 99 | returned by `getKey`. | ||
876 | 100 | """ | ||
877 | 101 | latest_uploads = {} | ||
878 | 102 | for upload in self.getCandidateUploads(source_series): | ||
879 | 103 | key = self.getKey(upload) | ||
880 | 104 | if key is not None: | ||
881 | 105 | latest_uploads.setdefault(key, upload) | ||
882 | 106 | return latest_uploads | ||
883 | 107 | |||
884 | 108 | def getTargetArchive(self, original_archive): | ||
885 | 109 | """Find counterpart of `original_archive` in `self.target_series`. | ||
886 | 110 | |||
887 | 111 | :param original_archive: The `Archive` that the original upload went | ||
888 | 112 | into. If this is not a primary, partner, or debug archive, | ||
889 | 113 | None is returned. | ||
890 | 114 | :return: The `Archive` of the same purpose for `self.target_series`. | ||
891 | 115 | """ | ||
892 | 116 | if original_archive.purpose not in MAIN_ARCHIVE_PURPOSES: | ||
893 | 117 | return None | ||
894 | 118 | return getUtility(IArchiveSet).getByDistroPurpose( | ||
895 | 119 | self.target_series.distribution, original_archive.purpose) | ||
896 | 120 | |||
897 | 121 | def isObsolete(self, upload, target_uploads): | ||
898 | 122 | """Is `upload` superseded by one that the target series already has? | ||
899 | 123 | |||
900 | 124 | :param upload: A `PackageUploadCustom` from the source series. | ||
901 | 125 | :param target_uploads: | ||
902 | 126 | """ | ||
903 | 127 | existing_upload = target_uploads.get(self.getKey(upload)) | ||
904 | 128 | return existing_upload is not None and existing_upload.id >= upload.id | ||
905 | 129 | |||
906 | 130 | def copyUpload(self, original_upload): | ||
907 | 131 | """Copy `original_upload` into `self.target_series`.""" | ||
908 | 132 | target_archive = self.getTargetArchive( | ||
909 | 133 | original_upload.packageupload.archive) | ||
910 | 134 | if target_archive is None: | ||
911 | 135 | return None | ||
912 | 136 | package_upload = self.target_series.createQueueEntry( | ||
913 | 137 | original_upload.packageupload.pocket, target_archive, | ||
914 | 138 | changes_file_alias=original_upload.packageupload.changesfile) | ||
915 | 139 | custom = package_upload.addCustom( | ||
916 | 140 | original_upload.libraryfilealias, original_upload.customformat) | ||
917 | 141 | package_upload.setAccepted() | ||
918 | 142 | return custom | ||
919 | 143 | |||
920 | 144 | def copy(self, source_series): | ||
921 | 145 | """Copy uploads from `source_series`.""" | ||
922 | 146 | target_uploads = self.getLatestUploads(self.target_series) | ||
923 | 147 | for upload in self.getLatestUploads(source_series).itervalues(): | ||
924 | 148 | if not self.isObsolete(upload, target_uploads): | ||
925 | 149 | self.copyUpload(upload) | ||
926 | 0 | 150 | ||
927 | === added file 'lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py' | |||
928 | --- lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 1970-01-01 00:00:00 +0000 | |||
929 | +++ lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 2011-08-11 10:31:20 +0000 | |||
930 | @@ -0,0 +1,420 @@ | |||
931 | 1 | # Copyright 2011 Canonical Ltd. This software is licensed under the | ||
932 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
933 | 3 | |||
934 | 4 | """Test copying of custom package uploads for a new `DistroSeries`.""" | ||
935 | 5 | |||
936 | 6 | __metaclass__ = type | ||
937 | 7 | |||
938 | 8 | from canonical.testing.layers import ( | ||
939 | 9 | LaunchpadZopelessLayer, | ||
940 | 10 | ZopelessLayer, | ||
941 | 11 | ) | ||
942 | 12 | from lp.soyuz.enums import ( | ||
943 | 13 | ArchivePurpose, | ||
944 | 14 | PackageUploadCustomFormat, | ||
945 | 15 | PackageUploadStatus, | ||
946 | 16 | ) | ||
947 | 17 | from lp.soyuz.interfaces.archive import MAIN_ARCHIVE_PURPOSES | ||
948 | 18 | from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier | ||
949 | 19 | from lp.testing import TestCaseWithFactory | ||
950 | 20 | from lp.testing.fakemethod import FakeMethod | ||
951 | 21 | |||
952 | 22 | |||
953 | 23 | def list_custom_uploads(distroseries): | ||
954 | 24 | """Return a list of all `PackageUploadCustom`s for `distroseries`.""" | ||
955 | 25 | return sum( | ||
956 | 26 | [ | ||
957 | 27 | list(upload.customfiles) | ||
958 | 28 | for upload in distroseries.getPackageUploads()], | ||
959 | 29 | []) | ||
960 | 30 | |||
961 | 31 | |||
962 | 32 | class FakeDistroSeries: | ||
963 | 33 | """Fake `DistroSeries` for test copiers that don't really need one.""" | ||
964 | 34 | |||
965 | 35 | |||
966 | 36 | class FakeLibraryFileAlias: | ||
967 | 37 | def __init__(self, filename): | ||
968 | 38 | self.filename = filename | ||
969 | 39 | |||
970 | 40 | |||
971 | 41 | class FakeUpload: | ||
972 | 42 | def __init__(self, customformat, filename): | ||
973 | 43 | self.customformat = customformat | ||
974 | 44 | self.libraryfilealias = FakeLibraryFileAlias(filename) | ||
975 | 45 | |||
976 | 46 | |||
977 | 47 | class CommonTestHelpers: | ||
978 | 48 | """Helper(s) for these tests.""" | ||
979 | 49 | def makeVersion(self): | ||
980 | 50 | """Create a fake version string.""" | ||
981 | 51 | return "%d.%d-%s" % ( | ||
982 | 52 | self.factory.getUniqueInteger(), | ||
983 | 53 | self.factory.getUniqueInteger(), | ||
984 | 54 | self.factory.getUniqueString()) | ||
985 | 55 | |||
986 | 56 | |||
987 | 57 | class TestCustomUploadsCopierLite(TestCaseWithFactory, CommonTestHelpers): | ||
988 | 58 | """Light-weight low-level tests for `CustomUploadsCopier`.""" | ||
989 | 59 | |||
990 | 60 | layer = ZopelessLayer | ||
991 | 61 | |||
992 | 62 | def test_isCopyable_matches_copyable_types(self): | ||
993 | 63 | # isCopyable checks a custom upload's customformat field to | ||
994 | 64 | # determine whether the upload is a candidate for copying. It | ||
995 | 65 | # approves only those whose customformats are in copyable_types. | ||
996 | 66 | class FakePackageUploadCustom: | ||
997 | 67 | def __init__(self, customformat): | ||
998 | 68 | self.customformat = customformat | ||
999 | 69 | |||
1000 | 70 | uploads = [ | ||
1001 | 71 | FakePackageUploadCustom(custom_type) | ||
1002 | 72 | for custom_type in PackageUploadCustomFormat.items] | ||
1003 | 73 | |||
1004 | 74 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1005 | 75 | copied_uploads = filter(copier.isCopyable, uploads) | ||
1006 | 76 | self.assertContentEqual( | ||
1007 | 77 | CustomUploadsCopier.copyable_types, | ||
1008 | 78 | [upload.customformat for upload in copied_uploads]) | ||
1009 | 79 | |||
1010 | 80 | def test_extractNameFields_extracts_package_name_and_architecture(self): | ||
1011 | 81 | # extractNameFields picks up the package name and architecture | ||
1012 | 82 | # out of an upload's filename field. | ||
1013 | 83 | package_name = self.factory.getUniqueString('package') | ||
1014 | 84 | version = self.makeVersion() | ||
1015 | 85 | architecture = self.factory.getUniqueString('arch') | ||
1016 | 86 | filename = '%s_%s_%s.tar.gz' % (package_name, version, architecture) | ||
1017 | 87 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1018 | 88 | self.assertEqual( | ||
1019 | 89 | (package_name, architecture), copier.extractNameFields(filename)) | ||
1020 | 90 | |||
1021 | 91 | def test_extractNameFields_does_not_require_architecture(self): | ||
1022 | 92 | # When extractNameFields does not see an architecture, it | ||
1023 | 93 | # defaults to 'all'. | ||
1024 | 94 | package_name = self.factory.getUniqueString('package') | ||
1025 | 95 | filename = '%s_%s.tar.gz' % (package_name, self.makeVersion()) | ||
1026 | 96 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1027 | 97 | self.assertEqual( | ||
1028 | 98 | (package_name, 'all'), copier.extractNameFields(filename)) | ||
1029 | 99 | |||
1030 | 100 | def test_extractNameFields_returns_None_on_mismatch(self): | ||
1031 | 101 | # If the filename does not match the expected pattern, | ||
1032 | 102 | # extractNameFields returns None. | ||
1033 | 103 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1034 | 104 | self.assertIs(None, copier.extractNameFields('argh_1.0.jpg')) | ||
1035 | 105 | |||
1036 | 106 | def test_extractNameFields_ignores_names_with_too_many_fields(self): | ||
1037 | 107 | # As one particularly nasty case that might break | ||
1038 | 108 | # extractNameFields, a name with more underscore-seprated fields | ||
1039 | 109 | # than the search pattern allows for is sensibly rejected. | ||
1040 | 110 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1041 | 111 | self.assertIs( | ||
1042 | 112 | None, copier.extractNameFields('one_two_three_four_5.tar.gz')) | ||
1043 | 113 | |||
1044 | 114 | def test_getKey_returns_None_on_name_mismatch(self): | ||
1045 | 115 | # If extractNameFields returns None, getKey also returns None. | ||
1046 | 116 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1047 | 117 | copier.extractNameFields = FakeMethod() | ||
1048 | 118 | self.assertIs( | ||
1049 | 119 | None, | ||
1050 | 120 | copier.getKey(FakeUpload( | ||
1051 | 121 | PackageUploadCustomFormat.DEBIAN_INSTALLER, | ||
1052 | 122 | "bad-filename.tar"))) | ||
1053 | 123 | |||
1054 | 124 | |||
1055 | 125 | class TestCustomUploadsCopier(TestCaseWithFactory, CommonTestHelpers): | ||
1056 | 126 | """Heavyweight `CustomUploadsCopier` tests.""" | ||
1057 | 127 | |||
1058 | 128 | # Alas, PackageUploadCustom relies on the Librarian. | ||
1059 | 129 | layer = LaunchpadZopelessLayer | ||
1060 | 130 | |||
1061 | 131 | def makeUpload(self, distroseries=None, | ||
1062 | 132 | custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER, | ||
1063 | 133 | package_name=None, version=None, arch=None): | ||
1064 | 134 | """Create a `PackageUploadCustom`.""" | ||
1065 | 135 | if distroseries is None: | ||
1066 | 136 | distroseries = self.factory.makeDistroSeries() | ||
1067 | 137 | if package_name is None: | ||
1068 | 138 | package_name = self.factory.getUniqueString("package") | ||
1069 | 139 | if version is None: | ||
1070 | 140 | version = self.makeVersion() | ||
1071 | 141 | filename = "%s.tar.gz" % '_'.join( | ||
1072 | 142 | filter(None, [package_name, version, arch])) | ||
1073 | 143 | package_upload = self.factory.makeCustomPackageUpload( | ||
1074 | 144 | distroseries=distroseries, custom_type=custom_type, | ||
1075 | 145 | filename=filename) | ||
1076 | 146 | return package_upload.customfiles[0] | ||
1077 | 147 | |||
1078 | 148 | def test_copies_custom_upload(self): | ||
1079 | 149 | # CustomUploadsCopier copies custom uploads from one series to | ||
1080 | 150 | # another. | ||
1081 | 151 | current_series = self.factory.makeDistroSeries() | ||
1082 | 152 | original_upload = self.makeUpload(current_series) | ||
1083 | 153 | new_series = self.factory.makeDistroSeries( | ||
1084 | 154 | distribution=current_series.distribution, | ||
1085 | 155 | previous_series=current_series) | ||
1086 | 156 | |||
1087 | 157 | CustomUploadsCopier(new_series).copy(current_series) | ||
1088 | 158 | |||
1089 | 159 | [copied_upload] = list_custom_uploads(new_series) | ||
1090 | 160 | self.assertEqual( | ||
1091 | 161 | original_upload.libraryfilealias, copied_upload.libraryfilealias) | ||
1092 | 162 | |||
1093 | 163 | def test_is_idempotent(self): | ||
1094 | 164 | # It's safe to perform the same copy more than once; the uploads | ||
1095 | 165 | # get copied only once. | ||
1096 | 166 | current_series = self.factory.makeDistroSeries() | ||
1097 | 167 | self.makeUpload(current_series) | ||
1098 | 168 | new_series = self.factory.makeDistroSeries( | ||
1099 | 169 | distribution=current_series.distribution, | ||
1100 | 170 | previous_series=current_series) | ||
1101 | 171 | |||
1102 | 172 | copier = CustomUploadsCopier(new_series) | ||
1103 | 173 | copier.copy(current_series) | ||
1104 | 174 | uploads_after_first_copy = list_custom_uploads(new_series) | ||
1105 | 175 | copier.copy(current_series) | ||
1106 | 176 | uploads_after_redundant_copy = list_custom_uploads(new_series) | ||
1107 | 177 | |||
1108 | 178 | self.assertEqual( | ||
1109 | 179 | uploads_after_first_copy, uploads_after_redundant_copy) | ||
1110 | 180 | |||
1111 | 181 | def test_getCandidateUploads_filters_by_distroseries(self): | ||
1112 | 182 | # getCandidateUploads ignores uploads for other distroseries. | ||
1113 | 183 | source_series = self.factory.makeDistroSeries() | ||
1114 | 184 | matching_upload = self.makeUpload(source_series) | ||
1115 | 185 | nonmatching_upload = self.makeUpload() | ||
1116 | 186 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1117 | 187 | candidate_uploads = copier.getCandidateUploads(source_series) | ||
1118 | 188 | self.assertContentEqual([matching_upload], candidate_uploads) | ||
1119 | 189 | self.assertNotIn(nonmatching_upload, candidate_uploads) | ||
1120 | 190 | |||
1121 | 191 | def test_getCandidateUploads_filters_upload_types(self): | ||
1122 | 192 | # getCandidateUploads returns only uploads of the types listed | ||
1123 | 193 | # in copyable_types; other types of upload are ignored. | ||
1124 | 194 | source_series = self.factory.makeDistroSeries() | ||
1125 | 195 | for custom_format in PackageUploadCustomFormat.items: | ||
1126 | 196 | self.makeUpload(source_series, custom_type=custom_format) | ||
1127 | 197 | |||
1128 | 198 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1129 | 199 | candidate_uploads = copier.getCandidateUploads(source_series) | ||
1130 | 200 | copied_types = [upload.customformat for upload in candidate_uploads] | ||
1131 | 201 | self.assertContentEqual( | ||
1132 | 202 | CustomUploadsCopier.copyable_types, copied_types) | ||
1133 | 203 | |||
1134 | 204 | def test_getCandidateUploads_ignores_other_attachments(self): | ||
1135 | 205 | # A PackageUpload can have multiple PackageUploadCustoms | ||
1136 | 206 | # attached, potentially of different types. getCandidateUploads | ||
1137 | 207 | # ignores PackageUploadCustoms of types that aren't supposed to | ||
1138 | 208 | # be copied, even if they are attached to PackageUploads that | ||
1139 | 209 | # also have PackageUploadCustoms that do need to be copied. | ||
1140 | 210 | source_series = self.factory.makeDistroSeries() | ||
1141 | 211 | package_upload = self.factory.makePackageUpload( | ||
1142 | 212 | distroseries=source_series, archive=source_series.main_archive) | ||
1143 | 213 | library_file = self.factory.makeLibraryFileAlias() | ||
1144 | 214 | matching_upload = package_upload.addCustom( | ||
1145 | 215 | library_file, PackageUploadCustomFormat.DEBIAN_INSTALLER) | ||
1146 | 216 | nonmatching_upload = package_upload.addCustom( | ||
1147 | 217 | library_file, PackageUploadCustomFormat.ROSETTA_TRANSLATIONS) | ||
1148 | 218 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1149 | 219 | candidates = copier.getCandidateUploads(source_series) | ||
1150 | 220 | self.assertContentEqual([matching_upload], candidates) | ||
1151 | 221 | self.assertNotIn(nonmatching_upload, candidates) | ||
1152 | 222 | |||
1153 | 223 | def test_getCandidateUploads_orders_newest_to_oldest(self): | ||
1154 | 224 | # getCandidateUploads returns its PackageUploadCustoms ordered | ||
1155 | 225 | # from newest to oldest. | ||
1156 | 226 | source_series = self.factory.makeDistroSeries() | ||
1157 | 227 | for counter in xrange(5): | ||
1158 | 228 | self.makeUpload(source_series) | ||
1159 | 229 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1160 | 230 | candidate_ids = [ | ||
1161 | 231 | upload.id for upload in copier.getCandidateUploads(source_series)] | ||
1162 | 232 | self.assertEqual(sorted(candidate_ids, reverse=True), candidate_ids) | ||
1163 | 233 | |||
1164 | 234 | def test_getKey_includes_format_package_and_architecture(self): | ||
1165 | 235 | # The key returned by getKey consists of custom upload type, | ||
1166 | 236 | # package name, and architecture. | ||
1167 | 237 | source_series = self.factory.makeDistroSeries() | ||
1168 | 238 | upload = self.makeUpload( | ||
1169 | 239 | source_series, PackageUploadCustomFormat.DIST_UPGRADER, | ||
1170 | 240 | package_name='upgrader', arch='mips') | ||
1171 | 241 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1172 | 242 | expected_key = ( | ||
1173 | 243 | PackageUploadCustomFormat.DIST_UPGRADER, | ||
1174 | 244 | 'upgrader', | ||
1175 | 245 | 'mips', | ||
1176 | 246 | ) | ||
1177 | 247 | self.assertEqual(expected_key, copier.getKey(upload)) | ||
1178 | 248 | |||
1179 | 249 | def test_getLatestUploads_indexes_uploads_by_key(self): | ||
1180 | 250 | # getLatestUploads returns a dict of uploads, indexed by keys | ||
1181 | 251 | # returned by getKey. | ||
1182 | 252 | source_series = self.factory.makeDistroSeries() | ||
1183 | 253 | upload = self.makeUpload(source_series) | ||
1184 | 254 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1185 | 255 | self.assertEqual( | ||
1186 | 256 | {copier.getKey(upload): upload}, | ||
1187 | 257 | copier.getLatestUploads(source_series)) | ||
1188 | 258 | |||
1189 | 259 | def test_getLatestUploads_filters_superseded_uploads(self): | ||
1190 | 260 | # getLatestUploads returns only the latest upload for a given | ||
1191 | 261 | # distroseries, type, package, and architecture. Any older | ||
1192 | 262 | # uploads with the same distroseries, type, package name, and | ||
1193 | 263 | # architecture are ignored. | ||
1194 | 264 | source_series = self.factory.makeDistroSeries() | ||
1195 | 265 | uploads = [ | ||
1196 | 266 | self.makeUpload( | ||
1197 | 267 | source_series, package_name='installer', version='1.0.0', | ||
1198 | 268 | arch='ppc') | ||
1199 | 269 | for counter in xrange(3)] | ||
1200 | 270 | |||
1201 | 271 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1202 | 272 | self.assertContentEqual( | ||
1203 | 273 | uploads[-1:], copier.getLatestUploads(source_series).values()) | ||
1204 | 274 | |||
1205 | 275 | def test_getLatestUploads_bundles_versions(self): | ||
1206 | 276 | # getLatestUploads sees an upload as superseding an older one | ||
1207 | 277 | # for the same distroseries, type, package name, and | ||
1208 | 278 | # architecture even if they have different versions. | ||
1209 | 279 | source_series = self.factory.makeDistroSeries() | ||
1210 | 280 | uploads = [ | ||
1211 | 281 | self.makeUpload(source_series, package_name='foo', arch='i386') | ||
1212 | 282 | for counter in xrange(2)] | ||
1213 | 283 | copier = CustomUploadsCopier(FakeDistroSeries()) | ||
1214 | 284 | self.assertContentEqual( | ||
1215 | 285 | uploads[-1:], copier.getLatestUploads(source_series).values()) | ||
1216 | 286 | |||
1217 | 287 | def test_getTargetArchive_on_same_distro_is_same_archive(self): | ||
1218 | 288 | # When copying within the same distribution, getTargetArchive | ||
1219 | 289 | # always returns the same archive you feed it. | ||
1220 | 290 | distro = self.factory.makeDistribution() | ||
1221 | 291 | archives = [ | ||
1222 | 292 | self.factory.makeArchive(distribution=distro, purpose=purpose) | ||
1223 | 293 | for purpose in MAIN_ARCHIVE_PURPOSES] | ||
1224 | 294 | copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro)) | ||
1225 | 295 | self.assertEqual( | ||
1226 | 296 | archives, | ||
1227 | 297 | [copier.getTargetArchive(archive) for archive in archives]) | ||
1228 | 298 | |||
1229 | 299 | def test_getTargetArchive_returns_None_if_not_distribution_archive(self): | ||
1230 | 300 | # getTargetArchive returns None for any archive that is not a | ||
1231 | 301 | # distribution archive, regardless of whether the target series | ||
1232 | 302 | # has an equivalent. | ||
1233 | 303 | distro = self.factory.makeDistribution() | ||
1234 | 304 | archives = [ | ||
1235 | 305 | self.factory.makeArchive(distribution=distro, purpose=purpose) | ||
1236 | 306 | for purpose in ArchivePurpose.items | ||
1237 | 307 | if purpose not in MAIN_ARCHIVE_PURPOSES] | ||
1238 | 308 | copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro)) | ||
1239 | 309 | self.assertEqual( | ||
1240 | 310 | [None] * len(archives), | ||
1241 | 311 | [copier.getTargetArchive(archive) for archive in archives]) | ||
1242 | 312 | |||
1243 | 313 | def test_getTargetArchive_finds_matching_archive(self): | ||
1244 | 314 | # When copying across archives, getTargetArchive looks for an | ||
1245 | 315 | # archive for the target series with the same purpose as the | ||
1246 | 316 | # original archive. | ||
1247 | 317 | source_series = self.factory.makeDistroSeries() | ||
1248 | 318 | source_archive = self.factory.makeArchive( | ||
1249 | 319 | distribution=source_series.distribution, | ||
1250 | 320 | purpose=ArchivePurpose.PARTNER) | ||
1251 | 321 | target_series = self.factory.makeDistroSeries() | ||
1252 | 322 | target_archive = self.factory.makeArchive( | ||
1253 | 323 | distribution=target_series.distribution, | ||
1254 | 324 | purpose=ArchivePurpose.PARTNER) | ||
1255 | 325 | |||
1256 | 326 | copier = CustomUploadsCopier(target_series) | ||
1257 | 327 | self.assertEqual( | ||
1258 | 328 | target_archive, copier.getTargetArchive(source_archive)) | ||
1259 | 329 | |||
1260 | 330 | def test_getTargetArchive_returns_None_if_no_archive_matches(self): | ||
1261 | 331 | # If the target series has no archive to match the archive that | ||
1262 | 332 | # the original upload was far, it returns None. | ||
1263 | 333 | source_series = self.factory.makeDistroSeries() | ||
1264 | 334 | source_archive = self.factory.makeArchive( | ||
1265 | 335 | distribution=source_series.distribution, | ||
1266 | 336 | purpose=ArchivePurpose.PARTNER) | ||
1267 | 337 | target_series = self.factory.makeDistroSeries() | ||
1268 | 338 | copier = CustomUploadsCopier(target_series) | ||
1269 | 339 | self.assertIs(None, copier.getTargetArchive(source_archive)) | ||
1270 | 340 | |||
1271 | 341 | def test_isObsolete_returns_False_if_no_equivalent_in_target(self): | ||
1272 | 342 | # isObsolete returns False if the upload in question has no | ||
1273 | 343 | # equivalent in the target series. | ||
1274 | 344 | source_series = self.factory.makeDistroSeries() | ||
1275 | 345 | upload = self.makeUpload(source_series) | ||
1276 | 346 | target_series = self.factory.makeDistroSeries() | ||
1277 | 347 | copier = CustomUploadsCopier(target_series) | ||
1278 | 348 | self.assertFalse( | ||
1279 | 349 | copier.isObsolete(upload, copier.getLatestUploads(target_series))) | ||
1280 | 350 | |||
1281 | 351 | def test_isObsolete_returns_False_if_target_has_older_equivalent(self): | ||
1282 | 352 | # isObsolete returns False if the target has an equivlalent of | ||
1283 | 353 | # the upload in question, but it's older than the version the | ||
1284 | 354 | # source series has. | ||
1285 | 355 | source_series = self.factory.makeDistroSeries() | ||
1286 | 356 | target_series = self.factory.makeDistroSeries() | ||
1287 | 357 | self.makeUpload( | ||
1288 | 358 | target_series, package_name='installer', arch='ppc64') | ||
1289 | 359 | source_upload = self.makeUpload( | ||
1290 | 360 | source_series, package_name='installer', arch='ppc64') | ||
1291 | 361 | copier = CustomUploadsCopier(target_series) | ||
1292 | 362 | self.assertFalse( | ||
1293 | 363 | copier.isObsolete( | ||
1294 | 364 | source_upload, copier.getLatestUploads(target_series))) | ||
1295 | 365 | |||
1296 | 366 | def test_isObsolete_returns_True_if_target_has_newer_equivalent(self): | ||
1297 | 367 | # isObsolete returns False if the target series already has a | ||
1298 | 368 | # newer equivalent of the upload in question (as would be the | ||
1299 | 369 | # case, for instance, if the upload had already been copied). | ||
1300 | 370 | source_series = self.factory.makeDistroSeries() | ||
1301 | 371 | source_upload = self.makeUpload( | ||
1302 | 372 | source_series, package_name='installer', arch='alpha') | ||
1303 | 373 | target_series = self.factory.makeDistroSeries() | ||
1304 | 374 | self.makeUpload( | ||
1305 | 375 | target_series, package_name='installer', arch='alpha') | ||
1306 | 376 | copier = CustomUploadsCopier(target_series) | ||
1307 | 377 | self.assertTrue( | ||
1308 | 378 | copier.isObsolete( | ||
1309 | 379 | source_upload, copier.getLatestUploads(target_series))) | ||
1310 | 380 | |||
1311 | 381 | def test_copyUpload_creates_upload(self): | ||
1312 | 382 | # copyUpload creates a new upload that's very similar to the | ||
1313 | 383 | # original, but for the target series. | ||
1314 | 384 | original_upload = self.makeUpload() | ||
1315 | 385 | target_series = self.factory.makeDistroSeries() | ||
1316 | 386 | copier = CustomUploadsCopier(target_series) | ||
1317 | 387 | copied_upload = copier.copyUpload(original_upload) | ||
1318 | 388 | self.assertEqual([copied_upload], list_custom_uploads(target_series)) | ||
1319 | 389 | self.assertNotEqual( | ||
1320 | 390 | original_upload.packageupload, copied_upload.packageupload) | ||
1321 | 391 | self.assertEqual( | ||
1322 | 392 | original_upload.customformat, copied_upload.customformat) | ||
1323 | 393 | self.assertEqual( | ||
1324 | 394 | original_upload.libraryfilealias, copied_upload.libraryfilealias) | ||
1325 | 395 | self.assertEqual( | ||
1326 | 396 | original_upload.packageupload.changesfile, | ||
1327 | 397 | copied_upload.packageupload.changesfile) | ||
1328 | 398 | self.assertEqual( | ||
1329 | 399 | original_upload.packageupload.pocket, | ||
1330 | 400 | copied_upload.packageupload.pocket) | ||
1331 | 401 | |||
1332 | 402 | def test_copyUpload_accepts_upload(self): | ||
1333 | 403 | # Uploads created by copyUpload are automatically accepted. | ||
1334 | 404 | original_upload = self.makeUpload() | ||
1335 | 405 | target_series = self.factory.makeDistroSeries() | ||
1336 | 406 | copier = CustomUploadsCopier(target_series) | ||
1337 | 407 | copied_upload = copier.copyUpload(original_upload) | ||
1338 | 408 | self.assertEqual( | ||
1339 | 409 | PackageUploadStatus.ACCEPTED, copied_upload.packageupload.status) | ||
1340 | 410 | |||
1341 | 411 | def test_copyUpload_does_not_copy_if_no_archive_matches(self): | ||
1342 | 412 | # If getTargetArchive does not find an appropriate target | ||
1343 | 413 | # archive, copyUpload does nothing. | ||
1344 | 414 | source_series = self.factory.makeDistroSeries() | ||
1345 | 415 | upload = self.makeUpload(distroseries=source_series) | ||
1346 | 416 | target_series = self.factory.makeDistroSeries() | ||
1347 | 417 | copier = CustomUploadsCopier(target_series) | ||
1348 | 418 | copier.getTargetArchive = FakeMethod(result=None) | ||
1349 | 419 | self.assertIs(None, copier.copyUpload(upload)) | ||
1350 | 420 | self.assertEqual([], list_custom_uploads(target_series)) | ||
1351 | 0 | 421 | ||
1352 | === modified file 'lib/lp/soyuz/tests/test_publishing.py' | |||
1353 | --- lib/lp/soyuz/tests/test_publishing.py 2011-08-03 11:00:11 +0000 | |||
1354 | +++ lib/lp/soyuz/tests/test_publishing.py 2011-08-11 10:31:20 +0000 | |||
1355 | @@ -159,7 +159,7 @@ | |||
1356 | 159 | signing_key = self.person.gpg_keys[0] | 159 | signing_key = self.person.gpg_keys[0] |
1357 | 160 | package_upload = distroseries.createQueueEntry( | 160 | package_upload = distroseries.createQueueEntry( |
1358 | 161 | pocket, archive, changes_file_name, changes_file_content, | 161 | pocket, archive, changes_file_name, changes_file_content, |
1360 | 162 | signing_key) | 162 | signing_key=signing_key) |
1361 | 163 | 163 | ||
1362 | 164 | status_to_method = { | 164 | status_to_method = { |
1363 | 165 | PackageUploadStatus.DONE: 'setDone', | 165 | PackageUploadStatus.DONE: 'setDone', |
Hi Jeroen,
I tried to run your tests but there are failures due to the removal of 'safe_open' which is still referenced in many places throughout the code base. Was its removal an accident? Did you move it to another file that you forgot to add to version control?
Brad