Merge lp:~jtv/launchpad/bug-659769 into lp:launchpad

Proposed by Jeroen T. Vermeulen
Status: Superseded
Proposed branch: lp:~jtv/launchpad/bug-659769
Merge into: lp:launchpad
Diff against target: 1363 lines (+668/-571)
9 files modified
lib/lp/archivepublisher/scripts/publish_ftpmaster.py (+23/-8)
lib/lp/archivepublisher/tests/test_publish_ftpmaster.py (+24/-0)
lib/lp/codehosting/safe_open.py (+0/-263)
lib/lp/codehosting/tests/test_safe_open.py (+0/-267)
lib/lp/registry/interfaces/distroseries.py (+25/-11)
lib/lp/registry/model/distroseries.py (+26/-21)
lib/lp/soyuz/scripts/custom_uploads_copier.py (+149/-0)
lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py (+420/-0)
lib/lp/soyuz/tests/test_publishing.py (+1/-1)
To merge this branch: bzr merge lp:~jtv/launchpad/bug-659769
Reviewer Review Type Date Requested Status
Brad Crittenden (community) code Needs Information
Review via email: mp+71173@code.launchpad.net

This proposal has been superseded by a proposal from 2011-08-15.

Commit message

Copy custom uploads into new distro release.

Description of the change

= Summary =

Every time the Ubuntu people (particularly Colin) set up a new release, they need to copy some of the archive's custom-upload files to the new release. Custom uploads are almost entirely unmanaged, so they would probably do this by copying direcetly in the filesystem. They have asked for an easier, more integrated way to get it done.

== Proposed fix ==

As far as I can make out, the only types of custom upload that need to be copied are for the Debian installer and the dist upgrader. The Rosetta translations are shared within the database, and presumably both kinds of translations for a package will soon be re-uploaded anyway. So I focused on the installers and upgraders.

Because custom uploads are so lightly managed (little or no metadata is kept, and we'd have to parse tarballs from the Librarian just to see what files came out of which upload), it's hard to figure out exactly what should or should not be copied. Some of the files may be obsolete and if we copy them along, they'll be with us forever.

The uploads contain tarballs with names in just one of two formats: <package>_<version>_<architecture>.tar.gz and <package>_<version>.tar.gz. A tarball for a given installer version, say, should contain all files that that version needs; there's no finicky merging of individual files that may each be current or obsolete. We can just identify the upload with the current tarball, and copy that upload into the new series.

Rather than get into the full hairy detail of version parsing (some of the versions look a little ad-hoc), I'm just copying the latest custom uploads for each (upload type, <package>, <architecture>). The <architecture> defaults to "all" because that's what seems to be meant when it is omitted.

As far as I've seen, the scheme I implemented here would actually work just fine for the other upload types. It will also work for derived distributions, with two limitations:

1. The custom uploads are copied only between consecutive releases of the same distribution, not across inheritance lines.

2. Uploads have to follow this naming scheme in order to be copied. This is something the package defines.

Having this supported for derived distributions could help save manual maintenance and system access needs for derived distros. We may want to add cross-distro inheritance of custom uploads later, but then we'll have to solve the conflict-resolution problem: should we copy the previous series' latest installer, or the one from the parent distro in the series that we're deriving from? What if there are multiple parent distros or releases?

== Pre-implementation notes ==

Discussed with various people, but alas not with Colin at the time of writing. I'm going to hold off on landing until he confirms that this will give him what he needs. Watch this space for updates.

== Implementation details ==

== Tests ==

{{{
./bin/test -vvc lp.soyuz -t test_publishing -t test_custom_uploads_copier
./bin/test -vvc lp.archivepublisher.tests.test_publish_ftpmaster
}}}

== Demo and Q/A ==

We'll have to do this hand in hand with the distro gurus. Not only to validate the result, but also to help set up the test scenario!

= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/tests/test_publishing.py
  lib/lp/archivepublisher/scripts/publish_ftpmaster.py
  lib/lp/registry/interfaces/distroseries.py
  lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py
  lib/lp/soyuz/scripts/custom_uploads_copier.py
  lib/lp/archivepublisher/tests/test_publish_ftpmaster.py
  lib/lp/registry/model/distroseries.py

To post a comment you must log in.
Revision history for this message
Brad Crittenden (bac) wrote :

Hi Jeroen,

I tried to run your tests but there are failures due to the removal of 'safe_open' which is still referenced in many places throughout the code base. Was its removal an accident? Did you move it to another file that you forgot to add to version control?

Brad

review: Needs Information (code)
Revision history for this message
Jeroen T. Vermeulen (jtv) wrote :

No idea what safe_open is or how it got into this MP. I'll have to resubmit.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'lib/lp/archivepublisher/scripts/publish_ftpmaster.py'
2--- lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-09 10:30:43 +0000
3+++ lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-11 10:31:20 +0000
4@@ -27,6 +27,7 @@
5 )
6 from lp.services.utils import file_exists
7 from lp.soyuz.enums import ArchivePurpose
8+from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier
9 from lp.soyuz.scripts.ftpmaster import LpQueryDistro
10 from lp.soyuz.scripts.processaccepted import ProcessAccepted
11 from lp.soyuz.scripts.publishdistro import PublishDistro
12@@ -539,6 +540,25 @@
13 self.recoverWorkingDists()
14 raise
15
16+ def prepareFreshSeries(self):
17+ """If there are any new distroseries, prepare them for publishing.
18+
19+ :return: True if a series did indeed still need some preparation,
20+ of False for the normal case.
21+ """
22+ have_fresh_series = False
23+ for series in self.distribution.series:
24+ suites_needing_indexes = self.listSuitesNeedingIndexes(series)
25+ if len(suites_needing_indexes) != 0:
26+ # This is a fresh series.
27+ have_fresh_series = True
28+ if series.previous_series is not None:
29+ CustomUploadsCopier(series).copy(series.previous_series)
30+ for suite in suites_needing_indexes:
31+ self.createIndexes(suite)
32+
33+ return have_fresh_series
34+
35 def setUp(self):
36 """Process options, and set up internal state."""
37 self.processOptions()
38@@ -550,14 +570,9 @@
39 self.setUp()
40 self.recoverWorkingDists()
41
42- for series in self.distribution.series:
43- suites_needing_indexes = self.listSuitesNeedingIndexes(series)
44- if len(suites_needing_indexes) > 0:
45- for suite in suites_needing_indexes:
46- self.createIndexes(suite)
47- # Don't try to do too much in one run. Leave the rest
48- # of the work for next time.
49- return
50+ if self.prepareFreshSeries():
51+ # We've done enough. Leave some room for others.
52+ return
53
54 self.processAccepted()
55 self.setUpDirs()
56
57=== modified file 'lib/lp/archivepublisher/tests/test_publish_ftpmaster.py'
58--- lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-03 06:24:53 +0000
59+++ lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-11 10:31:20 +0000
60@@ -44,6 +44,7 @@
61 from lp.soyuz.enums import (
62 ArchivePurpose,
63 PackagePublishingStatus,
64+ PackageUploadCustomFormat,
65 )
66 from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
67 from lp.testing import (
68@@ -980,6 +981,29 @@
69 self.assertEqual([suite], kwargs['suites'])
70 self.assertThat(kwargs['suites'][0], StartsWith(series.name))
71
72+ def test_prepareFreshSeries_copies_custom_uploads(self):
73+ distro = self.makeDistroWithPublishDirectory()
74+ old_series = self.factory.makeDistroSeries(
75+ distribution=distro, status=SeriesStatus.CURRENT)
76+ new_series = self.factory.makeDistroSeries(
77+ distribution=distro, previous_series=old_series,
78+ status=SeriesStatus.FROZEN)
79+ custom_upload = self.factory.makeCustomPackageUpload(
80+ distroseries=old_series,
81+ custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER,
82+ filename='debian-installer-images_1.0-20110805_i386.tar.gz')
83+ script = self.makeScript(distro)
84+ script.createIndexes = FakeMethod()
85+ script.setUp()
86+ have_fresh_series = script.prepareFreshSeries()
87+ self.assertTrue(have_fresh_series)
88+ [copied_upload] = new_series.getPackageUploads(
89+ name=u'debian-installer-images', exact_match=False)
90+ [copied_custom] = copied_upload.customfiles
91+ self.assertEqual(
92+ custom_upload.customfiles[0].libraryfilealias.filename,
93+ copied_custom.libraryfilealias.filename)
94+
95 def test_script_creates_indexes(self):
96 # End-to-end test: the script creates indexes for distroseries
97 # that need them.
98
99=== removed file 'lib/lp/codehosting/safe_open.py'
100--- lib/lp/codehosting/safe_open.py 2011-08-09 14:59:08 +0000
101+++ lib/lp/codehosting/safe_open.py 1970-01-01 00:00:00 +0000
102@@ -1,263 +0,0 @@
103-# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
104-# GNU Affero General Public License version 3 (see the file LICENSE).
105-
106-"""Safe branch opening."""
107-
108-__metaclass__ = type
109-
110-from bzrlib import urlutils
111-from bzrlib.branch import Branch
112-from bzrlib.bzrdir import BzrDir
113-
114-from lazr.uri import URI
115-
116-__all__ = [
117- 'AcceptAnythingPolicy',
118- 'BadUrl',
119- 'BlacklistPolicy',
120- 'BranchLoopError',
121- 'BranchOpenPolicy',
122- 'BranchReferenceForbidden',
123- 'SafeBranchOpener',
124- 'WhitelistPolicy',
125- 'safe_open',
126- ]
127-
128-
129-# TODO JelmerVernooij 2011-08-06: This module is generic enough to be
130-# in bzrlib, and may be of use to others.
131-
132-
133-class BadUrl(Exception):
134- """Tried to access a branch from a bad URL."""
135-
136-
137-class BranchReferenceForbidden(Exception):
138- """Trying to mirror a branch reference and the branch type does not allow
139- references.
140- """
141-
142-
143-class BranchLoopError(Exception):
144- """Encountered a branch cycle.
145-
146- A URL may point to a branch reference or it may point to a stacked branch.
147- In either case, it's possible for there to be a cycle in these references,
148- and this exception is raised when we detect such a cycle.
149- """
150-
151-
152-class BranchOpenPolicy:
153- """Policy on how to open branches.
154-
155- In particular, a policy determines which branches are safe to open by
156- checking their URLs and deciding whether or not to follow branch
157- references.
158- """
159-
160- def shouldFollowReferences(self):
161- """Whether we traverse references when mirroring.
162-
163- Subclasses must override this method.
164-
165- If we encounter a branch reference and this returns false, an error is
166- raised.
167-
168- :returns: A boolean to indicate whether to follow a branch reference.
169- """
170- raise NotImplementedError(self.shouldFollowReferences)
171-
172- def transformFallbackLocation(self, branch, url):
173- """Validate, maybe modify, 'url' to be used as a stacked-on location.
174-
175- :param branch: The branch that is being opened.
176- :param url: The URL that the branch provides for its stacked-on
177- location.
178- :return: (new_url, check) where 'new_url' is the URL of the branch to
179- actually open and 'check' is true if 'new_url' needs to be
180- validated by checkAndFollowBranchReference.
181- """
182- raise NotImplementedError(self.transformFallbackLocation)
183-
184- def checkOneURL(self, url):
185- """Check the safety of the source URL.
186-
187- Subclasses must override this method.
188-
189- :param url: The source URL to check.
190- :raise BadUrl: subclasses are expected to raise this or a subclass
191- when it finds a URL it deems to be unsafe.
192- """
193- raise NotImplementedError(self.checkOneURL)
194-
195-
196-class BlacklistPolicy(BranchOpenPolicy):
197- """Branch policy that forbids certain URLs."""
198-
199- def __init__(self, should_follow_references, unsafe_urls=None):
200- if unsafe_urls is None:
201- unsafe_urls = set()
202- self._unsafe_urls = unsafe_urls
203- self._should_follow_references = should_follow_references
204-
205- def shouldFollowReferences(self):
206- return self._should_follow_references
207-
208- def checkOneURL(self, url):
209- if url in self._unsafe_urls:
210- raise BadUrl(url)
211-
212- def transformFallbackLocation(self, branch, url):
213- """See `BranchOpenPolicy.transformFallbackLocation`.
214-
215- This class is not used for testing our smarter stacking features so we
216- just do the simplest thing: return the URL that would be used anyway
217- and don't check it.
218- """
219- return urlutils.join(branch.base, url), False
220-
221-
222-class AcceptAnythingPolicy(BlacklistPolicy):
223- """Accept anything, to make testing easier."""
224-
225- def __init__(self):
226- super(AcceptAnythingPolicy, self).__init__(True, set())
227-
228-
229-class WhitelistPolicy(BranchOpenPolicy):
230- """Branch policy that only allows certain URLs."""
231-
232- def __init__(self, should_follow_references, allowed_urls=None,
233- check=False):
234- if allowed_urls is None:
235- allowed_urls = []
236- self.allowed_urls = set(url.rstrip('/') for url in allowed_urls)
237- self.check = check
238-
239- def shouldFollowReferences(self):
240- return self._should_follow_references
241-
242- def checkOneURL(self, url):
243- if url.rstrip('/') not in self.allowed_urls:
244- raise BadUrl(url)
245-
246- def transformFallbackLocation(self, branch, url):
247- """See `BranchOpenPolicy.transformFallbackLocation`.
248-
249- Here we return the URL that would be used anyway and optionally check
250- it.
251- """
252- return urlutils.join(branch.base, url), self.check
253-
254-
255-class SafeBranchOpener(object):
256- """Safe branch opener.
257-
258- The policy object is expected to have the following methods:
259- * checkOneURL
260- * shouldFollowReferences
261- * transformFallbackLocation
262- """
263-
264- def __init__(self, policy):
265- self.policy = policy
266- self._seen_urls = set()
267-
268- def checkAndFollowBranchReference(self, url):
269- """Check URL (and possibly the referenced URL) for safety.
270-
271- This method checks that `url` passes the policy's `checkOneURL`
272- method, and if `url` refers to a branch reference, it checks whether
273- references are allowed and whether the reference's URL passes muster
274- also -- recursively, until a real branch is found.
275-
276- :raise BranchLoopError: If the branch references form a loop.
277- :raise BranchReferenceForbidden: If this opener forbids branch
278- references.
279- """
280- while True:
281- if url in self._seen_urls:
282- raise BranchLoopError()
283- self._seen_urls.add(url)
284- self.policy.checkOneURL(url)
285- next_url = self.followReference(url)
286- if next_url is None:
287- return url
288- url = next_url
289- if not self.policy.shouldFollowReferences():
290- raise BranchReferenceForbidden(url)
291-
292- def transformFallbackLocationHook(self, branch, url):
293- """Installed as the 'transform_fallback_location' Branch hook.
294-
295- This method calls `transformFallbackLocation` on the policy object and
296- either returns the url it provides or passes it back to
297- checkAndFollowBranchReference.
298- """
299- new_url, check = self.policy.transformFallbackLocation(branch, url)
300- if check:
301- return self.checkAndFollowBranchReference(new_url)
302- else:
303- return new_url
304-
305- def runWithTransformFallbackLocationHookInstalled(
306- self, callable, *args, **kw):
307- Branch.hooks.install_named_hook(
308- 'transform_fallback_location', self.transformFallbackLocationHook,
309- 'SafeBranchOpener.transformFallbackLocationHook')
310- try:
311- return callable(*args, **kw)
312- finally:
313- # XXX 2008-11-24 MichaelHudson, bug=301472: This is the hacky way
314- # to remove a hook. The linked bug report asks for an API to do
315- # it.
316- Branch.hooks['transform_fallback_location'].remove(
317- self.transformFallbackLocationHook)
318- # We reset _seen_urls here to avoid multiple calls to open giving
319- # spurious loop exceptions.
320- self._seen_urls = set()
321-
322- def followReference(self, url):
323- """Get the branch-reference value at the specified url.
324-
325- This exists as a separate method only to be overriden in unit tests.
326- """
327- bzrdir = BzrDir.open(url)
328- return bzrdir.get_branch_reference()
329-
330- def open(self, url):
331- """Open the Bazaar branch at url, first checking for safety.
332-
333- What safety means is defined by a subclasses `followReference` and
334- `checkOneURL` methods.
335- """
336- url = self.checkAndFollowBranchReference(url)
337- return self.runWithTransformFallbackLocationHookInstalled(
338- Branch.open, url)
339-
340-
341-class URLChecker(BranchOpenPolicy):
342- """Branch open policy that rejects URLs not on the given scheme."""
343-
344- def __init__(self, allowed_scheme):
345- self.allowed_scheme = allowed_scheme
346-
347- def shouldFollowReferences(self):
348- return True
349-
350- def transformFallbackLocation(self, branch, url):
351- return urlutils.join(branch.base, url), True
352-
353- def checkOneURL(self, url):
354- """Check that `url` is safe to open."""
355- if URI(url).scheme != self.allowed_scheme:
356- raise BadUrl(url)
357-
358-
359-def safe_open(allowed_scheme, url):
360- """Open the branch at `url`, only accessing URLs on `allowed_scheme`.
361-
362- :raises BadUrl: An attempt was made to open a URL that was not on
363- `allowed_scheme`.
364- """
365- return SafeBranchOpener(URLChecker(allowed_scheme)).open(url)
366
367=== removed file 'lib/lp/codehosting/tests/test_safe_open.py'
368--- lib/lp/codehosting/tests/test_safe_open.py 2011-08-09 15:05:18 +0000
369+++ lib/lp/codehosting/tests/test_safe_open.py 1970-01-01 00:00:00 +0000
370@@ -1,267 +0,0 @@
371-# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
372-# GNU Affero General Public License version 3 (see the file LICENSE).
373-
374-"""Tests for the safe branch open code."""
375-
376-
377-__metaclass__ = type
378-
379-from lazr.uri import URI
380-
381-from lp.codehosting.safe_open import (
382- BadUrl,
383- BlacklistPolicy,
384- BranchLoopError,
385- BranchReferenceForbidden,
386- SafeBranchOpener,
387- WhitelistPolicy,
388- safe_open,
389- )
390-
391-from lp.testing import TestCase
392-
393-from bzrlib.branch import (
394- Branch,
395- BzrBranchFormat7,
396- )
397-from bzrlib.bzrdir import (
398- BzrDirMetaFormat1,
399- )
400-from bzrlib.repofmt.pack_repo import RepositoryFormatKnitPack1
401-from bzrlib.tests import (
402- TestCaseWithTransport,
403- )
404-from bzrlib.transport import chroot
405-
406-
407-class TestSafeBranchOpenerCheckAndFollowBranchReference(TestCase):
408- """Unit tests for `SafeBranchOpener.checkAndFollowBranchReference`."""
409-
410- class StubbedSafeBranchOpener(SafeBranchOpener):
411- """SafeBranchOpener that provides canned answers.
412-
413- We implement the methods we need to to be able to control all the
414- inputs to the `BranchMirrorer.checkSource` method, which is what is
415- being tested in this class.
416- """
417-
418- def __init__(self, references, policy):
419- parent_cls = TestSafeBranchOpenerCheckAndFollowBranchReference
420- super(parent_cls.StubbedSafeBranchOpener, self).__init__(policy)
421- self._reference_values = {}
422- for i in range(len(references) - 1):
423- self._reference_values[references[i]] = references[i+1]
424- self.follow_reference_calls = []
425-
426- def followReference(self, url):
427- self.follow_reference_calls.append(url)
428- return self._reference_values[url]
429-
430- def makeBranchOpener(self, should_follow_references, references,
431- unsafe_urls=None):
432- policy = BlacklistPolicy(should_follow_references, unsafe_urls)
433- opener = self.StubbedSafeBranchOpener(references, policy)
434- return opener
435-
436- def testCheckInitialURL(self):
437- # checkSource rejects all URLs that are not allowed.
438- opener = self.makeBranchOpener(None, [], set(['a']))
439- self.assertRaises(BadUrl, opener.checkAndFollowBranchReference, 'a')
440-
441- def testNotReference(self):
442- # When branch references are forbidden, checkAndFollowBranchReference
443- # does not raise on non-references.
444- opener = self.makeBranchOpener(False, ['a', None])
445- self.assertEquals('a', opener.checkAndFollowBranchReference('a'))
446- self.assertEquals(['a'], opener.follow_reference_calls)
447-
448- def testBranchReferenceForbidden(self):
449- # checkAndFollowBranchReference raises BranchReferenceForbidden if
450- # branch references are forbidden and the source URL points to a
451- # branch reference.
452- opener = self.makeBranchOpener(False, ['a', 'b'])
453- self.assertRaises(
454- BranchReferenceForbidden,
455- opener.checkAndFollowBranchReference, 'a')
456- self.assertEquals(['a'], opener.follow_reference_calls)
457-
458- def testAllowedReference(self):
459- # checkAndFollowBranchReference does not raise if following references
460- # is allowed and the source URL points to a branch reference to a
461- # permitted location.
462- opener = self.makeBranchOpener(True, ['a', 'b', None])
463- self.assertEquals('b', opener.checkAndFollowBranchReference('a'))
464- self.assertEquals(['a', 'b'], opener.follow_reference_calls)
465-
466- def testCheckReferencedURLs(self):
467- # checkAndFollowBranchReference checks if the URL a reference points
468- # to is safe.
469- opener = self.makeBranchOpener(
470- True, ['a', 'b', None], unsafe_urls=set('b'))
471- self.assertRaises(BadUrl, opener.checkAndFollowBranchReference, 'a')
472- self.assertEquals(['a'], opener.follow_reference_calls)
473-
474- def testSelfReferencingBranch(self):
475- # checkAndFollowBranchReference raises BranchReferenceLoopError if
476- # following references is allowed and the source url points to a
477- # self-referencing branch reference.
478- opener = self.makeBranchOpener(True, ['a', 'a'])
479- self.assertRaises(
480- BranchLoopError, opener.checkAndFollowBranchReference, 'a')
481- self.assertEquals(['a'], opener.follow_reference_calls)
482-
483- def testBranchReferenceLoop(self):
484- # checkAndFollowBranchReference raises BranchReferenceLoopError if
485- # following references is allowed and the source url points to a loop
486- # of branch references.
487- references = ['a', 'b', 'a']
488- opener = self.makeBranchOpener(True, references)
489- self.assertRaises(
490- BranchLoopError, opener.checkAndFollowBranchReference, 'a')
491- self.assertEquals(['a', 'b'], opener.follow_reference_calls)
492-
493-
494-class TestSafeBranchOpenerStacking(TestCaseWithTransport):
495-
496- def makeBranchOpener(self, allowed_urls):
497- policy = WhitelistPolicy(True, allowed_urls, True)
498- return SafeBranchOpener(policy)
499-
500- def makeBranch(self, path, branch_format, repository_format):
501- """Make a Bazaar branch at 'path' with the given formats."""
502- bzrdir_format = BzrDirMetaFormat1()
503- bzrdir_format.set_branch_format(branch_format)
504- bzrdir = self.make_bzrdir(path, format=bzrdir_format)
505- repository_format.initialize(bzrdir)
506- return bzrdir.create_branch()
507-
508- def testAllowedURL(self):
509- # checkSource does not raise an exception for branches stacked on
510- # branches with allowed URLs.
511- stacked_on_branch = self.make_branch('base-branch', format='1.6')
512- stacked_branch = self.make_branch('stacked-branch', format='1.6')
513- stacked_branch.set_stacked_on_url(stacked_on_branch.base)
514- opener = self.makeBranchOpener(
515- [stacked_branch.base, stacked_on_branch.base])
516- # This doesn't raise an exception.
517- opener.open(stacked_branch.base)
518-
519- def testUnstackableRepository(self):
520- # checkSource treats branches with UnstackableRepositoryFormats as
521- # being not stacked.
522- branch = self.makeBranch(
523- 'unstacked', BzrBranchFormat7(), RepositoryFormatKnitPack1())
524- opener = self.makeBranchOpener([branch.base])
525- # This doesn't raise an exception.
526- opener.open(branch.base)
527-
528- def testAllowedRelativeURL(self):
529- # checkSource passes on absolute urls to checkOneURL, even if the
530- # value of stacked_on_location in the config is set to a relative URL.
531- stacked_on_branch = self.make_branch('base-branch', format='1.6')
532- stacked_branch = self.make_branch('stacked-branch', format='1.6')
533- stacked_branch.set_stacked_on_url('../base-branch')
534- opener = self.makeBranchOpener(
535- [stacked_branch.base, stacked_on_branch.base])
536- # Note that stacked_on_branch.base is not '../base-branch', it's an
537- # absolute URL.
538- self.assertNotEqual('../base-branch', stacked_on_branch.base)
539- # This doesn't raise an exception.
540- opener.open(stacked_branch.base)
541-
542- def testAllowedRelativeNested(self):
543- # Relative URLs are resolved relative to the stacked branch.
544- self.get_transport().mkdir('subdir')
545- a = self.make_branch('subdir/a', format='1.6')
546- b = self.make_branch('b', format='1.6')
547- b.set_stacked_on_url('../subdir/a')
548- c = self.make_branch('subdir/c', format='1.6')
549- c.set_stacked_on_url('../../b')
550- opener = self.makeBranchOpener([c.base, b.base, a.base])
551- # This doesn't raise an exception.
552- opener.open(c.base)
553-
554- def testForbiddenURL(self):
555- # checkSource raises a BadUrl exception if a branch is stacked on a
556- # branch with a forbidden URL.
557- stacked_on_branch = self.make_branch('base-branch', format='1.6')
558- stacked_branch = self.make_branch('stacked-branch', format='1.6')
559- stacked_branch.set_stacked_on_url(stacked_on_branch.base)
560- opener = self.makeBranchOpener([stacked_branch.base])
561- self.assertRaises(BadUrl, opener.open, stacked_branch.base)
562-
563- def testForbiddenURLNested(self):
564- # checkSource raises a BadUrl exception if a branch is stacked on a
565- # branch that is in turn stacked on a branch with a forbidden URL.
566- a = self.make_branch('a', format='1.6')
567- b = self.make_branch('b', format='1.6')
568- b.set_stacked_on_url(a.base)
569- c = self.make_branch('c', format='1.6')
570- c.set_stacked_on_url(b.base)
571- opener = self.makeBranchOpener([c.base, b.base])
572- self.assertRaises(BadUrl, opener.open, c.base)
573-
574- def testSelfStackedBranch(self):
575- # checkSource raises StackingLoopError if a branch is stacked on
576- # itself. This avoids infinite recursion errors.
577- a = self.make_branch('a', format='1.6')
578- # Bazaar 1.17 and up make it harder to create branches like this.
579- # It's still worth testing that we don't blow up in the face of them,
580- # so we grovel around a bit to create one anyway.
581- a.get_config().set_user_option('stacked_on_location', a.base)
582- opener = self.makeBranchOpener([a.base])
583- self.assertRaises(BranchLoopError, opener.open, a.base)
584-
585- def testLoopStackedBranch(self):
586- # checkSource raises StackingLoopError if a branch is stacked in such
587- # a way so that it is ultimately stacked on itself. e.g. a stacked on
588- # b stacked on a.
589- a = self.make_branch('a', format='1.6')
590- b = self.make_branch('b', format='1.6')
591- a.set_stacked_on_url(b.base)
592- b.set_stacked_on_url(a.base)
593- opener = self.makeBranchOpener([a.base, b.base])
594- self.assertRaises(BranchLoopError, opener.open, a.base)
595- self.assertRaises(BranchLoopError, opener.open, b.base)
596-
597-
598-class TestSafeOpen(TestCaseWithTransport):
599- """Tests for `safe_open`."""
600-
601- def get_chrooted_scheme(self, relpath):
602- """Create a server that is chrooted to `relpath`.
603-
604- :return: ``(scheme, get_url)`` where ``scheme`` is the scheme of the
605- chroot server and ``get_url`` returns URLs on said server.
606- """
607- transport = self.get_transport(relpath)
608- chroot_server = chroot.ChrootServer(transport)
609- chroot_server.start_server()
610- self.addCleanup(chroot_server.stop_server)
611- def get_url(relpath):
612- return chroot_server.get_url() + relpath
613- return URI(chroot_server.get_url()).scheme, get_url
614-
615- def test_stacked_within_scheme(self):
616- # A branch that is stacked on a URL of the same scheme is safe to
617- # open.
618- self.get_transport().mkdir('inside')
619- self.make_branch('inside/stacked')
620- self.make_branch('inside/stacked-on')
621- scheme, get_chrooted_url = self.get_chrooted_scheme('inside')
622- Branch.open(get_chrooted_url('stacked')).set_stacked_on_url(
623- get_chrooted_url('stacked-on'))
624- safe_open(scheme, get_chrooted_url('stacked'))
625-
626- def test_stacked_outside_scheme(self):
627- # A branch that is stacked on a URL that is not of the same scheme is
628- # not safe to open.
629- self.get_transport().mkdir('inside')
630- self.get_transport().mkdir('outside')
631- self.make_branch('inside/stacked')
632- self.make_branch('outside/stacked-on')
633- scheme, get_chrooted_url = self.get_chrooted_scheme('inside')
634- Branch.open(get_chrooted_url('stacked')).set_stacked_on_url(
635- self.get_url('outside/stacked-on'))
636- self.assertRaises(
637- BadUrl, safe_open, scheme, get_chrooted_url('stacked'))
638
639=== modified file 'lib/lp/registry/interfaces/distroseries.py'
640--- lib/lp/registry/interfaces/distroseries.py 2011-08-05 03:58:16 +0000
641+++ lib/lp/registry/interfaces/distroseries.py 2011-08-11 10:31:20 +0000
642@@ -786,20 +786,34 @@
643 DistroSeriesBinaryPackage objects that match the given text.
644 """
645
646- def createQueueEntry(pocket, archive, changesfilename, changesfilecontent,
647+ def createQueueEntry(pocket, archive, changesfilename=None,
648+ changesfilecontent=None, changes_file_alias=None,
649 signingkey=None, package_copy_job=None):
650 """Create a queue item attached to this distroseries.
651
652- Create a new records respecting the given pocket and archive.
653-
654- The default state is NEW, sorted sqlobject declaration, any
655- modification should be performed via Queue state-machine.
656-
657- The changesfile argument should be the text of the .changes for this
658- upload. The contents of this may be used later.
659-
660- 'signingkey' is the IGPGKey used to sign the changesfile or None if
661- the changesfile is unsigned.
662+ Create a new `PackageUpload` to the given pocket and archive.
663+
664+ The default state is NEW. Any further state changes go through
665+ the Queue state-machine.
666+
667+ :param pocket: The `PackagePublishingPocket` to upload to.
668+ :param archive: The `Archive` to upload to. Must be for the same
669+ `Distribution` as this series.
670+ :param changesfilename: Name for the upload's .changes file. You may
671+ specify a changes file by passing both `changesfilename` and
672+ `changesfilecontent`, or by passing `changes_file_alias`.
673+ :param changesfilecontent: Text for the changes file. It will be
674+ signed and stored in the Librarian. Must be passed together with
675+ `changesfilename`; alternatively, you may provide a
676+ `changes_file_alias` to replace both of these.
677+ :param changes_file_alias: A `LibraryFileAlias` containing the
678+ .changes file. Security warning: unless the file has already
679+ been checked, this may open us up to replay attacks as per bugs
680+ 159304 and 451396. Use `changes_file_alias` only if you know
681+ this can't happen.
682+ :param signingkey: `IGPGKey` used to sign the changes file, or None if
683+ it is unsigned.
684+ :return: A new `PackageUpload`.
685 """
686
687 def newArch(architecturetag, processorfamily, official, owner,
688
689=== modified file 'lib/lp/registry/model/distroseries.py'
690--- lib/lp/registry/model/distroseries.py 2011-08-05 03:58:16 +0000
691+++ lib/lp/registry/model/distroseries.py 2011-08-11 10:31:20 +0000
692@@ -86,9 +86,7 @@
693 ISeriesBugTarget,
694 )
695 from lp.bugs.interfaces.bugtaskfilter import OrderedBugTask
696-from lp.bugs.model.bug import (
697- get_bug_tags,
698- )
699+from lp.bugs.model.bug import get_bug_tags
700 from lp.bugs.model.bugtarget import (
701 BugTargetBase,
702 HasBugHeatMixin,
703@@ -1587,25 +1585,34 @@
704 get_property_cache(spph).newer_distroseries_version = version
705
706 def createQueueEntry(self, pocket, archive, changesfilename=None,
707- changesfilecontent=None, signing_key=None,
708- package_copy_job=None):
709+ changesfilecontent=None, changes_file_alias=None,
710+ signing_key=None, package_copy_job=None):
711 """See `IDistroSeries`."""
712- # We store the changes file in the librarian to avoid having to
713- # deal with broken encodings in these files; this will allow us
714- # to regenerate these files as necessary.
715- #
716- # The use of StringIO here should be safe: we do not encoding of
717- # the content in the changes file (as doing so would be guessing
718- # at best, causing unpredictable corruption), and simply pass it
719- # off to the librarian.
720-
721- if package_copy_job is None and (
722- changesfilename is None or changesfilecontent is None):
723+ if (changesfilename is None) != (changesfilecontent is None):
724+ raise AssertionError(
725+ "Inconsistent changesfilename and changesfilecontent. "
726+ "Pass either both, or neither.")
727+ if changes_file_alias is not None and changesfilename is not None:
728+ raise AssertionError(
729+ "Conflicting options: "
730+ "Both changesfilename and changes_file_alias were given.")
731+ have_changes_file = not (
732+ changesfilename is None and changes_file_alias is None)
733+ if package_copy_job is None and not have_changes_file:
734 raise AssertionError(
735 "changesfilename and changesfilecontent must be supplied "
736 "if there is no package_copy_job")
737
738- if package_copy_job is None:
739+ if changesfilename is not None:
740+ # We store the changes file in the librarian to avoid having to
741+ # deal with broken encodings in these files; this will allow us
742+ # to regenerate these files as necessary.
743+ #
744+ # The use of StringIO here should be safe: we do not encoding of
745+ # the content in the changes file (as doing so would be guessing
746+ # at best, causing unpredictable corruption), and simply pass it
747+ # off to the librarian.
748+
749 # The PGP signature is stripped from all changesfiles
750 # to avoid replay attacks (see bugs 159304 and 451396).
751 signed_message = signed_message_from_string(changesfilecontent)
752@@ -1616,17 +1623,15 @@
753 if new_content is not None:
754 changesfilecontent = signed_message.signedContent
755
756- changes_file = getUtility(ILibraryFileAliasSet).create(
757+ changes_file_alias = getUtility(ILibraryFileAliasSet).create(
758 changesfilename, len(changesfilecontent),
759 StringIO(changesfilecontent), 'text/plain',
760 restricted=archive.private)
761- else:
762- changes_file = None
763
764 return PackageUpload(
765 distroseries=self, status=PackageUploadStatus.NEW,
766 pocket=pocket, archive=archive,
767- changesfile=changes_file, signing_key=signing_key,
768+ changesfile=changes_file_alias, signing_key=signing_key,
769 package_copy_job=package_copy_job)
770
771 def getPackageUploadQueue(self, state):
772
773=== added file 'lib/lp/soyuz/scripts/custom_uploads_copier.py'
774--- lib/lp/soyuz/scripts/custom_uploads_copier.py 1970-01-01 00:00:00 +0000
775+++ lib/lp/soyuz/scripts/custom_uploads_copier.py 2011-08-11 10:31:20 +0000
776@@ -0,0 +1,149 @@
777+# Copyright 2011 Canonical Ltd. This software is licensed under the
778+# GNU Affero General Public License version 3 (see the file LICENSE).
779+
780+"""Copy latest custom uploads into a distribution release series.
781+
782+Use this when initializing the installer and dist upgrader for a new release
783+series based on the latest uploads from its preceding series.
784+"""
785+
786+__metaclass__ = type
787+__all__ = [
788+ 'CustomUploadsCopier',
789+ ]
790+
791+from operator import attrgetter
792+import re
793+
794+from zope.component import getUtility
795+
796+#from canonical.launchpad.database.librarian import LibraryFileAlias
797+from lp.services.database.bulk import load_referencing
798+from lp.soyuz.enums import PackageUploadCustomFormat
799+from lp.soyuz.interfaces.archive import (
800+ IArchiveSet,
801+ MAIN_ARCHIVE_PURPOSES,
802+ )
803+from lp.soyuz.model.queue import PackageUploadCustom
804+
805+
806+class CustomUploadsCopier:
807+ """Copy `PackageUploadCustom` objects into a new `DistroSeries`."""
808+
809+ copyable_types = [
810+ PackageUploadCustomFormat.DEBIAN_INSTALLER,
811+ PackageUploadCustomFormat.DIST_UPGRADER,
812+ ]
813+
814+ def __init__(self, target_series):
815+ self.target_series = target_series
816+
817+ def isCopyable(self, upload):
818+ """Is `upload` the kind of `PackageUploadCustom` that we can copy?"""
819+ return upload.customformat in self.copyable_types
820+
821+ def getCandidateUploads(self, source_series):
822+ """Find custom uploads that may need copying."""
823+ uploads = source_series.getPackageUploads(
824+ custom_type=self.copyable_types)
825+ load_referencing(PackageUploadCustom, uploads, ['packageuploadID'])
826+ customs = sum([list(upload.customfiles) for upload in uploads], [])
827+ customs = filter(self.isCopyable, customs)
828+ customs.sort(key=attrgetter('id'), reverse=True)
829+ return customs
830+
831+ def extractNameFields(self, filename):
832+ """Get the relevant fields out of `filename`.
833+
834+ Scans filenames of any of these forms:
835+
836+ <package>_<version>_<architecture>.tar.<compression_suffix>
837+ <package>_<version>.tar[.<compression_suffix>]
838+
839+ Versions may contain dots, dashes etc. but no underscores.
840+
841+ :return: A tuple of (<package>, <architecture>), or None if the
842+ filename does not match the expected pattern. If no
843+ architecture is found in the filename, it defaults to 'all'.
844+ """
845+ regex_parts = {
846+ 'package': "[^_]+",
847+ 'version': "[^_]+",
848+ 'arch': "[^._]+",
849+ }
850+ filename_regex = (
851+ "(%(package)s)_%(version)s(?:_(%(arch)s))?.tar" % regex_parts)
852+ match = re.match(filename_regex, filename)
853+ if match is None:
854+ return None
855+ default_arch = 'all'
856+ fields = match.groups(default_arch)
857+ if len(fields) != 2:
858+ return None
859+ return fields
860+
861+ def getKey(self, upload):
862+ """Get an indexing key for `upload`."""
863+ custom_format = (upload.customformat, )
864+ name_fields = self.extractNameFields(upload.libraryfilealias.filename)
865+ if name_fields is None:
866+ return None
867+ else:
868+ return custom_format + name_fields
869+
870+ def getLatestUploads(self, source_series):
871+ """Find the latest uploads.
872+
873+ :param source_series: The `DistroSeries` whose uploads to get.
874+ :return: A dict containing the latest uploads, indexed by keys as
875+ returned by `getKey`.
876+ """
877+ latest_uploads = {}
878+ for upload in self.getCandidateUploads(source_series):
879+ key = self.getKey(upload)
880+ if key is not None:
881+ latest_uploads.setdefault(key, upload)
882+ return latest_uploads
883+
884+ def getTargetArchive(self, original_archive):
885+ """Find counterpart of `original_archive` in `self.target_series`.
886+
887+ :param original_archive: The `Archive` that the original upload went
888+ into. If this is not a primary, partner, or debug archive,
889+ None is returned.
890+ :return: The `Archive` of the same purpose for `self.target_series`.
891+ """
892+ if original_archive.purpose not in MAIN_ARCHIVE_PURPOSES:
893+ return None
894+ return getUtility(IArchiveSet).getByDistroPurpose(
895+ self.target_series.distribution, original_archive.purpose)
896+
897+ def isObsolete(self, upload, target_uploads):
898+ """Is `upload` superseded by one that the target series already has?
899+
900+ :param upload: A `PackageUploadCustom` from the source series.
901+ :param target_uploads:
902+ """
903+ existing_upload = target_uploads.get(self.getKey(upload))
904+ return existing_upload is not None and existing_upload.id >= upload.id
905+
906+ def copyUpload(self, original_upload):
907+ """Copy `original_upload` into `self.target_series`."""
908+ target_archive = self.getTargetArchive(
909+ original_upload.packageupload.archive)
910+ if target_archive is None:
911+ return None
912+ package_upload = self.target_series.createQueueEntry(
913+ original_upload.packageupload.pocket, target_archive,
914+ changes_file_alias=original_upload.packageupload.changesfile)
915+ custom = package_upload.addCustom(
916+ original_upload.libraryfilealias, original_upload.customformat)
917+ package_upload.setAccepted()
918+ return custom
919+
920+ def copy(self, source_series):
921+ """Copy uploads from `source_series`."""
922+ target_uploads = self.getLatestUploads(self.target_series)
923+ for upload in self.getLatestUploads(source_series).itervalues():
924+ if not self.isObsolete(upload, target_uploads):
925+ self.copyUpload(upload)
926
927=== added file 'lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py'
928--- lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 1970-01-01 00:00:00 +0000
929+++ lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 2011-08-11 10:31:20 +0000
930@@ -0,0 +1,420 @@
931+# Copyright 2011 Canonical Ltd. This software is licensed under the
932+# GNU Affero General Public License version 3 (see the file LICENSE).
933+
934+"""Test copying of custom package uploads for a new `DistroSeries`."""
935+
936+__metaclass__ = type
937+
938+from canonical.testing.layers import (
939+ LaunchpadZopelessLayer,
940+ ZopelessLayer,
941+ )
942+from lp.soyuz.enums import (
943+ ArchivePurpose,
944+ PackageUploadCustomFormat,
945+ PackageUploadStatus,
946+ )
947+from lp.soyuz.interfaces.archive import MAIN_ARCHIVE_PURPOSES
948+from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier
949+from lp.testing import TestCaseWithFactory
950+from lp.testing.fakemethod import FakeMethod
951+
952+
953+def list_custom_uploads(distroseries):
954+ """Return a list of all `PackageUploadCustom`s for `distroseries`."""
955+ return sum(
956+ [
957+ list(upload.customfiles)
958+ for upload in distroseries.getPackageUploads()],
959+ [])
960+
961+
962+class FakeDistroSeries:
963+ """Fake `DistroSeries` for test copiers that don't really need one."""
964+
965+
966+class FakeLibraryFileAlias:
967+ def __init__(self, filename):
968+ self.filename = filename
969+
970+
971+class FakeUpload:
972+ def __init__(self, customformat, filename):
973+ self.customformat = customformat
974+ self.libraryfilealias = FakeLibraryFileAlias(filename)
975+
976+
977+class CommonTestHelpers:
978+ """Helper(s) for these tests."""
979+ def makeVersion(self):
980+ """Create a fake version string."""
981+ return "%d.%d-%s" % (
982+ self.factory.getUniqueInteger(),
983+ self.factory.getUniqueInteger(),
984+ self.factory.getUniqueString())
985+
986+
987+class TestCustomUploadsCopierLite(TestCaseWithFactory, CommonTestHelpers):
988+ """Light-weight low-level tests for `CustomUploadsCopier`."""
989+
990+ layer = ZopelessLayer
991+
992+ def test_isCopyable_matches_copyable_types(self):
993+ # isCopyable checks a custom upload's customformat field to
994+ # determine whether the upload is a candidate for copying. It
995+ # approves only those whose customformats are in copyable_types.
996+ class FakePackageUploadCustom:
997+ def __init__(self, customformat):
998+ self.customformat = customformat
999+
1000+ uploads = [
1001+ FakePackageUploadCustom(custom_type)
1002+ for custom_type in PackageUploadCustomFormat.items]
1003+
1004+ copier = CustomUploadsCopier(FakeDistroSeries())
1005+ copied_uploads = filter(copier.isCopyable, uploads)
1006+ self.assertContentEqual(
1007+ CustomUploadsCopier.copyable_types,
1008+ [upload.customformat for upload in copied_uploads])
1009+
1010+ def test_extractNameFields_extracts_package_name_and_architecture(self):
1011+ # extractNameFields picks up the package name and architecture
1012+ # out of an upload's filename field.
1013+ package_name = self.factory.getUniqueString('package')
1014+ version = self.makeVersion()
1015+ architecture = self.factory.getUniqueString('arch')
1016+ filename = '%s_%s_%s.tar.gz' % (package_name, version, architecture)
1017+ copier = CustomUploadsCopier(FakeDistroSeries())
1018+ self.assertEqual(
1019+ (package_name, architecture), copier.extractNameFields(filename))
1020+
1021+ def test_extractNameFields_does_not_require_architecture(self):
1022+ # When extractNameFields does not see an architecture, it
1023+ # defaults to 'all'.
1024+ package_name = self.factory.getUniqueString('package')
1025+ filename = '%s_%s.tar.gz' % (package_name, self.makeVersion())
1026+ copier = CustomUploadsCopier(FakeDistroSeries())
1027+ self.assertEqual(
1028+ (package_name, 'all'), copier.extractNameFields(filename))
1029+
1030+ def test_extractNameFields_returns_None_on_mismatch(self):
1031+ # If the filename does not match the expected pattern,
1032+ # extractNameFields returns None.
1033+ copier = CustomUploadsCopier(FakeDistroSeries())
1034+ self.assertIs(None, copier.extractNameFields('argh_1.0.jpg'))
1035+
1036+ def test_extractNameFields_ignores_names_with_too_many_fields(self):
1037+ # As one particularly nasty case that might break
1038+ # extractNameFields, a name with more underscore-seprated fields
1039+ # than the search pattern allows for is sensibly rejected.
1040+ copier = CustomUploadsCopier(FakeDistroSeries())
1041+ self.assertIs(
1042+ None, copier.extractNameFields('one_two_three_four_5.tar.gz'))
1043+
1044+ def test_getKey_returns_None_on_name_mismatch(self):
1045+ # If extractNameFields returns None, getKey also returns None.
1046+ copier = CustomUploadsCopier(FakeDistroSeries())
1047+ copier.extractNameFields = FakeMethod()
1048+ self.assertIs(
1049+ None,
1050+ copier.getKey(FakeUpload(
1051+ PackageUploadCustomFormat.DEBIAN_INSTALLER,
1052+ "bad-filename.tar")))
1053+
1054+
1055+class TestCustomUploadsCopier(TestCaseWithFactory, CommonTestHelpers):
1056+ """Heavyweight `CustomUploadsCopier` tests."""
1057+
1058+ # Alas, PackageUploadCustom relies on the Librarian.
1059+ layer = LaunchpadZopelessLayer
1060+
1061+ def makeUpload(self, distroseries=None,
1062+ custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER,
1063+ package_name=None, version=None, arch=None):
1064+ """Create a `PackageUploadCustom`."""
1065+ if distroseries is None:
1066+ distroseries = self.factory.makeDistroSeries()
1067+ if package_name is None:
1068+ package_name = self.factory.getUniqueString("package")
1069+ if version is None:
1070+ version = self.makeVersion()
1071+ filename = "%s.tar.gz" % '_'.join(
1072+ filter(None, [package_name, version, arch]))
1073+ package_upload = self.factory.makeCustomPackageUpload(
1074+ distroseries=distroseries, custom_type=custom_type,
1075+ filename=filename)
1076+ return package_upload.customfiles[0]
1077+
1078+ def test_copies_custom_upload(self):
1079+ # CustomUploadsCopier copies custom uploads from one series to
1080+ # another.
1081+ current_series = self.factory.makeDistroSeries()
1082+ original_upload = self.makeUpload(current_series)
1083+ new_series = self.factory.makeDistroSeries(
1084+ distribution=current_series.distribution,
1085+ previous_series=current_series)
1086+
1087+ CustomUploadsCopier(new_series).copy(current_series)
1088+
1089+ [copied_upload] = list_custom_uploads(new_series)
1090+ self.assertEqual(
1091+ original_upload.libraryfilealias, copied_upload.libraryfilealias)
1092+
1093+ def test_is_idempotent(self):
1094+ # It's safe to perform the same copy more than once; the uploads
1095+ # get copied only once.
1096+ current_series = self.factory.makeDistroSeries()
1097+ self.makeUpload(current_series)
1098+ new_series = self.factory.makeDistroSeries(
1099+ distribution=current_series.distribution,
1100+ previous_series=current_series)
1101+
1102+ copier = CustomUploadsCopier(new_series)
1103+ copier.copy(current_series)
1104+ uploads_after_first_copy = list_custom_uploads(new_series)
1105+ copier.copy(current_series)
1106+ uploads_after_redundant_copy = list_custom_uploads(new_series)
1107+
1108+ self.assertEqual(
1109+ uploads_after_first_copy, uploads_after_redundant_copy)
1110+
1111+ def test_getCandidateUploads_filters_by_distroseries(self):
1112+ # getCandidateUploads ignores uploads for other distroseries.
1113+ source_series = self.factory.makeDistroSeries()
1114+ matching_upload = self.makeUpload(source_series)
1115+ nonmatching_upload = self.makeUpload()
1116+ copier = CustomUploadsCopier(FakeDistroSeries())
1117+ candidate_uploads = copier.getCandidateUploads(source_series)
1118+ self.assertContentEqual([matching_upload], candidate_uploads)
1119+ self.assertNotIn(nonmatching_upload, candidate_uploads)
1120+
1121+ def test_getCandidateUploads_filters_upload_types(self):
1122+ # getCandidateUploads returns only uploads of the types listed
1123+ # in copyable_types; other types of upload are ignored.
1124+ source_series = self.factory.makeDistroSeries()
1125+ for custom_format in PackageUploadCustomFormat.items:
1126+ self.makeUpload(source_series, custom_type=custom_format)
1127+
1128+ copier = CustomUploadsCopier(FakeDistroSeries())
1129+ candidate_uploads = copier.getCandidateUploads(source_series)
1130+ copied_types = [upload.customformat for upload in candidate_uploads]
1131+ self.assertContentEqual(
1132+ CustomUploadsCopier.copyable_types, copied_types)
1133+
1134+ def test_getCandidateUploads_ignores_other_attachments(self):
1135+ # A PackageUpload can have multiple PackageUploadCustoms
1136+ # attached, potentially of different types. getCandidateUploads
1137+ # ignores PackageUploadCustoms of types that aren't supposed to
1138+ # be copied, even if they are attached to PackageUploads that
1139+ # also have PackageUploadCustoms that do need to be copied.
1140+ source_series = self.factory.makeDistroSeries()
1141+ package_upload = self.factory.makePackageUpload(
1142+ distroseries=source_series, archive=source_series.main_archive)
1143+ library_file = self.factory.makeLibraryFileAlias()
1144+ matching_upload = package_upload.addCustom(
1145+ library_file, PackageUploadCustomFormat.DEBIAN_INSTALLER)
1146+ nonmatching_upload = package_upload.addCustom(
1147+ library_file, PackageUploadCustomFormat.ROSETTA_TRANSLATIONS)
1148+ copier = CustomUploadsCopier(FakeDistroSeries())
1149+ candidates = copier.getCandidateUploads(source_series)
1150+ self.assertContentEqual([matching_upload], candidates)
1151+ self.assertNotIn(nonmatching_upload, candidates)
1152+
1153+ def test_getCandidateUploads_orders_newest_to_oldest(self):
1154+ # getCandidateUploads returns its PackageUploadCustoms ordered
1155+ # from newest to oldest.
1156+ source_series = self.factory.makeDistroSeries()
1157+ for counter in xrange(5):
1158+ self.makeUpload(source_series)
1159+ copier = CustomUploadsCopier(FakeDistroSeries())
1160+ candidate_ids = [
1161+ upload.id for upload in copier.getCandidateUploads(source_series)]
1162+ self.assertEqual(sorted(candidate_ids, reverse=True), candidate_ids)
1163+
1164+ def test_getKey_includes_format_package_and_architecture(self):
1165+ # The key returned by getKey consists of custom upload type,
1166+ # package name, and architecture.
1167+ source_series = self.factory.makeDistroSeries()
1168+ upload = self.makeUpload(
1169+ source_series, PackageUploadCustomFormat.DIST_UPGRADER,
1170+ package_name='upgrader', arch='mips')
1171+ copier = CustomUploadsCopier(FakeDistroSeries())
1172+ expected_key = (
1173+ PackageUploadCustomFormat.DIST_UPGRADER,
1174+ 'upgrader',
1175+ 'mips',
1176+ )
1177+ self.assertEqual(expected_key, copier.getKey(upload))
1178+
1179+ def test_getLatestUploads_indexes_uploads_by_key(self):
1180+ # getLatestUploads returns a dict of uploads, indexed by keys
1181+ # returned by getKey.
1182+ source_series = self.factory.makeDistroSeries()
1183+ upload = self.makeUpload(source_series)
1184+ copier = CustomUploadsCopier(FakeDistroSeries())
1185+ self.assertEqual(
1186+ {copier.getKey(upload): upload},
1187+ copier.getLatestUploads(source_series))
1188+
1189+ def test_getLatestUploads_filters_superseded_uploads(self):
1190+ # getLatestUploads returns only the latest upload for a given
1191+ # distroseries, type, package, and architecture. Any older
1192+ # uploads with the same distroseries, type, package name, and
1193+ # architecture are ignored.
1194+ source_series = self.factory.makeDistroSeries()
1195+ uploads = [
1196+ self.makeUpload(
1197+ source_series, package_name='installer', version='1.0.0',
1198+ arch='ppc')
1199+ for counter in xrange(3)]
1200+
1201+ copier = CustomUploadsCopier(FakeDistroSeries())
1202+ self.assertContentEqual(
1203+ uploads[-1:], copier.getLatestUploads(source_series).values())
1204+
1205+ def test_getLatestUploads_bundles_versions(self):
1206+ # getLatestUploads sees an upload as superseding an older one
1207+ # for the same distroseries, type, package name, and
1208+ # architecture even if they have different versions.
1209+ source_series = self.factory.makeDistroSeries()
1210+ uploads = [
1211+ self.makeUpload(source_series, package_name='foo', arch='i386')
1212+ for counter in xrange(2)]
1213+ copier = CustomUploadsCopier(FakeDistroSeries())
1214+ self.assertContentEqual(
1215+ uploads[-1:], copier.getLatestUploads(source_series).values())
1216+
1217+ def test_getTargetArchive_on_same_distro_is_same_archive(self):
1218+ # When copying within the same distribution, getTargetArchive
1219+ # always returns the same archive you feed it.
1220+ distro = self.factory.makeDistribution()
1221+ archives = [
1222+ self.factory.makeArchive(distribution=distro, purpose=purpose)
1223+ for purpose in MAIN_ARCHIVE_PURPOSES]
1224+ copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro))
1225+ self.assertEqual(
1226+ archives,
1227+ [copier.getTargetArchive(archive) for archive in archives])
1228+
1229+ def test_getTargetArchive_returns_None_if_not_distribution_archive(self):
1230+ # getTargetArchive returns None for any archive that is not a
1231+ # distribution archive, regardless of whether the target series
1232+ # has an equivalent.
1233+ distro = self.factory.makeDistribution()
1234+ archives = [
1235+ self.factory.makeArchive(distribution=distro, purpose=purpose)
1236+ for purpose in ArchivePurpose.items
1237+ if purpose not in MAIN_ARCHIVE_PURPOSES]
1238+ copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro))
1239+ self.assertEqual(
1240+ [None] * len(archives),
1241+ [copier.getTargetArchive(archive) for archive in archives])
1242+
1243+ def test_getTargetArchive_finds_matching_archive(self):
1244+ # When copying across archives, getTargetArchive looks for an
1245+ # archive for the target series with the same purpose as the
1246+ # original archive.
1247+ source_series = self.factory.makeDistroSeries()
1248+ source_archive = self.factory.makeArchive(
1249+ distribution=source_series.distribution,
1250+ purpose=ArchivePurpose.PARTNER)
1251+ target_series = self.factory.makeDistroSeries()
1252+ target_archive = self.factory.makeArchive(
1253+ distribution=target_series.distribution,
1254+ purpose=ArchivePurpose.PARTNER)
1255+
1256+ copier = CustomUploadsCopier(target_series)
1257+ self.assertEqual(
1258+ target_archive, copier.getTargetArchive(source_archive))
1259+
1260+ def test_getTargetArchive_returns_None_if_no_archive_matches(self):
1261+ # If the target series has no archive to match the archive that
1262+ # the original upload was far, it returns None.
1263+ source_series = self.factory.makeDistroSeries()
1264+ source_archive = self.factory.makeArchive(
1265+ distribution=source_series.distribution,
1266+ purpose=ArchivePurpose.PARTNER)
1267+ target_series = self.factory.makeDistroSeries()
1268+ copier = CustomUploadsCopier(target_series)
1269+ self.assertIs(None, copier.getTargetArchive(source_archive))
1270+
1271+ def test_isObsolete_returns_False_if_no_equivalent_in_target(self):
1272+ # isObsolete returns False if the upload in question has no
1273+ # equivalent in the target series.
1274+ source_series = self.factory.makeDistroSeries()
1275+ upload = self.makeUpload(source_series)
1276+ target_series = self.factory.makeDistroSeries()
1277+ copier = CustomUploadsCopier(target_series)
1278+ self.assertFalse(
1279+ copier.isObsolete(upload, copier.getLatestUploads(target_series)))
1280+
1281+ def test_isObsolete_returns_False_if_target_has_older_equivalent(self):
1282+ # isObsolete returns False if the target has an equivlalent of
1283+ # the upload in question, but it's older than the version the
1284+ # source series has.
1285+ source_series = self.factory.makeDistroSeries()
1286+ target_series = self.factory.makeDistroSeries()
1287+ self.makeUpload(
1288+ target_series, package_name='installer', arch='ppc64')
1289+ source_upload = self.makeUpload(
1290+ source_series, package_name='installer', arch='ppc64')
1291+ copier = CustomUploadsCopier(target_series)
1292+ self.assertFalse(
1293+ copier.isObsolete(
1294+ source_upload, copier.getLatestUploads(target_series)))
1295+
1296+ def test_isObsolete_returns_True_if_target_has_newer_equivalent(self):
1297+ # isObsolete returns False if the target series already has a
1298+ # newer equivalent of the upload in question (as would be the
1299+ # case, for instance, if the upload had already been copied).
1300+ source_series = self.factory.makeDistroSeries()
1301+ source_upload = self.makeUpload(
1302+ source_series, package_name='installer', arch='alpha')
1303+ target_series = self.factory.makeDistroSeries()
1304+ self.makeUpload(
1305+ target_series, package_name='installer', arch='alpha')
1306+ copier = CustomUploadsCopier(target_series)
1307+ self.assertTrue(
1308+ copier.isObsolete(
1309+ source_upload, copier.getLatestUploads(target_series)))
1310+
1311+ def test_copyUpload_creates_upload(self):
1312+ # copyUpload creates a new upload that's very similar to the
1313+ # original, but for the target series.
1314+ original_upload = self.makeUpload()
1315+ target_series = self.factory.makeDistroSeries()
1316+ copier = CustomUploadsCopier(target_series)
1317+ copied_upload = copier.copyUpload(original_upload)
1318+ self.assertEqual([copied_upload], list_custom_uploads(target_series))
1319+ self.assertNotEqual(
1320+ original_upload.packageupload, copied_upload.packageupload)
1321+ self.assertEqual(
1322+ original_upload.customformat, copied_upload.customformat)
1323+ self.assertEqual(
1324+ original_upload.libraryfilealias, copied_upload.libraryfilealias)
1325+ self.assertEqual(
1326+ original_upload.packageupload.changesfile,
1327+ copied_upload.packageupload.changesfile)
1328+ self.assertEqual(
1329+ original_upload.packageupload.pocket,
1330+ copied_upload.packageupload.pocket)
1331+
1332+ def test_copyUpload_accepts_upload(self):
1333+ # Uploads created by copyUpload are automatically accepted.
1334+ original_upload = self.makeUpload()
1335+ target_series = self.factory.makeDistroSeries()
1336+ copier = CustomUploadsCopier(target_series)
1337+ copied_upload = copier.copyUpload(original_upload)
1338+ self.assertEqual(
1339+ PackageUploadStatus.ACCEPTED, copied_upload.packageupload.status)
1340+
1341+ def test_copyUpload_does_not_copy_if_no_archive_matches(self):
1342+ # If getTargetArchive does not find an appropriate target
1343+ # archive, copyUpload does nothing.
1344+ source_series = self.factory.makeDistroSeries()
1345+ upload = self.makeUpload(distroseries=source_series)
1346+ target_series = self.factory.makeDistroSeries()
1347+ copier = CustomUploadsCopier(target_series)
1348+ copier.getTargetArchive = FakeMethod(result=None)
1349+ self.assertIs(None, copier.copyUpload(upload))
1350+ self.assertEqual([], list_custom_uploads(target_series))
1351
1352=== modified file 'lib/lp/soyuz/tests/test_publishing.py'
1353--- lib/lp/soyuz/tests/test_publishing.py 2011-08-03 11:00:11 +0000
1354+++ lib/lp/soyuz/tests/test_publishing.py 2011-08-11 10:31:20 +0000
1355@@ -159,7 +159,7 @@
1356 signing_key = self.person.gpg_keys[0]
1357 package_upload = distroseries.createQueueEntry(
1358 pocket, archive, changes_file_name, changes_file_content,
1359- signing_key)
1360+ signing_key=signing_key)
1361
1362 status_to_method = {
1363 PackageUploadStatus.DONE: 'setDone',