Merge lp:~jtv/launchpad/bug-659769 into lp:launchpad

Proposed by Jeroen T. Vermeulen
Status: Superseded
Proposed branch: lp:~jtv/launchpad/bug-659769
Merge into: lp:launchpad
Diff against target: 1363 lines (+668/-571)
9 files modified
lib/lp/archivepublisher/scripts/publish_ftpmaster.py (+23/-8)
lib/lp/archivepublisher/tests/test_publish_ftpmaster.py (+24/-0)
lib/lp/codehosting/safe_open.py (+0/-263)
lib/lp/codehosting/tests/test_safe_open.py (+0/-267)
lib/lp/registry/interfaces/distroseries.py (+25/-11)
lib/lp/registry/model/distroseries.py (+26/-21)
lib/lp/soyuz/scripts/custom_uploads_copier.py (+149/-0)
lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py (+420/-0)
lib/lp/soyuz/tests/test_publishing.py (+1/-1)
To merge this branch: bzr merge lp:~jtv/launchpad/bug-659769
Reviewer Review Type Date Requested Status
Brad Crittenden (community) code Needs Information
Review via email: mp+71173@code.launchpad.net

This proposal has been superseded by a proposal from 2011-08-15.

Commit message

Copy custom uploads into new distro release.

Description of the change

= Summary =

Every time the Ubuntu people (particularly Colin) set up a new release, they need to copy some of the archive's custom-upload files to the new release. Custom uploads are almost entirely unmanaged, so they would probably do this by copying direcetly in the filesystem. They have asked for an easier, more integrated way to get it done.

== Proposed fix ==

As far as I can make out, the only types of custom upload that need to be copied are for the Debian installer and the dist upgrader. The Rosetta translations are shared within the database, and presumably both kinds of translations for a package will soon be re-uploaded anyway. So I focused on the installers and upgraders.

Because custom uploads are so lightly managed (little or no metadata is kept, and we'd have to parse tarballs from the Librarian just to see what files came out of which upload), it's hard to figure out exactly what should or should not be copied. Some of the files may be obsolete and if we copy them along, they'll be with us forever.

The uploads contain tarballs with names in just one of two formats: <package>_<version>_<architecture>.tar.gz and <package>_<version>.tar.gz. A tarball for a given installer version, say, should contain all files that that version needs; there's no finicky merging of individual files that may each be current or obsolete. We can just identify the upload with the current tarball, and copy that upload into the new series.

Rather than get into the full hairy detail of version parsing (some of the versions look a little ad-hoc), I'm just copying the latest custom uploads for each (upload type, <package>, <architecture>). The <architecture> defaults to "all" because that's what seems to be meant when it is omitted.

As far as I've seen, the scheme I implemented here would actually work just fine for the other upload types. It will also work for derived distributions, with two limitations:

1. The custom uploads are copied only between consecutive releases of the same distribution, not across inheritance lines.

2. Uploads have to follow this naming scheme in order to be copied. This is something the package defines.

Having this supported for derived distributions could help save manual maintenance and system access needs for derived distros. We may want to add cross-distro inheritance of custom uploads later, but then we'll have to solve the conflict-resolution problem: should we copy the previous series' latest installer, or the one from the parent distro in the series that we're deriving from? What if there are multiple parent distros or releases?

== Pre-implementation notes ==

Discussed with various people, but alas not with Colin at the time of writing. I'm going to hold off on landing until he confirms that this will give him what he needs. Watch this space for updates.

== Implementation details ==

== Tests ==

{{{
./bin/test -vvc lp.soyuz -t test_publishing -t test_custom_uploads_copier
./bin/test -vvc lp.archivepublisher.tests.test_publish_ftpmaster
}}}

== Demo and Q/A ==

We'll have to do this hand in hand with the distro gurus. Not only to validate the result, but also to help set up the test scenario!

= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/tests/test_publishing.py
  lib/lp/archivepublisher/scripts/publish_ftpmaster.py
  lib/lp/registry/interfaces/distroseries.py
  lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py
  lib/lp/soyuz/scripts/custom_uploads_copier.py
  lib/lp/archivepublisher/tests/test_publish_ftpmaster.py
  lib/lp/registry/model/distroseries.py

To post a comment you must log in.
Revision history for this message
Brad Crittenden (bac) wrote :

Hi Jeroen,

I tried to run your tests but there are failures due to the removal of 'safe_open' which is still referenced in many places throughout the code base. Was its removal an accident? Did you move it to another file that you forgot to add to version control?

Brad

review: Needs Information (code)
Revision history for this message
Jeroen T. Vermeulen (jtv) wrote :

No idea what safe_open is or how it got into this MP. I'll have to resubmit.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'lib/lp/archivepublisher/scripts/publish_ftpmaster.py'
--- lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-09 10:30:43 +0000
+++ lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-11 10:31:20 +0000
@@ -27,6 +27,7 @@
27 )27 )
28from lp.services.utils import file_exists28from lp.services.utils import file_exists
29from lp.soyuz.enums import ArchivePurpose29from lp.soyuz.enums import ArchivePurpose
30from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier
30from lp.soyuz.scripts.ftpmaster import LpQueryDistro31from lp.soyuz.scripts.ftpmaster import LpQueryDistro
31from lp.soyuz.scripts.processaccepted import ProcessAccepted32from lp.soyuz.scripts.processaccepted import ProcessAccepted
32from lp.soyuz.scripts.publishdistro import PublishDistro33from lp.soyuz.scripts.publishdistro import PublishDistro
@@ -539,6 +540,25 @@
539 self.recoverWorkingDists()540 self.recoverWorkingDists()
540 raise541 raise
541542
543 def prepareFreshSeries(self):
544 """If there are any new distroseries, prepare them for publishing.
545
546 :return: True if a series did indeed still need some preparation,
547 of False for the normal case.
548 """
549 have_fresh_series = False
550 for series in self.distribution.series:
551 suites_needing_indexes = self.listSuitesNeedingIndexes(series)
552 if len(suites_needing_indexes) != 0:
553 # This is a fresh series.
554 have_fresh_series = True
555 if series.previous_series is not None:
556 CustomUploadsCopier(series).copy(series.previous_series)
557 for suite in suites_needing_indexes:
558 self.createIndexes(suite)
559
560 return have_fresh_series
561
542 def setUp(self):562 def setUp(self):
543 """Process options, and set up internal state."""563 """Process options, and set up internal state."""
544 self.processOptions()564 self.processOptions()
@@ -550,14 +570,9 @@
550 self.setUp()570 self.setUp()
551 self.recoverWorkingDists()571 self.recoverWorkingDists()
552572
553 for series in self.distribution.series:573 if self.prepareFreshSeries():
554 suites_needing_indexes = self.listSuitesNeedingIndexes(series)574 # We've done enough. Leave some room for others.
555 if len(suites_needing_indexes) > 0:575 return
556 for suite in suites_needing_indexes:
557 self.createIndexes(suite)
558 # Don't try to do too much in one run. Leave the rest
559 # of the work for next time.
560 return
561576
562 self.processAccepted()577 self.processAccepted()
563 self.setUpDirs()578 self.setUpDirs()
564579
=== modified file 'lib/lp/archivepublisher/tests/test_publish_ftpmaster.py'
--- lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-03 06:24:53 +0000
+++ lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-11 10:31:20 +0000
@@ -44,6 +44,7 @@
44from lp.soyuz.enums import (44from lp.soyuz.enums import (
45 ArchivePurpose,45 ArchivePurpose,
46 PackagePublishingStatus,46 PackagePublishingStatus,
47 PackageUploadCustomFormat,
47 )48 )
48from lp.soyuz.tests.test_publishing import SoyuzTestPublisher49from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
49from lp.testing import (50from lp.testing import (
@@ -980,6 +981,29 @@
980 self.assertEqual([suite], kwargs['suites'])981 self.assertEqual([suite], kwargs['suites'])
981 self.assertThat(kwargs['suites'][0], StartsWith(series.name))982 self.assertThat(kwargs['suites'][0], StartsWith(series.name))
982983
984 def test_prepareFreshSeries_copies_custom_uploads(self):
985 distro = self.makeDistroWithPublishDirectory()
986 old_series = self.factory.makeDistroSeries(
987 distribution=distro, status=SeriesStatus.CURRENT)
988 new_series = self.factory.makeDistroSeries(
989 distribution=distro, previous_series=old_series,
990 status=SeriesStatus.FROZEN)
991 custom_upload = self.factory.makeCustomPackageUpload(
992 distroseries=old_series,
993 custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER,
994 filename='debian-installer-images_1.0-20110805_i386.tar.gz')
995 script = self.makeScript(distro)
996 script.createIndexes = FakeMethod()
997 script.setUp()
998 have_fresh_series = script.prepareFreshSeries()
999 self.assertTrue(have_fresh_series)
1000 [copied_upload] = new_series.getPackageUploads(
1001 name=u'debian-installer-images', exact_match=False)
1002 [copied_custom] = copied_upload.customfiles
1003 self.assertEqual(
1004 custom_upload.customfiles[0].libraryfilealias.filename,
1005 copied_custom.libraryfilealias.filename)
1006
983 def test_script_creates_indexes(self):1007 def test_script_creates_indexes(self):
984 # End-to-end test: the script creates indexes for distroseries1008 # End-to-end test: the script creates indexes for distroseries
985 # that need them.1009 # that need them.
9861010
=== removed file 'lib/lp/codehosting/safe_open.py'
--- lib/lp/codehosting/safe_open.py 2011-08-09 14:59:08 +0000
+++ lib/lp/codehosting/safe_open.py 1970-01-01 00:00:00 +0000
@@ -1,263 +0,0 @@
1# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).
3
4"""Safe branch opening."""
5
6__metaclass__ = type
7
8from bzrlib import urlutils
9from bzrlib.branch import Branch
10from bzrlib.bzrdir import BzrDir
11
12from lazr.uri import URI
13
14__all__ = [
15 'AcceptAnythingPolicy',
16 'BadUrl',
17 'BlacklistPolicy',
18 'BranchLoopError',
19 'BranchOpenPolicy',
20 'BranchReferenceForbidden',
21 'SafeBranchOpener',
22 'WhitelistPolicy',
23 'safe_open',
24 ]
25
26
27# TODO JelmerVernooij 2011-08-06: This module is generic enough to be
28# in bzrlib, and may be of use to others.
29
30
31class BadUrl(Exception):
32 """Tried to access a branch from a bad URL."""
33
34
35class BranchReferenceForbidden(Exception):
36 """Trying to mirror a branch reference and the branch type does not allow
37 references.
38 """
39
40
41class BranchLoopError(Exception):
42 """Encountered a branch cycle.
43
44 A URL may point to a branch reference or it may point to a stacked branch.
45 In either case, it's possible for there to be a cycle in these references,
46 and this exception is raised when we detect such a cycle.
47 """
48
49
50class BranchOpenPolicy:
51 """Policy on how to open branches.
52
53 In particular, a policy determines which branches are safe to open by
54 checking their URLs and deciding whether or not to follow branch
55 references.
56 """
57
58 def shouldFollowReferences(self):
59 """Whether we traverse references when mirroring.
60
61 Subclasses must override this method.
62
63 If we encounter a branch reference and this returns false, an error is
64 raised.
65
66 :returns: A boolean to indicate whether to follow a branch reference.
67 """
68 raise NotImplementedError(self.shouldFollowReferences)
69
70 def transformFallbackLocation(self, branch, url):
71 """Validate, maybe modify, 'url' to be used as a stacked-on location.
72
73 :param branch: The branch that is being opened.
74 :param url: The URL that the branch provides for its stacked-on
75 location.
76 :return: (new_url, check) where 'new_url' is the URL of the branch to
77 actually open and 'check' is true if 'new_url' needs to be
78 validated by checkAndFollowBranchReference.
79 """
80 raise NotImplementedError(self.transformFallbackLocation)
81
82 def checkOneURL(self, url):
83 """Check the safety of the source URL.
84
85 Subclasses must override this method.
86
87 :param url: The source URL to check.
88 :raise BadUrl: subclasses are expected to raise this or a subclass
89 when it finds a URL it deems to be unsafe.
90 """
91 raise NotImplementedError(self.checkOneURL)
92
93
94class BlacklistPolicy(BranchOpenPolicy):
95 """Branch policy that forbids certain URLs."""
96
97 def __init__(self, should_follow_references, unsafe_urls=None):
98 if unsafe_urls is None:
99 unsafe_urls = set()
100 self._unsafe_urls = unsafe_urls
101 self._should_follow_references = should_follow_references
102
103 def shouldFollowReferences(self):
104 return self._should_follow_references
105
106 def checkOneURL(self, url):
107 if url in self._unsafe_urls:
108 raise BadUrl(url)
109
110 def transformFallbackLocation(self, branch, url):
111 """See `BranchOpenPolicy.transformFallbackLocation`.
112
113 This class is not used for testing our smarter stacking features so we
114 just do the simplest thing: return the URL that would be used anyway
115 and don't check it.
116 """
117 return urlutils.join(branch.base, url), False
118
119
120class AcceptAnythingPolicy(BlacklistPolicy):
121 """Accept anything, to make testing easier."""
122
123 def __init__(self):
124 super(AcceptAnythingPolicy, self).__init__(True, set())
125
126
127class WhitelistPolicy(BranchOpenPolicy):
128 """Branch policy that only allows certain URLs."""
129
130 def __init__(self, should_follow_references, allowed_urls=None,
131 check=False):
132 if allowed_urls is None:
133 allowed_urls = []
134 self.allowed_urls = set(url.rstrip('/') for url in allowed_urls)
135 self.check = check
136
137 def shouldFollowReferences(self):
138 return self._should_follow_references
139
140 def checkOneURL(self, url):
141 if url.rstrip('/') not in self.allowed_urls:
142 raise BadUrl(url)
143
144 def transformFallbackLocation(self, branch, url):
145 """See `BranchOpenPolicy.transformFallbackLocation`.
146
147 Here we return the URL that would be used anyway and optionally check
148 it.
149 """
150 return urlutils.join(branch.base, url), self.check
151
152
153class SafeBranchOpener(object):
154 """Safe branch opener.
155
156 The policy object is expected to have the following methods:
157 * checkOneURL
158 * shouldFollowReferences
159 * transformFallbackLocation
160 """
161
162 def __init__(self, policy):
163 self.policy = policy
164 self._seen_urls = set()
165
166 def checkAndFollowBranchReference(self, url):
167 """Check URL (and possibly the referenced URL) for safety.
168
169 This method checks that `url` passes the policy's `checkOneURL`
170 method, and if `url` refers to a branch reference, it checks whether
171 references are allowed and whether the reference's URL passes muster
172 also -- recursively, until a real branch is found.
173
174 :raise BranchLoopError: If the branch references form a loop.
175 :raise BranchReferenceForbidden: If this opener forbids branch
176 references.
177 """
178 while True:
179 if url in self._seen_urls:
180 raise BranchLoopError()
181 self._seen_urls.add(url)
182 self.policy.checkOneURL(url)
183 next_url = self.followReference(url)
184 if next_url is None:
185 return url
186 url = next_url
187 if not self.policy.shouldFollowReferences():
188 raise BranchReferenceForbidden(url)
189
190 def transformFallbackLocationHook(self, branch, url):
191 """Installed as the 'transform_fallback_location' Branch hook.
192
193 This method calls `transformFallbackLocation` on the policy object and
194 either returns the url it provides or passes it back to
195 checkAndFollowBranchReference.
196 """
197 new_url, check = self.policy.transformFallbackLocation(branch, url)
198 if check:
199 return self.checkAndFollowBranchReference(new_url)
200 else:
201 return new_url
202
203 def runWithTransformFallbackLocationHookInstalled(
204 self, callable, *args, **kw):
205 Branch.hooks.install_named_hook(
206 'transform_fallback_location', self.transformFallbackLocationHook,
207 'SafeBranchOpener.transformFallbackLocationHook')
208 try:
209 return callable(*args, **kw)
210 finally:
211 # XXX 2008-11-24 MichaelHudson, bug=301472: This is the hacky way
212 # to remove a hook. The linked bug report asks for an API to do
213 # it.
214 Branch.hooks['transform_fallback_location'].remove(
215 self.transformFallbackLocationHook)
216 # We reset _seen_urls here to avoid multiple calls to open giving
217 # spurious loop exceptions.
218 self._seen_urls = set()
219
220 def followReference(self, url):
221 """Get the branch-reference value at the specified url.
222
223 This exists as a separate method only to be overriden in unit tests.
224 """
225 bzrdir = BzrDir.open(url)
226 return bzrdir.get_branch_reference()
227
228 def open(self, url):
229 """Open the Bazaar branch at url, first checking for safety.
230
231 What safety means is defined by a subclasses `followReference` and
232 `checkOneURL` methods.
233 """
234 url = self.checkAndFollowBranchReference(url)
235 return self.runWithTransformFallbackLocationHookInstalled(
236 Branch.open, url)
237
238
239class URLChecker(BranchOpenPolicy):
240 """Branch open policy that rejects URLs not on the given scheme."""
241
242 def __init__(self, allowed_scheme):
243 self.allowed_scheme = allowed_scheme
244
245 def shouldFollowReferences(self):
246 return True
247
248 def transformFallbackLocation(self, branch, url):
249 return urlutils.join(branch.base, url), True
250
251 def checkOneURL(self, url):
252 """Check that `url` is safe to open."""
253 if URI(url).scheme != self.allowed_scheme:
254 raise BadUrl(url)
255
256
257def safe_open(allowed_scheme, url):
258 """Open the branch at `url`, only accessing URLs on `allowed_scheme`.
259
260 :raises BadUrl: An attempt was made to open a URL that was not on
261 `allowed_scheme`.
262 """
263 return SafeBranchOpener(URLChecker(allowed_scheme)).open(url)
2640
=== removed file 'lib/lp/codehosting/tests/test_safe_open.py'
--- lib/lp/codehosting/tests/test_safe_open.py 2011-08-09 15:05:18 +0000
+++ lib/lp/codehosting/tests/test_safe_open.py 1970-01-01 00:00:00 +0000
@@ -1,267 +0,0 @@
1# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).
3
4"""Tests for the safe branch open code."""
5
6
7__metaclass__ = type
8
9from lazr.uri import URI
10
11from lp.codehosting.safe_open import (
12 BadUrl,
13 BlacklistPolicy,
14 BranchLoopError,
15 BranchReferenceForbidden,
16 SafeBranchOpener,
17 WhitelistPolicy,
18 safe_open,
19 )
20
21from lp.testing import TestCase
22
23from bzrlib.branch import (
24 Branch,
25 BzrBranchFormat7,
26 )
27from bzrlib.bzrdir import (
28 BzrDirMetaFormat1,
29 )
30from bzrlib.repofmt.pack_repo import RepositoryFormatKnitPack1
31from bzrlib.tests import (
32 TestCaseWithTransport,
33 )
34from bzrlib.transport import chroot
35
36
37class TestSafeBranchOpenerCheckAndFollowBranchReference(TestCase):
38 """Unit tests for `SafeBranchOpener.checkAndFollowBranchReference`."""
39
40 class StubbedSafeBranchOpener(SafeBranchOpener):
41 """SafeBranchOpener that provides canned answers.
42
43 We implement the methods we need to to be able to control all the
44 inputs to the `BranchMirrorer.checkSource` method, which is what is
45 being tested in this class.
46 """
47
48 def __init__(self, references, policy):
49 parent_cls = TestSafeBranchOpenerCheckAndFollowBranchReference
50 super(parent_cls.StubbedSafeBranchOpener, self).__init__(policy)
51 self._reference_values = {}
52 for i in range(len(references) - 1):
53 self._reference_values[references[i]] = references[i+1]
54 self.follow_reference_calls = []
55
56 def followReference(self, url):
57 self.follow_reference_calls.append(url)
58 return self._reference_values[url]
59
60 def makeBranchOpener(self, should_follow_references, references,
61 unsafe_urls=None):
62 policy = BlacklistPolicy(should_follow_references, unsafe_urls)
63 opener = self.StubbedSafeBranchOpener(references, policy)
64 return opener
65
66 def testCheckInitialURL(self):
67 # checkSource rejects all URLs that are not allowed.
68 opener = self.makeBranchOpener(None, [], set(['a']))
69 self.assertRaises(BadUrl, opener.checkAndFollowBranchReference, 'a')
70
71 def testNotReference(self):
72 # When branch references are forbidden, checkAndFollowBranchReference
73 # does not raise on non-references.
74 opener = self.makeBranchOpener(False, ['a', None])
75 self.assertEquals('a', opener.checkAndFollowBranchReference('a'))
76 self.assertEquals(['a'], opener.follow_reference_calls)
77
78 def testBranchReferenceForbidden(self):
79 # checkAndFollowBranchReference raises BranchReferenceForbidden if
80 # branch references are forbidden and the source URL points to a
81 # branch reference.
82 opener = self.makeBranchOpener(False, ['a', 'b'])
83 self.assertRaises(
84 BranchReferenceForbidden,
85 opener.checkAndFollowBranchReference, 'a')
86 self.assertEquals(['a'], opener.follow_reference_calls)
87
88 def testAllowedReference(self):
89 # checkAndFollowBranchReference does not raise if following references
90 # is allowed and the source URL points to a branch reference to a
91 # permitted location.
92 opener = self.makeBranchOpener(True, ['a', 'b', None])
93 self.assertEquals('b', opener.checkAndFollowBranchReference('a'))
94 self.assertEquals(['a', 'b'], opener.follow_reference_calls)
95
96 def testCheckReferencedURLs(self):
97 # checkAndFollowBranchReference checks if the URL a reference points
98 # to is safe.
99 opener = self.makeBranchOpener(
100 True, ['a', 'b', None], unsafe_urls=set('b'))
101 self.assertRaises(BadUrl, opener.checkAndFollowBranchReference, 'a')
102 self.assertEquals(['a'], opener.follow_reference_calls)
103
104 def testSelfReferencingBranch(self):
105 # checkAndFollowBranchReference raises BranchReferenceLoopError if
106 # following references is allowed and the source url points to a
107 # self-referencing branch reference.
108 opener = self.makeBranchOpener(True, ['a', 'a'])
109 self.assertRaises(
110 BranchLoopError, opener.checkAndFollowBranchReference, 'a')
111 self.assertEquals(['a'], opener.follow_reference_calls)
112
113 def testBranchReferenceLoop(self):
114 # checkAndFollowBranchReference raises BranchReferenceLoopError if
115 # following references is allowed and the source url points to a loop
116 # of branch references.
117 references = ['a', 'b', 'a']
118 opener = self.makeBranchOpener(True, references)
119 self.assertRaises(
120 BranchLoopError, opener.checkAndFollowBranchReference, 'a')
121 self.assertEquals(['a', 'b'], opener.follow_reference_calls)
122
123
124class TestSafeBranchOpenerStacking(TestCaseWithTransport):
125
126 def makeBranchOpener(self, allowed_urls):
127 policy = WhitelistPolicy(True, allowed_urls, True)
128 return SafeBranchOpener(policy)
129
130 def makeBranch(self, path, branch_format, repository_format):
131 """Make a Bazaar branch at 'path' with the given formats."""
132 bzrdir_format = BzrDirMetaFormat1()
133 bzrdir_format.set_branch_format(branch_format)
134 bzrdir = self.make_bzrdir(path, format=bzrdir_format)
135 repository_format.initialize(bzrdir)
136 return bzrdir.create_branch()
137
138 def testAllowedURL(self):
139 # checkSource does not raise an exception for branches stacked on
140 # branches with allowed URLs.
141 stacked_on_branch = self.make_branch('base-branch', format='1.6')
142 stacked_branch = self.make_branch('stacked-branch', format='1.6')
143 stacked_branch.set_stacked_on_url(stacked_on_branch.base)
144 opener = self.makeBranchOpener(
145 [stacked_branch.base, stacked_on_branch.base])
146 # This doesn't raise an exception.
147 opener.open(stacked_branch.base)
148
149 def testUnstackableRepository(self):
150 # checkSource treats branches with UnstackableRepositoryFormats as
151 # being not stacked.
152 branch = self.makeBranch(
153 'unstacked', BzrBranchFormat7(), RepositoryFormatKnitPack1())
154 opener = self.makeBranchOpener([branch.base])
155 # This doesn't raise an exception.
156 opener.open(branch.base)
157
158 def testAllowedRelativeURL(self):
159 # checkSource passes on absolute urls to checkOneURL, even if the
160 # value of stacked_on_location in the config is set to a relative URL.
161 stacked_on_branch = self.make_branch('base-branch', format='1.6')
162 stacked_branch = self.make_branch('stacked-branch', format='1.6')
163 stacked_branch.set_stacked_on_url('../base-branch')
164 opener = self.makeBranchOpener(
165 [stacked_branch.base, stacked_on_branch.base])
166 # Note that stacked_on_branch.base is not '../base-branch', it's an
167 # absolute URL.
168 self.assertNotEqual('../base-branch', stacked_on_branch.base)
169 # This doesn't raise an exception.
170 opener.open(stacked_branch.base)
171
172 def testAllowedRelativeNested(self):
173 # Relative URLs are resolved relative to the stacked branch.
174 self.get_transport().mkdir('subdir')
175 a = self.make_branch('subdir/a', format='1.6')
176 b = self.make_branch('b', format='1.6')
177 b.set_stacked_on_url('../subdir/a')
178 c = self.make_branch('subdir/c', format='1.6')
179 c.set_stacked_on_url('../../b')
180 opener = self.makeBranchOpener([c.base, b.base, a.base])
181 # This doesn't raise an exception.
182 opener.open(c.base)
183
184 def testForbiddenURL(self):
185 # checkSource raises a BadUrl exception if a branch is stacked on a
186 # branch with a forbidden URL.
187 stacked_on_branch = self.make_branch('base-branch', format='1.6')
188 stacked_branch = self.make_branch('stacked-branch', format='1.6')
189 stacked_branch.set_stacked_on_url(stacked_on_branch.base)
190 opener = self.makeBranchOpener([stacked_branch.base])
191 self.assertRaises(BadUrl, opener.open, stacked_branch.base)
192
193 def testForbiddenURLNested(self):
194 # checkSource raises a BadUrl exception if a branch is stacked on a
195 # branch that is in turn stacked on a branch with a forbidden URL.
196 a = self.make_branch('a', format='1.6')
197 b = self.make_branch('b', format='1.6')
198 b.set_stacked_on_url(a.base)
199 c = self.make_branch('c', format='1.6')
200 c.set_stacked_on_url(b.base)
201 opener = self.makeBranchOpener([c.base, b.base])
202 self.assertRaises(BadUrl, opener.open, c.base)
203
204 def testSelfStackedBranch(self):
205 # checkSource raises StackingLoopError if a branch is stacked on
206 # itself. This avoids infinite recursion errors.
207 a = self.make_branch('a', format='1.6')
208 # Bazaar 1.17 and up make it harder to create branches like this.
209 # It's still worth testing that we don't blow up in the face of them,
210 # so we grovel around a bit to create one anyway.
211 a.get_config().set_user_option('stacked_on_location', a.base)
212 opener = self.makeBranchOpener([a.base])
213 self.assertRaises(BranchLoopError, opener.open, a.base)
214
215 def testLoopStackedBranch(self):
216 # checkSource raises StackingLoopError if a branch is stacked in such
217 # a way so that it is ultimately stacked on itself. e.g. a stacked on
218 # b stacked on a.
219 a = self.make_branch('a', format='1.6')
220 b = self.make_branch('b', format='1.6')
221 a.set_stacked_on_url(b.base)
222 b.set_stacked_on_url(a.base)
223 opener = self.makeBranchOpener([a.base, b.base])
224 self.assertRaises(BranchLoopError, opener.open, a.base)
225 self.assertRaises(BranchLoopError, opener.open, b.base)
226
227
228class TestSafeOpen(TestCaseWithTransport):
229 """Tests for `safe_open`."""
230
231 def get_chrooted_scheme(self, relpath):
232 """Create a server that is chrooted to `relpath`.
233
234 :return: ``(scheme, get_url)`` where ``scheme`` is the scheme of the
235 chroot server and ``get_url`` returns URLs on said server.
236 """
237 transport = self.get_transport(relpath)
238 chroot_server = chroot.ChrootServer(transport)
239 chroot_server.start_server()
240 self.addCleanup(chroot_server.stop_server)
241 def get_url(relpath):
242 return chroot_server.get_url() + relpath
243 return URI(chroot_server.get_url()).scheme, get_url
244
245 def test_stacked_within_scheme(self):
246 # A branch that is stacked on a URL of the same scheme is safe to
247 # open.
248 self.get_transport().mkdir('inside')
249 self.make_branch('inside/stacked')
250 self.make_branch('inside/stacked-on')
251 scheme, get_chrooted_url = self.get_chrooted_scheme('inside')
252 Branch.open(get_chrooted_url('stacked')).set_stacked_on_url(
253 get_chrooted_url('stacked-on'))
254 safe_open(scheme, get_chrooted_url('stacked'))
255
256 def test_stacked_outside_scheme(self):
257 # A branch that is stacked on a URL that is not of the same scheme is
258 # not safe to open.
259 self.get_transport().mkdir('inside')
260 self.get_transport().mkdir('outside')
261 self.make_branch('inside/stacked')
262 self.make_branch('outside/stacked-on')
263 scheme, get_chrooted_url = self.get_chrooted_scheme('inside')
264 Branch.open(get_chrooted_url('stacked')).set_stacked_on_url(
265 self.get_url('outside/stacked-on'))
266 self.assertRaises(
267 BadUrl, safe_open, scheme, get_chrooted_url('stacked'))
2680
=== modified file 'lib/lp/registry/interfaces/distroseries.py'
--- lib/lp/registry/interfaces/distroseries.py 2011-08-05 03:58:16 +0000
+++ lib/lp/registry/interfaces/distroseries.py 2011-08-11 10:31:20 +0000
@@ -786,20 +786,34 @@
786 DistroSeriesBinaryPackage objects that match the given text.786 DistroSeriesBinaryPackage objects that match the given text.
787 """787 """
788788
789 def createQueueEntry(pocket, archive, changesfilename, changesfilecontent,789 def createQueueEntry(pocket, archive, changesfilename=None,
790 changesfilecontent=None, changes_file_alias=None,
790 signingkey=None, package_copy_job=None):791 signingkey=None, package_copy_job=None):
791 """Create a queue item attached to this distroseries.792 """Create a queue item attached to this distroseries.
792793
793 Create a new records respecting the given pocket and archive.794 Create a new `PackageUpload` to the given pocket and archive.
794795
795 The default state is NEW, sorted sqlobject declaration, any796 The default state is NEW. Any further state changes go through
796 modification should be performed via Queue state-machine.797 the Queue state-machine.
797798
798 The changesfile argument should be the text of the .changes for this799 :param pocket: The `PackagePublishingPocket` to upload to.
799 upload. The contents of this may be used later.800 :param archive: The `Archive` to upload to. Must be for the same
800801 `Distribution` as this series.
801 'signingkey' is the IGPGKey used to sign the changesfile or None if802 :param changesfilename: Name for the upload's .changes file. You may
802 the changesfile is unsigned.803 specify a changes file by passing both `changesfilename` and
804 `changesfilecontent`, or by passing `changes_file_alias`.
805 :param changesfilecontent: Text for the changes file. It will be
806 signed and stored in the Librarian. Must be passed together with
807 `changesfilename`; alternatively, you may provide a
808 `changes_file_alias` to replace both of these.
809 :param changes_file_alias: A `LibraryFileAlias` containing the
810 .changes file. Security warning: unless the file has already
811 been checked, this may open us up to replay attacks as per bugs
812 159304 and 451396. Use `changes_file_alias` only if you know
813 this can't happen.
814 :param signingkey: `IGPGKey` used to sign the changes file, or None if
815 it is unsigned.
816 :return: A new `PackageUpload`.
803 """817 """
804818
805 def newArch(architecturetag, processorfamily, official, owner,819 def newArch(architecturetag, processorfamily, official, owner,
806820
=== modified file 'lib/lp/registry/model/distroseries.py'
--- lib/lp/registry/model/distroseries.py 2011-08-05 03:58:16 +0000
+++ lib/lp/registry/model/distroseries.py 2011-08-11 10:31:20 +0000
@@ -86,9 +86,7 @@
86 ISeriesBugTarget,86 ISeriesBugTarget,
87 )87 )
88from lp.bugs.interfaces.bugtaskfilter import OrderedBugTask88from lp.bugs.interfaces.bugtaskfilter import OrderedBugTask
89from lp.bugs.model.bug import (89from lp.bugs.model.bug import get_bug_tags
90 get_bug_tags,
91 )
92from lp.bugs.model.bugtarget import (90from lp.bugs.model.bugtarget import (
93 BugTargetBase,91 BugTargetBase,
94 HasBugHeatMixin,92 HasBugHeatMixin,
@@ -1587,25 +1585,34 @@
1587 get_property_cache(spph).newer_distroseries_version = version1585 get_property_cache(spph).newer_distroseries_version = version
15881586
1589 def createQueueEntry(self, pocket, archive, changesfilename=None,1587 def createQueueEntry(self, pocket, archive, changesfilename=None,
1590 changesfilecontent=None, signing_key=None,1588 changesfilecontent=None, changes_file_alias=None,
1591 package_copy_job=None):1589 signing_key=None, package_copy_job=None):
1592 """See `IDistroSeries`."""1590 """See `IDistroSeries`."""
1593 # We store the changes file in the librarian to avoid having to1591 if (changesfilename is None) != (changesfilecontent is None):
1594 # deal with broken encodings in these files; this will allow us1592 raise AssertionError(
1595 # to regenerate these files as necessary.1593 "Inconsistent changesfilename and changesfilecontent. "
1596 #1594 "Pass either both, or neither.")
1597 # The use of StringIO here should be safe: we do not encoding of1595 if changes_file_alias is not None and changesfilename is not None:
1598 # the content in the changes file (as doing so would be guessing1596 raise AssertionError(
1599 # at best, causing unpredictable corruption), and simply pass it1597 "Conflicting options: "
1600 # off to the librarian.1598 "Both changesfilename and changes_file_alias were given.")
16011599 have_changes_file = not (
1602 if package_copy_job is None and (1600 changesfilename is None and changes_file_alias is None)
1603 changesfilename is None or changesfilecontent is None):1601 if package_copy_job is None and not have_changes_file:
1604 raise AssertionError(1602 raise AssertionError(
1605 "changesfilename and changesfilecontent must be supplied "1603 "changesfilename and changesfilecontent must be supplied "
1606 "if there is no package_copy_job")1604 "if there is no package_copy_job")
16071605
1608 if package_copy_job is None:1606 if changesfilename is not None:
1607 # We store the changes file in the librarian to avoid having to
1608 # deal with broken encodings in these files; this will allow us
1609 # to regenerate these files as necessary.
1610 #
1611 # The use of StringIO here should be safe: we do not encoding of
1612 # the content in the changes file (as doing so would be guessing
1613 # at best, causing unpredictable corruption), and simply pass it
1614 # off to the librarian.
1615
1609 # The PGP signature is stripped from all changesfiles1616 # The PGP signature is stripped from all changesfiles
1610 # to avoid replay attacks (see bugs 159304 and 451396).1617 # to avoid replay attacks (see bugs 159304 and 451396).
1611 signed_message = signed_message_from_string(changesfilecontent)1618 signed_message = signed_message_from_string(changesfilecontent)
@@ -1616,17 +1623,15 @@
1616 if new_content is not None:1623 if new_content is not None:
1617 changesfilecontent = signed_message.signedContent1624 changesfilecontent = signed_message.signedContent
16181625
1619 changes_file = getUtility(ILibraryFileAliasSet).create(1626 changes_file_alias = getUtility(ILibraryFileAliasSet).create(
1620 changesfilename, len(changesfilecontent),1627 changesfilename, len(changesfilecontent),
1621 StringIO(changesfilecontent), 'text/plain',1628 StringIO(changesfilecontent), 'text/plain',
1622 restricted=archive.private)1629 restricted=archive.private)
1623 else:
1624 changes_file = None
16251630
1626 return PackageUpload(1631 return PackageUpload(
1627 distroseries=self, status=PackageUploadStatus.NEW,1632 distroseries=self, status=PackageUploadStatus.NEW,
1628 pocket=pocket, archive=archive,1633 pocket=pocket, archive=archive,
1629 changesfile=changes_file, signing_key=signing_key,1634 changesfile=changes_file_alias, signing_key=signing_key,
1630 package_copy_job=package_copy_job)1635 package_copy_job=package_copy_job)
16311636
1632 def getPackageUploadQueue(self, state):1637 def getPackageUploadQueue(self, state):
16331638
=== added file 'lib/lp/soyuz/scripts/custom_uploads_copier.py'
--- lib/lp/soyuz/scripts/custom_uploads_copier.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/scripts/custom_uploads_copier.py 2011-08-11 10:31:20 +0000
@@ -0,0 +1,149 @@
1# Copyright 2011 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).
3
4"""Copy latest custom uploads into a distribution release series.
5
6Use this when initializing the installer and dist upgrader for a new release
7series based on the latest uploads from its preceding series.
8"""
9
10__metaclass__ = type
11__all__ = [
12 'CustomUploadsCopier',
13 ]
14
15from operator import attrgetter
16import re
17
18from zope.component import getUtility
19
20#from canonical.launchpad.database.librarian import LibraryFileAlias
21from lp.services.database.bulk import load_referencing
22from lp.soyuz.enums import PackageUploadCustomFormat
23from lp.soyuz.interfaces.archive import (
24 IArchiveSet,
25 MAIN_ARCHIVE_PURPOSES,
26 )
27from lp.soyuz.model.queue import PackageUploadCustom
28
29
30class CustomUploadsCopier:
31 """Copy `PackageUploadCustom` objects into a new `DistroSeries`."""
32
33 copyable_types = [
34 PackageUploadCustomFormat.DEBIAN_INSTALLER,
35 PackageUploadCustomFormat.DIST_UPGRADER,
36 ]
37
38 def __init__(self, target_series):
39 self.target_series = target_series
40
41 def isCopyable(self, upload):
42 """Is `upload` the kind of `PackageUploadCustom` that we can copy?"""
43 return upload.customformat in self.copyable_types
44
45 def getCandidateUploads(self, source_series):
46 """Find custom uploads that may need copying."""
47 uploads = source_series.getPackageUploads(
48 custom_type=self.copyable_types)
49 load_referencing(PackageUploadCustom, uploads, ['packageuploadID'])
50 customs = sum([list(upload.customfiles) for upload in uploads], [])
51 customs = filter(self.isCopyable, customs)
52 customs.sort(key=attrgetter('id'), reverse=True)
53 return customs
54
55 def extractNameFields(self, filename):
56 """Get the relevant fields out of `filename`.
57
58 Scans filenames of any of these forms:
59
60 <package>_<version>_<architecture>.tar.<compression_suffix>
61 <package>_<version>.tar[.<compression_suffix>]
62
63 Versions may contain dots, dashes etc. but no underscores.
64
65 :return: A tuple of (<package>, <architecture>), or None if the
66 filename does not match the expected pattern. If no
67 architecture is found in the filename, it defaults to 'all'.
68 """
69 regex_parts = {
70 'package': "[^_]+",
71 'version': "[^_]+",
72 'arch': "[^._]+",
73 }
74 filename_regex = (
75 "(%(package)s)_%(version)s(?:_(%(arch)s))?.tar" % regex_parts)
76 match = re.match(filename_regex, filename)
77 if match is None:
78 return None
79 default_arch = 'all'
80 fields = match.groups(default_arch)
81 if len(fields) != 2:
82 return None
83 return fields
84
85 def getKey(self, upload):
86 """Get an indexing key for `upload`."""
87 custom_format = (upload.customformat, )
88 name_fields = self.extractNameFields(upload.libraryfilealias.filename)
89 if name_fields is None:
90 return None
91 else:
92 return custom_format + name_fields
93
94 def getLatestUploads(self, source_series):
95 """Find the latest uploads.
96
97 :param source_series: The `DistroSeries` whose uploads to get.
98 :return: A dict containing the latest uploads, indexed by keys as
99 returned by `getKey`.
100 """
101 latest_uploads = {}
102 for upload in self.getCandidateUploads(source_series):
103 key = self.getKey(upload)
104 if key is not None:
105 latest_uploads.setdefault(key, upload)
106 return latest_uploads
107
108 def getTargetArchive(self, original_archive):
109 """Find counterpart of `original_archive` in `self.target_series`.
110
111 :param original_archive: The `Archive` that the original upload went
112 into. If this is not a primary, partner, or debug archive,
113 None is returned.
114 :return: The `Archive` of the same purpose for `self.target_series`.
115 """
116 if original_archive.purpose not in MAIN_ARCHIVE_PURPOSES:
117 return None
118 return getUtility(IArchiveSet).getByDistroPurpose(
119 self.target_series.distribution, original_archive.purpose)
120
121 def isObsolete(self, upload, target_uploads):
122 """Is `upload` superseded by one that the target series already has?
123
124 :param upload: A `PackageUploadCustom` from the source series.
125 :param target_uploads:
126 """
127 existing_upload = target_uploads.get(self.getKey(upload))
128 return existing_upload is not None and existing_upload.id >= upload.id
129
130 def copyUpload(self, original_upload):
131 """Copy `original_upload` into `self.target_series`."""
132 target_archive = self.getTargetArchive(
133 original_upload.packageupload.archive)
134 if target_archive is None:
135 return None
136 package_upload = self.target_series.createQueueEntry(
137 original_upload.packageupload.pocket, target_archive,
138 changes_file_alias=original_upload.packageupload.changesfile)
139 custom = package_upload.addCustom(
140 original_upload.libraryfilealias, original_upload.customformat)
141 package_upload.setAccepted()
142 return custom
143
144 def copy(self, source_series):
145 """Copy uploads from `source_series`."""
146 target_uploads = self.getLatestUploads(self.target_series)
147 for upload in self.getLatestUploads(source_series).itervalues():
148 if not self.isObsolete(upload, target_uploads):
149 self.copyUpload(upload)
0150
=== added file 'lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py'
--- lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 2011-08-11 10:31:20 +0000
@@ -0,0 +1,420 @@
1# Copyright 2011 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).
3
4"""Test copying of custom package uploads for a new `DistroSeries`."""
5
6__metaclass__ = type
7
8from canonical.testing.layers import (
9 LaunchpadZopelessLayer,
10 ZopelessLayer,
11 )
12from lp.soyuz.enums import (
13 ArchivePurpose,
14 PackageUploadCustomFormat,
15 PackageUploadStatus,
16 )
17from lp.soyuz.interfaces.archive import MAIN_ARCHIVE_PURPOSES
18from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier
19from lp.testing import TestCaseWithFactory
20from lp.testing.fakemethod import FakeMethod
21
22
23def list_custom_uploads(distroseries):
24 """Return a list of all `PackageUploadCustom`s for `distroseries`."""
25 return sum(
26 [
27 list(upload.customfiles)
28 for upload in distroseries.getPackageUploads()],
29 [])
30
31
32class FakeDistroSeries:
33 """Fake `DistroSeries` for test copiers that don't really need one."""
34
35
36class FakeLibraryFileAlias:
37 def __init__(self, filename):
38 self.filename = filename
39
40
41class FakeUpload:
42 def __init__(self, customformat, filename):
43 self.customformat = customformat
44 self.libraryfilealias = FakeLibraryFileAlias(filename)
45
46
47class CommonTestHelpers:
48 """Helper(s) for these tests."""
49 def makeVersion(self):
50 """Create a fake version string."""
51 return "%d.%d-%s" % (
52 self.factory.getUniqueInteger(),
53 self.factory.getUniqueInteger(),
54 self.factory.getUniqueString())
55
56
57class TestCustomUploadsCopierLite(TestCaseWithFactory, CommonTestHelpers):
58 """Light-weight low-level tests for `CustomUploadsCopier`."""
59
60 layer = ZopelessLayer
61
62 def test_isCopyable_matches_copyable_types(self):
63 # isCopyable checks a custom upload's customformat field to
64 # determine whether the upload is a candidate for copying. It
65 # approves only those whose customformats are in copyable_types.
66 class FakePackageUploadCustom:
67 def __init__(self, customformat):
68 self.customformat = customformat
69
70 uploads = [
71 FakePackageUploadCustom(custom_type)
72 for custom_type in PackageUploadCustomFormat.items]
73
74 copier = CustomUploadsCopier(FakeDistroSeries())
75 copied_uploads = filter(copier.isCopyable, uploads)
76 self.assertContentEqual(
77 CustomUploadsCopier.copyable_types,
78 [upload.customformat for upload in copied_uploads])
79
80 def test_extractNameFields_extracts_package_name_and_architecture(self):
81 # extractNameFields picks up the package name and architecture
82 # out of an upload's filename field.
83 package_name = self.factory.getUniqueString('package')
84 version = self.makeVersion()
85 architecture = self.factory.getUniqueString('arch')
86 filename = '%s_%s_%s.tar.gz' % (package_name, version, architecture)
87 copier = CustomUploadsCopier(FakeDistroSeries())
88 self.assertEqual(
89 (package_name, architecture), copier.extractNameFields(filename))
90
91 def test_extractNameFields_does_not_require_architecture(self):
92 # When extractNameFields does not see an architecture, it
93 # defaults to 'all'.
94 package_name = self.factory.getUniqueString('package')
95 filename = '%s_%s.tar.gz' % (package_name, self.makeVersion())
96 copier = CustomUploadsCopier(FakeDistroSeries())
97 self.assertEqual(
98 (package_name, 'all'), copier.extractNameFields(filename))
99
100 def test_extractNameFields_returns_None_on_mismatch(self):
101 # If the filename does not match the expected pattern,
102 # extractNameFields returns None.
103 copier = CustomUploadsCopier(FakeDistroSeries())
104 self.assertIs(None, copier.extractNameFields('argh_1.0.jpg'))
105
106 def test_extractNameFields_ignores_names_with_too_many_fields(self):
107 # As one particularly nasty case that might break
108 # extractNameFields, a name with more underscore-seprated fields
109 # than the search pattern allows for is sensibly rejected.
110 copier = CustomUploadsCopier(FakeDistroSeries())
111 self.assertIs(
112 None, copier.extractNameFields('one_two_three_four_5.tar.gz'))
113
114 def test_getKey_returns_None_on_name_mismatch(self):
115 # If extractNameFields returns None, getKey also returns None.
116 copier = CustomUploadsCopier(FakeDistroSeries())
117 copier.extractNameFields = FakeMethod()
118 self.assertIs(
119 None,
120 copier.getKey(FakeUpload(
121 PackageUploadCustomFormat.DEBIAN_INSTALLER,
122 "bad-filename.tar")))
123
124
125class TestCustomUploadsCopier(TestCaseWithFactory, CommonTestHelpers):
126 """Heavyweight `CustomUploadsCopier` tests."""
127
128 # Alas, PackageUploadCustom relies on the Librarian.
129 layer = LaunchpadZopelessLayer
130
131 def makeUpload(self, distroseries=None,
132 custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER,
133 package_name=None, version=None, arch=None):
134 """Create a `PackageUploadCustom`."""
135 if distroseries is None:
136 distroseries = self.factory.makeDistroSeries()
137 if package_name is None:
138 package_name = self.factory.getUniqueString("package")
139 if version is None:
140 version = self.makeVersion()
141 filename = "%s.tar.gz" % '_'.join(
142 filter(None, [package_name, version, arch]))
143 package_upload = self.factory.makeCustomPackageUpload(
144 distroseries=distroseries, custom_type=custom_type,
145 filename=filename)
146 return package_upload.customfiles[0]
147
148 def test_copies_custom_upload(self):
149 # CustomUploadsCopier copies custom uploads from one series to
150 # another.
151 current_series = self.factory.makeDistroSeries()
152 original_upload = self.makeUpload(current_series)
153 new_series = self.factory.makeDistroSeries(
154 distribution=current_series.distribution,
155 previous_series=current_series)
156
157 CustomUploadsCopier(new_series).copy(current_series)
158
159 [copied_upload] = list_custom_uploads(new_series)
160 self.assertEqual(
161 original_upload.libraryfilealias, copied_upload.libraryfilealias)
162
163 def test_is_idempotent(self):
164 # It's safe to perform the same copy more than once; the uploads
165 # get copied only once.
166 current_series = self.factory.makeDistroSeries()
167 self.makeUpload(current_series)
168 new_series = self.factory.makeDistroSeries(
169 distribution=current_series.distribution,
170 previous_series=current_series)
171
172 copier = CustomUploadsCopier(new_series)
173 copier.copy(current_series)
174 uploads_after_first_copy = list_custom_uploads(new_series)
175 copier.copy(current_series)
176 uploads_after_redundant_copy = list_custom_uploads(new_series)
177
178 self.assertEqual(
179 uploads_after_first_copy, uploads_after_redundant_copy)
180
181 def test_getCandidateUploads_filters_by_distroseries(self):
182 # getCandidateUploads ignores uploads for other distroseries.
183 source_series = self.factory.makeDistroSeries()
184 matching_upload = self.makeUpload(source_series)
185 nonmatching_upload = self.makeUpload()
186 copier = CustomUploadsCopier(FakeDistroSeries())
187 candidate_uploads = copier.getCandidateUploads(source_series)
188 self.assertContentEqual([matching_upload], candidate_uploads)
189 self.assertNotIn(nonmatching_upload, candidate_uploads)
190
191 def test_getCandidateUploads_filters_upload_types(self):
192 # getCandidateUploads returns only uploads of the types listed
193 # in copyable_types; other types of upload are ignored.
194 source_series = self.factory.makeDistroSeries()
195 for custom_format in PackageUploadCustomFormat.items:
196 self.makeUpload(source_series, custom_type=custom_format)
197
198 copier = CustomUploadsCopier(FakeDistroSeries())
199 candidate_uploads = copier.getCandidateUploads(source_series)
200 copied_types = [upload.customformat for upload in candidate_uploads]
201 self.assertContentEqual(
202 CustomUploadsCopier.copyable_types, copied_types)
203
204 def test_getCandidateUploads_ignores_other_attachments(self):
205 # A PackageUpload can have multiple PackageUploadCustoms
206 # attached, potentially of different types. getCandidateUploads
207 # ignores PackageUploadCustoms of types that aren't supposed to
208 # be copied, even if they are attached to PackageUploads that
209 # also have PackageUploadCustoms that do need to be copied.
210 source_series = self.factory.makeDistroSeries()
211 package_upload = self.factory.makePackageUpload(
212 distroseries=source_series, archive=source_series.main_archive)
213 library_file = self.factory.makeLibraryFileAlias()
214 matching_upload = package_upload.addCustom(
215 library_file, PackageUploadCustomFormat.DEBIAN_INSTALLER)
216 nonmatching_upload = package_upload.addCustom(
217 library_file, PackageUploadCustomFormat.ROSETTA_TRANSLATIONS)
218 copier = CustomUploadsCopier(FakeDistroSeries())
219 candidates = copier.getCandidateUploads(source_series)
220 self.assertContentEqual([matching_upload], candidates)
221 self.assertNotIn(nonmatching_upload, candidates)
222
223 def test_getCandidateUploads_orders_newest_to_oldest(self):
224 # getCandidateUploads returns its PackageUploadCustoms ordered
225 # from newest to oldest.
226 source_series = self.factory.makeDistroSeries()
227 for counter in xrange(5):
228 self.makeUpload(source_series)
229 copier = CustomUploadsCopier(FakeDistroSeries())
230 candidate_ids = [
231 upload.id for upload in copier.getCandidateUploads(source_series)]
232 self.assertEqual(sorted(candidate_ids, reverse=True), candidate_ids)
233
234 def test_getKey_includes_format_package_and_architecture(self):
235 # The key returned by getKey consists of custom upload type,
236 # package name, and architecture.
237 source_series = self.factory.makeDistroSeries()
238 upload = self.makeUpload(
239 source_series, PackageUploadCustomFormat.DIST_UPGRADER,
240 package_name='upgrader', arch='mips')
241 copier = CustomUploadsCopier(FakeDistroSeries())
242 expected_key = (
243 PackageUploadCustomFormat.DIST_UPGRADER,
244 'upgrader',
245 'mips',
246 )
247 self.assertEqual(expected_key, copier.getKey(upload))
248
249 def test_getLatestUploads_indexes_uploads_by_key(self):
250 # getLatestUploads returns a dict of uploads, indexed by keys
251 # returned by getKey.
252 source_series = self.factory.makeDistroSeries()
253 upload = self.makeUpload(source_series)
254 copier = CustomUploadsCopier(FakeDistroSeries())
255 self.assertEqual(
256 {copier.getKey(upload): upload},
257 copier.getLatestUploads(source_series))
258
259 def test_getLatestUploads_filters_superseded_uploads(self):
260 # getLatestUploads returns only the latest upload for a given
261 # distroseries, type, package, and architecture. Any older
262 # uploads with the same distroseries, type, package name, and
263 # architecture are ignored.
264 source_series = self.factory.makeDistroSeries()
265 uploads = [
266 self.makeUpload(
267 source_series, package_name='installer', version='1.0.0',
268 arch='ppc')
269 for counter in xrange(3)]
270
271 copier = CustomUploadsCopier(FakeDistroSeries())
272 self.assertContentEqual(
273 uploads[-1:], copier.getLatestUploads(source_series).values())
274
275 def test_getLatestUploads_bundles_versions(self):
276 # getLatestUploads sees an upload as superseding an older one
277 # for the same distroseries, type, package name, and
278 # architecture even if they have different versions.
279 source_series = self.factory.makeDistroSeries()
280 uploads = [
281 self.makeUpload(source_series, package_name='foo', arch='i386')
282 for counter in xrange(2)]
283 copier = CustomUploadsCopier(FakeDistroSeries())
284 self.assertContentEqual(
285 uploads[-1:], copier.getLatestUploads(source_series).values())
286
287 def test_getTargetArchive_on_same_distro_is_same_archive(self):
288 # When copying within the same distribution, getTargetArchive
289 # always returns the same archive you feed it.
290 distro = self.factory.makeDistribution()
291 archives = [
292 self.factory.makeArchive(distribution=distro, purpose=purpose)
293 for purpose in MAIN_ARCHIVE_PURPOSES]
294 copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro))
295 self.assertEqual(
296 archives,
297 [copier.getTargetArchive(archive) for archive in archives])
298
299 def test_getTargetArchive_returns_None_if_not_distribution_archive(self):
300 # getTargetArchive returns None for any archive that is not a
301 # distribution archive, regardless of whether the target series
302 # has an equivalent.
303 distro = self.factory.makeDistribution()
304 archives = [
305 self.factory.makeArchive(distribution=distro, purpose=purpose)
306 for purpose in ArchivePurpose.items
307 if purpose not in MAIN_ARCHIVE_PURPOSES]
308 copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro))
309 self.assertEqual(
310 [None] * len(archives),
311 [copier.getTargetArchive(archive) for archive in archives])
312
313 def test_getTargetArchive_finds_matching_archive(self):
314 # When copying across archives, getTargetArchive looks for an
315 # archive for the target series with the same purpose as the
316 # original archive.
317 source_series = self.factory.makeDistroSeries()
318 source_archive = self.factory.makeArchive(
319 distribution=source_series.distribution,
320 purpose=ArchivePurpose.PARTNER)
321 target_series = self.factory.makeDistroSeries()
322 target_archive = self.factory.makeArchive(
323 distribution=target_series.distribution,
324 purpose=ArchivePurpose.PARTNER)
325
326 copier = CustomUploadsCopier(target_series)
327 self.assertEqual(
328 target_archive, copier.getTargetArchive(source_archive))
329
330 def test_getTargetArchive_returns_None_if_no_archive_matches(self):
331 # If the target series has no archive to match the archive that
332 # the original upload was far, it returns None.
333 source_series = self.factory.makeDistroSeries()
334 source_archive = self.factory.makeArchive(
335 distribution=source_series.distribution,
336 purpose=ArchivePurpose.PARTNER)
337 target_series = self.factory.makeDistroSeries()
338 copier = CustomUploadsCopier(target_series)
339 self.assertIs(None, copier.getTargetArchive(source_archive))
340
341 def test_isObsolete_returns_False_if_no_equivalent_in_target(self):
342 # isObsolete returns False if the upload in question has no
343 # equivalent in the target series.
344 source_series = self.factory.makeDistroSeries()
345 upload = self.makeUpload(source_series)
346 target_series = self.factory.makeDistroSeries()
347 copier = CustomUploadsCopier(target_series)
348 self.assertFalse(
349 copier.isObsolete(upload, copier.getLatestUploads(target_series)))
350
351 def test_isObsolete_returns_False_if_target_has_older_equivalent(self):
352 # isObsolete returns False if the target has an equivlalent of
353 # the upload in question, but it's older than the version the
354 # source series has.
355 source_series = self.factory.makeDistroSeries()
356 target_series = self.factory.makeDistroSeries()
357 self.makeUpload(
358 target_series, package_name='installer', arch='ppc64')
359 source_upload = self.makeUpload(
360 source_series, package_name='installer', arch='ppc64')
361 copier = CustomUploadsCopier(target_series)
362 self.assertFalse(
363 copier.isObsolete(
364 source_upload, copier.getLatestUploads(target_series)))
365
366 def test_isObsolete_returns_True_if_target_has_newer_equivalent(self):
367 # isObsolete returns False if the target series already has a
368 # newer equivalent of the upload in question (as would be the
369 # case, for instance, if the upload had already been copied).
370 source_series = self.factory.makeDistroSeries()
371 source_upload = self.makeUpload(
372 source_series, package_name='installer', arch='alpha')
373 target_series = self.factory.makeDistroSeries()
374 self.makeUpload(
375 target_series, package_name='installer', arch='alpha')
376 copier = CustomUploadsCopier(target_series)
377 self.assertTrue(
378 copier.isObsolete(
379 source_upload, copier.getLatestUploads(target_series)))
380
381 def test_copyUpload_creates_upload(self):
382 # copyUpload creates a new upload that's very similar to the
383 # original, but for the target series.
384 original_upload = self.makeUpload()
385 target_series = self.factory.makeDistroSeries()
386 copier = CustomUploadsCopier(target_series)
387 copied_upload = copier.copyUpload(original_upload)
388 self.assertEqual([copied_upload], list_custom_uploads(target_series))
389 self.assertNotEqual(
390 original_upload.packageupload, copied_upload.packageupload)
391 self.assertEqual(
392 original_upload.customformat, copied_upload.customformat)
393 self.assertEqual(
394 original_upload.libraryfilealias, copied_upload.libraryfilealias)
395 self.assertEqual(
396 original_upload.packageupload.changesfile,
397 copied_upload.packageupload.changesfile)
398 self.assertEqual(
399 original_upload.packageupload.pocket,
400 copied_upload.packageupload.pocket)
401
402 def test_copyUpload_accepts_upload(self):
403 # Uploads created by copyUpload are automatically accepted.
404 original_upload = self.makeUpload()
405 target_series = self.factory.makeDistroSeries()
406 copier = CustomUploadsCopier(target_series)
407 copied_upload = copier.copyUpload(original_upload)
408 self.assertEqual(
409 PackageUploadStatus.ACCEPTED, copied_upload.packageupload.status)
410
411 def test_copyUpload_does_not_copy_if_no_archive_matches(self):
412 # If getTargetArchive does not find an appropriate target
413 # archive, copyUpload does nothing.
414 source_series = self.factory.makeDistroSeries()
415 upload = self.makeUpload(distroseries=source_series)
416 target_series = self.factory.makeDistroSeries()
417 copier = CustomUploadsCopier(target_series)
418 copier.getTargetArchive = FakeMethod(result=None)
419 self.assertIs(None, copier.copyUpload(upload))
420 self.assertEqual([], list_custom_uploads(target_series))
0421
=== modified file 'lib/lp/soyuz/tests/test_publishing.py'
--- lib/lp/soyuz/tests/test_publishing.py 2011-08-03 11:00:11 +0000
+++ lib/lp/soyuz/tests/test_publishing.py 2011-08-11 10:31:20 +0000
@@ -159,7 +159,7 @@
159 signing_key = self.person.gpg_keys[0]159 signing_key = self.person.gpg_keys[0]
160 package_upload = distroseries.createQueueEntry(160 package_upload = distroseries.createQueueEntry(
161 pocket, archive, changes_file_name, changes_file_content,161 pocket, archive, changes_file_name, changes_file_content,
162 signing_key)162 signing_key=signing_key)
163163
164 status_to_method = {164 status_to_method = {
165 PackageUploadStatus.DONE: 'setDone',165 PackageUploadStatus.DONE: 'setDone',