Merge lp:~cjwatson/launchpad/copy-auto-approve into lp:launchpad

Proposed by Colin Watson
Status: Merged
Approved by: Colin Watson
Approved revision: no longer in the source branch.
Merged at revision: 15722
Proposed branch: lp:~cjwatson/launchpad/copy-auto-approve
Merge into: lp:launchpad
Prerequisite: lp:~cjwatson/launchpad/copy-allows-queue-admin
Diff against target: 379 lines (+148/-19)
5 files modified
lib/lp/soyuz/interfaces/archive.py (+18/-2)
lib/lp/soyuz/interfaces/packagecopyjob.py (+13/-2)
lib/lp/soyuz/model/archive.py (+6/-4)
lib/lp/soyuz/model/packagecopyjob.py (+26/-11)
lib/lp/soyuz/tests/test_packagecopyjob.py (+85/-0)
To merge this branch: bzr merge lp:~cjwatson/launchpad/copy-auto-approve
Reviewer Review Type Date Requested Status
Benji York (community) code Approve
Review via email: mp+117299@code.launchpad.net

Commit message

Add auto_approve parameter to Archive.copyPackage and Archive.copyPackages, allowing users with queue admin to bypass the unapproved status.

Description of the change

== Summary ==

Bug 1006871: in some cases it is cumbersome to have to approve package copies as a separate step, particularly given the delay on processing PackageCopyJobs. There should be a way for queue admins to copy a package and approve the copy in a single step. The bug has some concrete use cases for this.

== Proposed fix ==

Add an auto_approve parameter to Archive.copyPackage and Archive.copyPackages, defaulting to False. When True, copies are automatically accepted even if they would normally have required approval, although only if the requester has appropriate queue admin permissions.

== Pre-implementation notes ==

Iain Lane suggested that this should be explicit rather than (my initial proposal) implicit. On reflection, this is obviously correct. If I run syncpackage to copy a package from Debian, and Ubuntu happens to be frozen at the time, then it should go through UNAPPROVED even though I'm an Ubuntu archive administrator and thus have queue admin permissions. On the other hand, if (acting as an archive admin) I run sru-release to copy a verified stable release update from -proposed to -updates, I'm always going to accept it immediately afterwards so it makes sense to let me skip that extra step.

== Implementation details ==

There's a little duplication in PlainPackageCopyJob._checkPolicies (the calls to canAdministerQueue in both if branches), but if you look closely that's necessary because it has to come after the source override handling.

I arranged to silently ignore auto_approve=True from non-queue-admins because it's harmless to do so and it makes it easier to write tools that work for people both with and without queue admin permissions. (For instance, the kernel team sometimes copy packages from their PPA to the -proposed pocket in the primary archive and ask us to accept them; but since the copy-proposed-kernel tool is often run by archive admins, it would make sense for it to use auto_approve=True.)

== LOC Rationale ==

+129. I have 1833 lines of credit at the moment, and, as mentioned in https://code.launchpad.net/~cjwatson/launchpad/copy-allows-queue-admin/+merge/116731, this is the last known piece needed to remove the copy-package.py script.

== Tests ==

bin/test -vvct test_packagecopyjob

== Demo and Q/A ==

Add auto_approve=True to the Archive.copyPackage calls in sru-release in lp:ubuntu-archive-tools, and use this to copy a package from -proposed to -updates on dogfood (with queue admin permissions). It should be automatically accepted. Without that parameter, the copy should land in the unapproved queue.

== Lint ==

Just a pre-existing false positive due to pocketlint not understanding property setters:

./lib/lp/soyuz/model/archive.py
     346: redefinition of function 'private' from line 342

To post a comment you must log in.
Revision history for this message
Benji York (benji) wrote :

Looks good.

review: Approve (code)

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'lib/lp/soyuz/interfaces/archive.py'
2--- lib/lp/soyuz/interfaces/archive.py 2012-07-30 17:05:29 +0000
3+++ lib/lp/soyuz/interfaces/archive.py 2012-07-30 17:05:29 +0000
4@@ -1292,12 +1292,17 @@
5 title=_("Sponsored Person"),
6 description=_("The person who is being sponsored for this copy.")),
7 unembargo=Bool(title=_("Unembargo restricted files")),
8+ auto_approve=Bool(
9+ title=_("Automatic approval"),
10+ description=_("Automatically approve this copy (queue admins "
11+ "only)."),
12+ required=False),
13 )
14 @export_write_operation()
15 @operation_for_version('devel')
16 def copyPackage(source_name, version, from_archive, to_pocket,
17 person, to_series=None, include_binaries=False,
18- sponsored=None, unembargo=False):
19+ sponsored=None, unembargo=False, auto_approve=False):
20 """Copy a single named source into this archive.
21
22 Asynchronously copy a specific version of a named source to the
23@@ -1321,6 +1326,9 @@
24 :param unembargo: if True, allow copying restricted files from a
25 private archive to a public archive, and re-upload them to the
26 public librarian when doing so.
27+ :param auto_approve: if True and the `IPerson` requesting the sync
28+ has queue admin permissions on the target, accept the copy
29+ immediately rather than setting it to unapproved.
30
31 :raises NoSuchSourcePackageName: if the source name is invalid
32 :raises PocketNotFound: if the pocket name is invalid
33@@ -1354,12 +1362,17 @@
34 title=_("Sponsored Person"),
35 description=_("The person who is being sponsored for this copy.")),
36 unembargo=Bool(title=_("Unembargo restricted files")),
37+ auto_approve=Bool(
38+ title=_("Automatic approval"),
39+ description=_("Automatically approve this copy (queue admins "
40+ "only)."),
41+ required=False),
42 )
43 @export_write_operation()
44 @operation_for_version('devel')
45 def copyPackages(source_names, from_archive, to_pocket, person,
46 to_series=None, from_series=None, include_binaries=False,
47- sponsored=None, unembargo=False):
48+ sponsored=None, unembargo=False, auto_approve=False):
49 """Copy multiple named sources into this archive from another.
50
51 Asynchronously copy the most recent PUBLISHED versions of the named
52@@ -1386,6 +1399,9 @@
53 :param unembargo: if True, allow copying restricted files from a
54 private archive to a public archive, and re-upload them to the
55 public librarian when doing so.
56+ :param auto_approve: if True and the `IPerson` requesting the sync
57+ has queue admin permissions on the target, accept the copies
58+ immediately rather than setting it to unapproved.
59
60 :raises NoSuchSourcePackageName: if the source name is invalid
61 :raises PocketNotFound: if the pocket name is invalid
62
63=== modified file 'lib/lp/soyuz/interfaces/packagecopyjob.py'
64--- lib/lp/soyuz/interfaces/packagecopyjob.py 2012-07-12 08:32:15 +0000
65+++ lib/lp/soyuz/interfaces/packagecopyjob.py 2012-07-30 17:05:29 +0000
66@@ -128,7 +128,7 @@
67 target_archive, target_distroseries, target_pocket,
68 include_binaries=False, package_version=None,
69 copy_policy=PackageCopyPolicy.INSECURE, requester=None,
70- sponsored=None, unembargo=False):
71+ sponsored=None, unembargo=False, auto_approve=False):
72 """Create a new `IPlainPackageCopyJob`.
73
74 :param package_name: The name of the source package to copy.
75@@ -147,11 +147,15 @@
76 :param sponsored: The user who is being sponsored to make the copy.
77 The person who is making this request then becomes the sponsor.
78 :param unembargo: See `do_copy`.
79+ :param auto_approve: if True and the user requesting the sync has
80+ queue admin permissions on the target, accept the copy
81+ immediately rather than setting it to unapproved.
82 """
83
84 def createMultiple(target_distroseries, copy_tasks, requester,
85 copy_policy=PackageCopyPolicy.INSECURE,
86- include_binaries=False, unembargo=False):
87+ include_binaries=False, unembargo=False,
88+ auto_approve=False):
89 """Create multiple new `IPlainPackageCopyJob`s at once.
90
91 :param target_distroseries: The `IDistroSeries` to which to copy the
92@@ -164,6 +168,9 @@
93 :param include_binaries: As in `do_copy`.
94 :param unembargo: As in `do_copy`.
95 :return: An iterable of `PackageCopyJob` ids.
96+ :param auto_approve: if True and the user requesting the sync has
97+ queue admin permissions on the target, accept the copy
98+ immediately rather than setting it to unapproved.
99 """
100
101 def getActiveJobs(target_archive):
102@@ -216,6 +223,10 @@
103 title=_("Unembargo restricted files"),
104 required=False, readonly=True)
105
106+ auto_approve = Bool(
107+ title=_("Automatic approval"),
108+ required=False, readonly=True)
109+
110 def addSourceOverride(override):
111 """Add an `ISourceOverride` to the metadata."""
112
113
114=== modified file 'lib/lp/soyuz/model/archive.py'
115--- lib/lp/soyuz/model/archive.py 2012-07-30 17:05:29 +0000
116+++ lib/lp/soyuz/model/archive.py 2012-07-30 17:05:29 +0000
117@@ -1638,7 +1638,7 @@
118
119 def copyPackage(self, source_name, version, from_archive, to_pocket,
120 person, to_series=None, include_binaries=False,
121- sponsored=None, unembargo=False):
122+ sponsored=None, unembargo=False, auto_approve=False):
123 """See `IArchive`."""
124 self._checkCopyPackageFeatureFlags()
125
126@@ -1660,11 +1660,13 @@
127 target_pocket=pocket,
128 package_version=version, include_binaries=include_binaries,
129 copy_policy=PackageCopyPolicy.INSECURE, requester=person,
130- sponsored=sponsored, unembargo=unembargo)
131+ sponsored=sponsored, unembargo=unembargo,
132+ auto_approve=auto_approve)
133
134 def copyPackages(self, source_names, from_archive, to_pocket,
135 person, to_series=None, from_series=None,
136- include_binaries=None, sponsored=None, unembargo=False):
137+ include_binaries=None, sponsored=None, unembargo=False,
138+ auto_approve=False):
139 """See `IArchive`."""
140 self._checkCopyPackageFeatureFlags()
141
142@@ -1696,7 +1698,7 @@
143 job_source.createMultiple(
144 copy_tasks, person, copy_policy=PackageCopyPolicy.MASS_SYNC,
145 include_binaries=include_binaries, sponsored=sponsored,
146- unembargo=unembargo)
147+ unembargo=unembargo, auto_approve=auto_approve)
148
149 def _collectLatestPublishedSources(self, from_archive, from_series,
150 source_names):
151
152=== modified file 'lib/lp/soyuz/model/packagecopyjob.py'
153--- lib/lp/soyuz/model/packagecopyjob.py 2012-07-12 08:32:15 +0000
154+++ lib/lp/soyuz/model/packagecopyjob.py 2012-07-30 17:05:29 +0000
155@@ -256,7 +256,8 @@
156
157 @classmethod
158 def _makeMetadata(cls, target_pocket, package_version,
159- include_binaries, sponsored=None, unembargo=False):
160+ include_binaries, sponsored=None, unembargo=False,
161+ auto_approve=False):
162 """Produce a metadata dict for this job."""
163 if sponsored:
164 sponsored_name = sponsored.name
165@@ -268,6 +269,7 @@
166 'include_binaries': bool(include_binaries),
167 'sponsored': sponsored_name,
168 'unembargo': unembargo,
169+ 'auto_approve': auto_approve,
170 }
171
172 @classmethod
173@@ -275,13 +277,13 @@
174 target_archive, target_distroseries, target_pocket,
175 include_binaries=False, package_version=None,
176 copy_policy=PackageCopyPolicy.INSECURE, requester=None,
177- sponsored=None, unembargo=False):
178+ sponsored=None, unembargo=False, auto_approve=False):
179 """See `IPlainPackageCopyJobSource`."""
180 assert package_version is not None, "No package version specified."
181 assert requester is not None, "No requester specified."
182 metadata = cls._makeMetadata(
183 target_pocket, package_version, include_binaries, sponsored,
184- unembargo)
185+ unembargo, auto_approve)
186 job = PackageCopyJob(
187 job_type=cls.class_job_type,
188 source_archive=source_archive,
189@@ -298,7 +300,8 @@
190
191 @classmethod
192 def _composeJobInsertionTuple(cls, copy_policy, include_binaries, job_id,
193- copy_task, sponsored, unembargo):
194+ copy_task, sponsored, unembargo,
195+ auto_approve):
196 """Create an SQL fragment for inserting a job into the database.
197
198 :return: A string representing an SQL tuple containing initializers
199@@ -315,7 +318,7 @@
200 ) = copy_task
201 metadata = cls._makeMetadata(
202 target_pocket, package_version, include_binaries, sponsored,
203- unembargo)
204+ unembargo, auto_approve)
205 data = (
206 cls.class_job_type, target_distroseries, copy_policy,
207 source_archive, target_archive, package_name, job_id,
208@@ -326,14 +329,14 @@
209 def createMultiple(cls, copy_tasks, requester,
210 copy_policy=PackageCopyPolicy.INSECURE,
211 include_binaries=False, sponsored=None,
212- unembargo=False):
213+ unembargo=False, auto_approve=False):
214 """See `IPlainPackageCopyJobSource`."""
215 store = IMasterStore(Job)
216 job_ids = Job.createMultiple(store, len(copy_tasks), requester)
217 job_contents = [
218 cls._composeJobInsertionTuple(
219 copy_policy, include_binaries, job_id, task, sponsored,
220- unembargo)
221+ unembargo, auto_approve)
222 for job_id, task in zip(job_ids, copy_tasks)]
223 return bulk.create(
224 (PackageCopyJob.job_type, PackageCopyJob.target_distroseries,
225@@ -415,6 +418,10 @@
226 def unembargo(self):
227 return self.metadata.get('unembargo', False)
228
229+ @property
230+ def auto_approve(self):
231+ return self.metadata.get('auto_approve', False)
232+
233 def _createPackageUpload(self, unapproved=False):
234 pu = self.target_distroseries.createQueueEntry(
235 pocket=self.target_pocket, archive=self.target_archive,
236@@ -452,7 +459,8 @@
237
238 return SourceOverride(source_package_name, component, section)
239
240- def _checkPolicies(self, source_name, source_component=None):
241+ def _checkPolicies(self, source_name, source_component=None,
242+ auto_approve=False):
243 # This helper will only return if it's safe to carry on with the
244 # copy, otherwise it raises SuspendJobException to tell the job
245 # runner to suspend the job.
246@@ -470,8 +478,11 @@
247 self.target_archive, self.target_distroseries,
248 self.target_pocket, [source_name], source_component)
249 self.addSourceOverride(defaults[0])
250+ if auto_approve:
251+ auto_approve = self.target_archive.canAdministerQueue(
252+ self.requester, self.getSourceOverride().component)
253
254- approve_new = copy_policy.autoApproveNew(
255+ approve_new = auto_approve or copy_policy.autoApproveNew(
256 self.target_archive, self.target_distroseries,
257 self.target_pocket)
258
259@@ -483,10 +494,13 @@
260 else:
261 # Put the existing override in the metadata.
262 self.addSourceOverride(ancestry[0])
263+ if auto_approve:
264+ auto_approve = self.target_archive.canAdministerQueue(
265+ self.requester, self.getSourceOverride().component)
266
267 # The package is not new (it has ancestry) so check the copy
268 # policy for existing packages.
269- approve_existing = copy_policy.autoApprove(
270+ approve_existing = auto_approve or copy_policy.autoApprove(
271 self.target_archive, self.target_distroseries, self.target_pocket)
272 if not approve_existing:
273 self._createPackageUpload(unapproved=True)
274@@ -565,7 +579,8 @@
275 [self.context.id]).any()
276 if pu is None:
277 self._checkPolicies(
278- source_name, source_package.sourcepackagerelease.component)
279+ source_name, source_package.sourcepackagerelease.component,
280+ self.auto_approve)
281
282 # The package is free to go right in, so just copy it now.
283 ancestry = self.target_archive.getPublishedSources(
284
285=== modified file 'lib/lp/soyuz/tests/test_packagecopyjob.py'
286--- lib/lp/soyuz/tests/test_packagecopyjob.py 2012-07-25 18:56:53 +0000
287+++ lib/lp/soyuz/tests/test_packagecopyjob.py 2012-07-30 17:05:29 +0000
288@@ -925,6 +925,91 @@
289 # the target archive.
290 self.assertEqual('main', pcj.metadata['component_override'])
291
292+ def createAutoApproveEnvironment(self, create_ancestry, component_names):
293+ """Create an environment for testing the auto_approve flag."""
294+ if create_ancestry:
295+ self.distroseries.status = SeriesStatus.FROZEN
296+ target_archive = self.factory.makeArchive(
297+ self.distroseries.distribution, purpose=ArchivePurpose.PRIMARY)
298+ source_archive = self.factory.makeArchive()
299+ requester = self.factory.makePerson()
300+ with person_logged_in(target_archive.owner):
301+ for component_name in component_names:
302+ target_archive.newQueueAdmin(requester, component_name)
303+ spph = self.publisher.getPubSource(
304+ distroseries=self.distroseries,
305+ status=PackagePublishingStatus.PUBLISHED, archive=source_archive)
306+ spr = spph.sourcepackagerelease
307+ if create_ancestry:
308+ self.publisher.getPubSource(
309+ distroseries=self.distroseries, sourcename=spr.name,
310+ version="%s~" % spr.version,
311+ status=PackagePublishingStatus.PUBLISHED,
312+ archive=target_archive)
313+ return target_archive, source_archive, requester, spph, spr
314+
315+ def test_auto_approve(self):
316+ # The auto_approve flag causes the job to be processed immediately,
317+ # even if it would normally have required manual approval.
318+ (target_archive, source_archive, requester,
319+ spph, spr) = self.createAutoApproveEnvironment(True, ["main"])
320+ job = self.createCopyJobForSPPH(
321+ spph, source_archive, target_archive, requester=requester,
322+ auto_approve=True)
323+ self.runJob(job)
324+ self.assertEqual(JobStatus.COMPLETED, job.status)
325+ new_publication = target_archive.getPublishedSources(
326+ name=spr.name, version=spr.version, exact_match=True).one()
327+ self.assertEqual(target_archive, new_publication.archive)
328+
329+ def test_auto_approve_non_queue_admin(self):
330+ # The auto_approve flag is ignored for people without queue admin
331+ # permissions.
332+ (target_archive, source_archive, requester,
333+ spph, spr) = self.createAutoApproveEnvironment(True, [])
334+ job = self.createCopyJobForSPPH(
335+ spph, source_archive, target_archive, requester=requester,
336+ auto_approve=True)
337+ self.runJob(job)
338+ self.assertEqual(JobStatus.SUSPENDED, job.status)
339+
340+ def test_auto_approve_new(self):
341+ # The auto_approve flag causes copies to bypass the NEW queue.
342+ (target_archive, source_archive, requester,
343+ spph, spr) = self.createAutoApproveEnvironment(False, ["universe"])
344+
345+ # Without auto_approve, this job would be suspended and the upload
346+ # moved to the NEW queue.
347+ job = self.createCopyJobForSPPH(
348+ spph, source_archive, target_archive, requester=requester)
349+ self.runJob(job)
350+ self.assertEqual(JobStatus.SUSPENDED, job.status)
351+ switch_dbuser("launchpad_main")
352+ pu = getUtility(IPackageUploadSet).getByPackageCopyJobIDs(
353+ [removeSecurityProxy(job).context.id]).one()
354+ self.assertEqual(PackageUploadStatus.NEW, pu.status)
355+
356+ # With auto_approve, the job completes immediately.
357+ job = self.createCopyJobForSPPH(
358+ spph, source_archive, target_archive, requester=requester,
359+ auto_approve=True)
360+ self.runJob(job)
361+ self.assertEqual(JobStatus.COMPLETED, job.status)
362+ new_publication = target_archive.getPublishedSources(
363+ name=spr.name, version=spr.version, exact_match=True).one()
364+ self.assertEqual(target_archive, new_publication.archive)
365+
366+ def test_auto_approve_new_non_queue_admin(self):
367+ # For NEW packages, the auto_approve flag is ignored for people
368+ # without queue admin permissions.
369+ (target_archive, source_archive, requester,
370+ spph, spr) = self.createAutoApproveEnvironment(False, [])
371+ job = self.createCopyJobForSPPH(
372+ spph, source_archive, target_archive, requester=requester,
373+ auto_approve=True)
374+ self.runJob(job)
375+ self.assertEqual(JobStatus.SUSPENDED, job.status)
376+
377 def test_copying_after_job_released(self):
378 # The first pass of the job may have created a PackageUpload and
379 # suspended the job. Here we test the second run to make sure