Merge lp:~salgado/launchpad/layer-specific-navigation into lp:launchpad/db-devel

Proposed by Guilherme Salgado
Status: Rejected
Rejected by: Paul Hummer
Proposed branch: lp:~salgado/launchpad/layer-specific-navigation
Merge into: lp:launchpad/db-devel
Diff against target: 1625 lines (+491/-212)
26 files modified
.bzrignore (+1/-1)
README (+106/-4)
lib/canonical/buildd/debian/changelog (+11/-1)
lib/canonical/buildd/debian/compat (+1/-0)
lib/canonical/buildd/debian/control (+4/-4)
lib/canonical/buildd/debian/launchpad-buildd.examples (+1/-0)
lib/canonical/buildd/debian/launchpad-buildd.init (+10/-0)
lib/canonical/buildd/debian/rules (+4/-3)
lib/canonical/buildd/debian/source/format (+1/-0)
lib/canonical/launchpad/doc/navigation.txt (+9/-1)
lib/canonical/launchpad/webapp/metazcml.py (+5/-9)
lib/canonical/launchpad/webapp/tests/test_navigation.py (+107/-0)
lib/lp/archivepublisher/config.py (+19/-14)
lib/lp/archivepublisher/ftparchive.py (+11/-3)
lib/lp/archivepublisher/tests/test_config.py (+19/-9)
lib/lp/archivepublisher/tests/test_ftparchive.py (+18/-2)
lib/lp/archiveuploader/tests/__init__.py (+9/-3)
lib/lp/archiveuploader/tests/test_buildduploads.py (+2/-3)
lib/lp/archiveuploader/tests/test_ppauploadprocessor.py (+2/-5)
lib/lp/archiveuploader/tests/test_recipeuploads.py (+2/-3)
lib/lp/archiveuploader/tests/test_securityuploads.py (+3/-7)
lib/lp/archiveuploader/tests/test_uploadprocessor.py (+78/-78)
lib/lp/archiveuploader/uploadprocessor.py (+38/-28)
lib/lp/buildmaster/model/buildfarmjob.py (+21/-32)
lib/lp/soyuz/scripts/soyuz_process_upload.py (+8/-2)
utilities/sourcedeps.conf (+1/-0)
To merge this branch: bzr merge lp:~salgado/launchpad/layer-specific-navigation
Reviewer Review Type Date Requested Status
Launchpad code reviewers Pending
Review via email: mp+32339@code.launchpad.net

Description of the change

This branch makes it possible to register navigation classes for a
specific layer.

This is to allow us to use a different navigation class for, say,
IDistribution on the vostok vhost.

One issue I've encountered is that the navigation directive will
unconditionally register the nav classes for IXMLRPCRequest (the layer
of the xmlrpc vhost) as well, and that will cause a
ConfigurationConflict when you have more than one navigation class used
for a given context. One solution to this is to add yet another
parameter to the navigation directive (used_for_xmlrpc=True) and use
that to decide whether or not to register the nav class for
IXMLRPCRequest.

I'm not particularly happy with that solution as it doesn't make it
possible for vostok to overwrite the navigation that is used on the
xmlrpc vhost, but that may not be a problem in practice as we need
custom navigation classes only because we want to remove traversals from
the existing ones.

To post a comment you must log in.
Revision history for this message
Paul Hummer (rockstar) wrote :

I approved this in the devel proposal

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file '.bzrignore'
2--- .bzrignore 2010-07-15 15:57:40 +0000
3+++ .bzrignore 2010-08-12 19:18:45 +0000
4@@ -56,7 +56,6 @@
5 .bazaar
6 .cache
7 .subversion
8-lib/canonical/buildd/launchpad-files
9 .testrepository
10 .memcache.pid
11 ./pipes
12@@ -77,3 +76,4 @@
13 lib/canonical/launchpad-buildd_*_source.build
14 lib/canonical/launchpad-buildd_*_source.changes
15 lib/canonical/buildd/debian/*
16+lib/canonical/buildd/launchpad-files/*
17
18=== modified file 'README'
19--- README 2010-04-06 20:07:30 +0000
20+++ README 2010-08-12 19:18:45 +0000
21@@ -1,4 +1,106 @@
22-This is the top level project, that supplies the infrastructure for testing,
23-and running launchpad.
24-
25-Documentation is in the doc directory or on the wiki.
26+====================
27+README for Launchpad
28+====================
29+
30+Launchpad is an open source suite of tools that help people and teams to work
31+together on software projects. Unlike many open source projects, Launchpad
32+isn't something you install and run yourself (although you are welcome to do
33+so), instead, contributors help make <https://launchpad.net> better.
34+
35+Launchpad is a project of Canonical <http://www.canonical.com> and has
36+received many contributions from many wonderful people
37+<https://dev.launchpad.net/Contributions>.
38+
39+If you want help using Launchpad, then please visit our help wiki at:
40+
41+ https://help.launchpad.net
42+
43+If you'd like to contribute to Launchpad, have a look at:
44+
45+ https://dev.launchpad.net
46+
47+Alternatively, have a poke around in the code, which you probably already know
48+how to get if you are reading this file.
49+
50+
51+Getting started
52+===============
53+
54+There's a full guide for getting up-and-running with a development Launchpad
55+environment at <https://dev.launchpad.net/Getting>. When you are ready to
56+submit a patch, please consult <https://dev.launchpad.net/PatchSubmission>.
57+
58+Our bug tracker is at <https://bugs.launchpad.net/launchpad/> and you can get
59+the source code any time by doing:
60+
61+ $ bzr branch lp:launchpad
62+
63+
64+Navigating the tree
65+-------------------
66+
67+The Launchpad tree is big, messy and changing. Sorry about that. Don't panic
68+though, it can sense fear. Keep a firm grip on `grep` and pay attention to
69+these important top-level folders:
70+
71+ bin/, utilities/
72+ Where you will find scripts intended for developers and admins. There's
73+ no rhyme or reason to what goes in bin/ and what goes in utilities/, so
74+ take a look in both. bin/ will be empty in a fresh checkout, the actual
75+ content lives in 'buildout-templates'.
76+
77+ configs/
78+ Configuration files for various kinds of Launchpad instances.
79+ 'development' and 'testrunner' are of particular interest to developers.
80+
81+ cronscripts/
82+ Scripts that are run on actual production instances of Launchpad as
83+ cronjobs.
84+
85+ daemons/
86+ Entry points for various daemons that form part of Launchpad
87+
88+ database/
89+ Our database schema, our sample data, and some other stuff that causes
90+ fear.
91+
92+ doc/
93+ General system-wide documentation. You can also find documentation on
94+ <https://dev.launchpad.net>, in docstrings and in doctests.
95+
96+ lib/
97+ Where the vast majority of the code lives, along with our templates, tests
98+ and the bits of our documentation that are written as doctests. 'lp' and
99+ 'canonical' are the two most interesting packages. Note that 'canonical'
100+ is deprecated in favour of 'lp'. To learn more about how the 'lp' package
101+ is laid out, take a look at its docstring.
102+
103+ Makefile
104+ Ahh, bliss. The Makefile has all sorts of goodies. If you spend any
105+ length of time hacking on Launchpad, you'll use it often. The most
106+ important targets are 'make clean', 'make compile', 'make schema', 'make
107+ run' and 'make run_all'.
108+
109+ scripts/
110+ Scripts that are run on actual production instances of Launchpad,
111+ generally triggered by some automatic process.
112+
113+
114+You can spend years hacking on Launchpad full-time and not know what all of
115+the files in the top-level directory are for. However, here's a guide to some
116+of the ones that come up from time to time.
117+
118+ buildout-templates/
119+ Templates that are generated into actual files, normally bin/ scripts,
120+ when buildout is run. If you want to change the behaviour of bin/test,
121+ look here.
122+
123+ bzrplugins/, optionalbzrplugins/
124+ Bazaar plugins used in running Launchpad.
125+
126+ sourcecode/
127+ A directory into which we symlink branches of some of Launchpad's
128+ dependencies. Don't ask.
129+
130+You never have to care about 'benchmarks', 'override-includes' or
131+'package-includes'.
132
133=== renamed file 'daemons/buildd-slave-example.conf' => 'lib/canonical/buildd/buildd-slave-example.conf'
134=== modified file 'lib/canonical/buildd/debian/changelog'
135--- lib/canonical/buildd/debian/changelog 2010-08-05 21:12:36 +0000
136+++ lib/canonical/buildd/debian/changelog 2010-08-12 19:18:45 +0000
137@@ -1,9 +1,19 @@
138 launchpad-buildd (68) UNRELEASED; urgency=low
139
140+ [ William Grant ]
141 * Take an 'arch_tag' argument, so the master can override the slave
142 architecture.
143
144- -- William Grant <wgrant@ubuntu.com> Sun, 01 Aug 2010 22:00:32 +1000
145+ [ Jelmer Vernooij ]
146+
147+ * Explicitly use source format 1.0.
148+ * Add LSB information to init script.
149+ * Use debhelper >= 5 (available in dapper, not yet deprecated in
150+ maverick).
151+ * Fix spelling in description.
152+ * Install example buildd configuration.
153+
154+ -- Jelmer Vernooij <jelmer@canonical.com> Thu, 12 Aug 2010 17:04:14 +0200
155
156 launchpad-buildd (67) hardy-cat; urgency=low
157
158
159=== added file 'lib/canonical/buildd/debian/compat'
160--- lib/canonical/buildd/debian/compat 1970-01-01 00:00:00 +0000
161+++ lib/canonical/buildd/debian/compat 2010-08-12 19:18:45 +0000
162@@ -0,0 +1,1 @@
163+5
164
165=== modified file 'lib/canonical/buildd/debian/control'
166--- lib/canonical/buildd/debian/control 2010-05-19 15:50:27 +0000
167+++ lib/canonical/buildd/debian/control 2010-08-12 19:18:45 +0000
168@@ -3,15 +3,15 @@
169 Priority: extra
170 Maintainer: Adam Conrad <adconrad@ubuntu.com>
171 Standards-Version: 3.5.9
172-Build-Depends-Indep: debhelper (>= 4)
173+Build-Depends-Indep: debhelper (>= 5)
174
175 Package: launchpad-buildd
176 Section: misc
177 Architecture: all
178-Depends: python-twisted-core, python-twisted-web, debootstrap, dpkg-dev, linux32, file, bzip2, sudo, ntpdate, adduser, apt-transport-https, lsb-release, apache2
179+Depends: python-twisted-core, python-twisted-web, debootstrap, dpkg-dev, linux32, file, bzip2, sudo, ntpdate, adduser, apt-transport-https, lsb-release, apache2, ${misc:Depends}
180 Description: Launchpad buildd slave
181 This is the launchpad buildd slave package. It contains everything needed to
182 get a launchpad buildd going apart from the database manipulation required to
183 tell launchpad about the slave instance. If you are creating more than one
184- slave instance on the same computer, be sure to give them independant configs
185- and independant filecaches etc.
186+ slave instance on the same computer, be sure to give them independent configs
187+ and independent filecaches etc.
188
189=== added file 'lib/canonical/buildd/debian/launchpad-buildd.examples'
190--- lib/canonical/buildd/debian/launchpad-buildd.examples 1970-01-01 00:00:00 +0000
191+++ lib/canonical/buildd/debian/launchpad-buildd.examples 2010-08-12 19:18:45 +0000
192@@ -0,0 +1,1 @@
193+buildd-slave-example.conf
194
195=== modified file 'lib/canonical/buildd/debian/launchpad-buildd.init'
196--- lib/canonical/buildd/debian/launchpad-buildd.init 2010-03-31 17:10:21 +0000
197+++ lib/canonical/buildd/debian/launchpad-buildd.init 2010-08-12 19:18:45 +0000
198@@ -8,6 +8,16 @@
199 #
200 # Author: Daniel Silverstone <daniel.silverstone@canonical.com>
201
202+### BEGIN INIT INFO
203+# Provides: launchpad_buildd
204+# Required-Start: $local_fs $network $syslog $time
205+# Required-Stop: $local_fs $network $syslog $time
206+# Default-Start: 2 3 4 5
207+# Default-Stop: 0 1 6
208+# X-Interactive: false
209+# Short-Description: Start/stop launchpad buildds
210+### END INIT INFO
211+
212 set -e
213
214 PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
215
216=== modified file 'lib/canonical/buildd/debian/rules'
217--- lib/canonical/buildd/debian/rules 2010-07-18 08:49:02 +0000
218+++ lib/canonical/buildd/debian/rules 2010-08-12 19:18:45 +0000
219@@ -3,7 +3,6 @@
220 # Copyright 2009, 2010 Canonical Ltd. This software is licensed under the
221 # GNU Affero General Public License version 3 (see the file LICENSE).
222
223-export DH_COMPAT=4
224 export DH_OPTIONS
225
226 # This is an incomplete debian rules file for making the launchpad-buildd deb
227@@ -41,6 +40,7 @@
228 etc/launchpad-buildd \
229 usr/share/launchpad-buildd/canonical/launchpad/daemons \
230 usr/share/doc/launchpad-buildd
231+ dh_installexamples
232
233 # Do installs here
234 touch $(pytarget)/../launchpad/__init__.py
235@@ -89,10 +89,11 @@
236 clean:
237 dh_clean
238
239-package:
240- mkdir -p launchpad-files
241+prepare:
242 install -m644 $(daemons)/buildd-slave.tac launchpad-files/buildd-slave.tac
243 cp ../launchpad/daemons/tachandler.py launchpad-files/tachandler.py
244+
245+package: prepare
246 debuild -uc -us -S
247
248 build:
249
250=== added directory 'lib/canonical/buildd/debian/source'
251=== added file 'lib/canonical/buildd/debian/source/format'
252--- lib/canonical/buildd/debian/source/format 1970-01-01 00:00:00 +0000
253+++ lib/canonical/buildd/debian/source/format 2010-08-12 19:18:45 +0000
254@@ -0,0 +1,1 @@
255+1.0
256
257=== added directory 'lib/canonical/buildd/launchpad-files'
258=== modified file 'lib/canonical/launchpad/doc/navigation.txt'
259--- lib/canonical/launchpad/doc/navigation.txt 2010-08-02 02:13:52 +0000
260+++ lib/canonical/launchpad/doc/navigation.txt 2010-08-12 19:18:45 +0000
261@@ -234,7 +234,7 @@
262
263 == ZCML for browser:navigation ==
264
265-The zcml processor `browser:navigation` processes navigation classes.
266+The zcml processor `browser:navigation` registers navigation classes.
267
268 >>> class ThingSetView:
269 ...
270@@ -266,6 +266,14 @@
271 ... </configure>
272 ... """)
273
274+Once registered, we can look the navigation up using getMultiAdapter().
275+
276+ >>> from zope.component import getMultiAdapter
277+ >>> from canonical.launchpad.webapp.servers import LaunchpadTestRequest
278+ >>> from zope.publisher.interfaces.browser import IBrowserPublisher
279+ >>> navigation = getMultiAdapter(
280+ ... (thingset, LaunchpadTestRequest()), IBrowserPublisher, name='')
281+
282 This time, we get the view object for the page that was registered.
283
284 >>> navigation.publishTraverse(request, 'thingview')
285
286=== modified file 'lib/canonical/launchpad/webapp/metazcml.py'
287--- lib/canonical/launchpad/webapp/metazcml.py 2010-08-05 21:53:44 +0000
288+++ lib/canonical/launchpad/webapp/metazcml.py 2010-08-12 19:18:45 +0000
289@@ -193,6 +193,10 @@
290 class INavigationDirective(IGlueDirective):
291 """Hook up traversal etc."""
292
293+ layer = GlobalInterface(
294+ title=u"The layer where this navigation is going to be available.",
295+ required=False)
296+
297
298 class IFeedsDirective(IGlueDirective):
299 """Hook up feeds."""
300@@ -267,7 +271,7 @@
301 layer=layer, class_=feedclass)
302
303
304-def navigation(_context, module, classes):
305+def navigation(_context, module, classes, layer=IDefaultBrowserLayer):
306 """Handler for the `INavigationDirective`."""
307 if not inspect.ismodule(module):
308 raise TypeError("module attribute must be a module: %s, %s" %
309@@ -281,19 +285,11 @@
310 for_ = [navclass.usedfor]
311
312 # Register the navigation as the traversal component.
313- layer = IDefaultBrowserLayer
314 provides = IBrowserPublisher
315 name = ''
316 view(_context, factory, layer, name, for_,
317 permission=PublicPermission, provides=provides,
318 allowed_interface=[IBrowserPublisher])
319- #view(_context, factory, layer, name, for_,
320- # permission=PublicPermission, provides=provides)
321-
322- # Also register the navigation as a traversal component for XMLRPC.
323- xmlrpc_layer = IXMLRPCRequest
324- view(_context, factory, xmlrpc_layer, name, for_,
325- permission=PublicPermission, provides=provides)
326
327
328 class InterfaceInstanceDispatcher:
329
330=== added file 'lib/canonical/launchpad/webapp/tests/test_navigation.py'
331--- lib/canonical/launchpad/webapp/tests/test_navigation.py 1970-01-01 00:00:00 +0000
332+++ lib/canonical/launchpad/webapp/tests/test_navigation.py 2010-08-12 19:18:45 +0000
333@@ -0,0 +1,107 @@
334+# Copyright 2010 Canonical Ltd. This software is licensed under the
335+# GNU Affero General Public License version 3 (see the file LICENSE).
336+
337+__metaclass__ = type
338+
339+from zope.component import ComponentLookupError, getMultiAdapter
340+from zope.configuration import xmlconfig
341+from zope.interface import implements, Interface
342+from zope.publisher.interfaces.browser import (
343+ IBrowserPublisher, IDefaultBrowserLayer)
344+from zope.testing.cleanup import cleanUp
345+
346+from canonical.launchpad.webapp import Navigation
347+
348+from lp.testing import TestCase
349+
350+
351+class TestNavigationDirective(TestCase):
352+
353+ def test_default_layer(self):
354+ # By default all navigation classes are registered for
355+ # IDefaultBrowserLayer.
356+ directive = """
357+ <browser:navigation
358+ module="%(this)s" classes="ThingNavigation"/>
359+ """ % dict(this=this)
360+ xmlconfig.string(zcml_configure % directive)
361+ navigation = getMultiAdapter(
362+ (Thing(), DefaultBrowserLayer()), IBrowserPublisher, name='')
363+ self.assertIsInstance(navigation, ThingNavigation)
364+
365+ def test_specific_layer(self):
366+ # If we specify a layer when registering a navigation class, it will
367+ # only be available on that layer.
368+ directive = """
369+ <browser:navigation
370+ module="%(this)s" classes="OtherThingNavigation"
371+ layer="%(this)s.IOtherLayer" />
372+ """ % dict(this=this)
373+ xmlconfig.string(zcml_configure % directive)
374+ self.assertRaises(
375+ ComponentLookupError,
376+ getMultiAdapter,
377+ (Thing(), DefaultBrowserLayer()), IBrowserPublisher, name='')
378+
379+ navigation = getMultiAdapter(
380+ (Thing(), OtherLayer()), IBrowserPublisher, name='')
381+ self.assertIsInstance(navigation, OtherThingNavigation)
382+
383+ def test_multiple_navigations_for_single_context(self):
384+ # It is possible to have multiple navigation classes for a given
385+ # context class as long as they are registered for different layers.
386+ directive = """
387+ <browser:navigation
388+ module="%(this)s" classes="ThingNavigation"/>
389+ <browser:navigation
390+ module="%(this)s" classes="OtherThingNavigation"
391+ layer="%(this)s.IOtherLayer" />
392+ """ % dict(this=this)
393+ xmlconfig.string(zcml_configure % directive)
394+
395+ navigation = getMultiAdapter(
396+ (Thing(), DefaultBrowserLayer()), IBrowserPublisher, name='')
397+ other_navigation = getMultiAdapter(
398+ (Thing(), OtherLayer()), IBrowserPublisher, name='')
399+ self.assertNotEqual(navigation, other_navigation)
400+
401+ def tearDown(self):
402+ TestCase.tearDown(self)
403+ cleanUp()
404+
405+
406+class DefaultBrowserLayer:
407+ implements(IDefaultBrowserLayer)
408+
409+
410+class IThing(Interface):
411+ pass
412+
413+
414+class Thing(object):
415+ implements(IThing)
416+
417+
418+class ThingNavigation(Navigation):
419+ usedfor = IThing
420+
421+
422+class OtherThingNavigation(Navigation):
423+ usedfor = IThing
424+
425+
426+class IOtherLayer(Interface):
427+ pass
428+
429+
430+class OtherLayer:
431+ implements(IOtherLayer)
432+
433+
434+this = "canonical.launchpad.webapp.tests.test_navigation"
435+zcml_configure = """
436+ <configure xmlns:browser="http://namespaces.zope.org/browser">
437+ <include package="canonical.launchpad.webapp" file="meta.zcml" />
438+ %s
439+ </configure>
440+ """
441
442=== added symlink 'lib/deb822.py'
443=== target is u'../sourcecode/python-debian/lib/deb822.py'
444=== added symlink 'lib/debian'
445=== target is u'../sourcecode/python-debian/lib/debian'
446=== modified file 'lib/lp/archivepublisher/config.py'
447--- lib/lp/archivepublisher/config.py 2010-07-07 06:28:03 +0000
448+++ lib/lp/archivepublisher/config.py 2010-08-12 19:18:45 +0000
449@@ -120,18 +120,15 @@
450 config_segment["archtags"].append(
451 dar.architecturetag.encode('utf-8'))
452
453- if not dr.lucilleconfig:
454- raise LucilleConfigError(
455- 'No Lucille configuration section for %s' % dr.name)
456-
457- strio = StringIO(dr.lucilleconfig.encode('utf-8'))
458- config_segment["config"] = ConfigParser()
459- config_segment["config"].readfp(strio)
460- strio.close()
461- config_segment["components"] = config_segment["config"].get(
462- "publishing", "components").split(" ")
463-
464- self._distroseries[distroseries_name] = config_segment
465+ if dr.lucilleconfig:
466+ strio = StringIO(dr.lucilleconfig.encode('utf-8'))
467+ config_segment["config"] = ConfigParser()
468+ config_segment["config"].readfp(strio)
469+ strio.close()
470+ config_segment["components"] = config_segment["config"].get(
471+ "publishing", "components").split(" ")
472+
473+ self._distroseries[distroseries_name] = config_segment
474
475 strio = StringIO(distribution.lucilleconfig.encode('utf-8'))
476 self._distroconfig = ConfigParser()
477@@ -144,11 +141,19 @@
478 # Because dicts iterate for keys only; this works to get dr names
479 return self._distroseries.keys()
480
481+ def series(self, dr):
482+ try:
483+ return self._distroseries[dr]
484+ except KeyError:
485+ raise LucilleConfigError(
486+ 'No Lucille config section for %s in %s' %
487+ (dr, self.distroName))
488+
489 def archTagsForSeries(self, dr):
490- return self._distroseries[dr]["archtags"]
491+ return self.series(dr)["archtags"]
492
493 def componentsForSeries(self, dr):
494- return self._distroseries[dr]["components"]
495+ return self.series(dr)["components"]
496
497 def _extractConfigInfo(self):
498 """Extract configuration information into the attributes we use"""
499
500=== modified file 'lib/lp/archivepublisher/ftparchive.py'
501--- lib/lp/archivepublisher/ftparchive.py 2009-10-26 18:40:04 +0000
502+++ lib/lp/archivepublisher/ftparchive.py 2010-08-12 19:18:45 +0000
503@@ -138,6 +138,14 @@
504 self._config = config
505 self._diskpool = diskpool
506 self.distro = distro
507+ self.distroseries = []
508+ for distroseries in self.distro.series:
509+ if not distroseries.name in self._config.distroSeriesNames():
510+ self.log.warning("Distroseries %s in %s doesn't have "
511+ "a lucille configuration.", distroseries.name,
512+ self.distro.name)
513+ else:
514+ self.distroseries.append(distroseries)
515 self.publisher = publisher
516 self.release_files_needed = {}
517
518@@ -185,7 +193,7 @@
519 # iterate over the pockets, and do the suffix check inside
520 # createEmptyPocketRequest; that would also allow us to replace
521 # the == "" check we do there by a RELEASE match
522- for distroseries in self.distro:
523+ for distroseries in self.distroseries:
524 components = self._config.componentsForSeries(distroseries.name)
525 for pocket, suffix in pocketsuffix.items():
526 if not fullpublish:
527@@ -366,7 +374,7 @@
528
529 def generateOverrides(self, fullpublish=False):
530 """Collect packages that need overrides, and generate them."""
531- for distroseries in self.distro.series:
532+ for distroseries in self.distroseries:
533 for pocket in PackagePublishingPocket.items:
534 if not fullpublish:
535 if not self.publisher.isDirty(distroseries, pocket):
536@@ -629,7 +637,7 @@
537
538 def generateFileLists(self, fullpublish=False):
539 """Collect currently published FilePublishings and write filelists."""
540- for distroseries in self.distro.series:
541+ for distroseries in self.distroseries:
542 for pocket in pocketsuffix:
543 if not fullpublish:
544 if not self.publisher.isDirty(distroseries, pocket):
545
546=== modified file 'lib/lp/archivepublisher/tests/test_config.py'
547--- lib/lp/archivepublisher/tests/test_config.py 2010-07-18 00:24:06 +0000
548+++ lib/lp/archivepublisher/tests/test_config.py 2010-08-12 19:18:45 +0000
549@@ -5,36 +5,48 @@
550
551 __metaclass__ = type
552
553-import unittest
554-
555 from zope.component import getUtility
556
557 from canonical.config import config
558 from canonical.launchpad.interfaces import IDistributionSet
559 from canonical.testing import LaunchpadZopelessLayer
560
561-
562-class TestConfig(unittest.TestCase):
563+from lp.archivepublisher.config import Config, LucilleConfigError
564+from lp.testing import TestCaseWithFactory
565+
566+
567+class TestConfig(TestCaseWithFactory):
568 layer = LaunchpadZopelessLayer
569
570 def setUp(self):
571+ super(TestConfig, self).setUp()
572 self.layer.switchDbUser(config.archivepublisher.dbuser)
573 self.ubuntutest = getUtility(IDistributionSet)['ubuntutest']
574
575+ def testMissingDistroSeries(self):
576+ distroseries = self.factory.makeDistroSeries(
577+ distribution=self.ubuntutest, name="somename")
578+ d = Config(self.ubuntutest)
579+ dsns = d.distroSeriesNames()
580+ self.assertEquals(len(dsns), 2)
581+ self.assertEquals(dsns[0], "breezy-autotest")
582+ self.assertEquals(dsns[1], "hoary-test")
583+ self.assertRaises(LucilleConfigError,
584+ d.archTagsForSeries, "somename")
585+ self.assertRaises(LucilleConfigError,
586+ d.archTagsForSeries, "unknown")
587+
588 def testInstantiate(self):
589 """Config should instantiate"""
590- from lp.archivepublisher.config import Config
591 d = Config(self.ubuntutest)
592
593 def testDistroName(self):
594 """Config should be able to return the distroName"""
595- from lp.archivepublisher.config import Config
596 d = Config(self.ubuntutest)
597 self.assertEqual(d.distroName, "ubuntutest")
598
599 def testDistroSeriesNames(self):
600 """Config should return two distroseries names"""
601- from lp.archivepublisher.config import Config
602 d = Config(self.ubuntutest)
603 dsns = d.distroSeriesNames()
604 self.assertEquals(len(dsns), 2)
605@@ -43,14 +55,12 @@
606
607 def testArchTagsForSeries(self):
608 """Config should have the arch tags for the drs"""
609- from lp.archivepublisher.config import Config
610 d = Config(self.ubuntutest)
611 archs = d.archTagsForSeries("hoary-test")
612 self.assertEquals(len(archs), 2)
613
614 def testDistroConfig(self):
615 """Config should have parsed a distro config"""
616- from lp.archivepublisher.config import Config
617 d = Config(self.ubuntutest)
618 # NOTE: Add checks here when you add stuff in util.py
619 self.assertEquals(d.stayofexecution, 5)
620
621=== modified file 'lib/lp/archivepublisher/tests/test_ftparchive.py'
622--- lib/lp/archivepublisher/tests/test_ftparchive.py 2010-07-18 00:24:06 +0000
623+++ lib/lp/archivepublisher/tests/test_ftparchive.py 2010-08-12 19:18:45 +0000
624@@ -15,7 +15,7 @@
625 from zope.component import getUtility
626
627 from canonical.config import config
628-from canonical.launchpad.scripts.logger import QuietFakeLogger
629+from canonical.launchpad.scripts.logger import BufferLogger, QuietFakeLogger
630 from canonical.testing import LaunchpadZopelessLayer
631 from lp.archivepublisher.config import Config
632 from lp.archivepublisher.diskpool import DiskPool
633@@ -23,6 +23,7 @@
634 from lp.archivepublisher.publishing import Publisher
635 from lp.registry.interfaces.distribution import IDistributionSet
636 from lp.registry.interfaces.pocket import PackagePublishingPocket
637+from lp.testing import TestCaseWithFactory
638
639
640 def sanitize_apt_ftparchive_Sources_output(text):
641@@ -55,10 +56,11 @@
642 return self._result[i:j]
643
644
645-class TestFTPArchive(unittest.TestCase):
646+class TestFTPArchive(TestCaseWithFactory):
647 layer = LaunchpadZopelessLayer
648
649 def setUp(self):
650+ super(TestFTPArchive, self).setUp()
651 self.layer.switchDbUser(config.archivepublisher.dbuser)
652
653 self._distribution = getUtility(IDistributionSet)['ubuntutest']
654@@ -79,6 +81,7 @@
655 self._publisher = SamplePublisher(self._archive)
656
657 def tearDown(self):
658+ super(TestFTPArchive, self).tearDown()
659 shutil.rmtree(self._config.distroroot)
660
661 def _verifyFile(self, filename, directory, output_filter=None):
662@@ -116,6 +119,19 @@
663 self._publisher)
664 return fa
665
666+ def test_NoLucilleConfig(self):
667+ # Distroseries without a lucille configuration get ignored
668+ # and trigger a warning, they don't break the publisher
669+ logger = BufferLogger()
670+ publisher = Publisher(
671+ logger, self._config, self._dp, self._archive)
672+ self.factory.makeDistroSeries(self._distribution, name="somename")
673+ fa = FTPArchiveHandler(logger, self._config, self._dp,
674+ self._distribution, publisher)
675+ fa.createEmptyPocketRequests(fullpublish=True)
676+ self.assertEquals("WARNING: Distroseries somename in ubuntutest doesn't "
677+ "have a lucille configuration.\n", logger.buffer.getvalue())
678+
679 def test_getSourcesForOverrides(self):
680 # getSourcesForOverrides returns a list of tuples containing:
681 # (sourcename, suite, component, section)
682
683=== modified file 'lib/lp/archiveuploader/tests/__init__.py'
684--- lib/lp/archiveuploader/tests/__init__.py 2010-05-04 15:38:08 +0000
685+++ lib/lp/archiveuploader/tests/__init__.py 2010-08-12 19:18:45 +0000
686@@ -1,6 +1,10 @@
687 # Copyright 2009 Canonical Ltd. This software is licensed under the
688 # GNU Affero General Public License version 3 (see the file LICENSE).
689
690+"""Tests for the archive uploader."""
691+
692+from __future__ import with_statement
693+
694 __metaclass__ = type
695
696 __all__ = ['datadir', 'getPolicy', 'insertFakeChangesFile',
697@@ -25,6 +29,7 @@
698 raise ValueError("Path is not relative: %s" % path)
699 return os.path.join(here, 'data', path)
700
701+
702 def insertFakeChangesFile(fileID, path=None):
703 """Insert a fake changes file into the librarian.
704
705@@ -34,11 +39,11 @@
706 """
707 if path is None:
708 path = datadir("ed-0.2-21/ed_0.2-21_source.changes")
709- changes_file_obj = open(path, 'r')
710- test_changes_file = changes_file_obj.read()
711- changes_file_obj.close()
712+ with open(path, 'r') as changes_file_obj:
713+ test_changes_file = changes_file_obj.read()
714 fillLibrarianFile(fileID, content=test_changes_file)
715
716+
717 def insertFakeChangesFileForAllPackageUploads():
718 """Ensure all the PackageUpload records point to a valid changes file."""
719 for id in set(pu.changesfile.id for pu in PackageUploadSet()):
720@@ -53,6 +58,7 @@
721 self.distroseries = distroseries
722 self.buildid = buildid
723
724+
725 def getPolicy(name='anything', distro='ubuntu', distroseries=None,
726 buildid=None):
727 """Build and return an Upload Policy for the given context."""
728
729=== modified file 'lib/lp/archiveuploader/tests/test_buildduploads.py'
730--- lib/lp/archiveuploader/tests/test_buildduploads.py 2010-07-18 00:26:33 +0000
731+++ lib/lp/archiveuploader/tests/test_buildduploads.py 2010-08-12 19:18:45 +0000
732@@ -7,7 +7,6 @@
733
734 from lp.archiveuploader.tests.test_securityuploads import (
735 TestStagedBinaryUploadBase)
736-from lp.archiveuploader.uploadprocessor import UploadProcessor
737 from lp.registry.interfaces.pocket import PackagePublishingPocket
738 from canonical.database.constants import UTC_NOW
739 from canonical.launchpad.interfaces import PackagePublishingStatus
740@@ -84,8 +83,8 @@
741 """Setup an UploadProcessor instance for a given buildd context."""
742 self.options.context = self.policy
743 self.options.buildid = str(build_candidate.id)
744- self.uploadprocessor = UploadProcessor(
745- self.options, self.layer.txn, self.log)
746+ self.uploadprocessor = self.getUploadProcessor(
747+ self.layer.txn)
748
749 def testDelayedBinaryUpload(self):
750 """Check if Soyuz copes with delayed binary uploads.
751
752=== modified file 'lib/lp/archiveuploader/tests/test_ppauploadprocessor.py'
753--- lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2010-08-02 02:13:52 +0000
754+++ lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2010-08-12 19:18:45 +0000
755@@ -18,7 +18,6 @@
756 from zope.security.proxy import removeSecurityProxy
757
758 from lp.app.errors import NotFoundError
759-from lp.archiveuploader.uploadprocessor import UploadProcessor
760 from lp.archiveuploader.tests.test_uploadprocessor import (
761 TestUploadProcessorBase)
762 from canonical.config import config
763@@ -74,8 +73,7 @@
764
765 # Set up the uploadprocessor with appropriate options and logger
766 self.options.context = 'insecure'
767- self.uploadprocessor = UploadProcessor(
768- self.options, self.layer.txn, self.log)
769+ self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
770
771 def assertEmail(self, contents=None, recipients=None,
772 ppa_header='name16'):
773@@ -1224,8 +1222,7 @@
774
775 # Re-initialize uploadprocessor since it depends on the new
776 # transaction reset by switchDbUser.
777- self.uploadprocessor = UploadProcessor(
778- self.options, self.layer.txn, self.log)
779+ self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
780
781 def testPPASizeQuotaSourceRejection(self):
782 """Verify the size quota check for PPA uploads.
783
784=== modified file 'lib/lp/archiveuploader/tests/test_recipeuploads.py'
785--- lib/lp/archiveuploader/tests/test_recipeuploads.py 2010-07-22 15:24:02 +0000
786+++ lib/lp/archiveuploader/tests/test_recipeuploads.py 2010-08-12 19:18:45 +0000
787@@ -12,7 +12,6 @@
788
789 from lp.archiveuploader.tests.test_uploadprocessor import (
790 TestUploadProcessorBase)
791-from lp.archiveuploader.uploadprocessor import UploadProcessor
792 from lp.buildmaster.interfaces.buildbase import BuildStatus
793 from lp.code.interfaces.sourcepackagerecipebuild import (
794 ISourcePackageRecipeBuildSource)
795@@ -42,8 +41,8 @@
796 self.options.context = 'recipe'
797 self.options.buildid = self.build.id
798
799- self.uploadprocessor = UploadProcessor(
800- self.options, self.layer.txn, self.log)
801+ self.uploadprocessor = self.getUploadProcessor(
802+ self.layer.txn)
803
804 def testSetsBuildAndState(self):
805 # Ensure that the upload processor correctly links the SPR to
806
807=== modified file 'lib/lp/archiveuploader/tests/test_securityuploads.py'
808--- lib/lp/archiveuploader/tests/test_securityuploads.py 2010-07-18 00:26:33 +0000
809+++ lib/lp/archiveuploader/tests/test_securityuploads.py 2010-08-12 19:18:45 +0000
810@@ -11,7 +11,6 @@
811
812 from lp.archiveuploader.tests.test_uploadprocessor import (
813 TestUploadProcessorBase)
814-from lp.archiveuploader.uploadprocessor import UploadProcessor
815 from lp.registry.interfaces.pocket import PackagePublishingPocket
816 from lp.soyuz.model.binarypackagebuild import BinaryPackageBuild
817 from lp.soyuz.model.processor import ProcessorFamily
818@@ -70,8 +69,7 @@
819 self.options.context = self.policy
820 self.options.nomails = self.no_mails
821 # Set up the uploadprocessor with appropriate options and logger
822- self.uploadprocessor = UploadProcessor(
823- self.options, self.layer.txn, self.log)
824+ self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
825 self.builds_before_upload = BinaryPackageBuild.select().count()
826 self.source_queue = None
827 self._uploadSource()
828@@ -232,8 +230,7 @@
829 """
830 build_candidate = self._createBuild('i386')
831 self.options.buildid = str(build_candidate.id)
832- self.uploadprocessor = UploadProcessor(
833- self.options, self.layer.txn, self.log)
834+ self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
835
836 build_used = self._uploadBinary('i386')
837
838@@ -254,8 +251,7 @@
839 """
840 build_candidate = self._createBuild('hppa')
841 self.options.buildid = str(build_candidate.id)
842- self.uploadprocessor = UploadProcessor(
843- self.options, self.layer.txn, self.log)
844+ self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
845
846 self.assertRaises(AssertionError, self._uploadBinary, 'i386')
847
848
849=== modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py'
850--- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-08-02 02:13:52 +0000
851+++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-08-12 19:18:45 +0000
852@@ -23,8 +23,10 @@
853 from zope.security.proxy import removeSecurityProxy
854
855 from lp.app.errors import NotFoundError
856-from lp.archiveuploader.uploadpolicy import AbstractUploadPolicy
857+from lp.archiveuploader.uploadpolicy import (AbstractUploadPolicy,
858+ findPolicyByOptions)
859 from lp.archiveuploader.uploadprocessor import UploadProcessor
860+from lp.buildmaster.interfaces.buildbase import BuildStatus
861 from canonical.config import config
862 from canonical.database.constants import UTC_NOW
863 from lp.soyuz.model.archivepermission import ArchivePermission
864@@ -59,7 +61,7 @@
865 ISourcePackageNameSet)
866 from lp.services.mail import stub
867 from canonical.launchpad.testing.fakepackager import FakePackager
868-from lp.testing import TestCaseWithFactory
869+from lp.testing import TestCase, TestCaseWithFactory
870 from lp.testing.mail_helpers import pop_notifications
871 from canonical.launchpad.webapp.errorlog import ErrorReportingUtility
872 from canonical.testing import LaunchpadZopelessLayer
873@@ -113,7 +115,8 @@
874 super(TestUploadProcessorBase, self).setUp()
875
876 self.queue_folder = tempfile.mkdtemp()
877- os.makedirs(os.path.join(self.queue_folder, "incoming"))
878+ self.incoming_folder = os.path.join(self.queue_folder, "incoming")
879+ os.makedirs(self.incoming_folder)
880
881 self.test_files_dir = os.path.join(config.root,
882 "lib/lp/archiveuploader/tests/data/suite")
883@@ -139,6 +142,30 @@
884 shutil.rmtree(self.queue_folder)
885 super(TestUploadProcessorBase, self).tearDown()
886
887+ def getUploadProcessor(self, txn):
888+ def getPolicy(distro):
889+ self.options.distro = distro.name
890+ return findPolicyByOptions(self.options)
891+ return UploadProcessor(
892+ self.options.base_fsroot, self.options.dryrun,
893+ self.options.nomails,
894+ self.options.keep, getPolicy, txn, self.log)
895+
896+ def publishPackage(self, packagename, version, source=True,
897+ archive=None):
898+ """Publish a single package that is currently NEW in the queue."""
899+ queue_items = self.breezy.getQueueItems(
900+ status=PackageUploadStatus.NEW, name=packagename,
901+ version=version, exact_match=True, archive=archive)
902+ self.assertEqual(queue_items.count(), 1)
903+ queue_item = queue_items[0]
904+ queue_item.setAccepted()
905+ if source:
906+ pubrec = queue_item.sources[0].publish(self.log)
907+ else:
908+ pubrec = queue_item.builds[0].publish(self.log)
909+ return pubrec
910+
911 def assertLogContains(self, line):
912 """Assert if a given line is present in the log messages."""
913 self.assertTrue(line in self.log.lines,
914@@ -208,25 +235,29 @@
915 filename, len(content), StringIO(content),
916 'application/x-gtar')
917
918- def queueUpload(self, upload_name, relative_path="", test_files_dir=None):
919+ def queueUpload(self, upload_name, relative_path="", test_files_dir=None,
920+ queue_entry=None):
921 """Queue one of our test uploads.
922
923- upload_name is the name of the test upload directory. It is also
924+ upload_name is the name of the test upload directory. If there
925+ is no explicit queue entry name specified, it is also
926 the name of the queue entry directory we create.
927 relative_path is the path to create inside the upload, eg
928 ubuntu/~malcc/default. If not specified, defaults to "".
929
930 Return the path to the upload queue entry directory created.
931 """
932+ if queue_entry is None:
933+ queue_entry = upload_name
934 target_path = os.path.join(
935- self.queue_folder, "incoming", upload_name, relative_path)
936+ self.incoming_folder, queue_entry, relative_path)
937 if test_files_dir is None:
938 test_files_dir = self.test_files_dir
939 upload_dir = os.path.join(test_files_dir, upload_name)
940 if relative_path:
941 os.makedirs(os.path.dirname(target_path))
942 shutil.copytree(upload_dir, target_path)
943- return os.path.join(self.queue_folder, "incoming", upload_name)
944+ return os.path.join(self.incoming_folder, queue_entry)
945
946 def processUpload(self, processor, upload_dir):
947 """Process an upload queue entry directory.
948@@ -248,8 +279,7 @@
949 self.layer.txn.commit()
950 if policy is not None:
951 self.options.context = policy
952- return UploadProcessor(
953- self.options, self.layer.txn, self.log)
954+ return self.getUploadProcessor(self.layer.txn)
955
956 def assertEmail(self, contents=None, recipients=None):
957 """Check last email content and recipients.
958@@ -341,24 +371,9 @@
959 "Expected acceptance email not rejection. Actually Got:\n%s"
960 % raw_msg)
961
962- def _publishPackage(self, packagename, version, source=True,
963- archive=None):
964- """Publish a single package that is currently NEW in the queue."""
965- queue_items = self.breezy.getQueueItems(
966- status=PackageUploadStatus.NEW, name=packagename,
967- version=version, exact_match=True, archive=archive)
968- self.assertEqual(queue_items.count(), 1)
969- queue_item = queue_items[0]
970- queue_item.setAccepted()
971- if source:
972- pubrec = queue_item.sources[0].publish(self.log)
973- else:
974- pubrec = queue_item.builds[0].publish(self.log)
975- return pubrec
976-
977 def testInstantiate(self):
978 """UploadProcessor should instantiate"""
979- up = UploadProcessor(self.options, None, self.log)
980+ up = self.getUploadProcessor(None)
981
982 def testLocateDirectories(self):
983 """Return a sorted list of subdirs in a directory.
984@@ -372,7 +387,7 @@
985 os.mkdir("%s/dir1" % testdir)
986 os.mkdir("%s/dir2" % testdir)
987
988- up = UploadProcessor(self.options, None, self.log)
989+ up = self.getUploadProcessor(None)
990 located_dirs = up.locateDirectories(testdir)
991 self.assertEqual(located_dirs, ['dir1', 'dir2', 'dir3'])
992 finally:
993@@ -390,7 +405,7 @@
994 open("%s/2_source.changes" % testdir, "w").close()
995 open("%s/3.not_changes" % testdir, "w").close()
996
997- up = UploadProcessor(self.options, None, self.log)
998+ up = self.getUploadProcessor(None)
999 located_files = up.locateChangesFiles(testdir)
1000 self.assertEqual(
1001 located_files, ["2_source.changes", "1.changes"])
1002@@ -418,7 +433,7 @@
1003
1004 # Move it
1005 self.options.base_fsroot = testdir
1006- up = UploadProcessor(self.options, None, self.log)
1007+ up = self.getUploadProcessor(None)
1008 up.moveUpload(upload, target_name)
1009
1010 # Check it moved
1011@@ -439,7 +454,7 @@
1012
1013 # Remove it
1014 self.options.base_fsroot = testdir
1015- up = UploadProcessor(self.options, None, self.log)
1016+ up = self.getUploadProcessor(None)
1017 up.moveProcessedUpload(upload, "accepted")
1018
1019 # Check it was removed, not moved
1020@@ -462,7 +477,7 @@
1021
1022 # Move it
1023 self.options.base_fsroot = testdir
1024- up = UploadProcessor(self.options, None, self.log)
1025+ up = self.getUploadProcessor(None)
1026 up.moveProcessedUpload(upload, "rejected")
1027
1028 # Check it moved
1029@@ -485,7 +500,7 @@
1030
1031 # Remove it
1032 self.options.base_fsroot = testdir
1033- up = UploadProcessor(self.options, None, self.log)
1034+ up = self.getUploadProcessor(None)
1035 up.removeUpload(upload)
1036
1037 # Check it was removed, not moved
1038@@ -498,7 +513,7 @@
1039
1040 def testOrderFilenames(self):
1041 """orderFilenames sorts _source.changes ahead of other files."""
1042- up = UploadProcessor(self.options, None, self.log)
1043+ up = self.getUploadProcessor(None)
1044
1045 self.assertEqual(["d_source.changes", "a", "b", "c"],
1046 up.orderFilenames(["b", "a", "d_source.changes", "c"]))
1047@@ -522,8 +537,7 @@
1048 # Register our broken upload policy
1049 AbstractUploadPolicy._registerPolicy(BrokenUploadPolicy)
1050 self.options.context = 'broken'
1051- uploadprocessor = UploadProcessor(
1052- self.options, self.layer.txn, self.log)
1053+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1054
1055 # Upload a package to Breezy.
1056 upload_dir = self.queueUpload("baz_1.0-1")
1057@@ -634,7 +648,7 @@
1058 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
1059 upload_dir = self.queueUpload("bar_1.0-1")
1060 self.processUpload(uploadprocessor, upload_dir)
1061- bar_source_pub = self._publishPackage('bar', '1.0-1')
1062+ bar_source_pub = self.publishPackage('bar', '1.0-1')
1063 [bar_original_build] = bar_source_pub.createMissingBuilds()
1064
1065 # Move the source from the accepted queue.
1066@@ -653,7 +667,7 @@
1067 self.processUpload(uploadprocessor, upload_dir)
1068 self.assertEqual(
1069 uploadprocessor.last_processed_upload.is_rejected, False)
1070- bar_bin_pubs = self._publishPackage('bar', '1.0-1', source=False)
1071+ bar_bin_pubs = self.publishPackage('bar', '1.0-1', source=False)
1072 # Mangle its publishing component to "restricted" so we can check
1073 # the copy archive ancestry override later.
1074 restricted = getUtility(IComponentSet)["restricted"]
1075@@ -746,14 +760,14 @@
1076 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
1077 upload_dir = self.queueUpload("bar_1.0-1")
1078 self.processUpload(uploadprocessor, upload_dir)
1079- bar_source_pub = self._publishPackage('bar', '1.0-1')
1080+ bar_source_pub = self.publishPackage('bar', '1.0-1')
1081 [bar_original_build] = bar_source_pub.createMissingBuilds()
1082
1083 self.options.context = 'buildd'
1084 self.options.buildid = bar_original_build.id
1085 upload_dir = self.queueUpload("bar_1.0-1_binary")
1086 self.processUpload(uploadprocessor, upload_dir)
1087- [bar_binary_pub] = self._publishPackage("bar", "1.0-1", source=False)
1088+ [bar_binary_pub] = self.publishPackage("bar", "1.0-1", source=False)
1089
1090 # Prepare ubuntu/breezy-autotest to build sources in i386.
1091 breezy_autotest = self.ubuntu['breezy-autotest']
1092@@ -803,7 +817,7 @@
1093 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
1094 upload_dir = self.queueUpload("bar_1.0-1")
1095 self.processUpload(uploadprocessor, upload_dir)
1096- bar_source_old = self._publishPackage('bar', '1.0-1')
1097+ bar_source_old = self.publishPackage('bar', '1.0-1')
1098
1099 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
1100 upload_dir = self.queueUpload("bar_1.0-2")
1101@@ -816,7 +830,7 @@
1102 self.options.buildid = bar_original_build.id
1103 upload_dir = self.queueUpload("bar_1.0-2_binary")
1104 self.processUpload(uploadprocessor, upload_dir)
1105- [bar_binary_pub] = self._publishPackage("bar", "1.0-2", source=False)
1106+ [bar_binary_pub] = self.publishPackage("bar", "1.0-2", source=False)
1107
1108 # Create a COPY archive for building in non-virtual builds.
1109 uploader = getUtility(IPersonSet).getByName('name16')
1110@@ -971,7 +985,7 @@
1111 partner_archive = getUtility(IArchiveSet).getByDistroPurpose(
1112 self.ubuntu, ArchivePurpose.PARTNER)
1113 self.assertTrue(partner_archive)
1114- self._publishPackage("foocomm", "1.0-1", archive=partner_archive)
1115+ self.publishPackage("foocomm", "1.0-1", archive=partner_archive)
1116
1117 # Check the publishing record's archive and component.
1118 foocomm_spph = SourcePackagePublishingHistory.selectOneBy(
1119@@ -1015,7 +1029,7 @@
1120 self.assertEqual(foocomm_bpr.component.name, 'partner')
1121
1122 # Publish the upload so we can check the publishing record.
1123- self._publishPackage("foocomm", "1.0-1", source=False)
1124+ self.publishPackage("foocomm", "1.0-1", source=False)
1125
1126 # Check the publishing record's archive and component.
1127 foocomm_bpph = BinaryPackagePublishingHistory.selectOneBy(
1128@@ -1054,14 +1068,14 @@
1129 # Accept and publish the upload.
1130 partner_archive = getUtility(IArchiveSet).getByDistroPurpose(
1131 self.ubuntu, ArchivePurpose.PARTNER)
1132- self._publishPackage("foocomm", "1.0-1", archive=partner_archive)
1133+ self.publishPackage("foocomm", "1.0-1", archive=partner_archive)
1134
1135 # Now do the same thing with a binary package.
1136 upload_dir = self.queueUpload("foocomm_1.0-1_binary")
1137 self.processUpload(uploadprocessor, upload_dir)
1138
1139 # Accept and publish the upload.
1140- self._publishPackage("foocomm", "1.0-1", source=False,
1141+ self.publishPackage("foocomm", "1.0-1", source=False,
1142 archive=partner_archive)
1143
1144 # Upload the next source version of the package.
1145@@ -1105,8 +1119,7 @@
1146 self.breezy.status = SeriesStatus.CURRENT
1147 self.layer.txn.commit()
1148 self.options.context = 'insecure'
1149- uploadprocessor = UploadProcessor(
1150- self.options, self.layer.txn, self.log)
1151+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1152
1153 # Upload a package for Breezy.
1154 upload_dir = self.queueUpload("foocomm_1.0-1_proposed")
1155@@ -1124,8 +1137,7 @@
1156 self.breezy.status = SeriesStatus.CURRENT
1157 self.layer.txn.commit()
1158 self.options.context = 'insecure'
1159- uploadprocessor = UploadProcessor(
1160- self.options, self.layer.txn, self.log)
1161+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1162
1163 # Upload a package for Breezy.
1164 upload_dir = self.queueUpload("foocomm_1.0-1")
1165@@ -1140,8 +1152,7 @@
1166 pocket and ensure it fails."""
1167 # Set up the uploadprocessor with appropriate options and logger.
1168 self.options.context = 'insecure'
1169- uploadprocessor = UploadProcessor(
1170- self.options, self.layer.txn, self.log)
1171+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1172
1173 # Upload a package for Breezy.
1174 upload_dir = self.queueUpload("foocomm_1.0-1_updates")
1175@@ -1302,8 +1313,7 @@
1176 used.
1177 That exception will then initiate the creation of an OOPS report.
1178 """
1179- processor = UploadProcessor(
1180- self.options, self.layer.txn, self.log)
1181+ processor = self.getUploadProcessor(self.layer.txn)
1182
1183 upload_dir = self.queueUpload("foocomm_1.0-1_proposed")
1184 bogus_changesfile_data = '''
1185@@ -1346,8 +1356,7 @@
1186 self.setupBreezy()
1187 self.layer.txn.commit()
1188 self.options.context = 'absolutely-anything'
1189- uploadprocessor = UploadProcessor(
1190- self.options, self.layer.txn, self.log)
1191+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1192
1193 # Upload the source first to enable the binary later:
1194 upload_dir = self.queueUpload("bar_1.0-1_lzma")
1195@@ -1357,7 +1366,7 @@
1196 self.assertTrue(
1197 "rejected" not in raw_msg,
1198 "Failed to upload bar source:\n%s" % raw_msg)
1199- self._publishPackage("bar", "1.0-1")
1200+ self.publishPackage("bar", "1.0-1")
1201 # Clear out emails generated during upload.
1202 ignore = pop_notifications()
1203
1204@@ -1456,15 +1465,14 @@
1205 permission=ArchivePermissionType.UPLOAD, person=uploader,
1206 component=restricted)
1207
1208- uploadprocessor = UploadProcessor(
1209- self.options, self.layer.txn, self.log)
1210+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1211
1212 # Upload the first version and accept it to make it known in
1213 # Ubuntu. The uploader has rights to upload NEW packages to
1214 # components that he does not have direct rights to.
1215 upload_dir = self.queueUpload("bar_1.0-1")
1216 self.processUpload(uploadprocessor, upload_dir)
1217- bar_source_pub = self._publishPackage('bar', '1.0-1')
1218+ bar_source_pub = self.publishPackage('bar', '1.0-1')
1219 # Clear out emails generated during upload.
1220 ignore = pop_notifications()
1221
1222@@ -1509,15 +1517,14 @@
1223 permission=ArchivePermissionType.UPLOAD, person=uploader,
1224 component=restricted)
1225
1226- uploadprocessor = UploadProcessor(
1227- self.options, self.layer.txn, self.log)
1228+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1229
1230 # Upload the first version and accept it to make it known in
1231 # Ubuntu. The uploader has rights to upload NEW packages to
1232 # components that he does not have direct rights to.
1233 upload_dir = self.queueUpload("bar_1.0-1")
1234 self.processUpload(uploadprocessor, upload_dir)
1235- bar_source_pub = self._publishPackage('bar', '1.0-1')
1236+ bar_source_pub = self.publishPackage('bar', '1.0-1')
1237 # Clear out emails generated during upload.
1238 ignore = pop_notifications()
1239
1240@@ -1590,8 +1597,7 @@
1241 # with pointer to the Soyuz questions in Launchpad and the
1242 # reason why the message was sent to the current recipients.
1243 self.setupBreezy()
1244- uploadprocessor = UploadProcessor(
1245- self.options, self.layer.txn, self.log)
1246+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1247
1248 upload_dir = self.queueUpload("bar_1.0-1", "boing")
1249 self.processUpload(uploadprocessor, upload_dir)
1250@@ -1636,8 +1642,7 @@
1251 self.setupBreezy()
1252 self.layer.txn.commit()
1253 self.options.context = 'absolutely-anything'
1254- uploadprocessor = UploadProcessor(
1255- self.options, self.layer.txn, self.log)
1256+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1257
1258 # Upload the source.
1259 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")
1260@@ -1655,8 +1660,7 @@
1261 permitted_formats=[SourcePackageFormat.FORMAT_3_0_QUILT])
1262 self.layer.txn.commit()
1263 self.options.context = 'absolutely-anything'
1264- uploadprocessor = UploadProcessor(
1265- self.options, self.layer.txn, self.log)
1266+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1267
1268 # Upload the source.
1269 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")
1270@@ -1666,7 +1670,7 @@
1271 self.assertTrue(
1272 "rejected" not in raw_msg,
1273 "Failed to upload bar source:\n%s" % raw_msg)
1274- spph = self._publishPackage("bar", "1.0-1")
1275+ spph = self.publishPackage("bar", "1.0-1")
1276
1277 self.assertEquals(
1278 sorted((sprf.libraryfile.filename, sprf.filetype)
1279@@ -1689,8 +1693,7 @@
1280 permitted_formats=[SourcePackageFormat.FORMAT_3_0_QUILT])
1281 self.layer.txn.commit()
1282 self.options.context = 'absolutely-anything'
1283- uploadprocessor = UploadProcessor(
1284- self.options, self.layer.txn, self.log)
1285+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1286
1287 # Upload the first source.
1288 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")
1289@@ -1700,7 +1703,7 @@
1290 self.assertTrue(
1291 "rejected" not in raw_msg,
1292 "Failed to upload bar source:\n%s" % raw_msg)
1293- spph = self._publishPackage("bar", "1.0-1")
1294+ spph = self.publishPackage("bar", "1.0-1")
1295
1296 # Upload another source sharing the same (component) orig.
1297 upload_dir = self.queueUpload("bar_1.0-2_3.0-quilt_without_orig")
1298@@ -1728,8 +1731,7 @@
1299 permitted_formats=[SourcePackageFormat.FORMAT_3_0_NATIVE])
1300 self.layer.txn.commit()
1301 self.options.context = 'absolutely-anything'
1302- uploadprocessor = UploadProcessor(
1303- self.options, self.layer.txn, self.log)
1304+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1305
1306 # Upload the source.
1307 upload_dir = self.queueUpload("bar_1.0_3.0-native")
1308@@ -1739,7 +1741,7 @@
1309 self.assertTrue(
1310 "rejected" not in raw_msg,
1311 "Failed to upload bar source:\n%s" % raw_msg)
1312- spph = self._publishPackage("bar", "1.0")
1313+ spph = self.publishPackage("bar", "1.0")
1314
1315 self.assertEquals(
1316 sorted((sprf.libraryfile.filename, sprf.filetype)
1317@@ -1754,8 +1756,7 @@
1318 self.setupBreezy()
1319 self.layer.txn.commit()
1320 self.options.context = 'absolutely-anything'
1321- uploadprocessor = UploadProcessor(
1322- self.options, self.layer.txn, self.log)
1323+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1324
1325 # Upload the source.
1326 upload_dir = self.queueUpload("bar_1.0-1_1.0-bzip2")
1327@@ -1772,8 +1773,7 @@
1328 self.setupBreezy()
1329 breezy = self.ubuntu['breezy']
1330 breezy.status = SeriesStatus.CURRENT
1331- uploadprocessor = UploadProcessor(
1332- self.options, self.layer.txn, self.log)
1333+ uploadprocessor = self.getUploadProcessor(self.layer.txn)
1334
1335 upload_dir = self.queueUpload("bar_1.0-1")
1336 self.processUpload(uploadprocessor, upload_dir)
1337
1338=== modified file 'lib/lp/archiveuploader/uploadprocessor.py'
1339--- lib/lp/archiveuploader/uploadprocessor.py 2010-08-04 00:30:56 +0000
1340+++ lib/lp/archiveuploader/uploadprocessor.py 2010-08-12 19:18:45 +0000
1341@@ -60,7 +60,7 @@
1342 from lp.archiveuploader.nascentupload import (
1343 NascentUpload, FatalUploadError, EarlyReturnUploadError)
1344 from lp.archiveuploader.uploadpolicy import (
1345- findPolicyByOptions, UploadPolicyError)
1346+ UploadPolicyError)
1347 from lp.soyuz.interfaces.archive import IArchiveSet, NoSuchPPA
1348 from lp.registry.interfaces.distribution import IDistributionSet
1349 from lp.registry.interfaces.person import IPersonSet
1350@@ -108,16 +108,33 @@
1351 class UploadProcessor:
1352 """Responsible for processing uploads. See module docstring."""
1353
1354- def __init__(self, options, ztm, log):
1355- self.options = options
1356+ def __init__(self, base_fsroot, dry_run, no_mails, keep, policy_for_distro,
1357+ ztm, log):
1358+ """Create a new upload processor.
1359+
1360+ :param base_fsroot: Root path for queue to use
1361+ :param dry_run: Run but don't commit changes to database
1362+ :param no_mails: Don't send out any emails
1363+ :param builds: Interpret leaf names as build ids
1364+ :param keep: Leave the files in place, don't move them away
1365+ :param policy_for_distro: callback to obtain Policy object for a
1366+ distribution
1367+ :param ztm: Database transaction to use
1368+ :param log: Logger to use for reporting
1369+ """
1370+ self.base_fsroot = base_fsroot
1371+ self.dry_run = dry_run
1372+ self.keep = keep
1373+ self.last_processed_upload = None
1374+ self.log = log
1375+ self.no_mails = no_mails
1376+ self._getPolicyForDistro = policy_for_distro
1377 self.ztm = ztm
1378- self.log = log
1379- self.last_processed_upload = None
1380
1381- def processUploadQueue(self):
1382+ def processUploadQueue(self, leaf_name=None):
1383 """Search for uploads, and process them.
1384
1385- Uploads are searched for in the 'incoming' directory inside the
1386+ Uploads are searched for in the 'incoming' directory inside the
1387 base_fsroot.
1388
1389 This method also creates the 'incoming', 'accepted', 'rejected', and
1390@@ -127,19 +144,22 @@
1391 self.log.debug("Beginning processing")
1392
1393 for subdir in ["incoming", "accepted", "rejected", "failed"]:
1394- full_subdir = os.path.join(self.options.base_fsroot, subdir)
1395+ full_subdir = os.path.join(self.base_fsroot, subdir)
1396 if not os.path.exists(full_subdir):
1397 self.log.debug("Creating directory %s" % full_subdir)
1398 os.mkdir(full_subdir)
1399
1400- fsroot = os.path.join(self.options.base_fsroot, "incoming")
1401+ fsroot = os.path.join(self.base_fsroot, "incoming")
1402 uploads_to_process = self.locateDirectories(fsroot)
1403 self.log.debug("Checked in %s, found %s"
1404 % (fsroot, uploads_to_process))
1405 for upload in uploads_to_process:
1406 self.log.debug("Considering upload %s" % upload)
1407+ if leaf_name is not None and upload != leaf_name:
1408+ self.log.debug("Skipping %s -- does not match %s" % (
1409+ upload, leaf_name))
1410+ continue
1411 self.processUpload(fsroot, upload)
1412-
1413 finally:
1414 self.log.debug("Rolling back any remaining transactions.")
1415 self.ztm.abort()
1416@@ -152,16 +172,7 @@
1417 is 'failed', otherwise it is the worst of the results from the
1418 individual changes files, in order 'failed', 'rejected', 'accepted'.
1419
1420- If the leafname option is set but its value is not the same as the
1421- name of the upload directory, skip it entirely.
1422-
1423 """
1424- if (self.options.leafname is not None and
1425- upload != self.options.leafname):
1426- self.log.debug("Skipping %s -- does not match %s" % (
1427- upload, self.options.leafname))
1428- return
1429-
1430 upload_path = os.path.join(fsroot, upload)
1431 changes_files = self.locateChangesFiles(upload_path)
1432
1433@@ -242,7 +253,7 @@
1434 # Skip lockfile deletion, see similar code in lp.poppy.hooks.
1435 fsroot_lock.release(skip_delete=True)
1436
1437- sorted_dir_names = sorted(
1438+ sorted_dir_names = sorted(
1439 dir_name
1440 for dir_name in dir_names
1441 if os.path.isdir(os.path.join(fsroot, dir_name)))
1442@@ -321,8 +332,7 @@
1443 "https://help.launchpad.net/Packaging/PPA#Uploading "
1444 "and update your configuration.")))
1445 self.log.debug("Finding fresh policy")
1446- self.options.distro = distribution.name
1447- policy = findPolicyByOptions(self.options)
1448+ policy = self._getPolicyForDistro(distribution)
1449 policy.archive = archive
1450
1451 # DistroSeries overriding respect the following precedence:
1452@@ -396,7 +406,7 @@
1453 # when transaction is committed) this will cause any emails sent
1454 # sent by do_reject to be lost.
1455 notify = True
1456- if self.options.dryrun or self.options.nomails:
1457+ if self.dry_run or self.no_mails:
1458 notify = False
1459 if upload.is_rejected:
1460 result = UploadStatusEnum.REJECTED
1461@@ -415,7 +425,7 @@
1462 for msg in upload.rejections:
1463 self.log.warn("\t%s" % msg)
1464
1465- if self.options.dryrun:
1466+ if self.dry_run:
1467 self.log.info("Dry run, aborting transaction.")
1468 self.ztm.abort()
1469 else:
1470@@ -434,7 +444,7 @@
1471 This includes moving the given upload directory and moving the
1472 matching .distro file, if it exists.
1473 """
1474- if self.options.keep or self.options.dryrun:
1475+ if self.keep or self.dry_run:
1476 self.log.debug("Keeping contents untouched")
1477 return
1478
1479@@ -462,21 +472,21 @@
1480 This includes moving the given upload directory and moving the
1481 matching .distro file, if it exists.
1482 """
1483- if self.options.keep or self.options.dryrun:
1484+ if self.keep or self.dry_run:
1485 self.log.debug("Keeping contents untouched")
1486 return
1487
1488 pathname = os.path.basename(upload)
1489
1490 target_path = os.path.join(
1491- self.options.base_fsroot, subdir_name, pathname)
1492+ self.base_fsroot, subdir_name, pathname)
1493 self.log.debug("Moving upload directory %s to %s" %
1494 (upload, target_path))
1495 shutil.move(upload, target_path)
1496
1497 distro_filename = upload + ".distro"
1498 if os.path.isfile(distro_filename):
1499- target_path = os.path.join(self.options.base_fsroot, subdir_name,
1500+ target_path = os.path.join(self.base_fsroot, subdir_name,
1501 os.path.basename(distro_filename))
1502 self.log.debug("Moving distro file %s to %s" % (distro_filename,
1503 target_path))
1504
1505=== modified file 'lib/lp/buildmaster/model/buildfarmjob.py'
1506--- lib/lp/buildmaster/model/buildfarmjob.py 2010-06-16 16:01:08 +0000
1507+++ lib/lp/buildmaster/model/buildfarmjob.py 2010-08-12 19:18:45 +0000
1508@@ -16,7 +16,7 @@
1509
1510 import pytz
1511
1512-from storm.expr import And, Coalesce, Desc, Join, LeftJoin, Select
1513+from storm.expr import Coalesce, Desc, LeftJoin, Or
1514 from storm.locals import Bool, DateTime, Int, Reference, Storm
1515 from storm.store import Store
1516
1517@@ -365,49 +365,38 @@
1518 # Currently only package builds can be private (via their
1519 # related archive), but not all build farm jobs will have a
1520 # related package build - hence the left join.
1521- left_join_pkg_builds = LeftJoin(
1522- BuildFarmJob,
1523- Join(
1524+ origin = [BuildFarmJob]
1525+ left_join_archive = [
1526+ LeftJoin(
1527 PackageBuild,
1528- Archive,
1529- And(PackageBuild.archive == Archive.id)),
1530- PackageBuild.build_farm_job == BuildFarmJob.id)
1531-
1532- filtered_builds = IStore(BuildFarmJob).using(
1533- left_join_pkg_builds).find(BuildFarmJob, *extra_clauses)
1534+ PackageBuild.build_farm_job == BuildFarmJob.id),
1535+ LeftJoin(
1536+ Archive, PackageBuild.archive == Archive.id),
1537+ ]
1538
1539 if user is None:
1540 # Anonymous requests don't get to see private builds at all.
1541- filtered_builds = filtered_builds.find(
1542- Coalesce(Archive.private, False) == False)
1543+ origin.extend(left_join_archive)
1544+ extra_clauses.append(Coalesce(Archive.private, False) == False)
1545
1546 elif user.inTeam(getUtility(ILaunchpadCelebrities).admin):
1547 # Admins get to see everything.
1548 pass
1549 else:
1550- # Everyone else sees a union of all public builds and the
1551+ # Everyone else sees all public builds and the
1552 # specific private builds to which they have access.
1553- filtered_builds = filtered_builds.find(
1554- Coalesce(Archive.private, False) == False)
1555-
1556- user_teams_subselect = Select(
1557- TeamParticipation.teamID,
1558- where=And(
1559- TeamParticipation.personID == user.id,
1560- TeamParticipation.teamID == Archive.ownerID))
1561- private_builds_for_user = IStore(BuildFarmJob).find(
1562- BuildFarmJob,
1563- PackageBuild.build_farm_job == BuildFarmJob.id,
1564- PackageBuild.archive == Archive.id,
1565- Archive.private == True,
1566- Archive.ownerID.is_in(user_teams_subselect),
1567- *extra_clauses)
1568-
1569- filtered_builds = filtered_builds.union(
1570- private_builds_for_user)
1571+ origin.extend(left_join_archive)
1572+ origin.append(LeftJoin(
1573+ TeamParticipation, TeamParticipation.teamID == Archive.ownerID))
1574+ extra_clauses.append(
1575+ Or(Coalesce(Archive.private, False) == False,
1576+ TeamParticipation.person == user))
1577+
1578+ filtered_builds = IStore(BuildFarmJob).using(*origin).find(
1579+ BuildFarmJob, *extra_clauses)
1580
1581 filtered_builds.order_by(
1582 Desc(BuildFarmJob.date_finished), BuildFarmJob.id)
1583+ filtered_builds.config(distinct=True)
1584
1585 return filtered_builds
1586-
1587
1588=== modified file 'lib/lp/soyuz/scripts/soyuz_process_upload.py'
1589--- lib/lp/soyuz/scripts/soyuz_process_upload.py 2010-05-04 15:38:08 +0000
1590+++ lib/lp/soyuz/scripts/soyuz_process_upload.py 2010-08-12 19:18:45 +0000
1591@@ -8,6 +8,7 @@
1592
1593 import os
1594
1595+from lp.archiveuploader.uploadpolicy import findPolicyByOptions
1596 from lp.archiveuploader.uploadprocessor import UploadProcessor
1597 from lp.services.scripts.base import (
1598 LaunchpadCronScript, LaunchpadScriptFailure)
1599@@ -74,8 +75,13 @@
1600 "%s is not a directory" % self.options.base_fsroot)
1601
1602 self.logger.debug("Initialising connection.")
1603- UploadProcessor(
1604- self.options, self.txn, self.logger).processUploadQueue()
1605+ def getPolicy(distro):
1606+ self.options.distro = distro.name
1607+ return findPolicyByOptions(self.options)
1608+ processor = UploadProcessor(self.options.base_fsroot,
1609+ self.options.dryrun, self.options.nomails, self.options.keep,
1610+ getPolicy, self.txn, self.logger)
1611+ processor.processUploadQueue(self.options.leafname)
1612
1613 @property
1614 def lockfilename(self):
1615
1616=== modified file 'utilities/sourcedeps.conf'
1617--- utilities/sourcedeps.conf 2010-08-03 14:59:22 +0000
1618+++ utilities/sourcedeps.conf 2010-08-12 19:18:45 +0000
1619@@ -12,5 +12,6 @@
1620 pygettextpo lp:~launchpad-pqm/pygettextpo/trunk;revno=24
1621 pygpgme lp:~launchpad-pqm/pygpgme/devel;revno=49
1622 subvertpy lp:~launchpad-pqm/subvertpy/trunk;revno=2042
1623+python-debian lp:~launchpad-pqm/python-debian/devel;revno=185
1624 testresources lp:~launchpad-pqm/testresources/dev;revno=16
1625 shipit lp:~launchpad-pqm/shipit/trunk;revno=8909 optional

Subscribers

People subscribed via source and target branches

to status/vote changes: