Merge lp:~salgado/launchpad/layer-specific-navigation into lp:launchpad/db-devel

Proposed by Guilherme Salgado
Status: Rejected
Rejected by: Paul Hummer
Proposed branch: lp:~salgado/launchpad/layer-specific-navigation
Merge into: lp:launchpad/db-devel
Diff against target: 1625 lines (+491/-212)
26 files modified
.bzrignore (+1/-1)
README (+106/-4)
lib/canonical/buildd/debian/changelog (+11/-1)
lib/canonical/buildd/debian/compat (+1/-0)
lib/canonical/buildd/debian/control (+4/-4)
lib/canonical/buildd/debian/launchpad-buildd.examples (+1/-0)
lib/canonical/buildd/debian/launchpad-buildd.init (+10/-0)
lib/canonical/buildd/debian/rules (+4/-3)
lib/canonical/buildd/debian/source/format (+1/-0)
lib/canonical/launchpad/doc/navigation.txt (+9/-1)
lib/canonical/launchpad/webapp/metazcml.py (+5/-9)
lib/canonical/launchpad/webapp/tests/test_navigation.py (+107/-0)
lib/lp/archivepublisher/config.py (+19/-14)
lib/lp/archivepublisher/ftparchive.py (+11/-3)
lib/lp/archivepublisher/tests/test_config.py (+19/-9)
lib/lp/archivepublisher/tests/test_ftparchive.py (+18/-2)
lib/lp/archiveuploader/tests/__init__.py (+9/-3)
lib/lp/archiveuploader/tests/test_buildduploads.py (+2/-3)
lib/lp/archiveuploader/tests/test_ppauploadprocessor.py (+2/-5)
lib/lp/archiveuploader/tests/test_recipeuploads.py (+2/-3)
lib/lp/archiveuploader/tests/test_securityuploads.py (+3/-7)
lib/lp/archiveuploader/tests/test_uploadprocessor.py (+78/-78)
lib/lp/archiveuploader/uploadprocessor.py (+38/-28)
lib/lp/buildmaster/model/buildfarmjob.py (+21/-32)
lib/lp/soyuz/scripts/soyuz_process_upload.py (+8/-2)
utilities/sourcedeps.conf (+1/-0)
To merge this branch: bzr merge lp:~salgado/launchpad/layer-specific-navigation
Reviewer Review Type Date Requested Status
Launchpad code reviewers Pending
Review via email: mp+32339@code.launchpad.net

Description of the change

This branch makes it possible to register navigation classes for a
specific layer.

This is to allow us to use a different navigation class for, say,
IDistribution on the vostok vhost.

One issue I've encountered is that the navigation directive will
unconditionally register the nav classes for IXMLRPCRequest (the layer
of the xmlrpc vhost) as well, and that will cause a
ConfigurationConflict when you have more than one navigation class used
for a given context. One solution to this is to add yet another
parameter to the navigation directive (used_for_xmlrpc=True) and use
that to decide whether or not to register the nav class for
IXMLRPCRequest.

I'm not particularly happy with that solution as it doesn't make it
possible for vostok to overwrite the navigation that is used on the
xmlrpc vhost, but that may not be a problem in practice as we need
custom navigation classes only because we want to remove traversals from
the existing ones.

To post a comment you must log in.
Revision history for this message
Paul Hummer (rockstar) wrote :

I approved this in the devel proposal

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file '.bzrignore'
--- .bzrignore 2010-07-15 15:57:40 +0000
+++ .bzrignore 2010-08-12 19:18:45 +0000
@@ -56,7 +56,6 @@
56.bazaar56.bazaar
57.cache57.cache
58.subversion58.subversion
59lib/canonical/buildd/launchpad-files
60.testrepository59.testrepository
61.memcache.pid60.memcache.pid
62./pipes61./pipes
@@ -77,3 +76,4 @@
77lib/canonical/launchpad-buildd_*_source.build76lib/canonical/launchpad-buildd_*_source.build
78lib/canonical/launchpad-buildd_*_source.changes77lib/canonical/launchpad-buildd_*_source.changes
79lib/canonical/buildd/debian/*78lib/canonical/buildd/debian/*
79lib/canonical/buildd/launchpad-files/*
8080
=== modified file 'README'
--- README 2010-04-06 20:07:30 +0000
+++ README 2010-08-12 19:18:45 +0000
@@ -1,4 +1,106 @@
1This is the top level project, that supplies the infrastructure for testing,1====================
2and running launchpad.2README for Launchpad
33====================
4Documentation is in the doc directory or on the wiki.4
5Launchpad is an open source suite of tools that help people and teams to work
6together on software projects. Unlike many open source projects, Launchpad
7isn't something you install and run yourself (although you are welcome to do
8so), instead, contributors help make <https://launchpad.net> better.
9
10Launchpad is a project of Canonical <http://www.canonical.com> and has
11received many contributions from many wonderful people
12<https://dev.launchpad.net/Contributions>.
13
14If you want help using Launchpad, then please visit our help wiki at:
15
16 https://help.launchpad.net
17
18If you'd like to contribute to Launchpad, have a look at:
19
20 https://dev.launchpad.net
21
22Alternatively, have a poke around in the code, which you probably already know
23how to get if you are reading this file.
24
25
26Getting started
27===============
28
29There's a full guide for getting up-and-running with a development Launchpad
30environment at <https://dev.launchpad.net/Getting>. When you are ready to
31submit a patch, please consult <https://dev.launchpad.net/PatchSubmission>.
32
33Our bug tracker is at <https://bugs.launchpad.net/launchpad/> and you can get
34the source code any time by doing:
35
36 $ bzr branch lp:launchpad
37
38
39Navigating the tree
40-------------------
41
42The Launchpad tree is big, messy and changing. Sorry about that. Don't panic
43though, it can sense fear. Keep a firm grip on `grep` and pay attention to
44these important top-level folders:
45
46 bin/, utilities/
47 Where you will find scripts intended for developers and admins. There's
48 no rhyme or reason to what goes in bin/ and what goes in utilities/, so
49 take a look in both. bin/ will be empty in a fresh checkout, the actual
50 content lives in 'buildout-templates'.
51
52 configs/
53 Configuration files for various kinds of Launchpad instances.
54 'development' and 'testrunner' are of particular interest to developers.
55
56 cronscripts/
57 Scripts that are run on actual production instances of Launchpad as
58 cronjobs.
59
60 daemons/
61 Entry points for various daemons that form part of Launchpad
62
63 database/
64 Our database schema, our sample data, and some other stuff that causes
65 fear.
66
67 doc/
68 General system-wide documentation. You can also find documentation on
69 <https://dev.launchpad.net>, in docstrings and in doctests.
70
71 lib/
72 Where the vast majority of the code lives, along with our templates, tests
73 and the bits of our documentation that are written as doctests. 'lp' and
74 'canonical' are the two most interesting packages. Note that 'canonical'
75 is deprecated in favour of 'lp'. To learn more about how the 'lp' package
76 is laid out, take a look at its docstring.
77
78 Makefile
79 Ahh, bliss. The Makefile has all sorts of goodies. If you spend any
80 length of time hacking on Launchpad, you'll use it often. The most
81 important targets are 'make clean', 'make compile', 'make schema', 'make
82 run' and 'make run_all'.
83
84 scripts/
85 Scripts that are run on actual production instances of Launchpad,
86 generally triggered by some automatic process.
87
88
89You can spend years hacking on Launchpad full-time and not know what all of
90the files in the top-level directory are for. However, here's a guide to some
91of the ones that come up from time to time.
92
93 buildout-templates/
94 Templates that are generated into actual files, normally bin/ scripts,
95 when buildout is run. If you want to change the behaviour of bin/test,
96 look here.
97
98 bzrplugins/, optionalbzrplugins/
99 Bazaar plugins used in running Launchpad.
100
101 sourcecode/
102 A directory into which we symlink branches of some of Launchpad's
103 dependencies. Don't ask.
104
105You never have to care about 'benchmarks', 'override-includes' or
106'package-includes'.
5107
=== renamed file 'daemons/buildd-slave-example.conf' => 'lib/canonical/buildd/buildd-slave-example.conf'
=== modified file 'lib/canonical/buildd/debian/changelog'
--- lib/canonical/buildd/debian/changelog 2010-08-05 21:12:36 +0000
+++ lib/canonical/buildd/debian/changelog 2010-08-12 19:18:45 +0000
@@ -1,9 +1,19 @@
1launchpad-buildd (68) UNRELEASED; urgency=low1launchpad-buildd (68) UNRELEASED; urgency=low
22
3 [ William Grant ]
3 * Take an 'arch_tag' argument, so the master can override the slave4 * Take an 'arch_tag' argument, so the master can override the slave
4 architecture.5 architecture.
56
6 -- William Grant <wgrant@ubuntu.com> Sun, 01 Aug 2010 22:00:32 +10007 [ Jelmer Vernooij ]
8
9 * Explicitly use source format 1.0.
10 * Add LSB information to init script.
11 * Use debhelper >= 5 (available in dapper, not yet deprecated in
12 maverick).
13 * Fix spelling in description.
14 * Install example buildd configuration.
15
16 -- Jelmer Vernooij <jelmer@canonical.com> Thu, 12 Aug 2010 17:04:14 +0200
717
8launchpad-buildd (67) hardy-cat; urgency=low18launchpad-buildd (67) hardy-cat; urgency=low
919
1020
=== added file 'lib/canonical/buildd/debian/compat'
--- lib/canonical/buildd/debian/compat 1970-01-01 00:00:00 +0000
+++ lib/canonical/buildd/debian/compat 2010-08-12 19:18:45 +0000
@@ -0,0 +1,1 @@
15
02
=== modified file 'lib/canonical/buildd/debian/control'
--- lib/canonical/buildd/debian/control 2010-05-19 15:50:27 +0000
+++ lib/canonical/buildd/debian/control 2010-08-12 19:18:45 +0000
@@ -3,15 +3,15 @@
3Priority: extra3Priority: extra
4Maintainer: Adam Conrad <adconrad@ubuntu.com>4Maintainer: Adam Conrad <adconrad@ubuntu.com>
5Standards-Version: 3.5.95Standards-Version: 3.5.9
6Build-Depends-Indep: debhelper (>= 4)6Build-Depends-Indep: debhelper (>= 5)
77
8Package: launchpad-buildd8Package: launchpad-buildd
9Section: misc9Section: misc
10Architecture: all10Architecture: all
11Depends: python-twisted-core, python-twisted-web, debootstrap, dpkg-dev, linux32, file, bzip2, sudo, ntpdate, adduser, apt-transport-https, lsb-release, apache211Depends: python-twisted-core, python-twisted-web, debootstrap, dpkg-dev, linux32, file, bzip2, sudo, ntpdate, adduser, apt-transport-https, lsb-release, apache2, ${misc:Depends}
12Description: Launchpad buildd slave12Description: Launchpad buildd slave
13 This is the launchpad buildd slave package. It contains everything needed to13 This is the launchpad buildd slave package. It contains everything needed to
14 get a launchpad buildd going apart from the database manipulation required to14 get a launchpad buildd going apart from the database manipulation required to
15 tell launchpad about the slave instance. If you are creating more than one15 tell launchpad about the slave instance. If you are creating more than one
16 slave instance on the same computer, be sure to give them independant configs16 slave instance on the same computer, be sure to give them independent configs
17 and independant filecaches etc.17 and independent filecaches etc.
1818
=== added file 'lib/canonical/buildd/debian/launchpad-buildd.examples'
--- lib/canonical/buildd/debian/launchpad-buildd.examples 1970-01-01 00:00:00 +0000
+++ lib/canonical/buildd/debian/launchpad-buildd.examples 2010-08-12 19:18:45 +0000
@@ -0,0 +1,1 @@
1buildd-slave-example.conf
02
=== modified file 'lib/canonical/buildd/debian/launchpad-buildd.init'
--- lib/canonical/buildd/debian/launchpad-buildd.init 2010-03-31 17:10:21 +0000
+++ lib/canonical/buildd/debian/launchpad-buildd.init 2010-08-12 19:18:45 +0000
@@ -8,6 +8,16 @@
8#8#
9# Author: Daniel Silverstone <daniel.silverstone@canonical.com>9# Author: Daniel Silverstone <daniel.silverstone@canonical.com>
1010
11### BEGIN INIT INFO
12# Provides: launchpad_buildd
13# Required-Start: $local_fs $network $syslog $time
14# Required-Stop: $local_fs $network $syslog $time
15# Default-Start: 2 3 4 5
16# Default-Stop: 0 1 6
17# X-Interactive: false
18# Short-Description: Start/stop launchpad buildds
19### END INIT INFO
20
11set -e21set -e
1222
13PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin23PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
1424
=== modified file 'lib/canonical/buildd/debian/rules'
--- lib/canonical/buildd/debian/rules 2010-07-18 08:49:02 +0000
+++ lib/canonical/buildd/debian/rules 2010-08-12 19:18:45 +0000
@@ -3,7 +3,6 @@
3# Copyright 2009, 2010 Canonical Ltd. This software is licensed under the3# Copyright 2009, 2010 Canonical Ltd. This software is licensed under the
4# GNU Affero General Public License version 3 (see the file LICENSE).4# GNU Affero General Public License version 3 (see the file LICENSE).
55
6export DH_COMPAT=4
7export DH_OPTIONS6export DH_OPTIONS
87
9# This is an incomplete debian rules file for making the launchpad-buildd deb8# This is an incomplete debian rules file for making the launchpad-buildd deb
@@ -41,6 +40,7 @@
41 etc/launchpad-buildd \40 etc/launchpad-buildd \
42 usr/share/launchpad-buildd/canonical/launchpad/daemons \41 usr/share/launchpad-buildd/canonical/launchpad/daemons \
43 usr/share/doc/launchpad-buildd42 usr/share/doc/launchpad-buildd
43 dh_installexamples
4444
45 # Do installs here45 # Do installs here
46 touch $(pytarget)/../launchpad/__init__.py46 touch $(pytarget)/../launchpad/__init__.py
@@ -89,10 +89,11 @@
89clean:89clean:
90 dh_clean90 dh_clean
9191
92package:92prepare:
93 mkdir -p launchpad-files
94 install -m644 $(daemons)/buildd-slave.tac launchpad-files/buildd-slave.tac93 install -m644 $(daemons)/buildd-slave.tac launchpad-files/buildd-slave.tac
95 cp ../launchpad/daemons/tachandler.py launchpad-files/tachandler.py94 cp ../launchpad/daemons/tachandler.py launchpad-files/tachandler.py
95
96package: prepare
96 debuild -uc -us -S97 debuild -uc -us -S
9798
98build:99build:
99100
=== added directory 'lib/canonical/buildd/debian/source'
=== added file 'lib/canonical/buildd/debian/source/format'
--- lib/canonical/buildd/debian/source/format 1970-01-01 00:00:00 +0000
+++ lib/canonical/buildd/debian/source/format 2010-08-12 19:18:45 +0000
@@ -0,0 +1,1 @@
11.0
02
=== added directory 'lib/canonical/buildd/launchpad-files'
=== modified file 'lib/canonical/launchpad/doc/navigation.txt'
--- lib/canonical/launchpad/doc/navigation.txt 2010-08-02 02:13:52 +0000
+++ lib/canonical/launchpad/doc/navigation.txt 2010-08-12 19:18:45 +0000
@@ -234,7 +234,7 @@
234234
235== ZCML for browser:navigation ==235== ZCML for browser:navigation ==
236236
237The zcml processor `browser:navigation` processes navigation classes.237The zcml processor `browser:navigation` registers navigation classes.
238238
239 >>> class ThingSetView:239 >>> class ThingSetView:
240 ...240 ...
@@ -266,6 +266,14 @@
266 ... </configure>266 ... </configure>
267 ... """)267 ... """)
268268
269Once registered, we can look the navigation up using getMultiAdapter().
270
271 >>> from zope.component import getMultiAdapter
272 >>> from canonical.launchpad.webapp.servers import LaunchpadTestRequest
273 >>> from zope.publisher.interfaces.browser import IBrowserPublisher
274 >>> navigation = getMultiAdapter(
275 ... (thingset, LaunchpadTestRequest()), IBrowserPublisher, name='')
276
269This time, we get the view object for the page that was registered.277This time, we get the view object for the page that was registered.
270278
271 >>> navigation.publishTraverse(request, 'thingview')279 >>> navigation.publishTraverse(request, 'thingview')
272280
=== modified file 'lib/canonical/launchpad/webapp/metazcml.py'
--- lib/canonical/launchpad/webapp/metazcml.py 2010-08-05 21:53:44 +0000
+++ lib/canonical/launchpad/webapp/metazcml.py 2010-08-12 19:18:45 +0000
@@ -193,6 +193,10 @@
193class INavigationDirective(IGlueDirective):193class INavigationDirective(IGlueDirective):
194 """Hook up traversal etc."""194 """Hook up traversal etc."""
195195
196 layer = GlobalInterface(
197 title=u"The layer where this navigation is going to be available.",
198 required=False)
199
196200
197class IFeedsDirective(IGlueDirective):201class IFeedsDirective(IGlueDirective):
198 """Hook up feeds."""202 """Hook up feeds."""
@@ -267,7 +271,7 @@
267 layer=layer, class_=feedclass)271 layer=layer, class_=feedclass)
268272
269273
270def navigation(_context, module, classes):274def navigation(_context, module, classes, layer=IDefaultBrowserLayer):
271 """Handler for the `INavigationDirective`."""275 """Handler for the `INavigationDirective`."""
272 if not inspect.ismodule(module):276 if not inspect.ismodule(module):
273 raise TypeError("module attribute must be a module: %s, %s" %277 raise TypeError("module attribute must be a module: %s, %s" %
@@ -281,19 +285,11 @@
281 for_ = [navclass.usedfor]285 for_ = [navclass.usedfor]
282286
283 # Register the navigation as the traversal component.287 # Register the navigation as the traversal component.
284 layer = IDefaultBrowserLayer
285 provides = IBrowserPublisher288 provides = IBrowserPublisher
286 name = ''289 name = ''
287 view(_context, factory, layer, name, for_,290 view(_context, factory, layer, name, for_,
288 permission=PublicPermission, provides=provides,291 permission=PublicPermission, provides=provides,
289 allowed_interface=[IBrowserPublisher])292 allowed_interface=[IBrowserPublisher])
290 #view(_context, factory, layer, name, for_,
291 # permission=PublicPermission, provides=provides)
292
293 # Also register the navigation as a traversal component for XMLRPC.
294 xmlrpc_layer = IXMLRPCRequest
295 view(_context, factory, xmlrpc_layer, name, for_,
296 permission=PublicPermission, provides=provides)
297293
298294
299class InterfaceInstanceDispatcher:295class InterfaceInstanceDispatcher:
300296
=== added file 'lib/canonical/launchpad/webapp/tests/test_navigation.py'
--- lib/canonical/launchpad/webapp/tests/test_navigation.py 1970-01-01 00:00:00 +0000
+++ lib/canonical/launchpad/webapp/tests/test_navigation.py 2010-08-12 19:18:45 +0000
@@ -0,0 +1,107 @@
1# Copyright 2010 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).
3
4__metaclass__ = type
5
6from zope.component import ComponentLookupError, getMultiAdapter
7from zope.configuration import xmlconfig
8from zope.interface import implements, Interface
9from zope.publisher.interfaces.browser import (
10 IBrowserPublisher, IDefaultBrowserLayer)
11from zope.testing.cleanup import cleanUp
12
13from canonical.launchpad.webapp import Navigation
14
15from lp.testing import TestCase
16
17
18class TestNavigationDirective(TestCase):
19
20 def test_default_layer(self):
21 # By default all navigation classes are registered for
22 # IDefaultBrowserLayer.
23 directive = """
24 <browser:navigation
25 module="%(this)s" classes="ThingNavigation"/>
26 """ % dict(this=this)
27 xmlconfig.string(zcml_configure % directive)
28 navigation = getMultiAdapter(
29 (Thing(), DefaultBrowserLayer()), IBrowserPublisher, name='')
30 self.assertIsInstance(navigation, ThingNavigation)
31
32 def test_specific_layer(self):
33 # If we specify a layer when registering a navigation class, it will
34 # only be available on that layer.
35 directive = """
36 <browser:navigation
37 module="%(this)s" classes="OtherThingNavigation"
38 layer="%(this)s.IOtherLayer" />
39 """ % dict(this=this)
40 xmlconfig.string(zcml_configure % directive)
41 self.assertRaises(
42 ComponentLookupError,
43 getMultiAdapter,
44 (Thing(), DefaultBrowserLayer()), IBrowserPublisher, name='')
45
46 navigation = getMultiAdapter(
47 (Thing(), OtherLayer()), IBrowserPublisher, name='')
48 self.assertIsInstance(navigation, OtherThingNavigation)
49
50 def test_multiple_navigations_for_single_context(self):
51 # It is possible to have multiple navigation classes for a given
52 # context class as long as they are registered for different layers.
53 directive = """
54 <browser:navigation
55 module="%(this)s" classes="ThingNavigation"/>
56 <browser:navigation
57 module="%(this)s" classes="OtherThingNavigation"
58 layer="%(this)s.IOtherLayer" />
59 """ % dict(this=this)
60 xmlconfig.string(zcml_configure % directive)
61
62 navigation = getMultiAdapter(
63 (Thing(), DefaultBrowserLayer()), IBrowserPublisher, name='')
64 other_navigation = getMultiAdapter(
65 (Thing(), OtherLayer()), IBrowserPublisher, name='')
66 self.assertNotEqual(navigation, other_navigation)
67
68 def tearDown(self):
69 TestCase.tearDown(self)
70 cleanUp()
71
72
73class DefaultBrowserLayer:
74 implements(IDefaultBrowserLayer)
75
76
77class IThing(Interface):
78 pass
79
80
81class Thing(object):
82 implements(IThing)
83
84
85class ThingNavigation(Navigation):
86 usedfor = IThing
87
88
89class OtherThingNavigation(Navigation):
90 usedfor = IThing
91
92
93class IOtherLayer(Interface):
94 pass
95
96
97class OtherLayer:
98 implements(IOtherLayer)
99
100
101this = "canonical.launchpad.webapp.tests.test_navigation"
102zcml_configure = """
103 <configure xmlns:browser="http://namespaces.zope.org/browser">
104 <include package="canonical.launchpad.webapp" file="meta.zcml" />
105 %s
106 </configure>
107 """
0108
=== added symlink 'lib/deb822.py'
=== target is u'../sourcecode/python-debian/lib/deb822.py'
=== added symlink 'lib/debian'
=== target is u'../sourcecode/python-debian/lib/debian'
=== modified file 'lib/lp/archivepublisher/config.py'
--- lib/lp/archivepublisher/config.py 2010-07-07 06:28:03 +0000
+++ lib/lp/archivepublisher/config.py 2010-08-12 19:18:45 +0000
@@ -120,18 +120,15 @@
120 config_segment["archtags"].append(120 config_segment["archtags"].append(
121 dar.architecturetag.encode('utf-8'))121 dar.architecturetag.encode('utf-8'))
122122
123 if not dr.lucilleconfig:123 if dr.lucilleconfig:
124 raise LucilleConfigError(124 strio = StringIO(dr.lucilleconfig.encode('utf-8'))
125 'No Lucille configuration section for %s' % dr.name)125 config_segment["config"] = ConfigParser()
126126 config_segment["config"].readfp(strio)
127 strio = StringIO(dr.lucilleconfig.encode('utf-8'))127 strio.close()
128 config_segment["config"] = ConfigParser()128 config_segment["components"] = config_segment["config"].get(
129 config_segment["config"].readfp(strio)129 "publishing", "components").split(" ")
130 strio.close()130
131 config_segment["components"] = config_segment["config"].get(131 self._distroseries[distroseries_name] = config_segment
132 "publishing", "components").split(" ")
133
134 self._distroseries[distroseries_name] = config_segment
135132
136 strio = StringIO(distribution.lucilleconfig.encode('utf-8'))133 strio = StringIO(distribution.lucilleconfig.encode('utf-8'))
137 self._distroconfig = ConfigParser()134 self._distroconfig = ConfigParser()
@@ -144,11 +141,19 @@
144 # Because dicts iterate for keys only; this works to get dr names141 # Because dicts iterate for keys only; this works to get dr names
145 return self._distroseries.keys()142 return self._distroseries.keys()
146143
144 def series(self, dr):
145 try:
146 return self._distroseries[dr]
147 except KeyError:
148 raise LucilleConfigError(
149 'No Lucille config section for %s in %s' %
150 (dr, self.distroName))
151
147 def archTagsForSeries(self, dr):152 def archTagsForSeries(self, dr):
148 return self._distroseries[dr]["archtags"]153 return self.series(dr)["archtags"]
149154
150 def componentsForSeries(self, dr):155 def componentsForSeries(self, dr):
151 return self._distroseries[dr]["components"]156 return self.series(dr)["components"]
152157
153 def _extractConfigInfo(self):158 def _extractConfigInfo(self):
154 """Extract configuration information into the attributes we use"""159 """Extract configuration information into the attributes we use"""
155160
=== modified file 'lib/lp/archivepublisher/ftparchive.py'
--- lib/lp/archivepublisher/ftparchive.py 2009-10-26 18:40:04 +0000
+++ lib/lp/archivepublisher/ftparchive.py 2010-08-12 19:18:45 +0000
@@ -138,6 +138,14 @@
138 self._config = config138 self._config = config
139 self._diskpool = diskpool139 self._diskpool = diskpool
140 self.distro = distro140 self.distro = distro
141 self.distroseries = []
142 for distroseries in self.distro.series:
143 if not distroseries.name in self._config.distroSeriesNames():
144 self.log.warning("Distroseries %s in %s doesn't have "
145 "a lucille configuration.", distroseries.name,
146 self.distro.name)
147 else:
148 self.distroseries.append(distroseries)
141 self.publisher = publisher149 self.publisher = publisher
142 self.release_files_needed = {}150 self.release_files_needed = {}
143151
@@ -185,7 +193,7 @@
185 # iterate over the pockets, and do the suffix check inside193 # iterate over the pockets, and do the suffix check inside
186 # createEmptyPocketRequest; that would also allow us to replace194 # createEmptyPocketRequest; that would also allow us to replace
187 # the == "" check we do there by a RELEASE match195 # the == "" check we do there by a RELEASE match
188 for distroseries in self.distro:196 for distroseries in self.distroseries:
189 components = self._config.componentsForSeries(distroseries.name)197 components = self._config.componentsForSeries(distroseries.name)
190 for pocket, suffix in pocketsuffix.items():198 for pocket, suffix in pocketsuffix.items():
191 if not fullpublish:199 if not fullpublish:
@@ -366,7 +374,7 @@
366374
367 def generateOverrides(self, fullpublish=False):375 def generateOverrides(self, fullpublish=False):
368 """Collect packages that need overrides, and generate them."""376 """Collect packages that need overrides, and generate them."""
369 for distroseries in self.distro.series:377 for distroseries in self.distroseries:
370 for pocket in PackagePublishingPocket.items:378 for pocket in PackagePublishingPocket.items:
371 if not fullpublish:379 if not fullpublish:
372 if not self.publisher.isDirty(distroseries, pocket):380 if not self.publisher.isDirty(distroseries, pocket):
@@ -629,7 +637,7 @@
629637
630 def generateFileLists(self, fullpublish=False):638 def generateFileLists(self, fullpublish=False):
631 """Collect currently published FilePublishings and write filelists."""639 """Collect currently published FilePublishings and write filelists."""
632 for distroseries in self.distro.series:640 for distroseries in self.distroseries:
633 for pocket in pocketsuffix:641 for pocket in pocketsuffix:
634 if not fullpublish:642 if not fullpublish:
635 if not self.publisher.isDirty(distroseries, pocket):643 if not self.publisher.isDirty(distroseries, pocket):
636644
=== modified file 'lib/lp/archivepublisher/tests/test_config.py'
--- lib/lp/archivepublisher/tests/test_config.py 2010-07-18 00:24:06 +0000
+++ lib/lp/archivepublisher/tests/test_config.py 2010-08-12 19:18:45 +0000
@@ -5,36 +5,48 @@
55
6__metaclass__ = type6__metaclass__ = type
77
8import unittest
9
10from zope.component import getUtility8from zope.component import getUtility
119
12from canonical.config import config10from canonical.config import config
13from canonical.launchpad.interfaces import IDistributionSet11from canonical.launchpad.interfaces import IDistributionSet
14from canonical.testing import LaunchpadZopelessLayer12from canonical.testing import LaunchpadZopelessLayer
1513
1614from lp.archivepublisher.config import Config, LucilleConfigError
17class TestConfig(unittest.TestCase):15from lp.testing import TestCaseWithFactory
16
17
18class TestConfig(TestCaseWithFactory):
18 layer = LaunchpadZopelessLayer19 layer = LaunchpadZopelessLayer
1920
20 def setUp(self):21 def setUp(self):
22 super(TestConfig, self).setUp()
21 self.layer.switchDbUser(config.archivepublisher.dbuser)23 self.layer.switchDbUser(config.archivepublisher.dbuser)
22 self.ubuntutest = getUtility(IDistributionSet)['ubuntutest']24 self.ubuntutest = getUtility(IDistributionSet)['ubuntutest']
2325
26 def testMissingDistroSeries(self):
27 distroseries = self.factory.makeDistroSeries(
28 distribution=self.ubuntutest, name="somename")
29 d = Config(self.ubuntutest)
30 dsns = d.distroSeriesNames()
31 self.assertEquals(len(dsns), 2)
32 self.assertEquals(dsns[0], "breezy-autotest")
33 self.assertEquals(dsns[1], "hoary-test")
34 self.assertRaises(LucilleConfigError,
35 d.archTagsForSeries, "somename")
36 self.assertRaises(LucilleConfigError,
37 d.archTagsForSeries, "unknown")
38
24 def testInstantiate(self):39 def testInstantiate(self):
25 """Config should instantiate"""40 """Config should instantiate"""
26 from lp.archivepublisher.config import Config
27 d = Config(self.ubuntutest)41 d = Config(self.ubuntutest)
2842
29 def testDistroName(self):43 def testDistroName(self):
30 """Config should be able to return the distroName"""44 """Config should be able to return the distroName"""
31 from lp.archivepublisher.config import Config
32 d = Config(self.ubuntutest)45 d = Config(self.ubuntutest)
33 self.assertEqual(d.distroName, "ubuntutest")46 self.assertEqual(d.distroName, "ubuntutest")
3447
35 def testDistroSeriesNames(self):48 def testDistroSeriesNames(self):
36 """Config should return two distroseries names"""49 """Config should return two distroseries names"""
37 from lp.archivepublisher.config import Config
38 d = Config(self.ubuntutest)50 d = Config(self.ubuntutest)
39 dsns = d.distroSeriesNames()51 dsns = d.distroSeriesNames()
40 self.assertEquals(len(dsns), 2)52 self.assertEquals(len(dsns), 2)
@@ -43,14 +55,12 @@
4355
44 def testArchTagsForSeries(self):56 def testArchTagsForSeries(self):
45 """Config should have the arch tags for the drs"""57 """Config should have the arch tags for the drs"""
46 from lp.archivepublisher.config import Config
47 d = Config(self.ubuntutest)58 d = Config(self.ubuntutest)
48 archs = d.archTagsForSeries("hoary-test")59 archs = d.archTagsForSeries("hoary-test")
49 self.assertEquals(len(archs), 2)60 self.assertEquals(len(archs), 2)
5061
51 def testDistroConfig(self):62 def testDistroConfig(self):
52 """Config should have parsed a distro config"""63 """Config should have parsed a distro config"""
53 from lp.archivepublisher.config import Config
54 d = Config(self.ubuntutest)64 d = Config(self.ubuntutest)
55 # NOTE: Add checks here when you add stuff in util.py65 # NOTE: Add checks here when you add stuff in util.py
56 self.assertEquals(d.stayofexecution, 5)66 self.assertEquals(d.stayofexecution, 5)
5767
=== modified file 'lib/lp/archivepublisher/tests/test_ftparchive.py'
--- lib/lp/archivepublisher/tests/test_ftparchive.py 2010-07-18 00:24:06 +0000
+++ lib/lp/archivepublisher/tests/test_ftparchive.py 2010-08-12 19:18:45 +0000
@@ -15,7 +15,7 @@
15from zope.component import getUtility15from zope.component import getUtility
1616
17from canonical.config import config17from canonical.config import config
18from canonical.launchpad.scripts.logger import QuietFakeLogger18from canonical.launchpad.scripts.logger import BufferLogger, QuietFakeLogger
19from canonical.testing import LaunchpadZopelessLayer19from canonical.testing import LaunchpadZopelessLayer
20from lp.archivepublisher.config import Config20from lp.archivepublisher.config import Config
21from lp.archivepublisher.diskpool import DiskPool21from lp.archivepublisher.diskpool import DiskPool
@@ -23,6 +23,7 @@
23from lp.archivepublisher.publishing import Publisher23from lp.archivepublisher.publishing import Publisher
24from lp.registry.interfaces.distribution import IDistributionSet24from lp.registry.interfaces.distribution import IDistributionSet
25from lp.registry.interfaces.pocket import PackagePublishingPocket25from lp.registry.interfaces.pocket import PackagePublishingPocket
26from lp.testing import TestCaseWithFactory
2627
2728
28def sanitize_apt_ftparchive_Sources_output(text):29def sanitize_apt_ftparchive_Sources_output(text):
@@ -55,10 +56,11 @@
55 return self._result[i:j]56 return self._result[i:j]
5657
5758
58class TestFTPArchive(unittest.TestCase):59class TestFTPArchive(TestCaseWithFactory):
59 layer = LaunchpadZopelessLayer60 layer = LaunchpadZopelessLayer
6061
61 def setUp(self):62 def setUp(self):
63 super(TestFTPArchive, self).setUp()
62 self.layer.switchDbUser(config.archivepublisher.dbuser)64 self.layer.switchDbUser(config.archivepublisher.dbuser)
6365
64 self._distribution = getUtility(IDistributionSet)['ubuntutest']66 self._distribution = getUtility(IDistributionSet)['ubuntutest']
@@ -79,6 +81,7 @@
79 self._publisher = SamplePublisher(self._archive)81 self._publisher = SamplePublisher(self._archive)
8082
81 def tearDown(self):83 def tearDown(self):
84 super(TestFTPArchive, self).tearDown()
82 shutil.rmtree(self._config.distroroot)85 shutil.rmtree(self._config.distroroot)
8386
84 def _verifyFile(self, filename, directory, output_filter=None):87 def _verifyFile(self, filename, directory, output_filter=None):
@@ -116,6 +119,19 @@
116 self._publisher)119 self._publisher)
117 return fa120 return fa
118121
122 def test_NoLucilleConfig(self):
123 # Distroseries without a lucille configuration get ignored
124 # and trigger a warning, they don't break the publisher
125 logger = BufferLogger()
126 publisher = Publisher(
127 logger, self._config, self._dp, self._archive)
128 self.factory.makeDistroSeries(self._distribution, name="somename")
129 fa = FTPArchiveHandler(logger, self._config, self._dp,
130 self._distribution, publisher)
131 fa.createEmptyPocketRequests(fullpublish=True)
132 self.assertEquals("WARNING: Distroseries somename in ubuntutest doesn't "
133 "have a lucille configuration.\n", logger.buffer.getvalue())
134
119 def test_getSourcesForOverrides(self):135 def test_getSourcesForOverrides(self):
120 # getSourcesForOverrides returns a list of tuples containing:136 # getSourcesForOverrides returns a list of tuples containing:
121 # (sourcename, suite, component, section)137 # (sourcename, suite, component, section)
122138
=== modified file 'lib/lp/archiveuploader/tests/__init__.py'
--- lib/lp/archiveuploader/tests/__init__.py 2010-05-04 15:38:08 +0000
+++ lib/lp/archiveuploader/tests/__init__.py 2010-08-12 19:18:45 +0000
@@ -1,6 +1,10 @@
1# Copyright 2009 Canonical Ltd. This software is licensed under the1# Copyright 2009 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).2# GNU Affero General Public License version 3 (see the file LICENSE).
33
4"""Tests for the archive uploader."""
5
6from __future__ import with_statement
7
4__metaclass__ = type8__metaclass__ = type
59
6__all__ = ['datadir', 'getPolicy', 'insertFakeChangesFile',10__all__ = ['datadir', 'getPolicy', 'insertFakeChangesFile',
@@ -25,6 +29,7 @@
25 raise ValueError("Path is not relative: %s" % path)29 raise ValueError("Path is not relative: %s" % path)
26 return os.path.join(here, 'data', path)30 return os.path.join(here, 'data', path)
2731
32
28def insertFakeChangesFile(fileID, path=None):33def insertFakeChangesFile(fileID, path=None):
29 """Insert a fake changes file into the librarian.34 """Insert a fake changes file into the librarian.
3035
@@ -34,11 +39,11 @@
34 """39 """
35 if path is None:40 if path is None:
36 path = datadir("ed-0.2-21/ed_0.2-21_source.changes")41 path = datadir("ed-0.2-21/ed_0.2-21_source.changes")
37 changes_file_obj = open(path, 'r')42 with open(path, 'r') as changes_file_obj:
38 test_changes_file = changes_file_obj.read()43 test_changes_file = changes_file_obj.read()
39 changes_file_obj.close()
40 fillLibrarianFile(fileID, content=test_changes_file)44 fillLibrarianFile(fileID, content=test_changes_file)
4145
46
42def insertFakeChangesFileForAllPackageUploads():47def insertFakeChangesFileForAllPackageUploads():
43 """Ensure all the PackageUpload records point to a valid changes file."""48 """Ensure all the PackageUpload records point to a valid changes file."""
44 for id in set(pu.changesfile.id for pu in PackageUploadSet()):49 for id in set(pu.changesfile.id for pu in PackageUploadSet()):
@@ -53,6 +58,7 @@
53 self.distroseries = distroseries58 self.distroseries = distroseries
54 self.buildid = buildid59 self.buildid = buildid
5560
61
56def getPolicy(name='anything', distro='ubuntu', distroseries=None,62def getPolicy(name='anything', distro='ubuntu', distroseries=None,
57 buildid=None):63 buildid=None):
58 """Build and return an Upload Policy for the given context."""64 """Build and return an Upload Policy for the given context."""
5965
=== modified file 'lib/lp/archiveuploader/tests/test_buildduploads.py'
--- lib/lp/archiveuploader/tests/test_buildduploads.py 2010-07-18 00:26:33 +0000
+++ lib/lp/archiveuploader/tests/test_buildduploads.py 2010-08-12 19:18:45 +0000
@@ -7,7 +7,6 @@
77
8from lp.archiveuploader.tests.test_securityuploads import (8from lp.archiveuploader.tests.test_securityuploads import (
9 TestStagedBinaryUploadBase)9 TestStagedBinaryUploadBase)
10from lp.archiveuploader.uploadprocessor import UploadProcessor
11from lp.registry.interfaces.pocket import PackagePublishingPocket10from lp.registry.interfaces.pocket import PackagePublishingPocket
12from canonical.database.constants import UTC_NOW11from canonical.database.constants import UTC_NOW
13from canonical.launchpad.interfaces import PackagePublishingStatus12from canonical.launchpad.interfaces import PackagePublishingStatus
@@ -84,8 +83,8 @@
84 """Setup an UploadProcessor instance for a given buildd context."""83 """Setup an UploadProcessor instance for a given buildd context."""
85 self.options.context = self.policy84 self.options.context = self.policy
86 self.options.buildid = str(build_candidate.id)85 self.options.buildid = str(build_candidate.id)
87 self.uploadprocessor = UploadProcessor(86 self.uploadprocessor = self.getUploadProcessor(
88 self.options, self.layer.txn, self.log)87 self.layer.txn)
8988
90 def testDelayedBinaryUpload(self):89 def testDelayedBinaryUpload(self):
91 """Check if Soyuz copes with delayed binary uploads.90 """Check if Soyuz copes with delayed binary uploads.
9291
=== modified file 'lib/lp/archiveuploader/tests/test_ppauploadprocessor.py'
--- lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2010-08-02 02:13:52 +0000
+++ lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2010-08-12 19:18:45 +0000
@@ -18,7 +18,6 @@
18from zope.security.proxy import removeSecurityProxy18from zope.security.proxy import removeSecurityProxy
1919
20from lp.app.errors import NotFoundError20from lp.app.errors import NotFoundError
21from lp.archiveuploader.uploadprocessor import UploadProcessor
22from lp.archiveuploader.tests.test_uploadprocessor import (21from lp.archiveuploader.tests.test_uploadprocessor import (
23 TestUploadProcessorBase)22 TestUploadProcessorBase)
24from canonical.config import config23from canonical.config import config
@@ -74,8 +73,7 @@
7473
75 # Set up the uploadprocessor with appropriate options and logger74 # Set up the uploadprocessor with appropriate options and logger
76 self.options.context = 'insecure'75 self.options.context = 'insecure'
77 self.uploadprocessor = UploadProcessor(76 self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
78 self.options, self.layer.txn, self.log)
7977
80 def assertEmail(self, contents=None, recipients=None,78 def assertEmail(self, contents=None, recipients=None,
81 ppa_header='name16'):79 ppa_header='name16'):
@@ -1224,8 +1222,7 @@
12241222
1225 # Re-initialize uploadprocessor since it depends on the new1223 # Re-initialize uploadprocessor since it depends on the new
1226 # transaction reset by switchDbUser.1224 # transaction reset by switchDbUser.
1227 self.uploadprocessor = UploadProcessor(1225 self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
1228 self.options, self.layer.txn, self.log)
12291226
1230 def testPPASizeQuotaSourceRejection(self):1227 def testPPASizeQuotaSourceRejection(self):
1231 """Verify the size quota check for PPA uploads.1228 """Verify the size quota check for PPA uploads.
12321229
=== modified file 'lib/lp/archiveuploader/tests/test_recipeuploads.py'
--- lib/lp/archiveuploader/tests/test_recipeuploads.py 2010-07-22 15:24:02 +0000
+++ lib/lp/archiveuploader/tests/test_recipeuploads.py 2010-08-12 19:18:45 +0000
@@ -12,7 +12,6 @@
1212
13from lp.archiveuploader.tests.test_uploadprocessor import (13from lp.archiveuploader.tests.test_uploadprocessor import (
14 TestUploadProcessorBase)14 TestUploadProcessorBase)
15from lp.archiveuploader.uploadprocessor import UploadProcessor
16from lp.buildmaster.interfaces.buildbase import BuildStatus15from lp.buildmaster.interfaces.buildbase import BuildStatus
17from lp.code.interfaces.sourcepackagerecipebuild import (16from lp.code.interfaces.sourcepackagerecipebuild import (
18 ISourcePackageRecipeBuildSource)17 ISourcePackageRecipeBuildSource)
@@ -42,8 +41,8 @@
42 self.options.context = 'recipe'41 self.options.context = 'recipe'
43 self.options.buildid = self.build.id42 self.options.buildid = self.build.id
4443
45 self.uploadprocessor = UploadProcessor(44 self.uploadprocessor = self.getUploadProcessor(
46 self.options, self.layer.txn, self.log)45 self.layer.txn)
4746
48 def testSetsBuildAndState(self):47 def testSetsBuildAndState(self):
49 # Ensure that the upload processor correctly links the SPR to48 # Ensure that the upload processor correctly links the SPR to
5049
=== modified file 'lib/lp/archiveuploader/tests/test_securityuploads.py'
--- lib/lp/archiveuploader/tests/test_securityuploads.py 2010-07-18 00:26:33 +0000
+++ lib/lp/archiveuploader/tests/test_securityuploads.py 2010-08-12 19:18:45 +0000
@@ -11,7 +11,6 @@
1111
12from lp.archiveuploader.tests.test_uploadprocessor import (12from lp.archiveuploader.tests.test_uploadprocessor import (
13 TestUploadProcessorBase)13 TestUploadProcessorBase)
14from lp.archiveuploader.uploadprocessor import UploadProcessor
15from lp.registry.interfaces.pocket import PackagePublishingPocket14from lp.registry.interfaces.pocket import PackagePublishingPocket
16from lp.soyuz.model.binarypackagebuild import BinaryPackageBuild15from lp.soyuz.model.binarypackagebuild import BinaryPackageBuild
17from lp.soyuz.model.processor import ProcessorFamily16from lp.soyuz.model.processor import ProcessorFamily
@@ -70,8 +69,7 @@
70 self.options.context = self.policy69 self.options.context = self.policy
71 self.options.nomails = self.no_mails70 self.options.nomails = self.no_mails
72 # Set up the uploadprocessor with appropriate options and logger71 # Set up the uploadprocessor with appropriate options and logger
73 self.uploadprocessor = UploadProcessor(72 self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
74 self.options, self.layer.txn, self.log)
75 self.builds_before_upload = BinaryPackageBuild.select().count()73 self.builds_before_upload = BinaryPackageBuild.select().count()
76 self.source_queue = None74 self.source_queue = None
77 self._uploadSource()75 self._uploadSource()
@@ -232,8 +230,7 @@
232 """230 """
233 build_candidate = self._createBuild('i386')231 build_candidate = self._createBuild('i386')
234 self.options.buildid = str(build_candidate.id)232 self.options.buildid = str(build_candidate.id)
235 self.uploadprocessor = UploadProcessor(233 self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
236 self.options, self.layer.txn, self.log)
237234
238 build_used = self._uploadBinary('i386')235 build_used = self._uploadBinary('i386')
239236
@@ -254,8 +251,7 @@
254 """251 """
255 build_candidate = self._createBuild('hppa')252 build_candidate = self._createBuild('hppa')
256 self.options.buildid = str(build_candidate.id)253 self.options.buildid = str(build_candidate.id)
257 self.uploadprocessor = UploadProcessor(254 self.uploadprocessor = self.getUploadProcessor(self.layer.txn)
258 self.options, self.layer.txn, self.log)
259255
260 self.assertRaises(AssertionError, self._uploadBinary, 'i386')256 self.assertRaises(AssertionError, self._uploadBinary, 'i386')
261257
262258
=== modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py'
--- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-08-02 02:13:52 +0000
+++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-08-12 19:18:45 +0000
@@ -23,8 +23,10 @@
23from zope.security.proxy import removeSecurityProxy23from zope.security.proxy import removeSecurityProxy
2424
25from lp.app.errors import NotFoundError25from lp.app.errors import NotFoundError
26from lp.archiveuploader.uploadpolicy import AbstractUploadPolicy26from lp.archiveuploader.uploadpolicy import (AbstractUploadPolicy,
27 findPolicyByOptions)
27from lp.archiveuploader.uploadprocessor import UploadProcessor28from lp.archiveuploader.uploadprocessor import UploadProcessor
29from lp.buildmaster.interfaces.buildbase import BuildStatus
28from canonical.config import config30from canonical.config import config
29from canonical.database.constants import UTC_NOW31from canonical.database.constants import UTC_NOW
30from lp.soyuz.model.archivepermission import ArchivePermission32from lp.soyuz.model.archivepermission import ArchivePermission
@@ -59,7 +61,7 @@
59 ISourcePackageNameSet)61 ISourcePackageNameSet)
60from lp.services.mail import stub62from lp.services.mail import stub
61from canonical.launchpad.testing.fakepackager import FakePackager63from canonical.launchpad.testing.fakepackager import FakePackager
62from lp.testing import TestCaseWithFactory64from lp.testing import TestCase, TestCaseWithFactory
63from lp.testing.mail_helpers import pop_notifications65from lp.testing.mail_helpers import pop_notifications
64from canonical.launchpad.webapp.errorlog import ErrorReportingUtility66from canonical.launchpad.webapp.errorlog import ErrorReportingUtility
65from canonical.testing import LaunchpadZopelessLayer67from canonical.testing import LaunchpadZopelessLayer
@@ -113,7 +115,8 @@
113 super(TestUploadProcessorBase, self).setUp()115 super(TestUploadProcessorBase, self).setUp()
114116
115 self.queue_folder = tempfile.mkdtemp()117 self.queue_folder = tempfile.mkdtemp()
116 os.makedirs(os.path.join(self.queue_folder, "incoming"))118 self.incoming_folder = os.path.join(self.queue_folder, "incoming")
119 os.makedirs(self.incoming_folder)
117120
118 self.test_files_dir = os.path.join(config.root,121 self.test_files_dir = os.path.join(config.root,
119 "lib/lp/archiveuploader/tests/data/suite")122 "lib/lp/archiveuploader/tests/data/suite")
@@ -139,6 +142,30 @@
139 shutil.rmtree(self.queue_folder)142 shutil.rmtree(self.queue_folder)
140 super(TestUploadProcessorBase, self).tearDown()143 super(TestUploadProcessorBase, self).tearDown()
141144
145 def getUploadProcessor(self, txn):
146 def getPolicy(distro):
147 self.options.distro = distro.name
148 return findPolicyByOptions(self.options)
149 return UploadProcessor(
150 self.options.base_fsroot, self.options.dryrun,
151 self.options.nomails,
152 self.options.keep, getPolicy, txn, self.log)
153
154 def publishPackage(self, packagename, version, source=True,
155 archive=None):
156 """Publish a single package that is currently NEW in the queue."""
157 queue_items = self.breezy.getQueueItems(
158 status=PackageUploadStatus.NEW, name=packagename,
159 version=version, exact_match=True, archive=archive)
160 self.assertEqual(queue_items.count(), 1)
161 queue_item = queue_items[0]
162 queue_item.setAccepted()
163 if source:
164 pubrec = queue_item.sources[0].publish(self.log)
165 else:
166 pubrec = queue_item.builds[0].publish(self.log)
167 return pubrec
168
142 def assertLogContains(self, line):169 def assertLogContains(self, line):
143 """Assert if a given line is present in the log messages."""170 """Assert if a given line is present in the log messages."""
144 self.assertTrue(line in self.log.lines,171 self.assertTrue(line in self.log.lines,
@@ -208,25 +235,29 @@
208 filename, len(content), StringIO(content),235 filename, len(content), StringIO(content),
209 'application/x-gtar')236 'application/x-gtar')
210237
211 def queueUpload(self, upload_name, relative_path="", test_files_dir=None):238 def queueUpload(self, upload_name, relative_path="", test_files_dir=None,
239 queue_entry=None):
212 """Queue one of our test uploads.240 """Queue one of our test uploads.
213241
214 upload_name is the name of the test upload directory. It is also242 upload_name is the name of the test upload directory. If there
243 is no explicit queue entry name specified, it is also
215 the name of the queue entry directory we create.244 the name of the queue entry directory we create.
216 relative_path is the path to create inside the upload, eg245 relative_path is the path to create inside the upload, eg
217 ubuntu/~malcc/default. If not specified, defaults to "".246 ubuntu/~malcc/default. If not specified, defaults to "".
218247
219 Return the path to the upload queue entry directory created.248 Return the path to the upload queue entry directory created.
220 """249 """
250 if queue_entry is None:
251 queue_entry = upload_name
221 target_path = os.path.join(252 target_path = os.path.join(
222 self.queue_folder, "incoming", upload_name, relative_path)253 self.incoming_folder, queue_entry, relative_path)
223 if test_files_dir is None:254 if test_files_dir is None:
224 test_files_dir = self.test_files_dir255 test_files_dir = self.test_files_dir
225 upload_dir = os.path.join(test_files_dir, upload_name)256 upload_dir = os.path.join(test_files_dir, upload_name)
226 if relative_path:257 if relative_path:
227 os.makedirs(os.path.dirname(target_path))258 os.makedirs(os.path.dirname(target_path))
228 shutil.copytree(upload_dir, target_path)259 shutil.copytree(upload_dir, target_path)
229 return os.path.join(self.queue_folder, "incoming", upload_name)260 return os.path.join(self.incoming_folder, queue_entry)
230261
231 def processUpload(self, processor, upload_dir):262 def processUpload(self, processor, upload_dir):
232 """Process an upload queue entry directory.263 """Process an upload queue entry directory.
@@ -248,8 +279,7 @@
248 self.layer.txn.commit()279 self.layer.txn.commit()
249 if policy is not None:280 if policy is not None:
250 self.options.context = policy281 self.options.context = policy
251 return UploadProcessor(282 return self.getUploadProcessor(self.layer.txn)
252 self.options, self.layer.txn, self.log)
253283
254 def assertEmail(self, contents=None, recipients=None):284 def assertEmail(self, contents=None, recipients=None):
255 """Check last email content and recipients.285 """Check last email content and recipients.
@@ -341,24 +371,9 @@
341 "Expected acceptance email not rejection. Actually Got:\n%s"371 "Expected acceptance email not rejection. Actually Got:\n%s"
342 % raw_msg)372 % raw_msg)
343373
344 def _publishPackage(self, packagename, version, source=True,
345 archive=None):
346 """Publish a single package that is currently NEW in the queue."""
347 queue_items = self.breezy.getQueueItems(
348 status=PackageUploadStatus.NEW, name=packagename,
349 version=version, exact_match=True, archive=archive)
350 self.assertEqual(queue_items.count(), 1)
351 queue_item = queue_items[0]
352 queue_item.setAccepted()
353 if source:
354 pubrec = queue_item.sources[0].publish(self.log)
355 else:
356 pubrec = queue_item.builds[0].publish(self.log)
357 return pubrec
358
359 def testInstantiate(self):374 def testInstantiate(self):
360 """UploadProcessor should instantiate"""375 """UploadProcessor should instantiate"""
361 up = UploadProcessor(self.options, None, self.log)376 up = self.getUploadProcessor(None)
362377
363 def testLocateDirectories(self):378 def testLocateDirectories(self):
364 """Return a sorted list of subdirs in a directory.379 """Return a sorted list of subdirs in a directory.
@@ -372,7 +387,7 @@
372 os.mkdir("%s/dir1" % testdir)387 os.mkdir("%s/dir1" % testdir)
373 os.mkdir("%s/dir2" % testdir)388 os.mkdir("%s/dir2" % testdir)
374389
375 up = UploadProcessor(self.options, None, self.log)390 up = self.getUploadProcessor(None)
376 located_dirs = up.locateDirectories(testdir)391 located_dirs = up.locateDirectories(testdir)
377 self.assertEqual(located_dirs, ['dir1', 'dir2', 'dir3'])392 self.assertEqual(located_dirs, ['dir1', 'dir2', 'dir3'])
378 finally:393 finally:
@@ -390,7 +405,7 @@
390 open("%s/2_source.changes" % testdir, "w").close()405 open("%s/2_source.changes" % testdir, "w").close()
391 open("%s/3.not_changes" % testdir, "w").close()406 open("%s/3.not_changes" % testdir, "w").close()
392407
393 up = UploadProcessor(self.options, None, self.log)408 up = self.getUploadProcessor(None)
394 located_files = up.locateChangesFiles(testdir)409 located_files = up.locateChangesFiles(testdir)
395 self.assertEqual(410 self.assertEqual(
396 located_files, ["2_source.changes", "1.changes"])411 located_files, ["2_source.changes", "1.changes"])
@@ -418,7 +433,7 @@
418433
419 # Move it434 # Move it
420 self.options.base_fsroot = testdir435 self.options.base_fsroot = testdir
421 up = UploadProcessor(self.options, None, self.log)436 up = self.getUploadProcessor(None)
422 up.moveUpload(upload, target_name)437 up.moveUpload(upload, target_name)
423438
424 # Check it moved439 # Check it moved
@@ -439,7 +454,7 @@
439454
440 # Remove it455 # Remove it
441 self.options.base_fsroot = testdir456 self.options.base_fsroot = testdir
442 up = UploadProcessor(self.options, None, self.log)457 up = self.getUploadProcessor(None)
443 up.moveProcessedUpload(upload, "accepted")458 up.moveProcessedUpload(upload, "accepted")
444459
445 # Check it was removed, not moved460 # Check it was removed, not moved
@@ -462,7 +477,7 @@
462477
463 # Move it478 # Move it
464 self.options.base_fsroot = testdir479 self.options.base_fsroot = testdir
465 up = UploadProcessor(self.options, None, self.log)480 up = self.getUploadProcessor(None)
466 up.moveProcessedUpload(upload, "rejected")481 up.moveProcessedUpload(upload, "rejected")
467482
468 # Check it moved483 # Check it moved
@@ -485,7 +500,7 @@
485500
486 # Remove it501 # Remove it
487 self.options.base_fsroot = testdir502 self.options.base_fsroot = testdir
488 up = UploadProcessor(self.options, None, self.log)503 up = self.getUploadProcessor(None)
489 up.removeUpload(upload)504 up.removeUpload(upload)
490505
491 # Check it was removed, not moved506 # Check it was removed, not moved
@@ -498,7 +513,7 @@
498513
499 def testOrderFilenames(self):514 def testOrderFilenames(self):
500 """orderFilenames sorts _source.changes ahead of other files."""515 """orderFilenames sorts _source.changes ahead of other files."""
501 up = UploadProcessor(self.options, None, self.log)516 up = self.getUploadProcessor(None)
502517
503 self.assertEqual(["d_source.changes", "a", "b", "c"],518 self.assertEqual(["d_source.changes", "a", "b", "c"],
504 up.orderFilenames(["b", "a", "d_source.changes", "c"]))519 up.orderFilenames(["b", "a", "d_source.changes", "c"]))
@@ -522,8 +537,7 @@
522 # Register our broken upload policy537 # Register our broken upload policy
523 AbstractUploadPolicy._registerPolicy(BrokenUploadPolicy)538 AbstractUploadPolicy._registerPolicy(BrokenUploadPolicy)
524 self.options.context = 'broken'539 self.options.context = 'broken'
525 uploadprocessor = UploadProcessor(540 uploadprocessor = self.getUploadProcessor(self.layer.txn)
526 self.options, self.layer.txn, self.log)
527541
528 # Upload a package to Breezy.542 # Upload a package to Breezy.
529 upload_dir = self.queueUpload("baz_1.0-1")543 upload_dir = self.queueUpload("baz_1.0-1")
@@ -634,7 +648,7 @@
634 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.648 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
635 upload_dir = self.queueUpload("bar_1.0-1")649 upload_dir = self.queueUpload("bar_1.0-1")
636 self.processUpload(uploadprocessor, upload_dir)650 self.processUpload(uploadprocessor, upload_dir)
637 bar_source_pub = self._publishPackage('bar', '1.0-1')651 bar_source_pub = self.publishPackage('bar', '1.0-1')
638 [bar_original_build] = bar_source_pub.createMissingBuilds()652 [bar_original_build] = bar_source_pub.createMissingBuilds()
639653
640 # Move the source from the accepted queue.654 # Move the source from the accepted queue.
@@ -653,7 +667,7 @@
653 self.processUpload(uploadprocessor, upload_dir)667 self.processUpload(uploadprocessor, upload_dir)
654 self.assertEqual(668 self.assertEqual(
655 uploadprocessor.last_processed_upload.is_rejected, False)669 uploadprocessor.last_processed_upload.is_rejected, False)
656 bar_bin_pubs = self._publishPackage('bar', '1.0-1', source=False)670 bar_bin_pubs = self.publishPackage('bar', '1.0-1', source=False)
657 # Mangle its publishing component to "restricted" so we can check671 # Mangle its publishing component to "restricted" so we can check
658 # the copy archive ancestry override later.672 # the copy archive ancestry override later.
659 restricted = getUtility(IComponentSet)["restricted"]673 restricted = getUtility(IComponentSet)["restricted"]
@@ -746,14 +760,14 @@
746 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.760 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
747 upload_dir = self.queueUpload("bar_1.0-1")761 upload_dir = self.queueUpload("bar_1.0-1")
748 self.processUpload(uploadprocessor, upload_dir)762 self.processUpload(uploadprocessor, upload_dir)
749 bar_source_pub = self._publishPackage('bar', '1.0-1')763 bar_source_pub = self.publishPackage('bar', '1.0-1')
750 [bar_original_build] = bar_source_pub.createMissingBuilds()764 [bar_original_build] = bar_source_pub.createMissingBuilds()
751765
752 self.options.context = 'buildd'766 self.options.context = 'buildd'
753 self.options.buildid = bar_original_build.id767 self.options.buildid = bar_original_build.id
754 upload_dir = self.queueUpload("bar_1.0-1_binary")768 upload_dir = self.queueUpload("bar_1.0-1_binary")
755 self.processUpload(uploadprocessor, upload_dir)769 self.processUpload(uploadprocessor, upload_dir)
756 [bar_binary_pub] = self._publishPackage("bar", "1.0-1", source=False)770 [bar_binary_pub] = self.publishPackage("bar", "1.0-1", source=False)
757771
758 # Prepare ubuntu/breezy-autotest to build sources in i386.772 # Prepare ubuntu/breezy-autotest to build sources in i386.
759 breezy_autotest = self.ubuntu['breezy-autotest']773 breezy_autotest = self.ubuntu['breezy-autotest']
@@ -803,7 +817,7 @@
803 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.817 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
804 upload_dir = self.queueUpload("bar_1.0-1")818 upload_dir = self.queueUpload("bar_1.0-1")
805 self.processUpload(uploadprocessor, upload_dir)819 self.processUpload(uploadprocessor, upload_dir)
806 bar_source_old = self._publishPackage('bar', '1.0-1')820 bar_source_old = self.publishPackage('bar', '1.0-1')
807821
808 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.822 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
809 upload_dir = self.queueUpload("bar_1.0-2")823 upload_dir = self.queueUpload("bar_1.0-2")
@@ -816,7 +830,7 @@
816 self.options.buildid = bar_original_build.id830 self.options.buildid = bar_original_build.id
817 upload_dir = self.queueUpload("bar_1.0-2_binary")831 upload_dir = self.queueUpload("bar_1.0-2_binary")
818 self.processUpload(uploadprocessor, upload_dir)832 self.processUpload(uploadprocessor, upload_dir)
819 [bar_binary_pub] = self._publishPackage("bar", "1.0-2", source=False)833 [bar_binary_pub] = self.publishPackage("bar", "1.0-2", source=False)
820834
821 # Create a COPY archive for building in non-virtual builds.835 # Create a COPY archive for building in non-virtual builds.
822 uploader = getUtility(IPersonSet).getByName('name16')836 uploader = getUtility(IPersonSet).getByName('name16')
@@ -971,7 +985,7 @@
971 partner_archive = getUtility(IArchiveSet).getByDistroPurpose(985 partner_archive = getUtility(IArchiveSet).getByDistroPurpose(
972 self.ubuntu, ArchivePurpose.PARTNER)986 self.ubuntu, ArchivePurpose.PARTNER)
973 self.assertTrue(partner_archive)987 self.assertTrue(partner_archive)
974 self._publishPackage("foocomm", "1.0-1", archive=partner_archive)988 self.publishPackage("foocomm", "1.0-1", archive=partner_archive)
975989
976 # Check the publishing record's archive and component.990 # Check the publishing record's archive and component.
977 foocomm_spph = SourcePackagePublishingHistory.selectOneBy(991 foocomm_spph = SourcePackagePublishingHistory.selectOneBy(
@@ -1015,7 +1029,7 @@
1015 self.assertEqual(foocomm_bpr.component.name, 'partner')1029 self.assertEqual(foocomm_bpr.component.name, 'partner')
10161030
1017 # Publish the upload so we can check the publishing record.1031 # Publish the upload so we can check the publishing record.
1018 self._publishPackage("foocomm", "1.0-1", source=False)1032 self.publishPackage("foocomm", "1.0-1", source=False)
10191033
1020 # Check the publishing record's archive and component.1034 # Check the publishing record's archive and component.
1021 foocomm_bpph = BinaryPackagePublishingHistory.selectOneBy(1035 foocomm_bpph = BinaryPackagePublishingHistory.selectOneBy(
@@ -1054,14 +1068,14 @@
1054 # Accept and publish the upload.1068 # Accept and publish the upload.
1055 partner_archive = getUtility(IArchiveSet).getByDistroPurpose(1069 partner_archive = getUtility(IArchiveSet).getByDistroPurpose(
1056 self.ubuntu, ArchivePurpose.PARTNER)1070 self.ubuntu, ArchivePurpose.PARTNER)
1057 self._publishPackage("foocomm", "1.0-1", archive=partner_archive)1071 self.publishPackage("foocomm", "1.0-1", archive=partner_archive)
10581072
1059 # Now do the same thing with a binary package.1073 # Now do the same thing with a binary package.
1060 upload_dir = self.queueUpload("foocomm_1.0-1_binary")1074 upload_dir = self.queueUpload("foocomm_1.0-1_binary")
1061 self.processUpload(uploadprocessor, upload_dir)1075 self.processUpload(uploadprocessor, upload_dir)
10621076
1063 # Accept and publish the upload.1077 # Accept and publish the upload.
1064 self._publishPackage("foocomm", "1.0-1", source=False,1078 self.publishPackage("foocomm", "1.0-1", source=False,
1065 archive=partner_archive)1079 archive=partner_archive)
10661080
1067 # Upload the next source version of the package.1081 # Upload the next source version of the package.
@@ -1105,8 +1119,7 @@
1105 self.breezy.status = SeriesStatus.CURRENT1119 self.breezy.status = SeriesStatus.CURRENT
1106 self.layer.txn.commit()1120 self.layer.txn.commit()
1107 self.options.context = 'insecure'1121 self.options.context = 'insecure'
1108 uploadprocessor = UploadProcessor(1122 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1109 self.options, self.layer.txn, self.log)
11101123
1111 # Upload a package for Breezy.1124 # Upload a package for Breezy.
1112 upload_dir = self.queueUpload("foocomm_1.0-1_proposed")1125 upload_dir = self.queueUpload("foocomm_1.0-1_proposed")
@@ -1124,8 +1137,7 @@
1124 self.breezy.status = SeriesStatus.CURRENT1137 self.breezy.status = SeriesStatus.CURRENT
1125 self.layer.txn.commit()1138 self.layer.txn.commit()
1126 self.options.context = 'insecure'1139 self.options.context = 'insecure'
1127 uploadprocessor = UploadProcessor(1140 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1128 self.options, self.layer.txn, self.log)
11291141
1130 # Upload a package for Breezy.1142 # Upload a package for Breezy.
1131 upload_dir = self.queueUpload("foocomm_1.0-1")1143 upload_dir = self.queueUpload("foocomm_1.0-1")
@@ -1140,8 +1152,7 @@
1140 pocket and ensure it fails."""1152 pocket and ensure it fails."""
1141 # Set up the uploadprocessor with appropriate options and logger.1153 # Set up the uploadprocessor with appropriate options and logger.
1142 self.options.context = 'insecure'1154 self.options.context = 'insecure'
1143 uploadprocessor = UploadProcessor(1155 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1144 self.options, self.layer.txn, self.log)
11451156
1146 # Upload a package for Breezy.1157 # Upload a package for Breezy.
1147 upload_dir = self.queueUpload("foocomm_1.0-1_updates")1158 upload_dir = self.queueUpload("foocomm_1.0-1_updates")
@@ -1302,8 +1313,7 @@
1302 used.1313 used.
1303 That exception will then initiate the creation of an OOPS report.1314 That exception will then initiate the creation of an OOPS report.
1304 """1315 """
1305 processor = UploadProcessor(1316 processor = self.getUploadProcessor(self.layer.txn)
1306 self.options, self.layer.txn, self.log)
13071317
1308 upload_dir = self.queueUpload("foocomm_1.0-1_proposed")1318 upload_dir = self.queueUpload("foocomm_1.0-1_proposed")
1309 bogus_changesfile_data = '''1319 bogus_changesfile_data = '''
@@ -1346,8 +1356,7 @@
1346 self.setupBreezy()1356 self.setupBreezy()
1347 self.layer.txn.commit()1357 self.layer.txn.commit()
1348 self.options.context = 'absolutely-anything'1358 self.options.context = 'absolutely-anything'
1349 uploadprocessor = UploadProcessor(1359 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1350 self.options, self.layer.txn, self.log)
13511360
1352 # Upload the source first to enable the binary later:1361 # Upload the source first to enable the binary later:
1353 upload_dir = self.queueUpload("bar_1.0-1_lzma")1362 upload_dir = self.queueUpload("bar_1.0-1_lzma")
@@ -1357,7 +1366,7 @@
1357 self.assertTrue(1366 self.assertTrue(
1358 "rejected" not in raw_msg,1367 "rejected" not in raw_msg,
1359 "Failed to upload bar source:\n%s" % raw_msg)1368 "Failed to upload bar source:\n%s" % raw_msg)
1360 self._publishPackage("bar", "1.0-1")1369 self.publishPackage("bar", "1.0-1")
1361 # Clear out emails generated during upload.1370 # Clear out emails generated during upload.
1362 ignore = pop_notifications()1371 ignore = pop_notifications()
13631372
@@ -1456,15 +1465,14 @@
1456 permission=ArchivePermissionType.UPLOAD, person=uploader,1465 permission=ArchivePermissionType.UPLOAD, person=uploader,
1457 component=restricted)1466 component=restricted)
14581467
1459 uploadprocessor = UploadProcessor(1468 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1460 self.options, self.layer.txn, self.log)
14611469
1462 # Upload the first version and accept it to make it known in1470 # Upload the first version and accept it to make it known in
1463 # Ubuntu. The uploader has rights to upload NEW packages to1471 # Ubuntu. The uploader has rights to upload NEW packages to
1464 # components that he does not have direct rights to.1472 # components that he does not have direct rights to.
1465 upload_dir = self.queueUpload("bar_1.0-1")1473 upload_dir = self.queueUpload("bar_1.0-1")
1466 self.processUpload(uploadprocessor, upload_dir)1474 self.processUpload(uploadprocessor, upload_dir)
1467 bar_source_pub = self._publishPackage('bar', '1.0-1')1475 bar_source_pub = self.publishPackage('bar', '1.0-1')
1468 # Clear out emails generated during upload.1476 # Clear out emails generated during upload.
1469 ignore = pop_notifications()1477 ignore = pop_notifications()
14701478
@@ -1509,15 +1517,14 @@
1509 permission=ArchivePermissionType.UPLOAD, person=uploader,1517 permission=ArchivePermissionType.UPLOAD, person=uploader,
1510 component=restricted)1518 component=restricted)
15111519
1512 uploadprocessor = UploadProcessor(1520 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1513 self.options, self.layer.txn, self.log)
15141521
1515 # Upload the first version and accept it to make it known in1522 # Upload the first version and accept it to make it known in
1516 # Ubuntu. The uploader has rights to upload NEW packages to1523 # Ubuntu. The uploader has rights to upload NEW packages to
1517 # components that he does not have direct rights to.1524 # components that he does not have direct rights to.
1518 upload_dir = self.queueUpload("bar_1.0-1")1525 upload_dir = self.queueUpload("bar_1.0-1")
1519 self.processUpload(uploadprocessor, upload_dir)1526 self.processUpload(uploadprocessor, upload_dir)
1520 bar_source_pub = self._publishPackage('bar', '1.0-1')1527 bar_source_pub = self.publishPackage('bar', '1.0-1')
1521 # Clear out emails generated during upload.1528 # Clear out emails generated during upload.
1522 ignore = pop_notifications()1529 ignore = pop_notifications()
15231530
@@ -1590,8 +1597,7 @@
1590 # with pointer to the Soyuz questions in Launchpad and the1597 # with pointer to the Soyuz questions in Launchpad and the
1591 # reason why the message was sent to the current recipients.1598 # reason why the message was sent to the current recipients.
1592 self.setupBreezy()1599 self.setupBreezy()
1593 uploadprocessor = UploadProcessor(1600 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1594 self.options, self.layer.txn, self.log)
15951601
1596 upload_dir = self.queueUpload("bar_1.0-1", "boing")1602 upload_dir = self.queueUpload("bar_1.0-1", "boing")
1597 self.processUpload(uploadprocessor, upload_dir)1603 self.processUpload(uploadprocessor, upload_dir)
@@ -1636,8 +1642,7 @@
1636 self.setupBreezy()1642 self.setupBreezy()
1637 self.layer.txn.commit()1643 self.layer.txn.commit()
1638 self.options.context = 'absolutely-anything'1644 self.options.context = 'absolutely-anything'
1639 uploadprocessor = UploadProcessor(1645 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1640 self.options, self.layer.txn, self.log)
16411646
1642 # Upload the source.1647 # Upload the source.
1643 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")1648 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")
@@ -1655,8 +1660,7 @@
1655 permitted_formats=[SourcePackageFormat.FORMAT_3_0_QUILT])1660 permitted_formats=[SourcePackageFormat.FORMAT_3_0_QUILT])
1656 self.layer.txn.commit()1661 self.layer.txn.commit()
1657 self.options.context = 'absolutely-anything'1662 self.options.context = 'absolutely-anything'
1658 uploadprocessor = UploadProcessor(1663 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1659 self.options, self.layer.txn, self.log)
16601664
1661 # Upload the source.1665 # Upload the source.
1662 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")1666 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")
@@ -1666,7 +1670,7 @@
1666 self.assertTrue(1670 self.assertTrue(
1667 "rejected" not in raw_msg,1671 "rejected" not in raw_msg,
1668 "Failed to upload bar source:\n%s" % raw_msg)1672 "Failed to upload bar source:\n%s" % raw_msg)
1669 spph = self._publishPackage("bar", "1.0-1")1673 spph = self.publishPackage("bar", "1.0-1")
16701674
1671 self.assertEquals(1675 self.assertEquals(
1672 sorted((sprf.libraryfile.filename, sprf.filetype)1676 sorted((sprf.libraryfile.filename, sprf.filetype)
@@ -1689,8 +1693,7 @@
1689 permitted_formats=[SourcePackageFormat.FORMAT_3_0_QUILT])1693 permitted_formats=[SourcePackageFormat.FORMAT_3_0_QUILT])
1690 self.layer.txn.commit()1694 self.layer.txn.commit()
1691 self.options.context = 'absolutely-anything'1695 self.options.context = 'absolutely-anything'
1692 uploadprocessor = UploadProcessor(1696 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1693 self.options, self.layer.txn, self.log)
16941697
1695 # Upload the first source.1698 # Upload the first source.
1696 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")1699 upload_dir = self.queueUpload("bar_1.0-1_3.0-quilt")
@@ -1700,7 +1703,7 @@
1700 self.assertTrue(1703 self.assertTrue(
1701 "rejected" not in raw_msg,1704 "rejected" not in raw_msg,
1702 "Failed to upload bar source:\n%s" % raw_msg)1705 "Failed to upload bar source:\n%s" % raw_msg)
1703 spph = self._publishPackage("bar", "1.0-1")1706 spph = self.publishPackage("bar", "1.0-1")
17041707
1705 # Upload another source sharing the same (component) orig.1708 # Upload another source sharing the same (component) orig.
1706 upload_dir = self.queueUpload("bar_1.0-2_3.0-quilt_without_orig")1709 upload_dir = self.queueUpload("bar_1.0-2_3.0-quilt_without_orig")
@@ -1728,8 +1731,7 @@
1728 permitted_formats=[SourcePackageFormat.FORMAT_3_0_NATIVE])1731 permitted_formats=[SourcePackageFormat.FORMAT_3_0_NATIVE])
1729 self.layer.txn.commit()1732 self.layer.txn.commit()
1730 self.options.context = 'absolutely-anything'1733 self.options.context = 'absolutely-anything'
1731 uploadprocessor = UploadProcessor(1734 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1732 self.options, self.layer.txn, self.log)
17331735
1734 # Upload the source.1736 # Upload the source.
1735 upload_dir = self.queueUpload("bar_1.0_3.0-native")1737 upload_dir = self.queueUpload("bar_1.0_3.0-native")
@@ -1739,7 +1741,7 @@
1739 self.assertTrue(1741 self.assertTrue(
1740 "rejected" not in raw_msg,1742 "rejected" not in raw_msg,
1741 "Failed to upload bar source:\n%s" % raw_msg)1743 "Failed to upload bar source:\n%s" % raw_msg)
1742 spph = self._publishPackage("bar", "1.0")1744 spph = self.publishPackage("bar", "1.0")
17431745
1744 self.assertEquals(1746 self.assertEquals(
1745 sorted((sprf.libraryfile.filename, sprf.filetype)1747 sorted((sprf.libraryfile.filename, sprf.filetype)
@@ -1754,8 +1756,7 @@
1754 self.setupBreezy()1756 self.setupBreezy()
1755 self.layer.txn.commit()1757 self.layer.txn.commit()
1756 self.options.context = 'absolutely-anything'1758 self.options.context = 'absolutely-anything'
1757 uploadprocessor = UploadProcessor(1759 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1758 self.options, self.layer.txn, self.log)
17591760
1760 # Upload the source.1761 # Upload the source.
1761 upload_dir = self.queueUpload("bar_1.0-1_1.0-bzip2")1762 upload_dir = self.queueUpload("bar_1.0-1_1.0-bzip2")
@@ -1772,8 +1773,7 @@
1772 self.setupBreezy()1773 self.setupBreezy()
1773 breezy = self.ubuntu['breezy']1774 breezy = self.ubuntu['breezy']
1774 breezy.status = SeriesStatus.CURRENT1775 breezy.status = SeriesStatus.CURRENT
1775 uploadprocessor = UploadProcessor(1776 uploadprocessor = self.getUploadProcessor(self.layer.txn)
1776 self.options, self.layer.txn, self.log)
17771777
1778 upload_dir = self.queueUpload("bar_1.0-1")1778 upload_dir = self.queueUpload("bar_1.0-1")
1779 self.processUpload(uploadprocessor, upload_dir)1779 self.processUpload(uploadprocessor, upload_dir)
17801780
=== modified file 'lib/lp/archiveuploader/uploadprocessor.py'
--- lib/lp/archiveuploader/uploadprocessor.py 2010-08-04 00:30:56 +0000
+++ lib/lp/archiveuploader/uploadprocessor.py 2010-08-12 19:18:45 +0000
@@ -60,7 +60,7 @@
60from lp.archiveuploader.nascentupload import (60from lp.archiveuploader.nascentupload import (
61 NascentUpload, FatalUploadError, EarlyReturnUploadError)61 NascentUpload, FatalUploadError, EarlyReturnUploadError)
62from lp.archiveuploader.uploadpolicy import (62from lp.archiveuploader.uploadpolicy import (
63 findPolicyByOptions, UploadPolicyError)63 UploadPolicyError)
64from lp.soyuz.interfaces.archive import IArchiveSet, NoSuchPPA64from lp.soyuz.interfaces.archive import IArchiveSet, NoSuchPPA
65from lp.registry.interfaces.distribution import IDistributionSet65from lp.registry.interfaces.distribution import IDistributionSet
66from lp.registry.interfaces.person import IPersonSet66from lp.registry.interfaces.person import IPersonSet
@@ -108,16 +108,33 @@
108class UploadProcessor:108class UploadProcessor:
109 """Responsible for processing uploads. See module docstring."""109 """Responsible for processing uploads. See module docstring."""
110110
111 def __init__(self, options, ztm, log):111 def __init__(self, base_fsroot, dry_run, no_mails, keep, policy_for_distro,
112 self.options = options112 ztm, log):
113 """Create a new upload processor.
114
115 :param base_fsroot: Root path for queue to use
116 :param dry_run: Run but don't commit changes to database
117 :param no_mails: Don't send out any emails
118 :param builds: Interpret leaf names as build ids
119 :param keep: Leave the files in place, don't move them away
120 :param policy_for_distro: callback to obtain Policy object for a
121 distribution
122 :param ztm: Database transaction to use
123 :param log: Logger to use for reporting
124 """
125 self.base_fsroot = base_fsroot
126 self.dry_run = dry_run
127 self.keep = keep
128 self.last_processed_upload = None
129 self.log = log
130 self.no_mails = no_mails
131 self._getPolicyForDistro = policy_for_distro
113 self.ztm = ztm132 self.ztm = ztm
114 self.log = log
115 self.last_processed_upload = None
116133
117 def processUploadQueue(self):134 def processUploadQueue(self, leaf_name=None):
118 """Search for uploads, and process them.135 """Search for uploads, and process them.
119136
120 Uploads are searched for in the 'incoming' directory inside the137 Uploads are searched for in the 'incoming' directory inside the
121 base_fsroot.138 base_fsroot.
122139
123 This method also creates the 'incoming', 'accepted', 'rejected', and140 This method also creates the 'incoming', 'accepted', 'rejected', and
@@ -127,19 +144,22 @@
127 self.log.debug("Beginning processing")144 self.log.debug("Beginning processing")
128145
129 for subdir in ["incoming", "accepted", "rejected", "failed"]:146 for subdir in ["incoming", "accepted", "rejected", "failed"]:
130 full_subdir = os.path.join(self.options.base_fsroot, subdir)147 full_subdir = os.path.join(self.base_fsroot, subdir)
131 if not os.path.exists(full_subdir):148 if not os.path.exists(full_subdir):
132 self.log.debug("Creating directory %s" % full_subdir)149 self.log.debug("Creating directory %s" % full_subdir)
133 os.mkdir(full_subdir)150 os.mkdir(full_subdir)
134151
135 fsroot = os.path.join(self.options.base_fsroot, "incoming")152 fsroot = os.path.join(self.base_fsroot, "incoming")
136 uploads_to_process = self.locateDirectories(fsroot)153 uploads_to_process = self.locateDirectories(fsroot)
137 self.log.debug("Checked in %s, found %s"154 self.log.debug("Checked in %s, found %s"
138 % (fsroot, uploads_to_process))155 % (fsroot, uploads_to_process))
139 for upload in uploads_to_process:156 for upload in uploads_to_process:
140 self.log.debug("Considering upload %s" % upload)157 self.log.debug("Considering upload %s" % upload)
158 if leaf_name is not None and upload != leaf_name:
159 self.log.debug("Skipping %s -- does not match %s" % (
160 upload, leaf_name))
161 continue
141 self.processUpload(fsroot, upload)162 self.processUpload(fsroot, upload)
142
143 finally:163 finally:
144 self.log.debug("Rolling back any remaining transactions.")164 self.log.debug("Rolling back any remaining transactions.")
145 self.ztm.abort()165 self.ztm.abort()
@@ -152,16 +172,7 @@
152 is 'failed', otherwise it is the worst of the results from the172 is 'failed', otherwise it is the worst of the results from the
153 individual changes files, in order 'failed', 'rejected', 'accepted'.173 individual changes files, in order 'failed', 'rejected', 'accepted'.
154174
155 If the leafname option is set but its value is not the same as the
156 name of the upload directory, skip it entirely.
157
158 """175 """
159 if (self.options.leafname is not None and
160 upload != self.options.leafname):
161 self.log.debug("Skipping %s -- does not match %s" % (
162 upload, self.options.leafname))
163 return
164
165 upload_path = os.path.join(fsroot, upload)176 upload_path = os.path.join(fsroot, upload)
166 changes_files = self.locateChangesFiles(upload_path)177 changes_files = self.locateChangesFiles(upload_path)
167178
@@ -242,7 +253,7 @@
242 # Skip lockfile deletion, see similar code in lp.poppy.hooks.253 # Skip lockfile deletion, see similar code in lp.poppy.hooks.
243 fsroot_lock.release(skip_delete=True)254 fsroot_lock.release(skip_delete=True)
244255
245 sorted_dir_names = sorted(256 sorted_dir_names = sorted(
246 dir_name257 dir_name
247 for dir_name in dir_names258 for dir_name in dir_names
248 if os.path.isdir(os.path.join(fsroot, dir_name)))259 if os.path.isdir(os.path.join(fsroot, dir_name)))
@@ -321,8 +332,7 @@
321 "https://help.launchpad.net/Packaging/PPA#Uploading "332 "https://help.launchpad.net/Packaging/PPA#Uploading "
322 "and update your configuration.")))333 "and update your configuration.")))
323 self.log.debug("Finding fresh policy")334 self.log.debug("Finding fresh policy")
324 self.options.distro = distribution.name335 policy = self._getPolicyForDistro(distribution)
325 policy = findPolicyByOptions(self.options)
326 policy.archive = archive336 policy.archive = archive
327337
328 # DistroSeries overriding respect the following precedence:338 # DistroSeries overriding respect the following precedence:
@@ -396,7 +406,7 @@
396 # when transaction is committed) this will cause any emails sent406 # when transaction is committed) this will cause any emails sent
397 # sent by do_reject to be lost.407 # sent by do_reject to be lost.
398 notify = True408 notify = True
399 if self.options.dryrun or self.options.nomails:409 if self.dry_run or self.no_mails:
400 notify = False410 notify = False
401 if upload.is_rejected:411 if upload.is_rejected:
402 result = UploadStatusEnum.REJECTED412 result = UploadStatusEnum.REJECTED
@@ -415,7 +425,7 @@
415 for msg in upload.rejections:425 for msg in upload.rejections:
416 self.log.warn("\t%s" % msg)426 self.log.warn("\t%s" % msg)
417427
418 if self.options.dryrun:428 if self.dry_run:
419 self.log.info("Dry run, aborting transaction.")429 self.log.info("Dry run, aborting transaction.")
420 self.ztm.abort()430 self.ztm.abort()
421 else:431 else:
@@ -434,7 +444,7 @@
434 This includes moving the given upload directory and moving the444 This includes moving the given upload directory and moving the
435 matching .distro file, if it exists.445 matching .distro file, if it exists.
436 """446 """
437 if self.options.keep or self.options.dryrun:447 if self.keep or self.dry_run:
438 self.log.debug("Keeping contents untouched")448 self.log.debug("Keeping contents untouched")
439 return449 return
440450
@@ -462,21 +472,21 @@
462 This includes moving the given upload directory and moving the472 This includes moving the given upload directory and moving the
463 matching .distro file, if it exists.473 matching .distro file, if it exists.
464 """474 """
465 if self.options.keep or self.options.dryrun:475 if self.keep or self.dry_run:
466 self.log.debug("Keeping contents untouched")476 self.log.debug("Keeping contents untouched")
467 return477 return
468478
469 pathname = os.path.basename(upload)479 pathname = os.path.basename(upload)
470480
471 target_path = os.path.join(481 target_path = os.path.join(
472 self.options.base_fsroot, subdir_name, pathname)482 self.base_fsroot, subdir_name, pathname)
473 self.log.debug("Moving upload directory %s to %s" %483 self.log.debug("Moving upload directory %s to %s" %
474 (upload, target_path))484 (upload, target_path))
475 shutil.move(upload, target_path)485 shutil.move(upload, target_path)
476486
477 distro_filename = upload + ".distro"487 distro_filename = upload + ".distro"
478 if os.path.isfile(distro_filename):488 if os.path.isfile(distro_filename):
479 target_path = os.path.join(self.options.base_fsroot, subdir_name,489 target_path = os.path.join(self.base_fsroot, subdir_name,
480 os.path.basename(distro_filename))490 os.path.basename(distro_filename))
481 self.log.debug("Moving distro file %s to %s" % (distro_filename,491 self.log.debug("Moving distro file %s to %s" % (distro_filename,
482 target_path))492 target_path))
483493
=== modified file 'lib/lp/buildmaster/model/buildfarmjob.py'
--- lib/lp/buildmaster/model/buildfarmjob.py 2010-06-16 16:01:08 +0000
+++ lib/lp/buildmaster/model/buildfarmjob.py 2010-08-12 19:18:45 +0000
@@ -16,7 +16,7 @@
1616
17import pytz17import pytz
1818
19from storm.expr import And, Coalesce, Desc, Join, LeftJoin, Select19from storm.expr import Coalesce, Desc, LeftJoin, Or
20from storm.locals import Bool, DateTime, Int, Reference, Storm20from storm.locals import Bool, DateTime, Int, Reference, Storm
21from storm.store import Store21from storm.store import Store
2222
@@ -365,49 +365,38 @@
365 # Currently only package builds can be private (via their365 # Currently only package builds can be private (via their
366 # related archive), but not all build farm jobs will have a366 # related archive), but not all build farm jobs will have a
367 # related package build - hence the left join.367 # related package build - hence the left join.
368 left_join_pkg_builds = LeftJoin(368 origin = [BuildFarmJob]
369 BuildFarmJob,369 left_join_archive = [
370 Join(370 LeftJoin(
371 PackageBuild,371 PackageBuild,
372 Archive,372 PackageBuild.build_farm_job == BuildFarmJob.id),
373 And(PackageBuild.archive == Archive.id)),373 LeftJoin(
374 PackageBuild.build_farm_job == BuildFarmJob.id)374 Archive, PackageBuild.archive == Archive.id),
375375 ]
376 filtered_builds = IStore(BuildFarmJob).using(
377 left_join_pkg_builds).find(BuildFarmJob, *extra_clauses)
378376
379 if user is None:377 if user is None:
380 # Anonymous requests don't get to see private builds at all.378 # Anonymous requests don't get to see private builds at all.
381 filtered_builds = filtered_builds.find(379 origin.extend(left_join_archive)
382 Coalesce(Archive.private, False) == False)380 extra_clauses.append(Coalesce(Archive.private, False) == False)
383381
384 elif user.inTeam(getUtility(ILaunchpadCelebrities).admin):382 elif user.inTeam(getUtility(ILaunchpadCelebrities).admin):
385 # Admins get to see everything.383 # Admins get to see everything.
386 pass384 pass
387 else:385 else:
388 # Everyone else sees a union of all public builds and the386 # Everyone else sees all public builds and the
389 # specific private builds to which they have access.387 # specific private builds to which they have access.
390 filtered_builds = filtered_builds.find(388 origin.extend(left_join_archive)
391 Coalesce(Archive.private, False) == False)389 origin.append(LeftJoin(
392390 TeamParticipation, TeamParticipation.teamID == Archive.ownerID))
393 user_teams_subselect = Select(391 extra_clauses.append(
394 TeamParticipation.teamID,392 Or(Coalesce(Archive.private, False) == False,
395 where=And(393 TeamParticipation.person == user))
396 TeamParticipation.personID == user.id,394
397 TeamParticipation.teamID == Archive.ownerID))395 filtered_builds = IStore(BuildFarmJob).using(*origin).find(
398 private_builds_for_user = IStore(BuildFarmJob).find(396 BuildFarmJob, *extra_clauses)
399 BuildFarmJob,
400 PackageBuild.build_farm_job == BuildFarmJob.id,
401 PackageBuild.archive == Archive.id,
402 Archive.private == True,
403 Archive.ownerID.is_in(user_teams_subselect),
404 *extra_clauses)
405
406 filtered_builds = filtered_builds.union(
407 private_builds_for_user)
408397
409 filtered_builds.order_by(398 filtered_builds.order_by(
410 Desc(BuildFarmJob.date_finished), BuildFarmJob.id)399 Desc(BuildFarmJob.date_finished), BuildFarmJob.id)
400 filtered_builds.config(distinct=True)
411401
412 return filtered_builds402 return filtered_builds
413
414403
=== modified file 'lib/lp/soyuz/scripts/soyuz_process_upload.py'
--- lib/lp/soyuz/scripts/soyuz_process_upload.py 2010-05-04 15:38:08 +0000
+++ lib/lp/soyuz/scripts/soyuz_process_upload.py 2010-08-12 19:18:45 +0000
@@ -8,6 +8,7 @@
88
9import os9import os
1010
11from lp.archiveuploader.uploadpolicy import findPolicyByOptions
11from lp.archiveuploader.uploadprocessor import UploadProcessor12from lp.archiveuploader.uploadprocessor import UploadProcessor
12from lp.services.scripts.base import (13from lp.services.scripts.base import (
13 LaunchpadCronScript, LaunchpadScriptFailure)14 LaunchpadCronScript, LaunchpadScriptFailure)
@@ -74,8 +75,13 @@
74 "%s is not a directory" % self.options.base_fsroot)75 "%s is not a directory" % self.options.base_fsroot)
7576
76 self.logger.debug("Initialising connection.")77 self.logger.debug("Initialising connection.")
77 UploadProcessor(78 def getPolicy(distro):
78 self.options, self.txn, self.logger).processUploadQueue()79 self.options.distro = distro.name
80 return findPolicyByOptions(self.options)
81 processor = UploadProcessor(self.options.base_fsroot,
82 self.options.dryrun, self.options.nomails, self.options.keep,
83 getPolicy, self.txn, self.logger)
84 processor.processUploadQueue(self.options.leafname)
7985
80 @property86 @property
81 def lockfilename(self):87 def lockfilename(self):
8288
=== modified file 'utilities/sourcedeps.conf'
--- utilities/sourcedeps.conf 2010-08-03 14:59:22 +0000
+++ utilities/sourcedeps.conf 2010-08-12 19:18:45 +0000
@@ -12,5 +12,6 @@
12pygettextpo lp:~launchpad-pqm/pygettextpo/trunk;revno=2412pygettextpo lp:~launchpad-pqm/pygettextpo/trunk;revno=24
13pygpgme lp:~launchpad-pqm/pygpgme/devel;revno=4913pygpgme lp:~launchpad-pqm/pygpgme/devel;revno=49
14subvertpy lp:~launchpad-pqm/subvertpy/trunk;revno=204214subvertpy lp:~launchpad-pqm/subvertpy/trunk;revno=2042
15python-debian lp:~launchpad-pqm/python-debian/devel;revno=185
15testresources lp:~launchpad-pqm/testresources/dev;revno=1616testresources lp:~launchpad-pqm/testresources/dev;revno=16
16shipit lp:~launchpad-pqm/shipit/trunk;revno=8909 optional17shipit lp:~launchpad-pqm/shipit/trunk;revno=8909 optional

Subscribers

People subscribed via source and target branches

to status/vote changes: