Merge lp:~jtv/launchpad/db-bug-793382 into lp:launchpad/db-devel

Proposed by Jeroen T. Vermeulen
Status: Merged
Approved by: Jeroen T. Vermeulen
Approved revision: no longer in the source branch.
Merged at revision: 10661
Proposed branch: lp:~jtv/launchpad/db-bug-793382
Merge into: lp:launchpad/db-devel
Diff against target: 2150 lines (+1395/-402) (has conflicts)
14 files modified
lib/canonical/launchpad/doc/vocabularies.txt (+7/-7)
lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt (+38/-16)
lib/canonical/launchpad/webapp/publication.py (+11/-1)
lib/canonical/launchpad/webapp/tests/test_haproxy.py (+9/-1)
lib/lp/registry/browser/distroseries.py (+56/-0)
lib/lp/registry/browser/tests/test_distroseries.py (+134/-0)
lib/lp/registry/stories/distroseries/xx-distroseries-index.txt (+48/-0)
lib/lp/registry/templates/distroseries-details.pt (+33/-0)
lib/lp/services/job/model/job.py (+21/-2)
lib/lp/services/job/tests/test_job.py (+18/-8)
lib/lp/soyuz/interfaces/packagecopyjob.py (+168/-0)
lib/lp/soyuz/model/binaryandsourcepackagename.py (+1/-11)
lib/lp/soyuz/model/packagecopyjob.py (+401/-156)
lib/lp/soyuz/tests/test_packagecopyjob.py (+450/-200)
Text conflict in lib/lp/registry/browser/distroseries.py
Text conflict in lib/lp/registry/browser/tests/test_distroseries.py
Text conflict in lib/lp/registry/stories/distroseries/xx-distroseries-index.txt
Text conflict in lib/lp/registry/templates/distroseries-details.pt
Text conflict in lib/lp/soyuz/interfaces/packagecopyjob.py
Text conflict in lib/lp/soyuz/model/packagecopyjob.py
Text conflict in lib/lp/soyuz/tests/test_packagecopyjob.py
To merge this branch: bzr merge lp:~jtv/launchpad/db-bug-793382
Reviewer Review Type Date Requested Status
Henning Eggers (community) Approve
Review via email: mp+63545@code.launchpad.net

Commit message

[r=henninge][bug=793382] Bring requestUpgrades back to constant query count.

Description of the change

= Summary =

Now that we no longer bundle multiple packages into one PackageCopyJob, the DistroSeries:+localpackagediffs view method requestUpgrades issued extra queries in proportion to the number of copies requested. Creating a Job cost one extra query per package; the associated PackageCopyJob cost another. This request needs to scale, so we'd like the query count to remain constant.

== Proposed fix ==

Provide interfaces for creating multiple Jobs and PlainPackageCopyJobs with single INSERT queries. That makes it difficult to get the objects in memory, but we don't need them: all we need (and that only for Job, really) is the ids that they have been assigned.

Actually it should be possible for Storm to bundle the INSERTs in this way, but that would take quite some work to get right, and it might become fragile if one INSERT ends up being flushed to the database before the next object can be added to the store.

In the IPlainPackageCopyJobSource interface you provide two different kinds of data when creating multiple jobs at once: some parameters are specified once and repeated for all jobs in the batch, others need to be specified for each job separately. The latter are represented as a list of tuples.

== Pre-implementation notes ==

Gavin came up with the idea of having a "create multiple jobs" interface in a utility class. No other interesting ideas came up when discussing the idea on IRC.

== Implementation details ==

The Job multi-create method went into the class. There is an IJobSource interface, but that was not suitable here for several reasons. First, it's implemented in many places that may not have any need for a multi-create; requiring this method would cause unneeded hardship. Second, IJobSource is not a utility for Jobs. Instead it's more the generic base interface for classes built around Job, which may or may not live in separate tables — so it would be ambiguous what kind of ids the method would return. Finally, the job-specific creation methods will have their own signatures, which wouldn't work very well with having them all come from the same interface definition.

You'll notice in the performance test that even though the query count no longer increases by 2 per copy, the query count for the base case went up by 2. I haven't bothered to figure out where those 2 queries came from; I wonder if perhaps it might have been an accounting glitch where the two INSERTs got deferred beyond the point where the query count is collected. That deferral would not be possible with the new code, and that could explain the difference.

== Tests ==

From low-level to high-level:
{{{
./bin/test -vvc lp.services.job.tests.test_job -t createMultiple
./bin/test -vvc lp.soyuz.tests.test_packagecopyjob -t createMultiple
./bin/test -vvc lp.registry.browser.tests.test_distroseries -t requestUpgrades_is_efficient
}}}

== Demo and Q/A ==

I'm not sure the "upgrade packages" button is visible on DistroSeries:+localpackagediffs yet. But if it is, pressing it should make upgradable entries on that page (and any other batches for the same distroseries) show up as "synchronizing..."

Upgradeable entries are ones where the checkbox on the left is active, and the parent version is greater than the version in the distroseries you're looking at.

= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/model/packagecopyjob.py
  lib/lp/soyuz/interfaces/packagecopyjob.py
  lib/lp/soyuz/tests/test_packagecopyjob.py
  lib/lp/registry/browser/tests/test_distroseries.py
  lib/lp/services/job/tests/test_job.py
  lib/lp/services/job/model/job.py
  lib/lp/registry/browser/distroseries.py

To post a comment you must log in.
Revision history for this message
Henning Eggers (henninge) wrote :

I am sorry but I cannot find anything wrong with this proposal. :-P

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'lib/canonical/launchpad/doc/vocabularies.txt'
--- lib/canonical/launchpad/doc/vocabularies.txt 2011-05-27 19:53:20 +0000
+++ lib/canonical/launchpad/doc/vocabularies.txt 2011-06-09 10:59:00 +0000
@@ -300,8 +300,8 @@
300 >>> package_name_terms.count()300 >>> package_name_terms.count()
301 2301 2
302 >>> [(term.token, term.title) for term in package_name_terms]302 >>> [(term.token, term.title) for term in package_name_terms]
303 [('mozilla-firefox', u'iceweasel huh ?'),303 [('mozilla-firefox', u'mozilla-firefox'),
304 ('mozilla-firefox-data', u'Mozilla Firefox Data is .....')]304 ('mozilla-firefox-data', u'mozilla-firefox-data')]
305305
306Searching for "mozilla" should return the binary package name above, and306Searching for "mozilla" should return the binary package name above, and
307the source package named "mozilla".307the source package named "mozilla".
@@ -310,9 +310,9 @@
310 >>> package_name_terms.count()310 >>> package_name_terms.count()
311 3311 3
312 >>> [(term.token, term.title) for term in package_name_terms]312 >>> [(term.token, term.title) for term in package_name_terms]
313 [('mozilla', 'Not uploaded'),313 [('mozilla', u'mozilla'),
314 ('mozilla-firefox', u'iceweasel huh ?'),314 ('mozilla-firefox', u'mozilla-firefox'),
315 ('mozilla-firefox-data', u'Mozilla Firefox Data is .....')]315 ('mozilla-firefox-data', u'mozilla-firefox-data')]
316316
317The search does a case-insensitive, substring match.317The search does a case-insensitive, substring match.
318318
@@ -320,8 +320,8 @@
320 >>> package_name_terms.count()320 >>> package_name_terms.count()
321 2321 2
322 >>> [(term.token, term.title) for term in package_name_terms]322 >>> [(term.token, term.title) for term in package_name_terms]
323 [('linux-2.6.12', u'this kernel is like the crystal method: a temple...'),323 [('linux-2.6.12', u'linux-2.6.12'),
324 ('linux-source-2.6.15', u'Source of: linux-2.6.12')]324 ('linux-source-2.6.15', u'linux-source-2.6.15')]
325325
326326
327BinaryPackageNameVocabulary327BinaryPackageNameVocabulary
328328
=== modified file 'lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt'
--- lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt 2011-02-25 04:19:04 +0000
+++ lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt 2011-06-09 10:59:00 +0000
@@ -1,4 +1,5 @@
1= Operational Statistics and Metrics =1Operational Statistics and Metrics
2==================================
23
3We make Zope 3 give us real time statistics about Launchpad's operation.4We make Zope 3 give us real time statistics about Launchpad's operation.
4We can access them via XML-RPC:5We can access them via XML-RPC:
@@ -25,7 +26,8 @@
25 ... reset()26 ... reset()
26 ...27 ...
2728
28== Number of requests and XML-RPC requests ==29Number of requests and XML-RPC requests
30---------------------------------------
2931
30Even though XML-RPC requests are technically HTTP requests, we do not32Even though XML-RPC requests are technically HTTP requests, we do not
31count them as such. Note that the call to obtain statistics will increment33count them as such. Note that the call to obtain statistics will increment
@@ -59,7 +61,8 @@
59 requests: 161 requests: 1
60 xml-rpc requests: 162 xml-rpc requests: 1
6163
62== Number of HTTP requests and success codes ==64Number of HTTP requests and success codes
65-----------------------------------------
6366
64 >>> output = http("GET / HTTP/1.1\nHost: bugs.launchpad.dev\n")67 >>> output = http("GET / HTTP/1.1\nHost: bugs.launchpad.dev\n")
65 >>> output.getStatus()68 >>> output.getStatus()
@@ -69,7 +72,8 @@
69 http requests: 172 http requests: 1
70 requests: 173 requests: 1
7174
72== Number of 404s ==75Number of 404s
76--------------
7377
74Note that retries is incremented too. As per the standard Launchpad78Note that retries is incremented too. As per the standard Launchpad
75database policy, this request first uses the slave DB. The requested79database policy, this request first uses the slave DB. The requested
@@ -86,7 +90,8 @@
86 requests: 190 requests: 1
87 retries: 191 retries: 1
8892
89== Number of 500 Internal Server Errors (unhandled exceptions) ==93Number of 500 Internal Server Errors (unhandled exceptions)
94-----------------------------------------------------------
9095
91This is normally the number of OOPS pages displayed to the user, but96This is normally the number of OOPS pages displayed to the user, but
92may also include the odd case where the OOPS system has failed and a97may also include the odd case where the OOPS system has failed and a
@@ -129,7 +134,8 @@
129 http requests: 1134 http requests: 1
130 requests: 1135 requests: 1
131136
132== Number of XML-RPC Faults ==137Number of XML-RPC Faults
138------------------------
133139
134 >>> try:140 >>> try:
135 ... opstats = lp_xmlrpc.invalid() # XXX: Need a HTTP test too141 ... opstats = lp_xmlrpc.invalid() # XXX: Need a HTTP test too
@@ -142,7 +148,8 @@
142 xml-rpc requests: 1148 xml-rpc requests: 1
143149
144150
145== Number of soft timeouts ==151Number of soft timeouts
152-----------------------
146153
147 >>> from canonical.config import config154 >>> from canonical.config import config
148 >>> test_data = dedent("""155 >>> test_data = dedent("""
@@ -162,7 +169,8 @@
162 requests: 1169 requests: 1
163 soft timeouts: 1170 soft timeouts: 1
164171
165== Number of Timeouts ==172Number of Timeouts
173------------------
166174
167We can't reliably track this using the 503 response code as other175We can't reliably track this using the 503 response code as other
168Launchpad code may well return this status and an XML-RPC request may176Launchpad code may well return this status and an XML-RPC request may
@@ -190,7 +198,8 @@
190 timeouts: 1198 timeouts: 1
191199
192200
193== HTTP access for Cricket ==201HTTP access for Cricket
202-----------------------
194203
195Stats can also be retrieved via HTTP in cricket-graph format:204Stats can also be retrieved via HTTP in cricket-graph format:
196205
@@ -219,13 +228,13 @@
219 xmlrpc_requests:0@...228 xmlrpc_requests:0@...
220 <BLANKLINE>229 <BLANKLINE>
221230
222== No DB access required ==231No DB access required
232---------------------
223233
224Accessing the opstats page as an anonymous user will make no database234Accessing the opstats page will make no database queries. This is important to
225queries. This is important to make it as reliable as possible since we235make it as reliable as possible since we use this page for monitoring. Because
226use this page for monitoring. Because of this property, the load236of this property, the load balancers also use this page to determine if a
227balancers also use this page to determine if a Launchpad instance is237Launchpad instance is responsive.
228responsive.
229238
230To confirm this, we first point all our database connection information239To confirm this, we first point all our database connection information
231to somewhere that doesn't exist.240to somewhere that doesn't exist.
@@ -257,6 +266,20 @@
257 <BLANKLINE>266 <BLANKLINE>
258 1XXs:0@...267 1XXs:0@...
259268
269This is also true if we are provide authentication.
270
271 >>> print http(r"""
272 ... GET /+opstats HTTP/1.1
273 ... Host: launchpad.dev
274 ... Authorization: Basic Zm9vLmJhckBjYW5vbmljYWwuY29tOnRlc3Q=
275 ... """)
276 HTTP/1.1 200 Ok
277 ...
278 Content-Type: text/plain; charset=US-ASCII
279 ...
280 <BLANKLINE>
281 1XXs:0@...
282
260But our database connections are broken.283But our database connections are broken.
261284
262 >>> from canonical.launchpad.interfaces.lpstorm import IStore285 >>> from canonical.launchpad.interfaces.lpstorm import IStore
@@ -271,4 +294,3 @@
271294
272 >>> IStore(Person).find(Person, name='janitor').one().name295 >>> IStore(Person).find(Person, name='janitor').one().name
273 u'janitor'296 u'janitor'
274
275297
=== modified file 'lib/canonical/launchpad/webapp/publication.py'
--- lib/canonical/launchpad/webapp/publication.py 2011-05-27 21:12:25 +0000
+++ lib/canonical/launchpad/webapp/publication.py 2011-06-09 10:59:00 +0000
@@ -330,7 +330,17 @@
330 personless account, return the unauthenticated principal.330 personless account, return the unauthenticated principal.
331 """331 """
332 auth_utility = getUtility(IPlacelessAuthUtility)332 auth_utility = getUtility(IPlacelessAuthUtility)
333 principal = auth_utility.authenticate(request)333 principal = None
334 # +opstats and +haproxy are status URLs that must not query the DB at
335 # all. This is enforced (see
336 # lib/canonical/launchpad/webapp/dbpolicy.py). If the request is for
337 # one of those two pages, don't even try to authenticate, because we
338 # may fail. We haven't traversed yet, so we have to sniff the request
339 # this way. Even though PATH_INFO is always present in real requests,
340 # we need to tread carefully (``get``) because of test requests in our
341 # automated tests.
342 if request.get('PATH_INFO') not in [u'/+opstats', u'/+haproxy']:
343 principal = auth_utility.authenticate(request)
334 if principal is None or principal.person is None:344 if principal is None or principal.person is None:
335 # This is either an unauthenticated user or a user who345 # This is either an unauthenticated user or a user who
336 # authenticated on our OpenID server using a personless account.346 # authenticated on our OpenID server using a personless account.
337347
=== modified file 'lib/canonical/launchpad/webapp/tests/test_haproxy.py'
--- lib/canonical/launchpad/webapp/tests/test_haproxy.py 2011-02-18 18:12:43 +0000
+++ lib/canonical/launchpad/webapp/tests/test_haproxy.py 2011-06-09 10:59:00 +0000
@@ -35,6 +35,15 @@
35 result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False)35 result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False)
36 self.assertEquals(200, result.getStatus())36 self.assertEquals(200, result.getStatus())
3737
38 def test_authenticated_HAProxyStatusView_works(self):
39 # We don't use authenticated requests, but this keeps us from
40 # generating oopses.
41 result = self.http(
42 u'GET /+haproxy HTTP/1.0\n'
43 u'Authorization: Basic Zm9vLmJhckBjYW5vbmljYWwuY29tOnRlc3Q=\n',
44 handle_errors=False)
45 self.assertEquals(200, result.getStatus())
46
38 def test_HAProxyStatusView_going_down_returns_500(self):47 def test_HAProxyStatusView_going_down_returns_500(self):
39 haproxy.set_going_down_flag(True)48 haproxy.set_going_down_flag(True)
40 result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False)49 result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False)
@@ -61,4 +70,3 @@
61 haproxy.set_going_down_flag(True)70 haproxy.set_going_down_flag(True)
62 result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False)71 result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False)
63 self.assertEquals(499, result.getStatus())72 self.assertEquals(499, result.getStatus())
64
6573
=== modified file 'lib/lp/registry/browser/distroseries.py'
--- lib/lp/registry/browser/distroseries.py 2011-06-08 21:03:29 +0000
+++ lib/lp/registry/browser/distroseries.py 2011-06-09 10:59:00 +0000
@@ -1052,6 +1052,7 @@
1052 def sync_sources(self, action, data):1052 def sync_sources(self, action, data):
1053 self._sync_sources(action, data)1053 self._sync_sources(action, data)
10541054
1055<<<<<<< TREE
1055 def getUpgrades(self):1056 def getUpgrades(self):
1056 """Find straightforward package upgrades.1057 """Find straightforward package upgrades.
10571058
@@ -1103,6 +1104,61 @@
1103 queue = PackageUploadQueue(self.context, None)1104 queue = PackageUploadQueue(self.context, None)
1104 return check_permission("launchpad.Edit", queue)1105 return check_permission("launchpad.Edit", queue)
11051106
1107=======
1108 def getUpgrades(self):
1109 """Find straightforward package upgrades.
1110
1111 These are updates for packages that this distroseries shares
1112 with a parent series, for which there have been updates in the
1113 parent, and which do not have any changes in this series that
1114 might complicate a sync.
1115
1116 :return: A result set of `DistroSeriesDifference`s.
1117 """
1118 return getUtility(IDistroSeriesDifferenceSource).getSimpleUpgrades(
1119 self.context)
1120
1121 @action(_("Upgrade Packages"), name="upgrade", condition='canUpgrade')
1122 def upgrade(self, action, data):
1123 """Request synchronization of straightforward package upgrades."""
1124 self.requestUpgrades()
1125
1126 def requestUpgrades(self):
1127 """Request sync of packages that can be easily upgraded."""
1128 target_distroseries = self.context
1129 copies = [
1130 (
1131 dsd.source_package_name.name,
1132 dsd.parent_source_version,
1133 dsd.parent_series.main_archive,
1134 target_distroseries.main_archive,
1135 PackagePublishingPocket.RELEASE,
1136 )
1137 for dsd in self.getUpgrades()]
1138 getUtility(IPlainPackageCopyJobSource).createMultiple(
1139 target_distroseries, copies,
1140 copy_policy=PackageCopyPolicy.MASS_SYNC)
1141
1142 self.request.response.addInfoNotification(
1143 (u"Upgrades of {context.displayname} packages have been "
1144 u"requested. Please give Launchpad some time to complete "
1145 u"these.").format(context=self.context))
1146
1147 def canUpgrade(self, action=None):
1148 """Should the form offer a packages upgrade?"""
1149 if getFeatureFlag("soyuz.derived_series_sync.enabled") is None:
1150 return False
1151 elif self.context.status not in UPGRADABLE_SERIES_STATUSES:
1152 # A feature freeze precludes blanket updates.
1153 return False
1154 elif self.getUpgrades().is_empty():
1155 # There are no simple updates to perform.
1156 return False
1157 else:
1158 queue = PackageUploadQueue(self.context, None)
1159 return check_permission("launchpad.Edit", queue)
1160
1161>>>>>>> MERGE-SOURCE
11061162
1107class DistroSeriesMissingPackagesView(DistroSeriesDifferenceBaseView,1163class DistroSeriesMissingPackagesView(DistroSeriesDifferenceBaseView,
1108 LaunchpadFormView):1164 LaunchpadFormView):
11091165
=== modified file 'lib/lp/registry/browser/tests/test_distroseries.py'
--- lib/lp/registry/browser/tests/test_distroseries.py 2011-06-08 21:03:29 +0000
+++ lib/lp/registry/browser/tests/test_distroseries.py 2011-06-09 10:59:00 +0000
@@ -996,6 +996,7 @@
996 self.assertEqual(versions['derived'], derived_span[0].string.strip())996 self.assertEqual(versions['derived'], derived_span[0].string.strip())
997 self.assertEqual(versions['parent'], parent_span[0].string.strip())997 self.assertEqual(versions['parent'], parent_span[0].string.strip())
998998
999<<<<<<< TREE
999 def test_getUpgrades_shows_updates_in_parent(self):1000 def test_getUpgrades_shows_updates_in_parent(self):
1000 # The view's getUpgrades methods lists packages that can be1001 # The view's getUpgrades methods lists packages that can be
1001 # trivially upgraded: changed in the parent, not changed in the1002 # trivially upgraded: changed in the parent, not changed in the
@@ -1128,6 +1129,139 @@
11281129
1129class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory,1130class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory,
1130 DistroSeriesDifferenceMixin):1131 DistroSeriesDifferenceMixin):
1132=======
1133 def test_getUpgrades_shows_updates_in_parent(self):
1134 # The view's getUpgrades methods lists packages that can be
1135 # trivially upgraded: changed in the parent, not changed in the
1136 # derived series, but present in both.
1137 dsd = self.makePackageUpgrade()
1138 view = self.makeView(dsd.derived_series)
1139 self.assertContentEqual([dsd], view.getUpgrades())
1140
1141 def enableDerivedSeriesSyncFeature(self):
1142 self.useFixture(
1143 FeatureFixture(
1144 {u'soyuz.derived_series_sync.enabled': u'on'}))
1145
1146 @with_celebrity_logged_in("admin")
1147 def test_upgrades_offered_only_with_feature_flag(self):
1148 # The "Upgrade Packages" button will only be shown when a specific
1149 # feature flag is enabled.
1150 view = self.makeView()
1151 self.makePackageUpgrade(view.context)
1152 self.assertFalse(view.canUpgrade())
1153 self.enableDerivedSeriesSyncFeature()
1154 self.assertTrue(view.canUpgrade())
1155
1156 def test_upgrades_are_offered_if_appropriate(self):
1157 # The "Upgrade Packages" button will only be shown to privileged
1158 # users.
1159 self.enableDerivedSeriesSyncFeature()
1160 dsd = self.makePackageUpgrade()
1161 view = self.makeView(dsd.derived_series)
1162 with celebrity_logged_in("admin"):
1163 self.assertTrue(view.canUpgrade())
1164 with person_logged_in(self.factory.makePerson()):
1165 self.assertFalse(view.canUpgrade())
1166 with anonymous_logged_in():
1167 self.assertFalse(view.canUpgrade())
1168
1169 @with_celebrity_logged_in("admin")
1170 def test_upgrades_offered_only_if_available(self):
1171 # If there are no upgrades, the "Upgrade Packages" button won't
1172 # be shown.
1173 self.enableDerivedSeriesSyncFeature()
1174 view = self.makeView()
1175 self.assertFalse(view.canUpgrade())
1176 self.makePackageUpgrade(view.context)
1177 self.assertTrue(view.canUpgrade())
1178
1179 @with_celebrity_logged_in("admin")
1180 def test_upgrades_not_offered_after_feature_freeze(self):
1181 # There won't be an "Upgrade Packages" button once feature
1182 # freeze has occurred. Mass updates would not make sense after
1183 # that point.
1184 self.enableDerivedSeriesSyncFeature()
1185 upgradeable = {}
1186 for status in SeriesStatus.items:
1187 dsd = self.makePackageUpgrade()
1188 dsd.derived_series.status = status
1189 view = self.makeView(dsd.derived_series)
1190 upgradeable[status] = view.canUpgrade()
1191 expected = {
1192 SeriesStatus.FUTURE: True,
1193 SeriesStatus.EXPERIMENTAL: True,
1194 SeriesStatus.DEVELOPMENT: True,
1195 SeriesStatus.FROZEN: False,
1196 SeriesStatus.CURRENT: False,
1197 SeriesStatus.SUPPORTED: False,
1198 SeriesStatus.OBSOLETE: False,
1199 }
1200 self.assertEqual(expected, upgradeable)
1201
1202 def test_upgrade_creates_sync_jobs(self):
1203 # requestUpgrades generates PackageCopyJobs for the upgrades
1204 # that need doing.
1205 dsd = self.makePackageUpgrade()
1206 series = dsd.derived_series
1207 with celebrity_logged_in('admin'):
1208 series.status = SeriesStatus.DEVELOPMENT
1209 series.datereleased = UTC_NOW
1210 view = self.makeView(series)
1211 view.requestUpgrades()
1212 job_source = getUtility(IPlainPackageCopyJobSource)
1213 jobs = list(
1214 job_source.getActiveJobs(series.distribution.main_archive))
1215 self.assertEquals(1, len(jobs))
1216 job = jobs[0]
1217 self.assertEquals(series, job.target_distroseries)
1218 self.assertEqual(dsd.source_package_name.name, job.package_name)
1219 self.assertEqual(dsd.parent_source_version, job.package_version)
1220 self.assertEqual(PackagePublishingPocket.RELEASE, job.target_pocket)
1221
1222 def test_upgrade_gives_feedback(self):
1223 # requestUpgrades doesn't instantly perform package upgrades,
1224 # but it shows the user a notice that the upgrades have been
1225 # requested.
1226 dsd = self.makePackageUpgrade()
1227 view = self.makeView(dsd.derived_series)
1228 view.requestUpgrades()
1229 expected = {
1230 "level": BrowserNotificationLevel.INFO,
1231 "message":
1232 ("Upgrades of {0.displayname} packages have been "
1233 "requested. Please give Launchpad some time to "
1234 "complete these.").format(dsd.derived_series),
1235 }
1236 observed = map(vars, view.request.response.notifications)
1237 self.assertEqual([expected], observed)
1238
1239 def test_requestUpgrades_is_efficient(self):
1240 # A single web request may need to schedule large numbers of
1241 # package upgrades. It must do so without issuing large numbers
1242 # of database queries.
1243 derived_series, parent_series = self._createChildAndParent()
1244 # Take a baseline measure of queries.
1245 self.makePackageUpgrade(derived_series=derived_series)
1246 flush_database_caches()
1247 with StormStatementRecorder() as recorder1:
1248 self.makeView(derived_series).requestUpgrades()
1249 self.assertThat(recorder1, HasQueryCount(LessThan(12)))
1250
1251 # The query count does not increase with the number of upgrades.
1252 for index in xrange(3):
1253 self.makePackageUpgrade(derived_series=derived_series)
1254 flush_database_caches()
1255 with StormStatementRecorder() as recorder2:
1256 self.makeView(derived_series).requestUpgrades()
1257 self.assertThat(
1258 recorder2,
1259 HasQueryCount(Equals(recorder1.count)))
1260
1261
1262class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory,
1263 DistroSeriesDifferenceMixin):
1264>>>>>>> MERGE-SOURCE
11311265
1132 layer = LaunchpadFunctionalLayer1266 layer = LaunchpadFunctionalLayer
11331267
11341268
=== modified file 'lib/lp/registry/stories/distroseries/xx-distroseries-index.txt'
--- lib/lp/registry/stories/distroseries/xx-distroseries-index.txt 2011-05-24 10:08:33 +0000
+++ lib/lp/registry/stories/distroseries/xx-distroseries-index.txt 2011-06-09 10:59:00 +0000
@@ -51,13 +51,56 @@
51 Release manager: None51 Release manager: None
52 Status: Current Stable Release52 Status: Current Stable Release
53 Derives from: Warty (4.10) is not derived from another series.53 Derives from: Warty (4.10) is not derived from another series.
54<<<<<<< TREE
54 Derived series:55 Derived series:
56=======
57 Derived series: No derived series.
58>>>>>>> MERGE-SOURCE
55 Source packages: 359 Source packages: 3
56 Binary packages: 460 Binary packages: 4
5761
58On series that have no source or binary packages, the portlet will62On series that have no source or binary packages, the portlet will
59change its text slightly to annouce this:63change its text slightly to annouce this:
6064
65 >>> anon_browser.open('http://launchpad.dev/debian/sarge')
66 >>> print extract_text(
67 ... find_portlet(anon_browser.contents, 'Series information'))
68 Series information
69 Distribution: Debian
70 Series: Sarge (3.1)
71 Project drivers: Jeff Waugh, Mark Shuttleworth
72 Release manager: Jeff Waugh
73 Status: Pre-release Freeze
74 Derives from: Woody (3.0)
75 Source packages: No sources imported or published.
76 Binary packages: No binaries imported or published.
77
78The series' derivation parents - rather than the previous series - are
79shown when derivation is enabled, as are the series derived from this
80series:
81
82 >>> from lp.registry.interfaces.distribution import IDistributionSet
83 >>> from lp.testing import celebrity_logged_in
84 >>> from zope.component import getUtility
85
86 >>> with celebrity_logged_in("admin"):
87 ... debian = getUtility(IDistributionSet).getByName(u"debian")
88 ... sarge = debian.getSeries(u"sarge")
89 ... parents = [
90 ... factory.makeDistroSeries(name=u"dobby"),
91 ... factory.makeDistroSeries(name=u"knobby")]
92 ... distro_series_parents = [
93 ... factory.makeDistroSeriesParent(
94 ... derived_series=sarge, parent_series=parent)
95 ... for parent in parents]
96 ... children = [
97 ... factory.makeDistroSeries(name=u"bobby"),
98 ... factory.makeDistroSeries(name=u"tables")]
99 ... distro_series_children = [
100 ... factory.makeDistroSeriesParent(
101 ... derived_series=child, parent_series=sarge)
102 ... for child in children]
103
61 >>> with derivation_enabled:104 >>> with derivation_enabled:
62 ... anon_browser.open('http://launchpad.dev/debian/sarge')105 ... anon_browser.open('http://launchpad.dev/debian/sarge')
63 >>> print extract_text(106 >>> print extract_text(
@@ -68,8 +111,13 @@
68 Project drivers: Jeff Waugh, Mark Shuttleworth111 Project drivers: Jeff Waugh, Mark Shuttleworth
69 Release manager: Jeff Waugh112 Release manager: Jeff Waugh
70 Status: Pre-release Freeze113 Status: Pre-release Freeze
114<<<<<<< TREE
71 Derives from: Woody (3.0)115 Derives from: Woody (3.0)
72 Derived series:116 Derived series:
117=======
118 Derives from: Dobby (...), Knobby (...)
119 Derived series: Bobby (...), Tables (...)
120>>>>>>> MERGE-SOURCE
73 Source packages: No sources imported or published.121 Source packages: No sources imported or published.
74 Binary packages: No binaries imported or published.122 Binary packages: No binaries imported or published.
75123
76124
=== modified file 'lib/lp/registry/templates/distroseries-details.pt'
--- lib/lp/registry/templates/distroseries-details.pt 2011-05-24 10:08:33 +0000
+++ lib/lp/registry/templates/distroseries-details.pt 2011-06-09 10:59:00 +0000
@@ -46,6 +46,9 @@
46 </dd>46 </dd>
47 </dl>47 </dl>
4848
49 <tal:derivation-not-enabled
50 tal:condition="not:request/features/soyuz.derived_series_ui.enabled">
51
49 <dl>52 <dl>
50 <dt>Derives from:</dt>53 <dt>Derives from:</dt>
51 <dd>54 <dd>
@@ -60,8 +63,32 @@
60 </dd>63 </dd>
61 </dl>64 </dl>
6265
66<<<<<<< TREE
63 <dl tal:condition="request/features/soyuz.derived_series_ui.enabled"67 <dl tal:condition="request/features/soyuz.derived_series_ui.enabled"
64 tal:define="all_child_series context/getDerivedSeries">68 tal:define="all_child_series context/getDerivedSeries">
69=======
70 </tal:derivation-not-enabled>
71
72 <tal:derivation-enabled
73 tal:condition="request/features/soyuz.derived_series_ui.enabled">
74
75 <dl tal:define="parents context/getParentSeries">
76 <dt>Derives from:</dt>
77 <dd tal:condition="parents">
78 <tal:parents repeat="parent parents">
79 <a tal:attributes="href parent/fmt:url"
80 tal:content="parent/named_version">
81 </a><tal:comma condition="not:repeat/parent/end">, </tal:comma>
82 </tal:parents>
83 </dd>
84 <dd tal:condition="not:parents">
85 <tal:name replace="context/named_version"/> is not derived from
86 another series.
87 </dd>
88 </dl>
89
90 <dl tal:define="all_child_series context/getDerivedSeries">
91>>>>>>> MERGE-SOURCE
65 <dt>Derived series:</dt>92 <dt>Derived series:</dt>
66 <dd>93 <dd>
67 <tal:per_child_series repeat="child_series all_child_series">94 <tal:per_child_series repeat="child_series all_child_series">
@@ -70,12 +97,18 @@
70 tal:content="child_series/named_version" /><tal:comma97 tal:content="child_series/named_version" /><tal:comma
71 condition="not: repeat/child_series/end">,</tal:comma>98 condition="not: repeat/child_series/end">,</tal:comma>
72 </tal:per_child_series>99 </tal:per_child_series>
100<<<<<<< TREE
73 <tal:none condition="all_child_series">101 <tal:none condition="all_child_series">
102=======
103 <tal:none condition="not:all_child_series">
104>>>>>>> MERGE-SOURCE
74 No derived series.105 No derived series.
75 </tal:none>106 </tal:none>
76 </dd>107 </dd>
77 </dl>108 </dl>
78109
110 </tal:derivation-enabled>
111
79 <dl tal:define="sourcecount context/sourcecount">112 <dl tal:define="sourcecount context/sourcecount">
80 <dt>Source packages:</dt>113 <dt>Source packages:</dt>
81 <dd114 <dd
82115
=== modified file 'lib/lp/services/job/model/job.py'
--- lib/lp/services/job/model/job.py 2011-05-26 14:29:34 +0000
+++ lib/lp/services/job/model/job.py 2011-06-09 10:59:00 +0000
@@ -1,4 +1,4 @@
1# Copyright 2009 Canonical Ltd. This software is licensed under the1# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).2# GNU Affero General Public License version 3 (see the file LICENSE).
33
4"""ORM object representing jobs."""4"""ORM object representing jobs."""
@@ -26,7 +26,10 @@
26from canonical.database.constants import UTC_NOW26from canonical.database.constants import UTC_NOW
27from canonical.database.datetimecol import UtcDateTimeCol27from canonical.database.datetimecol import UtcDateTimeCol
28from canonical.database.enumcol import EnumCol28from canonical.database.enumcol import EnumCol
29from canonical.database.sqlbase import SQLBase29from canonical.database.sqlbase import (
30 quote,
31 SQLBase,
32 )
30from lp.services.job.interfaces.job import (33from lp.services.job.interfaces.job import (
31 IJob,34 IJob,
32 JobStatus,35 JobStatus,
@@ -99,6 +102,22 @@
99102
100 status = property(lambda x: x._status)103 status = property(lambda x: x._status)
101104
105 @classmethod
106 def createMultiple(self, store, num_jobs):
107 """Create multiple `Job`s at once.
108
109 :param store: `Store` to ceate the jobs in.
110 :param num_jobs: Number of `Job`s to create.
111 :return: An iterable of `Job.id` values for the new jobs.
112 """
113 job_contents = ["(%s)" % quote(JobStatus.WAITING)] * num_jobs
114 result = store.execute("""
115 INSERT INTO Job (status)
116 VALUES %s
117 RETURNING id
118 """ % ", ".join(job_contents))
119 return [job_id for job_id, in result]
120
102 def acquireLease(self, duration=300):121 def acquireLease(self, duration=300):
103 """See `IJob`."""122 """See `IJob`."""
104 if (self.lease_expires is not None123 if (self.lease_expires is not None
105124
=== modified file 'lib/lp/services/job/tests/test_job.py'
--- lib/lp/services/job/tests/test_job.py 2011-05-27 08:18:07 +0000
+++ lib/lp/services/job/tests/test_job.py 2011-06-09 10:59:00 +0000
@@ -8,14 +8,9 @@
88
9import pytz9import pytz
10from storm.locals import Store10from storm.locals import Store
11from zope.component import getUtility
1211
13from canonical.database.constants import UTC_NOW12from canonical.database.constants import UTC_NOW
14from canonical.launchpad.webapp.interfaces import (13from canonical.launchpad.interfaces.lpstorm import IStore
15 DEFAULT_FLAVOR,
16 IStoreSelector,
17 MAIN_STORE,
18 )
19from canonical.launchpad.webapp.testing import verifyObject14from canonical.launchpad.webapp.testing import verifyObject
20from canonical.testing.layers import ZopelessDatabaseLayer15from canonical.testing.layers import ZopelessDatabaseLayer
21from lp.services.job.interfaces.job import (16from lp.services.job.interfaces.job import (
@@ -44,6 +39,22 @@
44 job = Job()39 job = Job()
45 self.assertEqual(job.status, JobStatus.WAITING)40 self.assertEqual(job.status, JobStatus.WAITING)
4641
42 def test_createMultiple_creates_requested_number_of_jobs(self):
43 job_ids = list(Job.createMultiple(IStore(Job), 3))
44 self.assertEqual(3, len(job_ids))
45 self.assertEqual(3, len(set(job_ids)))
46
47 def test_createMultiple_returns_valid_job_ids(self):
48 job_ids = list(Job.createMultiple(IStore(Job), 3))
49 store = IStore(Job)
50 for job_id in job_ids:
51 self.assertIsNot(None, store.get(Job, job_id))
52
53 def test_createMultiple_sets_status_to_WAITING(self):
54 store = IStore(Job)
55 job = store.get(Job, Job.createMultiple(store, 1)[0])
56 self.assertEqual(JobStatus.WAITING, job.status)
57
47 def test_start(self):58 def test_start(self):
48 """Job.start should update the object appropriately.59 """Job.start should update the object appropriately.
4960
@@ -214,8 +225,7 @@
214 layer = ZopelessDatabaseLayer225 layer = ZopelessDatabaseLayer
215226
216 def _sampleData(self):227 def _sampleData(self):
217 store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR)228 return list(IStore(Job).execute(Job.ready_jobs))
218 return list(store.execute(Job.ready_jobs))
219229
220 def test_ready_jobs(self):230 def test_ready_jobs(self):
221 """Job.ready_jobs should include new jobs."""231 """Job.ready_jobs should include new jobs."""
222232
=== modified file 'lib/lp/soyuz/interfaces/packagecopyjob.py'
--- lib/lp/soyuz/interfaces/packagecopyjob.py 2011-06-03 08:53:14 +0000
+++ lib/lp/soyuz/interfaces/packagecopyjob.py 2011-06-09 10:59:00 +0000
@@ -1,3 +1,4 @@
1<<<<<<< TREE
1# Copyright 2010-2011 Canonical Ltd. This software is licensed under the2# Copyright 2010-2011 Canonical Ltd. This software is licensed under the
2# GNU Affero General Public License version 3 (see the file LICENSE).3# GNU Affero General Public License version 3 (see the file LICENSE).
34
@@ -148,3 +149,170 @@
148 copy_policy = Choice(149 copy_policy = Choice(
149 title=_("Applicable copy policy"),150 title=_("Applicable copy policy"),
150 values=PackageCopyPolicy, required=True, readonly=True)151 values=PackageCopyPolicy, required=True, readonly=True)
152=======
153# Copyright 2010-2011 Canonical Ltd. This software is licensed under the
154# GNU Affero General Public License version 3 (see the file LICENSE).
155
156__metaclass__ = type
157
158__all__ = [
159 "IPackageCopyJob",
160 "IPlainPackageCopyJob",
161 "IPlainPackageCopyJobSource",
162 "PackageCopyJobType",
163 ]
164
165from lazr.enum import (
166 DBEnumeratedType,
167 DBItem,
168 )
169from lazr.restful.fields import Reference
170from zope.interface import (
171 Attribute,
172 Interface,
173 )
174from zope.schema import (
175 Bool,
176 Choice,
177 Int,
178 TextLine,
179 )
180
181from canonical.launchpad import _
182from lp.registry.interfaces.distroseries import IDistroSeries
183from lp.services.job.interfaces.job import (
184 IJob,
185 IJobSource,
186 IRunnableJob,
187 )
188from lp.soyuz.enums import PackageCopyPolicy
189from lp.soyuz.interfaces.archive import IArchive
190
191
192class IPackageCopyJob(Interface):
193 """A job that copies packages between `IArchive`s."""
194
195 id = Int(
196 title=_('DB ID'), required=True, readonly=True,
197 description=_("The tracking number for this job."))
198
199 source_archive_id = Int(
200 title=_('Source Archive ID'),
201 required=True, readonly=True)
202
203 source_archive = Reference(
204 schema=IArchive, title=_('Source Archive'),
205 required=True, readonly=True)
206
207 target_archive_id = Int(
208 title=_('Target Archive ID'),
209 required=True, readonly=True)
210
211 target_archive = Reference(
212 schema=IArchive, title=_('Target Archive'),
213 required=True, readonly=True)
214
215 target_distroseries = Reference(
216 schema=IDistroSeries, title=_('Target DistroSeries.'),
217 required=True, readonly=True)
218
219 package_name = TextLine(
220 title=_("Package name"), required=True, readonly=True)
221
222 job = Reference(
223 schema=IJob, title=_('The common Job attributes'),
224 required=True, readonly=True)
225
226 metadata = Attribute('A dict of data about the job.')
227
228
229class PackageCopyJobType(DBEnumeratedType):
230
231 PLAIN = DBItem(1, """
232 Copy packages between archives.
233
234 This job copies one or more packages, optionally including binaries.
235 """)
236
237
238class IPlainPackageCopyJobSource(IJobSource):
239 """An interface for acquiring `IPackageCopyJobs`."""
240
241 def create(package_name, source_archive,
242 target_archive, target_distroseries, target_pocket,
243 include_binaries=False, package_version=None,
244 copy_policy=PackageCopyPolicy.INSECURE):
245 """Create a new `IPlainPackageCopyJob`.
246
247 :param package_name: The name of the source package to copy.
248 :param source_archive: The `IArchive` in which `source_packages` are
249 found.
250 :param target_archive: The `IArchive` to which to copy the packages.
251 :param target_distroseries: The `IDistroSeries` to which to copy the
252 packages.
253 :param target_pocket: The pocket into which to copy the packages. Must
254 be a member of `PackagePublishingPocket`.
255 :param include_binaries: See `do_copy`.
256 :param package_version: The version string for the package version
257 that is to be copied.
258 :param copy_policy: Applicable `PackageCopyPolicy`.
259 """
260
261 def createMultiple(target_distroseries, copy_tasks,
262 copy_policy=PackageCopyPolicy.INSECURE,
263 include_binaries=False):
264 """Create multiple new `IPlainPackageCopyJob`s at once.
265
266 :param target_distroseries: The `IDistroSeries` to which to copy the
267 packages.
268 :param copy_tasks: A list of tuples describing the copies to be
269 performed: (package name, package version, source archive,
270 target archive, target pocket).
271 :param copy_policy: Applicable `PackageCopyPolicy`.
272 :param include_binaries: As in `do_copy`.
273 :return: An iterable of `PackageCopyJob` ids.
274 """
275
276 def getActiveJobs(target_archive):
277 """Retrieve all active sync jobs for an archive."""
278
279 def getPendingJobsPerPackage(target_series):
280 """Find pending jobs for each package in `target_series`.
281
282 This is meant for finding jobs that will resolve specific
283 `DistroSeriesDifference`s.
284
285 :param target_series: Target `DistroSeries`; this corresponds to
286 `DistroSeriesDifference.derived_series`.
287 :return: A dict containing as keys the (name, version) tuples for
288 each `DistroSeriesDifference` that has a resolving
289 `PlainPackageCopyJob` pending. Each of these DSDs maps to its
290 oldest pending job. The `version` corresponds to
291 `DistroSeriesDifference.parent_source_version`.
292 """
293
294
295class IPlainPackageCopyJob(IRunnableJob):
296 """A no-frills job to copy packages between `IArchive`s."""
297
298 target_pocket = Int(
299 title=_("Target package publishing pocket"), required=True,
300 readonly=True)
301
302 package_version = TextLine(
303 title=_("Package version"), required=True, readonly=True)
304
305 include_binaries = Bool(
306 title=_("Copy binaries"),
307 required=False, readonly=True)
308
309 def addSourceOverride(override):
310 """Add an `ISourceOverride` to the metadata."""
311
312 def getSourceOverride():
313 """Get an `ISourceOverride` from the metadata."""
314
315 copy_policy = Choice(
316 title=_("Applicable copy policy"),
317 values=PackageCopyPolicy, required=True, readonly=True)
318>>>>>>> MERGE-SOURCE
151319
=== modified file 'lib/lp/soyuz/model/binaryandsourcepackagename.py'
--- lib/lp/soyuz/model/binaryandsourcepackagename.py 2010-12-22 02:48:42 +0000
+++ lib/lp/soyuz/model/binaryandsourcepackagename.py 2011-06-09 10:59:00 +0000
@@ -18,9 +18,7 @@
18 BatchedCountableIterator,18 BatchedCountableIterator,
19 NamedSQLObjectHugeVocabulary,19 NamedSQLObjectHugeVocabulary,
20 )20 )
21from lp.registry.model.sourcepackagename import getSourcePackageDescriptions
22from lp.soyuz.interfaces.binarypackagename import IBinaryAndSourcePackageName21from lp.soyuz.interfaces.binarypackagename import IBinaryAndSourcePackageName
23from lp.soyuz.model.binarypackagename import getBinaryPackageDescriptions
2422
2523
26class BinaryAndSourcePackageName(SQLBase):24class BinaryAndSourcePackageName(SQLBase):
@@ -45,15 +43,7 @@
45 """43 """
4644
47 def getTermsWithDescriptions(self, results):45 def getTermsWithDescriptions(self, results):
48 # Note that we grab first source package descriptions and then46 return [SimpleTerm(obj, obj.name, obj.name)
49 # binary package descriptions, giving preference to the latter,
50 # via the update() call.
51 descriptions = getSourcePackageDescriptions(results, use_names=True)
52 binary_descriptions = getBinaryPackageDescriptions(results,
53 use_names=True)
54 descriptions.update(binary_descriptions)
55 return [SimpleTerm(obj, obj.name,
56 descriptions.get(obj.name, "Not uploaded"))
57 for obj in results]47 for obj in results]
5848
5949
6050
=== modified file 'lib/lp/soyuz/model/packagecopyjob.py'
--- lib/lp/soyuz/model/packagecopyjob.py 2011-06-03 09:18:34 +0000
+++ lib/lp/soyuz/model/packagecopyjob.py 2011-06-09 10:59:00 +0000
@@ -23,10 +23,18 @@
23 implements,23 implements,
24 )24 )
2525
26from canonical.database.enumcol import EnumCol26<<<<<<< TREE
27from canonical.launchpad.components.decoratedresultset import (27from canonical.database.enumcol import EnumCol
28 DecoratedResultSet,28from canonical.launchpad.components.decoratedresultset import (
29 )29 DecoratedResultSet,
30 )
31=======
32from canonical.database.enumcol import EnumCol
33from canonical.database.sqlbase import sqlvalues
34from canonical.launchpad.components.decoratedresultset import (
35 DecoratedResultSet,
36 )
37>>>>>>> MERGE-SOURCE
30from canonical.launchpad.interfaces.lpstorm import (38from canonical.launchpad.interfaces.lpstorm import (
31 IMasterStore,39 IMasterStore,
32 IStore,40 IStore,
@@ -67,125 +75,256 @@
6775
68 implements(IPackageCopyJob)76 implements(IPackageCopyJob)
6977
70 __storm_table__ = 'PackageCopyJob'78<<<<<<< TREE
7179 __storm_table__ = 'PackageCopyJob'
72 id = Int(primary=True)80
7381 id = Int(primary=True)
74 job_id = Int(name='job')82
75 job = Reference(job_id, Job.id)83 job_id = Int(name='job')
7684 job = Reference(job_id, Job.id)
77 source_archive_id = Int(name='source_archive')85
78 source_archive = Reference(source_archive_id, Archive.id)86 source_archive_id = Int(name='source_archive')
7987 source_archive = Reference(source_archive_id, Archive.id)
80 target_archive_id = Int(name='target_archive')88
81 target_archive = Reference(target_archive_id, Archive.id)89 target_archive_id = Int(name='target_archive')
8290 target_archive = Reference(target_archive_id, Archive.id)
83 target_distroseries_id = Int(name='target_distroseries')91
84 target_distroseries = Reference(target_distroseries_id, DistroSeries.id)92 target_distroseries_id = Int(name='target_distroseries')
8593 target_distroseries = Reference(target_distroseries_id, DistroSeries.id)
86 package_name = Unicode('package_name')94
87 copy_policy = EnumCol(enum=PackageCopyPolicy)95 package_name = Unicode('package_name')
8896 copy_policy = EnumCol(enum=PackageCopyPolicy)
89 job_type = EnumCol(enum=PackageCopyJobType, notNull=True)97
9098 job_type = EnumCol(enum=PackageCopyJobType, notNull=True)
91 _json_data = Unicode('json_data')99
92100 _json_data = Unicode('json_data')
93 def __init__(self, source_archive, target_archive, target_distroseries,101
94 job_type, metadata, package_name=None, copy_policy=None):102 def __init__(self, source_archive, target_archive, target_distroseries,
95 super(PackageCopyJob, self).__init__()103 job_type, metadata, package_name=None, copy_policy=None):
96 self.job = Job()104 super(PackageCopyJob, self).__init__()
97 self.job_type = job_type105 self.job = Job()
98 self.source_archive = source_archive106 self.job_type = job_type
99 self.target_archive = target_archive107 self.source_archive = source_archive
100 self.target_distroseries = target_distroseries108 self.target_archive = target_archive
101 self.package_name = unicode(package_name)109 self.target_distroseries = target_distroseries
102 self.copy_policy = copy_policy110 self.package_name = unicode(package_name)
103 self._json_data = self.serializeMetadata(metadata)111 self.copy_policy = copy_policy
104112 self._json_data = self.serializeMetadata(metadata)
105 @classmethod113
106 def serializeMetadata(cls, metadata_dict):114 @classmethod
107 """Serialize a dict of metadata into a unicode string."""115 def serializeMetadata(cls, metadata_dict):
108 return simplejson.dumps(metadata_dict).decode('utf-8')116 """Serialize a dict of metadata into a unicode string."""
109117 return simplejson.dumps(metadata_dict).decode('utf-8')
110 @property118
111 def metadata(self):119 @property
112 return simplejson.loads(self._json_data)120 def metadata(self):
113121 return simplejson.loads(self._json_data)
114 def extendMetadata(self, metadata_dict):122
115 """Add metadata_dict to the existing metadata."""123 def extendMetadata(self, metadata_dict):
116 existing = self.metadata124 """Add metadata_dict to the existing metadata."""
117 existing.update(metadata_dict)125 existing = self.metadata
118 self._json_data = self.serializeMetadata(existing)126 existing.update(metadata_dict)
119127 self._json_data = self.serializeMetadata(existing)
120128
121class PackageCopyJobDerived(BaseRunnableJob):129
122 """Abstract class for deriving from PackageCopyJob."""130class PackageCopyJobDerived(BaseRunnableJob):
123131 """Abstract class for deriving from PackageCopyJob."""
124 delegates(IPackageCopyJob)132
125133 delegates(IPackageCopyJob)
126 def __init__(self, job):134
127 self.context = job135 def __init__(self, job):
128136 self.context = job
129 @classmethod137
130 def get(cls, job_id):138 @classmethod
131 """Get a job by id.139 def get(cls, job_id):
132140 """Get a job by id.
133 :return: the PackageCopyJob with the specified id, as the current141
134 PackageCopyJobDerived subclass.142 :return: the PackageCopyJob with the specified id, as the current
135 :raises: NotFoundError if there is no job with the specified id, or143 PackageCopyJobDerived subclass.
136 its job_type does not match the desired subclass.144 :raises: NotFoundError if there is no job with the specified id, or
137 """145 its job_type does not match the desired subclass.
138 job = IStore(PackageCopyJob).get(PackageCopyJob, job_id)146 """
139 if job.job_type != cls.class_job_type:147 job = IStore(PackageCopyJob).get(PackageCopyJob, job_id)
140 raise NotFoundError(148 if job.job_type != cls.class_job_type:
141 'No object found with id %d and type %s' % (job_id,149 raise NotFoundError(
142 cls.class_job_type.title))150 'No object found with id %d and type %s' % (job_id,
143 return cls(job)151 cls.class_job_type.title))
144152 return cls(job)
145 @classmethod153
146 def iterReady(cls):154 @classmethod
147 """Iterate through all ready PackageCopyJobs."""155 def iterReady(cls):
148 jobs = IStore(PackageCopyJob).find(156 """Iterate through all ready PackageCopyJobs."""
149 PackageCopyJob,157 jobs = IStore(PackageCopyJob).find(
150 And(PackageCopyJob.job_type == cls.class_job_type,158 PackageCopyJob,
151 PackageCopyJob.job == Job.id,159 And(PackageCopyJob.job_type == cls.class_job_type,
152 Job.id.is_in(Job.ready_jobs)))160 PackageCopyJob.job == Job.id,
153 return (cls(job) for job in jobs)161 Job.id.is_in(Job.ready_jobs)))
154162 return (cls(job) for job in jobs)
155 def getOopsVars(self):163
156 """See `IRunnableJob`."""164 def getOopsVars(self):
157 vars = super(PackageCopyJobDerived, self).getOopsVars()165 """See `IRunnableJob`."""
158 vars.extend([166 vars = super(PackageCopyJobDerived, self).getOopsVars()
159 ('source_archive_id', self.context.source_archive_id),167 vars.extend([
160 ('target_archive_id', self.context.target_archive_id),168 ('source_archive_id', self.context.source_archive_id),
161 ('target_distroseries_id', self.context.target_distroseries_id),169 ('target_archive_id', self.context.target_archive_id),
162 ('package_copy_job_id', self.context.id),170 ('target_distroseries_id', self.context.target_distroseries_id),
163 ('package_copy_job_type', self.context.job_type.title),171 ('package_copy_job_id', self.context.id),
164 ])172 ('package_copy_job_type', self.context.job_type.title),
165 return vars173 ])
166174 return vars
167 @property175
168 def copy_policy(self):176 @property
169 """See `PlainPackageCopyJob`."""177 def copy_policy(self):
170 return self.context.copy_policy178 """See `PlainPackageCopyJob`."""
171179 return self.context.copy_policy
172180
173class PlainPackageCopyJob(PackageCopyJobDerived):181
174 """Job that copies a package from one archive to another."""182class PlainPackageCopyJob(PackageCopyJobDerived):
175 # This job type serves in different places: it supports copying183 """Job that copies a package from one archive to another."""
176 # packages between archives, but also the syncing of packages from184 # This job type serves in different places: it supports copying
177 # parents into a derived distroseries. We may split these into185 # packages between archives, but also the syncing of packages from
178 # separate types at some point, but for now we (allenap, bigjools,186 # parents into a derived distroseries. We may split these into
179 # jtv) chose to keep it as one.187 # separate types at some point, but for now we (allenap, bigjools,
180188 # jtv) chose to keep it as one.
181 implements(IPlainPackageCopyJob)189
182190 implements(IPlainPackageCopyJob)
183 class_job_type = PackageCopyJobType.PLAIN191
184 classProvides(IPlainPackageCopyJobSource)192 class_job_type = PackageCopyJobType.PLAIN
185193 classProvides(IPlainPackageCopyJobSource)
186 @classmethod194
187 def create(cls, package_name, source_archive,195 @classmethod
196 def create(cls, package_name, source_archive,
197=======
198 __storm_table__ = 'PackageCopyJob'
199
200 id = Int(primary=True)
201
202 job_id = Int(name='job')
203 job = Reference(job_id, Job.id)
204
205 source_archive_id = Int(name='source_archive')
206 source_archive = Reference(source_archive_id, Archive.id)
207
208 target_archive_id = Int(name='target_archive')
209 target_archive = Reference(target_archive_id, Archive.id)
210
211 target_distroseries_id = Int(name='target_distroseries')
212 target_distroseries = Reference(target_distroseries_id, DistroSeries.id)
213
214 package_name = Unicode('package_name')
215 copy_policy = EnumCol(enum=PackageCopyPolicy)
216
217 job_type = EnumCol(enum=PackageCopyJobType, notNull=True)
218
219 _json_data = Unicode('json_data')
220
221 def __init__(self, source_archive, target_archive, target_distroseries,
222 job_type, metadata, package_name=None, copy_policy=None):
223 super(PackageCopyJob, self).__init__()
224 self.job = Job()
225 self.job_type = job_type
226 self.source_archive = source_archive
227 self.target_archive = target_archive
228 self.target_distroseries = target_distroseries
229 self.package_name = unicode(package_name)
230 self.copy_policy = copy_policy
231 self._json_data = self.serializeMetadata(metadata)
232
233 @classmethod
234 def serializeMetadata(cls, metadata_dict):
235 """Serialize a dict of metadata into a unicode string."""
236 return simplejson.dumps(metadata_dict).decode('utf-8')
237
238 @property
239 def metadata(self):
240 return simplejson.loads(self._json_data)
241
242 def extendMetadata(self, metadata_dict):
243 """Add metadata_dict to the existing metadata."""
244 existing = self.metadata
245 existing.update(metadata_dict)
246 self._json_data = self.serializeMetadata(existing)
247
248
249class PackageCopyJobDerived(BaseRunnableJob):
250 """Abstract class for deriving from PackageCopyJob."""
251
252 delegates(IPackageCopyJob)
253
254 def __init__(self, job):
255 self.context = job
256
257 @classmethod
258 def get(cls, job_id):
259 """Get a job by id.
260
261 :return: the PackageCopyJob with the specified id, as the current
262 PackageCopyJobDerived subclass.
263 :raises: NotFoundError if there is no job with the specified id, or
264 its job_type does not match the desired subclass.
265 """
266 job = IStore(PackageCopyJob).get(PackageCopyJob, job_id)
267 if job.job_type != cls.class_job_type:
268 raise NotFoundError(
269 'No object found with id %d and type %s' % (job_id,
270 cls.class_job_type.title))
271 return cls(job)
272
273 @classmethod
274 def iterReady(cls):
275 """Iterate through all ready PackageCopyJobs."""
276 jobs = IStore(PackageCopyJob).find(
277 PackageCopyJob,
278 And(PackageCopyJob.job_type == cls.class_job_type,
279 PackageCopyJob.job == Job.id,
280 Job.id.is_in(Job.ready_jobs)))
281 return (cls(job) for job in jobs)
282
283 def getOopsVars(self):
284 """See `IRunnableJob`."""
285 vars = super(PackageCopyJobDerived, self).getOopsVars()
286 vars.extend([
287 ('source_archive_id', self.context.source_archive_id),
288 ('target_archive_id', self.context.target_archive_id),
289 ('target_distroseries_id', self.context.target_distroseries_id),
290 ('package_copy_job_id', self.context.id),
291 ('package_copy_job_type', self.context.job_type.title),
292 ])
293 return vars
294
295 @property
296 def copy_policy(self):
297 """See `PlainPackageCopyJob`."""
298 return self.context.copy_policy
299
300
301class PlainPackageCopyJob(PackageCopyJobDerived):
302 """Job that copies a package from one archive to another."""
303 # This job type serves in different places: it supports copying
304 # packages between archives, but also the syncing of packages from
305 # parents into a derived distroseries. We may split these into
306 # separate types at some point, but for now we (allenap, bigjools,
307 # jtv) chose to keep it as one.
308
309 implements(IPlainPackageCopyJob)
310
311 class_job_type = PackageCopyJobType.PLAIN
312 classProvides(IPlainPackageCopyJobSource)
313
314 @classmethod
315 def _makeMetadata(cls, target_pocket, package_version, include_binaries):
316 """."""
317 return {
318 'target_pocket': target_pocket.value,
319 'package_version': package_version,
320 'include_binaries': bool(include_binaries),
321 }
322
323 @classmethod
324 def create(cls, package_name, source_archive,
325>>>>>>> MERGE-SOURCE
188 target_archive, target_distroseries, target_pocket,326 target_archive, target_distroseries, target_pocket,
327<<<<<<< TREE
189 include_binaries=False, package_version=None,328 include_binaries=False, package_version=None,
190 copy_policy=PackageCopyPolicy.INSECURE):329 copy_policy=PackageCopyPolicy.INSECURE):
191 """See `IPlainPackageCopyJobSource`."""330 """See `IPlainPackageCopyJobSource`."""
@@ -204,43 +343,149 @@
204 copy_policy=copy_policy,343 copy_policy=copy_policy,
205 metadata=metadata)344 metadata=metadata)
206 IMasterStore(PackageCopyJob).add(job)345 IMasterStore(PackageCopyJob).add(job)
346=======
347 include_binaries=False, package_version=None,
348 copy_policy=PackageCopyPolicy.INSECURE):
349 """See `IPlainPackageCopyJobSource`."""
350 assert package_version is not None, "No package version specified."
351 metadata = cls._makeMetadata(
352 target_pocket, package_version, include_binaries)
353 job = PackageCopyJob(
354 job_type=cls.class_job_type,
355 source_archive=source_archive,
356 target_archive=target_archive,
357 target_distroseries=target_distroseries,
358 package_name=package_name,
359 copy_policy=copy_policy,
360 metadata=metadata)
361 IMasterStore(PackageCopyJob).add(job)
362>>>>>>> MERGE-SOURCE
207 return cls(job)363 return cls(job)
208364
209 @classmethod365 @classmethod
210 def getActiveJobs(cls, target_archive):366<<<<<<< TREE
211 """See `IPlainPackageCopyJobSource`."""367 def getActiveJobs(cls, target_archive):
212 jobs = IStore(PackageCopyJob).find(368 """See `IPlainPackageCopyJobSource`."""
213 PackageCopyJob,369 jobs = IStore(PackageCopyJob).find(
214 PackageCopyJob.job_type == cls.class_job_type,370 PackageCopyJob,
215 PackageCopyJob.target_archive == target_archive,371 PackageCopyJob.job_type == cls.class_job_type,
216 Job.id == PackageCopyJob.job_id,372 PackageCopyJob.target_archive == target_archive,
217 Job._status == JobStatus.WAITING)373 Job.id == PackageCopyJob.job_id,
218 jobs = jobs.order_by(PackageCopyJob.id)374 Job._status == JobStatus.WAITING)
219 return DecoratedResultSet(jobs, cls)375 jobs = jobs.order_by(PackageCopyJob.id)
220376 return DecoratedResultSet(jobs, cls)
221 @classmethod377
222 def getPendingJobsForTargetSeries(cls, target_series):378 @classmethod
223 """Get upcoming jobs for `target_series`, ordered by age."""379 def getPendingJobsForTargetSeries(cls, target_series):
224 raw_jobs = IStore(PackageCopyJob).find(380 """Get upcoming jobs for `target_series`, ordered by age."""
225 PackageCopyJob,381 raw_jobs = IStore(PackageCopyJob).find(
226 Job.id == PackageCopyJob.job_id,382 PackageCopyJob,
227 PackageCopyJob.job_type == cls.class_job_type,383 Job.id == PackageCopyJob.job_id,
228 PackageCopyJob.target_distroseries == target_series,384 PackageCopyJob.job_type == cls.class_job_type,
229 Job._status.is_in(Job.PENDING_STATUSES))385 PackageCopyJob.target_distroseries == target_series,
230 raw_jobs = raw_jobs.order_by(PackageCopyJob.id)386 Job._status.is_in(Job.PENDING_STATUSES))
231 return DecoratedResultSet(raw_jobs, cls)387 raw_jobs = raw_jobs.order_by(PackageCopyJob.id)
232388 return DecoratedResultSet(raw_jobs, cls)
233 @classmethod389
234 def getPendingJobsPerPackage(cls, target_series):390 @classmethod
235 """See `IPlainPackageCopyJobSource`."""391 def getPendingJobsPerPackage(cls, target_series):
236 result = {}392 """See `IPlainPackageCopyJobSource`."""
237 # Go through jobs in-order, picking the first matching job for393 result = {}
238 # any (package, version) tuple. Because of how394 # Go through jobs in-order, picking the first matching job for
239 # getPendingJobsForTargetSeries orders its results, the first395 # any (package, version) tuple. Because of how
240 # will be the oldest and thus presumably the first to finish.396 # getPendingJobsForTargetSeries orders its results, the first
241 for job in cls.getPendingJobsForTargetSeries(target_series):397 # will be the oldest and thus presumably the first to finish.
242 result.setdefault(job.package_name, job)398 for job in cls.getPendingJobsForTargetSeries(target_series):
243 return result399 result.setdefault(job.package_name, job)
400 return result
401=======
402 def _composeJobInsertionTuple(cls, target_distroseries, copy_policy,
403 include_binaries, job_id, copy_task):
404 """Create an SQL fragment for inserting a job into the database.
405
406 :return: A string representing an SQL tuple containing initializers
407 for a `PackageCopyJob` in the database (minus `id`, which is
408 assigned automatically). Contents are escaped for use in SQL.
409 """
410 (
411 package_name,
412 package_version,
413 source_archive,
414 target_archive,
415 target_pocket,
416 ) = copy_task
417 metadata = cls._makeMetadata(
418 target_pocket, package_version, include_binaries)
419 data = (
420 cls.class_job_type, target_distroseries, copy_policy,
421 source_archive, target_archive, package_name, job_id,
422 PackageCopyJob.serializeMetadata(metadata))
423 format_string = "(%s)" % ", ".join(["%s"] * len(data))
424 return format_string % sqlvalues(*data)
425
426 @classmethod
427 def createMultiple(cls, target_distroseries, copy_tasks,
428 copy_policy=PackageCopyPolicy.INSECURE,
429 include_binaries=False):
430 """See `IPlainPackageCopyJobSource`."""
431 store = IMasterStore(Job)
432 job_ids = Job.createMultiple(store, len(copy_tasks))
433 job_contents = [
434 cls._composeJobInsertionTuple(
435 target_distroseries, copy_policy, include_binaries, job_id,
436 task)
437 for job_id, task in zip(job_ids, copy_tasks)]
438 result = store.execute("""
439 INSERT INTO PackageCopyJob (
440 job_type,
441 target_distroseries,
442 copy_policy,
443 source_archive,
444 target_archive,
445 package_name,
446 job,
447 json_data)
448 VALUES %s
449 RETURNING id
450 """ % ", ".join(job_contents))
451 return [job_id for job_id, in result]
452
453 @classmethod
454 def getActiveJobs(cls, target_archive):
455 """See `IPlainPackageCopyJobSource`."""
456 jobs = IStore(PackageCopyJob).find(
457 PackageCopyJob,
458 PackageCopyJob.job_type == cls.class_job_type,
459 PackageCopyJob.target_archive == target_archive,
460 Job.id == PackageCopyJob.job_id,
461 Job._status == JobStatus.WAITING)
462 jobs = jobs.order_by(PackageCopyJob.id)
463 return DecoratedResultSet(jobs, cls)
464
465 @classmethod
466 def getPendingJobsForTargetSeries(cls, target_series):
467 """Get upcoming jobs for `target_series`, ordered by age."""
468 raw_jobs = IStore(PackageCopyJob).find(
469 PackageCopyJob,
470 Job.id == PackageCopyJob.job_id,
471 PackageCopyJob.job_type == cls.class_job_type,
472 PackageCopyJob.target_distroseries == target_series,
473 Job._status.is_in(Job.PENDING_STATUSES))
474 raw_jobs = raw_jobs.order_by(PackageCopyJob.id)
475 return DecoratedResultSet(raw_jobs, cls)
476
477 @classmethod
478 def getPendingJobsPerPackage(cls, target_series):
479 """See `IPlainPackageCopyJobSource`."""
480 result = {}
481 # Go through jobs in-order, picking the first matching job for
482 # any (package, version) tuple. Because of how
483 # getPendingJobsForTargetSeries orders its results, the first
484 # will be the oldest and thus presumably the first to finish.
485 for job in cls.getPendingJobsForTargetSeries(target_series):
486 result.setdefault(job.package_name, job)
487 return result
488>>>>>>> MERGE-SOURCE
244489
245 @property490 @property
246 def target_pocket(self):491 def target_pocket(self):
247492
=== modified file 'lib/lp/soyuz/tests/test_packagecopyjob.py'
--- lib/lp/soyuz/tests/test_packagecopyjob.py 2011-06-03 08:53:14 +0000
+++ lib/lp/soyuz/tests/test_packagecopyjob.py 2011-06-09 10:59:00 +0000
@@ -116,7 +116,55 @@
116 self.assertEqual("foo", job.package_name)116 self.assertEqual("foo", job.package_name)
117 self.assertEqual("1.0-1", job.package_version)117 self.assertEqual("1.0-1", job.package_version)
118 self.assertEquals(False, job.include_binaries)118 self.assertEquals(False, job.include_binaries)
119 self.assertEquals(PackageCopyPolicy.MASS_SYNC, job.copy_policy)119<<<<<<< TREE
120 self.assertEquals(PackageCopyPolicy.MASS_SYNC, job.copy_policy)
121=======
122 self.assertEquals(PackageCopyPolicy.MASS_SYNC, job.copy_policy)
123
124 def test_createMultiple_creates_one_job_per_copy(self):
125 mother = self.factory.makeDistroSeriesParent()
126 derived_series = mother.derived_series
127 father = self.factory.makeDistroSeriesParent(
128 derived_series=derived_series)
129 mother_package = self.factory.makeSourcePackageName()
130 father_package = self.factory.makeSourcePackageName()
131 job_source = getUtility(IPlainPackageCopyJobSource)
132 copy_tasks = [
133 (
134 mother_package.name,
135 "1.5mother1",
136 mother.parent_series.main_archive,
137 derived_series.main_archive,
138 PackagePublishingPocket.RELEASE,
139 ),
140 (
141 father_package.name,
142 "0.9father1",
143 father.parent_series.main_archive,
144 derived_series.main_archive,
145 PackagePublishingPocket.UPDATES,
146 ),
147 ]
148 job_ids = list(
149 job_source.createMultiple(mother.derived_series, copy_tasks))
150 jobs = list(job_source.getActiveJobs(derived_series.main_archive))
151 self.assertContentEqual(job_ids, [job.id for job in jobs])
152 self.assertEqual(len(copy_tasks), len(set([job.job for job in jobs])))
153 # Get jobs into the same order as copy_tasks, for ease of
154 # comparison.
155 if jobs[0].package_name != mother_package.name:
156 jobs = reversed(jobs)
157 requested_copies = [
158 (
159 job.package_name,
160 job.package_version,
161 job.source_archive,
162 job.target_archive,
163 job.target_pocket,
164 )
165 for job in jobs]
166 self.assertEqual(copy_tasks, requested_copies)
167>>>>>>> MERGE-SOURCE
120168
121 def test_getActiveJobs(self):169 def test_getActiveJobs(self):
122 # getActiveJobs() can retrieve all active jobs for an archive.170 # getActiveJobs() can retrieve all active jobs for an archive.
@@ -320,202 +368,404 @@
320 copied_source_package = archive2.getPublishedSources(368 copied_source_package = archive2.getPublishedSources(
321 name="libc", version="2.8-1", exact_match=True).first()369 name="libc", version="2.8-1", exact_match=True).first()
322 self.assertIsNot(copied_source_package, None)370 self.assertIsNot(copied_source_package, None)
323371<<<<<<< TREE
324 def test___repr__(self):372
325 distroseries = self.factory.makeDistroSeries()373 def test___repr__(self):
326 archive1 = self.factory.makeArchive(distroseries.distribution)374 distroseries = self.factory.makeDistroSeries()
327 archive2 = self.factory.makeArchive(distroseries.distribution)375 archive1 = self.factory.makeArchive(distroseries.distribution)
328 source = getUtility(IPlainPackageCopyJobSource)376 archive2 = self.factory.makeArchive(distroseries.distribution)
329 job = source.create(377 source = getUtility(IPlainPackageCopyJobSource)
330 package_name="foo", source_archive=archive1,378 job = source.create(
331 target_archive=archive2, target_distroseries=distroseries,379 package_name="foo", source_archive=archive1,
332 target_pocket=PackagePublishingPocket.RELEASE,380 target_archive=archive2, target_distroseries=distroseries,
333 package_version="1.0-1", include_binaries=True)381 target_pocket=PackagePublishingPocket.RELEASE,
334 self.assertEqual(382 package_version="1.0-1", include_binaries=True)
335 ("<PlainPackageCopyJob to copy package foo from "383 self.assertEqual(
336 "{distroseries.distribution.name}/{archive1.name} to "384 ("<PlainPackageCopyJob to copy package foo from "
337 "{distroseries.distribution.name}/{archive2.name}, "385 "{distroseries.distribution.name}/{archive1.name} to "
338 "RELEASE pocket, in {distroseries.distribution.name} "386 "{distroseries.distribution.name}/{archive2.name}, "
339 "{distroseries.name}, including binaries>").format(387 "RELEASE pocket, in {distroseries.distribution.name} "
340 distroseries=distroseries, archive1=archive1,388 "{distroseries.name}, including binaries>").format(
341 archive2=archive2),389 distroseries=distroseries, archive1=archive1,
342 repr(job))390 archive2=archive2),
343391 repr(job))
344 def test_getPendingJobsPerPackage_finds_jobs(self):392
345 # getPendingJobsPerPackage finds jobs, and the packages they393 def test_getPendingJobsPerPackage_finds_jobs(self):
346 # belong to.394 # getPendingJobsPerPackage finds jobs, and the packages they
347 dsd = self.factory.makeDistroSeriesDifference()395 # belong to.
348 job = self.makeJob(dsd)396 dsd = self.factory.makeDistroSeriesDifference()
349 job_source = getUtility(IPlainPackageCopyJobSource)397 job = self.makeJob(dsd)
350 self.assertEqual(398 job_source = getUtility(IPlainPackageCopyJobSource)
351 {dsd.source_package_name.name: job},399 self.assertEqual(
352 job_source.getPendingJobsPerPackage(dsd.derived_series))400 {dsd.source_package_name.name: job},
353401 job_source.getPendingJobsPerPackage(dsd.derived_series))
354 def test_getPendingJobsPerPackage_ignores_other_distroseries(self):402
355 # getPendingJobsPerPackage only looks for jobs on the indicated403 def test_getPendingJobsPerPackage_ignores_other_distroseries(self):
356 # distroseries.404 # getPendingJobsPerPackage only looks for jobs on the indicated
357 self.makeJob()405 # distroseries.
358 other_series = self.factory.makeDistroSeries()406 self.makeJob()
359 job_source = getUtility(IPlainPackageCopyJobSource)407 other_series = self.factory.makeDistroSeries()
360 self.assertEqual(408 job_source = getUtility(IPlainPackageCopyJobSource)
361 {}, job_source.getPendingJobsPerPackage(other_series))409 self.assertEqual(
362410 {}, job_source.getPendingJobsPerPackage(other_series))
363 def test_getPendingJobsPerPackage_only_returns_pending_jobs(self):411
364 # getPendingJobsPerPackage ignores jobs that have already been412 def test_getPendingJobsPerPackage_only_returns_pending_jobs(self):
365 # run.413 # getPendingJobsPerPackage ignores jobs that have already been
366 dsd = self.factory.makeDistroSeriesDifference()414 # run.
367 job = self.makeJob(dsd)415 dsd = self.factory.makeDistroSeriesDifference()
368 job_source = getUtility(IPlainPackageCopyJobSource)416 job = self.makeJob(dsd)
369 found_by_state = {}417 job_source = getUtility(IPlainPackageCopyJobSource)
370 for status in JobStatus.items:418 found_by_state = {}
371 removeSecurityProxy(job).job._status = status419 for status in JobStatus.items:
372 result = job_source.getPendingJobsPerPackage(dsd.derived_series)420 removeSecurityProxy(job).job._status = status
373 if len(result) > 0:421 result = job_source.getPendingJobsPerPackage(dsd.derived_series)
374 found_by_state[status] = result[dsd.source_package_name.name]422 if len(result) > 0:
375 expected = {423 found_by_state[status] = result[dsd.source_package_name.name]
376 JobStatus.WAITING: job,424 expected = {
377 JobStatus.RUNNING: job,425 JobStatus.WAITING: job,
378 JobStatus.SUSPENDED: job,426 JobStatus.RUNNING: job,
379 }427 JobStatus.SUSPENDED: job,
380 self.assertEqual(expected, found_by_state)428 }
381429 self.assertEqual(expected, found_by_state)
382 def test_getPendingJobsPerPackage_distinguishes_jobs(self):430
383 # getPendingJobsPerPackage associates the right job with the431 def test_getPendingJobsPerPackage_distinguishes_jobs(self):
384 # right package.432 # getPendingJobsPerPackage associates the right job with the
385 derived_series = self.factory.makeDistroSeries()433 # right package.
386 dsds = [434 derived_series = self.factory.makeDistroSeries()
387 self.factory.makeDistroSeriesDifference(435 dsds = [
388 derived_series=derived_series)436 self.factory.makeDistroSeriesDifference(
389 for counter in xrange(2)]437 derived_series=derived_series)
390 jobs = map(self.makeJob, dsds)438 for counter in xrange(2)]
391 job_source = getUtility(IPlainPackageCopyJobSource)439 jobs = map(self.makeJob, dsds)
392 self.assertEqual(440 job_source = getUtility(IPlainPackageCopyJobSource)
393 dict(zip([dsd.source_package_name.name for dsd in dsds], jobs)),441 self.assertEqual(
394 job_source.getPendingJobsPerPackage(derived_series))442 dict(zip([dsd.source_package_name.name for dsd in dsds], jobs)),
395443 job_source.getPendingJobsPerPackage(derived_series))
396 def test_getPendingJobsPerPackage_picks_oldest_job_for_dsd(self):444
397 # If there are multiple jobs for one package,445 def test_getPendingJobsPerPackage_picks_oldest_job_for_dsd(self):
398 # getPendingJobsPerPackage picks the oldest.446 # If there are multiple jobs for one package,
399 dsd = self.factory.makeDistroSeriesDifference()447 # getPendingJobsPerPackage picks the oldest.
400 jobs = [self.makeJob(dsd) for counter in xrange(2)]448 dsd = self.factory.makeDistroSeriesDifference()
401 job_source = getUtility(IPlainPackageCopyJobSource)449 jobs = [self.makeJob(dsd) for counter in xrange(2)]
402 self.assertEqual(450 job_source = getUtility(IPlainPackageCopyJobSource)
403 {dsd.source_package_name.name: jobs[0]},451 self.assertEqual(
404 job_source.getPendingJobsPerPackage(dsd.derived_series))452 {dsd.source_package_name.name: jobs[0]},
405453 job_source.getPendingJobsPerPackage(dsd.derived_series))
406 def test_getPendingJobsPerPackage_ignores_dsds_without_jobs(self):454
407 # getPendingJobsPerPackage produces no dict entry for packages455 def test_getPendingJobsPerPackage_ignores_dsds_without_jobs(self):
408 # that have no pending jobs, even if they do have DSDs.456 # getPendingJobsPerPackage produces no dict entry for packages
409 dsd = self.factory.makeDistroSeriesDifference()457 # that have no pending jobs, even if they do have DSDs.
410 job_source = getUtility(IPlainPackageCopyJobSource)458 dsd = self.factory.makeDistroSeriesDifference()
411 self.assertEqual(459 job_source = getUtility(IPlainPackageCopyJobSource)
412 {}, job_source.getPendingJobsPerPackage(dsd.derived_series))460 self.assertEqual(
413461 {}, job_source.getPendingJobsPerPackage(dsd.derived_series))
414 def test_findMatchingDSDs_matches_all_DSDs_for_job(self):462
415 # findMatchingDSDs finds matching DSDs for any of the packages463 def test_findMatchingDSDs_matches_all_DSDs_for_job(self):
416 # in the job.464 # findMatchingDSDs finds matching DSDs for any of the packages
417 dsd = self.factory.makeDistroSeriesDifference()465 # in the job.
418 naked_job = removeSecurityProxy(self.makeJob(dsd))466 dsd = self.factory.makeDistroSeriesDifference()
419 self.assertContentEqual([dsd], naked_job.findMatchingDSDs())467 naked_job = removeSecurityProxy(self.makeJob(dsd))
420468 self.assertContentEqual([dsd], naked_job.findMatchingDSDs())
421 def test_findMatchingDSDs_ignores_other_source_series(self):469
422 # findMatchingDSDs tries to ignore DSDs that are for different470 def test_findMatchingDSDs_ignores_other_source_series(self):
423 # parent series than the job's source series. (This can't be471 # findMatchingDSDs tries to ignore DSDs that are for different
424 # done with perfect precision because the job doesn't keep track472 # parent series than the job's source series. (This can't be
425 # of source distroseries, but in practice it should be good473 # done with perfect precision because the job doesn't keep track
426 # enough).474 # of source distroseries, but in practice it should be good
427 dsd = self.factory.makeDistroSeriesDifference()475 # enough).
428 naked_job = removeSecurityProxy(self.makeJob(dsd))476 dsd = self.factory.makeDistroSeriesDifference()
429477 naked_job = removeSecurityProxy(self.makeJob(dsd))
430 # If the dsd differs only in parent series, that's enough to478
431 # make it a non-match.479 # If the dsd differs only in parent series, that's enough to
432 removeSecurityProxy(dsd).parent_series = (480 # make it a non-match.
433 self.factory.makeDistroSeries())481 removeSecurityProxy(dsd).parent_series = (
434482 self.factory.makeDistroSeries())
435 self.assertContentEqual([], naked_job.findMatchingDSDs())483
436484 self.assertContentEqual([], naked_job.findMatchingDSDs())
437 def test_findMatchingDSDs_ignores_other_packages(self):485
438 # findMatchingDSDs does not return DSDs that are similar to the486 def test_findMatchingDSDs_ignores_other_packages(self):
439 # information in the job, but are for different packages.487 # findMatchingDSDs does not return DSDs that are similar to the
440 dsd = self.factory.makeDistroSeriesDifference()488 # information in the job, but are for different packages.
441 self.factory.makeDistroSeriesDifference(489 dsd = self.factory.makeDistroSeriesDifference()
442 derived_series=dsd.derived_series,490 self.factory.makeDistroSeriesDifference(
443 parent_series=dsd.parent_series)491 derived_series=dsd.derived_series,
444 naked_job = removeSecurityProxy(self.makeJob(dsd))492 parent_series=dsd.parent_series)
445 self.assertContentEqual([dsd], naked_job.findMatchingDSDs())493 naked_job = removeSecurityProxy(self.makeJob(dsd))
446494 self.assertContentEqual([dsd], naked_job.findMatchingDSDs())
447 def test_addSourceOverride(self):495
448 # Test the addOverride method which adds an ISourceOverride to the496 def test_addSourceOverride(self):
449 # metadata.497 # Test the addOverride method which adds an ISourceOverride to the
450 name = self.factory.makeSourcePackageName()498 # metadata.
451 component = self.factory.makeComponent()499 name = self.factory.makeSourcePackageName()
452 section=self.factory.makeSection()500 component = self.factory.makeComponent()
453 pcj = self.factory.makePlainPackageCopyJob()501 section=self.factory.makeSection()
454 self.layer.txn.commit()502 pcj = self.factory.makePlainPackageCopyJob()
455 self.layer.switchDbUser('sync_packages')503 self.layer.txn.commit()
456504 self.layer.switchDbUser('sync_packages')
457 override = SourceOverride(505
458 source_package_name=name,506 override = SourceOverride(
459 component=component,507 source_package_name=name,
460 section=section)508 component=component,
461 pcj.addSourceOverride(override)509 section=section)
462510 pcj.addSourceOverride(override)
463 metadata_component = getUtility(511
464 IComponentSet)[pcj.metadata["component_override"]]512 metadata_component = getUtility(
465 metadata_section = getUtility(513 IComponentSet)[pcj.metadata["component_override"]]
466 ISectionSet)[pcj.metadata["section_override"]]514 metadata_section = getUtility(
467 matcher = MatchesStructure(515 ISectionSet)[pcj.metadata["section_override"]]
468 component=Equals(metadata_component),516 matcher = MatchesStructure(
469 section=Equals(metadata_section))517 component=Equals(metadata_component),
470 self.assertThat(override, matcher)518 section=Equals(metadata_section))
471519 self.assertThat(override, matcher)
472 def test_getSourceOverride(self):520
473 # Test the getSourceOverride which gets an ISourceOverride from521 def test_getSourceOverride(self):
474 # the metadata.522 # Test the getSourceOverride which gets an ISourceOverride from
475 name = self.factory.makeSourcePackageName()523 # the metadata.
476 component = self.factory.makeComponent()524 name = self.factory.makeSourcePackageName()
477 section=self.factory.makeSection()525 component = self.factory.makeComponent()
478 pcj = self.factory.makePlainPackageCopyJob(526 section=self.factory.makeSection()
479 package_name=name.name, package_version="1.0")527 pcj = self.factory.makePlainPackageCopyJob(
480 self.layer.txn.commit()528 package_name=name.name, package_version="1.0")
481 self.layer.switchDbUser('sync_packages')529 self.layer.txn.commit()
482530 self.layer.switchDbUser('sync_packages')
483 override = SourceOverride(531
484 source_package_name=name,532 override = SourceOverride(
485 component=component,533 source_package_name=name,
486 section=section)534 component=component,
487 pcj.addSourceOverride(override)535 section=section)
488536 pcj.addSourceOverride(override)
489 self.assertEqual(override, pcj.getSourceOverride())537
490538 self.assertEqual(override, pcj.getSourceOverride())
491 def test_getPolicyImplementation_returns_policy(self):539
492 # getPolicyImplementation returns the ICopyPolicy that was540 def test_getPolicyImplementation_returns_policy(self):
493 # chosen for the job.541 # getPolicyImplementation returns the ICopyPolicy that was
494 dsd = self.factory.makeDistroSeriesDifference()542 # chosen for the job.
495 for policy in PackageCopyPolicy.items:543 dsd = self.factory.makeDistroSeriesDifference()
496 naked_job = removeSecurityProxy(544 for policy in PackageCopyPolicy.items:
497 self.makeJob(dsd, copy_policy=policy))545 naked_job = removeSecurityProxy(
498 self.assertEqual(546 self.makeJob(dsd, copy_policy=policy))
499 policy, naked_job.getPolicyImplementation().enum_value)547 self.assertEqual(
500548 policy, naked_job.getPolicyImplementation().enum_value)
501549
502class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper):550
503 """Test that `PlainPackageCopyJob` has the privileges it needs.551class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper):
504552 """Test that `PlainPackageCopyJob` has the privileges it needs.
505 This test looks for errors, not failures. It's here only to see that553
506 these operations don't run into any privilege limitations.554 This test looks for errors, not failures. It's here only to see that
507 """555 these operations don't run into any privilege limitations.
508556 """
509 layer = LaunchpadZopelessLayer557
510558 layer = LaunchpadZopelessLayer
511 def test_findMatchingDSDs(self):559
512 job = self.makeJob()560 def test_findMatchingDSDs(self):
513 transaction.commit()561 job = self.makeJob()
514 self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser)562 transaction.commit()
515 removeSecurityProxy(job).findMatchingDSDs()563 self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser)
516564 removeSecurityProxy(job).findMatchingDSDs()
517 def test_reportFailure(self):565
518 job = self.makeJob()566 def test_reportFailure(self):
519 transaction.commit()567 job = self.makeJob()
520 self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser)568 transaction.commit()
521 removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts"))569 self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser)
570 removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts"))
571=======
572
573 def test___repr__(self):
574 distroseries = self.factory.makeDistroSeries()
575 archive1 = self.factory.makeArchive(distroseries.distribution)
576 archive2 = self.factory.makeArchive(distroseries.distribution)
577 source = getUtility(IPlainPackageCopyJobSource)
578 job = source.create(
579 package_name="foo", source_archive=archive1,
580 target_archive=archive2, target_distroseries=distroseries,
581 target_pocket=PackagePublishingPocket.RELEASE,
582 package_version="1.0-1", include_binaries=True)
583 self.assertEqual(
584 ("<PlainPackageCopyJob to copy package foo from "
585 "{distroseries.distribution.name}/{archive1.name} to "
586 "{distroseries.distribution.name}/{archive2.name}, "
587 "RELEASE pocket, in {distroseries.distribution.name} "
588 "{distroseries.name}, including binaries>").format(
589 distroseries=distroseries, archive1=archive1,
590 archive2=archive2),
591 repr(job))
592
593 def test_getPendingJobsPerPackage_finds_jobs(self):
594 # getPendingJobsPerPackage finds jobs, and the packages they
595 # belong to.
596 dsd = self.factory.makeDistroSeriesDifference()
597 job = self.makeJob(dsd)
598 job_source = getUtility(IPlainPackageCopyJobSource)
599 self.assertEqual(
600 {dsd.source_package_name.name: job},
601 job_source.getPendingJobsPerPackage(dsd.derived_series))
602
603 def test_getPendingJobsPerPackage_ignores_other_distroseries(self):
604 # getPendingJobsPerPackage only looks for jobs on the indicated
605 # distroseries.
606 self.makeJob()
607 other_series = self.factory.makeDistroSeries()
608 job_source = getUtility(IPlainPackageCopyJobSource)
609 self.assertEqual(
610 {}, job_source.getPendingJobsPerPackage(other_series))
611
612 def test_getPendingJobsPerPackage_only_returns_pending_jobs(self):
613 # getPendingJobsPerPackage ignores jobs that have already been
614 # run.
615 dsd = self.factory.makeDistroSeriesDifference()
616 job = self.makeJob(dsd)
617 job_source = getUtility(IPlainPackageCopyJobSource)
618 found_by_state = {}
619 for status in JobStatus.items:
620 removeSecurityProxy(job).job._status = status
621 result = job_source.getPendingJobsPerPackage(dsd.derived_series)
622 if len(result) > 0:
623 found_by_state[status] = result[dsd.source_package_name.name]
624 expected = {
625 JobStatus.WAITING: job,
626 JobStatus.RUNNING: job,
627 JobStatus.SUSPENDED: job,
628 }
629 self.assertEqual(expected, found_by_state)
630
631 def test_getPendingJobsPerPackage_distinguishes_jobs(self):
632 # getPendingJobsPerPackage associates the right job with the
633 # right package.
634 derived_series = self.factory.makeDistroSeries()
635 dsds = [
636 self.factory.makeDistroSeriesDifference(
637 derived_series=derived_series)
638 for counter in xrange(2)]
639 jobs = map(self.makeJob, dsds)
640 job_source = getUtility(IPlainPackageCopyJobSource)
641 self.assertEqual(
642 dict(zip([dsd.source_package_name.name for dsd in dsds], jobs)),
643 job_source.getPendingJobsPerPackage(derived_series))
644
645 def test_getPendingJobsPerPackage_picks_oldest_job_for_dsd(self):
646 # If there are multiple jobs for one package,
647 # getPendingJobsPerPackage picks the oldest.
648 dsd = self.factory.makeDistroSeriesDifference()
649 jobs = [self.makeJob(dsd) for counter in xrange(2)]
650 job_source = getUtility(IPlainPackageCopyJobSource)
651 self.assertEqual(
652 {dsd.source_package_name.name: jobs[0]},
653 job_source.getPendingJobsPerPackage(dsd.derived_series))
654
655 def test_getPendingJobsPerPackage_ignores_dsds_without_jobs(self):
656 # getPendingJobsPerPackage produces no dict entry for packages
657 # that have no pending jobs, even if they do have DSDs.
658 dsd = self.factory.makeDistroSeriesDifference()
659 job_source = getUtility(IPlainPackageCopyJobSource)
660 self.assertEqual(
661 {}, job_source.getPendingJobsPerPackage(dsd.derived_series))
662
663 def test_findMatchingDSDs_matches_all_DSDs_for_job(self):
664 # findMatchingDSDs finds matching DSDs for any of the packages
665 # in the job.
666 dsd = self.factory.makeDistroSeriesDifference()
667 naked_job = removeSecurityProxy(self.makeJob(dsd))
668 self.assertContentEqual([dsd], naked_job.findMatchingDSDs())
669
670 def test_findMatchingDSDs_ignores_other_source_series(self):
671 # findMatchingDSDs tries to ignore DSDs that are for different
672 # parent series than the job's source series. (This can't be
673 # done with perfect precision because the job doesn't keep track
674 # of source distroseries, but in practice it should be good
675 # enough).
676 dsd = self.factory.makeDistroSeriesDifference()
677 naked_job = removeSecurityProxy(self.makeJob(dsd))
678
679 # If the dsd differs only in parent series, that's enough to
680 # make it a non-match.
681 removeSecurityProxy(dsd).parent_series = (
682 self.factory.makeDistroSeries())
683
684 self.assertContentEqual([], naked_job.findMatchingDSDs())
685
686 def test_findMatchingDSDs_ignores_other_packages(self):
687 # findMatchingDSDs does not return DSDs that are similar to the
688 # information in the job, but are for different packages.
689 dsd = self.factory.makeDistroSeriesDifference()
690 self.factory.makeDistroSeriesDifference(
691 derived_series=dsd.derived_series,
692 parent_series=dsd.parent_series)
693 naked_job = removeSecurityProxy(self.makeJob(dsd))
694 self.assertContentEqual([dsd], naked_job.findMatchingDSDs())
695
696 def test_addSourceOverride(self):
697 # Test the addOverride method which adds an ISourceOverride to the
698 # metadata.
699 name = self.factory.makeSourcePackageName()
700 component = self.factory.makeComponent()
701 section = self.factory.makeSection()
702 pcj = self.factory.makePlainPackageCopyJob()
703 self.layer.txn.commit()
704 self.layer.switchDbUser('sync_packages')
705
706 override = SourceOverride(
707 source_package_name=name,
708 component=component,
709 section=section)
710 pcj.addSourceOverride(override)
711
712 metadata_component = getUtility(
713 IComponentSet)[pcj.metadata["component_override"]]
714 metadata_section = getUtility(
715 ISectionSet)[pcj.metadata["section_override"]]
716 matcher = MatchesStructure(
717 component=Equals(metadata_component),
718 section=Equals(metadata_section))
719 self.assertThat(override, matcher)
720
721 def test_getSourceOverride(self):
722 # Test the getSourceOverride which gets an ISourceOverride from
723 # the metadata.
724 name = self.factory.makeSourcePackageName()
725 component = self.factory.makeComponent()
726 section = self.factory.makeSection()
727 pcj = self.factory.makePlainPackageCopyJob(
728 package_name=name.name, package_version="1.0")
729 self.layer.txn.commit()
730 self.layer.switchDbUser('sync_packages')
731
732 override = SourceOverride(
733 source_package_name=name,
734 component=component,
735 section=section)
736 pcj.addSourceOverride(override)
737
738 self.assertEqual(override, pcj.getSourceOverride())
739
740 def test_getPolicyImplementation_returns_policy(self):
741 # getPolicyImplementation returns the ICopyPolicy that was
742 # chosen for the job.
743 dsd = self.factory.makeDistroSeriesDifference()
744 for policy in PackageCopyPolicy.items:
745 naked_job = removeSecurityProxy(
746 self.makeJob(dsd, copy_policy=policy))
747 self.assertEqual(
748 policy, naked_job.getPolicyImplementation().enum_value)
749
750
751class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper):
752 """Test that `PlainPackageCopyJob` has the privileges it needs.
753
754 This test looks for errors, not failures. It's here only to see that
755 these operations don't run into any privilege limitations.
756 """
757
758 layer = LaunchpadZopelessLayer
759
760 def test_findMatchingDSDs(self):
761 job = self.makeJob()
762 transaction.commit()
763 self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser)
764 removeSecurityProxy(job).findMatchingDSDs()
765
766 def test_reportFailure(self):
767 job = self.makeJob()
768 transaction.commit()
769 self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser)
770 removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts"))
771>>>>>>> MERGE-SOURCE

Subscribers

People subscribed via source and target branches

to status/vote changes: