Merge lp:~jtv/launchpad/db-bug-793382 into lp:launchpad/db-devel
- db-bug-793382
- Merge into db-devel
Status: | Merged | ||||
---|---|---|---|---|---|
Approved by: | Jeroen T. Vermeulen | ||||
Approved revision: | no longer in the source branch. | ||||
Merged at revision: | 10661 | ||||
Proposed branch: | lp:~jtv/launchpad/db-bug-793382 | ||||
Merge into: | lp:launchpad/db-devel | ||||
Diff against target: |
2150 lines (+1395/-402) (has conflicts) 14 files modified
lib/canonical/launchpad/doc/vocabularies.txt (+7/-7) lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt (+38/-16) lib/canonical/launchpad/webapp/publication.py (+11/-1) lib/canonical/launchpad/webapp/tests/test_haproxy.py (+9/-1) lib/lp/registry/browser/distroseries.py (+56/-0) lib/lp/registry/browser/tests/test_distroseries.py (+134/-0) lib/lp/registry/stories/distroseries/xx-distroseries-index.txt (+48/-0) lib/lp/registry/templates/distroseries-details.pt (+33/-0) lib/lp/services/job/model/job.py (+21/-2) lib/lp/services/job/tests/test_job.py (+18/-8) lib/lp/soyuz/interfaces/packagecopyjob.py (+168/-0) lib/lp/soyuz/model/binaryandsourcepackagename.py (+1/-11) lib/lp/soyuz/model/packagecopyjob.py (+401/-156) lib/lp/soyuz/tests/test_packagecopyjob.py (+450/-200) Text conflict in lib/lp/registry/browser/distroseries.py Text conflict in lib/lp/registry/browser/tests/test_distroseries.py Text conflict in lib/lp/registry/stories/distroseries/xx-distroseries-index.txt Text conflict in lib/lp/registry/templates/distroseries-details.pt Text conflict in lib/lp/soyuz/interfaces/packagecopyjob.py Text conflict in lib/lp/soyuz/model/packagecopyjob.py Text conflict in lib/lp/soyuz/tests/test_packagecopyjob.py |
||||
To merge this branch: | bzr merge lp:~jtv/launchpad/db-bug-793382 | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Henning Eggers (community) | Approve | ||
Review via email: mp+63545@code.launchpad.net |
Commit message
[r=henninge][bug=793382] Bring requestUpgrades back to constant query count.
Description of the change
= Summary =
Now that we no longer bundle multiple packages into one PackageCopyJob, the DistroSeries:
== Proposed fix ==
Provide interfaces for creating multiple Jobs and PlainPackageCop
Actually it should be possible for Storm to bundle the INSERTs in this way, but that would take quite some work to get right, and it might become fragile if one INSERT ends up being flushed to the database before the next object can be added to the store.
In the IPlainPackageCo
== Pre-implementation notes ==
Gavin came up with the idea of having a "create multiple jobs" interface in a utility class. No other interesting ideas came up when discussing the idea on IRC.
== Implementation details ==
The Job multi-create method went into the class. There is an IJobSource interface, but that was not suitable here for several reasons. First, it's implemented in many places that may not have any need for a multi-create; requiring this method would cause unneeded hardship. Second, IJobSource is not a utility for Jobs. Instead it's more the generic base interface for classes built around Job, which may or may not live in separate tables — so it would be ambiguous what kind of ids the method would return. Finally, the job-specific creation methods will have their own signatures, which wouldn't work very well with having them all come from the same interface definition.
You'll notice in the performance test that even though the query count no longer increases by 2 per copy, the query count for the base case went up by 2. I haven't bothered to figure out where those 2 queries came from; I wonder if perhaps it might have been an accounting glitch where the two INSERTs got deferred beyond the point where the query count is collected. That deferral would not be possible with the new code, and that could explain the difference.
== Tests ==
From low-level to high-level:
{{{
./bin/test -vvc lp.services.
./bin/test -vvc lp.soyuz.
./bin/test -vvc lp.registry.
}}}
== Demo and Q/A ==
I'm not sure the "upgrade packages" button is visible on DistroSeries:
Upgradeable entries are ones where the checkbox on the left is active, and the parent version is greater than the version in the distroseries you're looking at.
= Launchpad lint =
Checking for conflicts and issues in changed files.
Linting changed files:
lib/lp/
lib/lp/
lib/lp/
lib/lp/
lib/lp/
lib/lp/
lib/lp/
Preview Diff
1 | === modified file 'lib/canonical/launchpad/doc/vocabularies.txt' |
2 | --- lib/canonical/launchpad/doc/vocabularies.txt 2011-05-27 19:53:20 +0000 |
3 | +++ lib/canonical/launchpad/doc/vocabularies.txt 2011-06-09 10:59:00 +0000 |
4 | @@ -300,8 +300,8 @@ |
5 | >>> package_name_terms.count() |
6 | 2 |
7 | >>> [(term.token, term.title) for term in package_name_terms] |
8 | - [('mozilla-firefox', u'iceweasel huh ?'), |
9 | - ('mozilla-firefox-data', u'Mozilla Firefox Data is .....')] |
10 | + [('mozilla-firefox', u'mozilla-firefox'), |
11 | + ('mozilla-firefox-data', u'mozilla-firefox-data')] |
12 | |
13 | Searching for "mozilla" should return the binary package name above, and |
14 | the source package named "mozilla". |
15 | @@ -310,9 +310,9 @@ |
16 | >>> package_name_terms.count() |
17 | 3 |
18 | >>> [(term.token, term.title) for term in package_name_terms] |
19 | - [('mozilla', 'Not uploaded'), |
20 | - ('mozilla-firefox', u'iceweasel huh ?'), |
21 | - ('mozilla-firefox-data', u'Mozilla Firefox Data is .....')] |
22 | + [('mozilla', u'mozilla'), |
23 | + ('mozilla-firefox', u'mozilla-firefox'), |
24 | + ('mozilla-firefox-data', u'mozilla-firefox-data')] |
25 | |
26 | The search does a case-insensitive, substring match. |
27 | |
28 | @@ -320,8 +320,8 @@ |
29 | >>> package_name_terms.count() |
30 | 2 |
31 | >>> [(term.token, term.title) for term in package_name_terms] |
32 | - [('linux-2.6.12', u'this kernel is like the crystal method: a temple...'), |
33 | - ('linux-source-2.6.15', u'Source of: linux-2.6.12')] |
34 | + [('linux-2.6.12', u'linux-2.6.12'), |
35 | + ('linux-source-2.6.15', u'linux-source-2.6.15')] |
36 | |
37 | |
38 | BinaryPackageNameVocabulary |
39 | |
40 | === modified file 'lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt' |
41 | --- lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt 2011-02-25 04:19:04 +0000 |
42 | +++ lib/canonical/launchpad/pagetests/standalone/xx-opstats.txt 2011-06-09 10:59:00 +0000 |
43 | @@ -1,4 +1,5 @@ |
44 | -= Operational Statistics and Metrics = |
45 | +Operational Statistics and Metrics |
46 | +================================== |
47 | |
48 | We make Zope 3 give us real time statistics about Launchpad's operation. |
49 | We can access them via XML-RPC: |
50 | @@ -25,7 +26,8 @@ |
51 | ... reset() |
52 | ... |
53 | |
54 | -== Number of requests and XML-RPC requests == |
55 | +Number of requests and XML-RPC requests |
56 | +--------------------------------------- |
57 | |
58 | Even though XML-RPC requests are technically HTTP requests, we do not |
59 | count them as such. Note that the call to obtain statistics will increment |
60 | @@ -59,7 +61,8 @@ |
61 | requests: 1 |
62 | xml-rpc requests: 1 |
63 | |
64 | -== Number of HTTP requests and success codes == |
65 | +Number of HTTP requests and success codes |
66 | +----------------------------------------- |
67 | |
68 | >>> output = http("GET / HTTP/1.1\nHost: bugs.launchpad.dev\n") |
69 | >>> output.getStatus() |
70 | @@ -69,7 +72,8 @@ |
71 | http requests: 1 |
72 | requests: 1 |
73 | |
74 | -== Number of 404s == |
75 | +Number of 404s |
76 | +-------------- |
77 | |
78 | Note that retries is incremented too. As per the standard Launchpad |
79 | database policy, this request first uses the slave DB. The requested |
80 | @@ -86,7 +90,8 @@ |
81 | requests: 1 |
82 | retries: 1 |
83 | |
84 | -== Number of 500 Internal Server Errors (unhandled exceptions) == |
85 | +Number of 500 Internal Server Errors (unhandled exceptions) |
86 | +----------------------------------------------------------- |
87 | |
88 | This is normally the number of OOPS pages displayed to the user, but |
89 | may also include the odd case where the OOPS system has failed and a |
90 | @@ -129,7 +134,8 @@ |
91 | http requests: 1 |
92 | requests: 1 |
93 | |
94 | -== Number of XML-RPC Faults == |
95 | +Number of XML-RPC Faults |
96 | +------------------------ |
97 | |
98 | >>> try: |
99 | ... opstats = lp_xmlrpc.invalid() # XXX: Need a HTTP test too |
100 | @@ -142,7 +148,8 @@ |
101 | xml-rpc requests: 1 |
102 | |
103 | |
104 | -== Number of soft timeouts == |
105 | +Number of soft timeouts |
106 | +----------------------- |
107 | |
108 | >>> from canonical.config import config |
109 | >>> test_data = dedent(""" |
110 | @@ -162,7 +169,8 @@ |
111 | requests: 1 |
112 | soft timeouts: 1 |
113 | |
114 | -== Number of Timeouts == |
115 | +Number of Timeouts |
116 | +------------------ |
117 | |
118 | We can't reliably track this using the 503 response code as other |
119 | Launchpad code may well return this status and an XML-RPC request may |
120 | @@ -190,7 +198,8 @@ |
121 | timeouts: 1 |
122 | |
123 | |
124 | -== HTTP access for Cricket == |
125 | +HTTP access for Cricket |
126 | +----------------------- |
127 | |
128 | Stats can also be retrieved via HTTP in cricket-graph format: |
129 | |
130 | @@ -219,13 +228,13 @@ |
131 | xmlrpc_requests:0@... |
132 | <BLANKLINE> |
133 | |
134 | -== No DB access required == |
135 | +No DB access required |
136 | +--------------------- |
137 | |
138 | -Accessing the opstats page as an anonymous user will make no database |
139 | -queries. This is important to make it as reliable as possible since we |
140 | -use this page for monitoring. Because of this property, the load |
141 | -balancers also use this page to determine if a Launchpad instance is |
142 | -responsive. |
143 | +Accessing the opstats page will make no database queries. This is important to |
144 | +make it as reliable as possible since we use this page for monitoring. Because |
145 | +of this property, the load balancers also use this page to determine if a |
146 | +Launchpad instance is responsive. |
147 | |
148 | To confirm this, we first point all our database connection information |
149 | to somewhere that doesn't exist. |
150 | @@ -257,6 +266,20 @@ |
151 | <BLANKLINE> |
152 | 1XXs:0@... |
153 | |
154 | +This is also true if we are provide authentication. |
155 | + |
156 | + >>> print http(r""" |
157 | + ... GET /+opstats HTTP/1.1 |
158 | + ... Host: launchpad.dev |
159 | + ... Authorization: Basic Zm9vLmJhckBjYW5vbmljYWwuY29tOnRlc3Q= |
160 | + ... """) |
161 | + HTTP/1.1 200 Ok |
162 | + ... |
163 | + Content-Type: text/plain; charset=US-ASCII |
164 | + ... |
165 | + <BLANKLINE> |
166 | + 1XXs:0@... |
167 | + |
168 | But our database connections are broken. |
169 | |
170 | >>> from canonical.launchpad.interfaces.lpstorm import IStore |
171 | @@ -271,4 +294,3 @@ |
172 | |
173 | >>> IStore(Person).find(Person, name='janitor').one().name |
174 | u'janitor' |
175 | - |
176 | |
177 | === modified file 'lib/canonical/launchpad/webapp/publication.py' |
178 | --- lib/canonical/launchpad/webapp/publication.py 2011-05-27 21:12:25 +0000 |
179 | +++ lib/canonical/launchpad/webapp/publication.py 2011-06-09 10:59:00 +0000 |
180 | @@ -330,7 +330,17 @@ |
181 | personless account, return the unauthenticated principal. |
182 | """ |
183 | auth_utility = getUtility(IPlacelessAuthUtility) |
184 | - principal = auth_utility.authenticate(request) |
185 | + principal = None |
186 | + # +opstats and +haproxy are status URLs that must not query the DB at |
187 | + # all. This is enforced (see |
188 | + # lib/canonical/launchpad/webapp/dbpolicy.py). If the request is for |
189 | + # one of those two pages, don't even try to authenticate, because we |
190 | + # may fail. We haven't traversed yet, so we have to sniff the request |
191 | + # this way. Even though PATH_INFO is always present in real requests, |
192 | + # we need to tread carefully (``get``) because of test requests in our |
193 | + # automated tests. |
194 | + if request.get('PATH_INFO') not in [u'/+opstats', u'/+haproxy']: |
195 | + principal = auth_utility.authenticate(request) |
196 | if principal is None or principal.person is None: |
197 | # This is either an unauthenticated user or a user who |
198 | # authenticated on our OpenID server using a personless account. |
199 | |
200 | === modified file 'lib/canonical/launchpad/webapp/tests/test_haproxy.py' |
201 | --- lib/canonical/launchpad/webapp/tests/test_haproxy.py 2011-02-18 18:12:43 +0000 |
202 | +++ lib/canonical/launchpad/webapp/tests/test_haproxy.py 2011-06-09 10:59:00 +0000 |
203 | @@ -35,6 +35,15 @@ |
204 | result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False) |
205 | self.assertEquals(200, result.getStatus()) |
206 | |
207 | + def test_authenticated_HAProxyStatusView_works(self): |
208 | + # We don't use authenticated requests, but this keeps us from |
209 | + # generating oopses. |
210 | + result = self.http( |
211 | + u'GET /+haproxy HTTP/1.0\n' |
212 | + u'Authorization: Basic Zm9vLmJhckBjYW5vbmljYWwuY29tOnRlc3Q=\n', |
213 | + handle_errors=False) |
214 | + self.assertEquals(200, result.getStatus()) |
215 | + |
216 | def test_HAProxyStatusView_going_down_returns_500(self): |
217 | haproxy.set_going_down_flag(True) |
218 | result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False) |
219 | @@ -61,4 +70,3 @@ |
220 | haproxy.set_going_down_flag(True) |
221 | result = self.http(u'GET /+haproxy HTTP/1.0', handle_errors=False) |
222 | self.assertEquals(499, result.getStatus()) |
223 | - |
224 | |
225 | === modified file 'lib/lp/registry/browser/distroseries.py' |
226 | --- lib/lp/registry/browser/distroseries.py 2011-06-08 21:03:29 +0000 |
227 | +++ lib/lp/registry/browser/distroseries.py 2011-06-09 10:59:00 +0000 |
228 | @@ -1052,6 +1052,7 @@ |
229 | def sync_sources(self, action, data): |
230 | self._sync_sources(action, data) |
231 | |
232 | +<<<<<<< TREE |
233 | def getUpgrades(self): |
234 | """Find straightforward package upgrades. |
235 | |
236 | @@ -1103,6 +1104,61 @@ |
237 | queue = PackageUploadQueue(self.context, None) |
238 | return check_permission("launchpad.Edit", queue) |
239 | |
240 | +======= |
241 | + def getUpgrades(self): |
242 | + """Find straightforward package upgrades. |
243 | + |
244 | + These are updates for packages that this distroseries shares |
245 | + with a parent series, for which there have been updates in the |
246 | + parent, and which do not have any changes in this series that |
247 | + might complicate a sync. |
248 | + |
249 | + :return: A result set of `DistroSeriesDifference`s. |
250 | + """ |
251 | + return getUtility(IDistroSeriesDifferenceSource).getSimpleUpgrades( |
252 | + self.context) |
253 | + |
254 | + @action(_("Upgrade Packages"), name="upgrade", condition='canUpgrade') |
255 | + def upgrade(self, action, data): |
256 | + """Request synchronization of straightforward package upgrades.""" |
257 | + self.requestUpgrades() |
258 | + |
259 | + def requestUpgrades(self): |
260 | + """Request sync of packages that can be easily upgraded.""" |
261 | + target_distroseries = self.context |
262 | + copies = [ |
263 | + ( |
264 | + dsd.source_package_name.name, |
265 | + dsd.parent_source_version, |
266 | + dsd.parent_series.main_archive, |
267 | + target_distroseries.main_archive, |
268 | + PackagePublishingPocket.RELEASE, |
269 | + ) |
270 | + for dsd in self.getUpgrades()] |
271 | + getUtility(IPlainPackageCopyJobSource).createMultiple( |
272 | + target_distroseries, copies, |
273 | + copy_policy=PackageCopyPolicy.MASS_SYNC) |
274 | + |
275 | + self.request.response.addInfoNotification( |
276 | + (u"Upgrades of {context.displayname} packages have been " |
277 | + u"requested. Please give Launchpad some time to complete " |
278 | + u"these.").format(context=self.context)) |
279 | + |
280 | + def canUpgrade(self, action=None): |
281 | + """Should the form offer a packages upgrade?""" |
282 | + if getFeatureFlag("soyuz.derived_series_sync.enabled") is None: |
283 | + return False |
284 | + elif self.context.status not in UPGRADABLE_SERIES_STATUSES: |
285 | + # A feature freeze precludes blanket updates. |
286 | + return False |
287 | + elif self.getUpgrades().is_empty(): |
288 | + # There are no simple updates to perform. |
289 | + return False |
290 | + else: |
291 | + queue = PackageUploadQueue(self.context, None) |
292 | + return check_permission("launchpad.Edit", queue) |
293 | + |
294 | +>>>>>>> MERGE-SOURCE |
295 | |
296 | class DistroSeriesMissingPackagesView(DistroSeriesDifferenceBaseView, |
297 | LaunchpadFormView): |
298 | |
299 | === modified file 'lib/lp/registry/browser/tests/test_distroseries.py' |
300 | --- lib/lp/registry/browser/tests/test_distroseries.py 2011-06-08 21:03:29 +0000 |
301 | +++ lib/lp/registry/browser/tests/test_distroseries.py 2011-06-09 10:59:00 +0000 |
302 | @@ -996,6 +996,7 @@ |
303 | self.assertEqual(versions['derived'], derived_span[0].string.strip()) |
304 | self.assertEqual(versions['parent'], parent_span[0].string.strip()) |
305 | |
306 | +<<<<<<< TREE |
307 | def test_getUpgrades_shows_updates_in_parent(self): |
308 | # The view's getUpgrades methods lists packages that can be |
309 | # trivially upgraded: changed in the parent, not changed in the |
310 | @@ -1128,6 +1129,139 @@ |
311 | |
312 | class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory, |
313 | DistroSeriesDifferenceMixin): |
314 | +======= |
315 | + def test_getUpgrades_shows_updates_in_parent(self): |
316 | + # The view's getUpgrades methods lists packages that can be |
317 | + # trivially upgraded: changed in the parent, not changed in the |
318 | + # derived series, but present in both. |
319 | + dsd = self.makePackageUpgrade() |
320 | + view = self.makeView(dsd.derived_series) |
321 | + self.assertContentEqual([dsd], view.getUpgrades()) |
322 | + |
323 | + def enableDerivedSeriesSyncFeature(self): |
324 | + self.useFixture( |
325 | + FeatureFixture( |
326 | + {u'soyuz.derived_series_sync.enabled': u'on'})) |
327 | + |
328 | + @with_celebrity_logged_in("admin") |
329 | + def test_upgrades_offered_only_with_feature_flag(self): |
330 | + # The "Upgrade Packages" button will only be shown when a specific |
331 | + # feature flag is enabled. |
332 | + view = self.makeView() |
333 | + self.makePackageUpgrade(view.context) |
334 | + self.assertFalse(view.canUpgrade()) |
335 | + self.enableDerivedSeriesSyncFeature() |
336 | + self.assertTrue(view.canUpgrade()) |
337 | + |
338 | + def test_upgrades_are_offered_if_appropriate(self): |
339 | + # The "Upgrade Packages" button will only be shown to privileged |
340 | + # users. |
341 | + self.enableDerivedSeriesSyncFeature() |
342 | + dsd = self.makePackageUpgrade() |
343 | + view = self.makeView(dsd.derived_series) |
344 | + with celebrity_logged_in("admin"): |
345 | + self.assertTrue(view.canUpgrade()) |
346 | + with person_logged_in(self.factory.makePerson()): |
347 | + self.assertFalse(view.canUpgrade()) |
348 | + with anonymous_logged_in(): |
349 | + self.assertFalse(view.canUpgrade()) |
350 | + |
351 | + @with_celebrity_logged_in("admin") |
352 | + def test_upgrades_offered_only_if_available(self): |
353 | + # If there are no upgrades, the "Upgrade Packages" button won't |
354 | + # be shown. |
355 | + self.enableDerivedSeriesSyncFeature() |
356 | + view = self.makeView() |
357 | + self.assertFalse(view.canUpgrade()) |
358 | + self.makePackageUpgrade(view.context) |
359 | + self.assertTrue(view.canUpgrade()) |
360 | + |
361 | + @with_celebrity_logged_in("admin") |
362 | + def test_upgrades_not_offered_after_feature_freeze(self): |
363 | + # There won't be an "Upgrade Packages" button once feature |
364 | + # freeze has occurred. Mass updates would not make sense after |
365 | + # that point. |
366 | + self.enableDerivedSeriesSyncFeature() |
367 | + upgradeable = {} |
368 | + for status in SeriesStatus.items: |
369 | + dsd = self.makePackageUpgrade() |
370 | + dsd.derived_series.status = status |
371 | + view = self.makeView(dsd.derived_series) |
372 | + upgradeable[status] = view.canUpgrade() |
373 | + expected = { |
374 | + SeriesStatus.FUTURE: True, |
375 | + SeriesStatus.EXPERIMENTAL: True, |
376 | + SeriesStatus.DEVELOPMENT: True, |
377 | + SeriesStatus.FROZEN: False, |
378 | + SeriesStatus.CURRENT: False, |
379 | + SeriesStatus.SUPPORTED: False, |
380 | + SeriesStatus.OBSOLETE: False, |
381 | + } |
382 | + self.assertEqual(expected, upgradeable) |
383 | + |
384 | + def test_upgrade_creates_sync_jobs(self): |
385 | + # requestUpgrades generates PackageCopyJobs for the upgrades |
386 | + # that need doing. |
387 | + dsd = self.makePackageUpgrade() |
388 | + series = dsd.derived_series |
389 | + with celebrity_logged_in('admin'): |
390 | + series.status = SeriesStatus.DEVELOPMENT |
391 | + series.datereleased = UTC_NOW |
392 | + view = self.makeView(series) |
393 | + view.requestUpgrades() |
394 | + job_source = getUtility(IPlainPackageCopyJobSource) |
395 | + jobs = list( |
396 | + job_source.getActiveJobs(series.distribution.main_archive)) |
397 | + self.assertEquals(1, len(jobs)) |
398 | + job = jobs[0] |
399 | + self.assertEquals(series, job.target_distroseries) |
400 | + self.assertEqual(dsd.source_package_name.name, job.package_name) |
401 | + self.assertEqual(dsd.parent_source_version, job.package_version) |
402 | + self.assertEqual(PackagePublishingPocket.RELEASE, job.target_pocket) |
403 | + |
404 | + def test_upgrade_gives_feedback(self): |
405 | + # requestUpgrades doesn't instantly perform package upgrades, |
406 | + # but it shows the user a notice that the upgrades have been |
407 | + # requested. |
408 | + dsd = self.makePackageUpgrade() |
409 | + view = self.makeView(dsd.derived_series) |
410 | + view.requestUpgrades() |
411 | + expected = { |
412 | + "level": BrowserNotificationLevel.INFO, |
413 | + "message": |
414 | + ("Upgrades of {0.displayname} packages have been " |
415 | + "requested. Please give Launchpad some time to " |
416 | + "complete these.").format(dsd.derived_series), |
417 | + } |
418 | + observed = map(vars, view.request.response.notifications) |
419 | + self.assertEqual([expected], observed) |
420 | + |
421 | + def test_requestUpgrades_is_efficient(self): |
422 | + # A single web request may need to schedule large numbers of |
423 | + # package upgrades. It must do so without issuing large numbers |
424 | + # of database queries. |
425 | + derived_series, parent_series = self._createChildAndParent() |
426 | + # Take a baseline measure of queries. |
427 | + self.makePackageUpgrade(derived_series=derived_series) |
428 | + flush_database_caches() |
429 | + with StormStatementRecorder() as recorder1: |
430 | + self.makeView(derived_series).requestUpgrades() |
431 | + self.assertThat(recorder1, HasQueryCount(LessThan(12))) |
432 | + |
433 | + # The query count does not increase with the number of upgrades. |
434 | + for index in xrange(3): |
435 | + self.makePackageUpgrade(derived_series=derived_series) |
436 | + flush_database_caches() |
437 | + with StormStatementRecorder() as recorder2: |
438 | + self.makeView(derived_series).requestUpgrades() |
439 | + self.assertThat( |
440 | + recorder2, |
441 | + HasQueryCount(Equals(recorder1.count))) |
442 | + |
443 | + |
444 | +class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory, |
445 | + DistroSeriesDifferenceMixin): |
446 | +>>>>>>> MERGE-SOURCE |
447 | |
448 | layer = LaunchpadFunctionalLayer |
449 | |
450 | |
451 | === modified file 'lib/lp/registry/stories/distroseries/xx-distroseries-index.txt' |
452 | --- lib/lp/registry/stories/distroseries/xx-distroseries-index.txt 2011-05-24 10:08:33 +0000 |
453 | +++ lib/lp/registry/stories/distroseries/xx-distroseries-index.txt 2011-06-09 10:59:00 +0000 |
454 | @@ -51,13 +51,56 @@ |
455 | Release manager: None |
456 | Status: Current Stable Release |
457 | Derives from: Warty (4.10) is not derived from another series. |
458 | +<<<<<<< TREE |
459 | Derived series: |
460 | +======= |
461 | + Derived series: No derived series. |
462 | +>>>>>>> MERGE-SOURCE |
463 | Source packages: 3 |
464 | Binary packages: 4 |
465 | |
466 | On series that have no source or binary packages, the portlet will |
467 | change its text slightly to annouce this: |
468 | |
469 | + >>> anon_browser.open('http://launchpad.dev/debian/sarge') |
470 | + >>> print extract_text( |
471 | + ... find_portlet(anon_browser.contents, 'Series information')) |
472 | + Series information |
473 | + Distribution: Debian |
474 | + Series: Sarge (3.1) |
475 | + Project drivers: Jeff Waugh, Mark Shuttleworth |
476 | + Release manager: Jeff Waugh |
477 | + Status: Pre-release Freeze |
478 | + Derives from: Woody (3.0) |
479 | + Source packages: No sources imported or published. |
480 | + Binary packages: No binaries imported or published. |
481 | + |
482 | +The series' derivation parents - rather than the previous series - are |
483 | +shown when derivation is enabled, as are the series derived from this |
484 | +series: |
485 | + |
486 | + >>> from lp.registry.interfaces.distribution import IDistributionSet |
487 | + >>> from lp.testing import celebrity_logged_in |
488 | + >>> from zope.component import getUtility |
489 | + |
490 | + >>> with celebrity_logged_in("admin"): |
491 | + ... debian = getUtility(IDistributionSet).getByName(u"debian") |
492 | + ... sarge = debian.getSeries(u"sarge") |
493 | + ... parents = [ |
494 | + ... factory.makeDistroSeries(name=u"dobby"), |
495 | + ... factory.makeDistroSeries(name=u"knobby")] |
496 | + ... distro_series_parents = [ |
497 | + ... factory.makeDistroSeriesParent( |
498 | + ... derived_series=sarge, parent_series=parent) |
499 | + ... for parent in parents] |
500 | + ... children = [ |
501 | + ... factory.makeDistroSeries(name=u"bobby"), |
502 | + ... factory.makeDistroSeries(name=u"tables")] |
503 | + ... distro_series_children = [ |
504 | + ... factory.makeDistroSeriesParent( |
505 | + ... derived_series=child, parent_series=sarge) |
506 | + ... for child in children] |
507 | + |
508 | >>> with derivation_enabled: |
509 | ... anon_browser.open('http://launchpad.dev/debian/sarge') |
510 | >>> print extract_text( |
511 | @@ -68,8 +111,13 @@ |
512 | Project drivers: Jeff Waugh, Mark Shuttleworth |
513 | Release manager: Jeff Waugh |
514 | Status: Pre-release Freeze |
515 | +<<<<<<< TREE |
516 | Derives from: Woody (3.0) |
517 | Derived series: |
518 | +======= |
519 | + Derives from: Dobby (...), Knobby (...) |
520 | + Derived series: Bobby (...), Tables (...) |
521 | +>>>>>>> MERGE-SOURCE |
522 | Source packages: No sources imported or published. |
523 | Binary packages: No binaries imported or published. |
524 | |
525 | |
526 | === modified file 'lib/lp/registry/templates/distroseries-details.pt' |
527 | --- lib/lp/registry/templates/distroseries-details.pt 2011-05-24 10:08:33 +0000 |
528 | +++ lib/lp/registry/templates/distroseries-details.pt 2011-06-09 10:59:00 +0000 |
529 | @@ -46,6 +46,9 @@ |
530 | </dd> |
531 | </dl> |
532 | |
533 | + <tal:derivation-not-enabled |
534 | + tal:condition="not:request/features/soyuz.derived_series_ui.enabled"> |
535 | + |
536 | <dl> |
537 | <dt>Derives from:</dt> |
538 | <dd> |
539 | @@ -60,8 +63,32 @@ |
540 | </dd> |
541 | </dl> |
542 | |
543 | +<<<<<<< TREE |
544 | <dl tal:condition="request/features/soyuz.derived_series_ui.enabled" |
545 | tal:define="all_child_series context/getDerivedSeries"> |
546 | +======= |
547 | + </tal:derivation-not-enabled> |
548 | + |
549 | + <tal:derivation-enabled |
550 | + tal:condition="request/features/soyuz.derived_series_ui.enabled"> |
551 | + |
552 | + <dl tal:define="parents context/getParentSeries"> |
553 | + <dt>Derives from:</dt> |
554 | + <dd tal:condition="parents"> |
555 | + <tal:parents repeat="parent parents"> |
556 | + <a tal:attributes="href parent/fmt:url" |
557 | + tal:content="parent/named_version"> |
558 | + </a><tal:comma condition="not:repeat/parent/end">, </tal:comma> |
559 | + </tal:parents> |
560 | + </dd> |
561 | + <dd tal:condition="not:parents"> |
562 | + <tal:name replace="context/named_version"/> is not derived from |
563 | + another series. |
564 | + </dd> |
565 | + </dl> |
566 | + |
567 | + <dl tal:define="all_child_series context/getDerivedSeries"> |
568 | +>>>>>>> MERGE-SOURCE |
569 | <dt>Derived series:</dt> |
570 | <dd> |
571 | <tal:per_child_series repeat="child_series all_child_series"> |
572 | @@ -70,12 +97,18 @@ |
573 | tal:content="child_series/named_version" /><tal:comma |
574 | condition="not: repeat/child_series/end">,</tal:comma> |
575 | </tal:per_child_series> |
576 | +<<<<<<< TREE |
577 | <tal:none condition="all_child_series"> |
578 | +======= |
579 | + <tal:none condition="not:all_child_series"> |
580 | +>>>>>>> MERGE-SOURCE |
581 | No derived series. |
582 | </tal:none> |
583 | </dd> |
584 | </dl> |
585 | |
586 | + </tal:derivation-enabled> |
587 | + |
588 | <dl tal:define="sourcecount context/sourcecount"> |
589 | <dt>Source packages:</dt> |
590 | <dd |
591 | |
592 | === modified file 'lib/lp/services/job/model/job.py' |
593 | --- lib/lp/services/job/model/job.py 2011-05-26 14:29:34 +0000 |
594 | +++ lib/lp/services/job/model/job.py 2011-06-09 10:59:00 +0000 |
595 | @@ -1,4 +1,4 @@ |
596 | -# Copyright 2009 Canonical Ltd. This software is licensed under the |
597 | +# Copyright 2009-2011 Canonical Ltd. This software is licensed under the |
598 | # GNU Affero General Public License version 3 (see the file LICENSE). |
599 | |
600 | """ORM object representing jobs.""" |
601 | @@ -26,7 +26,10 @@ |
602 | from canonical.database.constants import UTC_NOW |
603 | from canonical.database.datetimecol import UtcDateTimeCol |
604 | from canonical.database.enumcol import EnumCol |
605 | -from canonical.database.sqlbase import SQLBase |
606 | +from canonical.database.sqlbase import ( |
607 | + quote, |
608 | + SQLBase, |
609 | + ) |
610 | from lp.services.job.interfaces.job import ( |
611 | IJob, |
612 | JobStatus, |
613 | @@ -99,6 +102,22 @@ |
614 | |
615 | status = property(lambda x: x._status) |
616 | |
617 | + @classmethod |
618 | + def createMultiple(self, store, num_jobs): |
619 | + """Create multiple `Job`s at once. |
620 | + |
621 | + :param store: `Store` to ceate the jobs in. |
622 | + :param num_jobs: Number of `Job`s to create. |
623 | + :return: An iterable of `Job.id` values for the new jobs. |
624 | + """ |
625 | + job_contents = ["(%s)" % quote(JobStatus.WAITING)] * num_jobs |
626 | + result = store.execute(""" |
627 | + INSERT INTO Job (status) |
628 | + VALUES %s |
629 | + RETURNING id |
630 | + """ % ", ".join(job_contents)) |
631 | + return [job_id for job_id, in result] |
632 | + |
633 | def acquireLease(self, duration=300): |
634 | """See `IJob`.""" |
635 | if (self.lease_expires is not None |
636 | |
637 | === modified file 'lib/lp/services/job/tests/test_job.py' |
638 | --- lib/lp/services/job/tests/test_job.py 2011-05-27 08:18:07 +0000 |
639 | +++ lib/lp/services/job/tests/test_job.py 2011-06-09 10:59:00 +0000 |
640 | @@ -8,14 +8,9 @@ |
641 | |
642 | import pytz |
643 | from storm.locals import Store |
644 | -from zope.component import getUtility |
645 | |
646 | from canonical.database.constants import UTC_NOW |
647 | -from canonical.launchpad.webapp.interfaces import ( |
648 | - DEFAULT_FLAVOR, |
649 | - IStoreSelector, |
650 | - MAIN_STORE, |
651 | - ) |
652 | +from canonical.launchpad.interfaces.lpstorm import IStore |
653 | from canonical.launchpad.webapp.testing import verifyObject |
654 | from canonical.testing.layers import ZopelessDatabaseLayer |
655 | from lp.services.job.interfaces.job import ( |
656 | @@ -44,6 +39,22 @@ |
657 | job = Job() |
658 | self.assertEqual(job.status, JobStatus.WAITING) |
659 | |
660 | + def test_createMultiple_creates_requested_number_of_jobs(self): |
661 | + job_ids = list(Job.createMultiple(IStore(Job), 3)) |
662 | + self.assertEqual(3, len(job_ids)) |
663 | + self.assertEqual(3, len(set(job_ids))) |
664 | + |
665 | + def test_createMultiple_returns_valid_job_ids(self): |
666 | + job_ids = list(Job.createMultiple(IStore(Job), 3)) |
667 | + store = IStore(Job) |
668 | + for job_id in job_ids: |
669 | + self.assertIsNot(None, store.get(Job, job_id)) |
670 | + |
671 | + def test_createMultiple_sets_status_to_WAITING(self): |
672 | + store = IStore(Job) |
673 | + job = store.get(Job, Job.createMultiple(store, 1)[0]) |
674 | + self.assertEqual(JobStatus.WAITING, job.status) |
675 | + |
676 | def test_start(self): |
677 | """Job.start should update the object appropriately. |
678 | |
679 | @@ -214,8 +225,7 @@ |
680 | layer = ZopelessDatabaseLayer |
681 | |
682 | def _sampleData(self): |
683 | - store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR) |
684 | - return list(store.execute(Job.ready_jobs)) |
685 | + return list(IStore(Job).execute(Job.ready_jobs)) |
686 | |
687 | def test_ready_jobs(self): |
688 | """Job.ready_jobs should include new jobs.""" |
689 | |
690 | === modified file 'lib/lp/soyuz/interfaces/packagecopyjob.py' |
691 | --- lib/lp/soyuz/interfaces/packagecopyjob.py 2011-06-03 08:53:14 +0000 |
692 | +++ lib/lp/soyuz/interfaces/packagecopyjob.py 2011-06-09 10:59:00 +0000 |
693 | @@ -1,3 +1,4 @@ |
694 | +<<<<<<< TREE |
695 | # Copyright 2010-2011 Canonical Ltd. This software is licensed under the |
696 | # GNU Affero General Public License version 3 (see the file LICENSE). |
697 | |
698 | @@ -148,3 +149,170 @@ |
699 | copy_policy = Choice( |
700 | title=_("Applicable copy policy"), |
701 | values=PackageCopyPolicy, required=True, readonly=True) |
702 | +======= |
703 | +# Copyright 2010-2011 Canonical Ltd. This software is licensed under the |
704 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
705 | + |
706 | +__metaclass__ = type |
707 | + |
708 | +__all__ = [ |
709 | + "IPackageCopyJob", |
710 | + "IPlainPackageCopyJob", |
711 | + "IPlainPackageCopyJobSource", |
712 | + "PackageCopyJobType", |
713 | + ] |
714 | + |
715 | +from lazr.enum import ( |
716 | + DBEnumeratedType, |
717 | + DBItem, |
718 | + ) |
719 | +from lazr.restful.fields import Reference |
720 | +from zope.interface import ( |
721 | + Attribute, |
722 | + Interface, |
723 | + ) |
724 | +from zope.schema import ( |
725 | + Bool, |
726 | + Choice, |
727 | + Int, |
728 | + TextLine, |
729 | + ) |
730 | + |
731 | +from canonical.launchpad import _ |
732 | +from lp.registry.interfaces.distroseries import IDistroSeries |
733 | +from lp.services.job.interfaces.job import ( |
734 | + IJob, |
735 | + IJobSource, |
736 | + IRunnableJob, |
737 | + ) |
738 | +from lp.soyuz.enums import PackageCopyPolicy |
739 | +from lp.soyuz.interfaces.archive import IArchive |
740 | + |
741 | + |
742 | +class IPackageCopyJob(Interface): |
743 | + """A job that copies packages between `IArchive`s.""" |
744 | + |
745 | + id = Int( |
746 | + title=_('DB ID'), required=True, readonly=True, |
747 | + description=_("The tracking number for this job.")) |
748 | + |
749 | + source_archive_id = Int( |
750 | + title=_('Source Archive ID'), |
751 | + required=True, readonly=True) |
752 | + |
753 | + source_archive = Reference( |
754 | + schema=IArchive, title=_('Source Archive'), |
755 | + required=True, readonly=True) |
756 | + |
757 | + target_archive_id = Int( |
758 | + title=_('Target Archive ID'), |
759 | + required=True, readonly=True) |
760 | + |
761 | + target_archive = Reference( |
762 | + schema=IArchive, title=_('Target Archive'), |
763 | + required=True, readonly=True) |
764 | + |
765 | + target_distroseries = Reference( |
766 | + schema=IDistroSeries, title=_('Target DistroSeries.'), |
767 | + required=True, readonly=True) |
768 | + |
769 | + package_name = TextLine( |
770 | + title=_("Package name"), required=True, readonly=True) |
771 | + |
772 | + job = Reference( |
773 | + schema=IJob, title=_('The common Job attributes'), |
774 | + required=True, readonly=True) |
775 | + |
776 | + metadata = Attribute('A dict of data about the job.') |
777 | + |
778 | + |
779 | +class PackageCopyJobType(DBEnumeratedType): |
780 | + |
781 | + PLAIN = DBItem(1, """ |
782 | + Copy packages between archives. |
783 | + |
784 | + This job copies one or more packages, optionally including binaries. |
785 | + """) |
786 | + |
787 | + |
788 | +class IPlainPackageCopyJobSource(IJobSource): |
789 | + """An interface for acquiring `IPackageCopyJobs`.""" |
790 | + |
791 | + def create(package_name, source_archive, |
792 | + target_archive, target_distroseries, target_pocket, |
793 | + include_binaries=False, package_version=None, |
794 | + copy_policy=PackageCopyPolicy.INSECURE): |
795 | + """Create a new `IPlainPackageCopyJob`. |
796 | + |
797 | + :param package_name: The name of the source package to copy. |
798 | + :param source_archive: The `IArchive` in which `source_packages` are |
799 | + found. |
800 | + :param target_archive: The `IArchive` to which to copy the packages. |
801 | + :param target_distroseries: The `IDistroSeries` to which to copy the |
802 | + packages. |
803 | + :param target_pocket: The pocket into which to copy the packages. Must |
804 | + be a member of `PackagePublishingPocket`. |
805 | + :param include_binaries: See `do_copy`. |
806 | + :param package_version: The version string for the package version |
807 | + that is to be copied. |
808 | + :param copy_policy: Applicable `PackageCopyPolicy`. |
809 | + """ |
810 | + |
811 | + def createMultiple(target_distroseries, copy_tasks, |
812 | + copy_policy=PackageCopyPolicy.INSECURE, |
813 | + include_binaries=False): |
814 | + """Create multiple new `IPlainPackageCopyJob`s at once. |
815 | + |
816 | + :param target_distroseries: The `IDistroSeries` to which to copy the |
817 | + packages. |
818 | + :param copy_tasks: A list of tuples describing the copies to be |
819 | + performed: (package name, package version, source archive, |
820 | + target archive, target pocket). |
821 | + :param copy_policy: Applicable `PackageCopyPolicy`. |
822 | + :param include_binaries: As in `do_copy`. |
823 | + :return: An iterable of `PackageCopyJob` ids. |
824 | + """ |
825 | + |
826 | + def getActiveJobs(target_archive): |
827 | + """Retrieve all active sync jobs for an archive.""" |
828 | + |
829 | + def getPendingJobsPerPackage(target_series): |
830 | + """Find pending jobs for each package in `target_series`. |
831 | + |
832 | + This is meant for finding jobs that will resolve specific |
833 | + `DistroSeriesDifference`s. |
834 | + |
835 | + :param target_series: Target `DistroSeries`; this corresponds to |
836 | + `DistroSeriesDifference.derived_series`. |
837 | + :return: A dict containing as keys the (name, version) tuples for |
838 | + each `DistroSeriesDifference` that has a resolving |
839 | + `PlainPackageCopyJob` pending. Each of these DSDs maps to its |
840 | + oldest pending job. The `version` corresponds to |
841 | + `DistroSeriesDifference.parent_source_version`. |
842 | + """ |
843 | + |
844 | + |
845 | +class IPlainPackageCopyJob(IRunnableJob): |
846 | + """A no-frills job to copy packages between `IArchive`s.""" |
847 | + |
848 | + target_pocket = Int( |
849 | + title=_("Target package publishing pocket"), required=True, |
850 | + readonly=True) |
851 | + |
852 | + package_version = TextLine( |
853 | + title=_("Package version"), required=True, readonly=True) |
854 | + |
855 | + include_binaries = Bool( |
856 | + title=_("Copy binaries"), |
857 | + required=False, readonly=True) |
858 | + |
859 | + def addSourceOverride(override): |
860 | + """Add an `ISourceOverride` to the metadata.""" |
861 | + |
862 | + def getSourceOverride(): |
863 | + """Get an `ISourceOverride` from the metadata.""" |
864 | + |
865 | + copy_policy = Choice( |
866 | + title=_("Applicable copy policy"), |
867 | + values=PackageCopyPolicy, required=True, readonly=True) |
868 | +>>>>>>> MERGE-SOURCE |
869 | |
870 | === modified file 'lib/lp/soyuz/model/binaryandsourcepackagename.py' |
871 | --- lib/lp/soyuz/model/binaryandsourcepackagename.py 2010-12-22 02:48:42 +0000 |
872 | +++ lib/lp/soyuz/model/binaryandsourcepackagename.py 2011-06-09 10:59:00 +0000 |
873 | @@ -18,9 +18,7 @@ |
874 | BatchedCountableIterator, |
875 | NamedSQLObjectHugeVocabulary, |
876 | ) |
877 | -from lp.registry.model.sourcepackagename import getSourcePackageDescriptions |
878 | from lp.soyuz.interfaces.binarypackagename import IBinaryAndSourcePackageName |
879 | -from lp.soyuz.model.binarypackagename import getBinaryPackageDescriptions |
880 | |
881 | |
882 | class BinaryAndSourcePackageName(SQLBase): |
883 | @@ -45,15 +43,7 @@ |
884 | """ |
885 | |
886 | def getTermsWithDescriptions(self, results): |
887 | - # Note that we grab first source package descriptions and then |
888 | - # binary package descriptions, giving preference to the latter, |
889 | - # via the update() call. |
890 | - descriptions = getSourcePackageDescriptions(results, use_names=True) |
891 | - binary_descriptions = getBinaryPackageDescriptions(results, |
892 | - use_names=True) |
893 | - descriptions.update(binary_descriptions) |
894 | - return [SimpleTerm(obj, obj.name, |
895 | - descriptions.get(obj.name, "Not uploaded")) |
896 | + return [SimpleTerm(obj, obj.name, obj.name) |
897 | for obj in results] |
898 | |
899 | |
900 | |
901 | === modified file 'lib/lp/soyuz/model/packagecopyjob.py' |
902 | --- lib/lp/soyuz/model/packagecopyjob.py 2011-06-03 09:18:34 +0000 |
903 | +++ lib/lp/soyuz/model/packagecopyjob.py 2011-06-09 10:59:00 +0000 |
904 | @@ -23,10 +23,18 @@ |
905 | implements, |
906 | ) |
907 | |
908 | -from canonical.database.enumcol import EnumCol |
909 | -from canonical.launchpad.components.decoratedresultset import ( |
910 | - DecoratedResultSet, |
911 | - ) |
912 | +<<<<<<< TREE |
913 | +from canonical.database.enumcol import EnumCol |
914 | +from canonical.launchpad.components.decoratedresultset import ( |
915 | + DecoratedResultSet, |
916 | + ) |
917 | +======= |
918 | +from canonical.database.enumcol import EnumCol |
919 | +from canonical.database.sqlbase import sqlvalues |
920 | +from canonical.launchpad.components.decoratedresultset import ( |
921 | + DecoratedResultSet, |
922 | + ) |
923 | +>>>>>>> MERGE-SOURCE |
924 | from canonical.launchpad.interfaces.lpstorm import ( |
925 | IMasterStore, |
926 | IStore, |
927 | @@ -67,125 +75,256 @@ |
928 | |
929 | implements(IPackageCopyJob) |
930 | |
931 | - __storm_table__ = 'PackageCopyJob' |
932 | - |
933 | - id = Int(primary=True) |
934 | - |
935 | - job_id = Int(name='job') |
936 | - job = Reference(job_id, Job.id) |
937 | - |
938 | - source_archive_id = Int(name='source_archive') |
939 | - source_archive = Reference(source_archive_id, Archive.id) |
940 | - |
941 | - target_archive_id = Int(name='target_archive') |
942 | - target_archive = Reference(target_archive_id, Archive.id) |
943 | - |
944 | - target_distroseries_id = Int(name='target_distroseries') |
945 | - target_distroseries = Reference(target_distroseries_id, DistroSeries.id) |
946 | - |
947 | - package_name = Unicode('package_name') |
948 | - copy_policy = EnumCol(enum=PackageCopyPolicy) |
949 | - |
950 | - job_type = EnumCol(enum=PackageCopyJobType, notNull=True) |
951 | - |
952 | - _json_data = Unicode('json_data') |
953 | - |
954 | - def __init__(self, source_archive, target_archive, target_distroseries, |
955 | - job_type, metadata, package_name=None, copy_policy=None): |
956 | - super(PackageCopyJob, self).__init__() |
957 | - self.job = Job() |
958 | - self.job_type = job_type |
959 | - self.source_archive = source_archive |
960 | - self.target_archive = target_archive |
961 | - self.target_distroseries = target_distroseries |
962 | - self.package_name = unicode(package_name) |
963 | - self.copy_policy = copy_policy |
964 | - self._json_data = self.serializeMetadata(metadata) |
965 | - |
966 | - @classmethod |
967 | - def serializeMetadata(cls, metadata_dict): |
968 | - """Serialize a dict of metadata into a unicode string.""" |
969 | - return simplejson.dumps(metadata_dict).decode('utf-8') |
970 | - |
971 | - @property |
972 | - def metadata(self): |
973 | - return simplejson.loads(self._json_data) |
974 | - |
975 | - def extendMetadata(self, metadata_dict): |
976 | - """Add metadata_dict to the existing metadata.""" |
977 | - existing = self.metadata |
978 | - existing.update(metadata_dict) |
979 | - self._json_data = self.serializeMetadata(existing) |
980 | - |
981 | - |
982 | -class PackageCopyJobDerived(BaseRunnableJob): |
983 | - """Abstract class for deriving from PackageCopyJob.""" |
984 | - |
985 | - delegates(IPackageCopyJob) |
986 | - |
987 | - def __init__(self, job): |
988 | - self.context = job |
989 | - |
990 | - @classmethod |
991 | - def get(cls, job_id): |
992 | - """Get a job by id. |
993 | - |
994 | - :return: the PackageCopyJob with the specified id, as the current |
995 | - PackageCopyJobDerived subclass. |
996 | - :raises: NotFoundError if there is no job with the specified id, or |
997 | - its job_type does not match the desired subclass. |
998 | - """ |
999 | - job = IStore(PackageCopyJob).get(PackageCopyJob, job_id) |
1000 | - if job.job_type != cls.class_job_type: |
1001 | - raise NotFoundError( |
1002 | - 'No object found with id %d and type %s' % (job_id, |
1003 | - cls.class_job_type.title)) |
1004 | - return cls(job) |
1005 | - |
1006 | - @classmethod |
1007 | - def iterReady(cls): |
1008 | - """Iterate through all ready PackageCopyJobs.""" |
1009 | - jobs = IStore(PackageCopyJob).find( |
1010 | - PackageCopyJob, |
1011 | - And(PackageCopyJob.job_type == cls.class_job_type, |
1012 | - PackageCopyJob.job == Job.id, |
1013 | - Job.id.is_in(Job.ready_jobs))) |
1014 | - return (cls(job) for job in jobs) |
1015 | - |
1016 | - def getOopsVars(self): |
1017 | - """See `IRunnableJob`.""" |
1018 | - vars = super(PackageCopyJobDerived, self).getOopsVars() |
1019 | - vars.extend([ |
1020 | - ('source_archive_id', self.context.source_archive_id), |
1021 | - ('target_archive_id', self.context.target_archive_id), |
1022 | - ('target_distroseries_id', self.context.target_distroseries_id), |
1023 | - ('package_copy_job_id', self.context.id), |
1024 | - ('package_copy_job_type', self.context.job_type.title), |
1025 | - ]) |
1026 | - return vars |
1027 | - |
1028 | - @property |
1029 | - def copy_policy(self): |
1030 | - """See `PlainPackageCopyJob`.""" |
1031 | - return self.context.copy_policy |
1032 | - |
1033 | - |
1034 | -class PlainPackageCopyJob(PackageCopyJobDerived): |
1035 | - """Job that copies a package from one archive to another.""" |
1036 | - # This job type serves in different places: it supports copying |
1037 | - # packages between archives, but also the syncing of packages from |
1038 | - # parents into a derived distroseries. We may split these into |
1039 | - # separate types at some point, but for now we (allenap, bigjools, |
1040 | - # jtv) chose to keep it as one. |
1041 | - |
1042 | - implements(IPlainPackageCopyJob) |
1043 | - |
1044 | - class_job_type = PackageCopyJobType.PLAIN |
1045 | - classProvides(IPlainPackageCopyJobSource) |
1046 | - |
1047 | - @classmethod |
1048 | - def create(cls, package_name, source_archive, |
1049 | +<<<<<<< TREE |
1050 | + __storm_table__ = 'PackageCopyJob' |
1051 | + |
1052 | + id = Int(primary=True) |
1053 | + |
1054 | + job_id = Int(name='job') |
1055 | + job = Reference(job_id, Job.id) |
1056 | + |
1057 | + source_archive_id = Int(name='source_archive') |
1058 | + source_archive = Reference(source_archive_id, Archive.id) |
1059 | + |
1060 | + target_archive_id = Int(name='target_archive') |
1061 | + target_archive = Reference(target_archive_id, Archive.id) |
1062 | + |
1063 | + target_distroseries_id = Int(name='target_distroseries') |
1064 | + target_distroseries = Reference(target_distroseries_id, DistroSeries.id) |
1065 | + |
1066 | + package_name = Unicode('package_name') |
1067 | + copy_policy = EnumCol(enum=PackageCopyPolicy) |
1068 | + |
1069 | + job_type = EnumCol(enum=PackageCopyJobType, notNull=True) |
1070 | + |
1071 | + _json_data = Unicode('json_data') |
1072 | + |
1073 | + def __init__(self, source_archive, target_archive, target_distroseries, |
1074 | + job_type, metadata, package_name=None, copy_policy=None): |
1075 | + super(PackageCopyJob, self).__init__() |
1076 | + self.job = Job() |
1077 | + self.job_type = job_type |
1078 | + self.source_archive = source_archive |
1079 | + self.target_archive = target_archive |
1080 | + self.target_distroseries = target_distroseries |
1081 | + self.package_name = unicode(package_name) |
1082 | + self.copy_policy = copy_policy |
1083 | + self._json_data = self.serializeMetadata(metadata) |
1084 | + |
1085 | + @classmethod |
1086 | + def serializeMetadata(cls, metadata_dict): |
1087 | + """Serialize a dict of metadata into a unicode string.""" |
1088 | + return simplejson.dumps(metadata_dict).decode('utf-8') |
1089 | + |
1090 | + @property |
1091 | + def metadata(self): |
1092 | + return simplejson.loads(self._json_data) |
1093 | + |
1094 | + def extendMetadata(self, metadata_dict): |
1095 | + """Add metadata_dict to the existing metadata.""" |
1096 | + existing = self.metadata |
1097 | + existing.update(metadata_dict) |
1098 | + self._json_data = self.serializeMetadata(existing) |
1099 | + |
1100 | + |
1101 | +class PackageCopyJobDerived(BaseRunnableJob): |
1102 | + """Abstract class for deriving from PackageCopyJob.""" |
1103 | + |
1104 | + delegates(IPackageCopyJob) |
1105 | + |
1106 | + def __init__(self, job): |
1107 | + self.context = job |
1108 | + |
1109 | + @classmethod |
1110 | + def get(cls, job_id): |
1111 | + """Get a job by id. |
1112 | + |
1113 | + :return: the PackageCopyJob with the specified id, as the current |
1114 | + PackageCopyJobDerived subclass. |
1115 | + :raises: NotFoundError if there is no job with the specified id, or |
1116 | + its job_type does not match the desired subclass. |
1117 | + """ |
1118 | + job = IStore(PackageCopyJob).get(PackageCopyJob, job_id) |
1119 | + if job.job_type != cls.class_job_type: |
1120 | + raise NotFoundError( |
1121 | + 'No object found with id %d and type %s' % (job_id, |
1122 | + cls.class_job_type.title)) |
1123 | + return cls(job) |
1124 | + |
1125 | + @classmethod |
1126 | + def iterReady(cls): |
1127 | + """Iterate through all ready PackageCopyJobs.""" |
1128 | + jobs = IStore(PackageCopyJob).find( |
1129 | + PackageCopyJob, |
1130 | + And(PackageCopyJob.job_type == cls.class_job_type, |
1131 | + PackageCopyJob.job == Job.id, |
1132 | + Job.id.is_in(Job.ready_jobs))) |
1133 | + return (cls(job) for job in jobs) |
1134 | + |
1135 | + def getOopsVars(self): |
1136 | + """See `IRunnableJob`.""" |
1137 | + vars = super(PackageCopyJobDerived, self).getOopsVars() |
1138 | + vars.extend([ |
1139 | + ('source_archive_id', self.context.source_archive_id), |
1140 | + ('target_archive_id', self.context.target_archive_id), |
1141 | + ('target_distroseries_id', self.context.target_distroseries_id), |
1142 | + ('package_copy_job_id', self.context.id), |
1143 | + ('package_copy_job_type', self.context.job_type.title), |
1144 | + ]) |
1145 | + return vars |
1146 | + |
1147 | + @property |
1148 | + def copy_policy(self): |
1149 | + """See `PlainPackageCopyJob`.""" |
1150 | + return self.context.copy_policy |
1151 | + |
1152 | + |
1153 | +class PlainPackageCopyJob(PackageCopyJobDerived): |
1154 | + """Job that copies a package from one archive to another.""" |
1155 | + # This job type serves in different places: it supports copying |
1156 | + # packages between archives, but also the syncing of packages from |
1157 | + # parents into a derived distroseries. We may split these into |
1158 | + # separate types at some point, but for now we (allenap, bigjools, |
1159 | + # jtv) chose to keep it as one. |
1160 | + |
1161 | + implements(IPlainPackageCopyJob) |
1162 | + |
1163 | + class_job_type = PackageCopyJobType.PLAIN |
1164 | + classProvides(IPlainPackageCopyJobSource) |
1165 | + |
1166 | + @classmethod |
1167 | + def create(cls, package_name, source_archive, |
1168 | +======= |
1169 | + __storm_table__ = 'PackageCopyJob' |
1170 | + |
1171 | + id = Int(primary=True) |
1172 | + |
1173 | + job_id = Int(name='job') |
1174 | + job = Reference(job_id, Job.id) |
1175 | + |
1176 | + source_archive_id = Int(name='source_archive') |
1177 | + source_archive = Reference(source_archive_id, Archive.id) |
1178 | + |
1179 | + target_archive_id = Int(name='target_archive') |
1180 | + target_archive = Reference(target_archive_id, Archive.id) |
1181 | + |
1182 | + target_distroseries_id = Int(name='target_distroseries') |
1183 | + target_distroseries = Reference(target_distroseries_id, DistroSeries.id) |
1184 | + |
1185 | + package_name = Unicode('package_name') |
1186 | + copy_policy = EnumCol(enum=PackageCopyPolicy) |
1187 | + |
1188 | + job_type = EnumCol(enum=PackageCopyJobType, notNull=True) |
1189 | + |
1190 | + _json_data = Unicode('json_data') |
1191 | + |
1192 | + def __init__(self, source_archive, target_archive, target_distroseries, |
1193 | + job_type, metadata, package_name=None, copy_policy=None): |
1194 | + super(PackageCopyJob, self).__init__() |
1195 | + self.job = Job() |
1196 | + self.job_type = job_type |
1197 | + self.source_archive = source_archive |
1198 | + self.target_archive = target_archive |
1199 | + self.target_distroseries = target_distroseries |
1200 | + self.package_name = unicode(package_name) |
1201 | + self.copy_policy = copy_policy |
1202 | + self._json_data = self.serializeMetadata(metadata) |
1203 | + |
1204 | + @classmethod |
1205 | + def serializeMetadata(cls, metadata_dict): |
1206 | + """Serialize a dict of metadata into a unicode string.""" |
1207 | + return simplejson.dumps(metadata_dict).decode('utf-8') |
1208 | + |
1209 | + @property |
1210 | + def metadata(self): |
1211 | + return simplejson.loads(self._json_data) |
1212 | + |
1213 | + def extendMetadata(self, metadata_dict): |
1214 | + """Add metadata_dict to the existing metadata.""" |
1215 | + existing = self.metadata |
1216 | + existing.update(metadata_dict) |
1217 | + self._json_data = self.serializeMetadata(existing) |
1218 | + |
1219 | + |
1220 | +class PackageCopyJobDerived(BaseRunnableJob): |
1221 | + """Abstract class for deriving from PackageCopyJob.""" |
1222 | + |
1223 | + delegates(IPackageCopyJob) |
1224 | + |
1225 | + def __init__(self, job): |
1226 | + self.context = job |
1227 | + |
1228 | + @classmethod |
1229 | + def get(cls, job_id): |
1230 | + """Get a job by id. |
1231 | + |
1232 | + :return: the PackageCopyJob with the specified id, as the current |
1233 | + PackageCopyJobDerived subclass. |
1234 | + :raises: NotFoundError if there is no job with the specified id, or |
1235 | + its job_type does not match the desired subclass. |
1236 | + """ |
1237 | + job = IStore(PackageCopyJob).get(PackageCopyJob, job_id) |
1238 | + if job.job_type != cls.class_job_type: |
1239 | + raise NotFoundError( |
1240 | + 'No object found with id %d and type %s' % (job_id, |
1241 | + cls.class_job_type.title)) |
1242 | + return cls(job) |
1243 | + |
1244 | + @classmethod |
1245 | + def iterReady(cls): |
1246 | + """Iterate through all ready PackageCopyJobs.""" |
1247 | + jobs = IStore(PackageCopyJob).find( |
1248 | + PackageCopyJob, |
1249 | + And(PackageCopyJob.job_type == cls.class_job_type, |
1250 | + PackageCopyJob.job == Job.id, |
1251 | + Job.id.is_in(Job.ready_jobs))) |
1252 | + return (cls(job) for job in jobs) |
1253 | + |
1254 | + def getOopsVars(self): |
1255 | + """See `IRunnableJob`.""" |
1256 | + vars = super(PackageCopyJobDerived, self).getOopsVars() |
1257 | + vars.extend([ |
1258 | + ('source_archive_id', self.context.source_archive_id), |
1259 | + ('target_archive_id', self.context.target_archive_id), |
1260 | + ('target_distroseries_id', self.context.target_distroseries_id), |
1261 | + ('package_copy_job_id', self.context.id), |
1262 | + ('package_copy_job_type', self.context.job_type.title), |
1263 | + ]) |
1264 | + return vars |
1265 | + |
1266 | + @property |
1267 | + def copy_policy(self): |
1268 | + """See `PlainPackageCopyJob`.""" |
1269 | + return self.context.copy_policy |
1270 | + |
1271 | + |
1272 | +class PlainPackageCopyJob(PackageCopyJobDerived): |
1273 | + """Job that copies a package from one archive to another.""" |
1274 | + # This job type serves in different places: it supports copying |
1275 | + # packages between archives, but also the syncing of packages from |
1276 | + # parents into a derived distroseries. We may split these into |
1277 | + # separate types at some point, but for now we (allenap, bigjools, |
1278 | + # jtv) chose to keep it as one. |
1279 | + |
1280 | + implements(IPlainPackageCopyJob) |
1281 | + |
1282 | + class_job_type = PackageCopyJobType.PLAIN |
1283 | + classProvides(IPlainPackageCopyJobSource) |
1284 | + |
1285 | + @classmethod |
1286 | + def _makeMetadata(cls, target_pocket, package_version, include_binaries): |
1287 | + """.""" |
1288 | + return { |
1289 | + 'target_pocket': target_pocket.value, |
1290 | + 'package_version': package_version, |
1291 | + 'include_binaries': bool(include_binaries), |
1292 | + } |
1293 | + |
1294 | + @classmethod |
1295 | + def create(cls, package_name, source_archive, |
1296 | +>>>>>>> MERGE-SOURCE |
1297 | target_archive, target_distroseries, target_pocket, |
1298 | +<<<<<<< TREE |
1299 | include_binaries=False, package_version=None, |
1300 | copy_policy=PackageCopyPolicy.INSECURE): |
1301 | """See `IPlainPackageCopyJobSource`.""" |
1302 | @@ -204,43 +343,149 @@ |
1303 | copy_policy=copy_policy, |
1304 | metadata=metadata) |
1305 | IMasterStore(PackageCopyJob).add(job) |
1306 | +======= |
1307 | + include_binaries=False, package_version=None, |
1308 | + copy_policy=PackageCopyPolicy.INSECURE): |
1309 | + """See `IPlainPackageCopyJobSource`.""" |
1310 | + assert package_version is not None, "No package version specified." |
1311 | + metadata = cls._makeMetadata( |
1312 | + target_pocket, package_version, include_binaries) |
1313 | + job = PackageCopyJob( |
1314 | + job_type=cls.class_job_type, |
1315 | + source_archive=source_archive, |
1316 | + target_archive=target_archive, |
1317 | + target_distroseries=target_distroseries, |
1318 | + package_name=package_name, |
1319 | + copy_policy=copy_policy, |
1320 | + metadata=metadata) |
1321 | + IMasterStore(PackageCopyJob).add(job) |
1322 | +>>>>>>> MERGE-SOURCE |
1323 | return cls(job) |
1324 | |
1325 | @classmethod |
1326 | - def getActiveJobs(cls, target_archive): |
1327 | - """See `IPlainPackageCopyJobSource`.""" |
1328 | - jobs = IStore(PackageCopyJob).find( |
1329 | - PackageCopyJob, |
1330 | - PackageCopyJob.job_type == cls.class_job_type, |
1331 | - PackageCopyJob.target_archive == target_archive, |
1332 | - Job.id == PackageCopyJob.job_id, |
1333 | - Job._status == JobStatus.WAITING) |
1334 | - jobs = jobs.order_by(PackageCopyJob.id) |
1335 | - return DecoratedResultSet(jobs, cls) |
1336 | - |
1337 | - @classmethod |
1338 | - def getPendingJobsForTargetSeries(cls, target_series): |
1339 | - """Get upcoming jobs for `target_series`, ordered by age.""" |
1340 | - raw_jobs = IStore(PackageCopyJob).find( |
1341 | - PackageCopyJob, |
1342 | - Job.id == PackageCopyJob.job_id, |
1343 | - PackageCopyJob.job_type == cls.class_job_type, |
1344 | - PackageCopyJob.target_distroseries == target_series, |
1345 | - Job._status.is_in(Job.PENDING_STATUSES)) |
1346 | - raw_jobs = raw_jobs.order_by(PackageCopyJob.id) |
1347 | - return DecoratedResultSet(raw_jobs, cls) |
1348 | - |
1349 | - @classmethod |
1350 | - def getPendingJobsPerPackage(cls, target_series): |
1351 | - """See `IPlainPackageCopyJobSource`.""" |
1352 | - result = {} |
1353 | - # Go through jobs in-order, picking the first matching job for |
1354 | - # any (package, version) tuple. Because of how |
1355 | - # getPendingJobsForTargetSeries orders its results, the first |
1356 | - # will be the oldest and thus presumably the first to finish. |
1357 | - for job in cls.getPendingJobsForTargetSeries(target_series): |
1358 | - result.setdefault(job.package_name, job) |
1359 | - return result |
1360 | +<<<<<<< TREE |
1361 | + def getActiveJobs(cls, target_archive): |
1362 | + """See `IPlainPackageCopyJobSource`.""" |
1363 | + jobs = IStore(PackageCopyJob).find( |
1364 | + PackageCopyJob, |
1365 | + PackageCopyJob.job_type == cls.class_job_type, |
1366 | + PackageCopyJob.target_archive == target_archive, |
1367 | + Job.id == PackageCopyJob.job_id, |
1368 | + Job._status == JobStatus.WAITING) |
1369 | + jobs = jobs.order_by(PackageCopyJob.id) |
1370 | + return DecoratedResultSet(jobs, cls) |
1371 | + |
1372 | + @classmethod |
1373 | + def getPendingJobsForTargetSeries(cls, target_series): |
1374 | + """Get upcoming jobs for `target_series`, ordered by age.""" |
1375 | + raw_jobs = IStore(PackageCopyJob).find( |
1376 | + PackageCopyJob, |
1377 | + Job.id == PackageCopyJob.job_id, |
1378 | + PackageCopyJob.job_type == cls.class_job_type, |
1379 | + PackageCopyJob.target_distroseries == target_series, |
1380 | + Job._status.is_in(Job.PENDING_STATUSES)) |
1381 | + raw_jobs = raw_jobs.order_by(PackageCopyJob.id) |
1382 | + return DecoratedResultSet(raw_jobs, cls) |
1383 | + |
1384 | + @classmethod |
1385 | + def getPendingJobsPerPackage(cls, target_series): |
1386 | + """See `IPlainPackageCopyJobSource`.""" |
1387 | + result = {} |
1388 | + # Go through jobs in-order, picking the first matching job for |
1389 | + # any (package, version) tuple. Because of how |
1390 | + # getPendingJobsForTargetSeries orders its results, the first |
1391 | + # will be the oldest and thus presumably the first to finish. |
1392 | + for job in cls.getPendingJobsForTargetSeries(target_series): |
1393 | + result.setdefault(job.package_name, job) |
1394 | + return result |
1395 | +======= |
1396 | + def _composeJobInsertionTuple(cls, target_distroseries, copy_policy, |
1397 | + include_binaries, job_id, copy_task): |
1398 | + """Create an SQL fragment for inserting a job into the database. |
1399 | + |
1400 | + :return: A string representing an SQL tuple containing initializers |
1401 | + for a `PackageCopyJob` in the database (minus `id`, which is |
1402 | + assigned automatically). Contents are escaped for use in SQL. |
1403 | + """ |
1404 | + ( |
1405 | + package_name, |
1406 | + package_version, |
1407 | + source_archive, |
1408 | + target_archive, |
1409 | + target_pocket, |
1410 | + ) = copy_task |
1411 | + metadata = cls._makeMetadata( |
1412 | + target_pocket, package_version, include_binaries) |
1413 | + data = ( |
1414 | + cls.class_job_type, target_distroseries, copy_policy, |
1415 | + source_archive, target_archive, package_name, job_id, |
1416 | + PackageCopyJob.serializeMetadata(metadata)) |
1417 | + format_string = "(%s)" % ", ".join(["%s"] * len(data)) |
1418 | + return format_string % sqlvalues(*data) |
1419 | + |
1420 | + @classmethod |
1421 | + def createMultiple(cls, target_distroseries, copy_tasks, |
1422 | + copy_policy=PackageCopyPolicy.INSECURE, |
1423 | + include_binaries=False): |
1424 | + """See `IPlainPackageCopyJobSource`.""" |
1425 | + store = IMasterStore(Job) |
1426 | + job_ids = Job.createMultiple(store, len(copy_tasks)) |
1427 | + job_contents = [ |
1428 | + cls._composeJobInsertionTuple( |
1429 | + target_distroseries, copy_policy, include_binaries, job_id, |
1430 | + task) |
1431 | + for job_id, task in zip(job_ids, copy_tasks)] |
1432 | + result = store.execute(""" |
1433 | + INSERT INTO PackageCopyJob ( |
1434 | + job_type, |
1435 | + target_distroseries, |
1436 | + copy_policy, |
1437 | + source_archive, |
1438 | + target_archive, |
1439 | + package_name, |
1440 | + job, |
1441 | + json_data) |
1442 | + VALUES %s |
1443 | + RETURNING id |
1444 | + """ % ", ".join(job_contents)) |
1445 | + return [job_id for job_id, in result] |
1446 | + |
1447 | + @classmethod |
1448 | + def getActiveJobs(cls, target_archive): |
1449 | + """See `IPlainPackageCopyJobSource`.""" |
1450 | + jobs = IStore(PackageCopyJob).find( |
1451 | + PackageCopyJob, |
1452 | + PackageCopyJob.job_type == cls.class_job_type, |
1453 | + PackageCopyJob.target_archive == target_archive, |
1454 | + Job.id == PackageCopyJob.job_id, |
1455 | + Job._status == JobStatus.WAITING) |
1456 | + jobs = jobs.order_by(PackageCopyJob.id) |
1457 | + return DecoratedResultSet(jobs, cls) |
1458 | + |
1459 | + @classmethod |
1460 | + def getPendingJobsForTargetSeries(cls, target_series): |
1461 | + """Get upcoming jobs for `target_series`, ordered by age.""" |
1462 | + raw_jobs = IStore(PackageCopyJob).find( |
1463 | + PackageCopyJob, |
1464 | + Job.id == PackageCopyJob.job_id, |
1465 | + PackageCopyJob.job_type == cls.class_job_type, |
1466 | + PackageCopyJob.target_distroseries == target_series, |
1467 | + Job._status.is_in(Job.PENDING_STATUSES)) |
1468 | + raw_jobs = raw_jobs.order_by(PackageCopyJob.id) |
1469 | + return DecoratedResultSet(raw_jobs, cls) |
1470 | + |
1471 | + @classmethod |
1472 | + def getPendingJobsPerPackage(cls, target_series): |
1473 | + """See `IPlainPackageCopyJobSource`.""" |
1474 | + result = {} |
1475 | + # Go through jobs in-order, picking the first matching job for |
1476 | + # any (package, version) tuple. Because of how |
1477 | + # getPendingJobsForTargetSeries orders its results, the first |
1478 | + # will be the oldest and thus presumably the first to finish. |
1479 | + for job in cls.getPendingJobsForTargetSeries(target_series): |
1480 | + result.setdefault(job.package_name, job) |
1481 | + return result |
1482 | +>>>>>>> MERGE-SOURCE |
1483 | |
1484 | @property |
1485 | def target_pocket(self): |
1486 | |
1487 | === modified file 'lib/lp/soyuz/tests/test_packagecopyjob.py' |
1488 | --- lib/lp/soyuz/tests/test_packagecopyjob.py 2011-06-03 08:53:14 +0000 |
1489 | +++ lib/lp/soyuz/tests/test_packagecopyjob.py 2011-06-09 10:59:00 +0000 |
1490 | @@ -116,7 +116,55 @@ |
1491 | self.assertEqual("foo", job.package_name) |
1492 | self.assertEqual("1.0-1", job.package_version) |
1493 | self.assertEquals(False, job.include_binaries) |
1494 | - self.assertEquals(PackageCopyPolicy.MASS_SYNC, job.copy_policy) |
1495 | +<<<<<<< TREE |
1496 | + self.assertEquals(PackageCopyPolicy.MASS_SYNC, job.copy_policy) |
1497 | +======= |
1498 | + self.assertEquals(PackageCopyPolicy.MASS_SYNC, job.copy_policy) |
1499 | + |
1500 | + def test_createMultiple_creates_one_job_per_copy(self): |
1501 | + mother = self.factory.makeDistroSeriesParent() |
1502 | + derived_series = mother.derived_series |
1503 | + father = self.factory.makeDistroSeriesParent( |
1504 | + derived_series=derived_series) |
1505 | + mother_package = self.factory.makeSourcePackageName() |
1506 | + father_package = self.factory.makeSourcePackageName() |
1507 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1508 | + copy_tasks = [ |
1509 | + ( |
1510 | + mother_package.name, |
1511 | + "1.5mother1", |
1512 | + mother.parent_series.main_archive, |
1513 | + derived_series.main_archive, |
1514 | + PackagePublishingPocket.RELEASE, |
1515 | + ), |
1516 | + ( |
1517 | + father_package.name, |
1518 | + "0.9father1", |
1519 | + father.parent_series.main_archive, |
1520 | + derived_series.main_archive, |
1521 | + PackagePublishingPocket.UPDATES, |
1522 | + ), |
1523 | + ] |
1524 | + job_ids = list( |
1525 | + job_source.createMultiple(mother.derived_series, copy_tasks)) |
1526 | + jobs = list(job_source.getActiveJobs(derived_series.main_archive)) |
1527 | + self.assertContentEqual(job_ids, [job.id for job in jobs]) |
1528 | + self.assertEqual(len(copy_tasks), len(set([job.job for job in jobs]))) |
1529 | + # Get jobs into the same order as copy_tasks, for ease of |
1530 | + # comparison. |
1531 | + if jobs[0].package_name != mother_package.name: |
1532 | + jobs = reversed(jobs) |
1533 | + requested_copies = [ |
1534 | + ( |
1535 | + job.package_name, |
1536 | + job.package_version, |
1537 | + job.source_archive, |
1538 | + job.target_archive, |
1539 | + job.target_pocket, |
1540 | + ) |
1541 | + for job in jobs] |
1542 | + self.assertEqual(copy_tasks, requested_copies) |
1543 | +>>>>>>> MERGE-SOURCE |
1544 | |
1545 | def test_getActiveJobs(self): |
1546 | # getActiveJobs() can retrieve all active jobs for an archive. |
1547 | @@ -320,202 +368,404 @@ |
1548 | copied_source_package = archive2.getPublishedSources( |
1549 | name="libc", version="2.8-1", exact_match=True).first() |
1550 | self.assertIsNot(copied_source_package, None) |
1551 | - |
1552 | - def test___repr__(self): |
1553 | - distroseries = self.factory.makeDistroSeries() |
1554 | - archive1 = self.factory.makeArchive(distroseries.distribution) |
1555 | - archive2 = self.factory.makeArchive(distroseries.distribution) |
1556 | - source = getUtility(IPlainPackageCopyJobSource) |
1557 | - job = source.create( |
1558 | - package_name="foo", source_archive=archive1, |
1559 | - target_archive=archive2, target_distroseries=distroseries, |
1560 | - target_pocket=PackagePublishingPocket.RELEASE, |
1561 | - package_version="1.0-1", include_binaries=True) |
1562 | - self.assertEqual( |
1563 | - ("<PlainPackageCopyJob to copy package foo from " |
1564 | - "{distroseries.distribution.name}/{archive1.name} to " |
1565 | - "{distroseries.distribution.name}/{archive2.name}, " |
1566 | - "RELEASE pocket, in {distroseries.distribution.name} " |
1567 | - "{distroseries.name}, including binaries>").format( |
1568 | - distroseries=distroseries, archive1=archive1, |
1569 | - archive2=archive2), |
1570 | - repr(job)) |
1571 | - |
1572 | - def test_getPendingJobsPerPackage_finds_jobs(self): |
1573 | - # getPendingJobsPerPackage finds jobs, and the packages they |
1574 | - # belong to. |
1575 | - dsd = self.factory.makeDistroSeriesDifference() |
1576 | - job = self.makeJob(dsd) |
1577 | - job_source = getUtility(IPlainPackageCopyJobSource) |
1578 | - self.assertEqual( |
1579 | - {dsd.source_package_name.name: job}, |
1580 | - job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1581 | - |
1582 | - def test_getPendingJobsPerPackage_ignores_other_distroseries(self): |
1583 | - # getPendingJobsPerPackage only looks for jobs on the indicated |
1584 | - # distroseries. |
1585 | - self.makeJob() |
1586 | - other_series = self.factory.makeDistroSeries() |
1587 | - job_source = getUtility(IPlainPackageCopyJobSource) |
1588 | - self.assertEqual( |
1589 | - {}, job_source.getPendingJobsPerPackage(other_series)) |
1590 | - |
1591 | - def test_getPendingJobsPerPackage_only_returns_pending_jobs(self): |
1592 | - # getPendingJobsPerPackage ignores jobs that have already been |
1593 | - # run. |
1594 | - dsd = self.factory.makeDistroSeriesDifference() |
1595 | - job = self.makeJob(dsd) |
1596 | - job_source = getUtility(IPlainPackageCopyJobSource) |
1597 | - found_by_state = {} |
1598 | - for status in JobStatus.items: |
1599 | - removeSecurityProxy(job).job._status = status |
1600 | - result = job_source.getPendingJobsPerPackage(dsd.derived_series) |
1601 | - if len(result) > 0: |
1602 | - found_by_state[status] = result[dsd.source_package_name.name] |
1603 | - expected = { |
1604 | - JobStatus.WAITING: job, |
1605 | - JobStatus.RUNNING: job, |
1606 | - JobStatus.SUSPENDED: job, |
1607 | - } |
1608 | - self.assertEqual(expected, found_by_state) |
1609 | - |
1610 | - def test_getPendingJobsPerPackage_distinguishes_jobs(self): |
1611 | - # getPendingJobsPerPackage associates the right job with the |
1612 | - # right package. |
1613 | - derived_series = self.factory.makeDistroSeries() |
1614 | - dsds = [ |
1615 | - self.factory.makeDistroSeriesDifference( |
1616 | - derived_series=derived_series) |
1617 | - for counter in xrange(2)] |
1618 | - jobs = map(self.makeJob, dsds) |
1619 | - job_source = getUtility(IPlainPackageCopyJobSource) |
1620 | - self.assertEqual( |
1621 | - dict(zip([dsd.source_package_name.name for dsd in dsds], jobs)), |
1622 | - job_source.getPendingJobsPerPackage(derived_series)) |
1623 | - |
1624 | - def test_getPendingJobsPerPackage_picks_oldest_job_for_dsd(self): |
1625 | - # If there are multiple jobs for one package, |
1626 | - # getPendingJobsPerPackage picks the oldest. |
1627 | - dsd = self.factory.makeDistroSeriesDifference() |
1628 | - jobs = [self.makeJob(dsd) for counter in xrange(2)] |
1629 | - job_source = getUtility(IPlainPackageCopyJobSource) |
1630 | - self.assertEqual( |
1631 | - {dsd.source_package_name.name: jobs[0]}, |
1632 | - job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1633 | - |
1634 | - def test_getPendingJobsPerPackage_ignores_dsds_without_jobs(self): |
1635 | - # getPendingJobsPerPackage produces no dict entry for packages |
1636 | - # that have no pending jobs, even if they do have DSDs. |
1637 | - dsd = self.factory.makeDistroSeriesDifference() |
1638 | - job_source = getUtility(IPlainPackageCopyJobSource) |
1639 | - self.assertEqual( |
1640 | - {}, job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1641 | - |
1642 | - def test_findMatchingDSDs_matches_all_DSDs_for_job(self): |
1643 | - # findMatchingDSDs finds matching DSDs for any of the packages |
1644 | - # in the job. |
1645 | - dsd = self.factory.makeDistroSeriesDifference() |
1646 | - naked_job = removeSecurityProxy(self.makeJob(dsd)) |
1647 | - self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) |
1648 | - |
1649 | - def test_findMatchingDSDs_ignores_other_source_series(self): |
1650 | - # findMatchingDSDs tries to ignore DSDs that are for different |
1651 | - # parent series than the job's source series. (This can't be |
1652 | - # done with perfect precision because the job doesn't keep track |
1653 | - # of source distroseries, but in practice it should be good |
1654 | - # enough). |
1655 | - dsd = self.factory.makeDistroSeriesDifference() |
1656 | - naked_job = removeSecurityProxy(self.makeJob(dsd)) |
1657 | - |
1658 | - # If the dsd differs only in parent series, that's enough to |
1659 | - # make it a non-match. |
1660 | - removeSecurityProxy(dsd).parent_series = ( |
1661 | - self.factory.makeDistroSeries()) |
1662 | - |
1663 | - self.assertContentEqual([], naked_job.findMatchingDSDs()) |
1664 | - |
1665 | - def test_findMatchingDSDs_ignores_other_packages(self): |
1666 | - # findMatchingDSDs does not return DSDs that are similar to the |
1667 | - # information in the job, but are for different packages. |
1668 | - dsd = self.factory.makeDistroSeriesDifference() |
1669 | - self.factory.makeDistroSeriesDifference( |
1670 | - derived_series=dsd.derived_series, |
1671 | - parent_series=dsd.parent_series) |
1672 | - naked_job = removeSecurityProxy(self.makeJob(dsd)) |
1673 | - self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) |
1674 | - |
1675 | - def test_addSourceOverride(self): |
1676 | - # Test the addOverride method which adds an ISourceOverride to the |
1677 | - # metadata. |
1678 | - name = self.factory.makeSourcePackageName() |
1679 | - component = self.factory.makeComponent() |
1680 | - section=self.factory.makeSection() |
1681 | - pcj = self.factory.makePlainPackageCopyJob() |
1682 | - self.layer.txn.commit() |
1683 | - self.layer.switchDbUser('sync_packages') |
1684 | - |
1685 | - override = SourceOverride( |
1686 | - source_package_name=name, |
1687 | - component=component, |
1688 | - section=section) |
1689 | - pcj.addSourceOverride(override) |
1690 | - |
1691 | - metadata_component = getUtility( |
1692 | - IComponentSet)[pcj.metadata["component_override"]] |
1693 | - metadata_section = getUtility( |
1694 | - ISectionSet)[pcj.metadata["section_override"]] |
1695 | - matcher = MatchesStructure( |
1696 | - component=Equals(metadata_component), |
1697 | - section=Equals(metadata_section)) |
1698 | - self.assertThat(override, matcher) |
1699 | - |
1700 | - def test_getSourceOverride(self): |
1701 | - # Test the getSourceOverride which gets an ISourceOverride from |
1702 | - # the metadata. |
1703 | - name = self.factory.makeSourcePackageName() |
1704 | - component = self.factory.makeComponent() |
1705 | - section=self.factory.makeSection() |
1706 | - pcj = self.factory.makePlainPackageCopyJob( |
1707 | - package_name=name.name, package_version="1.0") |
1708 | - self.layer.txn.commit() |
1709 | - self.layer.switchDbUser('sync_packages') |
1710 | - |
1711 | - override = SourceOverride( |
1712 | - source_package_name=name, |
1713 | - component=component, |
1714 | - section=section) |
1715 | - pcj.addSourceOverride(override) |
1716 | - |
1717 | - self.assertEqual(override, pcj.getSourceOverride()) |
1718 | - |
1719 | - def test_getPolicyImplementation_returns_policy(self): |
1720 | - # getPolicyImplementation returns the ICopyPolicy that was |
1721 | - # chosen for the job. |
1722 | - dsd = self.factory.makeDistroSeriesDifference() |
1723 | - for policy in PackageCopyPolicy.items: |
1724 | - naked_job = removeSecurityProxy( |
1725 | - self.makeJob(dsd, copy_policy=policy)) |
1726 | - self.assertEqual( |
1727 | - policy, naked_job.getPolicyImplementation().enum_value) |
1728 | - |
1729 | - |
1730 | -class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper): |
1731 | - """Test that `PlainPackageCopyJob` has the privileges it needs. |
1732 | - |
1733 | - This test looks for errors, not failures. It's here only to see that |
1734 | - these operations don't run into any privilege limitations. |
1735 | - """ |
1736 | - |
1737 | - layer = LaunchpadZopelessLayer |
1738 | - |
1739 | - def test_findMatchingDSDs(self): |
1740 | - job = self.makeJob() |
1741 | - transaction.commit() |
1742 | - self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) |
1743 | - removeSecurityProxy(job).findMatchingDSDs() |
1744 | - |
1745 | - def test_reportFailure(self): |
1746 | - job = self.makeJob() |
1747 | - transaction.commit() |
1748 | - self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) |
1749 | - removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts")) |
1750 | +<<<<<<< TREE |
1751 | + |
1752 | + def test___repr__(self): |
1753 | + distroseries = self.factory.makeDistroSeries() |
1754 | + archive1 = self.factory.makeArchive(distroseries.distribution) |
1755 | + archive2 = self.factory.makeArchive(distroseries.distribution) |
1756 | + source = getUtility(IPlainPackageCopyJobSource) |
1757 | + job = source.create( |
1758 | + package_name="foo", source_archive=archive1, |
1759 | + target_archive=archive2, target_distroseries=distroseries, |
1760 | + target_pocket=PackagePublishingPocket.RELEASE, |
1761 | + package_version="1.0-1", include_binaries=True) |
1762 | + self.assertEqual( |
1763 | + ("<PlainPackageCopyJob to copy package foo from " |
1764 | + "{distroseries.distribution.name}/{archive1.name} to " |
1765 | + "{distroseries.distribution.name}/{archive2.name}, " |
1766 | + "RELEASE pocket, in {distroseries.distribution.name} " |
1767 | + "{distroseries.name}, including binaries>").format( |
1768 | + distroseries=distroseries, archive1=archive1, |
1769 | + archive2=archive2), |
1770 | + repr(job)) |
1771 | + |
1772 | + def test_getPendingJobsPerPackage_finds_jobs(self): |
1773 | + # getPendingJobsPerPackage finds jobs, and the packages they |
1774 | + # belong to. |
1775 | + dsd = self.factory.makeDistroSeriesDifference() |
1776 | + job = self.makeJob(dsd) |
1777 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1778 | + self.assertEqual( |
1779 | + {dsd.source_package_name.name: job}, |
1780 | + job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1781 | + |
1782 | + def test_getPendingJobsPerPackage_ignores_other_distroseries(self): |
1783 | + # getPendingJobsPerPackage only looks for jobs on the indicated |
1784 | + # distroseries. |
1785 | + self.makeJob() |
1786 | + other_series = self.factory.makeDistroSeries() |
1787 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1788 | + self.assertEqual( |
1789 | + {}, job_source.getPendingJobsPerPackage(other_series)) |
1790 | + |
1791 | + def test_getPendingJobsPerPackage_only_returns_pending_jobs(self): |
1792 | + # getPendingJobsPerPackage ignores jobs that have already been |
1793 | + # run. |
1794 | + dsd = self.factory.makeDistroSeriesDifference() |
1795 | + job = self.makeJob(dsd) |
1796 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1797 | + found_by_state = {} |
1798 | + for status in JobStatus.items: |
1799 | + removeSecurityProxy(job).job._status = status |
1800 | + result = job_source.getPendingJobsPerPackage(dsd.derived_series) |
1801 | + if len(result) > 0: |
1802 | + found_by_state[status] = result[dsd.source_package_name.name] |
1803 | + expected = { |
1804 | + JobStatus.WAITING: job, |
1805 | + JobStatus.RUNNING: job, |
1806 | + JobStatus.SUSPENDED: job, |
1807 | + } |
1808 | + self.assertEqual(expected, found_by_state) |
1809 | + |
1810 | + def test_getPendingJobsPerPackage_distinguishes_jobs(self): |
1811 | + # getPendingJobsPerPackage associates the right job with the |
1812 | + # right package. |
1813 | + derived_series = self.factory.makeDistroSeries() |
1814 | + dsds = [ |
1815 | + self.factory.makeDistroSeriesDifference( |
1816 | + derived_series=derived_series) |
1817 | + for counter in xrange(2)] |
1818 | + jobs = map(self.makeJob, dsds) |
1819 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1820 | + self.assertEqual( |
1821 | + dict(zip([dsd.source_package_name.name for dsd in dsds], jobs)), |
1822 | + job_source.getPendingJobsPerPackage(derived_series)) |
1823 | + |
1824 | + def test_getPendingJobsPerPackage_picks_oldest_job_for_dsd(self): |
1825 | + # If there are multiple jobs for one package, |
1826 | + # getPendingJobsPerPackage picks the oldest. |
1827 | + dsd = self.factory.makeDistroSeriesDifference() |
1828 | + jobs = [self.makeJob(dsd) for counter in xrange(2)] |
1829 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1830 | + self.assertEqual( |
1831 | + {dsd.source_package_name.name: jobs[0]}, |
1832 | + job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1833 | + |
1834 | + def test_getPendingJobsPerPackage_ignores_dsds_without_jobs(self): |
1835 | + # getPendingJobsPerPackage produces no dict entry for packages |
1836 | + # that have no pending jobs, even if they do have DSDs. |
1837 | + dsd = self.factory.makeDistroSeriesDifference() |
1838 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1839 | + self.assertEqual( |
1840 | + {}, job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1841 | + |
1842 | + def test_findMatchingDSDs_matches_all_DSDs_for_job(self): |
1843 | + # findMatchingDSDs finds matching DSDs for any of the packages |
1844 | + # in the job. |
1845 | + dsd = self.factory.makeDistroSeriesDifference() |
1846 | + naked_job = removeSecurityProxy(self.makeJob(dsd)) |
1847 | + self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) |
1848 | + |
1849 | + def test_findMatchingDSDs_ignores_other_source_series(self): |
1850 | + # findMatchingDSDs tries to ignore DSDs that are for different |
1851 | + # parent series than the job's source series. (This can't be |
1852 | + # done with perfect precision because the job doesn't keep track |
1853 | + # of source distroseries, but in practice it should be good |
1854 | + # enough). |
1855 | + dsd = self.factory.makeDistroSeriesDifference() |
1856 | + naked_job = removeSecurityProxy(self.makeJob(dsd)) |
1857 | + |
1858 | + # If the dsd differs only in parent series, that's enough to |
1859 | + # make it a non-match. |
1860 | + removeSecurityProxy(dsd).parent_series = ( |
1861 | + self.factory.makeDistroSeries()) |
1862 | + |
1863 | + self.assertContentEqual([], naked_job.findMatchingDSDs()) |
1864 | + |
1865 | + def test_findMatchingDSDs_ignores_other_packages(self): |
1866 | + # findMatchingDSDs does not return DSDs that are similar to the |
1867 | + # information in the job, but are for different packages. |
1868 | + dsd = self.factory.makeDistroSeriesDifference() |
1869 | + self.factory.makeDistroSeriesDifference( |
1870 | + derived_series=dsd.derived_series, |
1871 | + parent_series=dsd.parent_series) |
1872 | + naked_job = removeSecurityProxy(self.makeJob(dsd)) |
1873 | + self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) |
1874 | + |
1875 | + def test_addSourceOverride(self): |
1876 | + # Test the addOverride method which adds an ISourceOverride to the |
1877 | + # metadata. |
1878 | + name = self.factory.makeSourcePackageName() |
1879 | + component = self.factory.makeComponent() |
1880 | + section=self.factory.makeSection() |
1881 | + pcj = self.factory.makePlainPackageCopyJob() |
1882 | + self.layer.txn.commit() |
1883 | + self.layer.switchDbUser('sync_packages') |
1884 | + |
1885 | + override = SourceOverride( |
1886 | + source_package_name=name, |
1887 | + component=component, |
1888 | + section=section) |
1889 | + pcj.addSourceOverride(override) |
1890 | + |
1891 | + metadata_component = getUtility( |
1892 | + IComponentSet)[pcj.metadata["component_override"]] |
1893 | + metadata_section = getUtility( |
1894 | + ISectionSet)[pcj.metadata["section_override"]] |
1895 | + matcher = MatchesStructure( |
1896 | + component=Equals(metadata_component), |
1897 | + section=Equals(metadata_section)) |
1898 | + self.assertThat(override, matcher) |
1899 | + |
1900 | + def test_getSourceOverride(self): |
1901 | + # Test the getSourceOverride which gets an ISourceOverride from |
1902 | + # the metadata. |
1903 | + name = self.factory.makeSourcePackageName() |
1904 | + component = self.factory.makeComponent() |
1905 | + section=self.factory.makeSection() |
1906 | + pcj = self.factory.makePlainPackageCopyJob( |
1907 | + package_name=name.name, package_version="1.0") |
1908 | + self.layer.txn.commit() |
1909 | + self.layer.switchDbUser('sync_packages') |
1910 | + |
1911 | + override = SourceOverride( |
1912 | + source_package_name=name, |
1913 | + component=component, |
1914 | + section=section) |
1915 | + pcj.addSourceOverride(override) |
1916 | + |
1917 | + self.assertEqual(override, pcj.getSourceOverride()) |
1918 | + |
1919 | + def test_getPolicyImplementation_returns_policy(self): |
1920 | + # getPolicyImplementation returns the ICopyPolicy that was |
1921 | + # chosen for the job. |
1922 | + dsd = self.factory.makeDistroSeriesDifference() |
1923 | + for policy in PackageCopyPolicy.items: |
1924 | + naked_job = removeSecurityProxy( |
1925 | + self.makeJob(dsd, copy_policy=policy)) |
1926 | + self.assertEqual( |
1927 | + policy, naked_job.getPolicyImplementation().enum_value) |
1928 | + |
1929 | + |
1930 | +class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper): |
1931 | + """Test that `PlainPackageCopyJob` has the privileges it needs. |
1932 | + |
1933 | + This test looks for errors, not failures. It's here only to see that |
1934 | + these operations don't run into any privilege limitations. |
1935 | + """ |
1936 | + |
1937 | + layer = LaunchpadZopelessLayer |
1938 | + |
1939 | + def test_findMatchingDSDs(self): |
1940 | + job = self.makeJob() |
1941 | + transaction.commit() |
1942 | + self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) |
1943 | + removeSecurityProxy(job).findMatchingDSDs() |
1944 | + |
1945 | + def test_reportFailure(self): |
1946 | + job = self.makeJob() |
1947 | + transaction.commit() |
1948 | + self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) |
1949 | + removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts")) |
1950 | +======= |
1951 | + |
1952 | + def test___repr__(self): |
1953 | + distroseries = self.factory.makeDistroSeries() |
1954 | + archive1 = self.factory.makeArchive(distroseries.distribution) |
1955 | + archive2 = self.factory.makeArchive(distroseries.distribution) |
1956 | + source = getUtility(IPlainPackageCopyJobSource) |
1957 | + job = source.create( |
1958 | + package_name="foo", source_archive=archive1, |
1959 | + target_archive=archive2, target_distroseries=distroseries, |
1960 | + target_pocket=PackagePublishingPocket.RELEASE, |
1961 | + package_version="1.0-1", include_binaries=True) |
1962 | + self.assertEqual( |
1963 | + ("<PlainPackageCopyJob to copy package foo from " |
1964 | + "{distroseries.distribution.name}/{archive1.name} to " |
1965 | + "{distroseries.distribution.name}/{archive2.name}, " |
1966 | + "RELEASE pocket, in {distroseries.distribution.name} " |
1967 | + "{distroseries.name}, including binaries>").format( |
1968 | + distroseries=distroseries, archive1=archive1, |
1969 | + archive2=archive2), |
1970 | + repr(job)) |
1971 | + |
1972 | + def test_getPendingJobsPerPackage_finds_jobs(self): |
1973 | + # getPendingJobsPerPackage finds jobs, and the packages they |
1974 | + # belong to. |
1975 | + dsd = self.factory.makeDistroSeriesDifference() |
1976 | + job = self.makeJob(dsd) |
1977 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1978 | + self.assertEqual( |
1979 | + {dsd.source_package_name.name: job}, |
1980 | + job_source.getPendingJobsPerPackage(dsd.derived_series)) |
1981 | + |
1982 | + def test_getPendingJobsPerPackage_ignores_other_distroseries(self): |
1983 | + # getPendingJobsPerPackage only looks for jobs on the indicated |
1984 | + # distroseries. |
1985 | + self.makeJob() |
1986 | + other_series = self.factory.makeDistroSeries() |
1987 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1988 | + self.assertEqual( |
1989 | + {}, job_source.getPendingJobsPerPackage(other_series)) |
1990 | + |
1991 | + def test_getPendingJobsPerPackage_only_returns_pending_jobs(self): |
1992 | + # getPendingJobsPerPackage ignores jobs that have already been |
1993 | + # run. |
1994 | + dsd = self.factory.makeDistroSeriesDifference() |
1995 | + job = self.makeJob(dsd) |
1996 | + job_source = getUtility(IPlainPackageCopyJobSource) |
1997 | + found_by_state = {} |
1998 | + for status in JobStatus.items: |
1999 | + removeSecurityProxy(job).job._status = status |
2000 | + result = job_source.getPendingJobsPerPackage(dsd.derived_series) |
2001 | + if len(result) > 0: |
2002 | + found_by_state[status] = result[dsd.source_package_name.name] |
2003 | + expected = { |
2004 | + JobStatus.WAITING: job, |
2005 | + JobStatus.RUNNING: job, |
2006 | + JobStatus.SUSPENDED: job, |
2007 | + } |
2008 | + self.assertEqual(expected, found_by_state) |
2009 | + |
2010 | + def test_getPendingJobsPerPackage_distinguishes_jobs(self): |
2011 | + # getPendingJobsPerPackage associates the right job with the |
2012 | + # right package. |
2013 | + derived_series = self.factory.makeDistroSeries() |
2014 | + dsds = [ |
2015 | + self.factory.makeDistroSeriesDifference( |
2016 | + derived_series=derived_series) |
2017 | + for counter in xrange(2)] |
2018 | + jobs = map(self.makeJob, dsds) |
2019 | + job_source = getUtility(IPlainPackageCopyJobSource) |
2020 | + self.assertEqual( |
2021 | + dict(zip([dsd.source_package_name.name for dsd in dsds], jobs)), |
2022 | + job_source.getPendingJobsPerPackage(derived_series)) |
2023 | + |
2024 | + def test_getPendingJobsPerPackage_picks_oldest_job_for_dsd(self): |
2025 | + # If there are multiple jobs for one package, |
2026 | + # getPendingJobsPerPackage picks the oldest. |
2027 | + dsd = self.factory.makeDistroSeriesDifference() |
2028 | + jobs = [self.makeJob(dsd) for counter in xrange(2)] |
2029 | + job_source = getUtility(IPlainPackageCopyJobSource) |
2030 | + self.assertEqual( |
2031 | + {dsd.source_package_name.name: jobs[0]}, |
2032 | + job_source.getPendingJobsPerPackage(dsd.derived_series)) |
2033 | + |
2034 | + def test_getPendingJobsPerPackage_ignores_dsds_without_jobs(self): |
2035 | + # getPendingJobsPerPackage produces no dict entry for packages |
2036 | + # that have no pending jobs, even if they do have DSDs. |
2037 | + dsd = self.factory.makeDistroSeriesDifference() |
2038 | + job_source = getUtility(IPlainPackageCopyJobSource) |
2039 | + self.assertEqual( |
2040 | + {}, job_source.getPendingJobsPerPackage(dsd.derived_series)) |
2041 | + |
2042 | + def test_findMatchingDSDs_matches_all_DSDs_for_job(self): |
2043 | + # findMatchingDSDs finds matching DSDs for any of the packages |
2044 | + # in the job. |
2045 | + dsd = self.factory.makeDistroSeriesDifference() |
2046 | + naked_job = removeSecurityProxy(self.makeJob(dsd)) |
2047 | + self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) |
2048 | + |
2049 | + def test_findMatchingDSDs_ignores_other_source_series(self): |
2050 | + # findMatchingDSDs tries to ignore DSDs that are for different |
2051 | + # parent series than the job's source series. (This can't be |
2052 | + # done with perfect precision because the job doesn't keep track |
2053 | + # of source distroseries, but in practice it should be good |
2054 | + # enough). |
2055 | + dsd = self.factory.makeDistroSeriesDifference() |
2056 | + naked_job = removeSecurityProxy(self.makeJob(dsd)) |
2057 | + |
2058 | + # If the dsd differs only in parent series, that's enough to |
2059 | + # make it a non-match. |
2060 | + removeSecurityProxy(dsd).parent_series = ( |
2061 | + self.factory.makeDistroSeries()) |
2062 | + |
2063 | + self.assertContentEqual([], naked_job.findMatchingDSDs()) |
2064 | + |
2065 | + def test_findMatchingDSDs_ignores_other_packages(self): |
2066 | + # findMatchingDSDs does not return DSDs that are similar to the |
2067 | + # information in the job, but are for different packages. |
2068 | + dsd = self.factory.makeDistroSeriesDifference() |
2069 | + self.factory.makeDistroSeriesDifference( |
2070 | + derived_series=dsd.derived_series, |
2071 | + parent_series=dsd.parent_series) |
2072 | + naked_job = removeSecurityProxy(self.makeJob(dsd)) |
2073 | + self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) |
2074 | + |
2075 | + def test_addSourceOverride(self): |
2076 | + # Test the addOverride method which adds an ISourceOverride to the |
2077 | + # metadata. |
2078 | + name = self.factory.makeSourcePackageName() |
2079 | + component = self.factory.makeComponent() |
2080 | + section = self.factory.makeSection() |
2081 | + pcj = self.factory.makePlainPackageCopyJob() |
2082 | + self.layer.txn.commit() |
2083 | + self.layer.switchDbUser('sync_packages') |
2084 | + |
2085 | + override = SourceOverride( |
2086 | + source_package_name=name, |
2087 | + component=component, |
2088 | + section=section) |
2089 | + pcj.addSourceOverride(override) |
2090 | + |
2091 | + metadata_component = getUtility( |
2092 | + IComponentSet)[pcj.metadata["component_override"]] |
2093 | + metadata_section = getUtility( |
2094 | + ISectionSet)[pcj.metadata["section_override"]] |
2095 | + matcher = MatchesStructure( |
2096 | + component=Equals(metadata_component), |
2097 | + section=Equals(metadata_section)) |
2098 | + self.assertThat(override, matcher) |
2099 | + |
2100 | + def test_getSourceOverride(self): |
2101 | + # Test the getSourceOverride which gets an ISourceOverride from |
2102 | + # the metadata. |
2103 | + name = self.factory.makeSourcePackageName() |
2104 | + component = self.factory.makeComponent() |
2105 | + section = self.factory.makeSection() |
2106 | + pcj = self.factory.makePlainPackageCopyJob( |
2107 | + package_name=name.name, package_version="1.0") |
2108 | + self.layer.txn.commit() |
2109 | + self.layer.switchDbUser('sync_packages') |
2110 | + |
2111 | + override = SourceOverride( |
2112 | + source_package_name=name, |
2113 | + component=component, |
2114 | + section=section) |
2115 | + pcj.addSourceOverride(override) |
2116 | + |
2117 | + self.assertEqual(override, pcj.getSourceOverride()) |
2118 | + |
2119 | + def test_getPolicyImplementation_returns_policy(self): |
2120 | + # getPolicyImplementation returns the ICopyPolicy that was |
2121 | + # chosen for the job. |
2122 | + dsd = self.factory.makeDistroSeriesDifference() |
2123 | + for policy in PackageCopyPolicy.items: |
2124 | + naked_job = removeSecurityProxy( |
2125 | + self.makeJob(dsd, copy_policy=policy)) |
2126 | + self.assertEqual( |
2127 | + policy, naked_job.getPolicyImplementation().enum_value) |
2128 | + |
2129 | + |
2130 | +class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper): |
2131 | + """Test that `PlainPackageCopyJob` has the privileges it needs. |
2132 | + |
2133 | + This test looks for errors, not failures. It's here only to see that |
2134 | + these operations don't run into any privilege limitations. |
2135 | + """ |
2136 | + |
2137 | + layer = LaunchpadZopelessLayer |
2138 | + |
2139 | + def test_findMatchingDSDs(self): |
2140 | + job = self.makeJob() |
2141 | + transaction.commit() |
2142 | + self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) |
2143 | + removeSecurityProxy(job).findMatchingDSDs() |
2144 | + |
2145 | + def test_reportFailure(self): |
2146 | + job = self.makeJob() |
2147 | + transaction.commit() |
2148 | + self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) |
2149 | + removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts")) |
2150 | +>>>>>>> MERGE-SOURCE |
I am sorry but I cannot find anything wrong with this proposal. :-P