Merge lp:~stevenk/launchpad/bpb-currentcomponent-assertion-part-2 into lp:launchpad
- bpb-currentcomponent-assertion-part-2
- Merge into devel
Proposed by
Steve Kowalik
Status: | Merged |
---|---|
Approved by: | Steve Kowalik |
Approved revision: | no longer in the source branch. |
Merged at revision: | 12253 |
Proposed branch: | lp:~stevenk/launchpad/bpb-currentcomponent-assertion-part-2 |
Merge into: | lp:launchpad |
Prerequisite: | lp:~stevenk/launchpad/bpb-currentcomponent-assertion |
Diff against target: |
1160 lines (+509/-589) 9 files modified
lib/lp/soyuz/browser/tests/test_build_views.py (+11/-0) lib/lp/soyuz/doc/binarypackagebuild.txt (+0/-397) lib/lp/soyuz/doc/build-estimated-dispatch-time.txt (+0/-178) lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt (+0/-8) lib/lp/soyuz/tests/test_build.py (+255/-0) lib/lp/soyuz/tests/test_build_set.py (+154/-0) lib/lp/soyuz/tests/test_build_start_estimation.py (+88/-0) lib/lp/soyuz/tests/test_doc.py (+0/-5) lib/lp/testing/factory.py (+1/-1) |
To merge this branch: | bzr merge lp:~stevenk/launchpad/bpb-currentcomponent-assertion-part-2 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Henning Eggers (community) | code | Approve | |
Review via email: mp+45693@code.launchpad.net |
Commit message
Description of the change
This branch does more of the preparation work that was started in https:/
Also allow me to apologise in advance for how large the diff for this branch is.
To post a comment you must log in.
Revision history for this message
Henning Eggers (henninge) wrote : | # |
Thank you for applying most of my suggestions. I understand that breaking up the tests is probably too much work atm and the current partitioning works, too. ;-)
review:
Approve
(code)
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === modified file 'lib/lp/soyuz/browser/tests/test_build_views.py' |
2 | --- lib/lp/soyuz/browser/tests/test_build_views.py 2011-01-16 22:48:57 +0000 |
3 | +++ lib/lp/soyuz/browser/tests/test_build_views.py 2011-01-16 22:48:58 +0000 |
4 | @@ -281,3 +281,14 @@ |
5 | job.suspend() |
6 | self.assertEquals(job.status, JobStatus.SUSPENDED) |
7 | self.assertFalse(view.dispatch_time_estimate_available) |
8 | + |
9 | + def test_old_url_redirection(self): |
10 | + # When users go to the old build URLs, they are redirected to the |
11 | + # equivalent new URLs. |
12 | + build = self.factory.makeBinaryPackageBuild() |
13 | + build.queueBuild() |
14 | + url = "http://launchpad.dev/+builds/+build/%s" % build.id |
15 | + expected_url = canonical_url(build) |
16 | + browser = self.getUserBrowser(url) |
17 | + self.assertEquals(expected_url, browser.url) |
18 | + |
19 | |
20 | === modified file 'lib/lp/soyuz/doc/binarypackagebuild.txt' |
21 | --- lib/lp/soyuz/doc/binarypackagebuild.txt 2011-01-12 23:07:40 +0000 |
22 | +++ lib/lp/soyuz/doc/binarypackagebuild.txt 2011-01-16 22:48:58 +0000 |
23 | @@ -1,400 +1,3 @@ |
24 | -= The Build table = |
25 | - |
26 | -The build table contains the information pertaining to a given build |
27 | -of a sourcepackagerelease on a distroarchseries. |
28 | - |
29 | -The build record may have many BinaryPackageRelease records pointing |
30 | -at it and it may reference a build log if the build was done on a |
31 | -launchpad build daemon. |
32 | - |
33 | - # Create a 'mozilla-firefox' build in ubuntutest/breezy-autotest/i386. |
34 | - >>> from zope.security.proxy import removeSecurityProxy |
35 | - >>> from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
36 | - >>> login('foo.bar@canonical.com') |
37 | - >>> test_publisher = SoyuzTestPublisher() |
38 | - >>> test_publisher.prepareBreezyAutotest() |
39 | - >>> source = test_publisher.getPubSource( |
40 | - ... sourcename='mozilla-firefox', version='0.9') |
41 | - >>> binaries = test_publisher.getPubBinaries( |
42 | - ... binaryname='firefox', pub_source=source) |
43 | - >>> [firefox_build] = source.getBuilds() |
44 | - >>> job = firefox_build.buildqueue_record.job |
45 | - >>> removeSecurityProxy(job).start() |
46 | - >>> removeSecurityProxy(job).complete() |
47 | - >>> login(ANONYMOUS) |
48 | - |
49 | -A build has a title which describes the context source version and in |
50 | -which series and architecture it is targeted for. |
51 | - |
52 | - >>> print firefox_build.title |
53 | - i386 build of mozilla-firefox 0.9 in ubuntutest breezy-autotest RELEASE |
54 | - |
55 | -A build directly links to the archive, distribution, distroseries, |
56 | -distroarchseries, pocket in its context and also the source version |
57 | -that generated it. |
58 | - |
59 | - >>> print firefox_build.archive.displayname |
60 | - Primary Archive for Ubuntu Test |
61 | - |
62 | - >>> print firefox_build.distribution.displayname |
63 | - ubuntutest |
64 | - |
65 | - >>> print firefox_build.distro_series.displayname |
66 | - Breezy Badger Autotest |
67 | - |
68 | - >>> print firefox_build.distro_arch_series.displayname |
69 | - ubuntutest Breezy Badger Autotest i386 |
70 | - |
71 | - >>> firefox_build.pocket |
72 | - <DBItem PackagePublishingPocket.RELEASE, (0) Release> |
73 | - |
74 | - >>> print firefox_build.arch_tag |
75 | - i386 |
76 | - |
77 | - >>> firefox_build.is_virtualized |
78 | - False |
79 | - |
80 | - >>> print firefox_build.source_package_release.title |
81 | - mozilla-firefox - 0.9 |
82 | - |
83 | -A build has an state that represents in which stage it is in a |
84 | -life-cycle that goes from PENDING to BUILDING until FULLYBUILT or one |
85 | -of the intermediate failed states (FAILEDTOBUILD, MANUALDEPWAIT, |
86 | -CHROOTWAIT, SUPERSEDED and FAILEDTOUPLOAD). |
87 | - |
88 | - >>> firefox_build.status |
89 | - <DBItem BuildStatus.FULLYBUILT, (1) Successfully built> |
90 | - |
91 | -Builds which were already processed also offer additional information |
92 | -about its process such as the time it was started and finished and its |
93 | -'log' and 'upload_changesfile' as librarian files. |
94 | - |
95 | - >>> firefox_build.was_built |
96 | - True |
97 | - |
98 | - >>> firefox_build.date_started |
99 | - datetime.datetime(...) |
100 | - |
101 | - >>> firefox_build.date_finished |
102 | - datetime.datetime(...) |
103 | - |
104 | - >>> firefox_build.duration |
105 | - datetime.timedelta(...) |
106 | - |
107 | - >>> print firefox_build.log.filename |
108 | - buildlog_ubuntutest-breezy-autotest-i386.mozilla-firefox_0.9_FULLYBUILT.txt.gz |
109 | - |
110 | - >>> print firefox_build.log_url |
111 | - http://launchpad.dev/ubuntutest/+source/mozilla-firefox/0.9/+buildjob/.../+files/buildlog_ubuntutest-breezy-autotest-i386.mozilla-firefox_0.9_FULLYBUILT.txt.gz |
112 | - |
113 | - >>> print firefox_build.upload_changesfile.filename |
114 | - firefox_0.9_i386.changes |
115 | - |
116 | - >>> print firefox_build.changesfile_url |
117 | - http://launchpad.dev/ubuntutest/+source/mozilla-firefox/0.9/+buildjob/.../+files/firefox_0.9_i386.changes |
118 | - |
119 | -The 'firefox_build' is already finished and requesting the estimated build |
120 | -start time makes no sense. Hence an exception is raised. |
121 | - |
122 | - >>> firefox_build.buildqueue_record.getEstimatedJobStartTime() |
123 | - Traceback (most recent call last): |
124 | - ... |
125 | - AssertionError: The start time is only estimated for pending jobs. |
126 | - |
127 | -On a build job in state `NEEDSBUILD` we can ask for its estimated |
128 | -build start time. |
129 | - |
130 | - # Create a brand new pending build. |
131 | - >>> login('foo.bar@canonical.com') |
132 | - >>> source = test_publisher.getPubSource(sourcename='pending-source') |
133 | - >>> [pending_build] = source.createMissingBuilds() |
134 | - >>> login(ANONYMOUS) |
135 | - |
136 | - >>> pending_build.buildqueue_record.getEstimatedJobStartTime() |
137 | - datetime.datetime(...) |
138 | - |
139 | -The currently published component is provided via the 'current_component' |
140 | -property. It looks over the publishing records and finds the current |
141 | -publication of the source in question. |
142 | - |
143 | - >>> print firefox_build.current_component.name |
144 | - main |
145 | - |
146 | -It is not necessarily the same as: |
147 | - |
148 | - >>> print firefox_build.source_package_release.component.name |
149 | - main |
150 | - |
151 | -which is the component the source was originally uploaded to, before |
152 | -any overriding action. |
153 | - |
154 | -The build can report any corresponding uploads using the package_upload |
155 | -property: |
156 | - |
157 | - >>> firefox_build.package_upload |
158 | - <PackageUpload ...> |
159 | - |
160 | - >>> firefox_build.package_upload.status |
161 | - <DBItem PackageUploadStatus.DONE, (3) Done> |
162 | - |
163 | -If the build does not have any uploads, None is returned: |
164 | - |
165 | - >>> from lp.buildmaster.enums import BuildStatus |
166 | - >>> from lp.soyuz.interfaces.binarypackagebuild import ( |
167 | - ... IBinaryPackageBuildSet) |
168 | - >>> at_build = getUtility(IBinaryPackageBuildSet).getByBuildID(15) |
169 | - >>> print at_build.package_upload |
170 | - None |
171 | - |
172 | -Test "retry" functionality: |
173 | - |
174 | - >>> firefox_build.can_be_retried |
175 | - False |
176 | - |
177 | - >>> frozen_build = getUtility(IBinaryPackageBuildSet).getByBuildID(9) |
178 | - >>> frozen_build.title |
179 | - u'i386 build of pmount 0.1-1 in ubuntu warty RELEASE' |
180 | - >>> frozen_build.status.title |
181 | - 'Failed to build' |
182 | - >>> frozen_build.can_be_retried |
183 | - False |
184 | - |
185 | -See section 'AssertionErrors in IBinaryPackageBuild' for further |
186 | -documentation about consequences of an denied 'retry' action. |
187 | - |
188 | -Let's retrieve a build record that can be retried. |
189 | - |
190 | - >>> active_build = getUtility(IBinaryPackageBuildSet).getByBuildID(9) |
191 | - |
192 | - >>> print active_build.title |
193 | - i386 build of pmount 0.1-1 in ubuntu warty RELEASE |
194 | - |
195 | - >>> print active_build.status.name |
196 | - FAILEDTOBUILD |
197 | - |
198 | - >>> print active_build.builder.name |
199 | - bob |
200 | - |
201 | - >>> print active_build.log.filename |
202 | - netapplet-1.0.0.tar.gz |
203 | - |
204 | -At this point, it's also convenient to test if any content can be |
205 | -stored as 'upload_log' using storeUploadLog(). |
206 | - |
207 | -This method will upload a file to librarian with the given content and |
208 | -update the context `upload_log` reference. |
209 | - |
210 | -We store such information persistently to allow users to revisit it, |
211 | -and potentially fix any issue, after the build has been processed. |
212 | - |
213 | -We continue to send the upload information with the |
214 | -build-failure-notification for FAILEDTOUPLOAD builds, see |
215 | -build-failedtoupload-workflow.txt for further information. |
216 | - |
217 | - >>> print active_build.upload_log |
218 | - None |
219 | - |
220 | - >>> print active_build.upload_log_url |
221 | - None |
222 | - |
223 | - >>> active_build.storeUploadLog('sample upload log.') |
224 | - >>> print active_build.upload_log.filename |
225 | - upload_9_log.txt |
226 | - |
227 | - >>> print active_build.upload_log_url |
228 | - http://launchpad.dev/ubuntu/+source/pmount/0.1-1/+buildjob/9/+files/upload_9_log.txt |
229 | - |
230 | -Once the transaction is committed, the file is available in the |
231 | -librarian, and we can retrieve its contents. |
232 | - |
233 | - >>> transaction.commit() |
234 | - >>> active_build.upload_log.open() |
235 | - >>> print active_build.upload_log.read() |
236 | - sample upload log. |
237 | - |
238 | -The 'upload_log' library file privacy is set according to the build |
239 | -target archive. For an example, see lp.buildmaster.tests.test_packagebuild |
240 | - |
241 | -Since ubuntu/warty is already released the failed build can't be |
242 | -retried. |
243 | - |
244 | - >>> active_build.can_be_retried |
245 | - False |
246 | - |
247 | -We will reactivate ubuntu/warty allowing the pmount build to be |
248 | -retried. |
249 | - |
250 | - >>> from lp.registry.interfaces.distribution import IDistributionSet |
251 | - >>> from lp.registry.interfaces.series import SeriesStatus |
252 | - >>> login('foo.bar@canonical.com') |
253 | - >>> ubuntu = getUtility(IDistributionSet)['ubuntu'] |
254 | - >>> warty = ubuntu.getSeries('warty') |
255 | - >>> warty.status = SeriesStatus.DEVELOPMENT |
256 | - >>> flush_database_updates() |
257 | - >>> login(ANONYMOUS) |
258 | - |
259 | - >>> active_build.can_be_retried |
260 | - True |
261 | - |
262 | -Before we actually retry the build on hand let's set its start time. |
263 | -This will allow us to observe the fact that a build retry does not |
264 | -change the start time if it was set already. |
265 | - |
266 | - >>> from datetime import datetime, timedelta |
267 | - >>> import pytz |
268 | - >>> UTC = pytz.timezone('UTC') |
269 | - >>> time_now = datetime.now(UTC) |
270 | - >>> unsecured_build = removeSecurityProxy(active_build) |
271 | - >>> unsecured_build.date_first_dispatched = time_now |
272 | - >>> active_build.date_first_dispatched == time_now |
273 | - True |
274 | - |
275 | -Re-trying builds requires the user to be logged in as an admin (including |
276 | -buildd admin) to gain launchpad.Edit on the build record. As an anonymous |
277 | -user, retrying will fail: |
278 | - |
279 | - >>> active_build.retry() |
280 | - Traceback (most recent call last): |
281 | - ... |
282 | - Unauthorized:... |
283 | - |
284 | -Login as an admin and retry the Build record in question: |
285 | - |
286 | - >>> login('foo.bar@canonical.com') |
287 | - >>> active_build.retry() |
288 | - |
289 | -The build was retried but its start time remains the same. |
290 | - |
291 | - >>> active_build.date_first_dispatched == time_now |
292 | - True |
293 | - |
294 | -Build record has no history and is NEEDSBUILD and a corresponding |
295 | -BuildQueue record was created. |
296 | - |
297 | - >>> print active_build.builder |
298 | - None |
299 | - |
300 | - >>> print active_build.status.name |
301 | - NEEDSBUILD |
302 | - |
303 | - >>> print active_build.buildqueue_record |
304 | - <...BuildQueue...> |
305 | - |
306 | -'log' and 'upload_log' librarian references were removed when the |
307 | -build was retried. They will be garbage-collected later by |
308 | -'librariangc' and replaced by new ones when the build re-attempt |
309 | -finishes. |
310 | - |
311 | - >>> print active_build.log |
312 | - None |
313 | - |
314 | - >>> print active_build.upload_log |
315 | - None |
316 | - |
317 | -We will restore ubuntu/warty previously changes status, SUPPORTED, so |
318 | -it won't interfere in the next tests. |
319 | - |
320 | - >>> login('foo.bar@canonical.com') |
321 | - >>> warty.status = SeriesStatus.SUPPORTED |
322 | - >>> flush_database_updates() |
323 | - >>> login(ANONYMOUS) |
324 | - |
325 | -Initialize all the required arguments to create a binary package for a |
326 | -given build record entry. |
327 | - |
328 | - >>> from lp.soyuz.interfaces.binarypackagename import IBinaryPackageNameSet |
329 | - >>> binarypackagename = getUtility(IBinaryPackageNameSet).ensure('demo').id |
330 | - >>> version = '0.0.1-demo' |
331 | - >>> summary = 'Summmmmmmmary' |
332 | - >>> description = 'Descripppppppption' |
333 | - >>> from lp.soyuz.enums import BinaryPackageFormat |
334 | - >>> binpackageformat = BinaryPackageFormat.DEB |
335 | - >>> component = firefox_build.source_package_release.component.id |
336 | - >>> section = firefox_build.source_package_release.section.id |
337 | - >>> from lp.soyuz.enums import PackagePublishingPriority |
338 | - >>> priority = PackagePublishingPriority.STANDARD |
339 | - >>> installedsize = 0 |
340 | - >>> architecturespecific = False |
341 | - |
342 | -Invoke createBinaryPackageRelease with all required arguments. |
343 | - |
344 | - # Load a build from the samepledata for creating binaries. |
345 | - >>> pmount_build = getUtility(IBinaryPackageBuildSet).getByBuildID(19) |
346 | - |
347 | - >>> bin = pmount_build.createBinaryPackageRelease( |
348 | - ... binarypackagename=binarypackagename, version=version, |
349 | - ... summary=summary, description=description, |
350 | - ... binpackageformat=binpackageformat, component=component, |
351 | - ... section=section, priority=priority, installedsize=installedsize, |
352 | - ... architecturespecific=architecturespecific) |
353 | - |
354 | - >>> from canonical.launchpad.webapp.testing import verifyObject |
355 | - >>> from lp.soyuz.interfaces.binarypackagerelease import IBinaryPackageRelease |
356 | - >>> verifyObject(IBinaryPackageRelease, bin) |
357 | - True |
358 | - |
359 | -Commit previous transaction, data we want to preserve: |
360 | - |
361 | -XXX: flush_database_updates() shouldn't be needed. This seems to be |
362 | -Bug 3989 -- StuarBishop 20060713 |
363 | - |
364 | - >>> flush_database_updates() |
365 | - >>> transaction.commit() |
366 | - |
367 | -Check binarypackages property: |
368 | - |
369 | - >>> for b in pmount_build.binarypackages: |
370 | - ... b.version |
371 | - u'0.0.1-demo' |
372 | - u'0.1-1' |
373 | - |
374 | -Emulate a huge list of binaries for 'pmount': |
375 | - |
376 | - >>> bpnameset = getUtility(IBinaryPackageNameSet) |
377 | - >>> for i in range(15): |
378 | - ... version = "%d" % i |
379 | - ... binarypackagename = bpnameset.ensure("test-%d" % i).id |
380 | - ... b = pmount_build.createBinaryPackageRelease( |
381 | - ... binarypackagename=binarypackagename, version=version, |
382 | - ... summary=summary, description=description, |
383 | - ... binpackageformat=binpackageformat, component=component, |
384 | - ... section=section, priority=priority, |
385 | - ... installedsize=installedsize, |
386 | - ... architecturespecific=architecturespecific) |
387 | - |
388 | - |
389 | -Check if the property is still working: |
390 | - |
391 | - >>> pmount_build.binarypackages.count() |
392 | - 17 |
393 | - |
394 | -Ensure the list is ordered by 'name' |
395 | - |
396 | - >>> for b in pmount_build.binarypackages: |
397 | - ... b.name, b.version |
398 | - (u'demo', u'0.0.1-demo') |
399 | - (u'pmount', u'0.1-1') |
400 | - (u'test-0', u'0') |
401 | - (u'test-1', u'1') |
402 | - (u'test-10', u'10') |
403 | - (u'test-11', u'11') |
404 | - (u'test-12', u'12') |
405 | - (u'test-13', u'13') |
406 | - (u'test-14', u'14') |
407 | - (u'test-2', u'2') |
408 | - (u'test-3', u'3') |
409 | - (u'test-4', u'4') |
410 | - (u'test-5', u'5') |
411 | - (u'test-6', u'6') |
412 | - (u'test-7', u'7') |
413 | - (u'test-8', u'8') |
414 | - (u'test-9', u'9') |
415 | - |
416 | -Rollback transaction to no disturb the other tests: |
417 | - |
418 | - >>> transaction.abort() |
419 | - |
420 | - |
421 | == The BuildSet Class == |
422 | |
423 | The BuildSet class gives us some useful ways to consider the |
424 | |
425 | === removed file 'lib/lp/soyuz/doc/build-estimated-dispatch-time.txt' |
426 | --- lib/lp/soyuz/doc/build-estimated-dispatch-time.txt 2010-10-09 16:36:22 +0000 |
427 | +++ lib/lp/soyuz/doc/build-estimated-dispatch-time.txt 1970-01-01 00:00:00 +0000 |
428 | @@ -1,178 +0,0 @@ |
429 | -In order to exercise the estimation of build job start times a setup |
430 | -with one job building and another job pending/waiting is to be created. |
431 | - |
432 | -Activate the builders present in sampledata; we need to be logged in |
433 | -as a member of launchpad-buildd-admin: |
434 | - |
435 | - >>> from canonical.launchpad.ftests import login |
436 | - >>> login('celso.providelo@canonical.com') |
437 | - >>> from lp.buildmaster.interfaces.builder import IBuilderSet |
438 | - >>> builder_set = getUtility(IBuilderSet) |
439 | - |
440 | -Do we have two builders? |
441 | - |
442 | - >>> builder_set.count() |
443 | - 2 |
444 | - |
445 | -These are the builders available. |
446 | - |
447 | - >>> from canonical.launchpad.ftests import syncUpdate |
448 | - >>> for b in builder_set: |
449 | - ... b.builderok = True |
450 | - ... print "builder: name='%s', id=%d" % (b.name, b.id) |
451 | - ... syncUpdate(b) |
452 | - builder: name='bob', id=1 |
453 | - builder: name='frog', id=2 |
454 | - |
455 | -The 'alsa-utils' package is the one to be built (in the ubuntu/hoary |
456 | -distroseries). |
457 | - |
458 | - >>> from lp.registry.interfaces.distribution import IDistributionSet |
459 | - >>> ubuntu = getUtility(IDistributionSet)['ubuntu'] |
460 | - >>> hoary = ubuntu['hoary'] |
461 | - >>> hoary.main_archive.require_virtualized |
462 | - False |
463 | - |
464 | - >>> from lp.registry.interfaces.pocket import ( |
465 | - ... PackagePublishingPocket) |
466 | - >>> alsa_hoary = hoary.getSourcePackage('alsa-utils') |
467 | - >>> alsa_spr = alsa_hoary['1.0.9a-4'].sourcepackagerelease |
468 | - >>> print alsa_spr.title |
469 | - alsa-utils - 1.0.9a-4 |
470 | - |
471 | -Create new Build and BuildQueue instances (in ubuntu/hoary/i386) for |
472 | -the pending job. |
473 | - |
474 | - >>> from datetime import timedelta |
475 | - >>> from lp.buildmaster.enums import BuildStatus |
476 | - >>> alsa_build = alsa_spr.createBuild( |
477 | - ... hoary['i386'], PackagePublishingPocket.RELEASE, |
478 | - ... hoary.main_archive) |
479 | - >>> alsa_bqueue = alsa_build.queueBuild() |
480 | - >>> alsa_bqueue.lastscore = 500 |
481 | - >>> alsa_build.status = BuildStatus.NEEDSBUILD |
482 | - |
483 | -Access the currently building job via the builder. |
484 | - |
485 | - >>> from datetime import datetime |
486 | - >>> import pytz |
487 | - >>> bob_the_builder = builder_set.get(1) |
488 | - >>> cur_bqueue = bob_the_builder.currentjob |
489 | - >>> from lp.soyuz.interfaces.binarypackagebuild import ( |
490 | - ... IBinaryPackageBuildSet) |
491 | - >>> cur_build = getUtility(IBinaryPackageBuildSet).getByQueueEntry(cur_bqueue) |
492 | - |
493 | -Make sure the job at hand is currently being built. |
494 | - |
495 | - >>> cur_build.status == BuildStatus.BUILDING |
496 | - True |
497 | - |
498 | -The start time estimation mechanism for a pending job N depends on |
499 | -proper "build start time" and "estimated build duration" values for |
500 | -other jobs that are either currently building or pending but ahead |
501 | -of job N in the build queue. These values will now be set for the job |
502 | -that is currently building. |
503 | - |
504 | - >>> from zope.security.proxy import removeSecurityProxy |
505 | - >>> cur_bqueue.lastscore = 1111 |
506 | - >>> cur_bqueue.setDateStarted( |
507 | - ... datetime(2008, 4, 1, 10, 45, 39, tzinfo=pytz.UTC)) |
508 | - >>> print cur_bqueue.date_started |
509 | - 2008-04-01 10:45:39+00:00 |
510 | - |
511 | -Please note that the "estimated build duration" is an internal property |
512 | -and not meant to be viewed or modified by an end user. |
513 | - |
514 | - >>> removeSecurityProxy(cur_bqueue).estimated_duration = ( |
515 | - ... timedelta(minutes=56)) |
516 | - |
517 | -The estimated start time for the pending job is either now or lies |
518 | -in the future. |
519 | - |
520 | - >>> now = datetime.now(pytz.UTC) |
521 | - >>> def job_start_estimate(build): |
522 | - ... return build.buildqueue_record.getEstimatedJobStartTime() |
523 | - >>> estimate = job_start_estimate(alsa_build) |
524 | - >>> estimate > now |
525 | - True |
526 | - |
527 | -The estimated build start time may only be requested for jobs that are |
528 | -pending. |
529 | - |
530 | - >>> job_start_estimate(cur_build) |
531 | - Traceback (most recent call last): |
532 | - ... |
533 | - AssertionError: The start time is only estimated for pending jobs. |
534 | - |
535 | -Now let's add two PPA packages to the mix in order to show how builds |
536 | -associated with disabled archives get ignored when it comes to the calculation |
537 | -of estimated dispatch times. |
538 | - |
539 | -We first add a build for the 'pmount' source package to cprov's PPA. |
540 | - |
541 | - >>> from lp.registry.interfaces.person import IPersonSet |
542 | - >>> cprov = getUtility(IPersonSet).getByName('cprov') |
543 | - >>> [pmount_source] = cprov.archive.getPublishedSources( |
544 | - ... name='pmount', version='0.1-1') |
545 | - >>> pmount_spr = pmount_source.sourcepackagerelease |
546 | - >>> print pmount_spr.title |
547 | - pmount - 0.1-1 |
548 | - |
549 | - >>> pmount_build = pmount_spr.createBuild( |
550 | - ... hoary['i386'], PackagePublishingPocket.RELEASE, cprov.archive) |
551 | - >>> pmount_bqueue = pmount_build.queueBuild() |
552 | - >>> pmount_bqueue.lastscore = 66 |
553 | - >>> removeSecurityProxy(pmount_bqueue).estimated_duration = ( |
554 | - ... timedelta(minutes=12)) |
555 | - >>> pmount_build.status = BuildStatus.NEEDSBUILD |
556 | - |
557 | -Followed by another build for the 'iceweasel' source package that is added |
558 | -to mark's PPA. |
559 | - |
560 | - >>> mark = getUtility(IPersonSet).getByName('mark') |
561 | - >>> [iceweasel_source] = cprov.archive.getPublishedSources( |
562 | - ... name='iceweasel', version='1.0') |
563 | - >>> iceweasel_spr = iceweasel_source.sourcepackagerelease |
564 | - >>> print iceweasel_spr.title |
565 | - iceweasel - 1.0 |
566 | - |
567 | - >>> iceweasel_build = iceweasel_spr.createBuild( |
568 | - ... hoary['i386'], PackagePublishingPocket.RELEASE, mark.archive) |
569 | - >>> iceweasel_bqueue = iceweasel_build.queueBuild() |
570 | - >>> removeSecurityProxy(iceweasel_bqueue).estimated_duration = ( |
571 | - ... timedelta(minutes=48)) |
572 | - >>> iceweasel_bqueue.lastscore = 666 |
573 | - >>> iceweasel_build.status = BuildStatus.NEEDSBUILD |
574 | - |
575 | -Since the 'iceweasel' build has a higher score (666) than the 'pmount' |
576 | -build (66) its estimated dispatch time is essentially "now". |
577 | - |
578 | - >>> now = datetime.now(pytz.UTC) |
579 | - >>> estimate = job_start_estimate(iceweasel_build) |
580 | - >>> estimate > now |
581 | - True |
582 | - >>> estimate - now |
583 | - datetime.timedelta(0, 5, ...) |
584 | - |
585 | -The 'pmount' build comes next in the queue and its estimated dispatch |
586 | -time is the estimated build time of the 'iceweasel' package i.e. 2880 |
587 | -seconds (48 minutes * 60). |
588 | - |
589 | - >>> estimate = job_start_estimate(pmount_build) |
590 | - >>> estimate > now |
591 | - True |
592 | - >>> estimate - now |
593 | - datetime.timedelta(0, 2880, ...) |
594 | - |
595 | -Now mark's PPA will be disabled. This has the effect that all builds |
596 | -associated with it (i.e. the 'iceweasel' build) are ignored while |
597 | -calculating the estimated dispatch time and the latter becomes effectively |
598 | -"now" for the 'pmount' build. |
599 | - |
600 | - >>> mark.archive.disable() |
601 | - >>> syncUpdate(mark.archive) |
602 | - >>> estimate = job_start_estimate(pmount_build) |
603 | - >>> estimate > now |
604 | - True |
605 | - >>> estimate - now |
606 | - datetime.timedelta(0, 5, ...) |
607 | |
608 | === removed file 'lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt' |
609 | --- lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt 2011-01-12 23:07:40 +0000 |
610 | +++ lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt 1970-01-01 00:00:00 +0000 |
611 | @@ -1,8 +0,0 @@ |
612 | -= URL redirection for build pages = |
613 | - |
614 | -When users go to the old build URLs, they are redirected to the equivalent |
615 | -new URLs. |
616 | - |
617 | - >>> anon_browser.open("http://launchpad.dev/+builds/+build/18") |
618 | - >>> anon_browser.url |
619 | - 'http://launchpad.dev/ubuntu/+source/mozilla-firefox/0.9/+buildjob/18' |
620 | |
621 | === added file 'lib/lp/soyuz/tests/test_build.py' |
622 | --- lib/lp/soyuz/tests/test_build.py 1970-01-01 00:00:00 +0000 |
623 | +++ lib/lp/soyuz/tests/test_build.py 2011-01-16 22:48:58 +0000 |
624 | @@ -0,0 +1,255 @@ |
625 | +# Copyright 2011 Canonical Ltd. This software is licensed under the |
626 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
627 | + |
628 | +__metaclass__ = type |
629 | + |
630 | +from datetime import ( |
631 | + datetime, |
632 | + timedelta, |
633 | + ) |
634 | +import pytz |
635 | +from zope.component import getUtility |
636 | +from zope.security.proxy import removeSecurityProxy |
637 | + |
638 | +from canonical.testing.layers import LaunchpadFunctionalLayer |
639 | +from lp.buildmaster.enums import BuildStatus |
640 | +from lp.registry.interfaces.person import IPersonSet |
641 | +from lp.registry.interfaces.pocket import PackagePublishingPocket |
642 | +from lp.registry.interfaces.series import SeriesStatus |
643 | +from lp.soyuz.enums import ( |
644 | + BinaryPackageFormat, |
645 | + PackagePublishingPriority, |
646 | + PackageUploadStatus, |
647 | + ) |
648 | +from lp.soyuz.interfaces.publishing import PackagePublishingStatus |
649 | +from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
650 | +from lp.testing import ( |
651 | + person_logged_in, |
652 | + TestCaseWithFactory, |
653 | + ) |
654 | +from lp.testing.sampledata import ADMIN_EMAIL |
655 | + |
656 | + |
657 | +class TestBuild(TestCaseWithFactory): |
658 | + |
659 | + layer = LaunchpadFunctionalLayer |
660 | + |
661 | + def setUp(self): |
662 | + super(TestBuild, self).setUp() |
663 | + self.admin = getUtility(IPersonSet).getByEmail(ADMIN_EMAIL) |
664 | + self.pf = self.factory.makeProcessorFamily() |
665 | + pf_proc = self.pf.addProcessor(self.factory.getUniqueString(), '', '') |
666 | + self.distroseries = self.factory.makeDistroSeries() |
667 | + self.das = self.factory.makeDistroArchSeries( |
668 | + distroseries=self.distroseries, processorfamily=self.pf, |
669 | + supports_virtualized=True) |
670 | + with person_logged_in(self.admin): |
671 | + self.publisher = SoyuzTestPublisher() |
672 | + self.publisher.prepareBreezyAutotest() |
673 | + self.distroseries.nominatedarchindep = self.das |
674 | + self.publisher.addFakeChroots(distroseries=self.distroseries) |
675 | + self.builder = self.factory.makeBuilder(processor=pf_proc) |
676 | + |
677 | + def test_title(self): |
678 | + # A build has a title which describes the context source version and |
679 | + # in which series and architecture it is targeted for. |
680 | + spph = self.publisher.getPubSource( |
681 | + sourcename=self.factory.getUniqueString(), |
682 | + version="%s.1" % self.factory.getUniqueInteger(), |
683 | + distroseries=self.distroseries) |
684 | + [build] = spph.createMissingBuilds() |
685 | + expected_title = '%s build of %s %s in %s %s RELEASE' % ( |
686 | + self.das.architecturetag, spph.source_package_name, |
687 | + spph.source_package_version, self.distroseries.distribution.name, |
688 | + self.distroseries.name) |
689 | + self.assertEquals(expected_title, build.title) |
690 | + |
691 | + def test_linking(self): |
692 | + # A build directly links to the archive, distribution, distroseries, |
693 | + # distroarchseries, pocket in its context and also the source version |
694 | + # that generated it. |
695 | + spph = self.publisher.getPubSource( |
696 | + sourcename=self.factory.getUniqueString(), |
697 | + version="%s.1" % self.factory.getUniqueInteger(), |
698 | + distroseries=self.distroseries) |
699 | + [build] = spph.createMissingBuilds() |
700 | + self.assertEquals(self.distroseries.main_archive, build.archive) |
701 | + self.assertEquals(self.distroseries.distribution, build.distribution) |
702 | + self.assertEquals(self.distroseries, build.distro_series) |
703 | + self.assertEquals(self.das, build.distro_arch_series) |
704 | + self.assertEquals(PackagePublishingPocket.RELEASE, build.pocket) |
705 | + self.assertEquals(self.das.architecturetag, build.arch_tag) |
706 | + self.assertTrue(build.is_virtualized) |
707 | + self.assertEquals( |
708 | + '%s - %s' % (spph.source_package_name, |
709 | + spph.source_package_version), |
710 | + build.source_package_release.title) |
711 | + |
712 | + def test_processed_builds(self): |
713 | + # Builds which were already processed also offer additional |
714 | + # information about its process such as the time it was started and |
715 | + # finished and its 'log' and 'upload_changesfile' as librarian files. |
716 | + spn=self.factory.getUniqueString() |
717 | + version="%s.1" % self.factory.getUniqueInteger() |
718 | + spph = self.publisher.getPubSource( |
719 | + sourcename=spn, version=version, |
720 | + distroseries=self.distroseries, |
721 | + status=PackagePublishingStatus.PUBLISHED) |
722 | + with person_logged_in(self.admin): |
723 | + binary = self.publisher.getPubBinaries(binaryname=spn, |
724 | + distroseries=self.distroseries, pub_source=spph, |
725 | + version=version, builder=self.builder) |
726 | + build = binary[0].binarypackagerelease.build |
727 | + self.assertTrue(build.was_built) |
728 | + self.assertEquals( |
729 | + PackageUploadStatus.DONE, build.package_upload.status) |
730 | + self.assertEquals( |
731 | + datetime(2008, 01, 01, 0, 0, 0, tzinfo=pytz.UTC), |
732 | + build.date_started) |
733 | + self.assertEquals( |
734 | + datetime(2008, 01, 01, 0, 5, 0, tzinfo=pytz.UTC), |
735 | + build.date_finished) |
736 | + self.assertEquals(timedelta(minutes=5), build.duration) |
737 | + expected_buildlog = 'buildlog_%s-%s-%s.%s_%s_FULLYBUILT.txt.gz' % ( |
738 | + self.distroseries.distribution.name, self.distroseries.name, |
739 | + self.das.architecturetag, spn, version) |
740 | + self.assertEquals(expected_buildlog, build.log.filename) |
741 | + url_start = ( |
742 | + 'http://launchpad.dev/%s/+source/%s/%s/+buildjob/%s/+files' % ( |
743 | + self.distroseries.distribution.name, spn, version, build.id)) |
744 | + expected_buildlog_url = '%s/%s' % (url_start, expected_buildlog) |
745 | + self.assertEquals(expected_buildlog_url, build.log_url) |
746 | + expected_changesfile = '%s_%s_%s.changes' % ( |
747 | + spn, version, self.das.architecturetag) |
748 | + self.assertEquals( |
749 | + expected_changesfile, build.upload_changesfile.filename) |
750 | + expected_changesfile_url = '%s/%s' % (url_start, expected_changesfile) |
751 | + self.assertEquals(expected_changesfile_url, build.changesfile_url) |
752 | + # Since this build was sucessful, it can not be retried |
753 | + self.assertFalse(build.can_be_retried) |
754 | + |
755 | + def test_current_component(self): |
756 | + # The currently published component is provided via the |
757 | + # 'current_component' property. It looks over the publishing records |
758 | + # and finds the current publication of the source in question. |
759 | + spph = self.publisher.getPubSource( |
760 | + sourcename=self.factory.getUniqueString(), |
761 | + version="%s.1" % self.factory.getUniqueInteger(), |
762 | + distroseries=self.distroseries) |
763 | + [build] = spph.createMissingBuilds() |
764 | + self.assertEquals('main', build.current_component.name) |
765 | + # It may not be the same as |
766 | + self.assertEquals('main', build.source_package_release.component.name) |
767 | + # If the package has no uploads, its package_upload is None |
768 | + self.assertEquals(None, build.package_upload) |
769 | + |
770 | + def test_retry_for_released_series(self): |
771 | + # Builds can not be retried for released distroseries |
772 | + distroseries = self.factory.makeDistroSeries() |
773 | + das = self.factory.makeDistroArchSeries( |
774 | + distroseries=distroseries, processorfamily=self.pf, |
775 | + supports_virtualized=True) |
776 | + with person_logged_in(self.admin): |
777 | + distroseries.nominatedarchindep = das |
778 | + distroseries.status = SeriesStatus.OBSOLETE |
779 | + self.publisher.addFakeChroots(distroseries=distroseries) |
780 | + spph = self.publisher.getPubSource( |
781 | + sourcename=self.factory.getUniqueString(), |
782 | + version="%s.1" % self.factory.getUniqueInteger(), |
783 | + distroseries=distroseries) |
784 | + [build] = spph.createMissingBuilds() |
785 | + self.assertFalse(build.can_be_retried) |
786 | + |
787 | + def test_retry(self): |
788 | + # A build can be retried |
789 | + spph = self.publisher.getPubSource( |
790 | + sourcename=self.factory.getUniqueString(), |
791 | + version="%s.1" % self.factory.getUniqueInteger(), |
792 | + distroseries=self.distroseries) |
793 | + [build] = spph.createMissingBuilds() |
794 | + with person_logged_in(self.admin): |
795 | + build.status = BuildStatus.FAILEDTOBUILD |
796 | + self.assertTrue(build.can_be_retried) |
797 | + |
798 | + def test_uploadlog(self): |
799 | + # The upload log can be attached to a build |
800 | + spph = self.publisher.getPubSource( |
801 | + sourcename=self.factory.getUniqueString(), |
802 | + version="%s.1" % self.factory.getUniqueInteger(), |
803 | + distroseries=self.distroseries) |
804 | + [build] = spph.createMissingBuilds() |
805 | + self.assertEquals(None, build.upload_log) |
806 | + self.assertEquals(None, build.upload_log_url) |
807 | + build.storeUploadLog('sample upload log') |
808 | + expected_filename = 'upload_%s_log.txt' % build.id |
809 | + self.assertEquals(expected_filename, build.upload_log.filename) |
810 | + url_start = ( |
811 | + 'http://launchpad.dev/%s/+source/%s/%s/+buildjob/%s/+files' % ( |
812 | + self.distroseries.distribution.name, spph.source_package_name, |
813 | + spph.source_package_version, build.id)) |
814 | + expected_url = '%s/%s' % (url_start, expected_filename) |
815 | + self.assertEquals(expected_url, build.upload_log_url) |
816 | + |
817 | + def test_retry_does_not_modify_first_dispatch(self): |
818 | + # Retrying a build does not modify the first dispatch time of the |
819 | + # build |
820 | + spph = self.publisher.getPubSource( |
821 | + sourcename=self.factory.getUniqueString(), |
822 | + version="%s.1" % self.factory.getUniqueInteger(), |
823 | + distroseries=self.distroseries) |
824 | + [build] = spph.createMissingBuilds() |
825 | + now = datetime.now(pytz.UTC) |
826 | + with person_logged_in(self.admin): |
827 | + build.status = BuildStatus.FAILEDTOBUILD |
828 | + # The build can't be queued if we're going to retry it |
829 | + build.buildqueue_record.destroySelf() |
830 | + removeSecurityProxy(build).date_first_dispatched = now |
831 | + with person_logged_in(self.admin): |
832 | + build.retry() |
833 | + self.assertEquals(BuildStatus.NEEDSBUILD, build.status) |
834 | + self.assertEquals(now, build.date_first_dispatched) |
835 | + self.assertEquals(None, build.log) |
836 | + self.assertEquals(None, build.upload_log) |
837 | + |
838 | + def test_create_bpr(self): |
839 | + # Test that we can create a BPR from a given build. |
840 | + spn = self.factory.getUniqueString() |
841 | + version = "%s.1" % self.factory.getUniqueInteger() |
842 | + bpn = self.factory.makeBinaryPackageName(name=spn) |
843 | + spph = self.publisher.getPubSource( |
844 | + sourcename=spn, version=version, distroseries=self.distroseries) |
845 | + [build] = spph.createMissingBuilds() |
846 | + binary = build.createBinaryPackageRelease( |
847 | + binarypackagename=bpn, version=version, summary='', |
848 | + description='', binpackageformat=BinaryPackageFormat.DEB, |
849 | + component=spph.sourcepackagerelease.component.id, |
850 | + section=spph.sourcepackagerelease.section.id, |
851 | + priority=PackagePublishingPriority.STANDARD, installedsize=0, |
852 | + architecturespecific=False) |
853 | + self.assertEquals(1, build.binarypackages.count()) |
854 | + self.assertEquals([binary], list(build.binarypackages)) |
855 | + |
856 | + def test_multiple_create_bpr(self): |
857 | + # We can create multiple BPRs from a build |
858 | + spn = self.factory.getUniqueString() |
859 | + version = "%s.1" % self.factory.getUniqueInteger() |
860 | + spph = self.publisher.getPubSource( |
861 | + sourcename=spn, version=version, distroseries=self.distroseries) |
862 | + [build] = spph.createMissingBuilds() |
863 | + expected_names = [] |
864 | + for i in range(15): |
865 | + bpn_name = '%s-%s' % (spn, i) |
866 | + bpn = self.factory.makeBinaryPackageName(bpn_name) |
867 | + expected_names.append(bpn_name) |
868 | + binary = build.createBinaryPackageRelease( |
869 | + binarypackagename=bpn, version=str(i), summary='', |
870 | + description='', binpackageformat=BinaryPackageFormat.DEB, |
871 | + component=spph.sourcepackagerelease.component.id, |
872 | + section=spph.sourcepackagerelease.section.id, |
873 | + priority=PackagePublishingPriority.STANDARD, installedsize=0, |
874 | + architecturespecific=False) |
875 | + self.assertEquals(15, build.binarypackages.count()) |
876 | + bin_names = [b.name for b in build.binarypackages] |
877 | + # Verify .binarypackages returns sorted by name |
878 | + expected_names.sort() |
879 | + self.assertEquals(expected_names, bin_names) |
880 | |
881 | === added file 'lib/lp/soyuz/tests/test_build_set.py' |
882 | --- lib/lp/soyuz/tests/test_build_set.py 1970-01-01 00:00:00 +0000 |
883 | +++ lib/lp/soyuz/tests/test_build_set.py 2011-01-16 22:48:58 +0000 |
884 | @@ -0,0 +1,154 @@ |
885 | +# Copyright 2011 Canonical Ltd. This software is licensed under the |
886 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
887 | + |
888 | +__metaclass__ = type |
889 | + |
890 | +from datetime import ( |
891 | + datetime, |
892 | + timedelta, |
893 | + ) |
894 | +import pytz |
895 | +from zope.component import getUtility |
896 | +from zope.security.proxy import removeSecurityProxy |
897 | +from storm.store import EmptyResultSet |
898 | + |
899 | +from canonical.testing.layers import LaunchpadFunctionalLayer |
900 | +from lp.buildmaster.enums import BuildStatus |
901 | +from lp.registry.interfaces.person import IPersonSet |
902 | +from lp.registry.interfaces.pocket import PackagePublishingPocket |
903 | +from lp.soyuz.enums import ArchivePurpose |
904 | +from lp.soyuz.interfaces.binarypackagebuild import ( |
905 | + BuildSetStatus, |
906 | + IBinaryPackageBuildSet, |
907 | + ) |
908 | +from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
909 | +from lp.testing import ( |
910 | + person_logged_in, |
911 | + TestCaseWithFactory, |
912 | + ) |
913 | +from lp.testing.sampledata import ADMIN_EMAIL |
914 | + |
915 | +class TestBuildSet(TestCaseWithFactory): |
916 | + |
917 | + layer = LaunchpadFunctionalLayer |
918 | + |
919 | + def setUp(self): |
920 | + super(TestBuildSet, self).setUp() |
921 | + self.admin = getUtility(IPersonSet).getByEmail(ADMIN_EMAIL) |
922 | + self.pf_one = self.factory.makeProcessorFamily() |
923 | + pf_proc_1 = self.pf_one.addProcessor( |
924 | + self.factory.getUniqueString(), '', '') |
925 | + self.pf_two = self.factory.makeProcessorFamily() |
926 | + pf_proc_2 = self.pf_two.addProcessor( |
927 | + self.factory.getUniqueString(), '', '') |
928 | + self.distroseries = self.factory.makeDistroSeries() |
929 | + self.distribution = self.distroseries.distribution |
930 | + self.das_one = self.factory.makeDistroArchSeries( |
931 | + distroseries=self.distroseries, processorfamily=self.pf_one, |
932 | + supports_virtualized=True) |
933 | + self.das_two = self.factory.makeDistroArchSeries( |
934 | + distroseries=self.distroseries, processorfamily=self.pf_two, |
935 | + supports_virtualized=True) |
936 | + self.archive = self.factory.makeArchive( |
937 | + distribution=self.distroseries.distribution, |
938 | + purpose=ArchivePurpose.PRIMARY) |
939 | + self.arch_ids = [arch.id for arch in self.distroseries.architectures] |
940 | + with person_logged_in(self.admin): |
941 | + self.publisher = SoyuzTestPublisher() |
942 | + self.publisher.prepareBreezyAutotest() |
943 | + self.distroseries.nominatedarchindep = self.das_one |
944 | + self.publisher.addFakeChroots(distroseries=self.distroseries) |
945 | + self.builder_one = self.factory.makeBuilder(processor=pf_proc_1) |
946 | + self.builder_two = self.factory.makeBuilder(processor=pf_proc_2) |
947 | + self.builds = [] |
948 | + |
949 | + def setUpBuilds(self): |
950 | + for i in range(5): |
951 | + # Create some test builds |
952 | + spph = self.publisher.getPubSource( |
953 | + sourcename=self.factory.getUniqueString(), |
954 | + version="%s.%s" % (self.factory.getUniqueInteger(), i), |
955 | + distroseries=self.distroseries, architecturehintlist='any') |
956 | + builds = spph.createMissingBuilds() |
957 | + with person_logged_in(self.admin): |
958 | + for b in builds: |
959 | + if i == 4: |
960 | + b.status = BuildStatus.FAILEDTOBUILD |
961 | + else: |
962 | + b.status = BuildStatus.FULLYBUILT |
963 | + b.buildqueue_record.destroySelf() |
964 | + b.date_started = datetime.now(pytz.UTC) |
965 | + b.date_finished = b.date_started + timedelta(minutes=5) |
966 | + self.builds += builds |
967 | + |
968 | + def test_get_by_spr(self): |
969 | + # Test fetching build records via the SPR |
970 | + self.setUpBuilds() |
971 | + spr = self.builds[0].source_package_release.id |
972 | + set = getUtility(IBinaryPackageBuildSet).getBuildBySRAndArchtag( |
973 | + spr, self.das_one.architecturetag) |
974 | + self.assertEquals(set.count(), 1) |
975 | + self.assertEquals(set[0], self.builds[0]) |
976 | + |
977 | + def test_get_by_arch_ids(self): |
978 | + # Test fetching builds via the arch tag |
979 | + self.setUpBuilds() |
980 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
981 | + self.distribution, self.arch_ids) |
982 | + self.assertEquals(set.count(), 10) |
983 | + |
984 | + def test_get_by_no_arch_ids(self): |
985 | + # .getBuildsByArchIds still works if the list given is empty, or none |
986 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
987 | + self.distribution, []) |
988 | + self.assertIsInstance(set, EmptyResultSet) |
989 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
990 | + self.distribution, None) |
991 | + self.assertIsInstance(set, EmptyResultSet) |
992 | + |
993 | + def test_get_by_arch_ids_filter_build_status(self): |
994 | + # The result can be filtered based on the build status |
995 | + self.setUpBuilds() |
996 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
997 | + self.distribution, self.arch_ids, status=BuildStatus.FULLYBUILT) |
998 | + self.assertEquals(set.count(), 8) |
999 | + |
1000 | + def test_get_by_arch_ids_filter_name(self): |
1001 | + # The result can be filtered based on the name |
1002 | + self.setUpBuilds() |
1003 | + spn = self.builds[2].source_package_release.sourcepackagename.name |
1004 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
1005 | + self.distribution, self.arch_ids, name=spn) |
1006 | + self.assertEquals(set.count(), 2) |
1007 | + |
1008 | + def test_get_by_arch_ids_filter_pocket(self): |
1009 | + # The result can be filtered based on the pocket of the build |
1010 | + self.setUpBuilds() |
1011 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
1012 | + self.distribution, self.arch_ids, |
1013 | + pocket=PackagePublishingPocket.RELEASE) |
1014 | + self.assertEquals(set.count(), 10) |
1015 | + set = getUtility(IBinaryPackageBuildSet).getBuildsByArchIds( |
1016 | + self.distribution, self.arch_ids, |
1017 | + pocket=PackagePublishingPocket.UPDATES) |
1018 | + self.assertEquals(set.count(), 0) |
1019 | + |
1020 | + def test_get_status_summary_for_builds(self): |
1021 | + # We can query for the status summary of a number of builds |
1022 | + self.setUpBuilds() |
1023 | + relevant_builds = [self.builds[0], self.builds[2], self.builds[-2]] |
1024 | + summary = getUtility( |
1025 | + IBinaryPackageBuildSet).getStatusSummaryForBuilds( |
1026 | + relevant_builds) |
1027 | + self.assertEquals(summary['status'], BuildSetStatus.FAILEDTOBUILD) |
1028 | + self.assertEquals(summary['builds'], [self.builds[-2]]) |
1029 | + |
1030 | + def test_preload_data(self): |
1031 | + # The BuildSet class allows data to be preloaded |
1032 | + # Note, it is an internal method, so we have to push past the security |
1033 | + # proxy |
1034 | + self.setUpBuilds() |
1035 | + build_ids = [self.builds[i] for i in (0, 1, 2, 3)] |
1036 | + rset = removeSecurityProxy( |
1037 | + getUtility(IBinaryPackageBuildSet))._prefetchBuildData(build_ids) |
1038 | + self.assertEquals(len(rset), 4) |
1039 | |
1040 | === added file 'lib/lp/soyuz/tests/test_build_start_estimation.py' |
1041 | --- lib/lp/soyuz/tests/test_build_start_estimation.py 1970-01-01 00:00:00 +0000 |
1042 | +++ lib/lp/soyuz/tests/test_build_start_estimation.py 2011-01-16 22:48:58 +0000 |
1043 | @@ -0,0 +1,88 @@ |
1044 | +# Copyright 2011 Canonical Ltd. This software is licensed under the |
1045 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1046 | + |
1047 | +__metaclass__ = type |
1048 | + |
1049 | +from datetime import ( |
1050 | + datetime, |
1051 | + timedelta, |
1052 | + ) |
1053 | +import pytz |
1054 | +from zope.component import getUtility |
1055 | +from zope.security.proxy import removeSecurityProxy |
1056 | + |
1057 | +from canonical.testing.layers import LaunchpadFunctionalLayer |
1058 | +from lp.buildmaster.interfaces.builder import IBuilderSet |
1059 | +from lp.registry.interfaces.person import IPersonSet |
1060 | +from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
1061 | +from lp.testing import ( |
1062 | + person_logged_in, |
1063 | + TestCaseWithFactory, |
1064 | + ) |
1065 | +from lp.testing.sampledata import ( |
1066 | + ADMIN_EMAIL, |
1067 | + BOB_THE_BUILDER_NAME, |
1068 | + ) |
1069 | + |
1070 | + |
1071 | +class TestBuildStartEstimation(TestCaseWithFactory): |
1072 | + |
1073 | + layer = LaunchpadFunctionalLayer |
1074 | + |
1075 | + def setUp(self): |
1076 | + super(TestBuildStartEstimation, self).setUp() |
1077 | + self.admin = getUtility(IPersonSet).getByEmail(ADMIN_EMAIL) |
1078 | + with person_logged_in(self.admin): |
1079 | + self.publisher = SoyuzTestPublisher() |
1080 | + self.publisher.prepareBreezyAutotest() |
1081 | + for buildd in getUtility(IBuilderSet): |
1082 | + buildd.builderok = True |
1083 | + self.distroseries = self.factory.makeDistroSeries() |
1084 | + self.bob = getUtility(IBuilderSet).getByName(BOB_THE_BUILDER_NAME) |
1085 | + das = self.factory.makeDistroArchSeries( |
1086 | + distroseries=self.distroseries, |
1087 | + processorfamily=self.bob.processor.id, |
1088 | + architecturetag='i386', supports_virtualized=True) |
1089 | + with person_logged_in(self.admin): |
1090 | + self.distroseries.nominatedarchindep = das |
1091 | + self.publisher.addFakeChroots(distroseries=self.distroseries) |
1092 | + |
1093 | + def job_start_estimate(self, build): |
1094 | + return build.buildqueue_record.getEstimatedJobStartTime() |
1095 | + |
1096 | + def test_estimation(self): |
1097 | + pkg = self.publisher.getPubSource( |
1098 | + sourcename=self.factory.getUniqueString(), |
1099 | + distroseries=self.distroseries) |
1100 | + build = pkg.createMissingBuilds()[0] |
1101 | + now = datetime.now(pytz.UTC) |
1102 | + estimate = self.job_start_estimate(build) |
1103 | + self.assertTrue(estimate > now) |
1104 | + |
1105 | + def test_disabled_archives(self): |
1106 | + pkg1 = self.publisher.getPubSource( |
1107 | + sourcename=self.factory.getUniqueString(), |
1108 | + distroseries=self.distroseries) |
1109 | + [build1] = pkg1.createMissingBuilds() |
1110 | + build1.buildqueue_record.lastscore = 1000 |
1111 | + # No user-serviceable parts inside |
1112 | + removeSecurityProxy(build1.buildqueue_record).estimated_duration = ( |
1113 | + timedelta(minutes=10)) |
1114 | + pkg2 = self.publisher.getPubSource( |
1115 | + sourcename=self.factory.getUniqueString(), |
1116 | + distroseries=self.distroseries) |
1117 | + [build2] = pkg2.createMissingBuilds() |
1118 | + build2.buildqueue_record.lastscore = 100 |
1119 | + now = datetime.now(pytz.UTC) |
1120 | + # Since build1 is higher priority, it's estimated dispatch time is now |
1121 | + estimate = self.job_start_estimate(build1) |
1122 | + self.assertEquals(5, (estimate - now).seconds) |
1123 | + # And build2 is next, so must take build1's duration into account |
1124 | + estimate = self.job_start_estimate(build2) |
1125 | + self.assertEquals(600, (estimate - now).seconds) |
1126 | + # If we disable build1's archive, build2 is next |
1127 | + with person_logged_in(self.admin): |
1128 | + build1.archive.disable() |
1129 | + estimate = self.job_start_estimate(build2) |
1130 | + self.assertEquals(5, (estimate - now).seconds) |
1131 | + |
1132 | |
1133 | === modified file 'lib/lp/soyuz/tests/test_doc.py' |
1134 | --- lib/lp/soyuz/tests/test_doc.py 2010-11-06 12:50:22 +0000 |
1135 | +++ lib/lp/soyuz/tests/test_doc.py 2011-01-16 22:48:58 +0000 |
1136 | @@ -196,11 +196,6 @@ |
1137 | setUp=manageChrootSetup, |
1138 | layer=LaunchpadZopelessLayer, |
1139 | ), |
1140 | - 'build-estimated-dispatch-time.txt': LayeredDocFileSuite( |
1141 | - '../doc/build-estimated-dispatch-time.txt', |
1142 | - setUp=builddmasterSetUp, |
1143 | - layer=LaunchpadZopelessLayer, |
1144 | - ), |
1145 | 'package-arch-specific.txt': LayeredDocFileSuite( |
1146 | '../doc/package-arch-specific.txt', |
1147 | setUp=builddmasterSetUp, |
1148 | |
1149 | === modified file 'lib/lp/testing/factory.py' |
1150 | --- lib/lp/testing/factory.py 2011-01-11 22:05:11 +0000 |
1151 | +++ lib/lp/testing/factory.py 2011-01-16 22:48:58 +0000 |
1152 | @@ -2127,7 +2127,7 @@ |
1153 | processorfamily = self.makeProcessorFamily() |
1154 | if owner is None: |
1155 | owner = self.makePerson() |
1156 | - # XXX: architecturetag & processerfamily are tightly coupled. It's |
1157 | + # XXX: architecturetag & processorfamily are tightly coupled. It's |
1158 | # wrong to just make a fresh architecture tag without also making a |
1159 | # processor family to go with it (ideally with processors!) |
1160 | if architecturetag is None: |
Hi Steven,
thank you for this nice clean-up. I am very much for any replacement of doctests with unit tests. Good job!
I have a few issues, that I'd like you to consider, though. I am aware that they probably arise from the very fact that these tests are converted doctests but since they are newly created by you now, we should take care that they conform to our test styles.
1. Please do not begin comments in tests with "Test that ..." or similar. It is rather obvious that tests methods are testing stuff, checking condition, making sure things work.
2. We have agreed to use the "expected, observed" order in test assertions. Old tests won't be converted but new tests should certainly follow that convention.
3. Something that easily happens in these conversions is that the functional get very long and mutate test data between assertions. Ideally, each unit test is short and concise, with one assertion per test. I don't insist on the "one assertion per test" paradigm although I find it useful but I ask you to please split up your tests into much smaller pieces. If you feel that too much setup code is wasted on too small a test, put that code into a "_make..." or "_setUp..." method. It might also be a good idea to only use those kind of factories/fixtures and do without a global "setUp" method. I just lately found how hard it is to read unit tests if you need to go back to setUp to remind yourself of what the test data looks like.
Cheers,
Henning