Merge lp:~michael.nelson/launchpad/567922-binarypackagebuild-new-table-4 into lp:launchpad/db-devel
- 567922-binarypackagebuild-new-table-4
- Merge into db-devel
| Status: | Merged | ||||
|---|---|---|---|---|---|
| Approved by: | Edwin Grubbs on 2010-05-12 | ||||
| Approved revision: | no longer in the source branch. | ||||
| Merged at revision: | 9405 | ||||
| Proposed branch: | lp:~michael.nelson/launchpad/567922-binarypackagebuild-new-table-4 | ||||
| Merge into: | lp:launchpad/db-devel | ||||
| Prerequisite: | lp:~michael.nelson/launchpad/567922-binarypackagebuild-new-table-3 | ||||
| Diff against target: |
785 lines (+163/-116) (has conflicts) 21 files modified
lib/lp/archiveuploader/nascentupload.py (+2/-2) lib/lp/archiveuploader/nascentuploadfile.py (+4/-4) lib/lp/buildmaster/browser/configure.zcml (+2/-1) lib/lp/buildmaster/browser/packagebuild.py (+0/-42) lib/lp/buildmaster/interfaces/buildfarmjob.py (+6/-1) lib/lp/buildmaster/model/buildfarmjob.py (+7/-0) lib/lp/buildmaster/tests/test_buildfarmjob.py (+24/-0) lib/lp/soyuz/adapters/archivedependencies.py (+2/-2) lib/lp/soyuz/doc/archive-dependencies.txt (+9/-8) lib/lp/soyuz/doc/archive-files.txt (+1/-1) lib/lp/soyuz/doc/archive.txt (+2/-2) lib/lp/soyuz/doc/build-failedtoupload-workflow.txt (+4/-4) lib/lp/soyuz/model/archive.py (+9/-6) lib/lp/soyuz/model/binarypackagebuild.py (+33/-20) lib/lp/soyuz/model/binarypackagerelease.py (+3/-3) lib/lp/soyuz/model/distributionsourcepackagerelease.py (+10/-6) lib/lp/soyuz/model/publishing.py (+6/-5) lib/lp/soyuz/model/queue.py (+2/-2) lib/lp/soyuz/model/sourcepackagerelease.py (+7/-5) lib/lp/soyuz/tests/test_binarypackagebuild.py (+28/-0) lib/lp/soyuz/tests/test_packageupload.py (+2/-2) Text conflict in lib/lp/buildmaster/interfaces/buildbase.py Text conflict in lib/lp/buildmaster/model/buildbase.py Text conflict in lib/lp/buildmaster/tests/test_buildbase.py Text conflict in lib/lp/code/model/sourcepackagerecipebuild.py |
||||
| To merge this branch: | bzr merge lp:~michael.nelson/launchpad/567922-binarypackagebuild-new-table-4 | ||||
| Related bugs: |
|
| Reviewer | Review Type | Date Requested | Status |
|---|---|---|---|
| Edwin Grubbs (community) | code | 2010-05-12 | Approve on 2010-05-12 |
|
Review via email:
|
|||
Commit Message
Description of the Change
This branch is part of a pipeline for
https:/
https:/
**Note**: If it's possible, please ignore the conflicts with db-devel - it's due to a reversion of some work that was in db-devel and that I'd already pumped through the pipeline, and I'm waiting for that work to land again on db-devel before re-merging and pumping.
The actual diff of this branch from the previous is:
http://
Overview
========
This branch continues the work to switch our BinaryPackageBuild class to the new binarypackagebuild table (using the delegated PackageBuild/
It (finally) gets all the soyuz unit tests passing.
Details
=======
This branch just gets the remaining soyuz unit tests passing and starts on the soyuz doctests.
This branch is dependent on the pending schema patch in a previous branch.
To test
=======
First update the test db schema (required as the db patch still needs to be updated to remove the old build table):
psql launchpad_
bin/py database/
And then:
bin/test -vv -t test_packageupload -t doc/archive-
The next branch will continue getting the soyuz doctests passing with the new model.
Preview Diff
| 1 | === modified file 'lib/lp/archiveuploader/nascentupload.py' |
| 2 | --- lib/lp/archiveuploader/nascentupload.py 2010-02-26 16:52:46 +0000 |
| 3 | +++ lib/lp/archiveuploader/nascentupload.py 2010-05-12 11:01:39 +0000 |
| 4 | @@ -780,7 +780,7 @@ |
| 5 | # fine. |
| 6 | ancestry = self.getBinaryAncestry( |
| 7 | uploaded_file, try_other_archs=False) |
| 8 | - if (ancestry is not None and |
| 9 | + if (ancestry is not None and |
| 10 | not self.policy.archive.is_copy): |
| 11 | # Ignore version checks for copy archives |
| 12 | # because the ancestry comes from the primary |
| 13 | @@ -962,7 +962,7 @@ |
| 14 | for build in self.queue_root.builds] |
| 15 | if considered_build.id in attached_builds: |
| 16 | continue |
| 17 | - assert (considered_build.sourcepackagerelease.id == |
| 18 | + assert (considered_build.source_package_release.id == |
| 19 | sourcepackagerelease.id), ( |
| 20 | "Upload contains binaries of different sources.") |
| 21 | self.queue_root.addBuild(considered_build) |
| 22 | |
| 23 | === modified file 'lib/lp/archiveuploader/nascentuploadfile.py' |
| 24 | --- lib/lp/archiveuploader/nascentuploadfile.py 2010-04-09 15:46:09 +0000 |
| 25 | +++ lib/lp/archiveuploader/nascentuploadfile.py 2010-05-12 11:01:39 +0000 |
| 26 | @@ -806,7 +806,7 @@ |
| 27 | build = sourcepackagerelease.getBuildByArch( |
| 28 | dar, self.policy.archive) |
| 29 | if build is not None: |
| 30 | - build.buildstate = BuildStatus.FULLYBUILT |
| 31 | + build.status = BuildStatus.FULLYBUILT |
| 32 | self.logger.debug("Updating build for %s: %s" % ( |
| 33 | dar.architecturetag, build.id)) |
| 34 | else: |
| 35 | @@ -822,7 +822,7 @@ |
| 36 | # Ensure gathered binary is related to a FULLYBUILT build |
| 37 | # record. It will be check in slave-scanner procedure to |
| 38 | # certify that the build was processed correctly. |
| 39 | - build.buildstate = BuildStatus.FULLYBUILT |
| 40 | + build.status = BuildStatus.FULLYBUILT |
| 41 | # Also purge any previous failed upload_log stored, so its |
| 42 | # content can be garbage-collected since it's not useful |
| 43 | # anymore. |
| 44 | @@ -831,9 +831,9 @@ |
| 45 | # Sanity check; raise an error if the build we've been |
| 46 | # told to link to makes no sense (ie. is not for the right |
| 47 | # source package). |
| 48 | - if (build.sourcepackagerelease != sourcepackagerelease or |
| 49 | + if (build.source_package_release != sourcepackagerelease or |
| 50 | build.pocket != self.policy.pocket or |
| 51 | - build.distroarchseries != dar or |
| 52 | + build.distro_arch_series != dar or |
| 53 | build.archive != self.policy.archive): |
| 54 | raise UploadError( |
| 55 | "Attempt to upload binaries specifying " |
| 56 | |
| 57 | === modified file 'lib/lp/buildmaster/browser/configure.zcml' |
| 58 | --- lib/lp/buildmaster/browser/configure.zcml 2010-05-12 11:01:28 +0000 |
| 59 | +++ lib/lp/buildmaster/browser/configure.zcml 2010-05-12 11:01:39 +0000 |
| 60 | @@ -10,5 +10,6 @@ |
| 61 | i18n_domain="launchpad"> |
| 62 | <browser:url |
| 63 | for="lp.buildmaster.interfaces.packagebuild.IPackageBuild" |
| 64 | - urldata="lp.buildmaster.browser.packagebuild.PackageBuildUrl"/> |
| 65 | + path_expression="string:+build/${build_farm_job/id}" |
| 66 | + attribute_to_parent="archive"/> |
| 67 | </configure> |
| 68 | |
| 69 | === removed file 'lib/lp/buildmaster/browser/packagebuild.py' |
| 70 | --- lib/lp/buildmaster/browser/packagebuild.py 2010-05-12 11:01:28 +0000 |
| 71 | +++ lib/lp/buildmaster/browser/packagebuild.py 1970-01-01 00:00:00 +0000 |
| 72 | @@ -1,42 +0,0 @@ |
| 73 | -# Copyright 2010 Canonical Ltd. This software is licensed under the |
| 74 | -# GNU Affero General Public License version 3 (see the file LICENSE). |
| 75 | - |
| 76 | -"""URLs for PackageBuild classes.""" |
| 77 | - |
| 78 | -from zope.interface import implements |
| 79 | - |
| 80 | -from canonical.launchpad.webapp.interfaces import ICanonicalUrlData |
| 81 | - |
| 82 | - |
| 83 | -class PackageBuildUrl: |
| 84 | - """Dynamic URL declaration for IPackageBuild classes. |
| 85 | - |
| 86 | - When dealing with distribution builds we want to present them |
| 87 | - under IDistributionSourcePackageRelease url: |
| 88 | - |
| 89 | - /ubuntu/+source/foo/1.0/+build/1234 |
| 90 | - |
| 91 | - On the other hand, PPA builds will be presented under the PPA page: |
| 92 | - |
| 93 | - /~cprov/+archive/+build/1235 |
| 94 | - |
| 95 | - Copy archives will be presented under the archives page: |
| 96 | - /ubuntu/+archive/my-special-archive/+build/1234 |
| 97 | - """ |
| 98 | - implements(ICanonicalUrlData) |
| 99 | - rootsite = None |
| 100 | - |
| 101 | - def __init__(self, context): |
| 102 | - self.context = context |
| 103 | - |
| 104 | - @property |
| 105 | - def inside(self): |
| 106 | - if self.context.archive.is_ppa or self.context.archive.is_copy: |
| 107 | - return self.context.archive |
| 108 | - else: |
| 109 | - return self.context.distributionsourcepackagerelease |
| 110 | - |
| 111 | - @property |
| 112 | - def path(self): |
| 113 | - return u"+build/%d" % self.context.build_farm_job.id |
| 114 | - |
| 115 | |
| 116 | === modified file 'lib/lp/buildmaster/interfaces/buildfarmjob.py' |
| 117 | --- lib/lp/buildmaster/interfaces/buildfarmjob.py 2010-05-12 11:01:28 +0000 |
| 118 | +++ lib/lp/buildmaster/interfaces/buildfarmjob.py 2010-05-12 11:01:39 +0000 |
| 119 | @@ -15,7 +15,7 @@ |
| 120 | ] |
| 121 | |
| 122 | from zope.interface import Interface, Attribute |
| 123 | -from zope.schema import Bool, Choice, Datetime, TextLine |
| 124 | +from zope.schema import Bool, Choice, Datetime, TextLine, Timedelta |
| 125 | from lazr.enum import DBEnumeratedType, DBItem |
| 126 | from lazr.restful.declarations import exported |
| 127 | from lazr.restful.fields import Reference |
| 128 | @@ -194,6 +194,11 @@ |
| 129 | "The timestamp when the build farm job was finished.")), |
| 130 | ("1.0", dict(exported=True, exported_as="datebuilt"))) |
| 131 | |
| 132 | + duration = Timedelta( |
| 133 | + title=_("Duration"), required=False, |
| 134 | + description=_("Duration interval, calculated when the " |
| 135 | + "result gets collected.")) |
| 136 | + |
| 137 | date_first_dispatched = exported( |
| 138 | Datetime( |
| 139 | title=_("Date finished"), required=False, readonly=True, |
| 140 | |
| 141 | === modified file 'lib/lp/buildmaster/model/buildfarmjob.py' |
| 142 | --- lib/lp/buildmaster/model/buildfarmjob.py 2010-05-12 11:01:28 +0000 |
| 143 | +++ lib/lp/buildmaster/model/buildfarmjob.py 2010-05-12 11:01:39 +0000 |
| 144 | @@ -220,6 +220,13 @@ |
| 145 | """See `IBuildFarmJob`.""" |
| 146 | return self.job_type.title |
| 147 | |
| 148 | + @property |
| 149 | + def duration(self): |
| 150 | + """See `IBuildFarmJob`.""" |
| 151 | + if self.date_started is None or self.date_finished is None: |
| 152 | + return None |
| 153 | + return self.date_finished - self.date_started |
| 154 | + |
| 155 | def makeJob(self): |
| 156 | """See `IBuildFarmJob`.""" |
| 157 | raise NotImplementedError |
| 158 | |
| 159 | === modified file 'lib/lp/buildmaster/tests/test_buildfarmjob.py' |
| 160 | --- lib/lp/buildmaster/tests/test_buildfarmjob.py 2010-05-12 11:01:28 +0000 |
| 161 | +++ lib/lp/buildmaster/tests/test_buildfarmjob.py 2010-05-12 11:01:39 +0000 |
| 162 | @@ -5,10 +5,13 @@ |
| 163 | |
| 164 | __metaclass__ = type |
| 165 | |
| 166 | +from datetime import datetime, timedelta |
| 167 | +import pytz |
| 168 | import unittest |
| 169 | |
| 170 | from storm.store import Store |
| 171 | from zope.component import getUtility |
| 172 | +from zope.security.proxy import removeSecurityProxy |
| 173 | |
| 174 | from canonical.database.sqlbase import flush_database_updates |
| 175 | from canonical.testing.layers import DatabaseFunctionalLayer |
| 176 | @@ -112,6 +115,27 @@ |
| 177 | self.build_farm_job.job_type.title, |
| 178 | self.build_farm_job.title) |
| 179 | |
| 180 | + def test_duration_none(self): |
| 181 | + # If either start or finished is none, the duration will be |
| 182 | + # none. |
| 183 | + self.build_farm_job.jobStarted() |
| 184 | + self.failUnlessEqual(None, self.build_farm_job.duration) |
| 185 | + |
| 186 | + self.build_farm_job.jobAborted() |
| 187 | + removeSecurityProxy(self.build_farm_job).date_finished = ( |
| 188 | + datetime.now(pytz.UTC)) |
| 189 | + self.failUnlessEqual(None, self.build_farm_job.duration) |
| 190 | + |
| 191 | + def test_duration_set(self): |
| 192 | + # If both start and finished are defined, the duration will be |
| 193 | + # returned. |
| 194 | + now = datetime.now(pytz.UTC) |
| 195 | + duration = timedelta(1) |
| 196 | + naked_bfj = removeSecurityProxy(self.build_farm_job) |
| 197 | + naked_bfj.date_started = now |
| 198 | + naked_bfj.date_finished = now + duration |
| 199 | + self.failUnlessEqual(duration, self.build_farm_job.duration) |
| 200 | + |
| 201 | |
| 202 | def test_suite(): |
| 203 | return unittest.TestLoader().loadTestsFromName(__name__) |
| 204 | |
| 205 | === modified file 'lib/lp/soyuz/adapters/archivedependencies.py' |
| 206 | --- lib/lp/soyuz/adapters/archivedependencies.py 2010-02-26 13:42:51 +0000 |
| 207 | +++ lib/lp/soyuz/adapters/archivedependencies.py 2010-05-12 11:01:39 +0000 |
| 208 | @@ -168,7 +168,7 @@ |
| 209 | |
| 210 | # Consider user-selected archive dependencies. |
| 211 | primary_component = get_primary_current_component( |
| 212 | - build.archive, build.distroseries, sourcepackagename) |
| 213 | + build.archive, build.distro_series, sourcepackagename) |
| 214 | for archive_dependency in build.archive.dependencies: |
| 215 | # When the dependency component is undefined, we should use |
| 216 | # the component where the source is published in the primary |
| 217 | @@ -272,7 +272,7 @@ |
| 218 | primary_dependencies = [] |
| 219 | for pocket in primary_pockets: |
| 220 | primary_dependencies.append( |
| 221 | - (build.distroseries.distribution.main_archive, pocket, |
| 222 | + (build.distro_series.distribution.main_archive, pocket, |
| 223 | primary_components)) |
| 224 | |
| 225 | return primary_dependencies |
| 226 | |
| 227 | === modified file 'lib/lp/soyuz/doc/archive-dependencies.txt' |
| 228 | --- lib/lp/soyuz/doc/archive-dependencies.txt 2010-03-11 01:39:25 +0000 |
| 229 | +++ lib/lp/soyuz/doc/archive-dependencies.txt 2010-05-12 11:01:39 +0000 |
| 230 | @@ -205,8 +205,9 @@ |
| 231 | ... get_sources_list_for_building) |
| 232 | |
| 233 | >>> def print_building_sources_list(candidate): |
| 234 | - ... sources_list = get_sources_list_for_building(candidate, |
| 235 | - ... candidate.distroarchseries, candidate.sourcepackagerelease.name) |
| 236 | + ... sources_list = get_sources_list_for_building( |
| 237 | + ... candidate, candidate.distro_arch_series, |
| 238 | + ... candidate.source_package_release.name) |
| 239 | ... for line in sources_list: |
| 240 | ... print line |
| 241 | |
| 242 | @@ -219,7 +220,7 @@ |
| 243 | ... PackagePublishingStatus) |
| 244 | |
| 245 | >>> cprov.archive.getAllPublishedBinaries( |
| 246 | - ... distroarchseries=a_build.distroarchseries, |
| 247 | + ... distroarchseries=a_build.distro_arch_series, |
| 248 | ... status=PackagePublishingStatus.PUBLISHED).count() |
| 249 | 0 |
| 250 | |
| 251 | @@ -323,9 +324,9 @@ |
| 252 | deb http://ppa.launchpad.dev/cprov/ppa/ubuntu hoary main |
| 253 | deb http://ftpmaster.internal/ubuntu hoary |
| 254 | main restricted universe multiverse |
| 255 | - deb http://ftpmaster.internal/ubuntu hoary-security |
| 256 | + deb http://ftpmaster.internal/ubuntu hoary-security |
| 257 | main restricted universe multiverse |
| 258 | - deb http://ftpmaster.internal/ubuntu hoary-updates |
| 259 | + deb http://ftpmaster.internal/ubuntu hoary-updates |
| 260 | main restricted universe multiverse |
| 261 | |
| 262 | However, in order to avoid the problem going forward (and to allow the PPA |
| 263 | @@ -363,9 +364,9 @@ |
| 264 | deb http://ppa.launchpad.dev/cprov/ppa/ubuntu hoary main |
| 265 | deb http://ftpmaster.internal/ubuntu hoary |
| 266 | main restricted universe multiverse |
| 267 | - deb http://ftpmaster.internal/ubuntu hoary-security |
| 268 | + deb http://ftpmaster.internal/ubuntu hoary-security |
| 269 | main restricted universe multiverse |
| 270 | - deb http://ftpmaster.internal/ubuntu hoary-updates |
| 271 | + deb http://ftpmaster.internal/ubuntu hoary-updates |
| 272 | main restricted universe multiverse |
| 273 | |
| 274 | However, in order to avoid the problem going forward (and to allow the PPA |
| 275 | @@ -434,7 +435,7 @@ |
| 276 | ... get_primary_current_component) |
| 277 | |
| 278 | >>> print get_primary_current_component(a_build.archive, |
| 279 | - ... a_build.distroseries, a_build.sourcepackagerelease.name) |
| 280 | + ... a_build.distro_series, a_build.source_package_release.name) |
| 281 | universe |
| 282 | |
| 283 | >>> print_building_sources_list(a_build) |
| 284 | |
| 285 | === modified file 'lib/lp/soyuz/doc/archive-files.txt' |
| 286 | --- lib/lp/soyuz/doc/archive-files.txt 2009-08-28 07:34:44 +0000 |
| 287 | +++ lib/lp/soyuz/doc/archive-files.txt 2010-05-12 11:01:39 +0000 |
| 288 | @@ -191,7 +191,7 @@ |
| 289 | ... 'buildlog_ubuntu-breezy-autotest-i386.' |
| 290 | ... 'test-pkg_1.0_FULLYBUILT.txt.gz') |
| 291 | >>> buildlog = test_publisher.addMockFile(buildlog_name) |
| 292 | - >>> build.buildlog = buildlog |
| 293 | + >>> build.log = buildlog |
| 294 | |
| 295 | >>> buildlog == build.getFileByName(buildlog_name) |
| 296 | True |
| 297 | |
| 298 | === modified file 'lib/lp/soyuz/doc/archive.txt' |
| 299 | --- lib/lp/soyuz/doc/archive.txt 2010-05-06 10:05:49 +0000 |
| 300 | +++ lib/lp/soyuz/doc/archive.txt 2010-05-12 11:01:39 +0000 |
| 301 | @@ -1047,13 +1047,13 @@ |
| 302 | >>> cd_lookup = cprov_archive.getBuildRecords(name='cd') |
| 303 | >>> cd_lookup.count() |
| 304 | 1 |
| 305 | - >>> cd_lookup[0].sourcepackagerelease.name |
| 306 | + >>> cd_lookup[0].source_package_release.name |
| 307 | u'cdrkit' |
| 308 | |
| 309 | >>> ice_lookup = cprov_archive.getBuildRecords(name='ice') |
| 310 | >>> ice_lookup.count() |
| 311 | 1 |
| 312 | - >>> ice_lookup[0].sourcepackagerelease.name |
| 313 | + >>> ice_lookup[0].source_package_release.name |
| 314 | u'iceweasel' |
| 315 | |
| 316 | >>> cprov_archive.getBuildRecords(name='foo').count() |
| 317 | |
| 318 | === modified file 'lib/lp/soyuz/doc/build-failedtoupload-workflow.txt' |
| 319 | --- lib/lp/soyuz/doc/build-failedtoupload-workflow.txt 2010-04-14 17:34:35 +0000 |
| 320 | +++ lib/lp/soyuz/doc/build-failedtoupload-workflow.txt 2010-05-12 11:01:39 +0000 |
| 321 | @@ -28,7 +28,7 @@ |
| 322 | >>> print failedtoupload_candidate.title |
| 323 | i386 build of cdrkit 1.0 in ubuntu breezy-autotest RELEASE |
| 324 | |
| 325 | - >>> print failedtoupload_candidate.buildstate.name |
| 326 | + >>> print failedtoupload_candidate.status.name |
| 327 | FAILEDTOUPLOAD |
| 328 | |
| 329 | >>> print failedtoupload_candidate.upload_log.filename |
| 330 | @@ -110,7 +110,7 @@ |
| 331 | Let's emulate the procedure of rescuing an FAILEDTOUPLOAD build. |
| 332 | A FAILEDTOUPLOAD build obviously has no binaries: |
| 333 | |
| 334 | - >>> print failedtoupload_candidate.buildstate.name |
| 335 | + >>> print failedtoupload_candidate.status.name |
| 336 | FAILEDTOUPLOAD |
| 337 | |
| 338 | >>> failedtoupload_candidate.binarypackages.count() |
| 339 | @@ -162,7 +162,7 @@ |
| 340 | >>> buildd_policy = getPolicy( |
| 341 | ... name='buildd', |
| 342 | ... distro=failedtoupload_candidate.distribution.name, |
| 343 | - ... distroseries=failedtoupload_candidate.distroseries.name, |
| 344 | + ... distroseries=failedtoupload_candidate.distro_series.name, |
| 345 | ... buildid=failedtoupload_candidate.id) |
| 346 | |
| 347 | >>> cdrkit_bin_upload = NascentUpload( |
| 348 | @@ -180,7 +180,7 @@ |
| 349 | previously stored upload_log is dereferenced (they are both updated |
| 350 | during the upload processing time): |
| 351 | |
| 352 | - >>> print failedtoupload_candidate.buildstate.name |
| 353 | + >>> print failedtoupload_candidate.status.name |
| 354 | FULLYBUILT |
| 355 | |
| 356 | >>> print failedtoupload_candidate.upload_log |
| 357 | |
| 358 | === modified file 'lib/lp/soyuz/model/archive.py' |
| 359 | --- lib/lp/soyuz/model/archive.py 2010-05-12 11:01:28 +0000 |
| 360 | +++ lib/lp/soyuz/model/archive.py 2010-05-12 11:01:39 +0000 |
| 361 | @@ -882,18 +882,21 @@ |
| 362 | extra_exprs = [] |
| 363 | if not include_needsbuild: |
| 364 | extra_exprs.append( |
| 365 | - BinaryPackageBuild.buildstate != BuildStatus.NEEDSBUILD) |
| 366 | + BuildFarmJob.status != BuildStatus.NEEDSBUILD) |
| 367 | |
| 368 | find_spec = ( |
| 369 | - BinaryPackageBuild.buildstate, |
| 370 | + BuildFarmJob.status, |
| 371 | Count(BinaryPackageBuild.id) |
| 372 | ) |
| 373 | - result = store.using(BinaryPackageBuild).find( |
| 374 | + result = store.using( |
| 375 | + BinaryPackageBuild, PackageBuild, BuildFarmJob).find( |
| 376 | find_spec, |
| 377 | - BinaryPackageBuild.archive == self, |
| 378 | + BinaryPackageBuild.package_build == PackageBuild.id, |
| 379 | + PackageBuild.archive == self, |
| 380 | + PackageBuild.build_farm_job == BuildFarmJob.id, |
| 381 | *extra_exprs |
| 382 | - ).group_by(BinaryPackageBuild.buildstate).order_by( |
| 383 | - BinaryPackageBuild.buildstate) |
| 384 | + ).group_by(BuildFarmJob.status).order_by( |
| 385 | + BuildFarmJob.status) |
| 386 | |
| 387 | # Create a map for each count summary to a number of buildstates: |
| 388 | count_map = { |
| 389 | |
| 390 | === modified file 'lib/lp/soyuz/model/binarypackagebuild.py' |
| 391 | --- lib/lp/soyuz/model/binarypackagebuild.py 2010-05-12 11:01:28 +0000 |
| 392 | +++ lib/lp/soyuz/model/binarypackagebuild.py 2010-05-12 11:01:39 +0000 |
| 393 | @@ -24,6 +24,8 @@ |
| 394 | |
| 395 | from canonical.config import config |
| 396 | from canonical.database.sqlbase import quote_like, SQLBase, sqlvalues |
| 397 | +from canonical.launchpad.browser.librarian import ( |
| 398 | + ProxiedLibraryFileAlias) |
| 399 | from canonical.launchpad.components.decoratedresultset import ( |
| 400 | DecoratedResultSet) |
| 401 | from canonical.launchpad.database.librarian import ( |
| 402 | @@ -209,6 +211,17 @@ |
| 403 | return self.distro_arch_series.architecturetag |
| 404 | |
| 405 | @property |
| 406 | + def log_url(self): |
| 407 | + """See `IBuildFarmJob`. |
| 408 | + |
| 409 | + Overridden here for the case of builds for distro archives, |
| 410 | + currently only supported for binary package builds. |
| 411 | + """ |
| 412 | + if self.log is None: |
| 413 | + return None |
| 414 | + return ProxiedLibraryFileAlias(self.log, self).http_url |
| 415 | + |
| 416 | + @property |
| 417 | def distributionsourcepackagerelease(self): |
| 418 | """See `IBuild`.""" |
| 419 | from lp.soyuz.model.distributionsourcepackagerelease \ |
| 420 | @@ -270,7 +283,7 @@ |
| 421 | def retry(self): |
| 422 | """See `IBuild`.""" |
| 423 | assert self.can_be_retried, "Build %s cannot be retried" % self.id |
| 424 | - self.buildstate = BuildStatus.NEEDSBUILD |
| 425 | + self.status = BuildStatus.NEEDSBUILD |
| 426 | self.datebuilt = None |
| 427 | self.buildduration = None |
| 428 | self.builder = None |
| 429 | @@ -522,15 +535,15 @@ |
| 430 | config.builddmaster.default_sender_address) |
| 431 | |
| 432 | extra_headers = { |
| 433 | - 'X-Launchpad-Build-State': self.buildstate.name, |
| 434 | + 'X-Launchpad-Build-State': self.status.name, |
| 435 | 'X-Launchpad-Build-Component' : self.current_component.name, |
| 436 | - 'X-Launchpad-Build-Arch' : self.distroarchseries.architecturetag, |
| 437 | + 'X-Launchpad-Build-Arch' : self.distro_arch_series.architecturetag, |
| 438 | } |
| 439 | |
| 440 | # XXX cprov 2006-10-27: Temporary extra debug info about the |
| 441 | # SPR.creator in context, to be used during the service quarantine, |
| 442 | # notify_owner will be disabled to avoid *spamming* Debian people. |
| 443 | - creator = self.sourcepackagerelease.creator |
| 444 | + creator = self.source_package_release.creator |
| 445 | extra_headers['X-Creator-Recipient'] = ",".join( |
| 446 | get_contact_email_addresses(creator)) |
| 447 | |
| 448 | @@ -545,7 +558,7 @@ |
| 449 | # * the package build (failure) occurred in the original |
| 450 | # archive. |
| 451 | package_was_not_copied = ( |
| 452 | - self.archive == self.sourcepackagerelease.upload_archive) |
| 453 | + self.archive == self.source_package_release.upload_archive) |
| 454 | |
| 455 | if package_was_not_copied and config.builddmaster.notify_owner: |
| 456 | if (self.archive.is_ppa and creator.inTeam(self.archive.owner) |
| 457 | @@ -557,7 +570,7 @@ |
| 458 | # Non-PPA notifications inform the creator regardless. |
| 459 | recipients = recipients.union( |
| 460 | get_contact_email_addresses(creator)) |
| 461 | - dsc_key = self.sourcepackagerelease.dscsigningkey |
| 462 | + dsc_key = self.source_package_release.dscsigningkey |
| 463 | if dsc_key: |
| 464 | recipients = recipients.union( |
| 465 | get_contact_email_addresses(dsc_key.owner)) |
| 466 | @@ -596,13 +609,13 @@ |
| 467 | # with the state in the build worflow, maybe by having an |
| 468 | # IBuild.statusReport property, which could also be used in the |
| 469 | # respective page template. |
| 470 | - if self.buildstate in [ |
| 471 | + if self.status in [ |
| 472 | BuildStatus.NEEDSBUILD, BuildStatus.SUPERSEDED]: |
| 473 | # untouched builds |
| 474 | buildduration = 'not available' |
| 475 | buildlog_url = 'not available' |
| 476 | builder_url = 'not available' |
| 477 | - elif self.buildstate == BuildStatus.BUILDING: |
| 478 | + elif self.status == BuildStatus.BUILDING: |
| 479 | # build in process |
| 480 | buildduration = 'not finished' |
| 481 | buildlog_url = 'see builder page' |
| 482 | @@ -610,11 +623,11 @@ |
| 483 | else: |
| 484 | # completed states (success and failure) |
| 485 | buildduration = DurationFormatterAPI( |
| 486 | - self.buildduration).approximateduration() |
| 487 | - buildlog_url = self.build_log_url |
| 488 | + self.date_finished - self.date_started).approximateduration() |
| 489 | + buildlog_url = self.log_url |
| 490 | builder_url = canonical_url(self.builder) |
| 491 | |
| 492 | - if self.buildstate == BuildStatus.FAILEDTOUPLOAD: |
| 493 | + if self.status == BuildStatus.FAILEDTOUPLOAD: |
| 494 | assert extra_info is not None, ( |
| 495 | 'Extra information is required for FAILEDTOUPLOAD ' |
| 496 | 'notifications.') |
| 497 | @@ -624,10 +637,10 @@ |
| 498 | |
| 499 | template = get_email_template('build-notification.txt') |
| 500 | replacements = { |
| 501 | - 'source_name': self.sourcepackagerelease.name, |
| 502 | - 'source_version': self.sourcepackagerelease.version, |
| 503 | - 'architecturetag': self.distroarchseries.architecturetag, |
| 504 | - 'build_state': self.buildstate.title, |
| 505 | + 'source_name': self.source_package_release.name, |
| 506 | + 'source_version': self.source_package_release.version, |
| 507 | + 'architecturetag': self.distro_arch_series.architecturetag, |
| 508 | + 'build_state': self.status.title, |
| 509 | 'build_duration': buildduration, |
| 510 | 'buildlog_url': buildlog_url, |
| 511 | 'builder_url': builder_url, |
| 512 | @@ -661,7 +674,7 @@ |
| 513 | if filename.endswith('.changes'): |
| 514 | file_object = self.upload_changesfile |
| 515 | elif filename.endswith('.txt.gz'): |
| 516 | - file_object = self.buildlog |
| 517 | + file_object = self.log |
| 518 | elif filename.endswith('_log.txt'): |
| 519 | file_object = self.upload_log |
| 520 | elif filename.endswith('deb'): |
| 521 | @@ -868,7 +881,7 @@ |
| 522 | # and share it with ISourcePackage.getBuildRecords() |
| 523 | |
| 524 | # exclude gina-generated and security (dak-made) builds |
| 525 | - # buildstate == FULLYBUILT && datebuilt == null |
| 526 | + # status == FULLYBUILT && datebuilt == null |
| 527 | if status == BuildStatus.FULLYBUILT: |
| 528 | condition_clauses.append("BuildFarmJob.date_finished IS NOT NULL") |
| 529 | else: |
| 530 | @@ -927,8 +940,8 @@ |
| 531 | |
| 532 | # Get the MANUALDEPWAIT records for all archives. |
| 533 | candidates = BinaryPackageBuild.selectBy( |
| 534 | - buildstate=BuildStatus.MANUALDEPWAIT, |
| 535 | - distroarchseries=distroarchseries) |
| 536 | + status=BuildStatus.MANUALDEPWAIT, |
| 537 | + distro_arch_series=distroarchseries) |
| 538 | |
| 539 | candidates_count = candidates.count() |
| 540 | if candidates_count == 0: |
| 541 | @@ -978,7 +991,7 @@ |
| 542 | wanted = [] |
| 543 | for state in states: |
| 544 | candidates = [build for build in builds |
| 545 | - if build.buildstate == state] |
| 546 | + if build.status == state] |
| 547 | wanted.extend(candidates) |
| 548 | return wanted |
| 549 | |
| 550 | |
| 551 | === modified file 'lib/lp/soyuz/model/binarypackagerelease.py' |
| 552 | --- lib/lp/soyuz/model/binarypackagerelease.py 2010-04-12 11:37:48 +0000 |
| 553 | +++ lib/lp/soyuz/model/binarypackagerelease.py 2010-05-12 11:01:39 +0000 |
| 554 | @@ -84,17 +84,17 @@ |
| 555 | import DistributionSourcePackageRelease |
| 556 | return DistributionSourcePackageRelease( |
| 557 | distribution=self.build.distribution, |
| 558 | - sourcepackagerelease=self.build.sourcepackagerelease) |
| 559 | + sourcepackagerelease=self.build.source_package_release) |
| 560 | |
| 561 | @property |
| 562 | def sourcepackagename(self): |
| 563 | """See `IBinaryPackageRelease`.""" |
| 564 | - return self.build.sourcepackagerelease.sourcepackagename.name |
| 565 | + return self.build.source_package_release.sourcepackagename.name |
| 566 | |
| 567 | @property |
| 568 | def is_new(self): |
| 569 | """See `IBinaryPackageRelease`.""" |
| 570 | - distroarchseries = self.build.distroarchseries |
| 571 | + distroarchseries = self.build.distro_arch_series |
| 572 | distroarchseries_binary_package = distroarchseries.getBinaryPackage( |
| 573 | self.binarypackagename) |
| 574 | return distroarchseries_binary_package.currentrelease is None |
| 575 | |
| 576 | === modified file 'lib/lp/soyuz/model/distributionsourcepackagerelease.py' |
| 577 | --- lib/lp/soyuz/model/distributionsourcepackagerelease.py 2010-04-12 08:29:02 +0000 |
| 578 | +++ lib/lp/soyuz/model/distributionsourcepackagerelease.py 2010-05-12 11:01:39 +0000 |
| 579 | @@ -16,11 +16,13 @@ |
| 580 | |
| 581 | from storm.expr import Desc |
| 582 | |
| 583 | +from canonical.database.sqlbase import sqlvalues |
| 584 | + |
| 585 | +from lp.buildmaster.model.buildfarmjob import BuildFarmJob |
| 586 | +from lp.buildmaster.model.packagebuild import PackageBuild |
| 587 | from lp.soyuz.interfaces.distributionsourcepackagerelease import ( |
| 588 | IDistributionSourcePackageRelease) |
| 589 | from lp.soyuz.interfaces.sourcepackagerelease import ISourcePackageRelease |
| 590 | -from canonical.database.sqlbase import sqlvalues |
| 591 | - |
| 592 | from lp.soyuz.model.archive import Archive |
| 593 | from lp.soyuz.model.binarypackagename import BinaryPackageName |
| 594 | from lp.soyuz.model.binarypackagerelease import ( |
| 595 | @@ -100,11 +102,13 @@ |
| 596 | # distribution that were built for a PPA but have been published |
| 597 | # in a main archive. |
| 598 | builds_for_distro_exprs = ( |
| 599 | - (BinaryPackageBuild.sourcepackagerelease == |
| 600 | + (BinaryPackageBuild.source_package_release == |
| 601 | self.sourcepackagerelease), |
| 602 | - BinaryPackageBuild.distroarchseries == DistroArchSeries.id, |
| 603 | + BinaryPackageBuild.distro_arch_series == DistroArchSeries.id, |
| 604 | DistroArchSeries.distroseries == DistroSeries.id, |
| 605 | DistroSeries.distribution == self.distribution, |
| 606 | + BinaryPackageBuild.package_build == PackageBuild.id, |
| 607 | + PackageBuild.build_farm_job == BuildFarmJob.id |
| 608 | ) |
| 609 | |
| 610 | # First, get all the builds built in a main archive (this will |
| 611 | @@ -112,7 +116,7 @@ |
| 612 | builds_built_in_main_archives = store.find( |
| 613 | BinaryPackageBuild, |
| 614 | builds_for_distro_exprs, |
| 615 | - BinaryPackageBuild.archive == Archive.id, |
| 616 | + PackageBuild.archive == Archive.id, |
| 617 | Archive.purpose.is_in(MAIN_ARCHIVE_PURPOSES)) |
| 618 | |
| 619 | # Next get all the builds that have a binary published in the |
| 620 | @@ -132,7 +136,7 @@ |
| 621 | return builds_built_in_main_archives.union( |
| 622 | builds_published_in_main_archives).order_by( |
| 623 | Desc( |
| 624 | - BinaryPackageBuild.datecreated), Desc(BinaryPackageBuild.id)) |
| 625 | + BuildFarmJob.date_created), Desc(BinaryPackageBuild.id)) |
| 626 | |
| 627 | @property |
| 628 | def binary_package_names(self): |
| 629 | |
| 630 | === modified file 'lib/lp/soyuz/model/publishing.py' |
| 631 | --- lib/lp/soyuz/model/publishing.py 2010-05-12 11:01:28 +0000 |
| 632 | +++ lib/lp/soyuz/model/publishing.py 2010-05-12 11:01:39 +0000 |
| 633 | @@ -439,15 +439,16 @@ |
| 634 | BinaryPackageRelease.id AND |
| 635 | BinaryPackagePublishingHistory.distroarchseries= |
| 636 | DistroArchSeries.id AND |
| 637 | - BinaryPackageRelease.build=Build.id AND |
| 638 | - Build.sourcepackagerelease=%s AND |
| 639 | + BinaryPackageRelease.build=BinaryPackageBuild.id AND |
| 640 | + BinaryPackageBuild.source_package_release=%s AND |
| 641 | DistroArchSeries.distroseries=%s AND |
| 642 | BinaryPackagePublishingHistory.archive=%s AND |
| 643 | BinaryPackagePublishingHistory.pocket=%s |
| 644 | """ % sqlvalues(self.sourcepackagerelease, self.distroseries, |
| 645 | self.archive, self.pocket) |
| 646 | |
| 647 | - clauseTables = ['Build', 'BinaryPackageRelease', 'DistroArchSeries'] |
| 648 | + clauseTables = [ |
| 649 | + 'BinaryPackageBuild', 'BinaryPackageRelease', 'DistroArchSeries'] |
| 650 | orderBy = ['-BinaryPackagePublishingHistory.id'] |
| 651 | preJoins = ['binarypackagerelease'] |
| 652 | |
| 653 | @@ -565,7 +566,7 @@ |
| 654 | # Check DistroArchSeries database IDs because the object belongs |
| 655 | # to different transactions (architecture_available is cached). |
| 656 | if (build_candidate is not None and |
| 657 | - (build_candidate.distroarchseries.id == arch.id or |
| 658 | + (build_candidate.distro_arch_series.id == arch.id or |
| 659 | build_candidate.buildstate == BuildStatus.FULLYBUILT)): |
| 660 | return None |
| 661 | |
| 662 | @@ -869,7 +870,7 @@ |
| 663 | def buildIndexStanzaFields(self): |
| 664 | """See `IPublishing`.""" |
| 665 | bpr = self.binarypackagerelease |
| 666 | - spr = bpr.build.sourcepackagerelease |
| 667 | + spr = bpr.build.source_package_release |
| 668 | |
| 669 | # binaries have only one file, the DEB |
| 670 | bin_file = bpr.files[0] |
| 671 | |
| 672 | === modified file 'lib/lp/soyuz/model/queue.py' |
| 673 | --- lib/lp/soyuz/model/queue.py 2010-04-22 16:39:05 +0000 |
| 674 | +++ lib/lp/soyuz/model/queue.py 2010-05-12 11:01:39 +0000 |
| 675 | @@ -509,7 +509,7 @@ |
| 676 | for queue_source in self.sources: |
| 677 | names.append(queue_source.sourcepackagerelease.name) |
| 678 | for queue_build in self.builds: |
| 679 | - names.append(queue_build.build.sourcepackagerelease.name) |
| 680 | + names.append(queue_build.build.source_package_release.name) |
| 681 | for queue_custom in self.customfiles: |
| 682 | names.append(queue_custom.libraryfilealias.filename) |
| 683 | # Make sure the list items have a whitespace separator so |
| 684 | @@ -1413,7 +1413,7 @@ |
| 685 | def publish(self, logger=None): |
| 686 | """See `IPackageUploadBuild`.""" |
| 687 | # Determine the build's architecturetag |
| 688 | - build_archtag = self.build.distroarchseries.architecturetag |
| 689 | + build_archtag = self.build.distro_arch_series.architecturetag |
| 690 | # Determine the target arch series. |
| 691 | # This will raise NotFoundError if anything odd happens. |
| 692 | target_dar = self.packageupload.distroseries[build_archtag] |
| 693 | |
| 694 | === modified file 'lib/lp/soyuz/model/sourcepackagerelease.py' |
| 695 | --- lib/lp/soyuz/model/sourcepackagerelease.py 2010-05-12 11:01:28 +0000 |
| 696 | +++ lib/lp/soyuz/model/sourcepackagerelease.py 2010-05-12 11:01:39 +0000 |
| 697 | @@ -146,12 +146,14 @@ |
| 698 | # a build may well have a different archive to the corresponding |
| 699 | # sourcepackagerelease. |
| 700 | return BinaryPackageBuild.select(""" |
| 701 | - sourcepackagerelease = %s AND |
| 702 | - archive.id = build.archive AND |
| 703 | + source_package_release = %s AND |
| 704 | + package_build = packagebuild.id AND |
| 705 | + archive.id = packagebuild.archive AND |
| 706 | + packagebuild.build_farm_job = buildfarmjob.id AND |
| 707 | archive.purpose IN %s |
| 708 | """ % sqlvalues(self.id, MAIN_ARCHIVE_PURPOSES), |
| 709 | - orderBy=['-datecreated', 'id'], |
| 710 | - clauseTables=['Archive']) |
| 711 | + orderBy=['-buildfarmjob.date_created', 'id'], |
| 712 | + clauseTables=['Archive', 'PackageBuild', 'BuildFarmJob']) |
| 713 | |
| 714 | @property |
| 715 | def age(self): |
| 716 | @@ -173,7 +175,7 @@ |
| 717 | @property |
| 718 | def needs_building(self): |
| 719 | for build in self._cached_builds: |
| 720 | - if build.buildstate in [BuildStatus.NEEDSBUILD, |
| 721 | + if build.status in [BuildStatus.NEEDSBUILD, |
| 722 | BuildStatus.MANUALDEPWAIT, |
| 723 | BuildStatus.CHROOTWAIT]: |
| 724 | return True |
| 725 | |
| 726 | === modified file 'lib/lp/soyuz/tests/test_binarypackagebuild.py' |
| 727 | --- lib/lp/soyuz/tests/test_binarypackagebuild.py 2010-05-12 11:01:28 +0000 |
| 728 | +++ lib/lp/soyuz/tests/test_binarypackagebuild.py 2010-05-12 11:01:39 +0000 |
| 729 | @@ -60,6 +60,34 @@ |
| 730 | self.failIfEqual(None, bq.processor) |
| 731 | self.failUnless(bq, self.build.buildqueue_record) |
| 732 | |
| 733 | + def addFakeBuildLog(self): |
| 734 | + lfa = self.factory.makeLibraryFileAlias('mybuildlog.txt') |
| 735 | + removeSecurityProxy(self.build).log = lfa |
| 736 | + |
| 737 | + def test_log_url(self): |
| 738 | + # The log URL for a binary package build will use |
| 739 | + # the distribution source package release when the context |
| 740 | + # is not a PPA or a copy archive. |
| 741 | + self.addFakeBuildLog() |
| 742 | + self.failUnlessEqual( |
| 743 | + 'http://launchpad.dev/ubuntutest/+source/' |
| 744 | + 'gedit/666/+build/%d/+files/mybuildlog.txt' % ( |
| 745 | + self.build.package_build.build_farm_job.id), |
| 746 | + self.build.log_url) |
| 747 | + |
| 748 | + def test_log_url_ppa(self): |
| 749 | + # On the other hand, ppa or copy builds will have a url in the |
| 750 | + # context of the archive. |
| 751 | + self.addFakeBuildLog() |
| 752 | + ppa_owner = self.factory.makePerson(name="joe") |
| 753 | + removeSecurityProxy(self.build).archive = self.factory.makeArchive( |
| 754 | + owner=ppa_owner, name="myppa") |
| 755 | + self.failUnlessEqual( |
| 756 | + 'http://launchpad.dev/~joe/' |
| 757 | + '+archive/myppa/+build/%d/+files/mybuildlog.txt' % ( |
| 758 | + self.build.build_farm_job.id), |
| 759 | + self.build.log_url) |
| 760 | + |
| 761 | |
| 762 | class TestBuildUpdateDependencies(TestCaseWithFactory): |
| 763 | |
| 764 | |
| 765 | === modified file 'lib/lp/soyuz/tests/test_packageupload.py' |
| 766 | --- lib/lp/soyuz/tests/test_packageupload.py 2010-03-24 11:58:07 +0000 |
| 767 | +++ lib/lp/soyuz/tests/test_packageupload.py 2010-05-12 11:01:39 +0000 |
| 768 | @@ -273,7 +273,7 @@ |
| 769 | 'main/dist-upgrader-all') |
| 770 | self.assertEquals( |
| 771 | ['20060302.0120', 'current'], sorted(os.listdir(custom_path))) |
| 772 | - |
| 773 | + |
| 774 | # The custom files were also copied to the public librarian |
| 775 | for customfile in delayed_copy.customfiles: |
| 776 | self.assertFalse(customfile.libraryfilealias.restricted) |
| 777 | @@ -300,7 +300,7 @@ |
| 778 | [pub_record] = pub_records |
| 779 | [build] = pub_record.getBuilds() |
| 780 | self.assertEquals( |
| 781 | - BuildStatus.NEEDSBUILD, build.buildstate) |
| 782 | + BuildStatus.NEEDSBUILD, build.status) |
| 783 | |
| 784 | def test_realiseUpload_for_overridden_component_archive(self): |
| 785 | # If the component of an upload is overridden to 'Partner' for |

Looks good.