Merge lp:~jelmer/launchpad/506256-remove-popen into lp:launchpad

Proposed by Jelmer Vernooij
Status: Merged
Approved by: Graham Binns
Approved revision: no longer in the source branch.
Merged at revision: 11566
Proposed branch: lp:~jelmer/launchpad/506256-remove-popen
Merge into: lp:launchpad
Diff against target: 1176 lines (+222/-405)
17 files modified
lib/canonical/launchpad/webapp/tales.py (+1/-0)
lib/lp/archiveuploader/tests/test_uploadprocessor.py (+53/-24)
lib/lp/archiveuploader/uploadprocessor.py (+25/-9)
lib/lp/buildmaster/enums.py (+7/-3)
lib/lp/buildmaster/interfaces/buildfarmjob.py (+4/-0)
lib/lp/buildmaster/interfaces/packagebuild.py (+2/-20)
lib/lp/buildmaster/model/buildfarmjob.py (+10/-0)
lib/lp/buildmaster/model/packagebuild.py (+35/-153)
lib/lp/buildmaster/tests/test_buildfarmjob.py (+15/-0)
lib/lp/buildmaster/tests/test_packagebuild.py (+29/-67)
lib/lp/code/browser/sourcepackagerecipebuild.py (+1/-0)
lib/lp/code/model/tests/test_sourcepackagerecipebuild.py (+6/-6)
lib/lp/registry/model/sourcepackage.py (+6/-2)
lib/lp/soyuz/browser/archive.py (+1/-0)
lib/lp/soyuz/doc/buildd-slavescanner.txt (+12/-116)
lib/lp/soyuz/model/archive.py (+3/-2)
lib/lp/soyuz/model/binarypackagebuild.py (+12/-3)
To merge this branch: bzr merge lp:~jelmer/launchpad/506256-remove-popen
Reviewer Review Type Date Requested Status
Graham Binns (community) release-critical Approve
Brad Crittenden (community) code Approve
Julian Edwards (community) Approve
Review via email: mp+34549@code.launchpad.net

Commit message

Builddmaster now moves build binaries away for later processing rather than invoking process-uploader on them directly.

Description of the change

buildd-manager currently invokes the uploadprocessor on binaries that it has fetched from the buildd slaves.

As this process is synchronous is blocks the buildd manager from doing other things at the same time - such as scheduling new builds - time during which the buildd slaves are idling.

This branch changes the buildd manager to move build results out of the way into a queue that can independently be processed by process-upload (extensions for process-upload to support this were landed earlier).

Tests:
./bin/test lp.archiveuploader
./bin/test lp.buildmaster

There is still a single test testNoFiles() in the archiveuploader that is itself buggy. I'm looking into this at the moment, but wanted to submit the branch for review earlier because of the upcoming PQM closure. The fix for this test shouldn't involve any changes outside of the test itself.

QA:

This branch has been running on dogfood for the past week or so in its current form and has been working well. We've tested with multiple buildds and thrown several hundred builds at it.

Deployment:

A cron job needs to be set up to run the following command regularly:

LPCURRENT/scripts/process-upload.py -C buildd --builds -v $QUEUE

To post a comment you must log in.
Revision history for this message
Brad Crittenden (bac) wrote :

Jelmer this branch looks ok. As I mentioned on IRC I'm getting test failures. I'm re-running them now and will paste the results when they are done.

Other than that I only find this one typo that needs fixing.

typo: s/nonexisting/nonexistent

Keeping the branch unapproved until the tests are sorted out.

review: Needs Information (code)
Revision history for this message
Jelmer Vernooij (jelmer) wrote :

I need to change the lookups in the archiveuploader to be of binarypackagebuilds or sourcepackagebuilds rather than on packagebuilds in general, since otherwise we don't have a sensible verifySuccessBuild() method to call.

Revision history for this message
Graham Binns (gmb) wrote :

Abstaining until the tests are fixed.

review: Abstain (release-critical)
Revision history for this message
Jelmer Vernooij (jelmer) wrote :

This turned out to be quite a complex issue; I'm fixing the last test now.

Revision history for this message
Julian Edwards (julian-edwards) wrote :

My two pence, as I mentioned on Mumble:

 1. Please change the added __getitem__ to a getByID() method. We're deprecating __getitem__
 2. Typo in the new enum (s/process/processed/)

Otherwise, ROCK!

review: Approve
Revision history for this message
Brad Crittenden (bac) wrote :

Thanks for fixing the tests Jelmer. They all ran locally for me.

review: Approve (code)
Revision history for this message
Graham Binns (gmb) wrote :

RC=me on this. Please land this on db-devel rather than devel (I believe there's some argument you can pass to utils/ec2 in order to achieve this). We want to close devel for landings sooner rather than later, so all traffic should go to db-devel unless absolutely necessary.

review: Approve (release-critical)

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'lib/canonical/launchpad/webapp/tales.py'
--- lib/canonical/launchpad/webapp/tales.py 2010-08-27 11:19:54 +0000
+++ lib/canonical/launchpad/webapp/tales.py 2010-09-16 00:48:58 +0000
@@ -995,6 +995,7 @@
995 BuildStatus.CHROOTWAIT: {'src': "/@@/build-chrootwait"},995 BuildStatus.CHROOTWAIT: {'src': "/@@/build-chrootwait"},
996 BuildStatus.SUPERSEDED: {'src': "/@@/build-superseded"},996 BuildStatus.SUPERSEDED: {'src': "/@@/build-superseded"},
997 BuildStatus.BUILDING: {'src': "/@@/processing"},997 BuildStatus.BUILDING: {'src': "/@@/processing"},
998 BuildStatus.UPLOADING: {'src': "/@@/processing"},
998 BuildStatus.FAILEDTOUPLOAD: {'src': "/@@/build-failedtoupload"},999 BuildStatus.FAILEDTOUPLOAD: {'src': "/@@/build-failedtoupload"},
999 }1000 }
10001001
10011002
=== modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py'
--- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-09-02 16:28:50 +0000
+++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-09-16 00:48:58 +0000
@@ -31,6 +31,7 @@
31from canonical.launchpad.testing.fakepackager import FakePackager31from canonical.launchpad.testing.fakepackager import FakePackager
32from canonical.launchpad.webapp.errorlog import ErrorReportingUtility32from canonical.launchpad.webapp.errorlog import ErrorReportingUtility
33from canonical.testing import LaunchpadZopelessLayer33from canonical.testing import LaunchpadZopelessLayer
34
34from lp.app.errors import NotFoundError35from lp.app.errors import NotFoundError
35from lp.archiveuploader.uploadpolicy import (36from lp.archiveuploader.uploadpolicy import (
36 AbstractUploadPolicy,37 AbstractUploadPolicy,
@@ -41,7 +42,10 @@
41 parse_build_upload_leaf_name,42 parse_build_upload_leaf_name,
42 UploadProcessor,43 UploadProcessor,
43 )44 )
44from lp.buildmaster.enums import BuildStatus45from lp.buildmaster.enums import (
46 BuildFarmJobType,
47 BuildStatus,
48 )
45from lp.registry.interfaces.distribution import IDistributionSet49from lp.registry.interfaces.distribution import IDistributionSet
46from lp.registry.interfaces.person import IPersonSet50from lp.registry.interfaces.person import IPersonSet
47from lp.registry.interfaces.pocket import PackagePublishingPocket51from lp.registry.interfaces.pocket import PackagePublishingPocket
@@ -1861,17 +1865,26 @@
1861 self.uploadprocessor = self.setupBreezyAndGetUploadProcessor()1865 self.uploadprocessor = self.setupBreezyAndGetUploadProcessor()
18621866
1863 def testInvalidLeafName(self):1867 def testInvalidLeafName(self):
1864 upload_dir = self.queueUpload("bar_1.0-1")1868 # Directories with invalid leaf names should be skipped,
1865 self.uploadprocessor.processBuildUpload(upload_dir, "bar_1.0-1")1869 # and a warning logged.
1870 upload_dir = self.queueUpload("bar_1.0-1", queue_entry="bar")
1871 self.uploadprocessor.processBuildUpload(upload_dir, "bar")
1866 self.assertLogContains('Unable to extract build id from leaf '1872 self.assertLogContains('Unable to extract build id from leaf '
1867 'name bar_1.0-1, skipping.')1873 'name bar, skipping.')
18681874
1869 def testNoBuildEntry(self):1875 def testNoBuildEntry(self):
1870 upload_dir = self.queueUpload("bar_1.0-1", queue_entry="42-60")1876 # Directories with that refer to a nonexistent build
1871 self.assertRaises(NotFoundError, self.uploadprocessor.processBuildUpload,1877 # should be skipped and a warning logged.
1872 upload_dir, "42-60")1878 cookie = "%s-%d" % (BuildFarmJobType.PACKAGEBUILD.name, 42)
1879 upload_dir = self.queueUpload("bar_1.0-1", queue_entry=cookie)
1880 self.uploadprocessor.processBuildUpload(upload_dir, cookie)
1881 self.assertLogContains(
1882 "Unable to find package build job with id 42. Skipping.")
18731883
1874 def testNoFiles(self):1884 def testNoFiles(self):
1885 # If the upload directory is empty, the upload
1886 # will fail.
1887
1875 # Upload a source package1888 # Upload a source package
1876 upload_dir = self.queueUpload("bar_1.0-1")1889 upload_dir = self.queueUpload("bar_1.0-1")
1877 self.processUpload(self.uploadprocessor, upload_dir)1890 self.processUpload(self.uploadprocessor, upload_dir)
@@ -1884,19 +1897,28 @@
1884 version="1.0-1", name="bar")1897 version="1.0-1", name="bar")
1885 queue_item.setDone()1898 queue_item.setDone()
18861899
1887 build.jobStarted()1900 builder = self.factory.makeBuilder()
1888 build.builder = self.factory.makeBuilder()1901 build.buildqueue_record.markAsBuilding(builder)
1902 build.builder = build.buildqueue_record.builder
1903
1904 build.status = BuildStatus.UPLOADING
18891905
1890 # Upload and accept a binary for the primary archive source.1906 # Upload and accept a binary for the primary archive source.
1891 shutil.rmtree(upload_dir)1907 shutil.rmtree(upload_dir)
1892 self.layer.txn.commit()1908 self.layer.txn.commit()
1893 leaf_name = "%d-%d" % (build.id, 60)1909 leaf_name = build.getUploadDirLeaf(build.getBuildCookie())
1894 os.mkdir(os.path.join(self.incoming_folder, leaf_name))1910 os.mkdir(os.path.join(self.incoming_folder, leaf_name))
1895 self.options.context = 'buildd'1911 self.options.context = 'buildd'
1896 self.options.builds = True1912 self.options.builds = True
1897 self.uploadprocessor.processBuildUpload(self.incoming_folder, leaf_name)1913 self.uploadprocessor.processBuildUpload(
1914 self.incoming_folder, leaf_name)
1915 self.assertEquals(1, len(self.oopses))
1898 self.layer.txn.commit()1916 self.layer.txn.commit()
1899 self.assertEquals(BuildStatus.FAILEDTOUPLOAD, build.status)1917 self.assertEquals(
1918 BuildStatus.FAILEDTOUPLOAD, build.status)
1919 self.assertEquals(builder, build.builder)
1920 self.assertIsNot(None, build.date_finished)
1921 self.assertIsNot(None, build.duration)
1900 log_contents = build.upload_log.read()1922 log_contents = build.upload_log.read()
1901 self.assertTrue('ERROR: Exception while processing upload '1923 self.assertTrue('ERROR: Exception while processing upload '
1902 in log_contents)1924 in log_contents)
@@ -1904,6 +1926,8 @@
1904 in log_contents)1926 in log_contents)
19051927
1906 def testSuccess(self):1928 def testSuccess(self):
1929 # Properly uploaded binaries should result in the
1930 # build status changing to FULLYBUILT.
1907 # Upload a source package1931 # Upload a source package
1908 upload_dir = self.queueUpload("bar_1.0-1")1932 upload_dir = self.queueUpload("bar_1.0-1")
1909 self.processUpload(self.uploadprocessor, upload_dir)1933 self.processUpload(self.uploadprocessor, upload_dir)
@@ -1916,21 +1940,27 @@
1916 version="1.0-1", name="bar")1940 version="1.0-1", name="bar")
1917 queue_item.setDone()1941 queue_item.setDone()
19181942
1919 build.jobStarted()1943 build.buildqueue_record.markAsBuilding(self.factory.makeBuilder())
1920 build.builder = self.factory.makeBuilder()1944
1945 build.status = BuildStatus.UPLOADING
19211946
1922 # Upload and accept a binary for the primary archive source.1947 # Upload and accept a binary for the primary archive source.
1923 shutil.rmtree(upload_dir)1948 shutil.rmtree(upload_dir)
1924 self.layer.txn.commit()1949 self.layer.txn.commit()
1925 leaf_name = "%d-%d" % (build.id, 60)1950 leaf_name = build.getUploadDirLeaf(build.getBuildCookie())
1926 upload_dir = self.queueUpload("bar_1.0-1_binary",1951 upload_dir = self.queueUpload("bar_1.0-1_binary",
1927 queue_entry=leaf_name)1952 queue_entry=leaf_name)
1928 self.options.context = 'buildd'1953 self.options.context = 'buildd'
1929 self.options.builds = True1954 self.options.builds = True
1930 self.uploadprocessor.processBuildUpload(self.incoming_folder, leaf_name)1955 last_stub_mail_count = len(stub.test_emails)
1956 self.uploadprocessor.processBuildUpload(
1957 self.incoming_folder, leaf_name)
1931 self.layer.txn.commit()1958 self.layer.txn.commit()
1959 # No emails are sent on success
1960 self.assertEquals(len(stub.test_emails), last_stub_mail_count)
1932 self.assertEquals(BuildStatus.FULLYBUILT, build.status)1961 self.assertEquals(BuildStatus.FULLYBUILT, build.status)
1933 log_lines = build.upload_log.read().splitlines()1962 log_contents = build.upload_log.read()
1963 log_lines = log_contents.splitlines()
1934 self.assertTrue(1964 self.assertTrue(
1935 'INFO: Processing upload bar_1.0-1_i386.changes' in log_lines)1965 'INFO: Processing upload bar_1.0-1_i386.changes' in log_lines)
1936 self.assertTrue(1966 self.assertTrue(
@@ -1942,10 +1972,9 @@
1942 """Tests for parse_build_upload_leaf_name."""1972 """Tests for parse_build_upload_leaf_name."""
19431973
1944 def test_valid(self):1974 def test_valid(self):
1945 self.assertEquals((42, 60), parse_build_upload_leaf_name("42-60"))1975 self.assertEquals(
19461976 60, parse_build_upload_leaf_name("20100812-42-PACKAGEBUILD-60"))
1947 def test_invalid_chars(self):1977
1948 self.assertRaises(ValueError, parse_build_upload_leaf_name, "a42-460")1978 def test_invalid_jobid(self):
19491979 self.assertRaises(
1950 def test_no_dash(self):1980 ValueError, parse_build_upload_leaf_name, "aaba-a42-PACKAGEBUILD-abc")
1951 self.assertRaises(ValueError, parse_build_upload_leaf_name, "32")
19521981
=== modified file 'lib/lp/archiveuploader/uploadprocessor.py'
--- lib/lp/archiveuploader/uploadprocessor.py 2010-08-27 11:19:54 +0000
+++ lib/lp/archiveuploader/uploadprocessor.py 2010-09-16 00:48:58 +0000
@@ -74,14 +74,16 @@
74 SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME,74 SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME,
75 UploadPolicyError,75 UploadPolicyError,
76 )76 )
77from lp.buildmaster.enums import BuildStatus77from lp.buildmaster.enums import (
78 BuildStatus,
79 )
80from lp.buildmaster.interfaces.buildfarmjob import IBuildFarmJobSet
78from lp.registry.interfaces.distribution import IDistributionSet81from lp.registry.interfaces.distribution import IDistributionSet
79from lp.registry.interfaces.person import IPersonSet82from lp.registry.interfaces.person import IPersonSet
80from lp.soyuz.interfaces.archive import (83from lp.soyuz.interfaces.archive import (
81 IArchiveSet,84 IArchiveSet,
82 NoSuchPPA,85 NoSuchPPA,
83 )86 )
84from lp.soyuz.interfaces.binarypackagebuild import IBinaryPackageBuildSet
8587
8688
87__all__ = [89__all__ = [
@@ -104,11 +106,11 @@
104 """Parse the leaf directory name of a build upload.106 """Parse the leaf directory name of a build upload.
105107
106 :param name: Directory name.108 :param name: Directory name.
107 :return: Tuple with build id and build queue record id.109 :return: Tuple with build farm job id.
108 """110 """
109 (build_id_str, queue_record_str) = name.split("-", 1)111 (job_id_str,) = name.split("-")[-1:]
110 try:112 try:
111 return int(build_id_str), int(queue_record_str)113 return int(job_id_str)
112 except TypeError:114 except TypeError:
113 raise ValueError115 raise ValueError
114116
@@ -191,7 +193,7 @@
191 continue193 continue
192 if self.builds:194 if self.builds:
193 # Upload directories contain build results,195 # Upload directories contain build results,
194 # directories are named after build ids.196 # directories are named after job ids.
195 self.processBuildUpload(fsroot, upload)197 self.processBuildUpload(fsroot, upload)
196 else:198 else:
197 self.processUpload(fsroot, upload)199 self.processUpload(fsroot, upload)
@@ -206,13 +208,24 @@
206 Build uploads always contain a single package per leaf.208 Build uploads always contain a single package per leaf.
207 """209 """
208 try:210 try:
209 (build_id, build_queue_record_id) = parse_build_upload_leaf_name(211 job_id = parse_build_upload_leaf_name(upload)
210 upload)
211 except ValueError:212 except ValueError:
212 self.log.warn("Unable to extract build id from leaf name %s,"213 self.log.warn("Unable to extract build id from leaf name %s,"
213 " skipping." % upload)214 " skipping." % upload)
214 return215 return
215 build = getUtility(IBinaryPackageBuildSet).getByBuildID(int(build_id))216 try:
217 buildfarm_job = getUtility(IBuildFarmJobSet).getByID(job_id)
218 except NotFoundError:
219 self.log.warn(
220 "Unable to find package build job with id %d. Skipping." %
221 job_id)
222 return
223 build = buildfarm_job.getSpecificJob()
224 if build.status != BuildStatus.UPLOADING:
225 self.log.warn(
226 "Expected build status to be 'UPLOADING', was %s. Skipping.",
227 build.status.name)
228 return
216 self.log.debug("Build %s found" % build.id)229 self.log.debug("Build %s found" % build.id)
217 logger = BufferLogger()230 logger = BufferLogger()
218 upload_path = os.path.join(fsroot, upload)231 upload_path = os.path.join(fsroot, upload)
@@ -246,6 +259,9 @@
246 build.notify(extra_info="Uploading build %s failed." % upload)259 build.notify(extra_info="Uploading build %s failed." % upload)
247 build.storeUploadLog(logger.buffer.getvalue())260 build.storeUploadLog(logger.buffer.getvalue())
248261
262 # Remove BuildQueue record.
263 build.buildqueue_record.destroySelf()
264
249 def processUpload(self, fsroot, upload):265 def processUpload(self, fsroot, upload):
250 """Process an upload's changes files, and move it to a new directory.266 """Process an upload's changes files, and move it to a new directory.
251267
252268
=== modified file 'lib/lp/buildmaster/enums.py'
--- lib/lp/buildmaster/enums.py 2010-08-27 15:03:18 +0000
+++ lib/lp/buildmaster/enums.py 2010-09-16 00:48:58 +0000
@@ -97,6 +97,13 @@
97 will be notified via process-upload about the reason of the rejection.97 will be notified via process-upload about the reason of the rejection.
98 """)98 """)
9999
100 UPLOADING = DBItem(8, """
101 Uploading build
102
103 The build has completed and is waiting to be processed by the
104 upload processor.
105 """)
106
100107
101class BuildFarmJobType(DBEnumeratedType):108class BuildFarmJobType(DBEnumeratedType):
102 """Soyuz build farm job type.109 """Soyuz build farm job type.
@@ -128,6 +135,3 @@
128135
129 Generate translation templates from a bazaar branch.136 Generate translation templates from a bazaar branch.
130 """)137 """)
131
132
133
134138
=== modified file 'lib/lp/buildmaster/interfaces/buildfarmjob.py'
--- lib/lp/buildmaster/interfaces/buildfarmjob.py 2010-08-30 15:00:23 +0000
+++ lib/lp/buildmaster/interfaces/buildfarmjob.py 2010-09-16 00:48:58 +0000
@@ -292,3 +292,7 @@
292 that should be included.292 that should be included.
293 :return: a `ResultSet` representing the requested builds.293 :return: a `ResultSet` representing the requested builds.
294 """294 """
295
296 def getByID(job_id):
297 """Look up a `IBuildFarmJob` record by id.
298 """
295299
=== modified file 'lib/lp/buildmaster/interfaces/packagebuild.py'
--- lib/lp/buildmaster/interfaces/packagebuild.py 2010-08-27 11:19:54 +0000
+++ lib/lp/buildmaster/interfaces/packagebuild.py 2010-09-16 00:48:58 +0000
@@ -91,13 +91,6 @@
91 title=_("Distribution series"), required=True,91 title=_("Distribution series"), required=True,
92 description=_("Shortcut for its distribution series.")))92 description=_("Shortcut for its distribution series.")))
9393
94 def getUploaderCommand(package_build, upload_leaf, uploader_logfilename):
95 """Get the command to run as the uploader.
96
97 :return: A list of command line arguments, beginning with the
98 executable.
99 """
100
101 def getUploadDirLeaf(build_cookie, now=None):94 def getUploadDirLeaf(build_cookie, now=None):
102 """Return the directory-leaf where files to be uploaded are stored.95 """Return the directory-leaf where files to be uploaded are stored.
10396
@@ -106,24 +99,13 @@
106 directory name. If not provided, defaults to now.99 directory name. If not provided, defaults to now.
107 """100 """
108101
109 def getUploadDir(upload_leaf):102 def getBuildCookie():
110 """Return the full directory where files to be uploaded are stored.103 """Return the build cookie (build id and build queue record id).
111
112 :param upload_leaf: The leaf directory name where things will be
113 stored.
114 """104 """
115105
116 def getLogFromSlave(build):106 def getLogFromSlave(build):
117 """Get last buildlog from slave. """107 """Get last buildlog from slave. """
118108
119 def getUploadLogContent(root, leaf):
120 """Retrieve the upload log contents.
121
122 :param root: Root directory for the uploads
123 :param leaf: Leaf for this particular upload
124 :return: Contents of log file or message saying no log file was found.
125 """
126
127 def estimateDuration():109 def estimateDuration():
128 """Estimate the build duration."""110 """Estimate the build duration."""
129111
130112
=== modified file 'lib/lp/buildmaster/model/buildfarmjob.py'
--- lib/lp/buildmaster/model/buildfarmjob.py 2010-08-30 15:00:23 +0000
+++ lib/lp/buildmaster/model/buildfarmjob.py 2010-09-16 00:48:58 +0000
@@ -52,6 +52,7 @@
52 IStoreSelector,52 IStoreSelector,
53 MAIN_STORE,53 MAIN_STORE,
54 )54 )
55from lp.app.errors import NotFoundError
55from lp.buildmaster.enums import BuildStatus56from lp.buildmaster.enums import BuildStatus
56from lp.buildmaster.enums import BuildFarmJobType57from lp.buildmaster.enums import BuildFarmJobType
57from lp.buildmaster.interfaces.buildfarmjob import (58from lp.buildmaster.interfaces.buildfarmjob import (
@@ -339,6 +340,7 @@
339 """See `IBuild`"""340 """See `IBuild`"""
340 return self.status not in [BuildStatus.NEEDSBUILD,341 return self.status not in [BuildStatus.NEEDSBUILD,
341 BuildStatus.BUILDING,342 BuildStatus.BUILDING,
343 BuildStatus.UPLOADING,
342 BuildStatus.SUPERSEDED]344 BuildStatus.SUPERSEDED]
343345
344 def getSpecificJob(self):346 def getSpecificJob(self):
@@ -431,3 +433,11 @@
431 filtered_builds.config(distinct=True)433 filtered_builds.config(distinct=True)
432434
433 return filtered_builds435 return filtered_builds
436
437 def getByID(self, job_id):
438 """See `IBuildfarmJobSet`."""
439 job = IStore(BuildFarmJob).find(BuildFarmJob,
440 BuildFarmJob.id == job_id).one()
441 if job is None:
442 raise NotFoundError(job_id)
443 return job
434444
=== modified file 'lib/lp/buildmaster/model/packagebuild.py'
--- lib/lp/buildmaster/model/packagebuild.py 2010-09-09 17:02:33 +0000
+++ lib/lp/buildmaster/model/packagebuild.py 2010-09-16 00:48:58 +0000
@@ -14,7 +14,6 @@
14import datetime14import datetime
15import logging15import logging
16import os.path16import os.path
17import subprocess
1817
19from cStringIO import StringIO18from cStringIO import StringIO
20from lazr.delegates import delegates19from lazr.delegates import delegates
@@ -36,11 +35,6 @@
3635
37from canonical.config import config36from canonical.config import config
38from canonical.database.enumcol import DBEnum37from canonical.database.enumcol import DBEnum
39from canonical.database.sqlbase import (
40 clear_current_connection_cache,
41 cursor,
42 flush_database_updates,
43 )
44from canonical.launchpad.browser.librarian import ProxiedLibraryFileAlias38from canonical.launchpad.browser.librarian import ProxiedLibraryFileAlias
45from canonical.launchpad.helpers import filenameToContentType39from canonical.launchpad.helpers import filenameToContentType
46from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet40from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet
@@ -73,7 +67,6 @@
7367
7468
75SLAVE_LOG_FILENAME = 'buildlog'69SLAVE_LOG_FILENAME = 'buildlog'
76UPLOAD_LOG_FILENAME = 'uploader.log'
7770
7871
79class PackageBuild(BuildFarmJobDerived, Storm):72class PackageBuild(BuildFarmJobDerived, Storm):
@@ -164,30 +157,11 @@
164 timestamp = now.strftime("%Y%m%d-%H%M%S")157 timestamp = now.strftime("%Y%m%d-%H%M%S")
165 return '%s-%s' % (timestamp, build_cookie)158 return '%s-%s' % (timestamp, build_cookie)
166159
167 def getUploadDir(self, upload_leaf):160 def getBuildCookie(self):
168 """See `IPackageBuild`."""161 """See `IPackageBuild`."""
169 return os.path.join(config.builddmaster.root, 'incoming', upload_leaf)162 return '%s-%s-%s' % (
170163 self.id, self.build_farm_job.job_type.name,
171 @staticmethod164 self.build_farm_job.id)
172 def getUploaderCommand(package_build, upload_leaf, upload_logfilename):
173 """See `IPackageBuild`."""
174 root = os.path.abspath(config.builddmaster.root)
175 uploader_command = list(config.builddmaster.uploader.split())
176
177 # Add extra arguments for processing a package upload.
178 extra_args = [
179 "--log-file", "%s" % upload_logfilename,
180 "-d", "%s" % package_build.distribution.name,
181 "-s", "%s" % (
182 package_build.distro_series.getSuite(package_build.pocket)),
183 "-b", "%s" % package_build.id,
184 "-J", "%s" % upload_leaf,
185 '--context=%s' % package_build.policy_name,
186 "%s" % root,
187 ]
188
189 uploader_command.extend(extra_args)
190 return uploader_command
191165
192 @staticmethod166 @staticmethod
193 def getLogFromSlave(package_build):167 def getLogFromSlave(package_build):
@@ -198,26 +172,6 @@
198 package_build.buildqueue_record.getLogFileName(),172 package_build.buildqueue_record.getLogFileName(),
199 package_build.is_private)173 package_build.is_private)
200174
201 @staticmethod
202 def getUploadLogContent(root, leaf):
203 """Retrieve the upload log contents.
204
205 :param root: Root directory for the uploads
206 :param leaf: Leaf for this particular upload
207 :return: Contents of log file or message saying no log file was found.
208 """
209 # Retrieve log file content.
210 possible_locations = (
211 'failed', 'failed-to-move', 'rejected', 'accepted')
212 for location_dir in possible_locations:
213 log_filepath = os.path.join(root, location_dir, leaf,
214 UPLOAD_LOG_FILENAME)
215 if os.path.exists(log_filepath):
216 with open(log_filepath, 'r') as uploader_log_file:
217 return uploader_log_file.read()
218 else:
219 return 'Could not find upload log file'
220
221 def estimateDuration(self):175 def estimateDuration(self):
222 """See `IPackageBuild`."""176 """See `IPackageBuild`."""
223 raise NotImplementedError177 raise NotImplementedError
@@ -346,19 +300,16 @@
346 root = os.path.abspath(config.builddmaster.root)300 root = os.path.abspath(config.builddmaster.root)
347301
348 # Create a single directory to store build result files.302 # Create a single directory to store build result files.
349 upload_leaf = self.getUploadDirLeaf(303 upload_leaf = self.getUploadDirLeaf(self.getBuildCookie())
350 '%s-%s' % (self.id, self.buildqueue_record.id))304 grab_dir = os.path.join(root, "grabbing", upload_leaf)
351 upload_dir = self.getUploadDir(upload_leaf)305 logger.debug("Storing build result at '%s'" % grab_dir)
352 logger.debug("Storing build result at '%s'" % upload_dir)
353306
354 # Build the right UPLOAD_PATH so the distribution and archive307 # Build the right UPLOAD_PATH so the distribution and archive
355 # can be correctly found during the upload:308 # can be correctly found during the upload:
356 # <archive_id>/distribution_name309 # <archive_id>/distribution_name
357 # for all destination archive types.310 # for all destination archive types.
358 archive = self.archive311 upload_path = os.path.join(
359 distribution_name = self.distribution.name312 grab_dir, str(self.archive.id), self.distribution.name)
360 target_path = '%s/%s' % (archive.id, distribution_name)
361 upload_path = os.path.join(upload_dir, target_path)
362 os.makedirs(upload_path)313 os.makedirs(upload_path)
363314
364 slave = removeSecurityProxy(self.buildqueue_record.builder.slave)315 slave = removeSecurityProxy(self.buildqueue_record.builder.slave)
@@ -379,106 +330,35 @@
379 slave_file = slave.getFile(filemap[filename])330 slave_file = slave.getFile(filemap[filename])
380 copy_and_close(slave_file, out_file)331 copy_and_close(slave_file, out_file)
381332
333 # Store build information, build record was already updated during
334 # the binary upload.
335 self.storeBuildInfo(self, librarian, slave_status)
336
382 # We only attempt the upload if we successfully copied all the337 # We only attempt the upload if we successfully copied all the
383 # files from the slave.338 # files from the slave.
384 if successful_copy_from_slave:339 if successful_copy_from_slave:
385 uploader_logfilename = os.path.join(340 logger.info(
386 upload_dir, UPLOAD_LOG_FILENAME)341 "Gathered %s %d completely. Moving %s to uploader queue." % (
387 uploader_command = self.getUploaderCommand(342 self.__class__.__name__, self.id, upload_leaf))
388 self, upload_leaf, uploader_logfilename)343 target_dir = os.path.join(root, "incoming")
389 logger.debug("Saving uploader log at '%s'" % uploader_logfilename)344 self.status = BuildStatus.UPLOADING
390345 else:
391 logger.info("Invoking uploader on %s" % root)346 logger.warning(
392 logger.info("%s" % uploader_command)347 "Copy from slave for build %s was unsuccessful.", self.id)
393
394 uploader_process = subprocess.Popen(
395 uploader_command, stdout=subprocess.PIPE,
396 stderr=subprocess.PIPE)
397
398 # Nothing should be written to the stdout/stderr.
399 upload_stdout, upload_stderr = uploader_process.communicate()
400
401 # XXX cprov 2007-04-17: we do not check uploader_result_code
402 # anywhere. We need to find out what will be best strategy
403 # when it failed HARD (there is a huge effort in process-upload
404 # to not return error, it only happen when the code is broken).
405 uploader_result_code = uploader_process.returncode
406 logger.info("Uploader returned %d" % uploader_result_code)
407
408 # Quick and dirty hack to carry on on process-upload failures
409 if os.path.exists(upload_dir):
410 logger.warning("The upload directory did not get moved.")
411 failed_dir = os.path.join(root, "failed-to-move")
412 if not os.path.exists(failed_dir):
413 os.mkdir(failed_dir)
414 os.rename(upload_dir, os.path.join(failed_dir, upload_leaf))
415
416 # The famous 'flush_updates + clear_cache' will make visible
417 # the DB changes done in process-upload, considering that the
418 # transaction was set with ISOLATION_LEVEL_READ_COMMITED
419 # isolation level.
420 cur = cursor()
421 cur.execute('SHOW transaction_isolation')
422 isolation_str = cur.fetchone()[0]
423 assert isolation_str == 'read committed', (
424 'BuildMaster/BuilderGroup transaction isolation should be '
425 'ISOLATION_LEVEL_READ_COMMITTED (not "%s")' % isolation_str)
426
427 original_slave = self.buildqueue_record.builder.slave
428
429 # XXX Robert Collins, Celso Providelo 2007-05-26 bug=506256:
430 # 'Refreshing' objects procedure is forced on us by using a
431 # different process to do the upload, but as that process runs
432 # in the same unix account, it is simply double handling and we
433 # would be better off to do it within this process.
434 flush_database_updates()
435 clear_current_connection_cache()
436
437 # XXX cprov 2007-06-15: Re-issuing removeSecurityProxy is forced on
438 # us by sqlobject refreshing the builder object during the
439 # transaction cache clearing. Once we sort the previous problem
440 # this step should probably not be required anymore.
441 self.buildqueue_record.builder.setSlaveForTesting(
442 removeSecurityProxy(original_slave))
443
444 # Store build information, build record was already updated during
445 # the binary upload.
446 self.storeBuildInfo(self, librarian, slave_status)
447
448 # Retrive the up-to-date build record and perform consistency
449 # checks. The build record should be updated during the binary
450 # upload processing, if it wasn't something is broken and needs
451 # admins attention. Even when we have a FULLYBUILT build record,
452 # if it is not related with at least one binary, there is also
453 # a problem.
454 # For both situations we will mark the builder as FAILEDTOUPLOAD
455 # and the and update the build details (datebuilt, duration,
456 # buildlog, builder) in LP. A build-failure-notification will be
457 # sent to the lp-build-admin celebrity and to the sourcepackagerelease
458 # uploader about this occurrence. The failure notification will
459 # also contain the information required to manually reprocess the
460 # binary upload when it was the case.
461 if (self.status != BuildStatus.FULLYBUILT or
462 not successful_copy_from_slave or
463 not self.verifySuccessfulUpload()):
464 logger.warning("Build %s upload failed." % self.id)
465 self.status = BuildStatus.FAILEDTOUPLOAD348 self.status = BuildStatus.FAILEDTOUPLOAD
466 uploader_log_content = self.getUploadLogContent(root,349 self.notify(extra_info='Copy from slave was unsuccessful.')
467 upload_leaf)350 target_dir = os.path.join(root, "failed")
468 # Store the upload_log_contents in librarian so it can be351
469 # accessed by anyone with permission to see the build.352 if not os.path.exists(target_dir):
470 self.storeUploadLog(uploader_log_content)353 os.mkdir(target_dir)
471 # Notify the build failure.354
472 self.notify(extra_info=uploader_log_content)355 # Move the directory used to grab the binaries into
473 else:356 # the incoming directory so the upload processor never
474 logger.info(357 # sees half-finished uploads.
475 "Gathered %s %d completely" % (358 os.rename(grab_dir, os.path.join(target_dir, upload_leaf))
476 self.__class__.__name__, self.id))
477359
478 # Release the builder for another job.360 # Release the builder for another job.
479 self.buildqueue_record.builder.cleanSlave()361 self.buildqueue_record.builder.cleanSlave()
480 # Remove BuildQueue record.
481 self.buildqueue_record.destroySelf()
482362
483 def _handleStatus_PACKAGEFAIL(self, librarian, slave_status, logger):363 def _handleStatus_PACKAGEFAIL(self, librarian, slave_status, logger):
484 """Handle a package that had failed to build.364 """Handle a package that had failed to build.
@@ -583,7 +463,9 @@
583 unfinished_states = [463 unfinished_states = [
584 BuildStatus.NEEDSBUILD,464 BuildStatus.NEEDSBUILD,
585 BuildStatus.BUILDING,465 BuildStatus.BUILDING,
586 BuildStatus.SUPERSEDED]466 BuildStatus.UPLOADING,
467 BuildStatus.SUPERSEDED,
468 ]
587 if status is None or status in unfinished_states:469 if status is None or status in unfinished_states:
588 result_set.order_by(470 result_set.order_by(
589 Desc(BuildFarmJob.date_created), BuildFarmJob.id)471 Desc(BuildFarmJob.date_created), BuildFarmJob.id)
590472
=== modified file 'lib/lp/buildmaster/tests/test_buildfarmjob.py'
--- lib/lp/buildmaster/tests/test_buildfarmjob.py 2010-08-30 15:00:23 +0000
+++ lib/lp/buildmaster/tests/test_buildfarmjob.py 2010-09-16 00:48:58 +0000
@@ -22,6 +22,7 @@
22 DatabaseFunctionalLayer,22 DatabaseFunctionalLayer,
23 LaunchpadFunctionalLayer,23 LaunchpadFunctionalLayer,
24 )24 )
25from lp.app.errors import NotFoundError
25from lp.buildmaster.enums import (26from lp.buildmaster.enums import (
26 BuildFarmJobType,27 BuildFarmJobType,
27 BuildStatus,28 BuildStatus,
@@ -317,3 +318,17 @@
317 result = self.build_farm_job_set.getBuildsForBuilder(self.builder)318 result = self.build_farm_job_set.getBuildsForBuilder(self.builder)
318319
319 self.assertEqual([build_1, build_2], list(result))320 self.assertEqual([build_1, build_2], list(result))
321
322 def test_getByID(self):
323 # getByID returns a job by id.
324 build_1 = self.makeBuildFarmJob(
325 builder=self.builder,
326 date_finished=datetime(2008, 10, 10, tzinfo=pytz.UTC))
327 flush_database_updates()
328 self.assertEquals(
329 build_1, self.build_farm_job_set.getByID(build_1.id))
330
331 def test_getByID_nonexistant(self):
332 # getByID raises NotFoundError for unknown job ids.
333 self.assertRaises(NotFoundError,
334 self.build_farm_job_set.getByID, 423432432432)
320335
=== modified file 'lib/lp/buildmaster/tests/test_packagebuild.py'
--- lib/lp/buildmaster/tests/test_packagebuild.py 2010-09-09 17:02:33 +0000
+++ lib/lp/buildmaster/tests/test_packagebuild.py 2010-09-16 00:48:58 +0000
@@ -9,7 +9,7 @@
99
10from datetime import datetime10from datetime import datetime
11import hashlib11import hashlib
12import os.path12import os
1313
14from storm.store import Store14from storm.store import Store
15from zope.component import getUtility15from zope.component import getUtility
@@ -22,6 +22,9 @@
22 LaunchpadFunctionalLayer,22 LaunchpadFunctionalLayer,
23 LaunchpadZopelessLayer,23 LaunchpadZopelessLayer,
24 )24 )
25from lp.archiveuploader.uploadprocessor import (
26 parse_build_upload_leaf_name,
27 )
25from lp.buildmaster.enums import (28from lp.buildmaster.enums import (
26 BuildFarmJobType,29 BuildFarmJobType,
27 BuildStatus,30 BuildStatus,
@@ -34,7 +37,6 @@
34from lp.buildmaster.model.packagebuild import PackageBuild37from lp.buildmaster.model.packagebuild import PackageBuild
35from lp.registry.interfaces.pocket import (38from lp.registry.interfaces.pocket import (
36 PackagePublishingPocket,39 PackagePublishingPocket,
37 pocketsuffix,
38 )40 )
39from lp.soyuz.tests.soyuzbuilddhelpers import WaitingSlave41from lp.soyuz.tests.soyuzbuilddhelpers import WaitingSlave
40from lp.testing import (42from lp.testing import (
@@ -192,15 +194,14 @@
192 '%s-%s' % (now.strftime("%Y%m%d-%H%M%S"), build_cookie),194 '%s-%s' % (now.strftime("%Y%m%d-%H%M%S"), build_cookie),
193 upload_leaf)195 upload_leaf)
194196
195 def test_getUploadDir(self):197 def test_getBuildCookie(self):
196 # getUploadDir is the absolute path to the directory in which things198 # A build cookie is made up of the package build id and record id.
197 # are uploaded to.199 # The uploadprocessor relies on this format.
198 build_cookie = self.factory.getUniqueInteger()200 Store.of(self.package_build).flush()
199 upload_leaf = self.package_build.getUploadDirLeaf(build_cookie)201 cookie = self.package_build.getBuildCookie()
200 upload_dir = self.package_build.getUploadDir(upload_leaf)202 expected_cookie = "%d-PACKAGEBUILD-%d" % (
201 self.assertEqual(203 self.package_build.id, self.package_build.build_farm_job.id)
202 os.path.join(config.builddmaster.root, 'incoming', upload_leaf),204 self.assertEquals(expected_cookie, cookie)
203 upload_dir)
204205
205206
206class TestPackageBuildSet(TestPackageBuildBase):207class TestPackageBuildSet(TestPackageBuildBase):
@@ -257,57 +258,18 @@
257 super(TestGetUploadMethodsMixin, self).setUp()258 super(TestGetUploadMethodsMixin, self).setUp()
258 self.build = self.makeBuild()259 self.build = self.makeBuild()
259260
260 def test_getUploadLogContent_nolog(self):261 def test_getUploadDirLeafCookie_parseable(self):
261 """If there is no log file there, a string explanation is returned.262 # getUploadDirLeaf should return a directory name
262 """263 # that is parseable by the upload processor.
263 self.useTempDir()264 upload_leaf = self.build.getUploadDirLeaf(
264 self.assertEquals(265 self.build.getBuildCookie())
265 'Could not find upload log file',266 job_id = parse_build_upload_leaf_name(upload_leaf)
266 self.build.getUploadLogContent(os.getcwd(), "myleaf"))267 self.assertEqual(job_id, self.build.build_farm_job.id)
267
268 def test_getUploadLogContent_only_dir(self):
269 """If there is a directory but no log file, expect the error string,
270 not an exception."""
271 self.useTempDir()
272 os.makedirs("accepted/myleaf")
273 self.assertEquals(
274 'Could not find upload log file',
275 self.build.getUploadLogContent(os.getcwd(), "myleaf"))
276
277 def test_getUploadLogContent_readsfile(self):
278 """If there is a log file, return its contents."""
279 self.useTempDir()
280 os.makedirs("accepted/myleaf")
281 with open('accepted/myleaf/uploader.log', 'w') as f:
282 f.write('foo')
283 self.assertEquals(
284 'foo', self.build.getUploadLogContent(os.getcwd(), "myleaf"))
285
286 def test_getUploaderCommand(self):
287 upload_leaf = self.factory.getUniqueString('upload-leaf')
288 config_args = list(config.builddmaster.uploader.split())
289 log_file = self.factory.getUniqueString('logfile')
290 config_args.extend(
291 ['--log-file', log_file,
292 '-d', self.build.distribution.name,
293 '-s', (self.build.distro_series.name
294 + pocketsuffix[self.build.pocket]),
295 '-b', str(self.build.id),
296 '-J', upload_leaf,
297 '--context=%s' % self.build.policy_name,
298 os.path.abspath(config.builddmaster.root),
299 ])
300 uploader_command = self.build.getUploaderCommand(
301 self.build, upload_leaf, log_file)
302 self.assertEqual(config_args, uploader_command)
303268
304269
305class TestHandleStatusMixin:270class TestHandleStatusMixin:
306 """Tests for `IPackageBuild`s handleStatus method.271 """Tests for `IPackageBuild`s handleStatus method.
307272
308 Note: these tests do *not* test the updating of the build
309 status to FULLYBUILT as this happens during the upload which
310 is stubbed out by a mock function.
311 """273 """
312274
313 layer = LaunchpadZopelessLayer275 layer = LaunchpadZopelessLayer
@@ -329,23 +291,23 @@
329 builder.setSlaveForTesting(self.slave)291 builder.setSlaveForTesting(self.slave)
330292
331 # We overwrite the buildmaster root to use a temp directory.293 # We overwrite the buildmaster root to use a temp directory.
332 tmp_dir = self.makeTemporaryDirectory()294 self.upload_root = self.makeTemporaryDirectory()
333 tmp_builddmaster_root = """295 tmp_builddmaster_root = """
334 [builddmaster]296 [builddmaster]
335 root: %s297 root: %s
336 """ % tmp_dir298 """ % self.upload_root
337 config.push('tmp_builddmaster_root', tmp_builddmaster_root)299 config.push('tmp_builddmaster_root', tmp_builddmaster_root)
338300
339 # We stub out our builds getUploaderCommand() method so301 # We stub out our builds getUploaderCommand() method so
340 # we can check whether it was called as well as302 # we can check whether it was called as well as
341 # verifySuccessfulUpload().303 # verifySuccessfulUpload().
342 self.fake_getUploaderCommand = FakeMethod(
343 result=['echo', 'noop'])
344 removeSecurityProxy(self.build).getUploaderCommand = (
345 self.fake_getUploaderCommand)
346 removeSecurityProxy(self.build).verifySuccessfulUpload = FakeMethod(304 removeSecurityProxy(self.build).verifySuccessfulUpload = FakeMethod(
347 result=True)305 result=True)
348306
307 def assertResultCount(self, count, result):
308 self.assertEquals(
309 1, len(os.listdir(os.path.join(self.upload_root, result))))
310
349 def test_handleStatus_OK_normal_file(self):311 def test_handleStatus_OK_normal_file(self):
350 # A filemap with plain filenames should not cause a problem.312 # A filemap with plain filenames should not cause a problem.
351 # The call to handleStatus will attempt to get the file from313 # The call to handleStatus will attempt to get the file from
@@ -354,8 +316,8 @@
354 'filemap': {'myfile.py': 'test_file_hash'},316 'filemap': {'myfile.py': 'test_file_hash'},
355 })317 })
356318
357 self.assertEqual(BuildStatus.FULLYBUILT, self.build.status)319 self.assertEqual(BuildStatus.UPLOADING, self.build.status)
358 self.assertEqual(1, self.fake_getUploaderCommand.call_count)320 self.assertResultCount(1, "incoming")
359321
360 def test_handleStatus_OK_absolute_filepath(self):322 def test_handleStatus_OK_absolute_filepath(self):
361 # A filemap that tries to write to files outside of323 # A filemap that tries to write to files outside of
@@ -364,7 +326,7 @@
364 'filemap': {'/tmp/myfile.py': 'test_file_hash'},326 'filemap': {'/tmp/myfile.py': 'test_file_hash'},
365 })327 })
366 self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status)328 self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status)
367 self.assertEqual(0, self.fake_getUploaderCommand.call_count)329 self.assertResultCount(0, "failed")
368330
369 def test_handleStatus_OK_relative_filepath(self):331 def test_handleStatus_OK_relative_filepath(self):
370 # A filemap that tries to write to files outside of332 # A filemap that tries to write to files outside of
@@ -373,7 +335,7 @@
373 'filemap': {'../myfile.py': 'test_file_hash'},335 'filemap': {'../myfile.py': 'test_file_hash'},
374 })336 })
375 self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status)337 self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status)
376 self.assertEqual(0, self.fake_getUploaderCommand.call_count)338 self.assertResultCount(0, "failed")
377339
378 def test_handleStatus_OK_sets_build_log(self):340 def test_handleStatus_OK_sets_build_log(self):
379 # The build log is set during handleStatus.341 # The build log is set during handleStatus.
380342
=== modified file 'lib/lp/code/browser/sourcepackagerecipebuild.py'
--- lib/lp/code/browser/sourcepackagerecipebuild.py 2010-08-27 11:19:54 +0000
+++ lib/lp/code/browser/sourcepackagerecipebuild.py 2010-09-16 00:48:58 +0000
@@ -82,6 +82,7 @@
82 return 'No suitable builders'82 return 'No suitable builders'
83 return {83 return {
84 BuildStatus.NEEDSBUILD: 'Pending build',84 BuildStatus.NEEDSBUILD: 'Pending build',
85 BuildStatus.UPLOADING: 'Build uploading',
85 BuildStatus.FULLYBUILT: 'Successful build',86 BuildStatus.FULLYBUILT: 'Successful build',
86 BuildStatus.MANUALDEPWAIT: (87 BuildStatus.MANUALDEPWAIT: (
87 'Could not build because of missing dependencies'),88 'Could not build because of missing dependencies'),
8889
=== modified file 'lib/lp/code/model/tests/test_sourcepackagerecipebuild.py'
--- lib/lp/code/model/tests/test_sourcepackagerecipebuild.py 2010-09-09 17:02:33 +0000
+++ lib/lp/code/model/tests/test_sourcepackagerecipebuild.py 2010-09-16 00:48:58 +0000
@@ -354,15 +354,15 @@
354 queue_record.builder.setSlaveForTesting(slave)354 queue_record.builder.setSlaveForTesting(slave)
355 return build355 return build
356356
357 def assertNotifyOnce(status, build):357 def assertNotifyCount(status, build, count):
358 build.handleStatus(status, None, {'filemap': {}})358 build.handleStatus(status, None, {'filemap': {}})
359 self.assertEqual(1, len(pop_notifications()))359 self.assertEqual(count, len(pop_notifications()))
360 for status in ['PACKAGEFAIL', 'OK']:360 assertNotifyCount("PACKAGEFAIL", prepare_build(), 1)
361 assertNotifyOnce(status, prepare_build())361 assertNotifyCount("OK", prepare_build(), 0)
362 build = prepare_build()362 build = prepare_build()
363 removeSecurityProxy(build).verifySuccessfulUpload = FakeMethod(363 removeSecurityProxy(build).verifySuccessfulUpload = FakeMethod(
364 result=True)364 result=True)
365 assertNotifyOnce('OK', prepare_build())365 assertNotifyCount("OK", prepare_build(), 0)
366366
367367
368class MakeSPRecipeBuildMixin:368class MakeSPRecipeBuildMixin:
369369
=== modified file 'lib/lp/registry/model/sourcepackage.py'
--- lib/lp/registry/model/sourcepackage.py 2010-08-27 11:19:54 +0000
+++ lib/lp/registry/model/sourcepackage.py 2010-09-16 00:48:58 +0000
@@ -597,11 +597,15 @@
597 % sqlvalues(BuildStatus.FULLYBUILT))597 % sqlvalues(BuildStatus.FULLYBUILT))
598598
599 # Ordering according status599 # Ordering according status
600 # * NEEDSBUILD & BUILDING by -lastscore600 # * NEEDSBUILD, BUILDING & UPLOADING by -lastscore
601 # * SUPERSEDED by -datecreated601 # * SUPERSEDED by -datecreated
602 # * FULLYBUILT & FAILURES by -datebuilt602 # * FULLYBUILT & FAILURES by -datebuilt
603 # It should present the builds in a more natural order.603 # It should present the builds in a more natural order.
604 if build_state in [BuildStatus.NEEDSBUILD, BuildStatus.BUILDING]:604 if build_state in [
605 BuildStatus.NEEDSBUILD,
606 BuildStatus.BUILDING,
607 BuildStatus.UPLOADING,
608 ]:
605 orderBy = ["-BuildQueue.lastscore"]609 orderBy = ["-BuildQueue.lastscore"]
606 clauseTables.append('BuildPackageJob')610 clauseTables.append('BuildPackageJob')
607 condition_clauses.append(611 condition_clauses.append(
608612
=== modified file 'lib/lp/soyuz/browser/archive.py'
--- lib/lp/soyuz/browser/archive.py 2010-08-31 11:31:04 +0000
+++ lib/lp/soyuz/browser/archive.py 2010-09-16 00:48:58 +0000
@@ -947,6 +947,7 @@
947 'NEEDSBUILD': 'Waiting to build',947 'NEEDSBUILD': 'Waiting to build',
948 'FAILEDTOBUILD': 'Failed to build:',948 'FAILEDTOBUILD': 'Failed to build:',
949 'BUILDING': 'Currently building',949 'BUILDING': 'Currently building',
950 'UPLOADING': 'Currently uploading',
950 }951 }
951952
952 now = datetime.now(tz=pytz.UTC)953 now = datetime.now(tz=pytz.UTC)
953954
=== modified file 'lib/lp/soyuz/doc/buildd-slavescanner.txt'
--- lib/lp/soyuz/doc/buildd-slavescanner.txt 2010-08-30 02:07:38 +0000
+++ lib/lp/soyuz/doc/buildd-slavescanner.txt 2010-09-16 00:48:58 +0000
@@ -319,87 +319,27 @@
319This situation happens when the builder has finished the job and is319This situation happens when the builder has finished the job and is
320waiting for the master to collect its results.320waiting for the master to collect its results.
321321
322The build record in question can end up in the following states:322The build record in question will end up as UPLOADING.
323323
324 * FULLYBUILT: when binaries were collected and uploaded correctly;324=== Uploading (UPLOADING) ===
325 * FAILEDTOUPLOAD: binaries were collected but the upload was
326 rejected/failed.
327
328
329=== Failed to Upload (FAILEDTOUPLOAD) ===
330325
331 >>> bqItem10 = a_build.queueBuild()326 >>> bqItem10 = a_build.queueBuild()
332 >>> setupBuildQueue(bqItem10, a_builder)327 >>> setupBuildQueue(bqItem10, a_builder)
333 >>> last_stub_mail_count = len(stub.test_emails)
334328
335Create a mock slave so the builder gets the right responses for this test.329Create a mock slave so the builder gets the right responses for this test.
336330
337 >>> bqItem10.builder.setSlaveForTesting(331 >>> bqItem10.builder.setSlaveForTesting(
338 ... WaitingSlave('BuildStatus.OK'))332 ... WaitingSlave('BuildStatus.OK'))
339333
340If the build record wasn't updated before/during the updateBuild334The build will progress to the UPLOADING state if the status from
341(precisely on binary upload time), the build will be considered335the builder was OK:
342FAILEDTOUPLOAD:
343336
344 >>> build = getUtility(IBinaryPackageBuildSet).getByQueueEntry(bqItem10)337 >>> build = getUtility(IBinaryPackageBuildSet).getByQueueEntry(bqItem10)
345 >>> a_builder.updateBuild(bqItem10)338 >>> a_builder.updateBuild(bqItem10)
346 WARNING:slave-scanner:Build ... upload failed.
347 >>> build.builder is not None
348 True
349 >>> build.date_finished is not None
350 True
351 >>> build.duration is not None
352 True
353 >>> build.log is not None
354 True
355 >>> check_mail_sent(last_stub_mail_count)
356 True
357 >>> build.status.title339 >>> build.status.title
358 'Failed to upload'340 'Uploading build'
359341
360Let's check the emails generated by this 'failure'342 >>> bqItem10.destroySelf()
361(see build-failedtoupload-workflow.txt for more information):
362
363 >>> from operator import itemgetter
364 >>> local_test_emails = stub.test_emails[last_stub_mail_count:]
365 >>> local_test_emails.sort(key=itemgetter(1), reverse=True)
366 >>> for from_addr, to_addrs, raw_msg in local_test_emails:
367 ... print to_addrs
368 ['mark@example.com']
369 ['foo.bar@canonical.com']
370 ['celso.providelo@canonical.com']
371
372Note that a real failed-to-upload notification contains the respective
373upload log information:
374
375 >>> one_email = stub.test_emails.pop()
376 >>> from_addr, to_addrs, raw_msg = one_email
377 >>> print raw_msg
378 Content-Type: text/plain; charset="utf-8"
379 ...
380 X-Launchpad-Build-State: FAILEDTOUPLOAD
381 ...
382 * Build Log: http://.../...i386.mozilla-firefox_0.9_BUILDING.txt.gz
383 ...
384 Upload log:
385 DEBUG ...
386 DEBUG Initialising connection.
387 ...
388 DEBUG Removing lock file: /var/lock/process-upload-buildd.lock
389 ...
390
391When a failure in processing the generated binaries occurs, the log
392output is both emailed in an immediate notification, and stored in the
393librarian for future reference.
394
395 >>> build.upload_log is not None
396 True
397
398What we can clearly notice is that the log is still containing
399the old build state (BUILDING) in its name. This is a minor problem
400that can be sorted by modifying the execution order of procedures
401inside Buildergroup.buildStatus_OK method.
402
403343
404=== Successfully collected and uploaded (FULLYBUILT) ===344=== Successfully collected and uploaded (FULLYBUILT) ===
405345
@@ -426,36 +366,14 @@
426366
427 >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK'))367 >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK'))
428368
429Now in order to emulate a successfully binary upload we will update
430the build record to FULLYBUILT, as the process-upload would do:
431
432 >>> from lp.buildmaster.enums import BuildStatus
433 >>> build.status = BuildStatus.FULLYBUILT
434
435Now the updateBuild should recognize this build record as a
436Successfully built and uploaded procedure, not sending any
437notification and updating the build information:
438
439 >>> a_builder.updateBuild(bqItem10)
440 >>> build.builder is not None
441 True
442 >>> build.date_finished is not None
443 True
444 >>> build.duration is not None
445 True
446 >>> build.log is not None
447 True
448 >>> build.status.title
449 'Successfully built'
450 >>> check_mail_sent(last_stub_mail_count)
451 False
452
453We do not store any build log information when the binary upload369We do not store any build log information when the binary upload
454processing succeeded.370processing succeeded.
455371
456 >>> build.upload_log is None372 >>> build.upload_log is None
457 True373 True
458374
375 >>> bqItem10.destroySelf()
376
459WAITING -> GIVENBACK - slave requested build record to be rescheduled.377WAITING -> GIVENBACK - slave requested build record to be rescheduled.
460378
461 >>> bqItem11 = a_build.queueBuild()379 >>> bqItem11 = a_build.queueBuild()
@@ -523,6 +441,7 @@
523 ... 6).queueBuild()441 ... 6).queueBuild()
524 >>> setupBuildQueue(bqItem10, a_builder)442 >>> setupBuildQueue(bqItem10, a_builder)
525 >>> build = bqItem10.specific_job.build443 >>> build = bqItem10.specific_job.build
444 >>> from lp.buildmaster.enums import BuildStatus
526 >>> build.status = BuildStatus.FULLYBUILT445 >>> build.status = BuildStatus.FULLYBUILT
527 >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK'))446 >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK'))
528447
@@ -613,29 +532,6 @@
613 >>> print headers['content-type']532 >>> print headers['content-type']
614 text/plain533 text/plain
615534
616Check the log from the uploader run has made it into the upload directory:
617
618 >>> failed_dir = os.path.join(config.builddmaster.root, 'failed')
619 >>> failed_uploads = sorted(os.listdir(failed_dir))
620 >>> len(failed_uploads)
621 2
622
623 >>> failed_upload = failed_uploads[0]
624 >>> uploader_log = open(os.path.join(failed_dir, failed_upload,
625 ... 'uploader.log'))
626
627 >>> print uploader_log.read()
628 DEBUG ...
629 DEBUG Initialising connection.
630 DEBUG Beginning processing
631 DEBUG Creating directory /var/tmp/builddmaster/accepted
632 DEBUG Creating directory /var/tmp/builddmaster/rejected
633 DEBUG Creating directory /var/tmp/builddmaster/failed
634 ...
635 DEBUG Rolling back any remaining transactions.
636 DEBUG Removing lock file: /var/lock/process-upload-buildd.lock
637 <BLANKLINE>
638
639Remove build upload results root535Remove build upload results root
640536
641 >>> shutil.rmtree(config.builddmaster.root)537 >>> shutil.rmtree(config.builddmaster.root)
@@ -1156,7 +1052,6 @@
1156 >>> build.upload_log = None1052 >>> build.upload_log = None
1157 >>> candidate.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK'))1053 >>> candidate.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK'))
1158 >>> a_builder.updateBuild(candidate)1054 >>> a_builder.updateBuild(candidate)
1159 WARNING:slave-scanner:Build ... upload failed.
1160 >>> local_transaction.commit()1055 >>> local_transaction.commit()
11611056
1162 >>> build.archive.private1057 >>> build.archive.private
@@ -1167,6 +1062,7 @@
1167 True1062 True
1168 >>> print lfa.filename1063 >>> print lfa.filename
1169 buildlog_ubuntu-hoary-i386.mozilla-firefox_0.9_BUILDING.txt.gz1064 buildlog_ubuntu-hoary-i386.mozilla-firefox_0.9_BUILDING.txt.gz
1065 >>> candidate.destroySelf()
11701066
1171The attempt to fetch the buildlog from the common librarian will fail1067The attempt to fetch the buildlog from the common librarian will fail
1172since this is a build in a private archive and the buildlog was thus1068since this is a build in a private archive and the buildlog was thus
11731069
=== modified file 'lib/lp/soyuz/model/archive.py'
--- lib/lp/soyuz/model/archive.py 2010-08-31 11:31:04 +0000
+++ lib/lp/soyuz/model/archive.py 2010-09-16 00:48:58 +0000
@@ -1004,10 +1004,9 @@
1004 BuildStatus.FAILEDTOUPLOAD,1004 BuildStatus.FAILEDTOUPLOAD,
1005 BuildStatus.MANUALDEPWAIT,1005 BuildStatus.MANUALDEPWAIT,
1006 ),1006 ),
1007 # The 'pending' count is a list because we may append to it
1008 # later.
1009 'pending': [1007 'pending': [
1010 BuildStatus.BUILDING,1008 BuildStatus.BUILDING,
1009 BuildStatus.UPLOADING,
1011 ],1010 ],
1012 'succeeded': (1011 'succeeded': (
1013 BuildStatus.FULLYBUILT,1012 BuildStatus.FULLYBUILT,
@@ -1023,6 +1022,7 @@
1023 BuildStatus.FAILEDTOUPLOAD,1022 BuildStatus.FAILEDTOUPLOAD,
1024 BuildStatus.MANUALDEPWAIT,1023 BuildStatus.MANUALDEPWAIT,
1025 BuildStatus.BUILDING,1024 BuildStatus.BUILDING,
1025 BuildStatus.UPLOADING,
1026 BuildStatus.FULLYBUILT,1026 BuildStatus.FULLYBUILT,
1027 BuildStatus.SUPERSEDED,1027 BuildStatus.SUPERSEDED,
1028 ],1028 ],
@@ -2007,6 +2007,7 @@
2007 ),2007 ),
2008 'pending': (2008 'pending': (
2009 BuildStatus.BUILDING,2009 BuildStatus.BUILDING,
2010 BuildStatus.UPLOADING,
2010 BuildStatus.NEEDSBUILD,2011 BuildStatus.NEEDSBUILD,
2011 ),2012 ),
2012 'succeeded': (2013 'succeeded': (
20132014
=== modified file 'lib/lp/soyuz/model/binarypackagebuild.py'
--- lib/lp/soyuz/model/binarypackagebuild.py 2010-09-09 17:02:33 +0000
+++ lib/lp/soyuz/model/binarypackagebuild.py 2010-09-16 00:48:58 +0000
@@ -251,6 +251,7 @@
251 """See `IBuild`"""251 """See `IBuild`"""
252 return self.status not in [BuildStatus.NEEDSBUILD,252 return self.status not in [BuildStatus.NEEDSBUILD,
253 BuildStatus.BUILDING,253 BuildStatus.BUILDING,
254 BuildStatus.UPLOADING,
254 BuildStatus.SUPERSEDED]255 BuildStatus.SUPERSEDED]
255256
256 @property257 @property
@@ -671,6 +672,10 @@
671 buildduration = 'not available'672 buildduration = 'not available'
672 buildlog_url = 'not available'673 buildlog_url = 'not available'
673 builder_url = 'not available'674 builder_url = 'not available'
675 elif self.status == BuildStatus.UPLOADING:
676 buildduration = 'uploading'
677 buildlog_url = 'see builder page'
678 builder_url = 'not available'
674 elif self.status == BuildStatus.BUILDING:679 elif self.status == BuildStatus.BUILDING:
675 # build in process680 # build in process
676 buildduration = 'not finished'681 buildduration = 'not finished'
@@ -959,11 +964,14 @@
959 % sqlvalues(BuildStatus.FULLYBUILT))964 % sqlvalues(BuildStatus.FULLYBUILT))
960965
961 # Ordering according status966 # Ordering according status
962 # * NEEDSBUILD & BUILDING by -lastscore967 # * NEEDSBUILD, BUILDING & UPLOADING by -lastscore
963 # * SUPERSEDED & All by -datecreated968 # * SUPERSEDED & All by -datecreated
964 # * FULLYBUILT & FAILURES by -datebuilt969 # * FULLYBUILT & FAILURES by -datebuilt
965 # It should present the builds in a more natural order.970 # It should present the builds in a more natural order.
966 if status in [BuildStatus.NEEDSBUILD, BuildStatus.BUILDING]:971 if status in [
972 BuildStatus.NEEDSBUILD,
973 BuildStatus.BUILDING,
974 BuildStatus.UPLOADING]:
967 orderBy = ["-BuildQueue.lastscore", "BinaryPackageBuild.id"]975 orderBy = ["-BuildQueue.lastscore", "BinaryPackageBuild.id"]
968 clauseTables.append('BuildQueue')976 clauseTables.append('BuildQueue')
969 clauseTables.append('BuildPackageJob')977 clauseTables.append('BuildPackageJob')
@@ -1079,7 +1087,8 @@
1079 BuildStatus.CHROOTWAIT,1087 BuildStatus.CHROOTWAIT,
1080 BuildStatus.FAILEDTOUPLOAD)1088 BuildStatus.FAILEDTOUPLOAD)
1081 needsbuild = collect_builds(BuildStatus.NEEDSBUILD)1089 needsbuild = collect_builds(BuildStatus.NEEDSBUILD)
1082 building = collect_builds(BuildStatus.BUILDING)1090 building = collect_builds(BuildStatus.BUILDING,
1091 BuildStatus.UPLOADING)
1083 successful = collect_builds(BuildStatus.FULLYBUILT)1092 successful = collect_builds(BuildStatus.FULLYBUILT)
10841093
1085 # Note: the BuildStatus DBItems are used here to summarize the1094 # Note: the BuildStatus DBItems are used here to summarize the