Merge lp:~jelmer/launchpad/506256-remove-popen into lp:launchpad
- 506256-remove-popen
- Merge into devel
Status: | Merged | ||||
---|---|---|---|---|---|
Approved by: | Graham Binns | ||||
Approved revision: | no longer in the source branch. | ||||
Merged at revision: | 11566 | ||||
Proposed branch: | lp:~jelmer/launchpad/506256-remove-popen | ||||
Merge into: | lp:launchpad | ||||
Diff against target: |
1176 lines (+222/-405) 17 files modified
lib/canonical/launchpad/webapp/tales.py (+1/-0) lib/lp/archiveuploader/tests/test_uploadprocessor.py (+53/-24) lib/lp/archiveuploader/uploadprocessor.py (+25/-9) lib/lp/buildmaster/enums.py (+7/-3) lib/lp/buildmaster/interfaces/buildfarmjob.py (+4/-0) lib/lp/buildmaster/interfaces/packagebuild.py (+2/-20) lib/lp/buildmaster/model/buildfarmjob.py (+10/-0) lib/lp/buildmaster/model/packagebuild.py (+35/-153) lib/lp/buildmaster/tests/test_buildfarmjob.py (+15/-0) lib/lp/buildmaster/tests/test_packagebuild.py (+29/-67) lib/lp/code/browser/sourcepackagerecipebuild.py (+1/-0) lib/lp/code/model/tests/test_sourcepackagerecipebuild.py (+6/-6) lib/lp/registry/model/sourcepackage.py (+6/-2) lib/lp/soyuz/browser/archive.py (+1/-0) lib/lp/soyuz/doc/buildd-slavescanner.txt (+12/-116) lib/lp/soyuz/model/archive.py (+3/-2) lib/lp/soyuz/model/binarypackagebuild.py (+12/-3) |
||||
To merge this branch: | bzr merge lp:~jelmer/launchpad/506256-remove-popen | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Graham Binns (community) | release-critical | Approve | |
Brad Crittenden (community) | code | Approve | |
Julian Edwards (community) | Approve | ||
Review via email: mp+34549@code.launchpad.net |
Commit message
Builddmaster now moves build binaries away for later processing rather than invoking process-uploader on them directly.
Description of the change
buildd-manager currently invokes the uploadprocessor on binaries that it has fetched from the buildd slaves.
As this process is synchronous is blocks the buildd manager from doing other things at the same time - such as scheduling new builds - time during which the buildd slaves are idling.
This branch changes the buildd manager to move build results out of the way into a queue that can independently be processed by process-upload (extensions for process-upload to support this were landed earlier).
Tests:
./bin/test lp.archiveuploader
./bin/test lp.buildmaster
There is still a single test testNoFiles() in the archiveuploader that is itself buggy. I'm looking into this at the moment, but wanted to submit the branch for review earlier because of the upcoming PQM closure. The fix for this test shouldn't involve any changes outside of the test itself.
QA:
This branch has been running on dogfood for the past week or so in its current form and has been working well. We've tested with multiple buildds and thrown several hundred builds at it.
Deployment:
A cron job needs to be set up to run the following command regularly:
LPCURRENT/
Jelmer Vernooij (jelmer) wrote : | # |
I need to change the lookups in the archiveuploader to be of binarypackagebuilds or sourcepackagebuilds rather than on packagebuilds in general, since otherwise we don't have a sensible verifySuccessBu
Graham Binns (gmb) wrote : | # |
Abstaining until the tests are fixed.
Jelmer Vernooij (jelmer) wrote : | # |
This turned out to be quite a complex issue; I'm fixing the last test now.
Julian Edwards (julian-edwards) wrote : | # |
My two pence, as I mentioned on Mumble:
1. Please change the added __getitem__ to a getByID() method. We're deprecating __getitem__
2. Typo in the new enum (s/process/
Otherwise, ROCK!
Brad Crittenden (bac) wrote : | # |
Thanks for fixing the tests Jelmer. They all ran locally for me.
Graham Binns (gmb) wrote : | # |
RC=me on this. Please land this on db-devel rather than devel (I believe there's some argument you can pass to utils/ec2 in order to achieve this). We want to close devel for landings sooner rather than later, so all traffic should go to db-devel unless absolutely necessary.
Preview Diff
1 | === modified file 'lib/canonical/launchpad/webapp/tales.py' | |||
2 | --- lib/canonical/launchpad/webapp/tales.py 2010-08-27 11:19:54 +0000 | |||
3 | +++ lib/canonical/launchpad/webapp/tales.py 2010-09-16 00:48:58 +0000 | |||
4 | @@ -995,6 +995,7 @@ | |||
5 | 995 | BuildStatus.CHROOTWAIT: {'src': "/@@/build-chrootwait"}, | 995 | BuildStatus.CHROOTWAIT: {'src': "/@@/build-chrootwait"}, |
6 | 996 | BuildStatus.SUPERSEDED: {'src': "/@@/build-superseded"}, | 996 | BuildStatus.SUPERSEDED: {'src': "/@@/build-superseded"}, |
7 | 997 | BuildStatus.BUILDING: {'src': "/@@/processing"}, | 997 | BuildStatus.BUILDING: {'src': "/@@/processing"}, |
8 | 998 | BuildStatus.UPLOADING: {'src': "/@@/processing"}, | ||
9 | 998 | BuildStatus.FAILEDTOUPLOAD: {'src': "/@@/build-failedtoupload"}, | 999 | BuildStatus.FAILEDTOUPLOAD: {'src': "/@@/build-failedtoupload"}, |
10 | 999 | } | 1000 | } |
11 | 1000 | 1001 | ||
12 | 1001 | 1002 | ||
13 | === modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py' | |||
14 | --- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-09-02 16:28:50 +0000 | |||
15 | +++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2010-09-16 00:48:58 +0000 | |||
16 | @@ -31,6 +31,7 @@ | |||
17 | 31 | from canonical.launchpad.testing.fakepackager import FakePackager | 31 | from canonical.launchpad.testing.fakepackager import FakePackager |
18 | 32 | from canonical.launchpad.webapp.errorlog import ErrorReportingUtility | 32 | from canonical.launchpad.webapp.errorlog import ErrorReportingUtility |
19 | 33 | from canonical.testing import LaunchpadZopelessLayer | 33 | from canonical.testing import LaunchpadZopelessLayer |
20 | 34 | |||
21 | 34 | from lp.app.errors import NotFoundError | 35 | from lp.app.errors import NotFoundError |
22 | 35 | from lp.archiveuploader.uploadpolicy import ( | 36 | from lp.archiveuploader.uploadpolicy import ( |
23 | 36 | AbstractUploadPolicy, | 37 | AbstractUploadPolicy, |
24 | @@ -41,7 +42,10 @@ | |||
25 | 41 | parse_build_upload_leaf_name, | 42 | parse_build_upload_leaf_name, |
26 | 42 | UploadProcessor, | 43 | UploadProcessor, |
27 | 43 | ) | 44 | ) |
29 | 44 | from lp.buildmaster.enums import BuildStatus | 45 | from lp.buildmaster.enums import ( |
30 | 46 | BuildFarmJobType, | ||
31 | 47 | BuildStatus, | ||
32 | 48 | ) | ||
33 | 45 | from lp.registry.interfaces.distribution import IDistributionSet | 49 | from lp.registry.interfaces.distribution import IDistributionSet |
34 | 46 | from lp.registry.interfaces.person import IPersonSet | 50 | from lp.registry.interfaces.person import IPersonSet |
35 | 47 | from lp.registry.interfaces.pocket import PackagePublishingPocket | 51 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
36 | @@ -1861,17 +1865,26 @@ | |||
37 | 1861 | self.uploadprocessor = self.setupBreezyAndGetUploadProcessor() | 1865 | self.uploadprocessor = self.setupBreezyAndGetUploadProcessor() |
38 | 1862 | 1866 | ||
39 | 1863 | def testInvalidLeafName(self): | 1867 | def testInvalidLeafName(self): |
42 | 1864 | upload_dir = self.queueUpload("bar_1.0-1") | 1868 | # Directories with invalid leaf names should be skipped, |
43 | 1865 | self.uploadprocessor.processBuildUpload(upload_dir, "bar_1.0-1") | 1869 | # and a warning logged. |
44 | 1870 | upload_dir = self.queueUpload("bar_1.0-1", queue_entry="bar") | ||
45 | 1871 | self.uploadprocessor.processBuildUpload(upload_dir, "bar") | ||
46 | 1866 | self.assertLogContains('Unable to extract build id from leaf ' | 1872 | self.assertLogContains('Unable to extract build id from leaf ' |
48 | 1867 | 'name bar_1.0-1, skipping.') | 1873 | 'name bar, skipping.') |
49 | 1868 | 1874 | ||
50 | 1869 | def testNoBuildEntry(self): | 1875 | def testNoBuildEntry(self): |
54 | 1870 | upload_dir = self.queueUpload("bar_1.0-1", queue_entry="42-60") | 1876 | # Directories with that refer to a nonexistent build |
55 | 1871 | self.assertRaises(NotFoundError, self.uploadprocessor.processBuildUpload, | 1877 | # should be skipped and a warning logged. |
56 | 1872 | upload_dir, "42-60") | 1878 | cookie = "%s-%d" % (BuildFarmJobType.PACKAGEBUILD.name, 42) |
57 | 1879 | upload_dir = self.queueUpload("bar_1.0-1", queue_entry=cookie) | ||
58 | 1880 | self.uploadprocessor.processBuildUpload(upload_dir, cookie) | ||
59 | 1881 | self.assertLogContains( | ||
60 | 1882 | "Unable to find package build job with id 42. Skipping.") | ||
61 | 1873 | 1883 | ||
62 | 1874 | def testNoFiles(self): | 1884 | def testNoFiles(self): |
63 | 1885 | # If the upload directory is empty, the upload | ||
64 | 1886 | # will fail. | ||
65 | 1887 | |||
66 | 1875 | # Upload a source package | 1888 | # Upload a source package |
67 | 1876 | upload_dir = self.queueUpload("bar_1.0-1") | 1889 | upload_dir = self.queueUpload("bar_1.0-1") |
68 | 1877 | self.processUpload(self.uploadprocessor, upload_dir) | 1890 | self.processUpload(self.uploadprocessor, upload_dir) |
69 | @@ -1884,19 +1897,28 @@ | |||
70 | 1884 | version="1.0-1", name="bar") | 1897 | version="1.0-1", name="bar") |
71 | 1885 | queue_item.setDone() | 1898 | queue_item.setDone() |
72 | 1886 | 1899 | ||
75 | 1887 | build.jobStarted() | 1900 | builder = self.factory.makeBuilder() |
76 | 1888 | build.builder = self.factory.makeBuilder() | 1901 | build.buildqueue_record.markAsBuilding(builder) |
77 | 1902 | build.builder = build.buildqueue_record.builder | ||
78 | 1903 | |||
79 | 1904 | build.status = BuildStatus.UPLOADING | ||
80 | 1889 | 1905 | ||
81 | 1890 | # Upload and accept a binary for the primary archive source. | 1906 | # Upload and accept a binary for the primary archive source. |
82 | 1891 | shutil.rmtree(upload_dir) | 1907 | shutil.rmtree(upload_dir) |
83 | 1892 | self.layer.txn.commit() | 1908 | self.layer.txn.commit() |
85 | 1893 | leaf_name = "%d-%d" % (build.id, 60) | 1909 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) |
86 | 1894 | os.mkdir(os.path.join(self.incoming_folder, leaf_name)) | 1910 | os.mkdir(os.path.join(self.incoming_folder, leaf_name)) |
87 | 1895 | self.options.context = 'buildd' | 1911 | self.options.context = 'buildd' |
88 | 1896 | self.options.builds = True | 1912 | self.options.builds = True |
90 | 1897 | self.uploadprocessor.processBuildUpload(self.incoming_folder, leaf_name) | 1913 | self.uploadprocessor.processBuildUpload( |
91 | 1914 | self.incoming_folder, leaf_name) | ||
92 | 1915 | self.assertEquals(1, len(self.oopses)) | ||
93 | 1898 | self.layer.txn.commit() | 1916 | self.layer.txn.commit() |
95 | 1899 | self.assertEquals(BuildStatus.FAILEDTOUPLOAD, build.status) | 1917 | self.assertEquals( |
96 | 1918 | BuildStatus.FAILEDTOUPLOAD, build.status) | ||
97 | 1919 | self.assertEquals(builder, build.builder) | ||
98 | 1920 | self.assertIsNot(None, build.date_finished) | ||
99 | 1921 | self.assertIsNot(None, build.duration) | ||
100 | 1900 | log_contents = build.upload_log.read() | 1922 | log_contents = build.upload_log.read() |
101 | 1901 | self.assertTrue('ERROR: Exception while processing upload ' | 1923 | self.assertTrue('ERROR: Exception while processing upload ' |
102 | 1902 | in log_contents) | 1924 | in log_contents) |
103 | @@ -1904,6 +1926,8 @@ | |||
104 | 1904 | in log_contents) | 1926 | in log_contents) |
105 | 1905 | 1927 | ||
106 | 1906 | def testSuccess(self): | 1928 | def testSuccess(self): |
107 | 1929 | # Properly uploaded binaries should result in the | ||
108 | 1930 | # build status changing to FULLYBUILT. | ||
109 | 1907 | # Upload a source package | 1931 | # Upload a source package |
110 | 1908 | upload_dir = self.queueUpload("bar_1.0-1") | 1932 | upload_dir = self.queueUpload("bar_1.0-1") |
111 | 1909 | self.processUpload(self.uploadprocessor, upload_dir) | 1933 | self.processUpload(self.uploadprocessor, upload_dir) |
112 | @@ -1916,21 +1940,27 @@ | |||
113 | 1916 | version="1.0-1", name="bar") | 1940 | version="1.0-1", name="bar") |
114 | 1917 | queue_item.setDone() | 1941 | queue_item.setDone() |
115 | 1918 | 1942 | ||
118 | 1919 | build.jobStarted() | 1943 | build.buildqueue_record.markAsBuilding(self.factory.makeBuilder()) |
119 | 1920 | build.builder = self.factory.makeBuilder() | 1944 | |
120 | 1945 | build.status = BuildStatus.UPLOADING | ||
121 | 1921 | 1946 | ||
122 | 1922 | # Upload and accept a binary for the primary archive source. | 1947 | # Upload and accept a binary for the primary archive source. |
123 | 1923 | shutil.rmtree(upload_dir) | 1948 | shutil.rmtree(upload_dir) |
124 | 1924 | self.layer.txn.commit() | 1949 | self.layer.txn.commit() |
126 | 1925 | leaf_name = "%d-%d" % (build.id, 60) | 1950 | leaf_name = build.getUploadDirLeaf(build.getBuildCookie()) |
127 | 1926 | upload_dir = self.queueUpload("bar_1.0-1_binary", | 1951 | upload_dir = self.queueUpload("bar_1.0-1_binary", |
128 | 1927 | queue_entry=leaf_name) | 1952 | queue_entry=leaf_name) |
129 | 1928 | self.options.context = 'buildd' | 1953 | self.options.context = 'buildd' |
130 | 1929 | self.options.builds = True | 1954 | self.options.builds = True |
132 | 1930 | self.uploadprocessor.processBuildUpload(self.incoming_folder, leaf_name) | 1955 | last_stub_mail_count = len(stub.test_emails) |
133 | 1956 | self.uploadprocessor.processBuildUpload( | ||
134 | 1957 | self.incoming_folder, leaf_name) | ||
135 | 1931 | self.layer.txn.commit() | 1958 | self.layer.txn.commit() |
136 | 1959 | # No emails are sent on success | ||
137 | 1960 | self.assertEquals(len(stub.test_emails), last_stub_mail_count) | ||
138 | 1932 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) | 1961 | self.assertEquals(BuildStatus.FULLYBUILT, build.status) |
140 | 1933 | log_lines = build.upload_log.read().splitlines() | 1962 | log_contents = build.upload_log.read() |
141 | 1963 | log_lines = log_contents.splitlines() | ||
142 | 1934 | self.assertTrue( | 1964 | self.assertTrue( |
143 | 1935 | 'INFO: Processing upload bar_1.0-1_i386.changes' in log_lines) | 1965 | 'INFO: Processing upload bar_1.0-1_i386.changes' in log_lines) |
144 | 1936 | self.assertTrue( | 1966 | self.assertTrue( |
145 | @@ -1942,10 +1972,9 @@ | |||
146 | 1942 | """Tests for parse_build_upload_leaf_name.""" | 1972 | """Tests for parse_build_upload_leaf_name.""" |
147 | 1943 | 1973 | ||
148 | 1944 | def test_valid(self): | 1974 | def test_valid(self): |
156 | 1945 | self.assertEquals((42, 60), parse_build_upload_leaf_name("42-60")) | 1975 | self.assertEquals( |
157 | 1946 | 1976 | 60, parse_build_upload_leaf_name("20100812-42-PACKAGEBUILD-60")) | |
158 | 1947 | def test_invalid_chars(self): | 1977 | |
159 | 1948 | self.assertRaises(ValueError, parse_build_upload_leaf_name, "a42-460") | 1978 | def test_invalid_jobid(self): |
160 | 1949 | 1979 | self.assertRaises( | |
161 | 1950 | def test_no_dash(self): | 1980 | ValueError, parse_build_upload_leaf_name, "aaba-a42-PACKAGEBUILD-abc") |
155 | 1951 | self.assertRaises(ValueError, parse_build_upload_leaf_name, "32") | ||
162 | 1952 | 1981 | ||
163 | === modified file 'lib/lp/archiveuploader/uploadprocessor.py' | |||
164 | --- lib/lp/archiveuploader/uploadprocessor.py 2010-08-27 11:19:54 +0000 | |||
165 | +++ lib/lp/archiveuploader/uploadprocessor.py 2010-09-16 00:48:58 +0000 | |||
166 | @@ -74,14 +74,16 @@ | |||
167 | 74 | SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME, | 74 | SOURCE_PACKAGE_RECIPE_UPLOAD_POLICY_NAME, |
168 | 75 | UploadPolicyError, | 75 | UploadPolicyError, |
169 | 76 | ) | 76 | ) |
171 | 77 | from lp.buildmaster.enums import BuildStatus | 77 | from lp.buildmaster.enums import ( |
172 | 78 | BuildStatus, | ||
173 | 79 | ) | ||
174 | 80 | from lp.buildmaster.interfaces.buildfarmjob import IBuildFarmJobSet | ||
175 | 78 | from lp.registry.interfaces.distribution import IDistributionSet | 81 | from lp.registry.interfaces.distribution import IDistributionSet |
176 | 79 | from lp.registry.interfaces.person import IPersonSet | 82 | from lp.registry.interfaces.person import IPersonSet |
177 | 80 | from lp.soyuz.interfaces.archive import ( | 83 | from lp.soyuz.interfaces.archive import ( |
178 | 81 | IArchiveSet, | 84 | IArchiveSet, |
179 | 82 | NoSuchPPA, | 85 | NoSuchPPA, |
180 | 83 | ) | 86 | ) |
181 | 84 | from lp.soyuz.interfaces.binarypackagebuild import IBinaryPackageBuildSet | ||
182 | 85 | 87 | ||
183 | 86 | 88 | ||
184 | 87 | __all__ = [ | 89 | __all__ = [ |
185 | @@ -104,11 +106,11 @@ | |||
186 | 104 | """Parse the leaf directory name of a build upload. | 106 | """Parse the leaf directory name of a build upload. |
187 | 105 | 107 | ||
188 | 106 | :param name: Directory name. | 108 | :param name: Directory name. |
190 | 107 | :return: Tuple with build id and build queue record id. | 109 | :return: Tuple with build farm job id. |
191 | 108 | """ | 110 | """ |
193 | 109 | (build_id_str, queue_record_str) = name.split("-", 1) | 111 | (job_id_str,) = name.split("-")[-1:] |
194 | 110 | try: | 112 | try: |
196 | 111 | return int(build_id_str), int(queue_record_str) | 113 | return int(job_id_str) |
197 | 112 | except TypeError: | 114 | except TypeError: |
198 | 113 | raise ValueError | 115 | raise ValueError |
199 | 114 | 116 | ||
200 | @@ -191,7 +193,7 @@ | |||
201 | 191 | continue | 193 | continue |
202 | 192 | if self.builds: | 194 | if self.builds: |
203 | 193 | # Upload directories contain build results, | 195 | # Upload directories contain build results, |
205 | 194 | # directories are named after build ids. | 196 | # directories are named after job ids. |
206 | 195 | self.processBuildUpload(fsroot, upload) | 197 | self.processBuildUpload(fsroot, upload) |
207 | 196 | else: | 198 | else: |
208 | 197 | self.processUpload(fsroot, upload) | 199 | self.processUpload(fsroot, upload) |
209 | @@ -206,13 +208,24 @@ | |||
210 | 206 | Build uploads always contain a single package per leaf. | 208 | Build uploads always contain a single package per leaf. |
211 | 207 | """ | 209 | """ |
212 | 208 | try: | 210 | try: |
215 | 209 | (build_id, build_queue_record_id) = parse_build_upload_leaf_name( | 211 | job_id = parse_build_upload_leaf_name(upload) |
214 | 210 | upload) | ||
216 | 211 | except ValueError: | 212 | except ValueError: |
217 | 212 | self.log.warn("Unable to extract build id from leaf name %s," | 213 | self.log.warn("Unable to extract build id from leaf name %s," |
218 | 213 | " skipping." % upload) | 214 | " skipping." % upload) |
219 | 214 | return | 215 | return |
221 | 215 | build = getUtility(IBinaryPackageBuildSet).getByBuildID(int(build_id)) | 216 | try: |
222 | 217 | buildfarm_job = getUtility(IBuildFarmJobSet).getByID(job_id) | ||
223 | 218 | except NotFoundError: | ||
224 | 219 | self.log.warn( | ||
225 | 220 | "Unable to find package build job with id %d. Skipping." % | ||
226 | 221 | job_id) | ||
227 | 222 | return | ||
228 | 223 | build = buildfarm_job.getSpecificJob() | ||
229 | 224 | if build.status != BuildStatus.UPLOADING: | ||
230 | 225 | self.log.warn( | ||
231 | 226 | "Expected build status to be 'UPLOADING', was %s. Skipping.", | ||
232 | 227 | build.status.name) | ||
233 | 228 | return | ||
234 | 216 | self.log.debug("Build %s found" % build.id) | 229 | self.log.debug("Build %s found" % build.id) |
235 | 217 | logger = BufferLogger() | 230 | logger = BufferLogger() |
236 | 218 | upload_path = os.path.join(fsroot, upload) | 231 | upload_path = os.path.join(fsroot, upload) |
237 | @@ -246,6 +259,9 @@ | |||
238 | 246 | build.notify(extra_info="Uploading build %s failed." % upload) | 259 | build.notify(extra_info="Uploading build %s failed." % upload) |
239 | 247 | build.storeUploadLog(logger.buffer.getvalue()) | 260 | build.storeUploadLog(logger.buffer.getvalue()) |
240 | 248 | 261 | ||
241 | 262 | # Remove BuildQueue record. | ||
242 | 263 | build.buildqueue_record.destroySelf() | ||
243 | 264 | |||
244 | 249 | def processUpload(self, fsroot, upload): | 265 | def processUpload(self, fsroot, upload): |
245 | 250 | """Process an upload's changes files, and move it to a new directory. | 266 | """Process an upload's changes files, and move it to a new directory. |
246 | 251 | 267 | ||
247 | 252 | 268 | ||
248 | === modified file 'lib/lp/buildmaster/enums.py' | |||
249 | --- lib/lp/buildmaster/enums.py 2010-08-27 15:03:18 +0000 | |||
250 | +++ lib/lp/buildmaster/enums.py 2010-09-16 00:48:58 +0000 | |||
251 | @@ -97,6 +97,13 @@ | |||
252 | 97 | will be notified via process-upload about the reason of the rejection. | 97 | will be notified via process-upload about the reason of the rejection. |
253 | 98 | """) | 98 | """) |
254 | 99 | 99 | ||
255 | 100 | UPLOADING = DBItem(8, """ | ||
256 | 101 | Uploading build | ||
257 | 102 | |||
258 | 103 | The build has completed and is waiting to be processed by the | ||
259 | 104 | upload processor. | ||
260 | 105 | """) | ||
261 | 106 | |||
262 | 100 | 107 | ||
263 | 101 | class BuildFarmJobType(DBEnumeratedType): | 108 | class BuildFarmJobType(DBEnumeratedType): |
264 | 102 | """Soyuz build farm job type. | 109 | """Soyuz build farm job type. |
265 | @@ -128,6 +135,3 @@ | |||
266 | 128 | 135 | ||
267 | 129 | Generate translation templates from a bazaar branch. | 136 | Generate translation templates from a bazaar branch. |
268 | 130 | """) | 137 | """) |
269 | 131 | |||
270 | 132 | |||
271 | 133 | |||
272 | 134 | 138 | ||
273 | === modified file 'lib/lp/buildmaster/interfaces/buildfarmjob.py' | |||
274 | --- lib/lp/buildmaster/interfaces/buildfarmjob.py 2010-08-30 15:00:23 +0000 | |||
275 | +++ lib/lp/buildmaster/interfaces/buildfarmjob.py 2010-09-16 00:48:58 +0000 | |||
276 | @@ -292,3 +292,7 @@ | |||
277 | 292 | that should be included. | 292 | that should be included. |
278 | 293 | :return: a `ResultSet` representing the requested builds. | 293 | :return: a `ResultSet` representing the requested builds. |
279 | 294 | """ | 294 | """ |
280 | 295 | |||
281 | 296 | def getByID(job_id): | ||
282 | 297 | """Look up a `IBuildFarmJob` record by id. | ||
283 | 298 | """ | ||
284 | 295 | 299 | ||
285 | === modified file 'lib/lp/buildmaster/interfaces/packagebuild.py' | |||
286 | --- lib/lp/buildmaster/interfaces/packagebuild.py 2010-08-27 11:19:54 +0000 | |||
287 | +++ lib/lp/buildmaster/interfaces/packagebuild.py 2010-09-16 00:48:58 +0000 | |||
288 | @@ -91,13 +91,6 @@ | |||
289 | 91 | title=_("Distribution series"), required=True, | 91 | title=_("Distribution series"), required=True, |
290 | 92 | description=_("Shortcut for its distribution series."))) | 92 | description=_("Shortcut for its distribution series."))) |
291 | 93 | 93 | ||
292 | 94 | def getUploaderCommand(package_build, upload_leaf, uploader_logfilename): | ||
293 | 95 | """Get the command to run as the uploader. | ||
294 | 96 | |||
295 | 97 | :return: A list of command line arguments, beginning with the | ||
296 | 98 | executable. | ||
297 | 99 | """ | ||
298 | 100 | |||
299 | 101 | def getUploadDirLeaf(build_cookie, now=None): | 94 | def getUploadDirLeaf(build_cookie, now=None): |
300 | 102 | """Return the directory-leaf where files to be uploaded are stored. | 95 | """Return the directory-leaf where files to be uploaded are stored. |
301 | 103 | 96 | ||
302 | @@ -106,24 +99,13 @@ | |||
303 | 106 | directory name. If not provided, defaults to now. | 99 | directory name. If not provided, defaults to now. |
304 | 107 | """ | 100 | """ |
305 | 108 | 101 | ||
311 | 109 | def getUploadDir(upload_leaf): | 102 | def getBuildCookie(): |
312 | 110 | """Return the full directory where files to be uploaded are stored. | 103 | """Return the build cookie (build id and build queue record id). |
308 | 111 | |||
309 | 112 | :param upload_leaf: The leaf directory name where things will be | ||
310 | 113 | stored. | ||
313 | 114 | """ | 104 | """ |
314 | 115 | 105 | ||
315 | 116 | def getLogFromSlave(build): | 106 | def getLogFromSlave(build): |
316 | 117 | """Get last buildlog from slave. """ | 107 | """Get last buildlog from slave. """ |
317 | 118 | 108 | ||
318 | 119 | def getUploadLogContent(root, leaf): | ||
319 | 120 | """Retrieve the upload log contents. | ||
320 | 121 | |||
321 | 122 | :param root: Root directory for the uploads | ||
322 | 123 | :param leaf: Leaf for this particular upload | ||
323 | 124 | :return: Contents of log file or message saying no log file was found. | ||
324 | 125 | """ | ||
325 | 126 | |||
326 | 127 | def estimateDuration(): | 109 | def estimateDuration(): |
327 | 128 | """Estimate the build duration.""" | 110 | """Estimate the build duration.""" |
328 | 129 | 111 | ||
329 | 130 | 112 | ||
330 | === modified file 'lib/lp/buildmaster/model/buildfarmjob.py' | |||
331 | --- lib/lp/buildmaster/model/buildfarmjob.py 2010-08-30 15:00:23 +0000 | |||
332 | +++ lib/lp/buildmaster/model/buildfarmjob.py 2010-09-16 00:48:58 +0000 | |||
333 | @@ -52,6 +52,7 @@ | |||
334 | 52 | IStoreSelector, | 52 | IStoreSelector, |
335 | 53 | MAIN_STORE, | 53 | MAIN_STORE, |
336 | 54 | ) | 54 | ) |
337 | 55 | from lp.app.errors import NotFoundError | ||
338 | 55 | from lp.buildmaster.enums import BuildStatus | 56 | from lp.buildmaster.enums import BuildStatus |
339 | 56 | from lp.buildmaster.enums import BuildFarmJobType | 57 | from lp.buildmaster.enums import BuildFarmJobType |
340 | 57 | from lp.buildmaster.interfaces.buildfarmjob import ( | 58 | from lp.buildmaster.interfaces.buildfarmjob import ( |
341 | @@ -339,6 +340,7 @@ | |||
342 | 339 | """See `IBuild`""" | 340 | """See `IBuild`""" |
343 | 340 | return self.status not in [BuildStatus.NEEDSBUILD, | 341 | return self.status not in [BuildStatus.NEEDSBUILD, |
344 | 341 | BuildStatus.BUILDING, | 342 | BuildStatus.BUILDING, |
345 | 343 | BuildStatus.UPLOADING, | ||
346 | 342 | BuildStatus.SUPERSEDED] | 344 | BuildStatus.SUPERSEDED] |
347 | 343 | 345 | ||
348 | 344 | def getSpecificJob(self): | 346 | def getSpecificJob(self): |
349 | @@ -431,3 +433,11 @@ | |||
350 | 431 | filtered_builds.config(distinct=True) | 433 | filtered_builds.config(distinct=True) |
351 | 432 | 434 | ||
352 | 433 | return filtered_builds | 435 | return filtered_builds |
353 | 436 | |||
354 | 437 | def getByID(self, job_id): | ||
355 | 438 | """See `IBuildfarmJobSet`.""" | ||
356 | 439 | job = IStore(BuildFarmJob).find(BuildFarmJob, | ||
357 | 440 | BuildFarmJob.id == job_id).one() | ||
358 | 441 | if job is None: | ||
359 | 442 | raise NotFoundError(job_id) | ||
360 | 443 | return job | ||
361 | 434 | 444 | ||
362 | === modified file 'lib/lp/buildmaster/model/packagebuild.py' | |||
363 | --- lib/lp/buildmaster/model/packagebuild.py 2010-09-09 17:02:33 +0000 | |||
364 | +++ lib/lp/buildmaster/model/packagebuild.py 2010-09-16 00:48:58 +0000 | |||
365 | @@ -14,7 +14,6 @@ | |||
366 | 14 | import datetime | 14 | import datetime |
367 | 15 | import logging | 15 | import logging |
368 | 16 | import os.path | 16 | import os.path |
369 | 17 | import subprocess | ||
370 | 18 | 17 | ||
371 | 19 | from cStringIO import StringIO | 18 | from cStringIO import StringIO |
372 | 20 | from lazr.delegates import delegates | 19 | from lazr.delegates import delegates |
373 | @@ -36,11 +35,6 @@ | |||
374 | 36 | 35 | ||
375 | 37 | from canonical.config import config | 36 | from canonical.config import config |
376 | 38 | from canonical.database.enumcol import DBEnum | 37 | from canonical.database.enumcol import DBEnum |
377 | 39 | from canonical.database.sqlbase import ( | ||
378 | 40 | clear_current_connection_cache, | ||
379 | 41 | cursor, | ||
380 | 42 | flush_database_updates, | ||
381 | 43 | ) | ||
382 | 44 | from canonical.launchpad.browser.librarian import ProxiedLibraryFileAlias | 38 | from canonical.launchpad.browser.librarian import ProxiedLibraryFileAlias |
383 | 45 | from canonical.launchpad.helpers import filenameToContentType | 39 | from canonical.launchpad.helpers import filenameToContentType |
384 | 46 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet | 40 | from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet |
385 | @@ -73,7 +67,6 @@ | |||
386 | 73 | 67 | ||
387 | 74 | 68 | ||
388 | 75 | SLAVE_LOG_FILENAME = 'buildlog' | 69 | SLAVE_LOG_FILENAME = 'buildlog' |
389 | 76 | UPLOAD_LOG_FILENAME = 'uploader.log' | ||
390 | 77 | 70 | ||
391 | 78 | 71 | ||
392 | 79 | class PackageBuild(BuildFarmJobDerived, Storm): | 72 | class PackageBuild(BuildFarmJobDerived, Storm): |
393 | @@ -164,30 +157,11 @@ | |||
394 | 164 | timestamp = now.strftime("%Y%m%d-%H%M%S") | 157 | timestamp = now.strftime("%Y%m%d-%H%M%S") |
395 | 165 | return '%s-%s' % (timestamp, build_cookie) | 158 | return '%s-%s' % (timestamp, build_cookie) |
396 | 166 | 159 | ||
421 | 167 | def getUploadDir(self, upload_leaf): | 160 | def getBuildCookie(self): |
422 | 168 | """See `IPackageBuild`.""" | 161 | """See `IPackageBuild`.""" |
423 | 169 | return os.path.join(config.builddmaster.root, 'incoming', upload_leaf) | 162 | return '%s-%s-%s' % ( |
424 | 170 | 163 | self.id, self.build_farm_job.job_type.name, | |
425 | 171 | @staticmethod | 164 | self.build_farm_job.id) |
402 | 172 | def getUploaderCommand(package_build, upload_leaf, upload_logfilename): | ||
403 | 173 | """See `IPackageBuild`.""" | ||
404 | 174 | root = os.path.abspath(config.builddmaster.root) | ||
405 | 175 | uploader_command = list(config.builddmaster.uploader.split()) | ||
406 | 176 | |||
407 | 177 | # Add extra arguments for processing a package upload. | ||
408 | 178 | extra_args = [ | ||
409 | 179 | "--log-file", "%s" % upload_logfilename, | ||
410 | 180 | "-d", "%s" % package_build.distribution.name, | ||
411 | 181 | "-s", "%s" % ( | ||
412 | 182 | package_build.distro_series.getSuite(package_build.pocket)), | ||
413 | 183 | "-b", "%s" % package_build.id, | ||
414 | 184 | "-J", "%s" % upload_leaf, | ||
415 | 185 | '--context=%s' % package_build.policy_name, | ||
416 | 186 | "%s" % root, | ||
417 | 187 | ] | ||
418 | 188 | |||
419 | 189 | uploader_command.extend(extra_args) | ||
420 | 190 | return uploader_command | ||
426 | 191 | 165 | ||
427 | 192 | @staticmethod | 166 | @staticmethod |
428 | 193 | def getLogFromSlave(package_build): | 167 | def getLogFromSlave(package_build): |
429 | @@ -198,26 +172,6 @@ | |||
430 | 198 | package_build.buildqueue_record.getLogFileName(), | 172 | package_build.buildqueue_record.getLogFileName(), |
431 | 199 | package_build.is_private) | 173 | package_build.is_private) |
432 | 200 | 174 | ||
433 | 201 | @staticmethod | ||
434 | 202 | def getUploadLogContent(root, leaf): | ||
435 | 203 | """Retrieve the upload log contents. | ||
436 | 204 | |||
437 | 205 | :param root: Root directory for the uploads | ||
438 | 206 | :param leaf: Leaf for this particular upload | ||
439 | 207 | :return: Contents of log file or message saying no log file was found. | ||
440 | 208 | """ | ||
441 | 209 | # Retrieve log file content. | ||
442 | 210 | possible_locations = ( | ||
443 | 211 | 'failed', 'failed-to-move', 'rejected', 'accepted') | ||
444 | 212 | for location_dir in possible_locations: | ||
445 | 213 | log_filepath = os.path.join(root, location_dir, leaf, | ||
446 | 214 | UPLOAD_LOG_FILENAME) | ||
447 | 215 | if os.path.exists(log_filepath): | ||
448 | 216 | with open(log_filepath, 'r') as uploader_log_file: | ||
449 | 217 | return uploader_log_file.read() | ||
450 | 218 | else: | ||
451 | 219 | return 'Could not find upload log file' | ||
452 | 220 | |||
453 | 221 | def estimateDuration(self): | 175 | def estimateDuration(self): |
454 | 222 | """See `IPackageBuild`.""" | 176 | """See `IPackageBuild`.""" |
455 | 223 | raise NotImplementedError | 177 | raise NotImplementedError |
456 | @@ -346,19 +300,16 @@ | |||
457 | 346 | root = os.path.abspath(config.builddmaster.root) | 300 | root = os.path.abspath(config.builddmaster.root) |
458 | 347 | 301 | ||
459 | 348 | # Create a single directory to store build result files. | 302 | # Create a single directory to store build result files. |
464 | 349 | upload_leaf = self.getUploadDirLeaf( | 303 | upload_leaf = self.getUploadDirLeaf(self.getBuildCookie()) |
465 | 350 | '%s-%s' % (self.id, self.buildqueue_record.id)) | 304 | grab_dir = os.path.join(root, "grabbing", upload_leaf) |
466 | 351 | upload_dir = self.getUploadDir(upload_leaf) | 305 | logger.debug("Storing build result at '%s'" % grab_dir) |
463 | 352 | logger.debug("Storing build result at '%s'" % upload_dir) | ||
467 | 353 | 306 | ||
468 | 354 | # Build the right UPLOAD_PATH so the distribution and archive | 307 | # Build the right UPLOAD_PATH so the distribution and archive |
469 | 355 | # can be correctly found during the upload: | 308 | # can be correctly found during the upload: |
470 | 356 | # <archive_id>/distribution_name | 309 | # <archive_id>/distribution_name |
471 | 357 | # for all destination archive types. | 310 | # for all destination archive types. |
476 | 358 | archive = self.archive | 311 | upload_path = os.path.join( |
477 | 359 | distribution_name = self.distribution.name | 312 | grab_dir, str(self.archive.id), self.distribution.name) |
474 | 360 | target_path = '%s/%s' % (archive.id, distribution_name) | ||
475 | 361 | upload_path = os.path.join(upload_dir, target_path) | ||
478 | 362 | os.makedirs(upload_path) | 313 | os.makedirs(upload_path) |
479 | 363 | 314 | ||
480 | 364 | slave = removeSecurityProxy(self.buildqueue_record.builder.slave) | 315 | slave = removeSecurityProxy(self.buildqueue_record.builder.slave) |
481 | @@ -379,106 +330,35 @@ | |||
482 | 379 | slave_file = slave.getFile(filemap[filename]) | 330 | slave_file = slave.getFile(filemap[filename]) |
483 | 380 | copy_and_close(slave_file, out_file) | 331 | copy_and_close(slave_file, out_file) |
484 | 381 | 332 | ||
485 | 333 | # Store build information, build record was already updated during | ||
486 | 334 | # the binary upload. | ||
487 | 335 | self.storeBuildInfo(self, librarian, slave_status) | ||
488 | 336 | |||
489 | 382 | # We only attempt the upload if we successfully copied all the | 337 | # We only attempt the upload if we successfully copied all the |
490 | 383 | # files from the slave. | 338 | # files from the slave. |
491 | 384 | if successful_copy_from_slave: | 339 | if successful_copy_from_slave: |
572 | 385 | uploader_logfilename = os.path.join( | 340 | logger.info( |
573 | 386 | upload_dir, UPLOAD_LOG_FILENAME) | 341 | "Gathered %s %d completely. Moving %s to uploader queue." % ( |
574 | 387 | uploader_command = self.getUploaderCommand( | 342 | self.__class__.__name__, self.id, upload_leaf)) |
575 | 388 | self, upload_leaf, uploader_logfilename) | 343 | target_dir = os.path.join(root, "incoming") |
576 | 389 | logger.debug("Saving uploader log at '%s'" % uploader_logfilename) | 344 | self.status = BuildStatus.UPLOADING |
577 | 390 | 345 | else: | |
578 | 391 | logger.info("Invoking uploader on %s" % root) | 346 | logger.warning( |
579 | 392 | logger.info("%s" % uploader_command) | 347 | "Copy from slave for build %s was unsuccessful.", self.id) |
500 | 393 | |||
501 | 394 | uploader_process = subprocess.Popen( | ||
502 | 395 | uploader_command, stdout=subprocess.PIPE, | ||
503 | 396 | stderr=subprocess.PIPE) | ||
504 | 397 | |||
505 | 398 | # Nothing should be written to the stdout/stderr. | ||
506 | 399 | upload_stdout, upload_stderr = uploader_process.communicate() | ||
507 | 400 | |||
508 | 401 | # XXX cprov 2007-04-17: we do not check uploader_result_code | ||
509 | 402 | # anywhere. We need to find out what will be best strategy | ||
510 | 403 | # when it failed HARD (there is a huge effort in process-upload | ||
511 | 404 | # to not return error, it only happen when the code is broken). | ||
512 | 405 | uploader_result_code = uploader_process.returncode | ||
513 | 406 | logger.info("Uploader returned %d" % uploader_result_code) | ||
514 | 407 | |||
515 | 408 | # Quick and dirty hack to carry on on process-upload failures | ||
516 | 409 | if os.path.exists(upload_dir): | ||
517 | 410 | logger.warning("The upload directory did not get moved.") | ||
518 | 411 | failed_dir = os.path.join(root, "failed-to-move") | ||
519 | 412 | if not os.path.exists(failed_dir): | ||
520 | 413 | os.mkdir(failed_dir) | ||
521 | 414 | os.rename(upload_dir, os.path.join(failed_dir, upload_leaf)) | ||
522 | 415 | |||
523 | 416 | # The famous 'flush_updates + clear_cache' will make visible | ||
524 | 417 | # the DB changes done in process-upload, considering that the | ||
525 | 418 | # transaction was set with ISOLATION_LEVEL_READ_COMMITED | ||
526 | 419 | # isolation level. | ||
527 | 420 | cur = cursor() | ||
528 | 421 | cur.execute('SHOW transaction_isolation') | ||
529 | 422 | isolation_str = cur.fetchone()[0] | ||
530 | 423 | assert isolation_str == 'read committed', ( | ||
531 | 424 | 'BuildMaster/BuilderGroup transaction isolation should be ' | ||
532 | 425 | 'ISOLATION_LEVEL_READ_COMMITTED (not "%s")' % isolation_str) | ||
533 | 426 | |||
534 | 427 | original_slave = self.buildqueue_record.builder.slave | ||
535 | 428 | |||
536 | 429 | # XXX Robert Collins, Celso Providelo 2007-05-26 bug=506256: | ||
537 | 430 | # 'Refreshing' objects procedure is forced on us by using a | ||
538 | 431 | # different process to do the upload, but as that process runs | ||
539 | 432 | # in the same unix account, it is simply double handling and we | ||
540 | 433 | # would be better off to do it within this process. | ||
541 | 434 | flush_database_updates() | ||
542 | 435 | clear_current_connection_cache() | ||
543 | 436 | |||
544 | 437 | # XXX cprov 2007-06-15: Re-issuing removeSecurityProxy is forced on | ||
545 | 438 | # us by sqlobject refreshing the builder object during the | ||
546 | 439 | # transaction cache clearing. Once we sort the previous problem | ||
547 | 440 | # this step should probably not be required anymore. | ||
548 | 441 | self.buildqueue_record.builder.setSlaveForTesting( | ||
549 | 442 | removeSecurityProxy(original_slave)) | ||
550 | 443 | |||
551 | 444 | # Store build information, build record was already updated during | ||
552 | 445 | # the binary upload. | ||
553 | 446 | self.storeBuildInfo(self, librarian, slave_status) | ||
554 | 447 | |||
555 | 448 | # Retrive the up-to-date build record and perform consistency | ||
556 | 449 | # checks. The build record should be updated during the binary | ||
557 | 450 | # upload processing, if it wasn't something is broken and needs | ||
558 | 451 | # admins attention. Even when we have a FULLYBUILT build record, | ||
559 | 452 | # if it is not related with at least one binary, there is also | ||
560 | 453 | # a problem. | ||
561 | 454 | # For both situations we will mark the builder as FAILEDTOUPLOAD | ||
562 | 455 | # and the and update the build details (datebuilt, duration, | ||
563 | 456 | # buildlog, builder) in LP. A build-failure-notification will be | ||
564 | 457 | # sent to the lp-build-admin celebrity and to the sourcepackagerelease | ||
565 | 458 | # uploader about this occurrence. The failure notification will | ||
566 | 459 | # also contain the information required to manually reprocess the | ||
567 | 460 | # binary upload when it was the case. | ||
568 | 461 | if (self.status != BuildStatus.FULLYBUILT or | ||
569 | 462 | not successful_copy_from_slave or | ||
570 | 463 | not self.verifySuccessfulUpload()): | ||
571 | 464 | logger.warning("Build %s upload failed." % self.id) | ||
580 | 465 | self.status = BuildStatus.FAILEDTOUPLOAD | 348 | self.status = BuildStatus.FAILEDTOUPLOAD |
592 | 466 | uploader_log_content = self.getUploadLogContent(root, | 349 | self.notify(extra_info='Copy from slave was unsuccessful.') |
593 | 467 | upload_leaf) | 350 | target_dir = os.path.join(root, "failed") |
594 | 468 | # Store the upload_log_contents in librarian so it can be | 351 | |
595 | 469 | # accessed by anyone with permission to see the build. | 352 | if not os.path.exists(target_dir): |
596 | 470 | self.storeUploadLog(uploader_log_content) | 353 | os.mkdir(target_dir) |
597 | 471 | # Notify the build failure. | 354 | |
598 | 472 | self.notify(extra_info=uploader_log_content) | 355 | # Move the directory used to grab the binaries into |
599 | 473 | else: | 356 | # the incoming directory so the upload processor never |
600 | 474 | logger.info( | 357 | # sees half-finished uploads. |
601 | 475 | "Gathered %s %d completely" % ( | 358 | os.rename(grab_dir, os.path.join(target_dir, upload_leaf)) |
591 | 476 | self.__class__.__name__, self.id)) | ||
602 | 477 | 359 | ||
603 | 478 | # Release the builder for another job. | 360 | # Release the builder for another job. |
604 | 479 | self.buildqueue_record.builder.cleanSlave() | 361 | self.buildqueue_record.builder.cleanSlave() |
605 | 480 | # Remove BuildQueue record. | ||
606 | 481 | self.buildqueue_record.destroySelf() | ||
607 | 482 | 362 | ||
608 | 483 | def _handleStatus_PACKAGEFAIL(self, librarian, slave_status, logger): | 363 | def _handleStatus_PACKAGEFAIL(self, librarian, slave_status, logger): |
609 | 484 | """Handle a package that had failed to build. | 364 | """Handle a package that had failed to build. |
610 | @@ -583,7 +463,9 @@ | |||
611 | 583 | unfinished_states = [ | 463 | unfinished_states = [ |
612 | 584 | BuildStatus.NEEDSBUILD, | 464 | BuildStatus.NEEDSBUILD, |
613 | 585 | BuildStatus.BUILDING, | 465 | BuildStatus.BUILDING, |
615 | 586 | BuildStatus.SUPERSEDED] | 466 | BuildStatus.UPLOADING, |
616 | 467 | BuildStatus.SUPERSEDED, | ||
617 | 468 | ] | ||
618 | 587 | if status is None or status in unfinished_states: | 469 | if status is None or status in unfinished_states: |
619 | 588 | result_set.order_by( | 470 | result_set.order_by( |
620 | 589 | Desc(BuildFarmJob.date_created), BuildFarmJob.id) | 471 | Desc(BuildFarmJob.date_created), BuildFarmJob.id) |
621 | 590 | 472 | ||
622 | === modified file 'lib/lp/buildmaster/tests/test_buildfarmjob.py' | |||
623 | --- lib/lp/buildmaster/tests/test_buildfarmjob.py 2010-08-30 15:00:23 +0000 | |||
624 | +++ lib/lp/buildmaster/tests/test_buildfarmjob.py 2010-09-16 00:48:58 +0000 | |||
625 | @@ -22,6 +22,7 @@ | |||
626 | 22 | DatabaseFunctionalLayer, | 22 | DatabaseFunctionalLayer, |
627 | 23 | LaunchpadFunctionalLayer, | 23 | LaunchpadFunctionalLayer, |
628 | 24 | ) | 24 | ) |
629 | 25 | from lp.app.errors import NotFoundError | ||
630 | 25 | from lp.buildmaster.enums import ( | 26 | from lp.buildmaster.enums import ( |
631 | 26 | BuildFarmJobType, | 27 | BuildFarmJobType, |
632 | 27 | BuildStatus, | 28 | BuildStatus, |
633 | @@ -317,3 +318,17 @@ | |||
634 | 317 | result = self.build_farm_job_set.getBuildsForBuilder(self.builder) | 318 | result = self.build_farm_job_set.getBuildsForBuilder(self.builder) |
635 | 318 | 319 | ||
636 | 319 | self.assertEqual([build_1, build_2], list(result)) | 320 | self.assertEqual([build_1, build_2], list(result)) |
637 | 321 | |||
638 | 322 | def test_getByID(self): | ||
639 | 323 | # getByID returns a job by id. | ||
640 | 324 | build_1 = self.makeBuildFarmJob( | ||
641 | 325 | builder=self.builder, | ||
642 | 326 | date_finished=datetime(2008, 10, 10, tzinfo=pytz.UTC)) | ||
643 | 327 | flush_database_updates() | ||
644 | 328 | self.assertEquals( | ||
645 | 329 | build_1, self.build_farm_job_set.getByID(build_1.id)) | ||
646 | 330 | |||
647 | 331 | def test_getByID_nonexistant(self): | ||
648 | 332 | # getByID raises NotFoundError for unknown job ids. | ||
649 | 333 | self.assertRaises(NotFoundError, | ||
650 | 334 | self.build_farm_job_set.getByID, 423432432432) | ||
651 | 320 | 335 | ||
652 | === modified file 'lib/lp/buildmaster/tests/test_packagebuild.py' | |||
653 | --- lib/lp/buildmaster/tests/test_packagebuild.py 2010-09-09 17:02:33 +0000 | |||
654 | +++ lib/lp/buildmaster/tests/test_packagebuild.py 2010-09-16 00:48:58 +0000 | |||
655 | @@ -9,7 +9,7 @@ | |||
656 | 9 | 9 | ||
657 | 10 | from datetime import datetime | 10 | from datetime import datetime |
658 | 11 | import hashlib | 11 | import hashlib |
660 | 12 | import os.path | 12 | import os |
661 | 13 | 13 | ||
662 | 14 | from storm.store import Store | 14 | from storm.store import Store |
663 | 15 | from zope.component import getUtility | 15 | from zope.component import getUtility |
664 | @@ -22,6 +22,9 @@ | |||
665 | 22 | LaunchpadFunctionalLayer, | 22 | LaunchpadFunctionalLayer, |
666 | 23 | LaunchpadZopelessLayer, | 23 | LaunchpadZopelessLayer, |
667 | 24 | ) | 24 | ) |
668 | 25 | from lp.archiveuploader.uploadprocessor import ( | ||
669 | 26 | parse_build_upload_leaf_name, | ||
670 | 27 | ) | ||
671 | 25 | from lp.buildmaster.enums import ( | 28 | from lp.buildmaster.enums import ( |
672 | 26 | BuildFarmJobType, | 29 | BuildFarmJobType, |
673 | 27 | BuildStatus, | 30 | BuildStatus, |
674 | @@ -34,7 +37,6 @@ | |||
675 | 34 | from lp.buildmaster.model.packagebuild import PackageBuild | 37 | from lp.buildmaster.model.packagebuild import PackageBuild |
676 | 35 | from lp.registry.interfaces.pocket import ( | 38 | from lp.registry.interfaces.pocket import ( |
677 | 36 | PackagePublishingPocket, | 39 | PackagePublishingPocket, |
678 | 37 | pocketsuffix, | ||
679 | 38 | ) | 40 | ) |
680 | 39 | from lp.soyuz.tests.soyuzbuilddhelpers import WaitingSlave | 41 | from lp.soyuz.tests.soyuzbuilddhelpers import WaitingSlave |
681 | 40 | from lp.testing import ( | 42 | from lp.testing import ( |
682 | @@ -192,15 +194,14 @@ | |||
683 | 192 | '%s-%s' % (now.strftime("%Y%m%d-%H%M%S"), build_cookie), | 194 | '%s-%s' % (now.strftime("%Y%m%d-%H%M%S"), build_cookie), |
684 | 193 | upload_leaf) | 195 | upload_leaf) |
685 | 194 | 196 | ||
695 | 195 | def test_getUploadDir(self): | 197 | def test_getBuildCookie(self): |
696 | 196 | # getUploadDir is the absolute path to the directory in which things | 198 | # A build cookie is made up of the package build id and record id. |
697 | 197 | # are uploaded to. | 199 | # The uploadprocessor relies on this format. |
698 | 198 | build_cookie = self.factory.getUniqueInteger() | 200 | Store.of(self.package_build).flush() |
699 | 199 | upload_leaf = self.package_build.getUploadDirLeaf(build_cookie) | 201 | cookie = self.package_build.getBuildCookie() |
700 | 200 | upload_dir = self.package_build.getUploadDir(upload_leaf) | 202 | expected_cookie = "%d-PACKAGEBUILD-%d" % ( |
701 | 201 | self.assertEqual( | 203 | self.package_build.id, self.package_build.build_farm_job.id) |
702 | 202 | os.path.join(config.builddmaster.root, 'incoming', upload_leaf), | 204 | self.assertEquals(expected_cookie, cookie) |
694 | 203 | upload_dir) | ||
703 | 204 | 205 | ||
704 | 205 | 206 | ||
705 | 206 | class TestPackageBuildSet(TestPackageBuildBase): | 207 | class TestPackageBuildSet(TestPackageBuildBase): |
706 | @@ -257,57 +258,18 @@ | |||
707 | 257 | super(TestGetUploadMethodsMixin, self).setUp() | 258 | super(TestGetUploadMethodsMixin, self).setUp() |
708 | 258 | self.build = self.makeBuild() | 259 | self.build = self.makeBuild() |
709 | 259 | 260 | ||
753 | 260 | def test_getUploadLogContent_nolog(self): | 261 | def test_getUploadDirLeafCookie_parseable(self): |
754 | 261 | """If there is no log file there, a string explanation is returned. | 262 | # getUploadDirLeaf should return a directory name |
755 | 262 | """ | 263 | # that is parseable by the upload processor. |
756 | 263 | self.useTempDir() | 264 | upload_leaf = self.build.getUploadDirLeaf( |
757 | 264 | self.assertEquals( | 265 | self.build.getBuildCookie()) |
758 | 265 | 'Could not find upload log file', | 266 | job_id = parse_build_upload_leaf_name(upload_leaf) |
759 | 266 | self.build.getUploadLogContent(os.getcwd(), "myleaf")) | 267 | self.assertEqual(job_id, self.build.build_farm_job.id) |
717 | 267 | |||
718 | 268 | def test_getUploadLogContent_only_dir(self): | ||
719 | 269 | """If there is a directory but no log file, expect the error string, | ||
720 | 270 | not an exception.""" | ||
721 | 271 | self.useTempDir() | ||
722 | 272 | os.makedirs("accepted/myleaf") | ||
723 | 273 | self.assertEquals( | ||
724 | 274 | 'Could not find upload log file', | ||
725 | 275 | self.build.getUploadLogContent(os.getcwd(), "myleaf")) | ||
726 | 276 | |||
727 | 277 | def test_getUploadLogContent_readsfile(self): | ||
728 | 278 | """If there is a log file, return its contents.""" | ||
729 | 279 | self.useTempDir() | ||
730 | 280 | os.makedirs("accepted/myleaf") | ||
731 | 281 | with open('accepted/myleaf/uploader.log', 'w') as f: | ||
732 | 282 | f.write('foo') | ||
733 | 283 | self.assertEquals( | ||
734 | 284 | 'foo', self.build.getUploadLogContent(os.getcwd(), "myleaf")) | ||
735 | 285 | |||
736 | 286 | def test_getUploaderCommand(self): | ||
737 | 287 | upload_leaf = self.factory.getUniqueString('upload-leaf') | ||
738 | 288 | config_args = list(config.builddmaster.uploader.split()) | ||
739 | 289 | log_file = self.factory.getUniqueString('logfile') | ||
740 | 290 | config_args.extend( | ||
741 | 291 | ['--log-file', log_file, | ||
742 | 292 | '-d', self.build.distribution.name, | ||
743 | 293 | '-s', (self.build.distro_series.name | ||
744 | 294 | + pocketsuffix[self.build.pocket]), | ||
745 | 295 | '-b', str(self.build.id), | ||
746 | 296 | '-J', upload_leaf, | ||
747 | 297 | '--context=%s' % self.build.policy_name, | ||
748 | 298 | os.path.abspath(config.builddmaster.root), | ||
749 | 299 | ]) | ||
750 | 300 | uploader_command = self.build.getUploaderCommand( | ||
751 | 301 | self.build, upload_leaf, log_file) | ||
752 | 302 | self.assertEqual(config_args, uploader_command) | ||
760 | 303 | 268 | ||
761 | 304 | 269 | ||
762 | 305 | class TestHandleStatusMixin: | 270 | class TestHandleStatusMixin: |
763 | 306 | """Tests for `IPackageBuild`s handleStatus method. | 271 | """Tests for `IPackageBuild`s handleStatus method. |
764 | 307 | 272 | ||
765 | 308 | Note: these tests do *not* test the updating of the build | ||
766 | 309 | status to FULLYBUILT as this happens during the upload which | ||
767 | 310 | is stubbed out by a mock function. | ||
768 | 311 | """ | 273 | """ |
769 | 312 | 274 | ||
770 | 313 | layer = LaunchpadZopelessLayer | 275 | layer = LaunchpadZopelessLayer |
771 | @@ -329,23 +291,23 @@ | |||
772 | 329 | builder.setSlaveForTesting(self.slave) | 291 | builder.setSlaveForTesting(self.slave) |
773 | 330 | 292 | ||
774 | 331 | # We overwrite the buildmaster root to use a temp directory. | 293 | # We overwrite the buildmaster root to use a temp directory. |
776 | 332 | tmp_dir = self.makeTemporaryDirectory() | 294 | self.upload_root = self.makeTemporaryDirectory() |
777 | 333 | tmp_builddmaster_root = """ | 295 | tmp_builddmaster_root = """ |
778 | 334 | [builddmaster] | 296 | [builddmaster] |
779 | 335 | root: %s | 297 | root: %s |
781 | 336 | """ % tmp_dir | 298 | """ % self.upload_root |
782 | 337 | config.push('tmp_builddmaster_root', tmp_builddmaster_root) | 299 | config.push('tmp_builddmaster_root', tmp_builddmaster_root) |
783 | 338 | 300 | ||
784 | 339 | # We stub out our builds getUploaderCommand() method so | 301 | # We stub out our builds getUploaderCommand() method so |
785 | 340 | # we can check whether it was called as well as | 302 | # we can check whether it was called as well as |
786 | 341 | # verifySuccessfulUpload(). | 303 | # verifySuccessfulUpload(). |
787 | 342 | self.fake_getUploaderCommand = FakeMethod( | ||
788 | 343 | result=['echo', 'noop']) | ||
789 | 344 | removeSecurityProxy(self.build).getUploaderCommand = ( | ||
790 | 345 | self.fake_getUploaderCommand) | ||
791 | 346 | removeSecurityProxy(self.build).verifySuccessfulUpload = FakeMethod( | 304 | removeSecurityProxy(self.build).verifySuccessfulUpload = FakeMethod( |
792 | 347 | result=True) | 305 | result=True) |
793 | 348 | 306 | ||
794 | 307 | def assertResultCount(self, count, result): | ||
795 | 308 | self.assertEquals( | ||
796 | 309 | 1, len(os.listdir(os.path.join(self.upload_root, result)))) | ||
797 | 310 | |||
798 | 349 | def test_handleStatus_OK_normal_file(self): | 311 | def test_handleStatus_OK_normal_file(self): |
799 | 350 | # A filemap with plain filenames should not cause a problem. | 312 | # A filemap with plain filenames should not cause a problem. |
800 | 351 | # The call to handleStatus will attempt to get the file from | 313 | # The call to handleStatus will attempt to get the file from |
801 | @@ -354,8 +316,8 @@ | |||
802 | 354 | 'filemap': {'myfile.py': 'test_file_hash'}, | 316 | 'filemap': {'myfile.py': 'test_file_hash'}, |
803 | 355 | }) | 317 | }) |
804 | 356 | 318 | ||
807 | 357 | self.assertEqual(BuildStatus.FULLYBUILT, self.build.status) | 319 | self.assertEqual(BuildStatus.UPLOADING, self.build.status) |
808 | 358 | self.assertEqual(1, self.fake_getUploaderCommand.call_count) | 320 | self.assertResultCount(1, "incoming") |
809 | 359 | 321 | ||
810 | 360 | def test_handleStatus_OK_absolute_filepath(self): | 322 | def test_handleStatus_OK_absolute_filepath(self): |
811 | 361 | # A filemap that tries to write to files outside of | 323 | # A filemap that tries to write to files outside of |
812 | @@ -364,7 +326,7 @@ | |||
813 | 364 | 'filemap': {'/tmp/myfile.py': 'test_file_hash'}, | 326 | 'filemap': {'/tmp/myfile.py': 'test_file_hash'}, |
814 | 365 | }) | 327 | }) |
815 | 366 | self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status) | 328 | self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status) |
817 | 367 | self.assertEqual(0, self.fake_getUploaderCommand.call_count) | 329 | self.assertResultCount(0, "failed") |
818 | 368 | 330 | ||
819 | 369 | def test_handleStatus_OK_relative_filepath(self): | 331 | def test_handleStatus_OK_relative_filepath(self): |
820 | 370 | # A filemap that tries to write to files outside of | 332 | # A filemap that tries to write to files outside of |
821 | @@ -373,7 +335,7 @@ | |||
822 | 373 | 'filemap': {'../myfile.py': 'test_file_hash'}, | 335 | 'filemap': {'../myfile.py': 'test_file_hash'}, |
823 | 374 | }) | 336 | }) |
824 | 375 | self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status) | 337 | self.assertEqual(BuildStatus.FAILEDTOUPLOAD, self.build.status) |
826 | 376 | self.assertEqual(0, self.fake_getUploaderCommand.call_count) | 338 | self.assertResultCount(0, "failed") |
827 | 377 | 339 | ||
828 | 378 | def test_handleStatus_OK_sets_build_log(self): | 340 | def test_handleStatus_OK_sets_build_log(self): |
829 | 379 | # The build log is set during handleStatus. | 341 | # The build log is set during handleStatus. |
830 | 380 | 342 | ||
831 | === modified file 'lib/lp/code/browser/sourcepackagerecipebuild.py' | |||
832 | --- lib/lp/code/browser/sourcepackagerecipebuild.py 2010-08-27 11:19:54 +0000 | |||
833 | +++ lib/lp/code/browser/sourcepackagerecipebuild.py 2010-09-16 00:48:58 +0000 | |||
834 | @@ -82,6 +82,7 @@ | |||
835 | 82 | return 'No suitable builders' | 82 | return 'No suitable builders' |
836 | 83 | return { | 83 | return { |
837 | 84 | BuildStatus.NEEDSBUILD: 'Pending build', | 84 | BuildStatus.NEEDSBUILD: 'Pending build', |
838 | 85 | BuildStatus.UPLOADING: 'Build uploading', | ||
839 | 85 | BuildStatus.FULLYBUILT: 'Successful build', | 86 | BuildStatus.FULLYBUILT: 'Successful build', |
840 | 86 | BuildStatus.MANUALDEPWAIT: ( | 87 | BuildStatus.MANUALDEPWAIT: ( |
841 | 87 | 'Could not build because of missing dependencies'), | 88 | 'Could not build because of missing dependencies'), |
842 | 88 | 89 | ||
843 | === modified file 'lib/lp/code/model/tests/test_sourcepackagerecipebuild.py' | |||
844 | --- lib/lp/code/model/tests/test_sourcepackagerecipebuild.py 2010-09-09 17:02:33 +0000 | |||
845 | +++ lib/lp/code/model/tests/test_sourcepackagerecipebuild.py 2010-09-16 00:48:58 +0000 | |||
846 | @@ -354,15 +354,15 @@ | |||
847 | 354 | queue_record.builder.setSlaveForTesting(slave) | 354 | queue_record.builder.setSlaveForTesting(slave) |
848 | 355 | return build | 355 | return build |
849 | 356 | 356 | ||
851 | 357 | def assertNotifyOnce(status, build): | 357 | def assertNotifyCount(status, build, count): |
852 | 358 | build.handleStatus(status, None, {'filemap': {}}) | 358 | build.handleStatus(status, None, {'filemap': {}}) |
856 | 359 | self.assertEqual(1, len(pop_notifications())) | 359 | self.assertEqual(count, len(pop_notifications())) |
857 | 360 | for status in ['PACKAGEFAIL', 'OK']: | 360 | assertNotifyCount("PACKAGEFAIL", prepare_build(), 1) |
858 | 361 | assertNotifyOnce(status, prepare_build()) | 361 | assertNotifyCount("OK", prepare_build(), 0) |
859 | 362 | build = prepare_build() | 362 | build = prepare_build() |
860 | 363 | removeSecurityProxy(build).verifySuccessfulUpload = FakeMethod( | 363 | removeSecurityProxy(build).verifySuccessfulUpload = FakeMethod( |
863 | 364 | result=True) | 364 | result=True) |
864 | 365 | assertNotifyOnce('OK', prepare_build()) | 365 | assertNotifyCount("OK", prepare_build(), 0) |
865 | 366 | 366 | ||
866 | 367 | 367 | ||
867 | 368 | class MakeSPRecipeBuildMixin: | 368 | class MakeSPRecipeBuildMixin: |
868 | 369 | 369 | ||
869 | === modified file 'lib/lp/registry/model/sourcepackage.py' | |||
870 | --- lib/lp/registry/model/sourcepackage.py 2010-08-27 11:19:54 +0000 | |||
871 | +++ lib/lp/registry/model/sourcepackage.py 2010-09-16 00:48:58 +0000 | |||
872 | @@ -597,11 +597,15 @@ | |||
873 | 597 | % sqlvalues(BuildStatus.FULLYBUILT)) | 597 | % sqlvalues(BuildStatus.FULLYBUILT)) |
874 | 598 | 598 | ||
875 | 599 | # Ordering according status | 599 | # Ordering according status |
877 | 600 | # * NEEDSBUILD & BUILDING by -lastscore | 600 | # * NEEDSBUILD, BUILDING & UPLOADING by -lastscore |
878 | 601 | # * SUPERSEDED by -datecreated | 601 | # * SUPERSEDED by -datecreated |
879 | 602 | # * FULLYBUILT & FAILURES by -datebuilt | 602 | # * FULLYBUILT & FAILURES by -datebuilt |
880 | 603 | # It should present the builds in a more natural order. | 603 | # It should present the builds in a more natural order. |
882 | 604 | if build_state in [BuildStatus.NEEDSBUILD, BuildStatus.BUILDING]: | 604 | if build_state in [ |
883 | 605 | BuildStatus.NEEDSBUILD, | ||
884 | 606 | BuildStatus.BUILDING, | ||
885 | 607 | BuildStatus.UPLOADING, | ||
886 | 608 | ]: | ||
887 | 605 | orderBy = ["-BuildQueue.lastscore"] | 609 | orderBy = ["-BuildQueue.lastscore"] |
888 | 606 | clauseTables.append('BuildPackageJob') | 610 | clauseTables.append('BuildPackageJob') |
889 | 607 | condition_clauses.append( | 611 | condition_clauses.append( |
890 | 608 | 612 | ||
891 | === modified file 'lib/lp/soyuz/browser/archive.py' | |||
892 | --- lib/lp/soyuz/browser/archive.py 2010-08-31 11:31:04 +0000 | |||
893 | +++ lib/lp/soyuz/browser/archive.py 2010-09-16 00:48:58 +0000 | |||
894 | @@ -947,6 +947,7 @@ | |||
895 | 947 | 'NEEDSBUILD': 'Waiting to build', | 947 | 'NEEDSBUILD': 'Waiting to build', |
896 | 948 | 'FAILEDTOBUILD': 'Failed to build:', | 948 | 'FAILEDTOBUILD': 'Failed to build:', |
897 | 949 | 'BUILDING': 'Currently building', | 949 | 'BUILDING': 'Currently building', |
898 | 950 | 'UPLOADING': 'Currently uploading', | ||
899 | 950 | } | 951 | } |
900 | 951 | 952 | ||
901 | 952 | now = datetime.now(tz=pytz.UTC) | 953 | now = datetime.now(tz=pytz.UTC) |
902 | 953 | 954 | ||
903 | === modified file 'lib/lp/soyuz/doc/buildd-slavescanner.txt' | |||
904 | --- lib/lp/soyuz/doc/buildd-slavescanner.txt 2010-08-30 02:07:38 +0000 | |||
905 | +++ lib/lp/soyuz/doc/buildd-slavescanner.txt 2010-09-16 00:48:58 +0000 | |||
906 | @@ -319,87 +319,27 @@ | |||
907 | 319 | This situation happens when the builder has finished the job and is | 319 | This situation happens when the builder has finished the job and is |
908 | 320 | waiting for the master to collect its results. | 320 | waiting for the master to collect its results. |
909 | 321 | 321 | ||
918 | 322 | The build record in question can end up in the following states: | 322 | The build record in question will end up as UPLOADING. |
919 | 323 | 323 | ||
920 | 324 | * FULLYBUILT: when binaries were collected and uploaded correctly; | 324 | === Uploading (UPLOADING) === |
913 | 325 | * FAILEDTOUPLOAD: binaries were collected but the upload was | ||
914 | 326 | rejected/failed. | ||
915 | 327 | |||
916 | 328 | |||
917 | 329 | === Failed to Upload (FAILEDTOUPLOAD) === | ||
921 | 330 | 325 | ||
922 | 331 | >>> bqItem10 = a_build.queueBuild() | 326 | >>> bqItem10 = a_build.queueBuild() |
923 | 332 | >>> setupBuildQueue(bqItem10, a_builder) | 327 | >>> setupBuildQueue(bqItem10, a_builder) |
924 | 333 | >>> last_stub_mail_count = len(stub.test_emails) | ||
925 | 334 | 328 | ||
926 | 335 | Create a mock slave so the builder gets the right responses for this test. | 329 | Create a mock slave so the builder gets the right responses for this test. |
927 | 336 | 330 | ||
928 | 337 | >>> bqItem10.builder.setSlaveForTesting( | 331 | >>> bqItem10.builder.setSlaveForTesting( |
929 | 338 | ... WaitingSlave('BuildStatus.OK')) | 332 | ... WaitingSlave('BuildStatus.OK')) |
930 | 339 | 333 | ||
934 | 340 | If the build record wasn't updated before/during the updateBuild | 334 | The build will progress to the UPLOADING state if the status from |
935 | 341 | (precisely on binary upload time), the build will be considered | 335 | the builder was OK: |
933 | 342 | FAILEDTOUPLOAD: | ||
936 | 343 | 336 | ||
937 | 344 | >>> build = getUtility(IBinaryPackageBuildSet).getByQueueEntry(bqItem10) | 337 | >>> build = getUtility(IBinaryPackageBuildSet).getByQueueEntry(bqItem10) |
938 | 345 | >>> a_builder.updateBuild(bqItem10) | 338 | >>> a_builder.updateBuild(bqItem10) |
939 | 346 | WARNING:slave-scanner:Build ... upload failed. | ||
940 | 347 | >>> build.builder is not None | ||
941 | 348 | True | ||
942 | 349 | >>> build.date_finished is not None | ||
943 | 350 | True | ||
944 | 351 | >>> build.duration is not None | ||
945 | 352 | True | ||
946 | 353 | >>> build.log is not None | ||
947 | 354 | True | ||
948 | 355 | >>> check_mail_sent(last_stub_mail_count) | ||
949 | 356 | True | ||
950 | 357 | >>> build.status.title | 339 | >>> build.status.title |
996 | 358 | 'Failed to upload' | 340 | 'Uploading build' |
997 | 359 | 341 | ||
998 | 360 | Let's check the emails generated by this 'failure' | 342 | >>> bqItem10.destroySelf() |
954 | 361 | (see build-failedtoupload-workflow.txt for more information): | ||
955 | 362 | |||
956 | 363 | >>> from operator import itemgetter | ||
957 | 364 | >>> local_test_emails = stub.test_emails[last_stub_mail_count:] | ||
958 | 365 | >>> local_test_emails.sort(key=itemgetter(1), reverse=True) | ||
959 | 366 | >>> for from_addr, to_addrs, raw_msg in local_test_emails: | ||
960 | 367 | ... print to_addrs | ||
961 | 368 | ['mark@example.com'] | ||
962 | 369 | ['foo.bar@canonical.com'] | ||
963 | 370 | ['celso.providelo@canonical.com'] | ||
964 | 371 | |||
965 | 372 | Note that a real failed-to-upload notification contains the respective | ||
966 | 373 | upload log information: | ||
967 | 374 | |||
968 | 375 | >>> one_email = stub.test_emails.pop() | ||
969 | 376 | >>> from_addr, to_addrs, raw_msg = one_email | ||
970 | 377 | >>> print raw_msg | ||
971 | 378 | Content-Type: text/plain; charset="utf-8" | ||
972 | 379 | ... | ||
973 | 380 | X-Launchpad-Build-State: FAILEDTOUPLOAD | ||
974 | 381 | ... | ||
975 | 382 | * Build Log: http://.../...i386.mozilla-firefox_0.9_BUILDING.txt.gz | ||
976 | 383 | ... | ||
977 | 384 | Upload log: | ||
978 | 385 | DEBUG ... | ||
979 | 386 | DEBUG Initialising connection. | ||
980 | 387 | ... | ||
981 | 388 | DEBUG Removing lock file: /var/lock/process-upload-buildd.lock | ||
982 | 389 | ... | ||
983 | 390 | |||
984 | 391 | When a failure in processing the generated binaries occurs, the log | ||
985 | 392 | output is both emailed in an immediate notification, and stored in the | ||
986 | 393 | librarian for future reference. | ||
987 | 394 | |||
988 | 395 | >>> build.upload_log is not None | ||
989 | 396 | True | ||
990 | 397 | |||
991 | 398 | What we can clearly notice is that the log is still containing | ||
992 | 399 | the old build state (BUILDING) in its name. This is a minor problem | ||
993 | 400 | that can be sorted by modifying the execution order of procedures | ||
994 | 401 | inside Buildergroup.buildStatus_OK method. | ||
995 | 402 | |||
999 | 403 | 343 | ||
1000 | 404 | === Successfully collected and uploaded (FULLYBUILT) === | 344 | === Successfully collected and uploaded (FULLYBUILT) === |
1001 | 405 | 345 | ||
1002 | @@ -426,36 +366,14 @@ | |||
1003 | 426 | 366 | ||
1004 | 427 | >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) | 367 | >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) |
1005 | 428 | 368 | ||
1006 | 429 | Now in order to emulate a successfully binary upload we will update | ||
1007 | 430 | the build record to FULLYBUILT, as the process-upload would do: | ||
1008 | 431 | |||
1009 | 432 | >>> from lp.buildmaster.enums import BuildStatus | ||
1010 | 433 | >>> build.status = BuildStatus.FULLYBUILT | ||
1011 | 434 | |||
1012 | 435 | Now the updateBuild should recognize this build record as a | ||
1013 | 436 | Successfully built and uploaded procedure, not sending any | ||
1014 | 437 | notification and updating the build information: | ||
1015 | 438 | |||
1016 | 439 | >>> a_builder.updateBuild(bqItem10) | ||
1017 | 440 | >>> build.builder is not None | ||
1018 | 441 | True | ||
1019 | 442 | >>> build.date_finished is not None | ||
1020 | 443 | True | ||
1021 | 444 | >>> build.duration is not None | ||
1022 | 445 | True | ||
1023 | 446 | >>> build.log is not None | ||
1024 | 447 | True | ||
1025 | 448 | >>> build.status.title | ||
1026 | 449 | 'Successfully built' | ||
1027 | 450 | >>> check_mail_sent(last_stub_mail_count) | ||
1028 | 451 | False | ||
1029 | 452 | |||
1030 | 453 | We do not store any build log information when the binary upload | 369 | We do not store any build log information when the binary upload |
1031 | 454 | processing succeeded. | 370 | processing succeeded. |
1032 | 455 | 371 | ||
1033 | 456 | >>> build.upload_log is None | 372 | >>> build.upload_log is None |
1034 | 457 | True | 373 | True |
1035 | 458 | 374 | ||
1036 | 375 | >>> bqItem10.destroySelf() | ||
1037 | 376 | |||
1038 | 459 | WAITING -> GIVENBACK - slave requested build record to be rescheduled. | 377 | WAITING -> GIVENBACK - slave requested build record to be rescheduled. |
1039 | 460 | 378 | ||
1040 | 461 | >>> bqItem11 = a_build.queueBuild() | 379 | >>> bqItem11 = a_build.queueBuild() |
1041 | @@ -523,6 +441,7 @@ | |||
1042 | 523 | ... 6).queueBuild() | 441 | ... 6).queueBuild() |
1043 | 524 | >>> setupBuildQueue(bqItem10, a_builder) | 442 | >>> setupBuildQueue(bqItem10, a_builder) |
1044 | 525 | >>> build = bqItem10.specific_job.build | 443 | >>> build = bqItem10.specific_job.build |
1045 | 444 | >>> from lp.buildmaster.enums import BuildStatus | ||
1046 | 526 | >>> build.status = BuildStatus.FULLYBUILT | 445 | >>> build.status = BuildStatus.FULLYBUILT |
1047 | 527 | >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) | 446 | >>> bqItem10.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) |
1048 | 528 | 447 | ||
1049 | @@ -613,29 +532,6 @@ | |||
1050 | 613 | >>> print headers['content-type'] | 532 | >>> print headers['content-type'] |
1051 | 614 | text/plain | 533 | text/plain |
1052 | 615 | 534 | ||
1053 | 616 | Check the log from the uploader run has made it into the upload directory: | ||
1054 | 617 | |||
1055 | 618 | >>> failed_dir = os.path.join(config.builddmaster.root, 'failed') | ||
1056 | 619 | >>> failed_uploads = sorted(os.listdir(failed_dir)) | ||
1057 | 620 | >>> len(failed_uploads) | ||
1058 | 621 | 2 | ||
1059 | 622 | |||
1060 | 623 | >>> failed_upload = failed_uploads[0] | ||
1061 | 624 | >>> uploader_log = open(os.path.join(failed_dir, failed_upload, | ||
1062 | 625 | ... 'uploader.log')) | ||
1063 | 626 | |||
1064 | 627 | >>> print uploader_log.read() | ||
1065 | 628 | DEBUG ... | ||
1066 | 629 | DEBUG Initialising connection. | ||
1067 | 630 | DEBUG Beginning processing | ||
1068 | 631 | DEBUG Creating directory /var/tmp/builddmaster/accepted | ||
1069 | 632 | DEBUG Creating directory /var/tmp/builddmaster/rejected | ||
1070 | 633 | DEBUG Creating directory /var/tmp/builddmaster/failed | ||
1071 | 634 | ... | ||
1072 | 635 | DEBUG Rolling back any remaining transactions. | ||
1073 | 636 | DEBUG Removing lock file: /var/lock/process-upload-buildd.lock | ||
1074 | 637 | <BLANKLINE> | ||
1075 | 638 | |||
1076 | 639 | Remove build upload results root | 535 | Remove build upload results root |
1077 | 640 | 536 | ||
1078 | 641 | >>> shutil.rmtree(config.builddmaster.root) | 537 | >>> shutil.rmtree(config.builddmaster.root) |
1079 | @@ -1156,7 +1052,6 @@ | |||
1080 | 1156 | >>> build.upload_log = None | 1052 | >>> build.upload_log = None |
1081 | 1157 | >>> candidate.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) | 1053 | >>> candidate.builder.setSlaveForTesting(WaitingSlave('BuildStatus.OK')) |
1082 | 1158 | >>> a_builder.updateBuild(candidate) | 1054 | >>> a_builder.updateBuild(candidate) |
1083 | 1159 | WARNING:slave-scanner:Build ... upload failed. | ||
1084 | 1160 | >>> local_transaction.commit() | 1055 | >>> local_transaction.commit() |
1085 | 1161 | 1056 | ||
1086 | 1162 | >>> build.archive.private | 1057 | >>> build.archive.private |
1087 | @@ -1167,6 +1062,7 @@ | |||
1088 | 1167 | True | 1062 | True |
1089 | 1168 | >>> print lfa.filename | 1063 | >>> print lfa.filename |
1090 | 1169 | buildlog_ubuntu-hoary-i386.mozilla-firefox_0.9_BUILDING.txt.gz | 1064 | buildlog_ubuntu-hoary-i386.mozilla-firefox_0.9_BUILDING.txt.gz |
1091 | 1065 | >>> candidate.destroySelf() | ||
1092 | 1170 | 1066 | ||
1093 | 1171 | The attempt to fetch the buildlog from the common librarian will fail | 1067 | The attempt to fetch the buildlog from the common librarian will fail |
1094 | 1172 | since this is a build in a private archive and the buildlog was thus | 1068 | since this is a build in a private archive and the buildlog was thus |
1095 | 1173 | 1069 | ||
1096 | === modified file 'lib/lp/soyuz/model/archive.py' | |||
1097 | --- lib/lp/soyuz/model/archive.py 2010-08-31 11:31:04 +0000 | |||
1098 | +++ lib/lp/soyuz/model/archive.py 2010-09-16 00:48:58 +0000 | |||
1099 | @@ -1004,10 +1004,9 @@ | |||
1100 | 1004 | BuildStatus.FAILEDTOUPLOAD, | 1004 | BuildStatus.FAILEDTOUPLOAD, |
1101 | 1005 | BuildStatus.MANUALDEPWAIT, | 1005 | BuildStatus.MANUALDEPWAIT, |
1102 | 1006 | ), | 1006 | ), |
1103 | 1007 | # The 'pending' count is a list because we may append to it | ||
1104 | 1008 | # later. | ||
1105 | 1009 | 'pending': [ | 1007 | 'pending': [ |
1106 | 1010 | BuildStatus.BUILDING, | 1008 | BuildStatus.BUILDING, |
1107 | 1009 | BuildStatus.UPLOADING, | ||
1108 | 1011 | ], | 1010 | ], |
1109 | 1012 | 'succeeded': ( | 1011 | 'succeeded': ( |
1110 | 1013 | BuildStatus.FULLYBUILT, | 1012 | BuildStatus.FULLYBUILT, |
1111 | @@ -1023,6 +1022,7 @@ | |||
1112 | 1023 | BuildStatus.FAILEDTOUPLOAD, | 1022 | BuildStatus.FAILEDTOUPLOAD, |
1113 | 1024 | BuildStatus.MANUALDEPWAIT, | 1023 | BuildStatus.MANUALDEPWAIT, |
1114 | 1025 | BuildStatus.BUILDING, | 1024 | BuildStatus.BUILDING, |
1115 | 1025 | BuildStatus.UPLOADING, | ||
1116 | 1026 | BuildStatus.FULLYBUILT, | 1026 | BuildStatus.FULLYBUILT, |
1117 | 1027 | BuildStatus.SUPERSEDED, | 1027 | BuildStatus.SUPERSEDED, |
1118 | 1028 | ], | 1028 | ], |
1119 | @@ -2007,6 +2007,7 @@ | |||
1120 | 2007 | ), | 2007 | ), |
1121 | 2008 | 'pending': ( | 2008 | 'pending': ( |
1122 | 2009 | BuildStatus.BUILDING, | 2009 | BuildStatus.BUILDING, |
1123 | 2010 | BuildStatus.UPLOADING, | ||
1124 | 2010 | BuildStatus.NEEDSBUILD, | 2011 | BuildStatus.NEEDSBUILD, |
1125 | 2011 | ), | 2012 | ), |
1126 | 2012 | 'succeeded': ( | 2013 | 'succeeded': ( |
1127 | 2013 | 2014 | ||
1128 | === modified file 'lib/lp/soyuz/model/binarypackagebuild.py' | |||
1129 | --- lib/lp/soyuz/model/binarypackagebuild.py 2010-09-09 17:02:33 +0000 | |||
1130 | +++ lib/lp/soyuz/model/binarypackagebuild.py 2010-09-16 00:48:58 +0000 | |||
1131 | @@ -251,6 +251,7 @@ | |||
1132 | 251 | """See `IBuild`""" | 251 | """See `IBuild`""" |
1133 | 252 | return self.status not in [BuildStatus.NEEDSBUILD, | 252 | return self.status not in [BuildStatus.NEEDSBUILD, |
1134 | 253 | BuildStatus.BUILDING, | 253 | BuildStatus.BUILDING, |
1135 | 254 | BuildStatus.UPLOADING, | ||
1136 | 254 | BuildStatus.SUPERSEDED] | 255 | BuildStatus.SUPERSEDED] |
1137 | 255 | 256 | ||
1138 | 256 | @property | 257 | @property |
1139 | @@ -671,6 +672,10 @@ | |||
1140 | 671 | buildduration = 'not available' | 672 | buildduration = 'not available' |
1141 | 672 | buildlog_url = 'not available' | 673 | buildlog_url = 'not available' |
1142 | 673 | builder_url = 'not available' | 674 | builder_url = 'not available' |
1143 | 675 | elif self.status == BuildStatus.UPLOADING: | ||
1144 | 676 | buildduration = 'uploading' | ||
1145 | 677 | buildlog_url = 'see builder page' | ||
1146 | 678 | builder_url = 'not available' | ||
1147 | 674 | elif self.status == BuildStatus.BUILDING: | 679 | elif self.status == BuildStatus.BUILDING: |
1148 | 675 | # build in process | 680 | # build in process |
1149 | 676 | buildduration = 'not finished' | 681 | buildduration = 'not finished' |
1150 | @@ -959,11 +964,14 @@ | |||
1151 | 959 | % sqlvalues(BuildStatus.FULLYBUILT)) | 964 | % sqlvalues(BuildStatus.FULLYBUILT)) |
1152 | 960 | 965 | ||
1153 | 961 | # Ordering according status | 966 | # Ordering according status |
1155 | 962 | # * NEEDSBUILD & BUILDING by -lastscore | 967 | # * NEEDSBUILD, BUILDING & UPLOADING by -lastscore |
1156 | 963 | # * SUPERSEDED & All by -datecreated | 968 | # * SUPERSEDED & All by -datecreated |
1157 | 964 | # * FULLYBUILT & FAILURES by -datebuilt | 969 | # * FULLYBUILT & FAILURES by -datebuilt |
1158 | 965 | # It should present the builds in a more natural order. | 970 | # It should present the builds in a more natural order. |
1160 | 966 | if status in [BuildStatus.NEEDSBUILD, BuildStatus.BUILDING]: | 971 | if status in [ |
1161 | 972 | BuildStatus.NEEDSBUILD, | ||
1162 | 973 | BuildStatus.BUILDING, | ||
1163 | 974 | BuildStatus.UPLOADING]: | ||
1164 | 967 | orderBy = ["-BuildQueue.lastscore", "BinaryPackageBuild.id"] | 975 | orderBy = ["-BuildQueue.lastscore", "BinaryPackageBuild.id"] |
1165 | 968 | clauseTables.append('BuildQueue') | 976 | clauseTables.append('BuildQueue') |
1166 | 969 | clauseTables.append('BuildPackageJob') | 977 | clauseTables.append('BuildPackageJob') |
1167 | @@ -1079,7 +1087,8 @@ | |||
1168 | 1079 | BuildStatus.CHROOTWAIT, | 1087 | BuildStatus.CHROOTWAIT, |
1169 | 1080 | BuildStatus.FAILEDTOUPLOAD) | 1088 | BuildStatus.FAILEDTOUPLOAD) |
1170 | 1081 | needsbuild = collect_builds(BuildStatus.NEEDSBUILD) | 1089 | needsbuild = collect_builds(BuildStatus.NEEDSBUILD) |
1172 | 1082 | building = collect_builds(BuildStatus.BUILDING) | 1090 | building = collect_builds(BuildStatus.BUILDING, |
1173 | 1091 | BuildStatus.UPLOADING) | ||
1174 | 1083 | successful = collect_builds(BuildStatus.FULLYBUILT) | 1092 | successful = collect_builds(BuildStatus.FULLYBUILT) |
1175 | 1084 | 1093 | ||
1176 | 1085 | # Note: the BuildStatus DBItems are used here to summarize the | 1094 | # Note: the BuildStatus DBItems are used here to summarize the |
Jelmer this branch looks ok. As I mentioned on IRC I'm getting test failures. I'm re-running them now and will paste the results when they are done.
Other than that I only find this one typo that needs fixing.
typo: s/nonexisting/ nonexistent
Keeping the branch unapproved until the tests are sorted out.