Merge lp:~twom/launchpad-buildd/initial-docker-build-support into lp:launchpad-buildd
- initial-docker-build-support
- Merge into trunk
Status: | Merged |
---|---|
Merged at revision: | 413 |
Proposed branch: | lp:~twom/launchpad-buildd/initial-docker-build-support |
Merge into: | lp:launchpad-buildd |
Diff against target: |
1478 lines (+1200/-54) 14 files modified
debian/changelog (+10/-4) lpbuildd/buildd-slave.tac (+2/-0) lpbuildd/oci.py (+222/-0) lpbuildd/snap.py (+32/-29) lpbuildd/target/build_oci.py (+126/-0) lpbuildd/target/build_snap.py (+7/-15) lpbuildd/target/cli.py (+2/-0) lpbuildd/target/lxd.py (+6/-2) lpbuildd/target/snapbuildproxy.py (+41/-0) lpbuildd/target/tests/test_build_oci.py (+407/-0) lpbuildd/target/tests/test_build_snap.py (+1/-1) lpbuildd/target/tests/test_lxd.py (+4/-3) lpbuildd/tests/oci_tarball.py (+60/-0) lpbuildd/tests/test_oci.py (+280/-0) |
To merge this branch: | bzr merge lp:~twom/launchpad-buildd/initial-docker-build-support |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Tom Wardill (community) | Approve | ||
Colin Watson (community) | Approve | ||
Review via email: mp+369775@code.launchpad.net |
Commit message
Add initial docker build support
Description of the change
Add a builder for docker, creating an image following a supplied Dockerfile.
Save the image, extract it and then tar each component layer individually for returning/caching in launchpad.
Colin Watson (cjwatson) : | # |
- 393. By Tom Wardill
-
Use shell_escape, compress output images
- 394. By Tom Wardill
-
Collect .tar.gz
- 395. By Tom Wardill
-
Use a python script to save docker image to layers
- 396. By Tom Wardill
-
Move image extraction to gatherResults
- 397. By Tom Wardill
-
Refactor joint proxy handling
- 398. By Tom Wardill
-
Fix changelog diff noise
- 399. By Tom Wardill
-
Do more preprocessing, create digests file
- 400. By Tom Wardill
-
Include all diffs in digests and waiting files
- 401. By Tom Wardill
-
Add layer_id to digests now we're using every layer
- 402. By <email address hidden>
-
Use docker from apt, rather than from snap
- 403. By <email address hidden>
-
Rename Docker to OCI
- 404. By <email address hidden>
-
Merge upstream
Colin Watson (cjwatson) : | # |
- 405. By <email address hidden>
-
Remove conflict marker
- 406. By <email address hidden>
-
Use proxy, extract sha from right location
- 407. By <email address hidden>
-
Refactor proxy handling into better named mixin
- 408. By <email address hidden>
-
Refactor out adding docker proxy
- 409. By <email address hidden>
-
Include snap build proxy code
- 410. By <email address hidden>
-
Move to an on-the-fly tarball creation
- 411. By <email address hidden>
-
Argument is build_file, not file
Colin Watson (cjwatson) : | # |
Tom Wardill (twom) wrote : | # |
Tested with latest buildbehaviour branch, should now be functional.
Preview Diff
1 | === modified file 'debian/changelog' | |||
2 | --- debian/changelog 2020-01-06 15:03:55 +0000 | |||
3 | +++ debian/changelog 2020-02-26 10:52:47 +0000 | |||
4 | @@ -1,3 +1,9 @@ | |||
5 | 1 | launchpad-buildd (187) UNRELEASED; urgency=medium | ||
6 | 2 | |||
7 | 3 | * Prototype Docker image building support. | ||
8 | 4 | |||
9 | 5 | -- Colin Watson <cjwatson@ubuntu.com> Wed, 05 Jun 2019 15:06:54 +0100 | ||
10 | 6 | |||
11 | 1 | launchpad-buildd (186) xenial; urgency=medium | 7 | launchpad-buildd (186) xenial; urgency=medium |
12 | 2 | 8 | ||
13 | 3 | * Fix sbuildrc compatibility with xenial's sbuild. | 9 | * Fix sbuildrc compatibility with xenial's sbuild. |
14 | @@ -32,7 +38,7 @@ | |||
15 | 32 | 38 | ||
16 | 33 | [ Michael Hudson-Doyle ] | 39 | [ Michael Hudson-Doyle ] |
17 | 34 | * Do not make assumptions about what device major number the device mapper | 40 | * Do not make assumptions about what device major number the device mapper |
19 | 35 | is using. (LP: #1852518) | 41 | is using. (LP: #1852518) |
20 | 36 | 42 | ||
21 | 37 | -- Colin Watson <cjwatson@ubuntu.com> Tue, 26 Nov 2019 12:22:37 +0000 | 43 | -- Colin Watson <cjwatson@ubuntu.com> Tue, 26 Nov 2019 12:22:37 +0000 |
22 | 38 | 44 | ||
23 | @@ -801,7 +807,7 @@ | |||
24 | 801 | memory at once (LP: #1227086). | 807 | memory at once (LP: #1227086). |
25 | 802 | 808 | ||
26 | 803 | [ Adam Conrad ] | 809 | [ Adam Conrad ] |
28 | 804 | * Tidy up log formatting of the "Already reaped..." message. | 810 | * Tidy up log formatting of the "Already reaped..." message. |
29 | 805 | 811 | ||
30 | 806 | -- Colin Watson <cjwatson@ubuntu.com> Fri, 27 Sep 2013 13:08:59 +0100 | 812 | -- Colin Watson <cjwatson@ubuntu.com> Fri, 27 Sep 2013 13:08:59 +0100 |
31 | 807 | 813 | ||
32 | @@ -986,7 +992,7 @@ | |||
33 | 986 | launchpad-buildd (98) hardy; urgency=low | 992 | launchpad-buildd (98) hardy; urgency=low |
34 | 987 | 993 | ||
35 | 988 | * Add launchpad-buildd dependency on python-apt, as an accomodation for it | 994 | * Add launchpad-buildd dependency on python-apt, as an accomodation for it |
37 | 989 | being only a Recommends but actually required by python-debian. | 995 | being only a Recommends but actually required by python-debian. |
38 | 990 | LP: #890834 | 996 | LP: #890834 |
39 | 991 | 997 | ||
40 | 992 | -- Martin Pool <mbp@canonical.com> Wed, 16 Nov 2011 10:28:48 +1100 | 998 | -- Martin Pool <mbp@canonical.com> Wed, 16 Nov 2011 10:28:48 +1100 |
41 | @@ -1040,7 +1046,7 @@ | |||
42 | 1040 | 1046 | ||
43 | 1041 | launchpad-buildd (90) hardy; urgency=low | 1047 | launchpad-buildd (90) hardy; urgency=low |
44 | 1042 | 1048 | ||
46 | 1043 | * debhelper is a Build-Depends because it is needed to run 'clean'. | 1049 | * debhelper is a Build-Depends because it is needed to run 'clean'. |
47 | 1044 | * python-lpbuildd conflicts with launchpad-buildd << 88. | 1050 | * python-lpbuildd conflicts with launchpad-buildd << 88. |
48 | 1045 | * Add and adjust build-arch, binary-arch, build-indep to match policy. | 1051 | * Add and adjust build-arch, binary-arch, build-indep to match policy. |
49 | 1046 | * Complies with stardards version 3.9.2. | 1052 | * Complies with stardards version 3.9.2. |
50 | 1047 | 1053 | ||
51 | === modified file 'lpbuildd/buildd-slave.tac' | |||
52 | --- lpbuildd/buildd-slave.tac 2019-02-12 10:35:12 +0000 | |||
53 | +++ lpbuildd/buildd-slave.tac 2020-02-26 10:52:47 +0000 | |||
54 | @@ -23,6 +23,7 @@ | |||
55 | 23 | 23 | ||
56 | 24 | from lpbuildd.binarypackage import BinaryPackageBuildManager | 24 | from lpbuildd.binarypackage import BinaryPackageBuildManager |
57 | 25 | from lpbuildd.builder import XMLRPCBuilder | 25 | from lpbuildd.builder import XMLRPCBuilder |
58 | 26 | from lpbuildd.oci import OCIBuildManager | ||
59 | 26 | from lpbuildd.livefs import LiveFilesystemBuildManager | 27 | from lpbuildd.livefs import LiveFilesystemBuildManager |
60 | 27 | from lpbuildd.log import RotatableFileLogObserver | 28 | from lpbuildd.log import RotatableFileLogObserver |
61 | 28 | from lpbuildd.snap import SnapBuildManager | 29 | from lpbuildd.snap import SnapBuildManager |
62 | @@ -45,6 +46,7 @@ | |||
63 | 45 | TranslationTemplatesBuildManager, 'translation-templates') | 46 | TranslationTemplatesBuildManager, 'translation-templates') |
64 | 46 | builder.registerManager(LiveFilesystemBuildManager, "livefs") | 47 | builder.registerManager(LiveFilesystemBuildManager, "livefs") |
65 | 47 | builder.registerManager(SnapBuildManager, "snap") | 48 | builder.registerManager(SnapBuildManager, "snap") |
66 | 49 | builder.registerManager(OCIBuildManager, "oci") | ||
67 | 48 | 50 | ||
68 | 49 | application = service.Application('Builder') | 51 | application = service.Application('Builder') |
69 | 50 | application.addComponent( | 52 | application.addComponent( |
70 | 51 | 53 | ||
71 | === added file 'lpbuildd/oci.py' | |||
72 | --- lpbuildd/oci.py 1970-01-01 00:00:00 +0000 | |||
73 | +++ lpbuildd/oci.py 2020-02-26 10:52:47 +0000 | |||
74 | @@ -0,0 +1,222 @@ | |||
75 | 1 | # Copyright 2019 Canonical Ltd. This software is licensed under the | ||
76 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
77 | 3 | |||
78 | 4 | from __future__ import print_function | ||
79 | 5 | |||
80 | 6 | __metaclass__ = type | ||
81 | 7 | |||
82 | 8 | import hashlib | ||
83 | 9 | import json | ||
84 | 10 | import os | ||
85 | 11 | import tarfile | ||
86 | 12 | import tempfile | ||
87 | 13 | |||
88 | 14 | from six.moves.configparser import ( | ||
89 | 15 | NoOptionError, | ||
90 | 16 | NoSectionError, | ||
91 | 17 | ) | ||
92 | 18 | |||
93 | 19 | from lpbuildd.debian import ( | ||
94 | 20 | DebianBuildManager, | ||
95 | 21 | DebianBuildState, | ||
96 | 22 | ) | ||
97 | 23 | from lpbuildd.snap import SnapBuildProxyMixin | ||
98 | 24 | |||
99 | 25 | |||
100 | 26 | RETCODE_SUCCESS = 0 | ||
101 | 27 | RETCODE_FAILURE_INSTALL = 200 | ||
102 | 28 | RETCODE_FAILURE_BUILD = 201 | ||
103 | 29 | |||
104 | 30 | |||
105 | 31 | class OCIBuildState(DebianBuildState): | ||
106 | 32 | BUILD_OCI = "BUILD_OCI" | ||
107 | 33 | |||
108 | 34 | |||
109 | 35 | class OCIBuildManager(SnapBuildProxyMixin, DebianBuildManager): | ||
110 | 36 | """Build an OCI Image.""" | ||
111 | 37 | |||
112 | 38 | backend_name = "lxd" | ||
113 | 39 | initial_build_state = OCIBuildState.BUILD_OCI | ||
114 | 40 | |||
115 | 41 | @property | ||
116 | 42 | def needs_sanitized_logs(self): | ||
117 | 43 | return True | ||
118 | 44 | |||
119 | 45 | def initiate(self, files, chroot, extra_args): | ||
120 | 46 | """Initiate a build with a given set of files and chroot.""" | ||
121 | 47 | self.name = extra_args["name"] | ||
122 | 48 | self.branch = extra_args.get("branch") | ||
123 | 49 | self.git_repository = extra_args.get("git_repository") | ||
124 | 50 | self.git_path = extra_args.get("git_path") | ||
125 | 51 | self.build_file = extra_args.get("build_file") | ||
126 | 52 | self.proxy_url = extra_args.get("proxy_url") | ||
127 | 53 | self.revocation_endpoint = extra_args.get("revocation_endpoint") | ||
128 | 54 | self.proxy_service = None | ||
129 | 55 | |||
130 | 56 | super(OCIBuildManager, self).initiate(files, chroot, extra_args) | ||
131 | 57 | |||
132 | 58 | def doRunBuild(self): | ||
133 | 59 | """Run the process to build the snap.""" | ||
134 | 60 | args = [] | ||
135 | 61 | args.extend(self.startProxy()) | ||
136 | 62 | if self.revocation_endpoint: | ||
137 | 63 | args.extend(["--revocation-endpoint", self.revocation_endpoint]) | ||
138 | 64 | if self.branch is not None: | ||
139 | 65 | args.extend(["--branch", self.branch]) | ||
140 | 66 | if self.git_repository is not None: | ||
141 | 67 | args.extend(["--git-repository", self.git_repository]) | ||
142 | 68 | if self.git_path is not None: | ||
143 | 69 | args.extend(["--git-path", self.git_path]) | ||
144 | 70 | if self.build_file is not None: | ||
145 | 71 | args.extend(["--build-file", self.build_file]) | ||
146 | 72 | try: | ||
147 | 73 | snap_store_proxy_url = self._builder._config.get( | ||
148 | 74 | "proxy", "snapstore") | ||
149 | 75 | args.extend(["--snap-store-proxy-url", snap_store_proxy_url]) | ||
150 | 76 | except (NoSectionError, NoOptionError): | ||
151 | 77 | pass | ||
152 | 78 | args.append(self.name) | ||
153 | 79 | self.runTargetSubProcess("build-oci", *args) | ||
154 | 80 | |||
155 | 81 | def iterate_BUILD_OCI(self, retcode): | ||
156 | 82 | """Finished building the OCI image.""" | ||
157 | 83 | self.stopProxy() | ||
158 | 84 | self.revokeProxyToken() | ||
159 | 85 | if retcode == RETCODE_SUCCESS: | ||
160 | 86 | print("Returning build status: OK") | ||
161 | 87 | return self.deferGatherResults() | ||
162 | 88 | elif (retcode >= RETCODE_FAILURE_INSTALL and | ||
163 | 89 | retcode <= RETCODE_FAILURE_BUILD): | ||
164 | 90 | if not self.alreadyfailed: | ||
165 | 91 | self._builder.buildFail() | ||
166 | 92 | print("Returning build status: Build failed.") | ||
167 | 93 | self.alreadyfailed = True | ||
168 | 94 | else: | ||
169 | 95 | if not self.alreadyfailed: | ||
170 | 96 | self._builder.builderFail() | ||
171 | 97 | print("Returning build status: Builder failed.") | ||
172 | 98 | self.alreadyfailed = True | ||
173 | 99 | self.doReapProcesses(self._state) | ||
174 | 100 | |||
175 | 101 | def iterateReap_BUILD_OCI(self, retcode): | ||
176 | 102 | """Finished reaping after building the OCI image.""" | ||
177 | 103 | self._state = DebianBuildState.UMOUNT | ||
178 | 104 | self.doUnmounting() | ||
179 | 105 | |||
180 | 106 | def _calculateLayerSha(self, layer_path): | ||
181 | 107 | with open(layer_path, 'rb') as layer_tar: | ||
182 | 108 | sha256_hash = hashlib.sha256() | ||
183 | 109 | for byte_block in iter(lambda: layer_tar.read(4096), b""): | ||
184 | 110 | sha256_hash.update(byte_block) | ||
185 | 111 | digest = sha256_hash.hexdigest() | ||
186 | 112 | return digest | ||
187 | 113 | |||
188 | 114 | def _gatherManifestSection(self, section, extract_path, sha_directory): | ||
189 | 115 | config_file_path = os.path.join(extract_path, section["Config"]) | ||
190 | 116 | self._builder.addWaitingFile(config_file_path) | ||
191 | 117 | with open(config_file_path, 'r') as config_fp: | ||
192 | 118 | config = json.load(config_fp) | ||
193 | 119 | diff_ids = config["rootfs"]["diff_ids"] | ||
194 | 120 | digest_diff_map = {} | ||
195 | 121 | for diff_id, layer_id in zip(diff_ids, section['Layers']): | ||
196 | 122 | layer_id = layer_id.split('/')[0] | ||
197 | 123 | diff_file = os.path.join(sha_directory, diff_id.split(':')[1]) | ||
198 | 124 | layer_path = os.path.join( | ||
199 | 125 | extract_path, "{}.tar.gz".format(layer_id)) | ||
200 | 126 | self._builder.addWaitingFile(layer_path) | ||
201 | 127 | # If we have a mapping between diff and existing digest, | ||
202 | 128 | # this means this layer has been pulled from a remote. | ||
203 | 129 | # We should maintain the same digest to achieve layer reuse | ||
204 | 130 | if os.path.exists(diff_file): | ||
205 | 131 | with open(diff_file, 'r') as diff_fp: | ||
206 | 132 | diff = json.load(diff_fp) | ||
207 | 133 | # We should be able to just take the first occurence, | ||
208 | 134 | # as that will be the 'most parent' image | ||
209 | 135 | digest = diff[0]["Digest"] | ||
210 | 136 | source = diff[0]["SourceRepository"] | ||
211 | 137 | # If the layer has been build locally, we need to generate the | ||
212 | 138 | # digest and then set the source to empty | ||
213 | 139 | else: | ||
214 | 140 | source = "" | ||
215 | 141 | digest = self._calculateLayerSha(layer_path) | ||
216 | 142 | digest_diff_map[diff_id] = { | ||
217 | 143 | "digest": digest, | ||
218 | 144 | "source": source, | ||
219 | 145 | "layer_id": layer_id | ||
220 | 146 | } | ||
221 | 147 | |||
222 | 148 | return digest_diff_map | ||
223 | 149 | |||
224 | 150 | def gatherResults(self): | ||
225 | 151 | """Gather the results of the build and add them to the file cache.""" | ||
226 | 152 | extract_path = tempfile.mkdtemp(prefix=self.name) | ||
227 | 153 | proc = self.backend.run( | ||
228 | 154 | ['docker', 'save', self.name], | ||
229 | 155 | get_output=True, universal_newlines=False, return_process=True) | ||
230 | 156 | try: | ||
231 | 157 | tar = tarfile.open(fileobj=proc.stdout, mode="r|") | ||
232 | 158 | except Exception as e: | ||
233 | 159 | print(e) | ||
234 | 160 | |||
235 | 161 | current_dir = '' | ||
236 | 162 | directory_tar = None | ||
237 | 163 | try: | ||
238 | 164 | # The tarfile is a stream and must be processed in order | ||
239 | 165 | for file in tar: | ||
240 | 166 | # Directories are just nodes, you can't extract the children | ||
241 | 167 | # directly, so keep track of what dir we're in. | ||
242 | 168 | if file.isdir(): | ||
243 | 169 | current_dir = file.name | ||
244 | 170 | if directory_tar: | ||
245 | 171 | # Close the old directory if we have one | ||
246 | 172 | directory_tar.close() | ||
247 | 173 | # We're going to add the layer.tar to a gzip | ||
248 | 174 | directory_tar = tarfile.open( | ||
249 | 175 | os.path.join( | ||
250 | 176 | extract_path, '{}.tar.gz'.format(file.name)), | ||
251 | 177 | 'w|gz') | ||
252 | 178 | if current_dir and file.name.endswith('layer.tar'): | ||
253 | 179 | # This is the actual layer data, we want to add it to | ||
254 | 180 | # the directory gzip | ||
255 | 181 | file.name = file.name.split('/')[1] | ||
256 | 182 | directory_tar.addfile(file, tar.extractfile(file)) | ||
257 | 183 | elif current_dir and file.name.startswith(current_dir): | ||
258 | 184 | # Other files that are in the layer directories, | ||
259 | 185 | # we don't care about | ||
260 | 186 | continue | ||
261 | 187 | else: | ||
262 | 188 | # If it's not in a directory, we need that | ||
263 | 189 | tar.extract(file, extract_path) | ||
264 | 190 | except Exception as e: | ||
265 | 191 | print(e) | ||
266 | 192 | |||
267 | 193 | # We need these mapping files | ||
268 | 194 | sha_directory = tempfile.mkdtemp() | ||
269 | 195 | sha_path = ('/var/lib/docker/image/' | ||
270 | 196 | 'vfs/distribution/v2metadata-by-diffid/sha256/') | ||
271 | 197 | sha_files = [x for x in self.backend.listdir(sha_path) | ||
272 | 198 | if not x.startswith('.')] | ||
273 | 199 | for file in sha_files: | ||
274 | 200 | self.backend.copy_out( | ||
275 | 201 | os.path.join(sha_path, file), | ||
276 | 202 | os.path.join(sha_directory, file) | ||
277 | 203 | ) | ||
278 | 204 | |||
279 | 205 | # Parse the manifest for the other files we need | ||
280 | 206 | manifest_path = os.path.join(extract_path, 'manifest.json') | ||
281 | 207 | self._builder.addWaitingFile(manifest_path) | ||
282 | 208 | with open(manifest_path) as manifest_fp: | ||
283 | 209 | manifest = json.load(manifest_fp) | ||
284 | 210 | |||
285 | 211 | digest_maps = [] | ||
286 | 212 | try: | ||
287 | 213 | for section in manifest: | ||
288 | 214 | digest_maps.append( | ||
289 | 215 | self._gatherManifestSection(section, extract_path, | ||
290 | 216 | sha_directory)) | ||
291 | 217 | digest_map_file = os.path.join(extract_path, 'digests.json') | ||
292 | 218 | with open(digest_map_file, 'w') as digest_map_fp: | ||
293 | 219 | json.dump(digest_maps, digest_map_fp) | ||
294 | 220 | self._builder.addWaitingFile(digest_map_file) | ||
295 | 221 | except Exception as e: | ||
296 | 222 | print(e) | ||
297 | 0 | 223 | ||
298 | === modified file 'lpbuildd/snap.py' | |||
299 | --- lpbuildd/snap.py 2019-10-30 12:31:53 +0000 | |||
300 | +++ lpbuildd/snap.py 2020-02-26 10:52:47 +0000 | |||
301 | @@ -239,35 +239,7 @@ | |||
302 | 239 | BUILD_SNAP = "BUILD_SNAP" | 239 | BUILD_SNAP = "BUILD_SNAP" |
303 | 240 | 240 | ||
304 | 241 | 241 | ||
334 | 242 | class SnapBuildManager(DebianBuildManager): | 242 | class SnapBuildProxyMixin(): |
306 | 243 | """Build a snap.""" | ||
307 | 244 | |||
308 | 245 | backend_name = "lxd" | ||
309 | 246 | initial_build_state = SnapBuildState.BUILD_SNAP | ||
310 | 247 | |||
311 | 248 | @property | ||
312 | 249 | def needs_sanitized_logs(self): | ||
313 | 250 | return True | ||
314 | 251 | |||
315 | 252 | def initiate(self, files, chroot, extra_args): | ||
316 | 253 | """Initiate a build with a given set of files and chroot.""" | ||
317 | 254 | self.name = extra_args["name"] | ||
318 | 255 | self.channels = extra_args.get("channels", {}) | ||
319 | 256 | self.build_request_id = extra_args.get("build_request_id") | ||
320 | 257 | self.build_request_timestamp = extra_args.get( | ||
321 | 258 | "build_request_timestamp") | ||
322 | 259 | self.build_url = extra_args.get("build_url") | ||
323 | 260 | self.branch = extra_args.get("branch") | ||
324 | 261 | self.git_repository = extra_args.get("git_repository") | ||
325 | 262 | self.git_path = extra_args.get("git_path") | ||
326 | 263 | self.proxy_url = extra_args.get("proxy_url") | ||
327 | 264 | self.revocation_endpoint = extra_args.get("revocation_endpoint") | ||
328 | 265 | self.build_source_tarball = extra_args.get( | ||
329 | 266 | "build_source_tarball", False) | ||
330 | 267 | self.private = extra_args.get("private", False) | ||
331 | 268 | self.proxy_service = None | ||
332 | 269 | |||
333 | 270 | super(SnapBuildManager, self).initiate(files, chroot, extra_args) | ||
335 | 271 | 243 | ||
336 | 272 | def startProxy(self): | 244 | def startProxy(self): |
337 | 273 | """Start the local snap proxy, if necessary.""" | 245 | """Start the local snap proxy, if necessary.""" |
338 | @@ -308,6 +280,37 @@ | |||
339 | 308 | self._builder.log( | 280 | self._builder.log( |
340 | 309 | "Unable to revoke token for %s: %s" % (url.username, e)) | 281 | "Unable to revoke token for %s: %s" % (url.username, e)) |
341 | 310 | 282 | ||
342 | 283 | |||
343 | 284 | class SnapBuildManager(SnapBuildProxyMixin, DebianBuildManager): | ||
344 | 285 | """Build a snap.""" | ||
345 | 286 | |||
346 | 287 | backend_name = "lxd" | ||
347 | 288 | initial_build_state = SnapBuildState.BUILD_SNAP | ||
348 | 289 | |||
349 | 290 | @property | ||
350 | 291 | def needs_sanitized_logs(self): | ||
351 | 292 | return True | ||
352 | 293 | |||
353 | 294 | def initiate(self, files, chroot, extra_args): | ||
354 | 295 | """Initiate a build with a given set of files and chroot.""" | ||
355 | 296 | self.name = extra_args["name"] | ||
356 | 297 | self.channels = extra_args.get("channels", {}) | ||
357 | 298 | self.build_request_id = extra_args.get("build_request_id") | ||
358 | 299 | self.build_request_timestamp = extra_args.get( | ||
359 | 300 | "build_request_timestamp") | ||
360 | 301 | self.build_url = extra_args.get("build_url") | ||
361 | 302 | self.branch = extra_args.get("branch") | ||
362 | 303 | self.git_repository = extra_args.get("git_repository") | ||
363 | 304 | self.git_path = extra_args.get("git_path") | ||
364 | 305 | self.proxy_url = extra_args.get("proxy_url") | ||
365 | 306 | self.revocation_endpoint = extra_args.get("revocation_endpoint") | ||
366 | 307 | self.build_source_tarball = extra_args.get( | ||
367 | 308 | "build_source_tarball", False) | ||
368 | 309 | self.private = extra_args.get("private", False) | ||
369 | 310 | self.proxy_service = None | ||
370 | 311 | |||
371 | 312 | super(SnapBuildManager, self).initiate(files, chroot, extra_args) | ||
372 | 313 | |||
373 | 311 | def status(self): | 314 | def status(self): |
374 | 312 | status_path = get_build_path(self.home, self._buildid, "status") | 315 | status_path = get_build_path(self.home, self._buildid, "status") |
375 | 313 | try: | 316 | try: |
376 | 314 | 317 | ||
377 | === added file 'lpbuildd/target/build_oci.py' | |||
378 | --- lpbuildd/target/build_oci.py 1970-01-01 00:00:00 +0000 | |||
379 | +++ lpbuildd/target/build_oci.py 2020-02-26 10:52:47 +0000 | |||
380 | @@ -0,0 +1,126 @@ | |||
381 | 1 | # Copyright 2019 Canonical Ltd. This software is licensed under the | ||
382 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
383 | 3 | |||
384 | 4 | from __future__ import print_function | ||
385 | 5 | |||
386 | 6 | __metaclass__ = type | ||
387 | 7 | |||
388 | 8 | from collections import OrderedDict | ||
389 | 9 | import logging | ||
390 | 10 | import os.path | ||
391 | 11 | import sys | ||
392 | 12 | import tempfile | ||
393 | 13 | from textwrap import dedent | ||
394 | 14 | |||
395 | 15 | from lpbuildd.target.operation import Operation | ||
396 | 16 | from lpbuildd.target.snapbuildproxy import SnapBuildProxyOperationMixin | ||
397 | 17 | from lpbuildd.target.snapstore import SnapStoreOperationMixin | ||
398 | 18 | from lpbuildd.target.vcs import VCSOperationMixin | ||
399 | 19 | |||
400 | 20 | |||
401 | 21 | RETCODE_FAILURE_INSTALL = 200 | ||
402 | 22 | RETCODE_FAILURE_BUILD = 201 | ||
403 | 23 | |||
404 | 24 | |||
405 | 25 | logger = logging.getLogger(__name__) | ||
406 | 26 | |||
407 | 27 | |||
408 | 28 | class BuildOCI(SnapBuildProxyOperationMixin, VCSOperationMixin, | ||
409 | 29 | SnapStoreOperationMixin, Operation): | ||
410 | 30 | |||
411 | 31 | description = "Build an OCI image." | ||
412 | 32 | |||
413 | 33 | @classmethod | ||
414 | 34 | def add_arguments(cls, parser): | ||
415 | 35 | super(BuildOCI, cls).add_arguments(parser) | ||
416 | 36 | parser.add_argument( | ||
417 | 37 | "--build-file", help="path to Dockerfile in branch") | ||
418 | 38 | parser.add_argument("name", help="name of snap to build") | ||
419 | 39 | |||
420 | 40 | def __init__(self, args, parser): | ||
421 | 41 | super(BuildOCI, self).__init__(args, parser) | ||
422 | 42 | self.bin = os.path.dirname(sys.argv[0]) | ||
423 | 43 | |||
424 | 44 | def _add_docker_engine_proxy_settings(self): | ||
425 | 45 | """Add systemd file for docker proxy settings.""" | ||
426 | 46 | # Create containing directory for systemd overrides | ||
427 | 47 | self.backend.run( | ||
428 | 48 | ["mkdir", "-p", "/etc/systemd/system/docker.service.d"]) | ||
429 | 49 | # we need both http_proxy and https_proxy. The contents of the files | ||
430 | 50 | # are otherwise identical | ||
431 | 51 | for setting in ['http_proxy', 'https_proxy']: | ||
432 | 52 | contents = dedent("""[Service] | ||
433 | 53 | Environment="{}={}" | ||
434 | 54 | """.format(setting.upper(), self.args.proxy_url)) | ||
435 | 55 | file_path = "/etc/systemd/system/docker.service.d/{}.conf".format( | ||
436 | 56 | setting) | ||
437 | 57 | with tempfile.NamedTemporaryFile(mode="w+") as systemd_file: | ||
438 | 58 | systemd_file.write(contents) | ||
439 | 59 | systemd_file.flush() | ||
440 | 60 | self.backend.copy_in(systemd_file.name, file_path) | ||
441 | 61 | |||
442 | 62 | def run_build_command(self, args, env=None, **kwargs): | ||
443 | 63 | """Run a build command in the target. | ||
444 | 64 | |||
445 | 65 | :param args: the command and arguments to run. | ||
446 | 66 | :param env: dictionary of additional environment variables to set. | ||
447 | 67 | :param kwargs: any other keyword arguments to pass to Backend.run. | ||
448 | 68 | """ | ||
449 | 69 | full_env = OrderedDict() | ||
450 | 70 | full_env["LANG"] = "C.UTF-8" | ||
451 | 71 | full_env["SHELL"] = "/bin/sh" | ||
452 | 72 | if env: | ||
453 | 73 | full_env.update(env) | ||
454 | 74 | return self.backend.run(args, env=full_env, **kwargs) | ||
455 | 75 | |||
456 | 76 | def install(self): | ||
457 | 77 | logger.info("Running install phase...") | ||
458 | 78 | deps = [] | ||
459 | 79 | if self.args.proxy_url: | ||
460 | 80 | deps.extend(self.proxy_deps) | ||
461 | 81 | self.install_git_proxy() | ||
462 | 82 | # Add any proxy settings that are needed | ||
463 | 83 | self._add_docker_engine_proxy_settings() | ||
464 | 84 | deps.extend(self.vcs_deps) | ||
465 | 85 | deps.extend(["docker.io"]) | ||
466 | 86 | self.backend.run(["apt-get", "-y", "install"] + deps) | ||
467 | 87 | if self.args.backend in ("lxd", "fake"): | ||
468 | 88 | self.snap_store_set_proxy() | ||
469 | 89 | self.backend.run(["systemctl", "restart", "docker"]) | ||
470 | 90 | # The docker snap can't see /build, so we have to do our work under | ||
471 | 91 | # /home/buildd instead. Make sure it exists. | ||
472 | 92 | self.backend.run(["mkdir", "-p", "/home/buildd"]) | ||
473 | 93 | |||
474 | 94 | def repo(self): | ||
475 | 95 | """Collect git or bzr branch.""" | ||
476 | 96 | logger.info("Running repo phase...") | ||
477 | 97 | env = self.build_proxy_environment(proxy_url=self.args.proxy_url) | ||
478 | 98 | self.vcs_fetch(self.args.name, cwd="/home/buildd", env=env) | ||
479 | 99 | |||
480 | 100 | def build(self): | ||
481 | 101 | logger.info("Running build phase...") | ||
482 | 102 | args = ["docker", "build", "--no-cache"] | ||
483 | 103 | if self.args.proxy_url: | ||
484 | 104 | for var in ("http_proxy", "https_proxy"): | ||
485 | 105 | args.extend( | ||
486 | 106 | ["--build-arg", "{}={}".format(var, self.args.proxy_url)]) | ||
487 | 107 | args.extend(["--tag", self.args.name]) | ||
488 | 108 | if self.args.build_file is not None: | ||
489 | 109 | args.extend(["--file", self.args.build_file]) | ||
490 | 110 | buildd_path = os.path.join("/home/buildd", self.args.name) | ||
491 | 111 | args.append(buildd_path) | ||
492 | 112 | self.run_build_command(args) | ||
493 | 113 | |||
494 | 114 | def run(self): | ||
495 | 115 | try: | ||
496 | 116 | self.install() | ||
497 | 117 | except Exception: | ||
498 | 118 | logger.exception('Install failed') | ||
499 | 119 | return RETCODE_FAILURE_INSTALL | ||
500 | 120 | try: | ||
501 | 121 | self.repo() | ||
502 | 122 | self.build() | ||
503 | 123 | except Exception: | ||
504 | 124 | logger.exception('Build failed') | ||
505 | 125 | return RETCODE_FAILURE_BUILD | ||
506 | 126 | return 0 | ||
507 | 0 | 127 | ||
508 | === modified file 'lpbuildd/target/build_snap.py' | |||
509 | --- lpbuildd/target/build_snap.py 2019-10-30 12:31:53 +0000 | |||
510 | +++ lpbuildd/target/build_snap.py 2020-02-26 10:52:47 +0000 | |||
511 | @@ -17,6 +17,7 @@ | |||
512 | 17 | from six.moves.urllib.parse import urlparse | 17 | from six.moves.urllib.parse import urlparse |
513 | 18 | 18 | ||
514 | 19 | from lpbuildd.target.operation import Operation | 19 | from lpbuildd.target.operation import Operation |
515 | 20 | from lpbuildd.target.snapbuildproxy import SnapBuildProxyOperationMixin | ||
516 | 20 | from lpbuildd.target.snapstore import SnapStoreOperationMixin | 21 | from lpbuildd.target.snapstore import SnapStoreOperationMixin |
517 | 21 | from lpbuildd.target.vcs import VCSOperationMixin | 22 | from lpbuildd.target.vcs import VCSOperationMixin |
518 | 22 | 23 | ||
519 | @@ -46,7 +47,8 @@ | |||
520 | 46 | getattr(namespace, self.dest)[snap] = channel | 47 | getattr(namespace, self.dest)[snap] = channel |
521 | 47 | 48 | ||
522 | 48 | 49 | ||
524 | 49 | class BuildSnap(VCSOperationMixin, SnapStoreOperationMixin, Operation): | 50 | class BuildSnap(SnapBuildProxyOperationMixin, VCSOperationMixin, |
525 | 51 | SnapStoreOperationMixin, Operation): | ||
526 | 50 | 52 | ||
527 | 51 | description = "Build a snap." | 53 | description = "Build a snap." |
528 | 52 | 54 | ||
529 | @@ -69,10 +71,6 @@ | |||
530 | 69 | help="RFC3339 timestamp of the Launchpad build request") | 71 | help="RFC3339 timestamp of the Launchpad build request") |
531 | 70 | parser.add_argument( | 72 | parser.add_argument( |
532 | 71 | "--build-url", help="URL of this build on Launchpad") | 73 | "--build-url", help="URL of this build on Launchpad") |
533 | 72 | parser.add_argument("--proxy-url", help="builder proxy url") | ||
534 | 73 | parser.add_argument( | ||
535 | 74 | "--revocation-endpoint", | ||
536 | 75 | help="builder proxy token revocation endpoint") | ||
537 | 76 | parser.add_argument( | 74 | parser.add_argument( |
538 | 77 | "--build-source-tarball", default=False, action="store_true", | 75 | "--build-source-tarball", default=False, action="store_true", |
539 | 78 | help=( | 76 | help=( |
540 | @@ -137,6 +135,9 @@ | |||
541 | 137 | def install(self): | 135 | def install(self): |
542 | 138 | logger.info("Running install phase...") | 136 | logger.info("Running install phase...") |
543 | 139 | deps = [] | 137 | deps = [] |
544 | 138 | if self.args.proxy_url: | ||
545 | 139 | deps.extend(self.proxy_deps) | ||
546 | 140 | self.install_git_proxy() | ||
547 | 140 | if self.args.backend == "lxd": | 141 | if self.args.backend == "lxd": |
548 | 141 | # udev is installed explicitly to work around | 142 | # udev is installed explicitly to work around |
549 | 142 | # https://bugs.launchpad.net/snapd/+bug/1731519. | 143 | # https://bugs.launchpad.net/snapd/+bug/1731519. |
550 | @@ -144,8 +145,6 @@ | |||
551 | 144 | if self.backend.is_package_available(dep): | 145 | if self.backend.is_package_available(dep): |
552 | 145 | deps.append(dep) | 146 | deps.append(dep) |
553 | 146 | deps.extend(self.vcs_deps) | 147 | deps.extend(self.vcs_deps) |
554 | 147 | if self.args.proxy_url: | ||
555 | 148 | deps.extend(["python3", "socat"]) | ||
556 | 149 | if "snapcraft" in self.args.channels: | 148 | if "snapcraft" in self.args.channels: |
557 | 150 | # snapcraft requires sudo in lots of places, but can't depend on | 149 | # snapcraft requires sudo in lots of places, but can't depend on |
558 | 151 | # it when installed as a snap. | 150 | # it when installed as a snap. |
559 | @@ -167,19 +166,12 @@ | |||
560 | 167 | "--channel=%s" % self.args.channels["snapcraft"], | 166 | "--channel=%s" % self.args.channels["snapcraft"], |
561 | 168 | "snapcraft"]) | 167 | "snapcraft"]) |
562 | 169 | if self.args.proxy_url: | 168 | if self.args.proxy_url: |
563 | 170 | self.backend.copy_in( | ||
564 | 171 | os.path.join(self.bin, "snap-git-proxy"), | ||
565 | 172 | "/usr/local/bin/snap-git-proxy") | ||
566 | 173 | self.install_svn_servers() | 169 | self.install_svn_servers() |
567 | 174 | 170 | ||
568 | 175 | def repo(self): | 171 | def repo(self): |
569 | 176 | """Collect git or bzr branch.""" | 172 | """Collect git or bzr branch.""" |
570 | 177 | logger.info("Running repo phase...") | 173 | logger.info("Running repo phase...") |
576 | 178 | env = OrderedDict() | 174 | env = self.build_proxy_environment(proxy_url=self.args.proxy_url) |
572 | 179 | if self.args.proxy_url: | ||
573 | 180 | env["http_proxy"] = self.args.proxy_url | ||
574 | 181 | env["https_proxy"] = self.args.proxy_url | ||
575 | 182 | env["GIT_PROXY_COMMAND"] = "/usr/local/bin/snap-git-proxy" | ||
577 | 183 | self.vcs_fetch(self.args.name, cwd="/build", env=env) | 175 | self.vcs_fetch(self.args.name, cwd="/build", env=env) |
578 | 184 | status = {} | 176 | status = {} |
579 | 185 | if self.args.branch is not None: | 177 | if self.args.branch is not None: |
580 | 186 | 178 | ||
581 | === modified file 'lpbuildd/target/cli.py' | |||
582 | --- lpbuildd/target/cli.py 2017-09-08 15:57:18 +0000 | |||
583 | +++ lpbuildd/target/cli.py 2020-02-26 10:52:47 +0000 | |||
584 | @@ -14,6 +14,7 @@ | |||
585 | 14 | OverrideSourcesList, | 14 | OverrideSourcesList, |
586 | 15 | Update, | 15 | Update, |
587 | 16 | ) | 16 | ) |
588 | 17 | from lpbuildd.target.build_oci import BuildOCI | ||
589 | 17 | from lpbuildd.target.build_livefs import BuildLiveFS | 18 | from lpbuildd.target.build_livefs import BuildLiveFS |
590 | 18 | from lpbuildd.target.build_snap import BuildSnap | 19 | from lpbuildd.target.build_snap import BuildSnap |
591 | 19 | from lpbuildd.target.generate_translation_templates import ( | 20 | from lpbuildd.target.generate_translation_templates import ( |
592 | @@ -49,6 +50,7 @@ | |||
593 | 49 | 50 | ||
594 | 50 | operations = { | 51 | operations = { |
595 | 51 | "add-trusted-keys": AddTrustedKeys, | 52 | "add-trusted-keys": AddTrustedKeys, |
596 | 53 | "build-oci": BuildOCI, | ||
597 | 52 | "buildlivefs": BuildLiveFS, | 54 | "buildlivefs": BuildLiveFS, |
598 | 53 | "buildsnap": BuildSnap, | 55 | "buildsnap": BuildSnap, |
599 | 54 | "generate-translation-templates": GenerateTranslationTemplates, | 56 | "generate-translation-templates": GenerateTranslationTemplates, |
600 | 55 | 57 | ||
601 | === modified file 'lpbuildd/target/lxd.py' | |||
602 | --- lpbuildd/target/lxd.py 2019-12-09 11:17:14 +0000 | |||
603 | +++ lpbuildd/target/lxd.py 2020-02-26 10:52:47 +0000 | |||
604 | @@ -504,7 +504,8 @@ | |||
605 | 504 | "/etc/systemd/system/snapd.refresh.timer"]) | 504 | "/etc/systemd/system/snapd.refresh.timer"]) |
606 | 505 | 505 | ||
607 | 506 | def run(self, args, cwd=None, env=None, input_text=None, get_output=False, | 506 | def run(self, args, cwd=None, env=None, input_text=None, get_output=False, |
609 | 507 | echo=False, **kwargs): | 507 | echo=False, return_process=False, universal_newlines=True, |
610 | 508 | **kwargs): | ||
611 | 508 | """See `Backend`.""" | 509 | """See `Backend`.""" |
612 | 509 | env_params = [] | 510 | env_params = [] |
613 | 510 | if env: | 511 | if env: |
614 | @@ -538,7 +539,10 @@ | |||
615 | 538 | if get_output: | 539 | if get_output: |
616 | 539 | kwargs["stdout"] = subprocess.PIPE | 540 | kwargs["stdout"] = subprocess.PIPE |
617 | 540 | proc = subprocess.Popen( | 541 | proc = subprocess.Popen( |
619 | 541 | cmd, stdin=subprocess.PIPE, universal_newlines=True, **kwargs) | 542 | cmd, stdin=subprocess.PIPE, |
620 | 543 | universal_newlines=universal_newlines, **kwargs) | ||
621 | 544 | if return_process: | ||
622 | 545 | return proc | ||
623 | 542 | output, _ = proc.communicate(input_text) | 546 | output, _ = proc.communicate(input_text) |
624 | 543 | if proc.returncode: | 547 | if proc.returncode: |
625 | 544 | raise subprocess.CalledProcessError(proc.returncode, cmd) | 548 | raise subprocess.CalledProcessError(proc.returncode, cmd) |
626 | 545 | 549 | ||
627 | === added file 'lpbuildd/target/snapbuildproxy.py' | |||
628 | --- lpbuildd/target/snapbuildproxy.py 1970-01-01 00:00:00 +0000 | |||
629 | +++ lpbuildd/target/snapbuildproxy.py 2020-02-26 10:52:47 +0000 | |||
630 | @@ -0,0 +1,41 @@ | |||
631 | 1 | # Copyright 2019-2020 Canonical Ltd. This software is licensed under the | ||
632 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
633 | 3 | |||
634 | 4 | from __future__ import print_function | ||
635 | 5 | |||
636 | 6 | __metaclass__ = type | ||
637 | 7 | |||
638 | 8 | from collections import OrderedDict | ||
639 | 9 | import os | ||
640 | 10 | |||
641 | 11 | |||
642 | 12 | class SnapBuildProxyOperationMixin: | ||
643 | 13 | """Methods supporting the build time HTTP proxy for snap and OCI builds.""" | ||
644 | 14 | |||
645 | 15 | @classmethod | ||
646 | 16 | def add_arguments(cls, parser): | ||
647 | 17 | super(SnapBuildProxyOperationMixin, cls).add_arguments(parser) | ||
648 | 18 | parser.add_argument("--proxy-url", help="builder proxy url") | ||
649 | 19 | parser.add_argument( | ||
650 | 20 | "--revocation-endpoint", | ||
651 | 21 | help="builder proxy token revocation endpoint") | ||
652 | 22 | |||
653 | 23 | @property | ||
654 | 24 | def proxy_deps(self): | ||
655 | 25 | return ["python3", "socat"] | ||
656 | 26 | |||
657 | 27 | def install_git_proxy(self): | ||
658 | 28 | self.backend.copy_in( | ||
659 | 29 | os.path.join(self.bin, "snap-git-proxy"), | ||
660 | 30 | "/usr/local/bin/snap-git-proxy") | ||
661 | 31 | |||
662 | 32 | def build_proxy_environment(self, proxy_url=None, env=None): | ||
663 | 33 | """Extend a command environment to include http proxy variables.""" | ||
664 | 34 | full_env = OrderedDict() | ||
665 | 35 | if env: | ||
666 | 36 | full_env.update(env) | ||
667 | 37 | if proxy_url: | ||
668 | 38 | full_env["http_proxy"] = self.args.proxy_url | ||
669 | 39 | full_env["https_proxy"] = self.args.proxy_url | ||
670 | 40 | full_env["GIT_PROXY_COMMAND"] = "/usr/local/bin/snap-git-proxy" | ||
671 | 41 | return full_env | ||
672 | 0 | 42 | ||
673 | === added file 'lpbuildd/target/tests/test_build_oci.py' | |||
674 | --- lpbuildd/target/tests/test_build_oci.py 1970-01-01 00:00:00 +0000 | |||
675 | +++ lpbuildd/target/tests/test_build_oci.py 2020-02-26 10:52:47 +0000 | |||
676 | @@ -0,0 +1,407 @@ | |||
677 | 1 | # Copyright 2019 Canonical Ltd. This software is licensed under the | ||
678 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
679 | 3 | |||
680 | 4 | __metaclass__ = type | ||
681 | 5 | |||
682 | 6 | import os.path | ||
683 | 7 | import stat | ||
684 | 8 | import subprocess | ||
685 | 9 | from textwrap import dedent | ||
686 | 10 | |||
687 | 11 | from fixtures import ( | ||
688 | 12 | FakeLogger, | ||
689 | 13 | TempDir, | ||
690 | 14 | ) | ||
691 | 15 | import responses | ||
692 | 16 | from systemfixtures import FakeFilesystem | ||
693 | 17 | from testtools import TestCase | ||
694 | 18 | from testtools.matchers import ( | ||
695 | 19 | AnyMatch, | ||
696 | 20 | Equals, | ||
697 | 21 | Is, | ||
698 | 22 | MatchesAll, | ||
699 | 23 | MatchesDict, | ||
700 | 24 | MatchesListwise, | ||
701 | 25 | ) | ||
702 | 26 | |||
703 | 27 | from lpbuildd.target.build_oci import ( | ||
704 | 28 | RETCODE_FAILURE_BUILD, | ||
705 | 29 | RETCODE_FAILURE_INSTALL, | ||
706 | 30 | ) | ||
707 | 31 | from lpbuildd.target.cli import parse_args | ||
708 | 32 | from lpbuildd.tests.fakebuilder import FakeMethod | ||
709 | 33 | |||
710 | 34 | |||
711 | 35 | class RanCommand(MatchesListwise): | ||
712 | 36 | |||
713 | 37 | def __init__(self, args, echo=None, cwd=None, input_text=None, | ||
714 | 38 | get_output=None, **env): | ||
715 | 39 | kwargs_matcher = {} | ||
716 | 40 | if echo is not None: | ||
717 | 41 | kwargs_matcher["echo"] = Is(echo) | ||
718 | 42 | if cwd: | ||
719 | 43 | kwargs_matcher["cwd"] = Equals(cwd) | ||
720 | 44 | if input_text: | ||
721 | 45 | kwargs_matcher["input_text"] = Equals(input_text) | ||
722 | 46 | if get_output is not None: | ||
723 | 47 | kwargs_matcher["get_output"] = Is(get_output) | ||
724 | 48 | if env: | ||
725 | 49 | kwargs_matcher["env"] = MatchesDict( | ||
726 | 50 | {key: Equals(value) for key, value in env.items()}) | ||
727 | 51 | super(RanCommand, self).__init__( | ||
728 | 52 | [Equals((args,)), MatchesDict(kwargs_matcher)]) | ||
729 | 53 | |||
730 | 54 | |||
731 | 55 | class RanAptGet(RanCommand): | ||
732 | 56 | |||
733 | 57 | def __init__(self, *args): | ||
734 | 58 | super(RanAptGet, self).__init__(["apt-get", "-y"] + list(args)) | ||
735 | 59 | |||
736 | 60 | |||
737 | 61 | class RanSnap(RanCommand): | ||
738 | 62 | |||
739 | 63 | def __init__(self, *args, **kwargs): | ||
740 | 64 | super(RanSnap, self).__init__(["snap"] + list(args), **kwargs) | ||
741 | 65 | |||
742 | 66 | |||
743 | 67 | class RanBuildCommand(RanCommand): | ||
744 | 68 | |||
745 | 69 | def __init__(self, args, **kwargs): | ||
746 | 70 | kwargs.setdefault("LANG", "C.UTF-8") | ||
747 | 71 | kwargs.setdefault("SHELL", "/bin/sh") | ||
748 | 72 | super(RanBuildCommand, self).__init__(args, **kwargs) | ||
749 | 73 | |||
750 | 74 | |||
751 | 75 | class TestBuildOCI(TestCase): | ||
752 | 76 | |||
753 | 77 | def test_run_build_command_no_env(self): | ||
754 | 78 | args = [ | ||
755 | 79 | "build-oci", | ||
756 | 80 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
757 | 81 | "--branch", "lp:foo", "test-image", | ||
758 | 82 | ] | ||
759 | 83 | build_oci = parse_args(args=args).operation | ||
760 | 84 | build_oci.run_build_command(["echo", "hello world"]) | ||
761 | 85 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
762 | 86 | RanBuildCommand(["echo", "hello world"]), | ||
763 | 87 | ])) | ||
764 | 88 | |||
765 | 89 | def test_run_build_command_env(self): | ||
766 | 90 | args = [ | ||
767 | 91 | "build-oci", | ||
768 | 92 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
769 | 93 | "--branch", "lp:foo", "test-image", | ||
770 | 94 | ] | ||
771 | 95 | build_oci = parse_args(args=args).operation | ||
772 | 96 | build_oci.run_build_command( | ||
773 | 97 | ["echo", "hello world"], env={"FOO": "bar baz"}) | ||
774 | 98 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
775 | 99 | RanBuildCommand(["echo", "hello world"], FOO="bar baz"), | ||
776 | 100 | ])) | ||
777 | 101 | |||
778 | 102 | def test_install_bzr(self): | ||
779 | 103 | args = [ | ||
780 | 104 | "build-oci", | ||
781 | 105 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
782 | 106 | "--branch", "lp:foo", "test-image" | ||
783 | 107 | ] | ||
784 | 108 | build_oci = parse_args(args=args).operation | ||
785 | 109 | build_oci.install() | ||
786 | 110 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
787 | 111 | RanAptGet("install", "bzr", "docker.io"), | ||
788 | 112 | RanCommand(["systemctl", "restart", "docker"]), | ||
789 | 113 | RanCommand(["mkdir", "-p", "/home/buildd"]), | ||
790 | 114 | ])) | ||
791 | 115 | |||
792 | 116 | def test_install_git(self): | ||
793 | 117 | args = [ | ||
794 | 118 | "build-oci", | ||
795 | 119 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
796 | 120 | "--git-repository", "lp:foo", "test-image" | ||
797 | 121 | ] | ||
798 | 122 | build_oci = parse_args(args=args).operation | ||
799 | 123 | build_oci.install() | ||
800 | 124 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
801 | 125 | RanAptGet("install", "git", "docker.io"), | ||
802 | 126 | RanCommand(["systemctl", "restart", "docker"]), | ||
803 | 127 | RanCommand(["mkdir", "-p", "/home/buildd"]), | ||
804 | 128 | ])) | ||
805 | 129 | |||
806 | 130 | @responses.activate | ||
807 | 131 | def test_install_snap_store_proxy(self): | ||
808 | 132 | store_assertion = dedent("""\ | ||
809 | 133 | type: store | ||
810 | 134 | store: store-id | ||
811 | 135 | url: http://snap-store-proxy.example | ||
812 | 136 | |||
813 | 137 | body | ||
814 | 138 | """) | ||
815 | 139 | |||
816 | 140 | def respond(request): | ||
817 | 141 | return 200, {"X-Assertion-Store-Id": "store-id"}, store_assertion | ||
818 | 142 | |||
819 | 143 | responses.add_callback( | ||
820 | 144 | "GET", "http://snap-store-proxy.example/v2/auth/store/assertions", | ||
821 | 145 | callback=respond) | ||
822 | 146 | args = [ | ||
823 | 147 | "buildsnap", | ||
824 | 148 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
825 | 149 | "--git-repository", "lp:foo", | ||
826 | 150 | "--snap-store-proxy-url", "http://snap-store-proxy.example/", | ||
827 | 151 | "test-snap", | ||
828 | 152 | ] | ||
829 | 153 | build_snap = parse_args(args=args).operation | ||
830 | 154 | build_snap.install() | ||
831 | 155 | self.assertThat(build_snap.backend.run.calls, MatchesListwise([ | ||
832 | 156 | RanAptGet("install", "git", "snapcraft"), | ||
833 | 157 | RanCommand( | ||
834 | 158 | ["snap", "ack", "/dev/stdin"], input_text=store_assertion), | ||
835 | 159 | RanCommand(["snap", "set", "core", "proxy.store=store-id"]), | ||
836 | 160 | ])) | ||
837 | 161 | |||
838 | 162 | def test_install_proxy(self): | ||
839 | 163 | args = [ | ||
840 | 164 | "build-oci", | ||
841 | 165 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
842 | 166 | "--git-repository", "lp:foo", | ||
843 | 167 | "--proxy-url", "http://proxy.example:3128/", | ||
844 | 168 | "test-image", | ||
845 | 169 | ] | ||
846 | 170 | build_oci = parse_args(args=args).operation | ||
847 | 171 | build_oci.bin = "/builderbin" | ||
848 | 172 | self.useFixture(FakeFilesystem()).add("/builderbin") | ||
849 | 173 | os.mkdir("/builderbin") | ||
850 | 174 | with open("/builderbin/snap-git-proxy", "w") as proxy_script: | ||
851 | 175 | proxy_script.write("proxy script\n") | ||
852 | 176 | os.fchmod(proxy_script.fileno(), 0o755) | ||
853 | 177 | build_oci.install() | ||
854 | 178 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
855 | 179 | RanCommand( | ||
856 | 180 | ["mkdir", "-p", "/etc/systemd/system/docker.service.d"]), | ||
857 | 181 | RanAptGet("install", "python3", "socat", "git", "docker.io"), | ||
858 | 182 | RanCommand(["systemctl", "restart", "docker"]), | ||
859 | 183 | RanCommand(["mkdir", "-p", "/home/buildd"]), | ||
860 | 184 | ])) | ||
861 | 185 | self.assertEqual( | ||
862 | 186 | (b"proxy script\n", stat.S_IFREG | 0o755), | ||
863 | 187 | build_oci.backend.backend_fs["/usr/local/bin/snap-git-proxy"]) | ||
864 | 188 | |||
865 | 189 | def test_repo_bzr(self): | ||
866 | 190 | args = [ | ||
867 | 191 | "build-oci", | ||
868 | 192 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
869 | 193 | "--branch", "lp:foo", "test-image", | ||
870 | 194 | ] | ||
871 | 195 | build_oci = parse_args(args=args).operation | ||
872 | 196 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
873 | 197 | build_oci.backend.run = FakeMethod() | ||
874 | 198 | build_oci.repo() | ||
875 | 199 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
876 | 200 | RanBuildCommand( | ||
877 | 201 | ["bzr", "branch", "lp:foo", "test-image"], cwd="/home/buildd"), | ||
878 | 202 | ])) | ||
879 | 203 | |||
880 | 204 | def test_repo_git(self): | ||
881 | 205 | args = [ | ||
882 | 206 | "build-oci", | ||
883 | 207 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
884 | 208 | "--git-repository", "lp:foo", "test-image", | ||
885 | 209 | ] | ||
886 | 210 | build_oci = parse_args(args=args).operation | ||
887 | 211 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
888 | 212 | build_oci.backend.run = FakeMethod() | ||
889 | 213 | build_oci.repo() | ||
890 | 214 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
891 | 215 | RanBuildCommand( | ||
892 | 216 | ["git", "clone", "lp:foo", "test-image"], cwd="/home/buildd"), | ||
893 | 217 | RanBuildCommand( | ||
894 | 218 | ["git", "submodule", "update", "--init", "--recursive"], | ||
895 | 219 | cwd="/home/buildd/test-image"), | ||
896 | 220 | ])) | ||
897 | 221 | |||
898 | 222 | def test_repo_git_with_path(self): | ||
899 | 223 | args = [ | ||
900 | 224 | "build-oci", | ||
901 | 225 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
902 | 226 | "--git-repository", "lp:foo", "--git-path", "next", "test-image", | ||
903 | 227 | ] | ||
904 | 228 | build_oci = parse_args(args=args).operation | ||
905 | 229 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
906 | 230 | build_oci.backend.run = FakeMethod() | ||
907 | 231 | build_oci.repo() | ||
908 | 232 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
909 | 233 | RanBuildCommand( | ||
910 | 234 | ["git", "clone", "-b", "next", "lp:foo", "test-image"], | ||
911 | 235 | cwd="/home/buildd"), | ||
912 | 236 | RanBuildCommand( | ||
913 | 237 | ["git", "submodule", "update", "--init", "--recursive"], | ||
914 | 238 | cwd="/home/buildd/test-image"), | ||
915 | 239 | ])) | ||
916 | 240 | |||
917 | 241 | def test_repo_git_with_tag_path(self): | ||
918 | 242 | args = [ | ||
919 | 243 | "build-oci", | ||
920 | 244 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
921 | 245 | "--git-repository", "lp:foo", "--git-path", "refs/tags/1.0", | ||
922 | 246 | "test-image", | ||
923 | 247 | ] | ||
924 | 248 | build_oci = parse_args(args=args).operation | ||
925 | 249 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
926 | 250 | build_oci.backend.run = FakeMethod() | ||
927 | 251 | build_oci.repo() | ||
928 | 252 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
929 | 253 | RanBuildCommand( | ||
930 | 254 | ["git", "clone", "-b", "1.0", "lp:foo", "test-image"], | ||
931 | 255 | cwd="/home/buildd"), | ||
932 | 256 | RanBuildCommand( | ||
933 | 257 | ["git", "submodule", "update", "--init", "--recursive"], | ||
934 | 258 | cwd="/home/buildd/test-image"), | ||
935 | 259 | ])) | ||
936 | 260 | |||
937 | 261 | def test_repo_proxy(self): | ||
938 | 262 | args = [ | ||
939 | 263 | "build-oci", | ||
940 | 264 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
941 | 265 | "--git-repository", "lp:foo", | ||
942 | 266 | "--proxy-url", "http://proxy.example:3128/", | ||
943 | 267 | "test-image", | ||
944 | 268 | ] | ||
945 | 269 | build_oci = parse_args(args=args).operation | ||
946 | 270 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
947 | 271 | build_oci.backend.run = FakeMethod() | ||
948 | 272 | build_oci.repo() | ||
949 | 273 | env = { | ||
950 | 274 | "http_proxy": "http://proxy.example:3128/", | ||
951 | 275 | "https_proxy": "http://proxy.example:3128/", | ||
952 | 276 | "GIT_PROXY_COMMAND": "/usr/local/bin/snap-git-proxy", | ||
953 | 277 | } | ||
954 | 278 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
955 | 279 | RanBuildCommand( | ||
956 | 280 | ["git", "clone", "lp:foo", "test-image"], | ||
957 | 281 | cwd="/home/buildd", **env), | ||
958 | 282 | RanBuildCommand( | ||
959 | 283 | ["git", "submodule", "update", "--init", "--recursive"], | ||
960 | 284 | cwd="/home/buildd/test-image", **env), | ||
961 | 285 | ])) | ||
962 | 286 | |||
963 | 287 | def test_build(self): | ||
964 | 288 | args = [ | ||
965 | 289 | "build-oci", | ||
966 | 290 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
967 | 291 | "--branch", "lp:foo", "test-image", | ||
968 | 292 | ] | ||
969 | 293 | build_oci = parse_args(args=args).operation | ||
970 | 294 | build_oci.backend.add_dir('/build/test-directory') | ||
971 | 295 | build_oci.build() | ||
972 | 296 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
973 | 297 | RanBuildCommand( | ||
974 | 298 | ["docker", "build", "--no-cache", "--tag", "test-image", | ||
975 | 299 | "/home/buildd/test-image"]), | ||
976 | 300 | ])) | ||
977 | 301 | |||
978 | 302 | def test_build_with_file(self): | ||
979 | 303 | args = [ | ||
980 | 304 | "build-oci", | ||
981 | 305 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
982 | 306 | "--branch", "lp:foo", "--build-file", "build-aux/Dockerfile", | ||
983 | 307 | "test-image", | ||
984 | 308 | ] | ||
985 | 309 | build_oci = parse_args(args=args).operation | ||
986 | 310 | build_oci.backend.add_dir('/build/test-directory') | ||
987 | 311 | build_oci.build() | ||
988 | 312 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
989 | 313 | RanBuildCommand( | ||
990 | 314 | ["docker", "build", "--no-cache", "--tag", "test-image", | ||
991 | 315 | "--file", "build-aux/Dockerfile", | ||
992 | 316 | "/home/buildd/test-image"]), | ||
993 | 317 | ])) | ||
994 | 318 | |||
995 | 319 | def test_build_proxy(self): | ||
996 | 320 | args = [ | ||
997 | 321 | "build-oci", | ||
998 | 322 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
999 | 323 | "--branch", "lp:foo", "--proxy-url", "http://proxy.example:3128/", | ||
1000 | 324 | "test-image", | ||
1001 | 325 | ] | ||
1002 | 326 | build_oci = parse_args(args=args).operation | ||
1003 | 327 | build_oci.backend.add_dir('/build/test-directory') | ||
1004 | 328 | build_oci.build() | ||
1005 | 329 | self.assertThat(build_oci.backend.run.calls, MatchesListwise([ | ||
1006 | 330 | RanBuildCommand( | ||
1007 | 331 | ["docker", "build", "--no-cache", | ||
1008 | 332 | "--build-arg", "http_proxy=http://proxy.example:3128/", | ||
1009 | 333 | "--build-arg", "https_proxy=http://proxy.example:3128/", | ||
1010 | 334 | "--tag", "test-image", "/home/buildd/test-image"]), | ||
1011 | 335 | ])) | ||
1012 | 336 | |||
1013 | 337 | def test_run_succeeds(self): | ||
1014 | 338 | args = [ | ||
1015 | 339 | "build-oci", | ||
1016 | 340 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
1017 | 341 | "--branch", "lp:foo", "test-image", | ||
1018 | 342 | ] | ||
1019 | 343 | build_oci = parse_args(args=args).operation | ||
1020 | 344 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
1021 | 345 | build_oci.backend.run = FakeMethod() | ||
1022 | 346 | self.assertEqual(0, build_oci.run()) | ||
1023 | 347 | self.assertThat(build_oci.backend.run.calls, MatchesAll( | ||
1024 | 348 | AnyMatch(RanAptGet("install", "bzr", "docker.io")), | ||
1025 | 349 | AnyMatch(RanBuildCommand( | ||
1026 | 350 | ["bzr", "branch", "lp:foo", "test-image"], | ||
1027 | 351 | cwd="/home/buildd")), | ||
1028 | 352 | AnyMatch(RanBuildCommand( | ||
1029 | 353 | ["docker", "build", "--no-cache", "--tag", "test-image", | ||
1030 | 354 | "/home/buildd/test-image"])), | ||
1031 | 355 | )) | ||
1032 | 356 | |||
1033 | 357 | def test_run_install_fails(self): | ||
1034 | 358 | class FailInstall(FakeMethod): | ||
1035 | 359 | def __call__(self, run_args, *args, **kwargs): | ||
1036 | 360 | super(FailInstall, self).__call__(run_args, *args, **kwargs) | ||
1037 | 361 | if run_args[0] == "apt-get": | ||
1038 | 362 | raise subprocess.CalledProcessError(1, run_args) | ||
1039 | 363 | |||
1040 | 364 | self.useFixture(FakeLogger()) | ||
1041 | 365 | args = [ | ||
1042 | 366 | "build-oci", | ||
1043 | 367 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
1044 | 368 | "--branch", "lp:foo", "test-image", | ||
1045 | 369 | ] | ||
1046 | 370 | build_oci = parse_args(args=args).operation | ||
1047 | 371 | build_oci.backend.run = FailInstall() | ||
1048 | 372 | self.assertEqual(RETCODE_FAILURE_INSTALL, build_oci.run()) | ||
1049 | 373 | |||
1050 | 374 | def test_run_repo_fails(self): | ||
1051 | 375 | class FailRepo(FakeMethod): | ||
1052 | 376 | def __call__(self, run_args, *args, **kwargs): | ||
1053 | 377 | super(FailRepo, self).__call__(run_args, *args, **kwargs) | ||
1054 | 378 | if run_args[:2] == ["bzr", "branch"]: | ||
1055 | 379 | raise subprocess.CalledProcessError(1, run_args) | ||
1056 | 380 | |||
1057 | 381 | self.useFixture(FakeLogger()) | ||
1058 | 382 | args = [ | ||
1059 | 383 | "build-oci", | ||
1060 | 384 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
1061 | 385 | "--branch", "lp:foo", "test-image", | ||
1062 | 386 | ] | ||
1063 | 387 | build_oci = parse_args(args=args).operation | ||
1064 | 388 | build_oci.backend.run = FailRepo() | ||
1065 | 389 | self.assertEqual(RETCODE_FAILURE_BUILD, build_oci.run()) | ||
1066 | 390 | |||
1067 | 391 | def test_run_build_fails(self): | ||
1068 | 392 | class FailBuild(FakeMethod): | ||
1069 | 393 | def __call__(self, run_args, *args, **kwargs): | ||
1070 | 394 | super(FailBuild, self).__call__(run_args, *args, **kwargs) | ||
1071 | 395 | if run_args[0] == "docker": | ||
1072 | 396 | raise subprocess.CalledProcessError(1, run_args) | ||
1073 | 397 | |||
1074 | 398 | self.useFixture(FakeLogger()) | ||
1075 | 399 | args = [ | ||
1076 | 400 | "build-oci", | ||
1077 | 401 | "--backend=fake", "--series=xenial", "--arch=amd64", "1", | ||
1078 | 402 | "--branch", "lp:foo", "test-image", | ||
1079 | 403 | ] | ||
1080 | 404 | build_oci = parse_args(args=args).operation | ||
1081 | 405 | build_oci.backend.build_path = self.useFixture(TempDir()).path | ||
1082 | 406 | build_oci.backend.run = FailBuild() | ||
1083 | 407 | self.assertEqual(RETCODE_FAILURE_BUILD, build_oci.run()) | ||
1084 | 0 | 408 | ||
1085 | === modified file 'lpbuildd/target/tests/test_build_snap.py' | |||
1086 | --- lpbuildd/target/tests/test_build_snap.py 2019-06-05 14:10:20 +0000 | |||
1087 | +++ lpbuildd/target/tests/test_build_snap.py 2020-02-26 10:52:47 +0000 | |||
1088 | @@ -186,7 +186,7 @@ | |||
1089 | 186 | os.fchmod(proxy_script.fileno(), 0o755) | 186 | os.fchmod(proxy_script.fileno(), 0o755) |
1090 | 187 | build_snap.install() | 187 | build_snap.install() |
1091 | 188 | self.assertThat(build_snap.backend.run.calls, MatchesListwise([ | 188 | self.assertThat(build_snap.backend.run.calls, MatchesListwise([ |
1093 | 189 | RanAptGet("install", "git", "python3", "socat", "snapcraft"), | 189 | RanAptGet("install", "python3", "socat", "git", "snapcraft"), |
1094 | 190 | RanCommand(["mkdir", "-p", "/root/.subversion"]), | 190 | RanCommand(["mkdir", "-p", "/root/.subversion"]), |
1095 | 191 | ])) | 191 | ])) |
1096 | 192 | self.assertEqual( | 192 | self.assertEqual( |
1097 | 193 | 193 | ||
1098 | === modified file 'lpbuildd/target/tests/test_lxd.py' | |||
1099 | --- lpbuildd/target/tests/test_lxd.py 2019-12-09 11:17:14 +0000 | |||
1100 | +++ lpbuildd/target/tests/test_lxd.py 2020-02-26 10:52:47 +0000 | |||
1101 | @@ -276,7 +276,7 @@ | |||
1102 | 276 | "lp-xenial-amd64", "lp-xenial-amd64") | 276 | "lp-xenial-amd64", "lp-xenial-amd64") |
1103 | 277 | 277 | ||
1104 | 278 | def assert_correct_profile(self, extra_raw_lxc_config=None, | 278 | def assert_correct_profile(self, extra_raw_lxc_config=None, |
1106 | 279 | driver_version="2.0"): | 279 | driver_version="2.0"): |
1107 | 280 | if extra_raw_lxc_config is None: | 280 | if extra_raw_lxc_config is None: |
1108 | 281 | extra_raw_lxc_config = [] | 281 | extra_raw_lxc_config = [] |
1109 | 282 | 282 | ||
1110 | @@ -356,7 +356,7 @@ | |||
1111 | 356 | } | 356 | } |
1112 | 357 | LXD("1", "xenial", "powerpc").create_profile() | 357 | LXD("1", "xenial", "powerpc").create_profile() |
1113 | 358 | self.assert_correct_profile( | 358 | self.assert_correct_profile( |
1115 | 359 | extra_raw_lxc_config=[("lxc.seccomp", ""),], | 359 | extra_raw_lxc_config=[("lxc.seccomp", ""), ], |
1116 | 360 | driver_version=driver_version or "3.0" | 360 | driver_version=driver_version or "3.0" |
1117 | 361 | ) | 361 | ) |
1118 | 362 | 362 | ||
1119 | @@ -463,7 +463,8 @@ | |||
1120 | 463 | "b", "7", str(minor)])) | 463 | "b", "7", str(minor)])) |
1121 | 464 | if not with_dm0: | 464 | if not with_dm0: |
1122 | 465 | expected_args.extend([ | 465 | expected_args.extend([ |
1124 | 466 | Equals(["sudo", "dmsetup", "create", "tmpdevice", "--notable"]), | 466 | Equals( |
1125 | 467 | ["sudo", "dmsetup", "create", "tmpdevice", "--notable"]), | ||
1126 | 467 | Equals(["sudo", "dmsetup", "remove", "tmpdevice"]), | 468 | Equals(["sudo", "dmsetup", "remove", "tmpdevice"]), |
1127 | 468 | ]) | 469 | ]) |
1128 | 469 | for minor in range(8): | 470 | for minor in range(8): |
1129 | 470 | 471 | ||
1130 | === added file 'lpbuildd/tests/oci_tarball.py' | |||
1131 | --- lpbuildd/tests/oci_tarball.py 1970-01-01 00:00:00 +0000 | |||
1132 | +++ lpbuildd/tests/oci_tarball.py 2020-02-26 10:52:47 +0000 | |||
1133 | @@ -0,0 +1,60 @@ | |||
1134 | 1 | import json | ||
1135 | 2 | import os | ||
1136 | 3 | import StringIO | ||
1137 | 4 | import tempfile | ||
1138 | 5 | import tarfile | ||
1139 | 6 | |||
1140 | 7 | |||
1141 | 8 | class OCITarball: | ||
1142 | 9 | """Create a tarball for use in tests with OCI.""" | ||
1143 | 10 | |||
1144 | 11 | def _makeFile(self, contents, name): | ||
1145 | 12 | json_contents = json.dumps(contents) | ||
1146 | 13 | tarinfo = tarfile.TarInfo(name) | ||
1147 | 14 | tarinfo.size = len(json_contents) | ||
1148 | 15 | return tarinfo, StringIO.StringIO(json_contents) | ||
1149 | 16 | |||
1150 | 17 | @property | ||
1151 | 18 | def config(self): | ||
1152 | 19 | return self._makeFile( | ||
1153 | 20 | {"rootfs": {"diff_ids": ["sha256:diff1", "sha256:diff2"]}}, | ||
1154 | 21 | 'config.json') | ||
1155 | 22 | |||
1156 | 23 | @property | ||
1157 | 24 | def manifest(self): | ||
1158 | 25 | return self._makeFile( | ||
1159 | 26 | [{"Config": "config.json", | ||
1160 | 27 | "Layers": ["layer-1/layer.tar", "layer-2/layer.tar"]}], | ||
1161 | 28 | 'manifest.json') | ||
1162 | 29 | |||
1163 | 30 | @property | ||
1164 | 31 | def repositories(self): | ||
1165 | 32 | return self._makeFile([], 'repositories') | ||
1166 | 33 | |||
1167 | 34 | def layer_file(self, directory, layer_name): | ||
1168 | 35 | contents = "{}-contents".format(layer_name) | ||
1169 | 36 | tarinfo = tarfile.TarInfo(contents) | ||
1170 | 37 | tarinfo.size = len(contents) | ||
1171 | 38 | layer_contents = StringIO.StringIO(contents) | ||
1172 | 39 | layer_tar_path = os.path.join( | ||
1173 | 40 | directory, '{}.tar.gz'.format(layer_name)) | ||
1174 | 41 | layer_tar = tarfile.open(layer_tar_path, 'w:gz') | ||
1175 | 42 | layer_tar.addfile(tarinfo, layer_contents) | ||
1176 | 43 | layer_tar.close() | ||
1177 | 44 | return layer_tar_path | ||
1178 | 45 | |||
1179 | 46 | def build_tar_file(self): | ||
1180 | 47 | tar_directory = tempfile.mkdtemp() | ||
1181 | 48 | tar_path = os.path.join(tar_directory, 'test-oci-image.tar') | ||
1182 | 49 | tar = tarfile.open(tar_path, 'w') | ||
1183 | 50 | tar.addfile(*self.config) | ||
1184 | 51 | tar.addfile(*self.manifest) | ||
1185 | 52 | tar.addfile(*self.repositories) | ||
1186 | 53 | |||
1187 | 54 | for layer_name in ['layer-1', 'layer-2']: | ||
1188 | 55 | layer = self.layer_file(tar_directory, layer_name) | ||
1189 | 56 | tar.add(layer, arcname='{}.tar.gz'.format(layer_name)) | ||
1190 | 57 | |||
1191 | 58 | tar.close() | ||
1192 | 59 | |||
1193 | 60 | return tar_path | ||
1194 | 0 | 61 | ||
1195 | === added file 'lpbuildd/tests/test_oci.py' | |||
1196 | --- lpbuildd/tests/test_oci.py 1970-01-01 00:00:00 +0000 | |||
1197 | +++ lpbuildd/tests/test_oci.py 2020-02-26 10:52:47 +0000 | |||
1198 | @@ -0,0 +1,280 @@ | |||
1199 | 1 | # Copyright 2019 Canonical Ltd. This software is licensed under the | ||
1200 | 2 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
1201 | 3 | |||
1202 | 4 | __metaclass__ = type | ||
1203 | 5 | |||
1204 | 6 | import io | ||
1205 | 7 | import json | ||
1206 | 8 | import os | ||
1207 | 9 | try: | ||
1208 | 10 | from unittest import mock | ||
1209 | 11 | except ImportError: | ||
1210 | 12 | import mock | ||
1211 | 13 | |||
1212 | 14 | from fixtures import ( | ||
1213 | 15 | EnvironmentVariable, | ||
1214 | 16 | MockPatch, | ||
1215 | 17 | TempDir, | ||
1216 | 18 | ) | ||
1217 | 19 | from testtools import TestCase | ||
1218 | 20 | from testtools.matchers import Contains | ||
1219 | 21 | from testtools.deferredruntest import AsynchronousDeferredRunTest | ||
1220 | 22 | from twisted.internet import defer | ||
1221 | 23 | |||
1222 | 24 | from lpbuildd.oci import ( | ||
1223 | 25 | OCIBuildManager, | ||
1224 | 26 | OCIBuildState, | ||
1225 | 27 | ) | ||
1226 | 28 | from lpbuildd.tests.fakebuilder import FakeBuilder | ||
1227 | 29 | from lpbuildd.tests.oci_tarball import OCITarball | ||
1228 | 30 | |||
1229 | 31 | |||
1230 | 32 | class MockBuildManager(OCIBuildManager): | ||
1231 | 33 | def __init__(self, *args, **kwargs): | ||
1232 | 34 | super(MockBuildManager, self).__init__(*args, **kwargs) | ||
1233 | 35 | self.commands = [] | ||
1234 | 36 | self.iterators = [] | ||
1235 | 37 | |||
1236 | 38 | def runSubProcess(self, path, command, iterate=None, env=None): | ||
1237 | 39 | self.commands.append([path] + command) | ||
1238 | 40 | if iterate is None: | ||
1239 | 41 | iterate = self.iterate | ||
1240 | 42 | self.iterators.append(iterate) | ||
1241 | 43 | return 0 | ||
1242 | 44 | |||
1243 | 45 | |||
1244 | 46 | class MockOCITarSave(): | ||
1245 | 47 | @property | ||
1246 | 48 | def stdout(self): | ||
1247 | 49 | return io.open(OCITarball().build_tar_file(), 'rb') | ||
1248 | 50 | |||
1249 | 51 | |||
1250 | 52 | class TestOCIBuildManagerIteration(TestCase): | ||
1251 | 53 | """Run OCIBuildManager through its iteration steps.""" | ||
1252 | 54 | |||
1253 | 55 | run_tests_with = AsynchronousDeferredRunTest.make_factory(timeout=5) | ||
1254 | 56 | |||
1255 | 57 | def setUp(self): | ||
1256 | 58 | super(TestOCIBuildManagerIteration, self).setUp() | ||
1257 | 59 | self.working_dir = self.useFixture(TempDir()).path | ||
1258 | 60 | builder_dir = os.path.join(self.working_dir, "builder") | ||
1259 | 61 | home_dir = os.path.join(self.working_dir, "home") | ||
1260 | 62 | for dir in (builder_dir, home_dir): | ||
1261 | 63 | os.mkdir(dir) | ||
1262 | 64 | self.useFixture(EnvironmentVariable("HOME", home_dir)) | ||
1263 | 65 | self.builder = FakeBuilder(builder_dir) | ||
1264 | 66 | self.buildid = "123" | ||
1265 | 67 | self.buildmanager = MockBuildManager(self.builder, self.buildid) | ||
1266 | 68 | self.buildmanager._cachepath = self.builder._cachepath | ||
1267 | 69 | |||
1268 | 70 | def getState(self): | ||
1269 | 71 | """Retrieve build manager's state.""" | ||
1270 | 72 | return self.buildmanager._state | ||
1271 | 73 | |||
1272 | 74 | @defer.inlineCallbacks | ||
1273 | 75 | def startBuild(self, args=None, options=None): | ||
1274 | 76 | # The build manager's iterate() kicks off the consecutive states | ||
1275 | 77 | # after INIT. | ||
1276 | 78 | extra_args = { | ||
1277 | 79 | "series": "xenial", | ||
1278 | 80 | "arch_tag": "i386", | ||
1279 | 81 | "name": "test-image", | ||
1280 | 82 | } | ||
1281 | 83 | if args is not None: | ||
1282 | 84 | extra_args.update(args) | ||
1283 | 85 | original_backend_name = self.buildmanager.backend_name | ||
1284 | 86 | self.buildmanager.backend_name = "fake" | ||
1285 | 87 | self.buildmanager.initiate({}, "chroot.tar.gz", extra_args) | ||
1286 | 88 | self.buildmanager.backend_name = original_backend_name | ||
1287 | 89 | |||
1288 | 90 | # Skip states that are done in DebianBuildManager to the state | ||
1289 | 91 | # directly before BUILD_OCI. | ||
1290 | 92 | self.buildmanager._state = OCIBuildState.UPDATE | ||
1291 | 93 | |||
1292 | 94 | # BUILD_OCI: Run the builder's payload to build the snap package. | ||
1293 | 95 | yield self.buildmanager.iterate(0) | ||
1294 | 96 | self.assertEqual(OCIBuildState.BUILD_OCI, self.getState()) | ||
1295 | 97 | expected_command = [ | ||
1296 | 98 | "sharepath/bin/in-target", "in-target", "build-oci", | ||
1297 | 99 | "--backend=lxd", "--series=xenial", "--arch=i386", self.buildid, | ||
1298 | 100 | ] | ||
1299 | 101 | if options is not None: | ||
1300 | 102 | expected_command.extend(options) | ||
1301 | 103 | expected_command.append("test-image") | ||
1302 | 104 | self.assertEqual(expected_command, self.buildmanager.commands[-1]) | ||
1303 | 105 | self.assertEqual( | ||
1304 | 106 | self.buildmanager.iterate, self.buildmanager.iterators[-1]) | ||
1305 | 107 | self.assertFalse(self.builder.wasCalled("chrootFail")) | ||
1306 | 108 | |||
1307 | 109 | @defer.inlineCallbacks | ||
1308 | 110 | def test_iterate(self): | ||
1309 | 111 | # This sha would change as it includes file attributes in the | ||
1310 | 112 | # tar file. Fix it so we can test against a known value. | ||
1311 | 113 | sha_mock = self.useFixture( | ||
1312 | 114 | MockPatch('lpbuildd.oci.OCIBuildManager._calculateLayerSha')) | ||
1313 | 115 | sha_mock.mock.return_value = "testsha" | ||
1314 | 116 | # The build manager iterates a normal build from start to finish. | ||
1315 | 117 | args = { | ||
1316 | 118 | "git_repository": "https://git.launchpad.dev/~example/+git/snap", | ||
1317 | 119 | "git_path": "master", | ||
1318 | 120 | } | ||
1319 | 121 | expected_options = [ | ||
1320 | 122 | "--git-repository", "https://git.launchpad.dev/~example/+git/snap", | ||
1321 | 123 | "--git-path", "master", | ||
1322 | 124 | ] | ||
1323 | 125 | yield self.startBuild(args, expected_options) | ||
1324 | 126 | |||
1325 | 127 | log_path = os.path.join(self.buildmanager._cachepath, "buildlog") | ||
1326 | 128 | with open(log_path, "w") as log: | ||
1327 | 129 | log.write("I am a build log.") | ||
1328 | 130 | |||
1329 | 131 | self.buildmanager.backend.run.result = MockOCITarSave() | ||
1330 | 132 | |||
1331 | 133 | self.buildmanager.backend.add_file( | ||
1332 | 134 | '/var/lib/docker/image/' | ||
1333 | 135 | 'vfs/distribution/v2metadata-by-diffid/sha256/diff1', | ||
1334 | 136 | b"""[{"Digest": "test_digest", "SourceRepository": "test"}]""" | ||
1335 | 137 | ) | ||
1336 | 138 | |||
1337 | 139 | # After building the package, reap processes. | ||
1338 | 140 | yield self.buildmanager.iterate(0) | ||
1339 | 141 | expected_command = [ | ||
1340 | 142 | "sharepath/bin/in-target", "in-target", "scan-for-processes", | ||
1341 | 143 | "--backend=lxd", "--series=xenial", "--arch=i386", self.buildid, | ||
1342 | 144 | ] | ||
1343 | 145 | self.assertEqual(OCIBuildState.BUILD_OCI, self.getState()) | ||
1344 | 146 | self.assertEqual(expected_command, self.buildmanager.commands[-1]) | ||
1345 | 147 | self.assertNotEqual( | ||
1346 | 148 | self.buildmanager.iterate, self.buildmanager.iterators[-1]) | ||
1347 | 149 | self.assertFalse(self.builder.wasCalled("buildFail")) | ||
1348 | 150 | expected_files = [ | ||
1349 | 151 | 'manifest.json', | ||
1350 | 152 | 'layer-1.tar.gz', | ||
1351 | 153 | 'layer-2.tar.gz', | ||
1352 | 154 | 'digests.json', | ||
1353 | 155 | 'config.json', | ||
1354 | 156 | ] | ||
1355 | 157 | for expected in expected_files: | ||
1356 | 158 | self.assertThat(self.builder.waitingfiles, Contains(expected)) | ||
1357 | 159 | |||
1358 | 160 | cache_path = self.builder.cachePath( | ||
1359 | 161 | self.builder.waitingfiles['digests.json']) | ||
1360 | 162 | with open(cache_path, "rb") as f: | ||
1361 | 163 | digests_contents = f.read() | ||
1362 | 164 | digests_expected = [{ | ||
1363 | 165 | "sha256:diff1": { | ||
1364 | 166 | "source": "test", | ||
1365 | 167 | "digest": "test_digest", | ||
1366 | 168 | "layer_id": "layer-1" | ||
1367 | 169 | }, | ||
1368 | 170 | "sha256:diff2": { | ||
1369 | 171 | "source": "", | ||
1370 | 172 | "digest": "testsha", | ||
1371 | 173 | "layer_id": "layer-2" | ||
1372 | 174 | } | ||
1373 | 175 | }] | ||
1374 | 176 | self.assertEqual(digests_contents, json.dumps(digests_expected)) | ||
1375 | 177 | # Control returns to the DebianBuildManager in the UMOUNT state. | ||
1376 | 178 | self.buildmanager.iterateReap(self.getState(), 0) | ||
1377 | 179 | expected_command = [ | ||
1378 | 180 | "sharepath/bin/in-target", "in-target", "umount-chroot", | ||
1379 | 181 | "--backend=lxd", "--series=xenial", "--arch=i386", self.buildid, | ||
1380 | 182 | ] | ||
1381 | 183 | self.assertEqual(OCIBuildState.UMOUNT, self.getState()) | ||
1382 | 184 | self.assertEqual(expected_command, self.buildmanager.commands[-1]) | ||
1383 | 185 | self.assertEqual( | ||
1384 | 186 | self.buildmanager.iterate, self.buildmanager.iterators[-1]) | ||
1385 | 187 | self.assertFalse(self.builder.wasCalled("buildFail")) | ||
1386 | 188 | |||
1387 | 189 | @defer.inlineCallbacks | ||
1388 | 190 | def test_iterate_with_file(self): | ||
1389 | 191 | # This sha would change as it includes file attributes in the | ||
1390 | 192 | # tar file. Fix it so we can test against a known value. | ||
1391 | 193 | sha_mock = self.useFixture( | ||
1392 | 194 | MockPatch('lpbuildd.oci.OCIBuildManager._calculateLayerSha')) | ||
1393 | 195 | sha_mock.mock.return_value = "testsha" | ||
1394 | 196 | # The build manager iterates a build that specifies a non-default | ||
1395 | 197 | # Dockerfile location from start to finish. | ||
1396 | 198 | args = { | ||
1397 | 199 | "git_repository": "https://git.launchpad.dev/~example/+git/snap", | ||
1398 | 200 | "git_path": "master", | ||
1399 | 201 | "build_file": "build-aux/Dockerfile", | ||
1400 | 202 | } | ||
1401 | 203 | expected_options = [ | ||
1402 | 204 | "--git-repository", "https://git.launchpad.dev/~example/+git/snap", | ||
1403 | 205 | "--git-path", "master", | ||
1404 | 206 | "--build-file", "build-aux/Dockerfile", | ||
1405 | 207 | ] | ||
1406 | 208 | yield self.startBuild(args, expected_options) | ||
1407 | 209 | |||
1408 | 210 | log_path = os.path.join(self.buildmanager._cachepath, "buildlog") | ||
1409 | 211 | with open(log_path, "w") as log: | ||
1410 | 212 | log.write("I am a build log.") | ||
1411 | 213 | |||
1412 | 214 | self.buildmanager.backend.run.result = MockOCITarSave() | ||
1413 | 215 | |||
1414 | 216 | self.buildmanager.backend.add_file( | ||
1415 | 217 | '/var/lib/docker/image/' | ||
1416 | 218 | 'vfs/distribution/v2metadata-by-diffid/sha256/diff1', | ||
1417 | 219 | b"""[{"Digest": "test_digest", "SourceRepository": "test"}]""" | ||
1418 | 220 | ) | ||
1419 | 221 | |||
1420 | 222 | # After building the package, reap processes. | ||
1421 | 223 | yield self.buildmanager.iterate(0) | ||
1422 | 224 | expected_command = [ | ||
1423 | 225 | "sharepath/bin/in-target", "in-target", "scan-for-processes", | ||
1424 | 226 | "--backend=lxd", "--series=xenial", "--arch=i386", self.buildid, | ||
1425 | 227 | ] | ||
1426 | 228 | self.assertEqual(OCIBuildState.BUILD_OCI, self.getState()) | ||
1427 | 229 | self.assertEqual(expected_command, self.buildmanager.commands[-1]) | ||
1428 | 230 | self.assertNotEqual( | ||
1429 | 231 | self.buildmanager.iterate, self.buildmanager.iterators[-1]) | ||
1430 | 232 | self.assertFalse(self.builder.wasCalled("buildFail")) | ||
1431 | 233 | expected_files = [ | ||
1432 | 234 | 'manifest.json', | ||
1433 | 235 | 'layer-1.tar.gz', | ||
1434 | 236 | 'layer-2.tar.gz', | ||
1435 | 237 | 'digests.json', | ||
1436 | 238 | 'config.json', | ||
1437 | 239 | ] | ||
1438 | 240 | for expected in expected_files: | ||
1439 | 241 | self.assertThat(self.builder.waitingfiles, Contains(expected)) | ||
1440 | 242 | |||
1441 | 243 | cache_path = self.builder.cachePath( | ||
1442 | 244 | self.builder.waitingfiles['digests.json']) | ||
1443 | 245 | with open(cache_path, "rb") as f: | ||
1444 | 246 | digests_contents = f.read() | ||
1445 | 247 | digests_expected = [{ | ||
1446 | 248 | "sha256:diff1": { | ||
1447 | 249 | "source": "test", | ||
1448 | 250 | "digest": "test_digest", | ||
1449 | 251 | "layer_id": "layer-1" | ||
1450 | 252 | }, | ||
1451 | 253 | "sha256:diff2": { | ||
1452 | 254 | "source": "", | ||
1453 | 255 | "digest": "testsha", | ||
1454 | 256 | "layer_id": "layer-2" | ||
1455 | 257 | } | ||
1456 | 258 | }] | ||
1457 | 259 | self.assertEqual(digests_contents, json.dumps(digests_expected)) | ||
1458 | 260 | |||
1459 | 261 | # Control returns to the DebianBuildManager in the UMOUNT state. | ||
1460 | 262 | self.buildmanager.iterateReap(self.getState(), 0) | ||
1461 | 263 | expected_command = [ | ||
1462 | 264 | "sharepath/bin/in-target", "in-target", "umount-chroot", | ||
1463 | 265 | "--backend=lxd", "--series=xenial", "--arch=i386", self.buildid, | ||
1464 | 266 | ] | ||
1465 | 267 | self.assertEqual(OCIBuildState.UMOUNT, self.getState()) | ||
1466 | 268 | self.assertEqual(expected_command, self.buildmanager.commands[-1]) | ||
1467 | 269 | self.assertEqual( | ||
1468 | 270 | self.buildmanager.iterate, self.buildmanager.iterators[-1]) | ||
1469 | 271 | self.assertFalse(self.builder.wasCalled("buildFail")) | ||
1470 | 272 | |||
1471 | 273 | @defer.inlineCallbacks | ||
1472 | 274 | def test_iterate_snap_store_proxy(self): | ||
1473 | 275 | # The build manager can be told to use a snap store proxy. | ||
1474 | 276 | self.builder._config.set( | ||
1475 | 277 | "proxy", "snapstore", "http://snap-store-proxy.example/") | ||
1476 | 278 | expected_options = [ | ||
1477 | 279 | "--snap-store-proxy-url", "http://snap-store-proxy.example/"] | ||
1478 | 280 | yield self.startBuild(options=expected_options) |
While looking at the launchpad buildbehaviour side of this, realised MP doesn't support build file location.
It should do that.