Merge lp:~ltrager/maas-images/bootloaders into lp:maas-images

Proposed by Lee Trager
Status: Merged
Merged at revision: 306
Proposed branch: lp:~ltrager/maas-images/bootloaders
Merge into: lp:maas-images
Diff against target: 513 lines (+389/-47)
4 files modified
conf/bootloaders.yaml (+45/-0)
meph2/commands/dpkg.py (+183/-0)
meph2/commands/meph2_util.py (+160/-46)
meph2/netinst.py (+1/-1)
To merge this branch: bzr merge lp:~ltrager/maas-images/bootloaders
Reviewer Review Type Date Requested Status
maintainers of maas images Pending
Review via email: mp+300402@code.launchpad.net

Commit message

Import and publish bootloaders from a yaml file.

This allows the import subcommand to accept a yaml file containing a list of bootloaders. The latest version of the bootloader is downloaded from the Ubuntu archive. Files can be copied out of the bootloader or they can be processed with grub-mkimage.

This also fixes a bug with the import command where if any version of a product existed no newer version would be imported. This will allow more up-to-date CentOS images to be published.

Description of the change

This allows bootloaders to be published in the MAAS SimpleStream. As this currently breaks the MAAS import process, due to missing fields, boot loaders will only be added to http://images.maas.io/ephemeral-v3/

Example product listing: http://paste.ubuntu.com/20006681/

To post a comment you must log in.
lp:~ltrager/maas-images/bootloaders updated
309. By Lee Trager

Fix PEP8 errors

310. By Lee Trager

We don't need a boot_loader entry, we already have the ftype

Revision history for this message
Blake Rouse (blake-rouse) wrote :

See my comment about the release for the bootloaders. I have not looked closely at the code yet.

lp:~ltrager/maas-images/bootloaders updated
311. By Lee Trager

blake_r corrections

Revision history for this message
Lee Trager (ltrager) wrote :

The bootloader import process now iterates through updates-<release>, security-<release>, and <release> to find the latest version of the package. This ensures we will always be publishing the latest version that is in the Ubuntu archives.

I also added the ability for an individual bootloader to override the global release. This allows us to pull bootloaders from different releases. Right now they all pull from Xenial.

lp:~ltrager/maas-images/bootloaders updated
312. By Lee Trager

Make each bootloader specify a release

Revision history for this message
Lee Trager (ltrager) wrote :

After thinking about this we should just make each bootloader specify a release. We already require each to specify an archive.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'conf/bootloaders.yaml'
2--- conf/bootloaders.yaml 1970-01-01 00:00:00 +0000
3+++ conf/bootloaders.yaml 2016-07-21 22:02:46 +0000
4@@ -0,0 +1,45 @@
5+product_id: "com.ubuntu.maas.daily:bootloaders-bases:{bootloader}"
6+content_id: "com.ubuntu.maas:daily:bootloaders-bases-download"
7+
8+bootloaders:
9+ - bootloader: pxe
10+ packages:
11+ - pxelinux
12+ - syslinux-common
13+ arch: amd64
14+ archive: http://archive.ubuntu.com/ubuntu
15+ release: xenial
16+ files:
17+ - /usr/lib/PXELINUX/pxelinux.0
18+ - /usr/lib/syslinux/modules/bios/*.c32
19+ - bootloader: uefi
20+ packages:
21+ - shim-signed
22+ - grub-efi-amd64-signed
23+ arch: amd64
24+ archive: http://archive.ubuntu.com/ubuntu
25+ release: xenial
26+ files:
27+ - /usr/lib/shim/shim.efi.signed
28+ - /usr/lib/grub/x86_64-efi-signed/grubnetx64.efi.signed
29+ - /usr/lib/grub/x86_64-efi-signed/grubx64.efi.signed
30+ - bootloader: uefi-arm64
31+ packages:
32+ - grub-efi-arm64-bin
33+ arch: arm64
34+ archive: http://ports.ubuntu.com
35+ release: xenial
36+ files:
37+ - /usr/lib/grub/arm64-efi/
38+ grub_format: arm64-efi
39+ grub_output: grubaa64.efi
40+ - bootloader: powerkvm
41+ packages:
42+ - grub-ieee1275-bin
43+ arch: ppc64el
44+ archive: http://ports.ubuntu.com
45+ release: xenial
46+ files:
47+ - /usr/lib/grub/powerpc-ieee1275/
48+ grub_format: powerpc-ieee1275
49+ grub_output: bootppc64.bin
50
51=== added file 'meph2/commands/dpkg.py'
52--- meph2/commands/dpkg.py 1970-01-01 00:00:00 +0000
53+++ meph2/commands/dpkg.py 2016-07-21 22:02:46 +0000
54@@ -0,0 +1,183 @@
55+from platform import linux_distribution
56+import apt_pkg
57+import shutil
58+import subprocess
59+import hashlib
60+import io
61+import lzma
62+import os
63+import re
64+import sys
65+import tempfile
66+import urllib.request
67+import glob
68+
69+
70+def get_distro_release():
71+ """Returns the release name for the running distro."""
72+ disname, version, codename = linux_distribution()
73+ return codename
74+
75+
76+def get_file(url):
77+ """Downloads the file from the given URL into memory.
78+
79+ :param url" URL to download
80+ :return: File data, or None
81+ """
82+ # Build a newer opener so that the environment is checked for proxy
83+ # URLs. Using urllib2.urlopen() means that we'd only be using the
84+ # proxies as defined when urlopen() was called the first time.
85+ try:
86+ response = urllib.request.build_opener().open(url)
87+ return response.read()
88+ except urllib.error.URLError as e:
89+ sys.stderr.write("Unable to download %s: %s" % (url, str(e.reason)))
90+ sys.exit(1)
91+ except BaseException as e:
92+ sys.stderr.write("Unable to download %s: %s" % (url, str(e)))
93+ sys.exit(1)
94+
95+
96+def get_sha256(data):
97+ """Returns the SHA256SUM for the provided data."""
98+ sha256 = hashlib.sha256()
99+ sha256.update(data)
100+ return sha256.hexdigest()
101+
102+
103+def gpg_verify_data(signature, data_file):
104+ """Verify's data using the signature."""
105+ tmp = tempfile.mkdtemp(prefix='maas-images')
106+ sig_out = os.path.join(tmp, 'verify.gpg')
107+ with open(sig_out, 'wb') as stream:
108+ stream.write(signature)
109+
110+ data_out = os.path.join(tmp, 'verify')
111+ with open(data_out, 'wb') as stream:
112+ stream.write(data_file)
113+
114+ subprocess.check_output(
115+ ['gpgv', '--keyring', '/etc/apt/trusted.gpg', sig_out, data_out],
116+ stderr=subprocess.STDOUT)
117+
118+ shutil.rmtree(tmp, ignore_errors=True)
119+
120+
121+# Cache packages
122+_packages = {}
123+
124+
125+def get_packages(base_url, architecture, pkg_name):
126+ """Gets the package list from the archive verified."""
127+ global _packages
128+ release_url = '%s/%s' % (base_url, 'Release')
129+ path = 'main/binary-%s/Packages.xz' % architecture
130+ packages_url = '%s/%s' % (base_url, path)
131+ if packages_url in _packages:
132+ return _packages[packages_url]
133+ release_file = get_file(release_url)
134+ release_file_gpg = get_file('%s.gpg' % release_url)
135+ gpg_verify_data(release_file_gpg, release_file)
136+
137+ # Download the packages file and verify the SHA256SUM
138+ pkg_data = get_file(packages_url)
139+ regex_path = re.escape(path)
140+ sha256sum = re.search(
141+ ("^\s*?([a-fA-F0-9]{64})\s*[0-9]+\s+%s$" % regex_path).encode('utf-8'),
142+ release_file,
143+ re.MULTILINE).group(1)
144+ if get_sha256(pkg_data).encode('utf-8') != sha256sum:
145+ sys.stderr.write("Unable to verify %s" % packages_url)
146+ sys.exit(1)
147+
148+ _packages[packages_url] = {}
149+ compressed = io.BytesIO(pkg_data)
150+ with lzma.LZMAFile(compressed) as uncompressed:
151+ pkg_name = None
152+ package = {}
153+ for line in uncompressed:
154+ line = line.decode('utf-8')
155+ if line == '\n':
156+ _packages[packages_url][pkg_name] = package
157+ pkg_name = None
158+ package = {}
159+ continue
160+ key, value = line.split(': ', 1)
161+ value = value.strip()
162+ if key == 'Package':
163+ pkg_name = value
164+ else:
165+ package[key] = value
166+
167+ return _packages[packages_url]
168+
169+
170+def get_package(archive, pkg_name, architecture, release=None, dest=None):
171+ """Look through the archives for package metadata. If a dest is given
172+ download the package.
173+
174+ :return: A dictionary containing the packages meta info or if a dest is
175+ given the path of the downloaded file."""
176+ global _packages
177+ release = get_distro_release() if release is None else release
178+ package = None
179+ apt_pkg.init()
180+ # Find the latest version of the package
181+ for dist in ('%s-updates' % release, '%s-security' % release, release):
182+ base_url = '%s/dists/%s' % (archive, dist)
183+ packages = get_packages(base_url, architecture, pkg_name)
184+ if pkg_name in packages:
185+ if package is None or apt_pkg.version_compare(
186+ packages[pkg_name]['Version'], package['Version']) > 0:
187+ package = packages[pkg_name]
188+ # Download it if it was found and a dest was set
189+ if package is not None and dest is not None:
190+ pkg_data = get_file('%s/%s' % (archive, package['Filename']))
191+ if package['SHA256'] != get_sha256(pkg_data):
192+ sys.stderr.write(
193+ 'SHA256 mismatch on %s from %s' % (pkg_name, base_url))
194+ sys.exit(1)
195+ pkg_path = os.path.join(dest, os.path.basename(package['Filename']))
196+ with open(pkg_path, 'wb') as stream:
197+ stream.write(pkg_data)
198+ return package
199+
200+
201+def extract_files_from_packages(
202+ archive, packages, architecture, files, release, dest,
203+ grub_format=None):
204+ tmp = tempfile.mkdtemp(prefix='maas-images-')
205+ for package in packages:
206+ package = get_package(archive, package, architecture, release, tmp)
207+ pkg_path = os.path.join(tmp, os.path.basename(package['Filename']))
208+ if pkg_path is None:
209+ sys.stderr.write('%s not found in archives!' % package)
210+ sys.exit(1)
211+ subprocess.check_output(['dpkg', '-x', pkg_path, tmp])
212+
213+ if grub_format is None:
214+ for i in files:
215+ src = "%s/%s" % (tmp, i)
216+ if '*' in src or '?' in src:
217+ unglobbed_files = glob.glob(src)
218+ else:
219+ unglobbed_files = [src]
220+ for f in unglobbed_files:
221+ dest_file = "%s/%s" % (dest, os.path.basename(f))
222+ shutil.copyfile(f, dest_file)
223+ else:
224+ # You can only tell grub to use modules from one directory
225+ modules_path = "%s/%s" % (tmp, files[0])
226+ modules = []
227+ for module_path in glob.glob("%s/*.mod" % modules_path):
228+ module_filename = os.path.basename(module_path)
229+ module_name, _ = os.path.splitext(module_filename)
230+ modules.append(module_name)
231+ subprocess.check_output(
232+ ['grub-mkimage',
233+ '-o', dest,
234+ '-O', grub_format,
235+ '-d', modules_path,
236+ '-c', dest] + modules)
237+ shutil.rmtree(tmp)
238
239=== modified file 'meph2/commands/meph2_util.py'
240--- meph2/commands/meph2_util.py 2016-06-06 15:49:10 +0000
241+++ meph2/commands/meph2_util.py 2016-07-21 22:02:46 +0000
242@@ -1,6 +1,7 @@
243 #!/usr/bin/python3
244
245 import argparse
246+import glob
247 import copy
248 import os
249 from functools import partial
250@@ -13,6 +14,10 @@
251
252 from meph2 import util
253 from meph2.url_helper import geturl_text
254+from meph2.commands.dpkg import (
255+ get_package,
256+ extract_files_from_packages,
257+)
258
259 from simplestreams import (
260 contentsource,
261@@ -406,10 +411,12 @@
262 def load_products(path, product_streams):
263 products = {}
264 for product_stream in product_streams:
265- with contentsource.UrlContentSource(
266- os.path.join(path, product_stream)) as tcs:
267- product_listing = sutil.load_content(tcs.read())
268- products.update(product_listing['products'])
269+ product_stream_path = os.path.join(path, product_stream)
270+ if os.path.exists(product_stream_path):
271+ with contentsource.UrlContentSource(
272+ product_stream_path) as tcs:
273+ product_listing = sutil.load_content(tcs.read())
274+ products.update(product_listing['products'])
275 return products
276
277
278@@ -455,25 +462,7 @@
279 return 0
280
281
282-def main_import(args):
283- cfg_path = os.path.join(
284- os.path.dirname(__file__), "..", "..", "conf", args.import_cfg)
285- if not os.path.exists(cfg_path):
286- if os.path.exists(args.import_cfg):
287- cfg_path = args.import_cfg
288- else:
289- print("Error: Unable to find config file %s" % args.import_cfg)
290- os.exit(1)
291-
292- target_product_streams = load_product_streams(args.target)
293- target_products = load_products(args.target, target_product_streams)
294-
295- with open(cfg_path) as fp:
296- cfgdata = yaml.load(fp)
297-
298- product_tree = util.empty_iid_products(cfgdata['content_id'])
299- product_tree['updated'] = sutil.timestamp()
300- product_tree['datatype'] = 'image-downloads'
301+def import_sha256(args, product_tree, cfgdata):
302 for (release, release_info) in cfgdata['versions'].items():
303 if 'arch' in release_info:
304 arch = release_info['arch']
305@@ -483,34 +472,40 @@
306 os_name = release_info['os']
307 else:
308 os_name = cfgdata['os']
309-
310- product_id = cfgdata['product_id'].format(
311- version=release_info['version'], arch=arch)
312-
313- # If the product already exists don't regenerate the image, just copy
314- # its metadata
315- if product_id in target_products:
316- product_tree['products'][product_id] = target_products[product_id]
317- continue
318-
319- product_tree['products'][product_id] = {
320- 'subarches': 'generic',
321- 'label': 'daily',
322- 'subarch': 'generic',
323- 'arch': arch,
324- 'os': os_name,
325- 'version': release_info['version'],
326- 'release': release,
327- 'versions': {},
328- }
329 if 'path_version' in release_info:
330 path_version = release_info['path_version']
331 else:
332 path_version = release_info['version']
333+ product_id = cfgdata['product_id'].format(
334+ version=release_info['version'], arch=arch)
335 url = cfgdata['sha256_meta_data_path'].format(version=path_version)
336+ images = get_sha256_meta_images(url)
337 base_url = os.path.dirname(url)
338- images = get_sha256_meta_images(url)
339+
340+ if product_tree['products'].get(product_id) is None:
341+ print("Creating new product %s" % product_id)
342+ product_tree['products'][product_id] = {
343+ 'subarches': 'generic',
344+ 'label': 'daily',
345+ 'subarch': 'generic',
346+ 'arch': arch,
347+ 'os': os_name,
348+ 'version': release_info['version'],
349+ 'release': release,
350+ 'versions': {},
351+ }
352+
353 for (image, image_info) in images.items():
354+ if (
355+ product_id in product_tree['products'] and
356+ image in product_tree['products'][product_id]['versions']):
357+ print(
358+ "Product %s at version %s exists, skipping" % (
359+ product_id, image))
360+ continue
361+ print(
362+ "Downloading and creating %s version %s" % (
363+ (product_id, image)))
364 image_path = '/'.join([release, arch, image, 'root-tgz'])
365 real_image_path = os.path.join(
366 os.path.realpath(args.target), image_path)
367@@ -528,12 +523,131 @@
368 }
369 }
370 }
371+
372+
373+def get_file_info(f):
374+ size = 0
375+ sha256 = hashlib.sha256()
376+ with open(f, 'rb') as f:
377+ for chunk in iter(lambda: f.read(2**15), b''):
378+ sha256.update(chunk)
379+ size += len(chunk)
380+ return sha256.hexdigest(), size
381+
382+
383+def import_bootloaders(args, product_tree, cfgdata):
384+ for bootloader in cfgdata['bootloaders']:
385+ product_id = cfgdata['product_id'].format(
386+ bootloader=bootloader['bootloader'])
387+ package = get_package(
388+ bootloader['archive'], bootloader['packages'][0],
389+ bootloader['arch'], bootloader['release'])
390+
391+ if (
392+ product_id in product_tree['products'] and
393+ package['Version'] in product_tree['products'][product_id][
394+ 'versions']):
395+ print(
396+ "Product %s at version %s exists, skipping" % (
397+ product_id, package['Version']))
398+ continue
399+ if product_tree['products'].get(product_id) is None:
400+ print("Creating new product %s" % product_id)
401+ product_tree['products'][product_id] = {
402+ 'label': 'daily',
403+ 'arch': bootloader['arch'],
404+ 'subarch': 'generic',
405+ 'subarches': 'generic',
406+ 'os': 'bootloader',
407+ 'release': bootloader['bootloader'],
408+ 'versions': {},
409+ }
410+ path = os.path.join(
411+ 'bootloaders', bootloader['bootloader'], bootloader['arch'],
412+ package['Version'])
413+ dest = os.path.join(args.target, path)
414+ os.makedirs(dest)
415+ grub_format = bootloader.get('grub_format')
416+ if grub_format is not None:
417+ dest = os.path.join(dest, bootloader['grub_output'])
418+ print(
419+ "Downloading and creating %s version %s" % (
420+ product_id, package['Version']))
421+ extract_files_from_packages(
422+ bootloader['archive'], bootloader['packages'],
423+ bootloader['arch'], bootloader['files'], bootloader['release'],
424+ dest, grub_format)
425+ if grub_format is not None:
426+ sha256, size = get_file_info(dest)
427+ product_tree['products'][product_id]['versions'][
428+ package['Version']] = {
429+ 'items': {
430+ bootloader['grub_output']: {
431+ 'ftype': 'bootloader',
432+ 'sha256': sha256,
433+ 'path': os.path.join(
434+ path, bootloader['grub_output']),
435+ 'size': size,
436+ }
437+ }
438+ }
439+ else:
440+ items = {}
441+ for i in bootloader['files']:
442+ basename = os.path.basename(i)
443+ dest_file = os.path.join(dest, basename)
444+ if '*' in dest_file or '?' in dest_file:
445+ unglobbed_files = glob.glob(dest_file)
446+ else:
447+ unglobbed_files = [dest_file]
448+ for f in unglobbed_files:
449+ basename = os.path.basename(f)
450+ sha256, size = get_file_info(f)
451+ items[basename] = {
452+ 'ftype': 'bootloader',
453+ 'sha256': sha256,
454+ 'path': os.path.join(path, basename),
455+ 'size': size,
456+ }
457+ product_tree['products'][product_id]['versions'][
458+ package['Version']] = {'items': items}
459+
460+
461+def main_import(args):
462+ cfg_path = os.path.join(
463+ os.path.dirname(__file__), "..", "..", "conf", args.import_cfg)
464+ if not os.path.exists(cfg_path):
465+ if os.path.exists(args.import_cfg):
466+ cfg_path = args.import_cfg
467+ else:
468+ print("Error: Unable to find config file %s" % args.import_cfg)
469+ os.exit(1)
470+
471+ with open(cfg_path) as fp:
472+ cfgdata = yaml.load(fp)
473+
474+ target_product_stream = os.path.join(
475+ 'streams', 'v1', cfgdata['content_id'] + '.json')
476+
477+ product_tree = util.empty_iid_products(cfgdata['content_id'])
478+ product_tree['products'] = load_products(
479+ args.target, [target_product_stream])
480+ product_tree['updated'] = sutil.timestamp()
481+ product_tree['datatype'] = 'image-downloads'
482+
483+ if cfgdata.get('sha256_meta_data_path', None) is not None:
484+ import_sha256(args, product_tree, cfgdata)
485+ elif cfgdata.get('bootloaders', None) is not None:
486+ import_bootloaders(args, product_tree, cfgdata)
487+ else:
488+ sys.stderr.write('Unsupported import yaml!')
489+ sys.exit(1)
490+
491 md_d = os.path.join(args.target, 'streams', 'v1')
492 if not os.path.exists(md_d):
493 os.makedirs(md_d)
494
495- product_tree_fn = cfgdata['content_id'] + '.json'
496- with open(os.path.join(md_d, product_tree_fn), 'wb') as fp:
497+ with open(os.path.join(args.target, target_product_stream), 'wb') as fp:
498 fp.write(util.dump_data(product_tree))
499
500 gen_index_and_sign(args.target, not args.no_sign)
501
502=== modified file 'meph2/netinst.py'
503--- meph2/netinst.py 2016-06-07 17:30:20 +0000
504+++ meph2/netinst.py 2016-07-21 22:02:46 +0000
505@@ -113,7 +113,7 @@
506 "dtb": re.compile(r".dtb$").search,
507 }
508
509-IGNORED_INITRD_FLAVORS = ('xen', 'cdrom', 'gtk', 'hd-media')
510+IGNORED_INITRD_FLAVORS = ('xen', 'cdrom', 'gtk', 'hd-media')
511
512 # #
513 # # Under a path like: MIRROR/precise-updates/main/installer-i386/

Subscribers

People subscribed via source and target branches