Merge ~ltrager/maas-images:diff_patch into maas-images:master

Proposed by Lee Trager
Status: Merged
Merge reported by: Lee Trager
Merged at revision: 6a8501a0848641c04647140a7efac8609cdaeb31
Proposed branch: ~ltrager/maas-images:diff_patch
Merge into: maas-images:master
Prerequisite: ~ltrager/maas-images:daily_to_candidate
Diff against target: 724 lines (+569/-12)
10 files modified
README (+37/-0)
conf/cpc-patch.yaml (+2/-0)
doc/example-patches/delete-products.yaml (+9/-0)
doc/example-patches/delete-version.yaml (+13/-0)
doc/example-patches/modify-metadata.yaml (+19/-0)
doc/example-patches/promote-product.yaml (+11/-0)
doc/example-patches/promote-version.yaml (+21/-0)
meph2/commands/flags.py (+53/-0)
meph2/commands/meph2_util.py (+388/-7)
meph2/util.py (+16/-5)
Reviewer Review Type Date Requested Status
Dougal Matthews (community) Needs Information
Review via email: mp+387911@code.launchpad.net

Commit message

Add the diff and patch subcommands to meph2-util.

The diff and patch commands are used to modify the MAAS images stream in
a structured way. While SimpleStreams support multiple labels per stream
these commands operate under the assumption that each stream consistently
uses the same label.

The diff command finds the differences between two MAAS streams. The
streams may be either local or remote. The outputted YAML file may be
modified and passed to patch to modify the stream.

The patch command accepts a YAML file describing how the stream should
be modified. This command allows you to modify metadata, add or remove
products, and add or remove product versions. Products may be defined
using a Python regular expression.

To post a comment you must log in.
Revision history for this message
Dougal Matthews (d0ugal) wrote :

Two small comments. Otherwise the change looks good but I find it quite hard to review. I wonder if we can start to introduce some unit tests for the Python code?

review: Needs Information
Revision history for this message
Dougal Matthews (d0ugal) :
Revision history for this message
Dougal Matthews (d0ugal) :
~ltrager/maas-images:diff_patch updated
bae0560... by Lee Trager

Merge branch 'master' into daily_to_candidate

0e68521... by Lee Trager

Merge branch 'master' into diff_patch

35f0383... by Lee Trager

Merge branch 'master' into daily_to_candidate

6a8501a... by Lee Trager

Merge branch 'daily_to_candidate' into diff_patch

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1diff --git a/README b/README
2index 050bdcf..95fce17 100644
3--- a/README
4+++ b/README
5@@ -225,6 +225,43 @@ its usage is fairly simple.
6
7 The default is '3d'.
8
9+=== diff ===
10+It is often useful to compare the difference between two MAAS streams. While
11+SimpleStreams supports multiple labels in the same stream this tool is written
12+to work with the MAAS streams, as such it assumes each stream consistently uses
13+a different label. The tool will identify missing product streams, missing
14+products, differences in the value of keys, and missing versions. Stream and
15+product names have their label removed. The streams may be local or remote.
16+Output defaults to stdout but may be written to a file with '-o'.
17+'--new-versions-only' may be used to only output new versions found in the
18+target stream. '--latest-only' may be used to only show the latest version
19+missing. The YAML output may be modified and passed to the 'patch' command.
20+
21+Example usage:
22+
23+$ meph2-util diff --latest-only -o conf/cpc-patch.yaml \
24+ http://images.maas.io/ephemeral-v3/candidate \
25+ http://images.maas.io/ephemeral-v3/stable
26+
27+== patch ===
28+Modifying a stream can be done using a structured YAML file. A patch may
29+modify metadata, add and remove products, and add or remove product versions.
30+Products may be specified as a regular expression. When modifying or removing
31+only a target stream is required. When adding a product or version a source
32+stream must also be given. The source stream may be local or remote over
33+HTTP(S). Streams are resigned after being modified unless the '--no-sign'
34+flag is passed. '--dry-run' may be used to test the patch without actually
35+modifying the stream. The YAML file references streams and products without
36+a label included in the name, it is automatically added based on the stream.
37+
38+See doc/example-patches
39+
40+Example usage:
41+
42+$ meph2-util patch -i conf/cpc-patch.yaml \
43+ http://images.maas.io/ephemeral-v3/candidate \
44+ /path/to/stable
45+
46 === Known callers ===
47 This is just a list of known callers to reference when changing things.
48
49diff --git a/conf/cpc-patch.yaml b/conf/cpc-patch.yaml
50new file mode 100644
51index 0000000..c488aa7
52--- /dev/null
53+++ b/conf/cpc-patch.yaml
54@@ -0,0 +1,2 @@
55+# This patch is run nightly by CPC.
56+
57diff --git a/doc/example-patches/delete-products.yaml b/doc/example-patches/delete-products.yaml
58new file mode 100644
59index 0000000..a750968
60--- /dev/null
61+++ b/doc/example-patches/delete-products.yaml
62@@ -0,0 +1,9 @@
63+# Products can be entirely removed. The logic works similarly to removing a
64+# version. If the stream label isn't found in labels, it is removed.
65+#
66+# Note: Patch only modifies the stream metadata when deleting. To cleanup
67+# left over finds use the find-orphans and reap-orphans commands.
68+
69+com.ubuntu.maas:v3:download.json:
70+ com.ubuntu.maas:v3:boot:(16\.10|17|18\.10|19).*:
71+ labels: {}
72diff --git a/doc/example-patches/delete-version.yaml b/doc/example-patches/delete-version.yaml
73new file mode 100644
74index 0000000..79304d8
75--- /dev/null
76+++ b/doc/example-patches/delete-version.yaml
77@@ -0,0 +1,13 @@
78+# Deleting a version can be done by defining the product and version without
79+# a label matching the stream. The patch command supports Python regular
80+# expressions so you don't have to define each product individually. The
81+# patch below removes 20200721 from all 20.04 products. Because versions
82+# aren't being added this patch may be used without defining a source.
83+#
84+# Note: Patch only modifies the stream metadata when deleting. To cleanup
85+# left over finds use the find-orphans and reap-orphans commands.
86+
87+com.ubuntu.maas:v3:download.json:
88+ com.ubuntu.maas:v3:boot:20.04:.*:
89+ versions:
90+ '20200721': {}
91diff --git a/doc/example-patches/modify-metadata.yaml b/doc/example-patches/modify-metadata.yaml
92new file mode 100644
93index 0000000..8d932f0
94--- /dev/null
95+++ b/doc/example-patches/modify-metadata.yaml
96@@ -0,0 +1,19 @@
97+# Patch can be used to modify metadata. The patch command will use the
98+# value assoicated with the label of the stream. Lets say Ubuntu 20.10
99+# becomes an LTS release. By using a regular expression you can easily
100+# patch all 20.10 products.
101+#
102+# Note: The dates are quoted otherwise yaml.safe_load returns them as
103+# a Date object.
104+
105+com.ubuntu.maas:v3:download.json:
106+ com.ubuntu.maas:v3:boot:20.10:.*:
107+ support_eol:
108+ candidate: "2030-07-22"
109+ stable: "2030-07-22"
110+ support_esm_eol:
111+ candidate: "2035-07-22"
112+ stable: "2035-07-22"
113+ release_title:
114+ candidate: 20.10 LTS
115+ stable: 20.10 LTS
116diff --git a/doc/example-patches/promote-product.yaml b/doc/example-patches/promote-product.yaml
117new file mode 100644
118index 0000000..8d10e83
119--- /dev/null
120+++ b/doc/example-patches/promote-product.yaml
121@@ -0,0 +1,11 @@
122+# A product may be promoted like a version. Simply list the labels the product
123+# should belong to and patch will add them to any stream whois label matches.
124+#
125+# Note: The patch command will require a source to pull the version from.
126+# The source may be local or remote over HTTP(S).
127+
128+com.ubuntu.maas:v3:download.json:
129+ com.ubuntu.maas:v3:boot:20.10:.*:
130+ labels:
131+ - candidate
132+ - stable
133diff --git a/doc/example-patches/promote-version.yaml b/doc/example-patches/promote-version.yaml
134new file mode 100644
135index 0000000..c72d6b5
136--- /dev/null
137+++ b/doc/example-patches/promote-version.yaml
138@@ -0,0 +1,21 @@
139+# The YAML below was generated by meph2-util diff and modified. meph2-util
140+# detected that the candidate stream has 20200722.1 while it is missing
141+# from stable. By adding 'stable' to the list of labels when the patch
142+# command is run 20200722.1 will be inserted into stable.
143+#
144+# Note: The patch command will require a source to pull the version from.
145+# The source may be local or remote over HTTP(S).
146+
147+com.ubuntu.maas:v3:download.json:
148+ com.ubuntu.maas:v3:boot:20.04:amd64:ga-20.04:
149+ versions:
150+ '20200722.1':
151+ labels:
152+ - candidate
153+ - stable
154+ com.ubuntu.maas:v3:boot:20.04:amd64:ga-20.04-lowlatency:
155+ versions:
156+ '20200722.1':
157+ labels:
158+ - candidate
159+ - stable
160diff --git a/meph2/commands/flags.py b/meph2/commands/flags.py
161index f85060d..0fe76d0 100644
162--- a/meph2/commands/flags.py
163+++ b/meph2/commands/flags.py
164@@ -73,6 +73,59 @@ SUBCOMMANDS = {
165 COMMON_FLAGS['version'], COMMON_FLAGS['filters'],
166 ]
167 },
168+ 'diff': {
169+ 'help': (
170+ 'Creates a diff, represented by a YAML file, which how two '
171+ 'streams differ. This assumes each stream uses a separate label '
172+ 'consistently.'
173+ ),
174+ 'opts': [
175+ (
176+ ('-o', '--output'),
177+ {'help': 'Specify file to output to, defaults to STDOUT'}
178+ ),
179+ (
180+ ('--new-versions-only'),
181+ {
182+ 'help': (
183+ 'Only show new versions from the source in diff. '
184+ 'Do not include old versions from the target.'
185+ ),
186+ 'action': 'store_true',
187+ 'default': False,
188+ },
189+ ),
190+ (
191+ ('--latest-only'),
192+ {
193+ 'help': 'Only include the latest missing version in diff.',
194+ 'action': 'store_true',
195+ 'default': False,
196+ },
197+ ), COMMON_FLAGS['src'], COMMON_FLAGS['target'],
198+ ],
199+ },
200+ 'patch': {
201+ 'help': (
202+ 'Apply a patch, represented by a YAML file generated with diff, '
203+ 'which describes how a stream should be edited.'
204+ ),
205+ 'opts': [
206+ (
207+ ('-i', '--input'),
208+ {'help': 'Specify the patch YAML to apply, defaults to STDIN'}
209+ ), COMMON_FLAGS['dry-run'], COMMON_FLAGS['no-sign'],
210+ (
211+ ('streams', ), {
212+ 'action': 'append',
213+ 'nargs': '+',
214+ 'help': (
215+ 'The stream to apply the patch YAML to. Multiple '
216+ 'streams must be given to insert new versions'
217+ )
218+ }),
219+ ],
220+ },
221 'clean-md': {
222 'help': 'clean streams metadata only to keep "max" items',
223 'opts': [
224diff --git a/meph2/commands/meph2_util.py b/meph2/commands/meph2_util.py
225index f33b4be..9ca40bf 100755
226--- a/meph2/commands/meph2_util.py
227+++ b/meph2/commands/meph2_util.py
228@@ -3,9 +3,17 @@
229 import argparse
230 import copy
231 import os
232+import re
233 from functools import partial
234 import shutil
235 import sys
236+import yaml
237+
238+try:
239+ from urllib import request as urllib_request
240+except ImportError:
241+ # python2
242+ import urllib2 as urllib_request
243
244 from meph2 import util
245 from meph2.commands.flags import COMMON_ARGS, SUBCOMMANDS
246@@ -155,7 +163,8 @@ class ReleasePromoteMirror(InsertBareMirrorWriter):
247 del ptree['products'][oname]
248
249 def fixed_content_id(self, content_id):
250- # when promoting from candidate, our content ids get ':candidate' removed
251+ # when promoting from candidate, our content ids get ':candidate'
252+ # removed
253 # com.ubuntu.maas:candidate:v2:download => com.ubuntu.maas:v2:download
254 return(content_id.replace(":candidate", ""))
255
256@@ -217,16 +226,13 @@ class DryRunMirrorWriter(mirrors.DryRunMirrorWriter):
257 def main_insert(args):
258 (src_url, src_path) = sutil.path_from_mirror_url(args.src, None)
259 filter_list = filters.get_filters(args.filters)
260-
261 mirror_config = {'max_items': 20, 'keep_items': True,
262 'filters': filter_list}
263-
264 policy = partial(util.endswith_policy, src_path, args.keyring)
265 smirror = mirrors.UrlMirrorReader(src_url, policy=policy)
266+ tstore = objectstores.FileStore(args.target)
267
268 if args.dry_run:
269- smirror = mirrors.UrlMirrorReader(src_url, policy=policy)
270- tstore = objectstores.FileStore(args.target)
271 drmirror = DryRunMirrorWriter(config=mirror_config, objectstore=tstore)
272 drmirror.sync(smirror, src_path)
273 for (pedigree, path, size) in drmirror.downloading:
274@@ -235,8 +241,6 @@ def main_insert(args):
275 fmt.format(pedigree='/'.join(pedigree), path=path) + "\n")
276 return 0
277
278- smirror = mirrors.UrlMirrorReader(src_url, policy=policy)
279- tstore = objectstores.FileStore(args.target)
280 tmirror = InsertBareMirrorWriter(config=mirror_config, objectstore=tstore)
281 tmirror.sync(smirror, src_path)
282
283@@ -483,6 +487,383 @@ def main_import(args):
284 return(mimport.main_import(args))
285
286
287+def get_stream_label(product_streams):
288+ """Returns the label for the stream.
289+
290+ This assumes the stream uses a consistent label and is identified in
291+ the stream's filename using for format FQDN:label:[version]?:name"""
292+ stream_label = None
293+ for product_stream in product_streams:
294+ stream = os.path.basename(product_stream).split(':')
295+ label = stream[1]
296+ if stream_label:
297+ assert label == stream_label, "All labels must be identical!"
298+ else:
299+ stream_label = label
300+ return stream_label
301+
302+
303+def get_stream_name_without_label(product_stream):
304+ stream_name = os.path.basename(product_stream).split(':')
305+ del stream_name[1]
306+ return ':'.join(stream_name)
307+
308+
309+def get_product_name_without_label(product_name, label):
310+ m = re.search(
311+ r"^(?P<fqdn>.*)[\.:]%s(?P<product>:.*)$" % label, product_name)
312+ assert m, "Unable to find label %s in product %s!" % (label, product_name)
313+ return ''.join(m.groups())
314+
315+
316+def main_diff(args):
317+ src_product_streams = util.load_product_streams(args.src, True)
318+ src_label = get_stream_label(src_product_streams)
319+ target_product_streams = util.load_product_streams(args.target, True)
320+ target_label = get_stream_label(target_product_streams)
321+ diff = {}
322+
323+ # Iterate over both streams to make sure we capture anything
324+ # missing.
325+ for product_stream in src_product_streams + target_product_streams:
326+ diff_stream_name = get_stream_name_without_label(product_stream)
327+ if src_label in product_stream:
328+ src_product_stream = product_stream
329+ target_product_stream = product_stream.replace(
330+ src_label, target_label)
331+ else:
332+ src_product_stream = product_stream.replace(
333+ target_label, src_label)
334+ target_product_stream = product_stream
335+
336+ src_product_stream_path = os.path.join(args.src, src_product_stream)
337+ target_product_stream_path = os.path.join(
338+ args.target, target_product_stream)
339+
340+ src_stream_missing = False
341+ target_stream_missing = False
342+ if src_label in product_stream:
343+ try:
344+ content = util.load_content(src_product_stream_path, True)
345+ except OSError:
346+ src_stream_missing = True
347+ try:
348+ other_content = util.load_content(
349+ target_product_stream_path, True)
350+ except OSError:
351+ target_stream_missing = True
352+ else:
353+ try:
354+ content = util.load_content(target_product_stream_path, True)
355+ except OSError:
356+ target_stream_missing = True
357+ try:
358+ other_content = util.load_content(
359+ src_product_stream_path, True)
360+ except OSError:
361+ src_stream_missing = True
362+
363+ # Verify the product stream exists in both streams.
364+ if src_stream_missing or target_stream_missing:
365+ if diff_stream_name not in diff:
366+ diff[diff_stream_name] = {
367+ 'not_merged': (
368+ src_label if src_stream_missing else target_label)
369+ }
370+ else:
371+ diff[diff_stream_name]['not_merged'] = (
372+ src_label if src_stream_missing else target_label)
373+ continue
374+
375+ for product, data in content['products'].items():
376+ if src_label in product:
377+ label = src_label
378+ other_label = target_label
379+ else:
380+ label = target_label
381+ other_label = src_label
382+ other_product = product.replace(label, other_label)
383+ diff_product_name = get_product_name_without_label(product, label)
384+ # Verify the product is in both streams.
385+ if other_product not in other_content['products']:
386+ if diff_stream_name not in diff:
387+ diff[diff_stream_name] = {}
388+ if diff_product_name not in diff[diff_stream_name]:
389+ diff[diff_stream_name][diff_product_name] = {}
390+ diff[diff_stream_name][diff_product_name]['labels'] = [
391+ label]
392+ continue
393+ else:
394+ other_data = other_content['products'][other_product]
395+ for key, value in data.items():
396+ if key == 'versions':
397+ if args.new_versions_only and target_label in product:
398+ continue
399+ for version, version_data in value.items():
400+ if version in other_data.get('versions', {}):
401+ assert version_data == other_data[
402+ 'versions'][version], (
403+ "%s %s exists in both streams but data "
404+ " does not match!" % (product, version))
405+ else:
406+ if diff_stream_name not in diff:
407+ diff[diff_stream_name] = {}
408+ if diff_product_name not in diff[diff_stream_name]:
409+ diff[diff_stream_name][diff_product_name] = {}
410+ if 'versions' not in diff[
411+ diff_stream_name][diff_product_name]:
412+ diff[diff_stream_name][diff_product_name][
413+ 'versions'] = {}
414+ if version not in diff[
415+ diff_stream_name][diff_product_name][
416+ 'versions']:
417+ diff[diff_stream_name][diff_product_name][
418+ 'versions'][version] = {}
419+ diff[diff_stream_name][diff_product_name][
420+ 'versions'][version]['labels'] = [label]
421+ if args.latest_only:
422+ latest = version
423+ for diff_version in [
424+ i for i in diff[diff_stream_name][
425+ diff_product_name][
426+ 'versions'].keys()
427+ if i != latest]:
428+ if latest > diff_version:
429+ del diff[diff_stream_name][
430+ diff_product_name]['versions'][
431+ diff_version]
432+ else:
433+ del diff[diff_stream_name][
434+ diff_product_name]['versions'][
435+ latest]
436+ latest = diff_version
437+ elif key == 'label':
438+ # Label is expected to be different
439+ continue
440+ elif value != other_data.get(key):
441+ if diff_stream_name not in diff:
442+ diff[diff_stream_name] = {}
443+ if diff_product_name not in diff[diff_stream_name]:
444+ diff[diff_stream_name][diff_product_name] = {}
445+ # Keep dictionary order consistent
446+ if label == src_label:
447+ diff[diff_stream_name][diff_product_name][key] = {
448+ src_label: value,
449+ target_label: other_data.get(key),
450+ }
451+ else:
452+ diff[diff_stream_name][diff_product_name][key] = {
453+ src_label: other_data.get(key),
454+ target_label: value,
455+ }
456+
457+ def output(buff):
458+ buff.write("# Generated by %s-%s\n" % (
459+ os.path.basename(sys.argv[0]), util.get_version()))
460+ buff.write("# Generated on %s\n" % sutil.timestamp())
461+ buff.write("# Source: %s\n" % args.src)
462+ buff.write("# Target: %s\n" % args.target)
463+ buff.write("# new-versions-only: %s\n" % args.new_versions_only)
464+ buff.write("# latest-only: %s\n\n" % args.latest_only)
465+ yaml.safe_dump(diff, buff)
466+
467+ if args.output:
468+ if os.path.exists(args.output):
469+ os.remove(args.output)
470+ with open(args.output, 'w') as f:
471+ output(f)
472+ else:
473+ output(sys.stdout)
474+ return 0
475+
476+
477+def find_stream(diff_product_stream, product_streams):
478+ found = False
479+ for product_stream in product_streams:
480+ if diff_product_stream == get_stream_name_without_label(
481+ product_stream):
482+ found = True
483+ break
484+ # New product streams should be merged in.
485+ assert found, "Target stream %s not found!" % product_stream
486+ return product_stream
487+
488+
489+def copy_items(version_data, src_path, target_path):
490+ for item in version_data['items'].values():
491+ src_item_path = os.path.join(src_path, item['path'])
492+ target_item_path = os.path.join(target_path, item['path'])
493+ if os.path.exists(target_item_path):
494+ # Items in a product may be referenced multiple times.
495+ # e.g all kernel versions of the same arch use the same SquashFS.
496+ continue
497+ os.makedirs(os.path.dirname(target_item_path), exist_ok=True)
498+ if os.path.exists(src_item_path):
499+ print("INFO: Copying %s to %s" % (src_item_path, target_item_path))
500+ # Attempt to use a hard link when both streams are on the same
501+ # filesystem to save space. Will fallback to a copy.
502+ try:
503+ os.link(src_item_path, target_item_path)
504+ except OSError:
505+ shutil.copy2(src_item_path, target_item_path)
506+ else:
507+ print("INFO: Downloading %s to %s" % (
508+ src_item_path, target_item_path))
509+ urllib_request.urlretrieve(src_item_path, target_item_path)
510+ assert util.get_file_info(target_item_path)['sha256'] == item[
511+ 'sha256'], ("Target file %s hash %s does not match!" % (
512+ target_item_path, item['sha256']))
513+
514+
515+def patch_versions(
516+ value, args, target_label, target_product, target_data, target_path,
517+ src_content, src_product_stream_path, src_label, src_path):
518+ write_product_stream = False
519+ for version, version_data in value.items():
520+ if version in target_data['versions']:
521+ if target_label in version_data.get('labels', []):
522+ # If the version already exists in the target stream skip
523+ # adding it. This allows CPC to run a nightly cron job.
524+ print("INFO: Skipping, version %s already exists!" % version)
525+ else:
526+ print("INFO: Deleting version %s" % version)
527+ del target_data['versions'][version]
528+ write_product_stream = True
529+ elif target_label in version_data.get('labels', []):
530+ print("INFO: Adding version %s to %s" % (version, target_product))
531+ assert src_product_stream_path, (
532+ "A source must be given when adding a version to a product!")
533+ write_product_stream = True
534+ if not src_content:
535+ src_content = util.load_content(src_product_stream_path, True)
536+ src_product = target_product.replace(target_label, src_label)
537+ src_data = src_content['products'][src_product]
538+ target_data['versions'][version] = src_data['versions'][version]
539+ if not args.dry_run:
540+ copy_items(
541+ target_data['versions'][version], src_path, target_path)
542+ return write_product_stream
543+
544+
545+def main_patch(args):
546+ streams = args.streams[0]
547+ regenerate_index = False
548+ if len(streams) == 1:
549+ target_path = streams[0]
550+ target_product_streams = util.load_product_streams(streams[0])
551+ target_label = get_stream_label(target_product_streams)
552+ src_path = None
553+ src_product_streams = []
554+ src_label = None
555+ elif len(streams) == 2:
556+ target_path = streams[1]
557+ target_product_streams = util.load_product_streams(streams[1])
558+ target_label = get_stream_label(target_product_streams)
559+ src_path = streams[0]
560+ src_product_streams = util.load_product_streams(streams[0], True)
561+ src_label = get_stream_label(src_product_streams)
562+ else:
563+ raise AssertionError("A max of 2 streams can be given!")
564+
565+ if args.input:
566+ with open(args.input, 'r') as f:
567+ diff = yaml.safe_load(f)
568+ else:
569+ diff = yaml.safe_load(sys.stdin)
570+
571+ if diff is None:
572+ print("WARNING: No diff defined!")
573+ return 0
574+
575+ for product_stream, stream_data in diff.items():
576+ write_product_stream = False
577+ target_stream = find_stream(product_stream, target_product_streams)
578+ target_product_stream_path = os.path.join(target_path, target_stream)
579+ if src_product_streams:
580+ src_stream = find_stream(product_stream, src_product_streams)
581+ src_product_stream_path = os.path.join(src_path, src_stream)
582+ else:
583+ src_product_stream_path = None
584+
585+ for product, product_data in stream_data.items():
586+ found_product = False
587+ product_regex = re.compile(r"^%s$" % product)
588+ target_content = util.load_content(target_product_stream_path)
589+ # Only load source content when promoting a version. This allows
590+ # users to create a patch to modify the values or remove versions
591+ # without needing a source.
592+ src_content = None
593+ for target_product, target_data in list(
594+ target_content['products'].items()):
595+ if product_regex.search(get_product_name_without_label(
596+ target_product, target_label)):
597+ found_product = True
598+ print(
599+ "INFO: Found matching product in target for %s, %s" % (
600+ product, target_product))
601+ for key, value in product_data.items():
602+ if key == 'labels':
603+ if target_label not in value:
604+ print(
605+ "INFO: Deleting product %s" %
606+ target_product)
607+ del target_content['products'][target_product]
608+ regenerate_index = write_product_stream = True
609+ break
610+ elif key == 'versions':
611+ ret = patch_versions(
612+ value, args, target_label, target_product,
613+ target_data, target_path, src_content,
614+ src_product_stream_path, src_label,
615+ src_path)
616+ regenerate_index |= ret
617+ write_product_stream |= ret
618+ elif (
619+ target_label in value and
620+ target_data[key] != value[target_label]):
621+ print(
622+ "INFO: Updating key %s %s -> %s" % (
623+ key, target_data[key],
624+ value[target_label]))
625+ regenerate_index = write_product_stream = True
626+ target_data[key] = value[target_label]
627+ if not found_product:
628+ assert src_product_stream_path, (
629+ "A source must be given when adding a new product!")
630+ src_content = util.load_content(
631+ src_product_stream_path, True)
632+ for src_product, src_data in src_content['products'].items():
633+ if (
634+ product_regex.search(
635+ get_product_name_without_label(
636+ src_product, src_label))
637+ and target_label in product_data.get('labels', [])
638+ ):
639+ write_product_stream = found_product = True
640+ print("INFO: Promoting %s into %s" % (
641+ product, target_stream))
642+ new_product = src_product.replace(
643+ src_label, target_label)
644+ target_content['products'][new_product] = src_data
645+ target_content['products'][new_product][
646+ 'label'] = target_label
647+ if not args.dry_run:
648+ for version_data in src_data['versions'].values():
649+ copy_items(version_data, src_path, target_path)
650+ if write_product_stream and not args.dry_run:
651+ print("INFO: Writing %s" % target_product_stream_path)
652+ os.remove(target_product_stream_path)
653+ with open(target_product_stream_path, 'wb') as f:
654+ f.write(util.dump_data(target_content).strip())
655+ else:
656+ # Validate the modified stream is still valid during
657+ # a dry run.
658+ util.dump_data(target_content).strip()
659+ if regenerate_index and not args.dry_run:
660+ util.gen_index_and_sign(target_path, sign=not args.no_sign)
661+ return 0
662+
663+
664 def main():
665 parser = argparse.ArgumentParser()
666
667diff --git a/meph2/util.py b/meph2/util.py
668index 69e49ef..22eccbc 100644
669--- a/meph2/util.py
670+++ b/meph2/util.py
671@@ -1,6 +1,7 @@
672-# Copyright (C) 2013 Canonical Ltd.
673+# Copyright (C) 2013-2020 Canonical Ltd.
674 #
675 # Author: Scott Moser <scott.moser@canonical.com>
676+# Lee Trager <lee.trager@canonical.com>
677 #
678 # Simplestreams is free software: you can redistribute it and/or modify it
679 # under the terms of the GNU Affero General Public License as published by
680@@ -26,6 +27,7 @@ import hashlib
681 import json
682 import os
683 import re
684+import subprocess
685 import sys
686 import tempfile
687
688@@ -264,8 +266,8 @@ def dump_data(data, end_cr=True):
689 return bytestr
690
691
692-def load_content(path):
693- if not os.path.exists(path):
694+def load_content(path, allow_url=False):
695+ if not allow_url and not os.path.exists(path):
696 return {}
697 with scontentsource.UrlContentSource(path) as tcs:
698 return sutil.load_content(tcs.read())
699@@ -280,9 +282,9 @@ def load_products(path, product_streams):
700 return products
701
702
703-def load_product_streams(src):
704+def load_product_streams(src, allow_url=False):
705 index_path = os.path.join(src, STREAMS_D, "index.json")
706- if not os.path.exists(index_path):
707+ if not allow_url and not os.path.exists(index_path):
708 return []
709 with scontentsource.UrlContentSource(index_path) as tcs:
710 index = sutil.load_content(tcs.read())
711@@ -311,4 +313,13 @@ def ensure_product_entry(tree):
712 return
713
714
715+def get_version():
716+ try:
717+ return subprocess.check_output(
718+ ['git', 'describe', '--always', '--dirty'],
719+ cwd=os.path.dirname(__file__)).decode().strip()
720+ except subprocess.CalledProcessError:
721+ return 'unknown'
722+
723+
724 # vi: ts=4 expandtab syntax=python

Subscribers

People subscribed via source and target branches