Merge ~paride/simplestreams:apply-black into simplestreams:master

Proposed by Paride Legovini
Status: Work in progress
Proposed branch: ~paride/simplestreams:apply-black
Merge into: simplestreams:master
Diff against target: 10555 lines (+3956/-2434)
47 files modified
.git-blame-ignore-revs (+4/-0)
.launchpad.yaml (+38/-0)
.pre-commit-config.yaml (+15/-0)
bin/json2streams (+2/-2)
bin/sstream-mirror (+118/-66)
bin/sstream-mirror-glance (+197/-114)
bin/sstream-query (+84/-47)
bin/sstream-sync (+111/-63)
pyproject.toml (+6/-0)
setup.py (+17/-14)
simplestreams/checksum_util.py (+33/-16)
simplestreams/contentsource.py (+33/-26)
simplestreams/filters.py (+10/-6)
simplestreams/generate_simplestreams.py (+53/-38)
simplestreams/json2streams.py (+34/-26)
simplestreams/log.py (+13/-9)
simplestreams/mirrors/__init__.py (+153/-100)
simplestreams/mirrors/command_hook.py (+69/-50)
simplestreams/mirrors/glance.py (+305/-206)
simplestreams/objectstores/__init__.py (+49/-26)
simplestreams/objectstores/s3.py (+7/-7)
simplestreams/objectstores/swift.py (+48/-37)
simplestreams/openstack.py (+126/-66)
simplestreams/util.py (+100/-79)
tests/httpserver.py (+24/-13)
tests/testutil.py (+5/-4)
tests/unittests/test_badmirrors.py (+72/-33)
tests/unittests/test_command_hook_mirror.py (+17/-12)
tests/unittests/test_contentsource.py (+62/-51)
tests/unittests/test_generate_simplestreams.py (+322/-179)
tests/unittests/test_glancemirror.py (+702/-390)
tests/unittests/test_json2streams.py (+126/-95)
tests/unittests/test_mirrorreaders.py (+24/-14)
tests/unittests/test_mirrorwriters.py (+6/-6)
tests/unittests/test_openstack.py (+184/-151)
tests/unittests/test_resolvework.py (+90/-38)
tests/unittests/test_signed_data.py (+7/-7)
tests/unittests/test_util.py (+232/-132)
tests/unittests/tests_filestore.py (+19/-13)
tools/install-deps (+1/-1)
tools/js2signed (+4/-3)
tools/make-test-data (+240/-185)
tools/sign_helper.py (+8/-4)
tools/tab2streams (+31/-15)
tools/toolutil.py (+66/-43)
tools/ubuntu_versions.py (+71/-43)
tox.ini (+18/-4)
Reviewer Review Type Date Requested Status
Server Team CI bot continuous-integration Approve
simplestreams-dev Pending
Review via email: mp+440053@code.launchpad.net

Commit message

- Apply black and isort
- Add black and isort pre-commit hooks
- Run (read-only) pre-commit in tox environment
- Run tox in lpci

To post a comment you must log in.
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
Revision history for this message
Paride Legovini (paride) wrote :
~paride/simplestreams:apply-black updated
246f30e... by Paride Legovini

lpci: don't run the pre-commit environment

Apparently the lpci workers do not have access to github.com,
so pre-commit can't download the hooks.

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
~paride/simplestreams:apply-black updated
274817d... by Paride Legovini

install-deps: install pkg-config on non-amd64 archs

a4d7a3f... by Paride Legovini

lpci: expand the test matrix

Cover: (amd64, arm64, ppc64el, s390x) * (focal, jammy, devel)

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
~paride/simplestreams:apply-black updated
39c2e39... by Paride Legovini

lpci: add test dependency: python3-dev

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
~paride/simplestreams:apply-black updated
19608c6... by Paride Legovini

mm

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
~paride/simplestreams:apply-black updated
b405e84... by Paride Legovini

tox: pass the lowercase *_proxy variables

Workaround for https://github.com/tox-dev/tox/pull/2378/.

363d3b4... by Paride Legovini

tox: skip_install in the pre_commit environment

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
~paride/simplestreams:apply-black updated
75441dd... by Paride Legovini

ci: deal with pre-commit modifying files

We don't need to jump through hoops to avoid pre-commit modifying files
during the tox run: if that happens, the CI run will fail anyway.

5449069... by Paride Legovini

prefer bin

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
~paride/simplestreams:apply-black updated
4c88164... by Paride Legovini

use-install-deps

The tool requires sudo to install dependencies.

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
Revision history for this message
Adam Collard (adam-collard) wrote :

I wouldn't mix a large reformatting like this (applying black across the codebase) with anything else.

I suggest landing the blacken commit, then separately isort, then separately again tox / lp-ci changes.

You'll want to ignore the black commits for e.g. git blame; but should see tox and lp-ci config in their own right

Revision history for this message
Server Team CI bot (server-team-bot) wrote :
review: Approve (continuous-integration)
Revision history for this message
Paride Legovini (paride) wrote :

Hi Adam, thanks for chiming in. Yeah I started this with the idea of quickly fixing CI (currently failing in master), then I started experimenting with integrating pre-commit, tox and lpci and the changeset grew.

I'll definitely split it up and resubmit it in smaller chunks.

Revision history for this message
Paride Legovini (paride) wrote :

CI is stuck (running for 17 hours). I'll do a no-op force push to retrigger.

~paride/simplestreams:apply-black updated
6da3cc1... by Paride Legovini

ci: append 127.0.0.1 to no_proxy

Revision history for this message
Paride Legovini (paride) wrote :

CI passed! Next steps:

- Cleanup git history
- Split the this MP is smaller one.

I'll get back to this after 2023-06-13.

Revision history for this message
Adam Collard (adam-collard) wrote :

@Paride - don't forget about this one!

Revision history for this message
Paride Legovini (paride) wrote :

heh, knowing that other people are waiting for it will help. thanks!

Unmerged commits

6da3cc1... by Paride Legovini

ci: append 127.0.0.1 to no_proxy

Succeeded
[SUCCEEDED] lint:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[SUCCEEDED] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[SUCCEEDED] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[SUCCEEDED] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[SUCCEEDED] unit-devel:0 (build)
112 of 12 results
4c88164... by Paride Legovini

use-install-deps

The tool requires sudo to install dependencies.

Failed
[WAITING] lint:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[WAITING] lint:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
112 of 12 results
5449069... by Paride Legovini

prefer bin

19608c6... by Paride Legovini

mm

Failed
[SUCCEEDED] lint:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] lint:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[WAITING] lint:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
112 of 12 results
75441dd... by Paride Legovini

ci: deal with pre-commit modifying files

We don't need to jump through hoops to avoid pre-commit modifying files
during the tox run: if that happens, the CI run will fail anyway.

363d3b4... by Paride Legovini

tox: skip_install in the pre_commit environment

b405e84... by Paride Legovini

tox: pass the lowercase *_proxy variables

Workaround for https://github.com/tox-dev/tox/pull/2378/.

39c2e39... by Paride Legovini

lpci: add test dependency: python3-dev

Failed
[SUCCEEDED] linters:0 (build)
[SUCCEEDED] unit-focal:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[FAILED] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[SUCCEEDED] unit-focal:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[FAILED] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[SUCCEEDED] unit-focal:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[FAILED] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[SUCCEEDED] unit-focal:0 (build)
[SUCCEEDED] unit-jammy:0 (build)
[FAILED] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
132 of 32 results
a4d7a3f... by Paride Legovini

lpci: expand the test matrix

Cover: (amd64, arm64, ppc64el, s390x) * (focal, jammy, devel)

Failed
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[SUCCEEDED] unit-focal:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[SUCCEEDED] unit-focal:0 (build)
[FAILED] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
[SUCCEEDED] linters:0 (build)
[FAILED] unit-focal:0 (build)
[WAITING] unit-jammy:0 (build)
[WAITING] unit-devel:0 (build)
132 of 32 results
274817d... by Paride Legovini

install-deps: install pkg-config on non-amd64 archs

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs
0new file mode 1006440new file mode 100644
index 0000000..da1123d
--- /dev/null
+++ b/.git-blame-ignore-revs
@@ -0,0 +1,4 @@
1# Automatically apply to git blame with `git config blame.ignorerevsfile .git-blame-ignore-revs`
2
3# Apply black and isort formatting
40d4060f7fa2de1e5b8c8b263100b1ed3a2c479bb
diff --git a/.launchpad.yaml b/.launchpad.yaml
0new file mode 1006445new file mode 100644
index 0000000..20a3afb
--- /dev/null
+++ b/.launchpad.yaml
@@ -0,0 +1,38 @@
1pipeline:
2 - lint
3 - unit-jammy
4 - unit-devel
5
6jobs:
7 lint:
8 series: jammy
9 architectures:
10 - amd64
11 packages:
12 - git
13 - tox
14 run: tox -e flake8,pre-commit
15 unit-jammy:
16 series: jammy
17 architectures:
18 - amd64
19 - arm64
20 - ppc64el
21 - s390x
22 packages:
23 - sudo
24 run: |
25 tools/install-deps tox
26 no_proxy="${no_proxy:+$no_proxy,}127.0.0.1" tox -e py3
27 unit-devel:
28 series: devel
29 architectures:
30 - amd64
31 - arm64
32 - ppc64el
33 - s390x
34 packages:
35 - sudo
36 run: |
37 tools/install-deps tox
38 no_proxy="${no_proxy:+$no_proxy,}127.0.0.1" tox -e py3
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
0new file mode 10064439new file mode 100644
index 0000000..0d08ba7
--- /dev/null
+++ b/.pre-commit-config.yaml
@@ -0,0 +1,15 @@
1# Hooks called in the "manual" stage are meant to be read-only. We'll trigger
2# them from tox, and we don't want the tox pre-commit environment to modify
3# code, as this may interfere with other environments.
4#
5# To update the pinned versions run: pre-commit autoupdate
6
7repos:
8 - repo: https://github.com/ambv/black
9 rev: 23.3.0
10 hooks:
11 - id: black
12 - repo: https://github.com/pycqa/isort
13 rev: 5.12.0
14 hooks:
15 - id: isort
diff --git a/bin/json2streams b/bin/json2streams
index bca6a7b..7dfd873 100755
--- a/bin/json2streams
+++ b/bin/json2streams
@@ -2,8 +2,8 @@
2# Copyright (C) 2013, 2015 Canonical Ltd.2# Copyright (C) 2013, 2015 Canonical Ltd.
33
4import sys4import sys
5from simplestreams.json2streams import main
65
6from simplestreams.json2streams import main
77
8if __name__ == '__main__':8if __name__ == "__main__":
9 sys.exit(main())9 sys.exit(main())
diff --git a/bin/sstream-mirror b/bin/sstream-mirror
index 08b71a5..fbe86da 100755
--- a/bin/sstream-mirror
+++ b/bin/sstream-mirror
@@ -18,11 +18,7 @@
18import argparse18import argparse
19import sys19import sys
2020
21from simplestreams import filters21from simplestreams import filters, log, mirrors, objectstores, util
22from simplestreams import log
23from simplestreams import mirrors
24from simplestreams import objectstores
25from simplestreams import util
2622
2723
28class DotProgress(object):24class DotProgress(object):
@@ -39,9 +35,10 @@ class DotProgress(object):
39 self.curpath = path35 self.curpath = path
40 status = ""36 status = ""
41 if self.expected:37 if self.expected:
42 status = (" %02s%%" %38 status = " %02s%%" % (
43 (int(self.bytes_read * 100 / self.expected)))39 int(self.bytes_read * 100 / self.expected)
44 sys.stderr.write('=> %s [%s]%s\n' % (path, total, status))40 )
41 sys.stderr.write("=> %s [%s]%s\n" % (path, total, status))
4542
46 if cur == total:43 if cur == total:
47 sys.stderr.write("\n")44 sys.stderr.write("\n")
@@ -52,7 +49,7 @@ class DotProgress(object):
52 toprint = int(cur * self.columns / total) - self.printed49 toprint = int(cur * self.columns / total) - self.printed
53 if toprint <= 0:50 if toprint <= 0:
54 return51 return
55 sys.stderr.write('.' * toprint)52 sys.stderr.write("." * toprint)
56 sys.stderr.flush()53 sys.stderr.flush()
57 self.printed += toprint54 self.printed += toprint
5855
@@ -60,81 +57,135 @@ class DotProgress(object):
60def main():57def main():
61 parser = argparse.ArgumentParser()58 parser = argparse.ArgumentParser()
6259
63 parser.add_argument('--keep', action='store_true', default=False,60 parser.add_argument(
64 help='keep items in target up to MAX items '61 "--keep",
65 'even after they have fallen out of the source')62 action="store_true",
66 parser.add_argument('--max', type=int, default=None,63 default=False,
67 help='store at most MAX items in the target')64 help="keep items in target up to MAX items "
68 parser.add_argument('--path', default=None,65 "even after they have fallen out of the source",
69 help='sync from index or products file in mirror')66 )
70 parser.add_argument('--no-item-download', action='store_true',67 parser.add_argument(
71 default=False,68 "--max",
72 help='do not download items with a "path"')69 type=int,
73 parser.add_argument('--dry-run', action='store_true', default=False,70 default=None,
74 help='only report what would be done')71 help="store at most MAX items in the target",
75 parser.add_argument('--progress', action='store_true', default=False,72 )
76 help='show progress for downloading files')73 parser.add_argument(
77 parser.add_argument('--mirror', action='append', default=[],74 "--path",
78 dest="mirrors",75 default=None,
79 help='additional mirrors to find referenced files')76 help="sync from index or products file in mirror",
8077 )
81 parser.add_argument('--verbose', '-v', action='count', default=0)78 parser.add_argument(
82 parser.add_argument('--log-file', default=sys.stderr,79 "--no-item-download",
83 type=argparse.FileType('w'))80 action="store_true",
8481 default=False,
85 parser.add_argument('--keyring', action='store', default=None,82 help='do not download items with a "path"',
86 help='keyring to be specified to gpg via --keyring')83 )
87 parser.add_argument('--no-verify', '-U', action='store_false',84 parser.add_argument(
88 dest='verify', default=True,85 "--dry-run",
89 help="do not gpg check signed json files")86 action="store_true",
90 parser.add_argument('--no-checksumming-reader', action='store_false',87 default=False,
91 dest='checksumming_reader', default=True,88 help="only report what would be done",
92 help=("do not call 'insert_item' with a reader"89 )
93 " that does checksumming."))90 parser.add_argument(
9491 "--progress",
95 parser.add_argument('source_mirror')92 action="store_true",
96 parser.add_argument('output_d')93 default=False,
97 parser.add_argument('filters', nargs='*', default=[])94 help="show progress for downloading files",
95 )
96 parser.add_argument(
97 "--mirror",
98 action="append",
99 default=[],
100 dest="mirrors",
101 help="additional mirrors to find referenced files",
102 )
103
104 parser.add_argument("--verbose", "-v", action="count", default=0)
105 parser.add_argument(
106 "--log-file", default=sys.stderr, type=argparse.FileType("w")
107 )
108
109 parser.add_argument(
110 "--keyring",
111 action="store",
112 default=None,
113 help="keyring to be specified to gpg via --keyring",
114 )
115 parser.add_argument(
116 "--no-verify",
117 "-U",
118 action="store_false",
119 dest="verify",
120 default=True,
121 help="do not gpg check signed json files",
122 )
123 parser.add_argument(
124 "--no-checksumming-reader",
125 action="store_false",
126 dest="checksumming_reader",
127 default=True,
128 help=(
129 "do not call 'insert_item' with a reader"
130 " that does checksumming."
131 ),
132 )
133
134 parser.add_argument("source_mirror")
135 parser.add_argument("output_d")
136 parser.add_argument("filters", nargs="*", default=[])
98137
99 args = parser.parse_args()138 args = parser.parse_args()
100139
101 (mirror_url, initial_path) = util.path_from_mirror_url(args.source_mirror,140 (mirror_url, initial_path) = util.path_from_mirror_url(
102 args.path)141 args.source_mirror, args.path
142 )
103143
104 def policy(content, path):144 def policy(content, path):
105 if initial_path.endswith('sjson'):145 if initial_path.endswith("sjson"):
106 return util.read_signed(content,146 return util.read_signed(
107 keyring=args.keyring,147 content, keyring=args.keyring, checked=args.verify
108 checked=args.verify)148 )
109 else:149 else:
110 return content150 return content
111151
112 filter_list = filters.get_filters(args.filters)152 filter_list = filters.get_filters(args.filters)
113 mirror_config = {'max_items': args.max, 'keep_items': args.keep,153 mirror_config = {
114 'filters': filter_list,154 "max_items": args.max,
115 'item_download': not args.no_item_download,155 "keep_items": args.keep,
116 'checksumming_reader': args.checksumming_reader}156 "filters": filter_list,
157 "item_download": not args.no_item_download,
158 "checksumming_reader": args.checksumming_reader,
159 }
117160
118 level = (log.ERROR, log.INFO, log.DEBUG)[min(args.verbose, 2)]161 level = (log.ERROR, log.INFO, log.DEBUG)[min(args.verbose, 2)]
119 log.basicConfig(stream=args.log_file, level=level)162 log.basicConfig(stream=args.log_file, level=level)
120163
121 smirror = mirrors.UrlMirrorReader(mirror_url, mirrors=args.mirrors,164 smirror = mirrors.UrlMirrorReader(
122 policy=policy)165 mirror_url, mirrors=args.mirrors, policy=policy
166 )
123 tstore = objectstores.FileStore(args.output_d)167 tstore = objectstores.FileStore(args.output_d)
124168
125 drmirror = mirrors.DryRunMirrorWriter(config=mirror_config,169 drmirror = mirrors.DryRunMirrorWriter(
126 objectstore=tstore)170 config=mirror_config, objectstore=tstore
171 )
127 drmirror.sync(smirror, initial_path)172 drmirror.sync(smirror, initial_path)
128173
129 def print_diff(char, items):174 def print_diff(char, items):
130 for pedigree, path, size in items:175 for pedigree, path, size in items:
131 fmt = "{char} {pedigree} {path} {size} Mb\n"176 fmt = "{char} {pedigree} {path} {size} Mb\n"
132 size = int(size / (1024 * 1024))177 size = int(size / (1024 * 1024))
133 sys.stderr.write(fmt.format(178 sys.stderr.write(
134 char=char, pedigree=' '.join(pedigree), path=path, size=size))179 fmt.format(
135180 char=char,
136 print_diff('+', drmirror.downloading)181 pedigree=" ".join(pedigree),
137 print_diff('-', drmirror.removing)182 path=path,
183 size=size,
184 )
185 )
186
187 print_diff("+", drmirror.downloading)
188 print_diff("-", drmirror.removing)
138 sys.stderr.write("%d Mb change\n" % (drmirror.size / (1024 * 1024)))189 sys.stderr.write("%d Mb change\n" % (drmirror.size / (1024 * 1024)))
139190
140 if args.dry_run:191 if args.dry_run:
@@ -147,13 +198,14 @@ def main():
147198
148 tstore = objectstores.FileStore(args.output_d, complete_callback=callback)199 tstore = objectstores.FileStore(args.output_d, complete_callback=callback)
149200
150 tmirror = mirrors.ObjectFilterMirror(config=mirror_config,201 tmirror = mirrors.ObjectFilterMirror(
151 objectstore=tstore)202 config=mirror_config, objectstore=tstore
203 )
152204
153 tmirror.sync(smirror, initial_path)205 tmirror.sync(smirror, initial_path)
154206
155207
156if __name__ == '__main__':208if __name__ == "__main__":
157 main()209 main()
158210
159# vi: ts=4 expandtab syntax=python211# vi: ts=4 expandtab syntax=python
diff --git a/bin/sstream-mirror-glance b/bin/sstream-mirror-glance
index 5907137..953a2d3 100755
--- a/bin/sstream-mirror-glance
+++ b/bin/sstream-mirror-glance
@@ -23,15 +23,11 @@ import argparse
23import os.path23import os.path
24import sys24import sys
2525
26from simplestreams import objectstores26from simplestreams import log, mirrors, objectstores, openstack, util
27from simplestreams.objectstores import swift
28from simplestreams import log
29from simplestreams import mirrors
30from simplestreams import openstack
31from simplestreams import util
32from simplestreams.mirrors import glance27from simplestreams.mirrors import glance
28from simplestreams.objectstores import swift
3329
34DEFAULT_FILTERS = ['ftype~(disk1.img|disk.img)', 'arch~(x86_64|amd64|i386)']30DEFAULT_FILTERS = ["ftype~(disk1.img|disk.img)", "arch~(x86_64|amd64|i386)"]
35DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg"31DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg"
3632
3733
@@ -44,94 +40,172 @@ class StdoutProgressAggregator(util.ProgressAggregator):
44 super(StdoutProgressAggregator, self).__init__(remaining_items)40 super(StdoutProgressAggregator, self).__init__(remaining_items)
4541
46 def emit(self, progress):42 def emit(self, progress):
47 size = float(progress['size'])43 size = float(progress["size"])
48 written = float(progress['written'])44 written = float(progress["written"])
49 print("%.2f %s (%d of %d images) - %.2f" %45 print(
50 (written / size, progress['name'],46 "%.2f %s (%d of %d images) - %.2f"
51 self.total_image_count - len(self.remaining_items) + 1,47 % (
52 self.total_image_count,48 written / size,
53 float(self.total_written) / self.total_size))49 progress["name"],
50 self.total_image_count - len(self.remaining_items) + 1,
51 self.total_image_count,
52 float(self.total_written) / self.total_size,
53 )
54 )
5455
5556
56def main():57def main():
57 parser = argparse.ArgumentParser()58 parser = argparse.ArgumentParser()
5859
59 parser.add_argument('--keep', action='store_true', default=False,60 parser.add_argument(
60 help='keep items in target up to MAX items '61 "--keep",
61 'even after they have fallen out of the source')62 action="store_true",
62 parser.add_argument('--max', type=int, default=None,63 default=False,
63 help='store at most MAX items in the target')64 help="keep items in target up to MAX items "
6465 "even after they have fallen out of the source",
65 parser.add_argument('--region', action='append', default=None,66 )
66 dest='regions',67 parser.add_argument(
67 help='operate on specified region '68 "--max",
68 '[useable multiple times]')69 type=int,
6970 default=None,
70 parser.add_argument('--mirror', action='append', default=[],71 help="store at most MAX items in the target",
71 dest="mirrors",72 )
72 help='additional mirrors to find referenced files')73
73 parser.add_argument('--path', default=None,74 parser.add_argument(
74 help='sync from index or products file in mirror')75 "--region",
75 parser.add_argument('--output-dir', metavar="DIR", default=False,76 action="append",
76 help='write image data to storage in dir')77 default=None,
77 parser.add_argument('--output-swift', metavar="prefix", default=False,78 dest="regions",
78 help='write image data to swift under prefix')79 help="operate on specified region " "[useable multiple times]",
7980 )
80 parser.add_argument('--name-prefix', metavar="prefix", default=None,81
81 help='prefix for each published image name')82 parser.add_argument(
82 parser.add_argument('--cloud-name', metavar="name", default=None,83 "--mirror",
83 required=True, help='unique name for this cloud')84 action="append",
84 parser.add_argument('--modify-hook', metavar="cmd", default=None,85 default=[],
85 required=False,86 dest="mirrors",
86 help='invoke cmd on each image prior to upload')87 help="additional mirrors to find referenced files",
87 parser.add_argument('--content-id', metavar="name", default=None,88 )
88 required=True,89 parser.add_argument(
89 help='content-id to use for published data.'90 "--path",
90 ' may contain "%%(region)s"')91 default=None,
9192 help="sync from index or products file in mirror",
92 parser.add_argument('--progress', action='store_true', default=False,93 )
93 help='display per-item download progress')94 parser.add_argument(
94 parser.add_argument('--verbose', '-v', action='count', default=0)95 "--output-dir",
95 parser.add_argument('--log-file', default=sys.stderr,96 metavar="DIR",
96 type=argparse.FileType('w'))97 default=False,
9798 help="write image data to storage in dir",
98 parser.add_argument('--keyring', action='store', default=DEFAULT_KEYRING,99 )
99 help='The keyring for gpg --keyring')100 parser.add_argument(
100101 "--output-swift",
101 parser.add_argument('source_mirror')102 metavar="prefix",
102 parser.add_argument('item_filters', nargs='*', default=DEFAULT_FILTERS,103 default=False,
103 help="Filter expression for mirrored items. "104 help="write image data to swift under prefix",
104 "Multiple filter arguments can be specified"105 )
105 "and will be combined with logical AND. "106
106 "Expressions are key[!]=literal_string "107 parser.add_argument(
107 "or key[!]~regexp.")108 "--name-prefix",
108109 metavar="prefix",
109 parser.add_argument('--hypervisor-mapping', action='store_true',110 default=None,
110 default=False,111 help="prefix for each published image name",
111 help="Set hypervisor_type attribute on stored images "112 )
112 "and the virt attribute in the associated stream "113 parser.add_argument(
113 "data. This is useful in OpenStack Clouds which use "114 "--cloud-name",
114 "multiple hypervisor types with in a single region.")115 metavar="name",
115116 default=None,
116 parser.add_argument('--custom-property', action='append', default=[],117 required=True,
117 dest="custom_properties",118 help="unique name for this cloud",
118 help='additional properties to add to glance'119 )
119 ' image metadata (key=value format).')120 parser.add_argument(
120121 "--modify-hook",
121 parser.add_argument('--visibility', action='store', default='public',122 metavar="cmd",
122 choices=('public', 'private', 'community', 'shared'),123 default=None,
123 help='Visibility to apply to stored images.')124 required=False,
124125 help="invoke cmd on each image prior to upload",
125 parser.add_argument('--image-import-conversion', action='store_true',126 )
126 default=False,127 parser.add_argument(
127 help="Enable conversion of images to raw format using "128 "--content-id",
128 "image import option in Glance.")129 metavar="name",
129130 default=None,
130 parser.add_argument('--set-latest-property', action='store_true',131 required=True,
131 default=False,132 help="content-id to use for published data."
132 help="Set 'latest=true' property to latest synced "133 ' may contain "%%(region)s"',
133 "os_version/architecture image metadata and remove "134 )
134 "latest property from the old images.")135
136 parser.add_argument(
137 "--progress",
138 action="store_true",
139 default=False,
140 help="display per-item download progress",
141 )
142 parser.add_argument("--verbose", "-v", action="count", default=0)
143 parser.add_argument(
144 "--log-file", default=sys.stderr, type=argparse.FileType("w")
145 )
146
147 parser.add_argument(
148 "--keyring",
149 action="store",
150 default=DEFAULT_KEYRING,
151 help="The keyring for gpg --keyring",
152 )
153
154 parser.add_argument("source_mirror")
155 parser.add_argument(
156 "item_filters",
157 nargs="*",
158 default=DEFAULT_FILTERS,
159 help="Filter expression for mirrored items. "
160 "Multiple filter arguments can be specified"
161 "and will be combined with logical AND. "
162 "Expressions are key[!]=literal_string "
163 "or key[!]~regexp.",
164 )
165
166 parser.add_argument(
167 "--hypervisor-mapping",
168 action="store_true",
169 default=False,
170 help="Set hypervisor_type attribute on stored images "
171 "and the virt attribute in the associated stream "
172 "data. This is useful in OpenStack Clouds which use "
173 "multiple hypervisor types with in a single region.",
174 )
175
176 parser.add_argument(
177 "--custom-property",
178 action="append",
179 default=[],
180 dest="custom_properties",
181 help="additional properties to add to glance"
182 " image metadata (key=value format).",
183 )
184
185 parser.add_argument(
186 "--visibility",
187 action="store",
188 default="public",
189 choices=("public", "private", "community", "shared"),
190 help="Visibility to apply to stored images.",
191 )
192
193 parser.add_argument(
194 "--image-import-conversion",
195 action="store_true",
196 default=False,
197 help="Enable conversion of images to raw format using "
198 "image import option in Glance.",
199 )
200
201 parser.add_argument(
202 "--set-latest-property",
203 action="store_true",
204 default=False,
205 help="Set 'latest=true' property to latest synced "
206 "os_version/architecture image metadata and remove "
207 "latest property from the old images.",
208 )
135209
136 args = parser.parse_args()210 args = parser.parse_args()
137211
@@ -139,27 +213,32 @@ def main():
139 if args.modify_hook:213 if args.modify_hook:
140 modify_hook = args.modify_hook.split()214 modify_hook = args.modify_hook.split()
141215
142 mirror_config = {'max_items': args.max, 'keep_items': args.keep,216 mirror_config = {
143 'cloud_name': args.cloud_name,217 "max_items": args.max,
144 'modify_hook': modify_hook,218 "keep_items": args.keep,
145 'item_filters': args.item_filters,219 "cloud_name": args.cloud_name,
146 'hypervisor_mapping': args.hypervisor_mapping,220 "modify_hook": modify_hook,
147 'custom_properties': args.custom_properties,221 "item_filters": args.item_filters,
148 'visibility': args.visibility,222 "hypervisor_mapping": args.hypervisor_mapping,
149 'image_import_conversion': args.image_import_conversion,223 "custom_properties": args.custom_properties,
150 'set_latest_property': args.set_latest_property}224 "visibility": args.visibility,
151225 "image_import_conversion": args.image_import_conversion,
152 (mirror_url, args.path) = util.path_from_mirror_url(args.source_mirror,226 "set_latest_property": args.set_latest_property,
153 args.path)227 }
228
229 (mirror_url, args.path) = util.path_from_mirror_url(
230 args.source_mirror, args.path
231 )
154232
155 def policy(content, path): # pylint: disable=W0613233 def policy(content, path): # pylint: disable=W0613
156 if args.path.endswith('sjson'):234 if args.path.endswith("sjson"):
157 return util.read_signed(content, keyring=args.keyring)235 return util.read_signed(content, keyring=args.keyring)
158 else:236 else:
159 return content237 return content
160238
161 smirror = mirrors.UrlMirrorReader(mirror_url, mirrors=args.mirrors,239 smirror = mirrors.UrlMirrorReader(
162 policy=policy)240 mirror_url, mirrors=args.mirrors, policy=policy
241 )
163 if args.output_dir and args.output_swift:242 if args.output_dir and args.output_swift:
164 error("--output-dir and --output-swift are mutually exclusive\n")243 error("--output-dir and --output-swift are mutually exclusive\n")
165 sys.exit(1)244 sys.exit(1)
@@ -169,7 +248,7 @@ def main():
169248
170 regions = args.regions249 regions = args.regions
171 if regions is None:250 if regions is None:
172 regions = openstack.get_regions(services=['image'])251 regions = openstack.get_regions(services=["image"])
173252
174 for region in regions:253 for region in regions:
175 if args.output_dir:254 if args.output_dir:
@@ -181,25 +260,29 @@ def main():
181 sys.stderr.write("not writing data anywhere\n")260 sys.stderr.write("not writing data anywhere\n")
182 tstore = None261 tstore = None
183262
184 mirror_config['content_id'] = args.content_id % {'region': region}263 mirror_config["content_id"] = args.content_id % {"region": region}
185264
186 if args.progress:265 if args.progress:
187 drmirror = glance.ItemInfoDryRunMirror(config=mirror_config,266 drmirror = glance.ItemInfoDryRunMirror(
188 objectstore=tstore)267 config=mirror_config, objectstore=tstore
268 )
189 drmirror.sync(smirror, args.path)269 drmirror.sync(smirror, args.path)
190 p = StdoutProgressAggregator(drmirror.items)270 p = StdoutProgressAggregator(drmirror.items)
191 progress_callback = p.progress_callback271 progress_callback = p.progress_callback
192 else:272 else:
193 progress_callback = None273 progress_callback = None
194274
195 tmirror = glance.GlanceMirror(config=mirror_config,275 tmirror = glance.GlanceMirror(
196 objectstore=tstore, region=region,276 config=mirror_config,
197 name_prefix=args.name_prefix,277 objectstore=tstore,
198 progress_callback=progress_callback)278 region=region,
279 name_prefix=args.name_prefix,
280 progress_callback=progress_callback,
281 )
199 tmirror.sync(smirror, args.path)282 tmirror.sync(smirror, args.path)
200283
201284
202if __name__ == '__main__':285if __name__ == "__main__":
203 main()286 main()
204287
205# vi: ts=4 expandtab syntax=python288# vi: ts=4 expandtab syntax=python
diff --git a/bin/sstream-query b/bin/sstream-query
index 6534317..3752cac 100755
--- a/bin/sstream-query
+++ b/bin/sstream-query
@@ -16,11 +16,6 @@
16# You should have received a copy of the GNU Affero General Public License16# You should have received a copy of the GNU Affero General Public License
17# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.17# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1818
19from simplestreams import filters
20from simplestreams import mirrors
21from simplestreams import log
22from simplestreams import util
23
24import argparse19import argparse
25import errno20import errno
26import json21import json
@@ -28,6 +23,8 @@ import pprint
28import signal23import signal
29import sys24import sys
3025
26from simplestreams import filters, log, mirrors, util
27
31FORMAT_PRETTY = "PRETTY"28FORMAT_PRETTY = "PRETTY"
32FORMAT_JSON = "JSON"29FORMAT_JSON = "JSON"
33DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg"30DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg"
@@ -43,15 +40,15 @@ class FilterMirror(mirrors.BasicMirrorWriter):
43 if config is None:40 if config is None:
44 config = {}41 config = {}
45 self.config = config42 self.config = config
46 self.filters = config.get('filters', [])43 self.filters = config.get("filters", [])
47 outfmt = config.get('output_format')44 outfmt = config.get("output_format")
48 if not outfmt:45 if not outfmt:
49 outfmt = "%s"46 outfmt = "%s"
50 self.output_format = outfmt47 self.output_format = outfmt
51 self.json_entries = []48 self.json_entries = []
5249
53 def load_products(self, path=None, content_id=None):50 def load_products(self, path=None, content_id=None):
54 return {'content_id': content_id, 'products': {}}51 return {"content_id": content_id, "products": {}}
5552
56 def filter_item(self, data, src, target, pedigree):53 def filter_item(self, data, src, target, pedigree):
57 return filters.filter_item(self.filters, data, src, pedigree)54 return filters.filter_item(self.filters, data, src, pedigree)
@@ -61,8 +58,8 @@ class FilterMirror(mirrors.BasicMirrorWriter):
61 # data is src['products'][ped[0]]['versions'][ped[1]]['items'][ped[2]]58 # data is src['products'][ped[0]]['versions'][ped[1]]['items'][ped[2]]
62 # contentsource is a ContentSource if 'path' exists in data or None59 # contentsource is a ContentSource if 'path' exists in data or None
63 data = util.products_exdata(src, pedigree)60 data = util.products_exdata(src, pedigree)
64 if 'path' in data:61 if "path" in data:
65 data.update({'item_url': contentsource.url})62 data.update({"item_url": contentsource.url})
6663
67 if self.output_format == FORMAT_PRETTY:64 if self.output_format == FORMAT_PRETTY:
68 pprint.pprint(data)65 pprint.pprint(data)
@@ -79,39 +76,71 @@ class FilterMirror(mirrors.BasicMirrorWriter):
79def main():76def main():
80 parser = argparse.ArgumentParser()77 parser = argparse.ArgumentParser()
8178
82 parser.add_argument('--max', type=int, default=None, dest='max_items',79 parser.add_argument(
83 help='store at most MAX items in the target')80 "--max",
81 type=int,
82 default=None,
83 dest="max_items",
84 help="store at most MAX items in the target",
85 )
8486
85 parser.add_argument('--path', default=None,87 parser.add_argument(
86 help='sync from index or products file in mirror')88 "--path",
89 default=None,
90 help="sync from index or products file in mirror",
91 )
8792
88 fmt_group = parser.add_mutually_exclusive_group()93 fmt_group = parser.add_mutually_exclusive_group()
89 fmt_group.add_argument('--output-format', '-o', action='store',94 fmt_group.add_argument(
90 dest='output_format', default=None,95 "--output-format",
91 help="specify output format per python str.format")96 "-o",
92 fmt_group.add_argument('--pretty', action='store_const',97 action="store",
93 const=FORMAT_PRETTY, dest='output_format',98 dest="output_format",
94 help="pretty print output")99 default=None,
95 fmt_group.add_argument('--json', action='store_const',100 help="specify output format per python str.format",
96 const=FORMAT_JSON, dest='output_format',101 )
97 help="output in JSON as a list of dicts.")102 fmt_group.add_argument(
98 parser.add_argument('--verbose', '-v', action='count', default=0)103 "--pretty",
99 parser.add_argument('--log-file', default=sys.stderr,104 action="store_const",
100 type=argparse.FileType('w'))105 const=FORMAT_PRETTY,
101106 dest="output_format",
102 parser.add_argument('--keyring', action='store', default=DEFAULT_KEYRING,107 help="pretty print output",
103 help='keyring to be specified to gpg via --keyring')108 )
104 parser.add_argument('--no-verify', '-U', action='store_false',109 fmt_group.add_argument(
105 dest='verify', default=True,110 "--json",
106 help="do not gpg check signed json files")111 action="store_const",
107112 const=FORMAT_JSON,
108 parser.add_argument('mirror_url')113 dest="output_format",
109 parser.add_argument('filters', nargs='*', default=[])114 help="output in JSON as a list of dicts.",
115 )
116 parser.add_argument("--verbose", "-v", action="count", default=0)
117 parser.add_argument(
118 "--log-file", default=sys.stderr, type=argparse.FileType("w")
119 )
120
121 parser.add_argument(
122 "--keyring",
123 action="store",
124 default=DEFAULT_KEYRING,
125 help="keyring to be specified to gpg via --keyring",
126 )
127 parser.add_argument(
128 "--no-verify",
129 "-U",
130 action="store_false",
131 dest="verify",
132 default=True,
133 help="do not gpg check signed json files",
134 )
135
136 parser.add_argument("mirror_url")
137 parser.add_argument("filters", nargs="*", default=[])
110138
111 cmdargs = parser.parse_args()139 cmdargs = parser.parse_args()
112140
113 (mirror_url, path) = util.path_from_mirror_url(cmdargs.mirror_url,141 (mirror_url, path) = util.path_from_mirror_url(
114 cmdargs.path)142 cmdargs.mirror_url, cmdargs.path
143 )
115144
116 level = (log.ERROR, log.INFO, log.DEBUG)[min(cmdargs.verbose, 2)]145 level = (log.ERROR, log.INFO, log.DEBUG)[min(cmdargs.verbose, 2)]
117 log.basicConfig(stream=cmdargs.log_file, level=level)146 log.basicConfig(stream=cmdargs.log_file, level=level)
@@ -119,33 +148,41 @@ def main():
119 initial_path = path148 initial_path = path
120149
121 def policy(content, path):150 def policy(content, path):
122 if initial_path.endswith('sjson'):151 if initial_path.endswith("sjson"):
123 return util.read_signed(content,152 return util.read_signed(
124 keyring=cmdargs.keyring,153 content, keyring=cmdargs.keyring, checked=cmdargs.verify
125 checked=cmdargs.verify)154 )
126 else:155 else:
127 return content156 return content
128157
129 smirror = mirrors.UrlMirrorReader(mirror_url, policy=policy)158 smirror = mirrors.UrlMirrorReader(mirror_url, policy=policy)
130159
131 filter_list = filters.get_filters(cmdargs.filters)160 filter_list = filters.get_filters(cmdargs.filters)
132 cfg = {'max_items': cmdargs.max_items,161 cfg = {
133 'filters': filter_list,162 "max_items": cmdargs.max_items,
134 'output_format': cmdargs.output_format}163 "filters": filter_list,
164 "output_format": cmdargs.output_format,
165 }
135166
136 tmirror = FilterMirror(config=cfg)167 tmirror = FilterMirror(config=cfg)
137 try:168 try:
138 tmirror.sync(smirror, path)169 tmirror.sync(smirror, path)
139 if tmirror.output_format == FORMAT_JSON:170 if tmirror.output_format == FORMAT_JSON:
140 print(json.dumps(tmirror.json_entries, indent=2, sort_keys=True,171 print(
141 separators=(',', ': ')))172 json.dumps(
173 tmirror.json_entries,
174 indent=2,
175 sort_keys=True,
176 separators=(",", ": "),
177 )
178 )
142 except IOError as e:179 except IOError as e:
143 if e.errno == errno.EPIPE:180 if e.errno == errno.EPIPE:
144 sys.exit(0x80 | signal.SIGPIPE)181 sys.exit(0x80 | signal.SIGPIPE)
145 raise182 raise
146183
147184
148if __name__ == '__main__':185if __name__ == "__main__":
149 main()186 main()
150187
151# vi: ts=4 expandtab syntax=python188# vi: ts=4 expandtab syntax=python
diff --git a/bin/sstream-sync b/bin/sstream-sync
index d10b90f..4082400 100755
--- a/bin/sstream-sync
+++ b/bin/sstream-sync
@@ -16,18 +16,17 @@
16# You should have received a copy of the GNU Affero General Public License16# You should have received a copy of the GNU Affero General Public License
17# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.17# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1818
19from simplestreams import mirrors
20from simplestreams.mirrors import command_hook
21from simplestreams import log
22from simplestreams import util
23
24import argparse19import argparse
25import errno20import errno
26import os21import os
27import signal22import signal
28import sys23import sys
24
29import yaml25import yaml
3026
27from simplestreams import log, mirrors, util
28from simplestreams.mirrors import command_hook
29
3130
32def which(program):31def which(program):
33 def is_exe(fpath):32 def is_exe(fpath):
@@ -55,51 +54,90 @@ def main():
55 parser = argparse.ArgumentParser()54 parser = argparse.ArgumentParser()
56 defhook = command_hook.DEFAULT_HOOK_NAME55 defhook = command_hook.DEFAULT_HOOK_NAME
5756
58 hooks = [("--hook-%s" % hook.replace("_", "-"), hook, False)57 hooks = [
59 for hook in command_hook.HOOK_NAMES]58 ("--hook-%s" % hook.replace("_", "-"), hook, False)
60 hooks.append(('--hook', defhook, False,))59 for hook in command_hook.HOOK_NAMES
6160 ]
62 parser.add_argument('--config', '-c',61 hooks.append(
63 help='read config file',62 (
64 type=argparse.FileType('rb'))63 "--hook",
6564 defhook,
66 for (argname, cfgname, _required) in hooks:65 False,
66 )
67 )
68
69 parser.add_argument(
70 "--config", "-c", help="read config file", type=argparse.FileType("rb")
71 )
72
73 for argname, cfgname, _required in hooks:
67 parser.add_argument(argname, dest=cfgname, required=False)74 parser.add_argument(argname, dest=cfgname, required=False)
6875
69 parser.add_argument('--keep', action='store_true', default=False,76 parser.add_argument(
70 dest='keep_items',77 "--keep",
71 help='keep items in target up to MAX items '78 action="store_true",
72 'even after they have fallen out of the source')79 default=False,
73 parser.add_argument('--max', type=int, default=None, dest='max_items',80 dest="keep_items",
74 help='store at most MAX items in the target')81 help="keep items in target up to MAX items "
75 parser.add_argument('--item-skip-download', action='store_true',82 "even after they have fallen out of the source",
76 default=False,83 )
77 help='Do not download items that are to be inserted.')84 parser.add_argument(
78 parser.add_argument('--delete', action='store_true', default=False,85 "--max",
79 dest='delete_filtered_items',86 type=int,
80 help='remove filtered items from the target')87 default=None,
81 parser.add_argument('--path', default=None,88 dest="max_items",
82 help='sync from index or products file in mirror')89 help="store at most MAX items in the target",
8390 )
84 parser.add_argument('--verbose', '-v', action='count', default=0)91 parser.add_argument(
85 parser.add_argument('--log-file', default=sys.stderr,92 "--item-skip-download",
86 type=argparse.FileType('w'))93 action="store_true",
8794 default=False,
88 parser.add_argument('--keyring', action='store', default=None,95 help="Do not download items that are to be inserted.",
89 help='keyring to be specified to gpg via --keyring')96 )
90 parser.add_argument('--no-verify', '-U', action='store_false',97 parser.add_argument(
91 dest='verify', default=True,98 "--delete",
92 help="do not gpg check signed json files")99 action="store_true",
93100 default=False,
94 parser.add_argument('mirror_url')101 dest="delete_filtered_items",
102 help="remove filtered items from the target",
103 )
104 parser.add_argument(
105 "--path",
106 default=None,
107 help="sync from index or products file in mirror",
108 )
109
110 parser.add_argument("--verbose", "-v", action="count", default=0)
111 parser.add_argument(
112 "--log-file", default=sys.stderr, type=argparse.FileType("w")
113 )
114
115 parser.add_argument(
116 "--keyring",
117 action="store",
118 default=None,
119 help="keyring to be specified to gpg via --keyring",
120 )
121 parser.add_argument(
122 "--no-verify",
123 "-U",
124 action="store_false",
125 dest="verify",
126 default=True,
127 help="do not gpg check signed json files",
128 )
129
130 parser.add_argument("mirror_url")
95 cmdargs = parser.parse_args()131 cmdargs = parser.parse_args()
96132
97 known_cfg = [('--item-skip-download', 'item_skip_download', False),133 known_cfg = [
98 ('--max', 'max_items', False),134 ("--item-skip-download", "item_skip_download", False),
99 ('--keep', 'keep_items', False),135 ("--max", "max_items", False),
100 ('--delete', 'delete_filtered_items', False),136 ("--keep", "keep_items", False),
101 ('mirror_url', 'mirror_url', True),137 ("--delete", "delete_filtered_items", False),
102 ('--path', 'path', True)]138 ("mirror_url", "mirror_url", True),
139 ("--path", "path", True),
140 ]
103 known_cfg.extend(hooks)141 known_cfg.extend(hooks)
104142
105 cfg = {}143 cfg = {}
@@ -116,31 +154,41 @@ def main():
116 missing = []154 missing = []
117 fallback = cfg.get(defhook, getattr(cmdargs, defhook, None))155 fallback = cfg.get(defhook, getattr(cmdargs, defhook, None))
118156
119 for (argname, cfgname, _required) in known_cfg:157 for argname, cfgname, _required in known_cfg:
120 val = getattr(cmdargs, cfgname)158 val = getattr(cmdargs, cfgname)
121 if val is not None:159 if val is not None:
122 cfg[cfgname] = val160 cfg[cfgname] = val
123 if val == "":161 if val == "":
124 cfg[cfgname] = None162 cfg[cfgname] = None
125163
126 if ((cfgname in command_hook.HOOK_NAMES or cfgname == defhook) and164 if (
127 cfg.get(cfgname) is not None):165 cfgname in command_hook.HOOK_NAMES or cfgname == defhook
166 ) and cfg.get(cfgname) is not None:
128 if which(cfg[cfgname]) is None:167 if which(cfg[cfgname]) is None:
129 msg = "invalid input for %s. '%s' is not executable\n"168 msg = "invalid input for %s. '%s' is not executable\n"
130 sys.stderr.write(msg % (argname, val))169 sys.stderr.write(msg % (argname, val))
131 sys.exit(1)170 sys.exit(1)
132171
133 if (cfgname in command_hook.REQUIRED_FIELDS and172 if (
134 cfg.get(cfgname) is None and not fallback):173 cfgname in command_hook.REQUIRED_FIELDS
135 missing.append((argname, cfgname,))174 and cfg.get(cfgname) is None
175 and not fallback
176 ):
177 missing.append(
178 (
179 argname,
180 cfgname,
181 )
182 )
136183
137 pfm = util.path_from_mirror_url184 pfm = util.path_from_mirror_url
138 (cfg['mirror_url'], cfg['path']) = pfm(cfg['mirror_url'], cfg.get('path'))185 (cfg["mirror_url"], cfg["path"]) = pfm(cfg["mirror_url"], cfg.get("path"))
139186
140 if missing:187 if missing:
141 sys.stderr.write("must provide input for (--hook/%s for default):\n"188 sys.stderr.write(
142 % defhook)189 "must provide input for (--hook/%s for default):\n" % defhook
143 for (flag, cfg) in missing:190 )
191 for flag, cfg in missing:
144 sys.stderr.write(" cmdline '%s' or cfgname '%s'\n" % (flag, cfg))192 sys.stderr.write(" cmdline '%s' or cfgname '%s'\n" % (flag, cfg))
145 sys.exit(1)193 sys.exit(1)
146194
@@ -148,24 +196,24 @@ def main():
148 log.basicConfig(stream=cmdargs.log_file, level=level)196 log.basicConfig(stream=cmdargs.log_file, level=level)
149197
150 def policy(content, path):198 def policy(content, path):
151 if cfg['path'].endswith('sjson'):199 if cfg["path"].endswith("sjson"):
152 return util.read_signed(content,200 return util.read_signed(
153 keyring=cmdargs.keyring,201 content, keyring=cmdargs.keyring, checked=cmdargs.verify
154 checked=cmdargs.verify)202 )
155 else:203 else:
156 return content204 return content
157205
158 smirror = mirrors.UrlMirrorReader(cfg['mirror_url'], policy=policy)206 smirror = mirrors.UrlMirrorReader(cfg["mirror_url"], policy=policy)
159 tmirror = command_hook.CommandHookMirror(config=cfg)207 tmirror = command_hook.CommandHookMirror(config=cfg)
160 try:208 try:
161 tmirror.sync(smirror, cfg['path'])209 tmirror.sync(smirror, cfg["path"])
162 except IOError as e:210 except IOError as e:
163 if e.errno == errno.EPIPE:211 if e.errno == errno.EPIPE:
164 sys.exit(0x80 | signal.SIGPIPE)212 sys.exit(0x80 | signal.SIGPIPE)
165 raise213 raise
166214
167215
168if __name__ == '__main__':216if __name__ == "__main__":
169 main()217 main()
170218
171# vi: ts=4 expandtab syntax=python219# vi: ts=4 expandtab syntax=python
diff --git a/pyproject.toml b/pyproject.toml
172new file mode 100644220new file mode 100644
index 0000000..d84cc51
--- /dev/null
+++ b/pyproject.toml
@@ -0,0 +1,6 @@
1[tool.black]
2line-length = 79
3
4[tool.isort]
5profile = "black"
6line_length = 79
diff --git a/setup.py b/setup.py
index 6b5a29b..301721b 100644
--- a/setup.py
+++ b/setup.py
@@ -1,8 +1,9 @@
1from setuptools import setup
2from glob import glob
3import os1import os
2from glob import glob
3
4from setuptools import setup
45
5VERSION = '0.1.0'6VERSION = "0.1.0"
67
78
8def is_f(p):9def is_f(p):
@@ -11,18 +12,20 @@ def is_f(p):
1112
12setup(13setup(
13 name="python-simplestreams",14 name="python-simplestreams",
14 description='Library and tools for using Simple Streams data',15 description="Library and tools for using Simple Streams data",
15 version=VERSION,16 version=VERSION,
16 author='Scott Moser',17 author="Scott Moser",
17 author_email='scott.moser@canonical.com',18 author_email="scott.moser@canonical.com",
18 license="AGPL",19 license="AGPL",
19 url='http://launchpad.net/simplestreams/',20 url="http://launchpad.net/simplestreams/",
20 packages=['simplestreams', 'simplestreams.mirrors',21 packages=[
21 'simplestreams.objectstores'],22 "simplestreams",
22 scripts=glob('bin/*'),23 "simplestreams.mirrors",
24 "simplestreams.objectstores",
25 ],
26 scripts=glob("bin/*"),
23 data_files=[27 data_files=[
24 ('lib/simplestreams', glob('tools/hook-*')),28 ("lib/simplestreams", glob("tools/hook-*")),
25 ('share/doc/simplestreams',29 ("share/doc/simplestreams", [f for f in glob("doc/*") if is_f(f)]),
26 [f for f in glob('doc/*') if is_f(f)]),30 ],
27 ]
28)31)
diff --git a/simplestreams/checksum_util.py b/simplestreams/checksum_util.py
index dc695e3..bc98161 100644
--- a/simplestreams/checksum_util.py
+++ b/simplestreams/checksum_util.py
@@ -20,7 +20,7 @@ import hashlib
20CHECKSUMS = ("md5", "sha256", "sha512")20CHECKSUMS = ("md5", "sha256", "sha512")
2121
22try:22try:
23 ALGORITHMS = list(getattr(hashlib, 'algorithms'))23 ALGORITHMS = list(getattr(hashlib, "algorithms"))
24except AttributeError:24except AttributeError:
25 ALGORITHMS = list(hashlib.algorithms_available)25 ALGORITHMS = list(hashlib.algorithms_available)
2626
@@ -56,11 +56,13 @@ class checksummer(object):
56 return self._hasher.hexdigest()56 return self._hasher.hexdigest()
5757
58 def check(self):58 def check(self):
59 return (self.expected is None or self.expected == self.hexdigest())59 return self.expected is None or self.expected == self.hexdigest()
6060
61 def __str__(self):61 def __str__(self):
62 return ("checksummer (algorithm=%s expected=%s)" %62 return "checksummer (algorithm=%s expected=%s)" % (
63 (self.algorithm, self.expected))63 self.algorithm,
64 self.expected,
65 )
6466
6567
66def item_checksums(item):68def item_checksums(item):
@@ -69,14 +71,16 @@ def item_checksums(item):
6971
70class SafeCheckSummer(checksummer):72class SafeCheckSummer(checksummer):
71 """SafeCheckSummer raises ValueError if checksums are not provided."""73 """SafeCheckSummer raises ValueError if checksums are not provided."""
74
72 def __init__(self, checksums, allowed=None):75 def __init__(self, checksums, allowed=None):
73 if allowed is None:76 if allowed is None:
74 allowed = CHECKSUMS77 allowed = CHECKSUMS
75 super(SafeCheckSummer, self).__init__(checksums)78 super(SafeCheckSummer, self).__init__(checksums)
76 if self.algorithm not in allowed:79 if self.algorithm not in allowed:
77 raise ValueError(80 raise ValueError(
78 "provided checksums (%s) did not include any allowed (%s)" %81 "provided checksums (%s) did not include any allowed (%s)"
79 (checksums, allowed))82 % (checksums, allowed)
83 )
8084
8185
82class InvalidChecksum(ValueError):86class InvalidChecksum(ValueError):
@@ -93,18 +97,31 @@ class InvalidChecksum(ValueError):
93 if not isinstance(self.expected_size, int):97 if not isinstance(self.expected_size, int):
94 msg = "Invalid size '%s' at %s." % (self.expected_size, self.path)98 msg = "Invalid size '%s' at %s." % (self.expected_size, self.path)
95 else:99 else:
96 msg = ("Invalid %s Checksum at %s. Found %s. Expected %s. "100 msg = (
97 "read %s bytes expected %s bytes." %101 "Invalid %s Checksum at %s. Found %s. Expected %s. "
98 (self.cksum.algorithm, self.path,102 "read %s bytes expected %s bytes."
99 self.cksum.hexdigest(), self.cksum.expected,103 % (
100 self.size, self.expected_size))104 self.cksum.algorithm,
105 self.path,
106 self.cksum.hexdigest(),
107 self.cksum.expected,
108 self.size,
109 self.expected_size,
110 )
111 )
101 if self.size:112 if self.size:
102 msg += (" (size %s expected %s)" %113 msg += " (size %s expected %s)" % (
103 (self.size, self.expected_size))114 self.size,
115 self.expected_size,
116 )
104 return msg117 return msg
105118
106119
107def invalid_checksum_for_reader(reader, msg=None):120def invalid_checksum_for_reader(reader, msg=None):
108 return InvalidChecksum(path=reader.url, cksum=reader.checksummer,121 return InvalidChecksum(
109 size=reader.bytes_read, expected_size=reader.size,122 path=reader.url,
110 msg=msg)123 cksum=reader.checksummer,
124 size=reader.bytes_read,
125 expected_size=reader.size,
126 msg=msg,
127 )
diff --git a/simplestreams/contentsource.py b/simplestreams/contentsource.py
index ce45097..e5c6c98 100644
--- a/simplestreams/contentsource.py
+++ b/simplestreams/contentsource.py
@@ -23,12 +23,13 @@ import sys
23from . import checksum_util23from . import checksum_util
2424
25if sys.version_info > (3, 0):25if sys.version_info > (3, 0):
26 import urllib.error as urllib_error
26 import urllib.parse as urlparse27 import urllib.parse as urlparse
27 import urllib.request as urllib_request28 import urllib.request as urllib_request
28 import urllib.error as urllib_error
29else:29else:
30 import urlparse
31 import urllib2 as urllib_request30 import urllib2 as urllib_request
31 import urlparse
32
32 urllib_error = urllib_request33 urllib_error = urllib_request
3334
34READ_BUFFER_SIZE = 1024 * 1035READ_BUFFER_SIZE = 1024 * 10
@@ -38,12 +39,14 @@ try:
38 # We try to use requests because we can do gzip encoding with it.39 # We try to use requests because we can do gzip encoding with it.
39 # however requests < 1.1 didn't have 'stream' argument to 'get'40 # however requests < 1.1 didn't have 'stream' argument to 'get'
40 # making it completely unsuitable for downloading large files.41 # making it completely unsuitable for downloading large files.
41 import requests
42 from distutils.version import LooseVersion42 from distutils.version import LooseVersion
43
43 import pkg_resources44 import pkg_resources
44 _REQ = pkg_resources.get_distribution('requests')45 import requests
46
47 _REQ = pkg_resources.get_distribution("requests")
45 _REQ_VER = LooseVersion(_REQ.version)48 _REQ_VER = LooseVersion(_REQ.version)
46 if _REQ_VER < LooseVersion('1.1'):49 if _REQ_VER < LooseVersion("1.1"):
47 raise ImportError("Requests version < 1.1, not suitable for usage.")50 raise ImportError("Requests version < 1.1, not suitable for usage.")
48 URL_READER_CLASSNAME = "RequestsUrlReader"51 URL_READER_CLASSNAME = "RequestsUrlReader"
49except ImportError:52except ImportError:
@@ -63,8 +66,8 @@ class ContentSource(object):
63 raise NotImplementedError()66 raise NotImplementedError()
6467
65 def set_start_pos(self, offset):68 def set_start_pos(self, offset):
66 """ Implemented if the ContentSource supports seeking within content.69 """Implemented if the ContentSource supports seeking within content.
67 Used to resume failed transfers. """70 Used to resume failed transfers."""
6871
69 class SetStartPosNotImplementedError(NotImplementedError):72 class SetStartPosNotImplementedError(NotImplementedError):
70 pass73 pass
@@ -122,8 +125,9 @@ class UrlContentSource(ContentSource):
122 if e.errno != errno.ENOENT:125 if e.errno != errno.ENOENT:
123 raise126 raise
124 continue127 continue
125 myerr = IOError("Unable to open %s. mirrors=%s" %128 myerr = IOError(
126 (self.input_url, self.mirrors))129 "Unable to open %s. mirrors=%s" % (self.input_url, self.mirrors)
130 )
127 myerr.errno = errno.ENOENT131 myerr.errno = errno.ENOENT
128 raise myerr132 raise myerr
129133
@@ -181,7 +185,7 @@ class IteratorContentSource(ContentSource):
181 raise exc185 raise exc
182186
183 def is_enoent(self, exc):187 def is_enoent(self, exc):
184 return (isinstance(exc, IOError) and exc.errno == errno.ENOENT)188 return isinstance(exc, IOError) and exc.errno == errno.ENOENT
185189
186 def read(self, size=None):190 def read(self, size=None):
187 self.open()191 self.open()
@@ -189,7 +193,7 @@ class IteratorContentSource(ContentSource):
189 if self.consumed:193 if self.consumed:
190 return bytes()194 return bytes()
191195
192 if (size is None or size < 0):196 if size is None or size < 0:
193 # read everything197 # read everything
194 ret = self.leftover198 ret = self.leftover
195 self.leftover = bytes()199 self.leftover = bytes()
@@ -227,7 +231,7 @@ class IteratorContentSource(ContentSource):
227class MemoryContentSource(FdContentSource):231class MemoryContentSource(FdContentSource):
228 def __init__(self, url=None, content=""):232 def __init__(self, url=None, content=""):
229 if isinstance(content, str):233 if isinstance(content, str):
230 content = content.encode('utf-8')234 content = content.encode("utf-8")
231 fd = io.BytesIO(content)235 fd = io.BytesIO(content)
232 if url is None:236 if url is None:
233 url = "MemoryContentSource://undefined"237 url = "MemoryContentSource://undefined"
@@ -265,8 +269,10 @@ class ChecksummingContentSource(ContentSource):
265269
266 def _set_checksummer(self, checksummer):270 def _set_checksummer(self, checksummer):
267 if checksummer.algorithm not in checksum_util.CHECKSUMS:271 if checksummer.algorithm not in checksum_util.CHECKSUMS:
268 raise ValueError("algorithm %s is not valid (%s)" %272 raise ValueError(
269 (checksummer.algorithm, checksum_util.CHECKSUMS))273 "algorithm %s is not valid (%s)"
274 % (checksummer.algorithm, checksum_util.CHECKSUMS)
275 )
270 self.checksummer = checksummer276 self.checksummer = checksummer
271277
272 def check(self):278 def check(self):
@@ -322,7 +328,6 @@ class FileReader(UrlReader):
322328
323329
324class Urllib2UrlReader(UrlReader):330class Urllib2UrlReader(UrlReader):
325
326 timeout = TIMEOUT331 timeout = TIMEOUT
327332
328 def __init__(self, url, offset=None, user_agent=None):333 def __init__(self, url, offset=None, user_agent=None):
@@ -339,9 +344,9 @@ class Urllib2UrlReader(UrlReader):
339 try:344 try:
340 req = urllib_request.Request(url)345 req = urllib_request.Request(url)
341 if user_agent is not None:346 if user_agent is not None:
342 req.add_header('User-Agent', user_agent)347 req.add_header("User-Agent", user_agent)
343 if offset is not None:348 if offset is not None:
344 req.add_header('Range', 'bytes=%d-' % offset)349 req.add_header("Range", "bytes=%d-" % offset)
345 self.req = opener(req, timeout=self.timeout)350 self.req = opener(req, timeout=self.timeout)
346 except urllib_error.HTTPError as e:351 except urllib_error.HTTPError as e:
347 if e.code == 404:352 if e.code == 404:
@@ -373,8 +378,10 @@ class RequestsUrlReader(UrlReader):
373378
374 def __init__(self, url, buflen=None, offset=None, user_agent=None):379 def __init__(self, url, buflen=None, offset=None, user_agent=None):
375 if requests is None:380 if requests is None:
376 raise ImportError("Attempt to use RequestsUrlReader "381 raise ImportError(
377 "without suitable requests library.")382 "Attempt to use RequestsUrlReader "
383 "without suitable requests library."
384 )
378 self.url = url385 self.url = url
379 (url, user, password) = parse_url_auth(url)386 (url, user, password) = parse_url_auth(url)
380 if user is None:387 if user is None:
@@ -384,15 +391,15 @@ class RequestsUrlReader(UrlReader):
384391
385 headers = {}392 headers = {}
386 if user_agent is not None:393 if user_agent is not None:
387 headers['User-Agent'] = user_agent394 headers["User-Agent"] = user_agent
388 if offset is not None:395 if offset is not None:
389 headers['Range'] = 'bytes=%d-' % offset396 headers["Range"] = "bytes=%d-" % offset
390 if headers == {}:397 if headers == {}:
391 headers = None398 headers = None
392399
393 # requests version less than 2.4.1 takes an optional400 # requests version less than 2.4.1 takes an optional
394 # float for timeout. There is no separate read timeout401 # float for timeout. There is no separate read timeout
395 if _REQ_VER < LooseVersion('2.4.1'):402 if _REQ_VER < LooseVersion("2.4.1"):
396 self.timeout = TIMEOUT403 self.timeout = TIMEOUT
397404
398 self.req = requests.get(405 self.req = requests.get(
@@ -405,13 +412,13 @@ class RequestsUrlReader(UrlReader):
405 self.leftover = bytes()412 self.leftover = bytes()
406 self.consumed = False413 self.consumed = False
407414
408 if (self.req.status_code == requests.codes.NOT_FOUND):415 if self.req.status_code == requests.codes.NOT_FOUND:
409 myerr = IOError("Unable to open %s" % url)416 myerr = IOError("Unable to open %s" % url)
410 myerr.errno = errno.ENOENT417 myerr.errno = errno.ENOENT
411 raise myerr418 raise myerr
412419
413 ce = self.req.headers.get('content-encoding', '').lower()420 ce = self.req.headers.get("content-encoding", "").lower()
414 if 'gzip' in ce or 'deflate' in ce:421 if "gzip" in ce or "deflate" in ce:
415 self._read = self.read_compressed422 self._read = self.read_compressed
416 else:423 else:
417 self._read = self.read_raw424 self._read = self.read_raw
@@ -426,7 +433,7 @@ class RequestsUrlReader(UrlReader):
426 if self.consumed:433 if self.consumed:
427 return bytes()434 return bytes()
428435
429 if (size is None or size < 0):436 if size is None or size < 0:
430 # read everything437 # read everything
431 ret = self.leftover438 ret = self.leftover
432 self.leftover = bytes()439 self.leftover = bytes()
diff --git a/simplestreams/filters.py b/simplestreams/filters.py
index 3818949..330a475 100644
--- a/simplestreams/filters.py
+++ b/simplestreams/filters.py
@@ -15,10 +15,10 @@
15# You should have received a copy of the GNU Affero General Public License15# You should have received a copy of the GNU Affero General Public License
16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1717
18from simplestreams import util
19
20import re18import re
2119
20from simplestreams import util
21
2222
23class ItemFilter(object):23class ItemFilter(object):
24 def __init__(self, content, noneval=""):24 def __init__(self, content, noneval=""):
@@ -37,7 +37,7 @@ class ItemFilter(object):
37 else:37 else:
38 raise ValueError("Bad parsing of %s" % content)38 raise ValueError("Bad parsing of %s" % content)
3939
40 self.negator = (op[0] != "!")40 self.negator = op[0] != "!"
41 self.op = op41 self.op = op
42 self.key = key42 self.key = key
43 self.value = val43 self.value = val
@@ -45,15 +45,19 @@ class ItemFilter(object):
45 self.noneval = noneval45 self.noneval = noneval
4646
47 def __str__(self):47 def __str__(self):
48 return "%s %s %s [none=%s]" % (self.key, self.op,48 return "%s %s %s [none=%s]" % (
49 self.value, self.noneval)49 self.key,
50 self.op,
51 self.value,
52 self.noneval,
53 )
5054
51 def __repr__(self):55 def __repr__(self):
52 return self.__str__()56 return self.__str__()
5357
54 def matches(self, item):58 def matches(self, item):
55 val = str(item.get(self.key, self.noneval))59 val = str(item.get(self.key, self.noneval))
56 return (self.negator == bool(self._matches(val)))60 return self.negator == bool(self._matches(val))
5761
5862
59def get_filters(filters, noneval=""):63def get_filters(filters, noneval=""):
diff --git a/simplestreams/generate_simplestreams.py b/simplestreams/generate_simplestreams.py
index 9b7b919..329a91d 100644
--- a/simplestreams/generate_simplestreams.py
+++ b/simplestreams/generate_simplestreams.py
@@ -13,54 +13,58 @@
13# or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public13# or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public
14# License for more details.14# License for more details.
15#15#
16from collections import namedtuple
17from copy import deepcopy
18import json16import json
19import os17import os
20import sys18import sys
19from collections import namedtuple
20from copy import deepcopy
2121
22from simplestreams import util22from simplestreams import util
2323
2424Item = namedtuple(
25Item = namedtuple('Item', ['content_id', 'product_name', 'version_name',25 "Item", ["content_id", "product_name", "version_name", "item_name", "data"]
26 'item_name', 'data'])26)
2727
2828
29def items2content_trees(itemslist, exdata):29def items2content_trees(itemslist, exdata):
30 # input is a list with each item having:30 # input is a list with each item having:
31 # (content_id, product_name, version_name, item_name, {data})31 # (content_id, product_name, version_name, item_name, {data})
32 ctrees = {}32 ctrees = {}
33 for (content_id, prodname, vername, itemname, data) in itemslist:33 for content_id, prodname, vername, itemname, data in itemslist:
34 if content_id not in ctrees:34 if content_id not in ctrees:
35 ctrees[content_id] = {'content_id': content_id,35 ctrees[content_id] = {
36 'format': 'products:1.0', 'products': {}}36 "content_id": content_id,
37 "format": "products:1.0",
38 "products": {},
39 }
37 ctrees[content_id].update(exdata)40 ctrees[content_id].update(exdata)
3841
39 ctree = ctrees[content_id]42 ctree = ctrees[content_id]
40 if prodname not in ctree['products']:43 if prodname not in ctree["products"]:
41 ctree['products'][prodname] = {'versions': {}}44 ctree["products"][prodname] = {"versions": {}}
4245
43 prodtree = ctree['products'][prodname]46 prodtree = ctree["products"][prodname]
44 if vername not in prodtree['versions']:47 if vername not in prodtree["versions"]:
45 prodtree['versions'][vername] = {'items': {}}48 prodtree["versions"][vername] = {"items": {}}
4649
47 vertree = prodtree['versions'][vername]50 vertree = prodtree["versions"][vername]
4851
49 if itemname in vertree['items']:52 if itemname in vertree["items"]:
50 raise ValueError("%s: already existed" %53 raise ValueError(
51 str([content_id, prodname, vername, itemname]))54 "%s: already existed"
55 % str([content_id, prodname, vername, itemname])
56 )
5257
53 vertree['items'][itemname] = data58 vertree["items"][itemname] = data
54 return ctrees59 return ctrees
5560
5661
57class FileNamer:62class FileNamer:
5863 streamdir = "streams/v1"
59 streamdir = 'streams/v1'
6064
61 @classmethod65 @classmethod
62 def get_index_path(cls):66 def get_index_path(cls):
63 return "%s/%s" % (cls.streamdir, 'index.json')67 return "%s/%s" % (cls.streamdir, "index.json")
6468
65 @classmethod69 @classmethod
66 def get_content_path(cls, content_id):70 def get_content_path(cls, content_id):
@@ -68,16 +72,16 @@ class FileNamer:
6872
6973
70def generate_index(trees, updated, namer):74def generate_index(trees, updated, namer):
71 index = {"index": {}, 'format': 'index:1.0', 'updated': updated}75 index = {"index": {}, "format": "index:1.0", "updated": updated}
72 not_copied_up = ['content_id']76 not_copied_up = ["content_id"]
73 for content_id, content in trees.items():77 for content_id, content in trees.items():
74 index['index'][content_id] = {78 index["index"][content_id] = {
75 'products': sorted(list(content['products'].keys())),79 "products": sorted(list(content["products"].keys())),
76 'path': namer.get_content_path(content_id),80 "path": namer.get_content_path(content_id),
77 }81 }
78 for k in util.stringitems(content):82 for k in util.stringitems(content):
79 if k not in not_copied_up:83 if k not in not_copied_up:
80 index['index'][content_id][k] = content[k]84 index["index"][content_id][k] = content[k]
81 return index85 return index
8286
8387
@@ -85,20 +89,29 @@ def write_streams(out_d, trees, updated, namer=None, condense=True):
85 if namer is None:89 if namer is None:
86 namer = FileNamer90 namer = FileNamer
87 index = generate_index(trees, updated, namer)91 index = generate_index(trees, updated, namer)
88 to_write = [(namer.get_index_path(), index,)]92 to_write = [
93 (
94 namer.get_index_path(),
95 index,
96 )
97 ]
89 # Don't let products_condense modify the input98 # Don't let products_condense modify the input
90 trees = deepcopy(trees)99 trees = deepcopy(trees)
91 for content_id in trees:100 for content_id in trees:
92 if condense:101 if condense:
93 util.products_condense(trees[content_id],102 util.products_condense(
94 sticky=[103 trees[content_id],
95 'path', 'sha256', 'md5',104 sticky=["path", "sha256", "md5", "size", "mirrors"],
96 'size', 'mirrors'105 )
97 ])
98 content = trees[content_id]106 content = trees[content_id]
99 to_write.append((index['index'][content_id]['path'], content,))107 to_write.append(
108 (
109 index["index"][content_id]["path"],
110 content,
111 )
112 )
100 out_filenames = []113 out_filenames = []
101 for (outfile, data) in to_write:114 for outfile, data in to_write:
102 filef = os.path.join(out_d, outfile)115 filef = os.path.join(out_d, outfile)
103 util.mkdir_p(os.path.dirname(filef))116 util.mkdir_p(os.path.dirname(filef))
104 json_dump(data, filef)117 json_dump(data, filef)
@@ -108,6 +121,8 @@ def write_streams(out_d, trees, updated, namer=None, condense=True):
108121
109def json_dump(data, filename):122def json_dump(data, filename):
110 with open(filename, "w") as fp:123 with open(filename, "w") as fp:
111 sys.stderr.write(u"writing %s\n" % filename)124 sys.stderr.write("writing %s\n" % filename)
112 fp.write(json.dumps(data, indent=2, sort_keys=True,125 fp.write(
113 separators=(',', ': ')) + "\n")126 json.dumps(data, indent=2, sort_keys=True, separators=(",", ": "))
127 + "\n"
128 )
diff --git a/simplestreams/json2streams.py b/simplestreams/json2streams.py
index e9749f6..20b09e7 100755
--- a/simplestreams/json2streams.py
+++ b/simplestreams/json2streams.py
@@ -1,43 +1,41 @@
1#!/usr/bin/env python31#!/usr/bin/env python3
2# Copyright (C) 2013, 2015 Canonical Ltd.2# Copyright (C) 2013, 2015 Canonical Ltd.
33
4from argparse import ArgumentParser
5import json4import json
6import os5import os
7import sys6import sys
7from argparse import ArgumentParser
88
9from simplestreams import util9from simplestreams import util
10
11from simplestreams.generate_simplestreams import (10from simplestreams.generate_simplestreams import (
12 FileNamer,11 FileNamer,
13 Item,12 Item,
14 items2content_trees,13 items2content_trees,
15 json_dump,14 json_dump,
16 write_streams,15 write_streams,
17 )16)
1817
1918
20class JujuFileNamer(FileNamer):19class JujuFileNamer(FileNamer):
21
22 @classmethod20 @classmethod
23 def get_index_path(cls):21 def get_index_path(cls):
24 return "%s/%s" % (cls.streamdir, 'index2.json')22 return "%s/%s" % (cls.streamdir, "index2.json")
2523
26 @classmethod24 @classmethod
27 def get_content_path(cls, content_id):25 def get_content_path(cls, content_id):
28 return "%s/%s.json" % (cls.streamdir, content_id.replace(':', '-'))26 return "%s/%s.json" % (cls.streamdir, content_id.replace(":", "-"))
2927
3028
31def dict_to_item(item_dict):29def dict_to_item(item_dict):
32 """Convert a dict into an Item, mutating input."""30 """Convert a dict into an Item, mutating input."""
33 item_dict.pop('item_url', None)31 item_dict.pop("item_url", None)
34 size = item_dict.get('size')32 size = item_dict.get("size")
35 if size is not None:33 if size is not None:
36 item_dict['size'] = int(size)34 item_dict["size"] = int(size)
37 content_id = item_dict.pop('content_id')35 content_id = item_dict.pop("content_id")
38 product_name = item_dict.pop('product_name')36 product_name = item_dict.pop("product_name")
39 version_name = item_dict.pop('version_name')37 version_name = item_dict.pop("version_name")
40 item_name = item_dict.pop('item_name')38 item_name = item_dict.pop("item_name")
41 return Item(content_id, product_name, version_name, item_name, item_dict)39 return Item(content_id, product_name, version_name, item_name, item_dict)
4240
4341
@@ -51,9 +49,11 @@ def write_release_index(out_d):
51 in_path = os.path.join(out_d, JujuFileNamer.get_index_path())49 in_path = os.path.join(out_d, JujuFileNamer.get_index_path())
52 with open(in_path) as in_file:50 with open(in_path) as in_file:
53 full_index = json.load(in_file)51 full_index = json.load(in_file)
54 full_index['index'] = dict(52 full_index["index"] = dict(
55 (k, v) for k, v in list(full_index['index'].items())53 (k, v)
56 if k == 'com.ubuntu.juju:released:tools')54 for k, v in list(full_index["index"].items())
55 if k == "com.ubuntu.juju:released:tools"
56 )
57 out_path = os.path.join(out_d, FileNamer.get_index_path())57 out_path = os.path.join(out_d, FileNamer.get_index_path())
58 json_dump(full_index, out_path)58 json_dump(full_index, out_path)
59 return out_path59 return out_path
@@ -70,7 +70,7 @@ def filenames_to_streams(filenames, updated, out_d, juju_format=False):
70 for items_file in filenames:70 for items_file in filenames:
71 items.extend(read_items_file(items_file))71 items.extend(read_items_file(items_file))
7272
73 data = {'updated': updated, 'datatype': 'content-download'}73 data = {"updated": updated, "datatype": "content-download"}
74 trees = items2content_trees(items, data)74 trees = items2content_trees(items, data)
75 if juju_format:75 if juju_format:
76 write = write_juju_streams76 write = write_juju_streams
@@ -88,23 +88,31 @@ def write_juju_streams(out_d, trees, updated):
88def parse_args(argv=None):88def parse_args(argv=None):
89 parser = ArgumentParser()89 parser = ArgumentParser()
90 parser.add_argument(90 parser.add_argument(
91 'items_file', metavar='items-file', help='File to read items from',91 "items_file",
92 nargs='+')92 metavar="items-file",
93 help="File to read items from",
94 nargs="+",
95 )
93 parser.add_argument(96 parser.add_argument(
94 'out_d', metavar='output-dir',97 "out_d",
95 help='The directory to write stream files to.')98 metavar="output-dir",
99 help="The directory to write stream files to.",
100 )
96 parser.add_argument(101 parser.add_argument(
97 '--juju-format', action='store_true',102 "--juju-format",
98 help='Write stream files in juju format.')103 action="store_true",
104 help="Write stream files in juju format.",
105 )
99 return parser.parse_args(argv)106 return parser.parse_args(argv)
100107
101108
102def main():109def main():
103 args = parse_args()110 args = parse_args()
104 updated = util.timestamp()111 updated = util.timestamp()
105 filenames_to_streams(args.items_file, updated, args.out_d,112 filenames_to_streams(
106 args.juju_format)113 args.items_file, updated, args.out_d, args.juju_format
114 )
107115
108116
109if __name__ == '__main__':117if __name__ == "__main__":
110 sys.exit(main())118 sys.exit(main())
diff --git a/simplestreams/log.py b/simplestreams/log.py
index 061103a..824f248 100644
--- a/simplestreams/log.py
+++ b/simplestreams/log.py
@@ -33,18 +33,22 @@ class NullHandler(logging.Handler):
3333
34def basicConfig(**kwargs):34def basicConfig(**kwargs):
35 # basically like logging.basicConfig but only output for our logger35 # basically like logging.basicConfig but only output for our logger
36 if kwargs.get('filename'):36 if kwargs.get("filename"):
37 handler = logging.FileHandler(filename=kwargs['filename'],37 handler = logging.FileHandler(
38 mode=kwargs.get('filemode', 'a'))38 filename=kwargs["filename"], mode=kwargs.get("filemode", "a")
39 elif kwargs.get('stream'):39 )
40 handler = logging.StreamHandler(stream=kwargs['stream'])40 elif kwargs.get("stream"):
41 handler = logging.StreamHandler(stream=kwargs["stream"])
41 else:42 else:
42 handler = NullHandler()43 handler = NullHandler()
4344
44 level = kwargs.get('level', NOTSET)45 level = kwargs.get("level", NOTSET)
4546
46 handler.setFormatter(logging.Formatter(fmt=kwargs.get('format'),47 handler.setFormatter(
47 datefmt=kwargs.get('datefmt')))48 logging.Formatter(
49 fmt=kwargs.get("format"), datefmt=kwargs.get("datefmt")
50 )
51 )
48 handler.setLevel(level)52 handler.setLevel(level)
4953
50 logging.getLogger().setLevel(level)54 logging.getLogger().setLevel(level)
@@ -56,7 +60,7 @@ def basicConfig(**kwargs):
56 logger.addHandler(handler)60 logger.addHandler(handler)
5761
5862
59def _getLogger(name='sstreams'):63def _getLogger(name="sstreams"):
60 return logging.getLogger(name)64 return logging.getLogger(name)
6165
6266
diff --git a/simplestreams/mirrors/__init__.py b/simplestreams/mirrors/__init__.py
index 4a9593a..b3374f8 100644
--- a/simplestreams/mirrors/__init__.py
+++ b/simplestreams/mirrors/__init__.py
@@ -18,10 +18,10 @@ import errno
18import io18import io
19import json19import json
2020
21import simplestreams.contentsource as cs
21import simplestreams.filters as filters22import simplestreams.filters as filters
22import simplestreams.util as util23import simplestreams.util as util
23from simplestreams import checksum_util24from simplestreams import checksum_util
24import simplestreams.contentsource as cs
25from simplestreams.log import LOG25from simplestreams.log import LOG
2626
27DEFAULT_USER_AGENT = "python-simplestreams/0.1"27DEFAULT_USER_AGENT = "python-simplestreams/0.1"
@@ -29,8 +29,8 @@ DEFAULT_USER_AGENT = "python-simplestreams/0.1"
2929
30class MirrorReader(object):30class MirrorReader(object):
31 def __init__(self, policy=util.policy_read_signed):31 def __init__(self, policy=util.policy_read_signed):
32 """ policy should be a function which returns the extracted payload or32 """policy should be a function which returns the extracted payload or
33 raises an exception if the policy is violated. """33 raises an exception if the policy is violated."""
34 self.policy = policy34 self.policy = policy
3535
36 def load_products(self, path):36 def load_products(self, path):
@@ -39,7 +39,7 @@ class MirrorReader(object):
3939
40 def read_json(self, path):40 def read_json(self, path):
41 with self.source(path) as source:41 with self.source(path) as source:
42 raw = source.read().decode('utf-8')42 raw = source.read().decode("utf-8")
43 return raw, self.policy(content=raw, path=path)43 return raw, self.policy(content=raw, path=path)
4444
45 def source(self, path):45 def source(self, path):
@@ -164,8 +164,13 @@ class MirrorWriter(object):
164164
165165
166class UrlMirrorReader(MirrorReader):166class UrlMirrorReader(MirrorReader):
167 def __init__(self, prefix, mirrors=None, policy=util.policy_read_signed,167 def __init__(
168 user_agent=DEFAULT_USER_AGENT):168 self,
169 prefix,
170 mirrors=None,
171 policy=util.policy_read_signed,
172 user_agent=DEFAULT_USER_AGENT,
173 ):
169 super(UrlMirrorReader, self).__init__(policy=policy)174 super(UrlMirrorReader, self).__init__(policy=policy)
170 self._cs = cs.UrlContentSource175 self._cs = cs.UrlContentSource
171 if mirrors is None:176 if mirrors is None:
@@ -184,13 +189,18 @@ class UrlMirrorReader(MirrorReader):
184189
185 def url_reader_factory(*args, **kwargs):190 def url_reader_factory(*args, **kwargs):
186 return cs.URL_READER(191 return cs.URL_READER(
187 *args, user_agent=self.user_agent, **kwargs)192 *args, user_agent=self.user_agent, **kwargs
193 )
194
188 else:195 else:
189 url_reader_factory = None196 url_reader_factory = None
190197
191 if self._trailing_slash_checked:198 if self._trailing_slash_checked:
192 return self._cs(self.prefix + path, mirrors=mirrors,199 return self._cs(
193 url_reader=url_reader_factory)200 self.prefix + path,
201 mirrors=mirrors,
202 url_reader=url_reader_factory,
203 )
194204
195 # A little hack to fix up the user's path. It's fairly common to205 # A little hack to fix up the user's path. It's fairly common to
196 # specify URLs without a trailing slash, so we try to do that here as206 # specify URLs without a trailing slash, so we try to do that here as
@@ -198,22 +208,31 @@ class UrlMirrorReader(MirrorReader):
198 # returned is not yet open (LP: #1237658)208 # returned is not yet open (LP: #1237658)
199 self._trailing_slash_checked = True209 self._trailing_slash_checked = True
200 try:210 try:
201 with self._cs(self.prefix + path, mirrors=None,211 with self._cs(
202 url_reader=url_reader_factory) as csource:212 self.prefix + path, mirrors=None, url_reader=url_reader_factory
213 ) as csource:
203 csource.read(1024)214 csource.read(1024)
204 except Exception as e:215 except Exception as e:
205 if isinstance(e, IOError) and (e.errno == errno.ENOENT):216 if isinstance(e, IOError) and (e.errno == errno.ENOENT):
206 LOG.warning("got ENOENT for (%s, %s), trying with trailing /",217 LOG.warning(
207 self.prefix, path)218 "got ENOENT for (%s, %s), trying with trailing /",
208 self.prefix = self.prefix + '/'219 self.prefix,
220 path,
221 )
222 self.prefix = self.prefix + "/"
209 else:223 else:
210 # this raised exception, but it was sneaky to do it224 # this raised exception, but it was sneaky to do it
211 # so just ignore it.225 # so just ignore it.
212 LOG.debug("trailing / check on (%s, %s) resulted in %s",226 LOG.debug(
213 self.prefix, path, e)227 "trailing / check on (%s, %s) resulted in %s",
228 self.prefix,
229 path,
230 e,
231 )
214232
215 return self._cs(self.prefix + path, mirrors=mirrors,233 return self._cs(
216 url_reader=url_reader_factory)234 self.prefix + path, mirrors=mirrors, url_reader=url_reader_factory
235 )
217236
218237
219class ObjectStoreMirrorReader(MirrorReader):238class ObjectStoreMirrorReader(MirrorReader):
@@ -231,7 +250,7 @@ class BasicMirrorWriter(MirrorWriter):
231 if config is None:250 if config is None:
232 config = {}251 config = {}
233 self.config = config252 self.config = config
234 self.checksumming_reader = self.config.get('checksumming_reader', True)253 self.checksumming_reader = self.config.get("checksumming_reader", True)
235254
236 def load_products(self, path=None, content_id=None):255 def load_products(self, path=None, content_id=None):
237 super(BasicMirrorWriter, self).load_products(path, content_id)256 super(BasicMirrorWriter, self).load_products(path, content_id)
@@ -243,14 +262,14 @@ class BasicMirrorWriter(MirrorWriter):
243262
244 check_tree_paths(src)263 check_tree_paths(src)
245264
246 itree = src.get('index')265 itree = src.get("index")
247 for content_id, index_entry in itree.items():266 for content_id, index_entry in itree.items():
248 if not self.filter_index_entry(index_entry, src, (content_id,)):267 if not self.filter_index_entry(index_entry, src, (content_id,)):
249 continue268 continue
250 epath = index_entry.get('path', None)269 epath = index_entry.get("path", None)
251 mycs = None270 mycs = None
252 if epath:271 if epath:
253 if index_entry.get('format') in ("index:1.0", "products:1.0"):272 if index_entry.get("format") in ("index:1.0", "products:1.0"):
254 self.sync(reader, path=epath)273 self.sync(reader, path=epath)
255 mycs = reader.source(epath)274 mycs = reader.source(epath)
256275
@@ -265,36 +284,37 @@ class BasicMirrorWriter(MirrorWriter):
265284
266 check_tree_paths(src)285 check_tree_paths(src)
267286
268 content_id = src['content_id']287 content_id = src["content_id"]
269 target = self.load_products(path, content_id)288 target = self.load_products(path, content_id)
270 if not target:289 if not target:
271 target = util.stringitems(src)290 target = util.stringitems(src)
272291
273 util.expand_tree(target)292 util.expand_tree(target)
274293
275 stree = src.get('products', {})294 stree = src.get("products", {})
276 if 'products' not in target:295 if "products" not in target:
277 target['products'] = {}296 target["products"] = {}
278297
279 tproducts = target['products']298 tproducts = target["products"]
280299
281 filtered_products = []300 filtered_products = []
282 prodname = None301 prodname = None
283302
284 # Apply filters to items before filtering versions303 # Apply filters to items before filtering versions
285 for prodname, product in list(stree.items()):304 for prodname, product in list(stree.items()):
286305 for vername, version in list(product.get("versions", {}).items()):
287 for vername, version in list(product.get('versions', {}).items()):306 for itemname, item in list(version.get("items", {}).items()):
288 for itemname, item in list(version.get('items', {}).items()):
289 pgree = (prodname, vername, itemname)307 pgree = (prodname, vername, itemname)
290 if not self.filter_item(item, src, target, pgree):308 if not self.filter_item(item, src, target, pgree):
291 LOG.debug("Filtered out item: %s/%s", itemname, item)309 LOG.debug("Filtered out item: %s/%s", itemname, item)
292 del stree[prodname]['versions'][vername]['items'][310 del stree[prodname]["versions"][vername]["items"][
293 itemname]311 itemname
294 if not stree[prodname]['versions'][vername].get(312 ]
295 'items', {}):313 if not stree[prodname]["versions"][vername].get(
296 del stree[prodname]['versions'][vername]314 "items", {}
297 if not stree[prodname].get('versions', {}):315 ):
316 del stree[prodname]["versions"][vername]
317 if not stree[prodname].get("versions", {}):
298 del stree[prodname]318 del stree[prodname]
299319
300 for prodname, product in stree.items():320 for prodname, product in stree.items():
@@ -305,50 +325,62 @@ class BasicMirrorWriter(MirrorWriter):
305 if prodname not in tproducts:325 if prodname not in tproducts:
306 tproducts[prodname] = util.stringitems(product)326 tproducts[prodname] = util.stringitems(product)
307 tproduct = tproducts[prodname]327 tproduct = tproducts[prodname]
308 if 'versions' not in tproduct:328 if "versions" not in tproduct:
309 tproduct['versions'] = {}329 tproduct["versions"] = {}
310330
311 src_filtered_items = []331 src_filtered_items = []
312332
313 def _filter(itemkey):333 def _filter(itemkey):
314 ret = self.filter_version(product['versions'][itemkey],334 ret = self.filter_version(
315 src, target, (prodname, itemkey))335 product["versions"][itemkey],
336 src,
337 target,
338 (prodname, itemkey),
339 )
316 if not ret:340 if not ret:
317 src_filtered_items.append(itemkey)341 src_filtered_items.append(itemkey)
318 return ret342 return ret
319343
320 (to_add, to_remove) = util.resolve_work(344 (to_add, to_remove) = util.resolve_work(
321 src=list(product.get('versions', {}).keys()),345 src=list(product.get("versions", {}).keys()),
322 target=list(tproduct.get('versions', {}).keys()),346 target=list(tproduct.get("versions", {}).keys()),
323 maxnum=self.config.get('max_items'),347 maxnum=self.config.get("max_items"),
324 keep=self.config.get('keep_items'), itemfilter=_filter)348 keep=self.config.get("keep_items"),
325349 itemfilter=_filter,
326 LOG.info("%s/%s: to_add=%s to_remove=%s", content_id, prodname,350 )
327 to_add, to_remove)351
328352 LOG.info(
329 tversions = tproduct['versions']353 "%s/%s: to_add=%s to_remove=%s",
354 content_id,
355 prodname,
356 to_add,
357 to_remove,
358 )
359
360 tversions = tproduct["versions"]
330 skipped_versions = []361 skipped_versions = []
331 for vername in to_add:362 for vername in to_add:
332 version = product['versions'][vername]363 version = product["versions"][vername]
333364
334 if vername not in tversions:365 if vername not in tversions:
335 tversions[vername] = util.stringitems(version)366 tversions[vername] = util.stringitems(version)
336367
337 added_items = []368 added_items = []
338 for itemname, item in version.get('items', {}).items():369 for itemname, item in version.get("items", {}).items():
339 pgree = (prodname, vername, itemname)370 pgree = (prodname, vername, itemname)
340371
341 added_items.append(itemname)372 added_items.append(itemname)
342373
343 ipath = item.get('path', None)374 ipath = item.get("path", None)
344 ipath_cs = None375 ipath_cs = None
345 if ipath and reader:376 if ipath and reader:
346 if self.checksumming_reader:377 if self.checksumming_reader:
347 flat = util.products_exdata(src, pgree)378 flat = util.products_exdata(src, pgree)
348 ipath_cs = cs.ChecksummingContentSource(379 ipath_cs = cs.ChecksummingContentSource(
349 csrc=reader.source(ipath),380 csrc=reader.source(ipath),
350 size=flat.get('size'),381 size=flat.get("size"),
351 checksums=checksum_util.item_checksums(flat))382 checksums=checksum_util.item_checksums(flat),
383 )
352 else:384 else:
353 ipath_cs = reader.source(ipath)385 ipath_cs = reader.source(ipath)
354386
@@ -356,28 +388,38 @@ class BasicMirrorWriter(MirrorWriter):
356388
357 if len(added_items):389 if len(added_items):
358 # do not insert versions that had all items filtered390 # do not insert versions that had all items filtered
359 self.insert_version(version, src, target,391 self.insert_version(
360 (prodname, vername))392 version, src, target, (prodname, vername)
393 )
361 else:394 else:
362 skipped_versions.append(vername)395 skipped_versions.append(vername)
363396
364 for vername in skipped_versions:397 for vername in skipped_versions:
365 if vername in tproduct['versions']:398 if vername in tproduct["versions"]:
366 del tproduct['versions'][vername]399 del tproduct["versions"][vername]
367400
368 if self.config.get('delete_filtered_items', False):401 if self.config.get("delete_filtered_items", False):
369 tkeys = tproduct.get('versions', {}).keys()402 tkeys = tproduct.get("versions", {}).keys()
370 for v in src_filtered_items:403 for v in src_filtered_items:
371 if v not in to_remove and v in tkeys:404 if v not in to_remove and v in tkeys:
372 to_remove.append(v)405 to_remove.append(v)
373 LOG.info("After deletions %s/%s: to_add=%s to_remove=%s",406 LOG.info(
374 content_id, prodname, to_add, to_remove)407 "After deletions %s/%s: to_add=%s to_remove=%s",
408 content_id,
409 prodname,
410 to_add,
411 to_remove,
412 )
375413
376 for vername in to_remove:414 for vername in to_remove:
377 tversion = tversions[vername]415 tversion = tversions[vername]
378 for itemname in list(tversion.get('items', {}).keys()):416 for itemname in list(tversion.get("items", {}).keys()):
379 self.remove_item(tversion['items'][itemname], src, target,417 self.remove_item(
380 (prodname, vername, itemname))418 tversion["items"][itemname],
419 src,
420 target,
421 (prodname, vername, itemname),
422 )
381423
382 self.remove_version(tversion, src, target, (prodname, vername))424 self.remove_version(tversion, src, target, (prodname, vername))
383 del tversions[vername]425 del tversions[vername]
@@ -389,12 +431,14 @@ class BasicMirrorWriter(MirrorWriter):
389 # that could accidentally delete a lot.431 # that could accidentally delete a lot.
390 #432 #
391 del_products = []433 del_products = []
392 if self.config.get('delete_products', False):434 if self.config.get("delete_products", False):
393 del_products.extend([p for p in list(tproducts.keys())435 del_products.extend(
394 if p not in stree])436 [p for p in list(tproducts.keys()) if p not in stree]
395 if self.config.get('delete_filtered_products', False):437 )
396 del_products.extend([p for p in filtered_products438 if self.config.get("delete_filtered_products", False):
397 if p not in stree])439 del_products.extend(
440 [p for p in filtered_products if p not in stree]
441 )
398442
399 for prodname in del_products:443 for prodname in del_products:
400 # FIXME: we remove a product here, but unless that acts444 # FIXME: we remove a product here, but unless that acts
@@ -421,7 +465,7 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter):
421 try:465 try:
422 with self.source(self._reference_count_data_path()) as source:466 with self.source(self._reference_count_data_path()) as source:
423 raw = source.read()467 raw = source.read()
424 return json.load(io.StringIO(raw.decode('utf-8')))468 return json.load(io.StringIO(raw.decode("utf-8")))
425 except IOError as e:469 except IOError as e:
426 if e.errno == errno.ENOENT:470 if e.errno == errno.ENOENT:
427 return {}471 return {}
@@ -432,7 +476,7 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter):
432 self.store.insert(self._reference_count_data_path(), source)476 self.store.insert(self._reference_count_data_path(), source)
433477
434 def _build_rc_id(self, src, pedigree):478 def _build_rc_id(self, src, pedigree):
435 return '/'.join([src['content_id']] + list(pedigree))479 return "/".join([src["content_id"]] + list(pedigree))
436480
437 def _inc_rc(self, path, src, pedigree):481 def _inc_rc(self, path, src, pedigree):
438 rc = self._load_rc_dict()482 rc = self._load_rc_dict()
@@ -482,25 +526,30 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter):
482526
483 def insert_item(self, data, src, target, pedigree, contentsource):527 def insert_item(self, data, src, target, pedigree, contentsource):
484 util.products_set(target, data, pedigree)528 util.products_set(target, data, pedigree)
485 if 'path' not in data:529 if "path" not in data:
486 return530 return
487 if not self.config.get('item_download', True):531 if not self.config.get("item_download", True):
488 return532 return
489 LOG.debug("inserting %s to %s", contentsource.url, data['path'])533 LOG.debug("inserting %s to %s", contentsource.url, data["path"])
490 self.store.insert(data['path'], contentsource,534 self.store.insert(
491 checksums=checksum_util.item_checksums(data),535 data["path"],
492 mutable=False, size=data.get('size'))536 contentsource,
493 self._inc_rc(data['path'], src, pedigree)537 checksums=checksum_util.item_checksums(data),
538 mutable=False,
539 size=data.get("size"),
540 )
541 self._inc_rc(data["path"], src, pedigree)
494542
495 def insert_index_entry(self, data, src, pedigree, contentsource):543 def insert_index_entry(self, data, src, pedigree, contentsource):
496 epath = data.get('path', None)544 epath = data.get("path", None)
497 if not epath:545 if not epath:
498 return546 return
499 self.store.insert(epath, contentsource,547 self.store.insert(
500 checksums=checksum_util.item_checksums(data))548 epath, contentsource, checksums=checksum_util.item_checksums(data)
549 )
501550
502 def insert_products(self, path, target, content):551 def insert_products(self, path, target, content):
503 dpath = self.products_data_path(target['content_id'])552 dpath = self.products_data_path(target["content_id"])
504 self.store.insert_content(dpath, util.dump_data(target))553 self.store.insert_content(dpath, util.dump_data(target))
505 if not path:554 if not path:
506 return555 return
@@ -517,16 +566,16 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter):
517566
518 def remove_item(self, data, src, target, pedigree):567 def remove_item(self, data, src, target, pedigree):
519 util.products_del(target, pedigree)568 util.products_del(target, pedigree)
520 if 'path' not in data:569 if "path" not in data:
521 return570 return
522 if self._dec_rc(data['path'], src, pedigree):571 if self._dec_rc(data["path"], src, pedigree):
523 self.store.remove(data['path'])572 self.store.remove(data["path"])
524573
525574
526class ObjectFilterMirror(ObjectStoreMirrorWriter):575class ObjectFilterMirror(ObjectStoreMirrorWriter):
527 def __init__(self, *args, **kwargs):576 def __init__(self, *args, **kwargs):
528 super(ObjectFilterMirror, self).__init__(*args, **kwargs)577 super(ObjectFilterMirror, self).__init__(*args, **kwargs)
529 self.filters = self.config.get('filters', [])578 self.filters = self.config.get("filters", [])
530579
531 def filter_item(self, data, src, target, pedigree):580 def filter_item(self, data, src, target, pedigree):
532 return filters.filter_item(self.filters, data, src, pedigree)581 return filters.filter_item(self.filters, data, src, pedigree)
@@ -552,15 +601,15 @@ class DryRunMirrorWriter(ObjectFilterMirror):
552601
553 def insert_item(self, data, src, target, pedigree, contentsource):602 def insert_item(self, data, src, target, pedigree, contentsource):
554 data = util.products_exdata(src, pedigree)603 data = util.products_exdata(src, pedigree)
555 if 'size' in data and 'path' in data:604 if "size" in data and "path" in data:
556 self.downloading.append(605 self.downloading.append(
557 (pedigree, data['path'], int(data['size'])))606 (pedigree, data["path"], int(data["size"]))
607 )
558608
559 def remove_item(self, data, src, target, pedigree):609 def remove_item(self, data, src, target, pedigree):
560 data = util.products_exdata(src, pedigree)610 data = util.products_exdata(src, pedigree)
561 if 'size' in data and 'path' in data:611 if "size" in data and "path" in data:
562 self.removing.append(612 self.removing.append((pedigree, data["path"], int(data["size"])))
563 (pedigree, data['path'], int(data['size'])))
564613
565 @property614 @property
566 def size(self):615 def size(self):
@@ -573,27 +622,31 @@ def _get_data_content(path, data, content, reader):
573 if content is None and path:622 if content is None and path:
574 _, content = reader.read(path)623 _, content = reader.read(path)
575 if isinstance(content, bytes):624 if isinstance(content, bytes):
576 content = content.decode('utf-8')625 content = content.decode("utf-8")
577626
578 if data is None and content:627 if data is None and content:
579 data = util.load_content(content)628 data = util.load_content(content)
580629
581 if not data:630 if not data:
582 raise ValueError("Data could not be loaded. "631 raise ValueError(
583 "Path or content is required")632 "Data could not be loaded. " "Path or content is required"
633 )
584 return (data, content)634 return (data, content)
585635
586636
587def check_tree_paths(tree, fmt=None):637def check_tree_paths(tree, fmt=None):
588 if fmt is None:638 if fmt is None:
589 fmt = tree.get('format')639 fmt = tree.get("format")
590 if fmt == "products:1.0":640 if fmt == "products:1.0":
641
591 def check_path(item, tree, pedigree):642 def check_path(item, tree, pedigree):
592 util.assert_safe_path(item.get('path'))643 util.assert_safe_path(item.get("path"))
644
593 util.walk_products(tree, cb_item=check_path)645 util.walk_products(tree, cb_item=check_path)
594 elif fmt == "index:1.0":646 elif fmt == "index:1.0":
595 index = tree.get('index')647 index = tree.get("index")
596 for content_id in index:648 for content_id in index:
597 util.assert_safe_path(index[content_id].get('path'))649 util.assert_safe_path(index[content_id].get("path"))
650
598651
599# vi: ts=4 expandtab652# vi: ts=4 expandtab
diff --git a/simplestreams/mirrors/command_hook.py b/simplestreams/mirrors/command_hook.py
index ab70691..de42623 100644
--- a/simplestreams/mirrors/command_hook.py
+++ b/simplestreams/mirrors/command_hook.py
@@ -15,15 +15,15 @@
15# You should have received a copy of the GNU Affero General Public License15# You should have received a copy of the GNU Affero General Public License
16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1717
18import simplestreams.mirrors as mirrors
19import simplestreams.util as util
20
21import os
22import errno18import errno
19import os
23import signal20import signal
24import subprocess21import subprocess
25import tempfile22import tempfile
2623
24import simplestreams.mirrors as mirrors
25import simplestreams.util as util
26
27REQUIRED_FIELDS = ("load_products",)27REQUIRED_FIELDS = ("load_products",)
28HOOK_NAMES = (28HOOK_NAMES = (
29 "filter_index_entry",29 "filter_index_entry",
@@ -92,6 +92,7 @@ class CommandHookMirror(mirrors.BasicMirrorWriter):
92 If the configuration setting 'item_skip_download' is set to True, then92 If the configuration setting 'item_skip_download' is set to True, then
93 'path_url' will be set instead to a url where the item can be found.93 'path_url' will be set instead to a url where the item can be found.
94 """94 """
95
95 def __init__(self, config):96 def __init__(self, config):
96 if isinstance(config, str):97 if isinstance(config, str):
97 config = util.load_content(config)98 config = util.load_content(config)
@@ -100,32 +101,34 @@ class CommandHookMirror(mirrors.BasicMirrorWriter):
100 super(CommandHookMirror, self).__init__(config=config)101 super(CommandHookMirror, self).__init__(config=config)
101102
102 def load_products(self, path=None, content_id=None):103 def load_products(self, path=None, content_id=None):
103 (_rc, output) = self.call_hook('load_products',104 (_rc, output) = self.call_hook(
104 data={'content_id': content_id},105 "load_products", data={"content_id": content_id}, capture=True
105 capture=True)106 )
106 fmt = self.config.get("product_load_output_format", "serial_list")107 fmt = self.config.get("product_load_output_format", "serial_list")
107108
108 loaded = load_product_output(output=output, content_id=content_id,109 loaded = load_product_output(
109 fmt=fmt)110 output=output, content_id=content_id, fmt=fmt
111 )
110 return loaded112 return loaded
111113
112 def filter_index_entry(self, data, src, pedigree):114 def filter_index_entry(self, data, src, pedigree):
113 mdata = util.stringitems(src)115 mdata = util.stringitems(src)
114 mdata['content_id'] = pedigree[0]116 mdata["content_id"] = pedigree[0]
115 mdata.update(util.stringitems(data))117 mdata.update(util.stringitems(data))
116118
117 (ret, _output) = self.call_hook('filter_index_entry', data=mdata,119 (ret, _output) = self.call_hook(
118 rcs=[0, 1])120 "filter_index_entry", data=mdata, rcs=[0, 1]
121 )
119 return ret == 0122 return ret == 0
120123
121 def filter_product(self, data, src, target, pedigree):124 def filter_product(self, data, src, target, pedigree):
122 return self._call_filter('filter_product', src, pedigree)125 return self._call_filter("filter_product", src, pedigree)
123126
124 def filter_version(self, data, src, target, pedigree):127 def filter_version(self, data, src, target, pedigree):
125 return self._call_filter('filter_version', src, pedigree)128 return self._call_filter("filter_version", src, pedigree)
126129
127 def filter_item(self, data, src, target, pedigree):130 def filter_item(self, data, src, target, pedigree):
128 return self._call_filter('filter_item', src, pedigree)131 return self._call_filter("filter_item", src, pedigree)
129132
130 def _call_filter(self, name, src, pedigree):133 def _call_filter(self, name, src, pedigree):
131 data = util.products_exdata(src, pedigree)134 data = util.products_exdata(src, pedigree)
@@ -133,20 +136,27 @@ class CommandHookMirror(mirrors.BasicMirrorWriter):
133 return ret == 0136 return ret == 0
134137
135 def insert_index(self, path, src, content):138 def insert_index(self, path, src, content):
136 return self.call_hook('insert_index', data=src, content=content,139 return self.call_hook(
137 extra={'path': path})140 "insert_index", data=src, content=content, extra={"path": path}
141 )
138142
139 def insert_products(self, path, target, content):143 def insert_products(self, path, target, content):
140 return self.call_hook('insert_products', data=target,144 return self.call_hook(
141 content=content, extra={'path': path})145 "insert_products",
146 data=target,
147 content=content,
148 extra={"path": path},
149 )
142150
143 def insert_product(self, data, src, target, pedigree):151 def insert_product(self, data, src, target, pedigree):
144 return self.call_hook('insert_product',152 return self.call_hook(
145 data=util.products_exdata(src, pedigree))153 "insert_product", data=util.products_exdata(src, pedigree)
154 )
146155
147 def insert_version(self, data, src, target, pedigree):156 def insert_version(self, data, src, target, pedigree):
148 return self.call_hook('insert_version',157 return self.call_hook(
149 data=util.products_exdata(src, pedigree))158 "insert_version", data=util.products_exdata(src, pedigree)
159 )
150160
151 def insert_item(self, data, src, target, pedigree, contentsource):161 def insert_item(self, data, src, target, pedigree, contentsource):
152 mdata = util.products_exdata(src, pedigree)162 mdata = util.products_exdata(src, pedigree)
@@ -154,43 +164,47 @@ class CommandHookMirror(mirrors.BasicMirrorWriter):
154 tmp_path = None164 tmp_path = None
155 tmp_del = None165 tmp_del = None
156 extra = {}166 extra = {}
157 if 'path' in data:167 if "path" in data:
158 extra.update({'item_url': contentsource.url})168 extra.update({"item_url": contentsource.url})
159 if not self.config.get('item_skip_download', False):169 if not self.config.get("item_skip_download", False):
160 try:170 try:
161 (tmp_path, tmp_del) = util.get_local_copy(contentsource)171 (tmp_path, tmp_del) = util.get_local_copy(contentsource)
162 extra['path_local'] = tmp_path172 extra["path_local"] = tmp_path
163 finally:173 finally:
164 contentsource.close()174 contentsource.close()
165175
166 try:176 try:
167 ret = self.call_hook('insert_item', data=mdata, extra=extra)177 ret = self.call_hook("insert_item", data=mdata, extra=extra)
168 finally:178 finally:
169 if tmp_del and os.path.exists(tmp_path):179 if tmp_del and os.path.exists(tmp_path):
170 os.unlink(tmp_path)180 os.unlink(tmp_path)
171 return ret181 return ret
172182
173 def remove_product(self, data, src, target, pedigree):183 def remove_product(self, data, src, target, pedigree):
174 return self.call_hook('remove_product',184 return self.call_hook(
175 data=util.products_exdata(src, pedigree))185 "remove_product", data=util.products_exdata(src, pedigree)
186 )
176187
177 def remove_version(self, data, src, target, pedigree):188 def remove_version(self, data, src, target, pedigree):
178 return self.call_hook('remove_version',189 return self.call_hook(
179 data=util.products_exdata(src, pedigree))190 "remove_version", data=util.products_exdata(src, pedigree)
191 )
180192
181 def remove_item(self, data, src, target, pedigree):193 def remove_item(self, data, src, target, pedigree):
182 return self.call_hook('remove_item',194 return self.call_hook(
183 data=util.products_exdata(target, pedigree))195 "remove_item", data=util.products_exdata(target, pedigree)
196 )
184197
185 def call_hook(self, hookname, data, capture=False, rcs=None, extra=None,198 def call_hook(
186 content=None):199 self, hookname, data, capture=False, rcs=None, extra=None, content=None
200 ):
187 command = self.config.get(hookname, self.config.get(DEFAULT_HOOK_NAME))201 command = self.config.get(hookname, self.config.get(DEFAULT_HOOK_NAME))
188 if not command:202 if not command:
189 # return successful execution with no output203 # return successful execution with no output
190 return (0, '')204 return (0, "")
191205
192 if isinstance(command, str):206 if isinstance(command, str):
193 command = ['sh', '-c', command]207 command = ["sh", "-c", command]
194208
195 fdata = util.stringitems(data)209 fdata = util.stringitems(data)
196210
@@ -200,16 +214,20 @@ class CommandHookMirror(mirrors.BasicMirrorWriter):
200 tfile = os.fdopen(tfd, "w")214 tfile = os.fdopen(tfd, "w")
201 tfile.write(content)215 tfile.write(content)
202 tfile.close()216 tfile.close()
203 fdata['content_file_path'] = content_file217 fdata["content_file_path"] = content_file
204218
205 if extra:219 if extra:
206 fdata.update(extra)220 fdata.update(extra)
207 fdata['HOOK'] = hookname221 fdata["HOOK"] = hookname
208222
209 try:223 try:
210 return call_hook(command=command, data=fdata,224 return call_hook(
211 unset=self.config.get('unset_value', None),225 command=command,
212 capture=capture, rcs=rcs)226 data=fdata,
227 unset=self.config.get("unset_value", None),
228 capture=capture,
229 rcs=rcs,
230 )
213 finally:231 finally:
214 if content_file:232 if content_file:
215 os.unlink(content_file)233 os.unlink(content_file)
@@ -219,7 +237,7 @@ def call_hook(command, data, unset=None, capture=False, rcs=None):
219 env = os.environ.copy()237 env = os.environ.copy()
220 data = data.copy()238 data = data.copy()
221239
222 data[ENV_FIELDS_NAME] = ' '.join([k for k in data if k != ENV_HOOK_NAME])240 data[ENV_FIELDS_NAME] = " ".join([k for k in data if k != ENV_HOOK_NAME])
223241
224 mcommand = render(command, data, unset=unset)242 mcommand = render(command, data, unset=unset)
225243
@@ -257,12 +275,12 @@ def load_product_output(output, content_id, fmt="serial_list"):
257275
258 if fmt == "serial_list":276 if fmt == "serial_list":
259 # "line" format just is a list of serials that are present277 # "line" format just is a list of serials that are present
260 working = {'content_id': content_id, 'products': {}}278 working = {"content_id": content_id, "products": {}}
261 for line in output.splitlines():279 for line in output.splitlines():
262 (product_id, version) = line.split(None, 1)280 (product_id, version) = line.split(None, 1)
263 if product_id not in working['products']:281 if product_id not in working["products"]:
264 working['products'][product_id] = {'versions': {}}282 working["products"][product_id] = {"versions": {}}
265 working['products'][product_id]['versions'][version] = {}283 working["products"][product_id]["versions"][version] = {}
266 return working284 return working
267285
268 elif fmt == "json":286 elif fmt == "json":
@@ -293,10 +311,11 @@ def run_command(cmd, env=None, capture=False, rcs=None):
293 raise subprocess.CalledProcessError(rc, cmd)311 raise subprocess.CalledProcessError(rc, cmd)
294312
295 if out is None:313 if out is None:
296 out = ''314 out = ""
297 elif isinstance(out, bytes):315 elif isinstance(out, bytes):
298 out = out.decode('utf-8')316 out = out.decode("utf-8")
299317
300 return (rc, out)318 return (rc, out)
301319
320
302# vi: ts=4 expandtab syntax=python321# vi: ts=4 expandtab syntax=python
diff --git a/simplestreams/mirrors/glance.py b/simplestreams/mirrors/glance.py
index b96f8eb..22e46e9 100644
--- a/simplestreams/mirrors/glance.py
+++ b/simplestreams/mirrors/glance.py
@@ -15,42 +15,47 @@
15# You should have received a copy of the GNU Affero General Public License15# You should have received a copy of the GNU Affero General Public License
16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1717
18import simplestreams.filters as filters
19import simplestreams.mirrors as mirrors
20import simplestreams.util as util
21from simplestreams import checksum_util
22import simplestreams.openstack as openstack
23from simplestreams.log import LOG
24
25import copy
26import collections18import collections
19import copy
27import errno20import errno
28import glanceclient
29import json21import json
30import os22import os
31import re23import re
3224
25import glanceclient
26
27import simplestreams.filters as filters
28import simplestreams.mirrors as mirrors
29import simplestreams.openstack as openstack
30import simplestreams.util as util
31from simplestreams import checksum_util
32from simplestreams.log import LOG
33
3334
34def get_glanceclient(version='1', **kwargs):35def get_glanceclient(version="1", **kwargs):
35 # newer versions of the glanceclient will do this 'strip_version' for36 # newer versions of the glanceclient will do this 'strip_version' for
36 # us, but older versions do not.37 # us, but older versions do not.
37 kwargs['endpoint'] = _strip_version(kwargs['endpoint'])38 kwargs["endpoint"] = _strip_version(kwargs["endpoint"])
38 pt = ('endpoint', 'token', 'insecure', 'cacert')39 pt = ("endpoint", "token", "insecure", "cacert")
39 kskw = {k: kwargs.get(k) for k in pt if k in kwargs}40 kskw = {k: kwargs.get(k) for k in pt if k in kwargs}
40 if kwargs.get('session'):41 if kwargs.get("session"):
41 sess = kwargs.get('session')42 sess = kwargs.get("session")
42 return glanceclient.Client(version, session=sess)43 return glanceclient.Client(version, session=sess)
43 else:44 else:
44 return glanceclient.Client(version, **kskw)45 return glanceclient.Client(version, **kskw)
4546
4647
47def empty_iid_products(content_id):48def empty_iid_products(content_id):
48 return {'content_id': content_id, 'products': {},49 return {
49 'datatype': 'image-ids', 'format': 'products:1.0'}50 "content_id": content_id,
51 "products": {},
52 "datatype": "image-ids",
53 "format": "products:1.0",
54 }
5055
5156
52def canonicalize_arch(arch):57def canonicalize_arch(arch):
53 '''Canonicalize Ubuntu archs for use in OpenStack'''58 """Canonicalize Ubuntu archs for use in OpenStack"""
54 newarch = arch.lower()59 newarch = arch.lower()
55 if newarch == "amd64":60 if newarch == "amd64":
56 newarch = "x86_64"61 newarch = "x86_64"
@@ -68,21 +73,21 @@ def canonicalize_arch(arch):
6873
6974
70LXC_FTYPES = {75LXC_FTYPES = {
71 'root.tar.gz': 'root-tar',76 "root.tar.gz": "root-tar",
72 'root.tar.xz': 'root-tar',77 "root.tar.xz": "root-tar",
73 'squashfs': 'squashfs',78 "squashfs": "squashfs",
74}79}
7580
76QEMU_FTYPES = {81QEMU_FTYPES = {
77 'disk.img': 'qcow2',82 "disk.img": "qcow2",
78 'disk1.img': 'qcow2',83 "disk1.img": "qcow2",
79}84}
8085
8186
82def disk_format(ftype):87def disk_format(ftype):
83 '''Canonicalize disk formats for use in OpenStack.88 """Canonicalize disk formats for use in OpenStack.
84 Input ftype is a 'ftype' from a simplestream feed.89 Input ftype is a 'ftype' from a simplestream feed.
85 Return value is the appropriate 'disk_format' for glance.'''90 Return value is the appropriate 'disk_format' for glance."""
86 newftype = ftype.lower()91 newftype = ftype.lower()
87 if newftype in LXC_FTYPES:92 if newftype in LXC_FTYPES:
88 return LXC_FTYPES[newftype]93 return LXC_FTYPES[newftype]
@@ -92,22 +97,22 @@ def disk_format(ftype):
9297
9398
94def hypervisor_type(ftype):99def hypervisor_type(ftype):
95 '''Determine hypervisor type based on image format'''100 """Determine hypervisor type based on image format"""
96 newftype = ftype.lower()101 newftype = ftype.lower()
97 if newftype in LXC_FTYPES:102 if newftype in LXC_FTYPES:
98 return 'lxc'103 return "lxc"
99 if newftype in QEMU_FTYPES:104 if newftype in QEMU_FTYPES:
100 return 'qemu'105 return "qemu"
101 return None106 return None
102107
103108
104def virt_type(hypervisor_type):109def virt_type(hypervisor_type):
105 '''Map underlying hypervisor types into high level virt types'''110 """Map underlying hypervisor types into high level virt types"""
106 newhtype = hypervisor_type.lower()111 newhtype = hypervisor_type.lower()
107 if newhtype == 'qemu':112 if newhtype == "qemu":
108 return 'kvm'113 return "kvm"
109 if newhtype == 'lxc':114 if newhtype == "lxc":
110 return 'lxd'115 return "lxd"
111 return None116 return None
112117
113118
@@ -120,20 +125,29 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
120 `client` argument is used for testing to override openstack module:125 `client` argument is used for testing to override openstack module:
121 allows dependency injection of fake "openstack" module.126 allows dependency injection of fake "openstack" module.
122 """127 """
123 def __init__(self, config, objectstore=None, region=None,128
124 name_prefix=None, progress_callback=None,129 def __init__(
125 client=None):130 self,
131 config,
132 objectstore=None,
133 region=None,
134 name_prefix=None,
135 progress_callback=None,
136 client=None,
137 ):
126 super(GlanceMirror, self).__init__(config=config)138 super(GlanceMirror, self).__init__(config=config)
127139
128 self.item_filters = self.config.get('item_filters', [])140 self.item_filters = self.config.get("item_filters", [])
129 if len(self.item_filters) == 0:141 if len(self.item_filters) == 0:
130 self.item_filters = ['ftype~(disk1.img|disk.img)',142 self.item_filters = [
131 'arch~(x86_64|amd64|i386)']143 "ftype~(disk1.img|disk.img)",
144 "arch~(x86_64|amd64|i386)",
145 ]
132 self.item_filters = filters.get_filters(self.item_filters)146 self.item_filters = filters.get_filters(self.item_filters)
133147
134 self.index_filters = self.config.get('index_filters', [])148 self.index_filters = self.config.get("index_filters", [])
135 if len(self.index_filters) == 0:149 if len(self.index_filters) == 0:
136 self.index_filters = ['datatype=image-downloads']150 self.index_filters = ["datatype=image-downloads"]
137 self.index_filters = filters.get_filters(self.index_filters)151 self.index_filters = filters.get_filters(self.index_filters)
138152
139 self.loaded_content = {}153 self.loaded_content = {}
@@ -146,21 +160,28 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
146160
147 self.name_prefix = name_prefix or ""161 self.name_prefix = name_prefix or ""
148 if region is not None:162 if region is not None:
149 self.keystone_creds['region_name'] = region163 self.keystone_creds["region_name"] = region
150164
151 self.progress_callback = progress_callback165 self.progress_callback = progress_callback
152166
153 conn_info = client.get_service_conn_info(167 conn_info = client.get_service_conn_info(
154 'image', **self.keystone_creds)168 "image", **self.keystone_creds
155 self.glance_api_version = conn_info['glance_version']169 )
156 self.gclient = get_glanceclient(version=self.glance_api_version,170 self.glance_api_version = conn_info["glance_version"]
157 **conn_info)171 self.gclient = get_glanceclient(
158 self.tenant_id = conn_info['tenant_id']172 version=self.glance_api_version, **conn_info
159173 )
160 self.region = self.keystone_creds.get('region_name', 'nullregion')174 self.tenant_id = conn_info["tenant_id"]
161 self.cloudname = config.get("cloud_name", 'nullcloud')175
162 self.crsn = '-'.join((self.cloudname, self.region,))176 self.region = self.keystone_creds.get("region_name", "nullregion")
163 self.auth_url = self.keystone_creds['auth_url']177 self.cloudname = config.get("cloud_name", "nullcloud")
178 self.crsn = "-".join(
179 (
180 self.cloudname,
181 self.region,
182 )
183 )
184 self.auth_url = self.keystone_creds["auth_url"]
164185
165 self.content_id = config.get("content_id")186 self.content_id = config.get("content_id")
166 self.modify_hook = config.get("modify_hook")187 self.modify_hook = config.get("modify_hook")
@@ -170,7 +191,7 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
170 raise TypeError("content_id is required")191 raise TypeError("content_id is required")
171192
172 self.custom_properties = collections.OrderedDict(193 self.custom_properties = collections.OrderedDict(
173 prop.split('=') for prop in config.get("custom_properties", [])194 prop.split("=") for prop in config.get("custom_properties", [])
174 )195 )
175 self.visibility = config.get("visibility", "public")196 self.visibility = config.get("visibility", "public")
176197
@@ -208,82 +229,100 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
208 for image in images:229 for image in images:
209 if self.glance_api_version == "1":230 if self.glance_api_version == "1":
210 image = image.to_dict()231 image = image.to_dict()
211 props = image['properties']232 props = image["properties"]
212 else:233 else:
213 props = copy.deepcopy(image)234 props = copy.deepcopy(image)
214235
215 if image['owner'] != self.tenant_id:236 if image["owner"] != self.tenant_id:
216 continue237 continue
217238
218 if props.get('content_id') != my_cid:239 if props.get("content_id") != my_cid:
219 continue240 continue
220241
221 if image.get('status') != "active":242 if image.get("status") != "active":
222 LOG.warning("Ignoring inactive image %s with status '%s'" % (243 LOG.warning(
223 image['id'], image.get('status')))244 "Ignoring inactive image %s with status '%s'"
245 % (image["id"], image.get("status"))
246 )
224 continue247 continue
225248
226 source_content_id = props.get('source_content_id')249 source_content_id = props.get("source_content_id")
227250
228 product = props.get('product_name')251 product = props.get("product_name")
229 version = props.get('version_name')252 version = props.get("version_name")
230 item = props.get('item_name')253 item = props.get("item_name")
231 if not (version and product and item and source_content_id):254 if not (version and product and item and source_content_id):
232 LOG.warning("%s missing required fields" % image['id'])255 LOG.warning("%s missing required fields" % image["id"])
233 continue256 continue
234257
235 # get data from the datastore for this item, if it exists258 # get data from the datastore for this item, if it exists
236 # and then update that with glance data (just in case different)259 # and then update that with glance data (just in case different)
237 try:260 try:
238 item_data = util.products_exdata(store_t,261 item_data = util.products_exdata(
239 (product, version, item,),262 store_t,
240 include_top=False,263 (
241 insert_fieldnames=False)264 product,
265 version,
266 item,
267 ),
268 include_top=False,
269 insert_fieldnames=False,
270 )
242 except KeyError:271 except KeyError:
243 item_data = {}272 item_data = {}
244273
245 # If original simplestreams-metadata is stored on the image,274 # If original simplestreams-metadata is stored on the image,
246 # use that as well.275 # use that as well.
247 if 'simplestreams_metadata' in props:276 if "simplestreams_metadata" in props:
248 simplestreams_metadata = json.loads(277 simplestreams_metadata = json.loads(
249 props.get('simplestreams_metadata'))278 props.get("simplestreams_metadata")
279 )
250 else:280 else:
251 simplestreams_metadata = {}281 simplestreams_metadata = {}
252 item_data.update(simplestreams_metadata)282 item_data.update(simplestreams_metadata)
253283
254 item_data.update({'name': image['name'], 'id': image['id']})284 item_data.update({"name": image["name"], "id": image["id"]})
255 if 'owner_id' not in item_data:285 if "owner_id" not in item_data:
256 item_data['owner_id'] = self.tenant_id286 item_data["owner_id"] = self.tenant_id
257287
258 util.products_set(glance_t, item_data,288 util.products_set(
259 (product, version, item,))289 glance_t,
290 item_data,
291 (
292 product,
293 version,
294 item,
295 ),
296 )
260297
261 for product in glance_t['products']:298 for product in glance_t["products"]:
262 glance_t['products'][product]['region'] = self.region299 glance_t["products"][product]["region"] = self.region
263 glance_t['products'][product]['endpoint'] = self.auth_url300 glance_t["products"][product]["endpoint"] = self.auth_url
264301
265 return glance_t302 return glance_t
266303
267 def filter_item(self, data, src, target, pedigree):304 def filter_item(self, data, src, target, pedigree):
268 return filters.filter_item(self.item_filters, data, src, pedigree)305 return filters.filter_item(self.item_filters, data, src, pedigree)
269306
270 def create_glance_properties(self, content_id, source_content_id,307 def create_glance_properties(
271 image_metadata, hypervisor_mapping):308 self, content_id, source_content_id, image_metadata, hypervisor_mapping
309 ):
272 """310 """
273 Construct extra properties to store in Glance for an image.311 Construct extra properties to store in Glance for an image.
274312
275 Based on source image metadata.313 Based on source image metadata.
276 """314 """
277 properties = {315 properties = {
278 'content_id': content_id,316 "content_id": content_id,
279 'source_content_id': source_content_id,317 "source_content_id": source_content_id,
280 }318 }
281 # An iterator of properties to carry over: if a property needs319 # An iterator of properties to carry over: if a property needs
282 # renaming, uses a tuple (old name, new name).320 # renaming, uses a tuple (old name, new name).
283 carry_over_simple = (321 carry_over_simple = ("product_name", "version_name", "item_name")
284 'product_name', 'version_name', 'item_name')
285 carry_over = carry_over_simple + (322 carry_over = carry_over_simple + (
286 ('os', 'os_distro'), ('version', 'os_version'))323 ("os", "os_distro"),
324 ("version", "os_version"),
325 )
287 for carry_over_property in carry_over:326 for carry_over_property in carry_over:
288 if isinstance(carry_over_property, tuple):327 if isinstance(carry_over_property, tuple):
289 name_old, name_new = carry_over_property328 name_old, name_new = carry_over_property
@@ -291,33 +330,41 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
291 name_old = name_new = carry_over_property330 name_old = name_new = carry_over_property
292 properties[name_new] = image_metadata.get(name_old)331 properties[name_new] = image_metadata.get(name_old)
293332
294 if 'arch' in image_metadata:333 if "arch" in image_metadata:
295 properties['architecture'] = canonicalize_arch(334 properties["architecture"] = canonicalize_arch(
296 image_metadata['arch'])335 image_metadata["arch"]
336 )
297337
298 if hypervisor_mapping and 'ftype' in image_metadata:338 if hypervisor_mapping and "ftype" in image_metadata:
299 _hypervisor_type = hypervisor_type(image_metadata['ftype'])339 _hypervisor_type = hypervisor_type(image_metadata["ftype"])
300 if _hypervisor_type:340 if _hypervisor_type:
301 properties['hypervisor_type'] = _hypervisor_type341 properties["hypervisor_type"] = _hypervisor_type
302342
303 properties.update(self.custom_properties)343 properties.update(self.custom_properties)
304344
305 if self.set_latest_property:345 if self.set_latest_property:
306 properties['latest'] = "true"346 properties["latest"] = "true"
307347
308 # Store flattened metadata for a source image along with the348 # Store flattened metadata for a source image along with the
309 # image in 'simplestreams_metadata' property.349 # image in 'simplestreams_metadata' property.
310 simplestreams_metadata = image_metadata.copy()350 simplestreams_metadata = image_metadata.copy()
311 drop_keys = carry_over_simple + ('path',)351 drop_keys = carry_over_simple + ("path",)
312 for remove_key in drop_keys:352 for remove_key in drop_keys:
313 if remove_key in simplestreams_metadata:353 if remove_key in simplestreams_metadata:
314 del simplestreams_metadata[remove_key]354 del simplestreams_metadata[remove_key]
315 properties['simplestreams_metadata'] = json.dumps(355 properties["simplestreams_metadata"] = json.dumps(
316 simplestreams_metadata, sort_keys=True)356 simplestreams_metadata, sort_keys=True
357 )
317 return properties358 return properties
318359
319 def prepare_glance_arguments(self, full_image_name, image_metadata,360 def prepare_glance_arguments(
320 image_md5_hash, image_size, image_properties):361 self,
362 full_image_name,
363 image_metadata,
364 image_md5_hash,
365 image_size,
366 image_properties,
367 ):
321 """368 """
322 Prepare arguments to pass into Glance image creation method.369 Prepare arguments to pass into Glance image creation method.
323370
@@ -334,37 +381,39 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
334 GlanceClient.images.create().381 GlanceClient.images.create().
335 """382 """
336 create_kwargs = {383 create_kwargs = {
337 'name': full_image_name,384 "name": full_image_name,
338 'container_format': 'bare',385 "container_format": "bare",
339 'is_public': self.visibility == 'public',386 "is_public": self.visibility == "public",
340 'properties': image_properties,387 "properties": image_properties,
341 }388 }
342389
343 # In v2 is_public=True is visibility='public'390 # In v2 is_public=True is visibility='public'
344 if self.glance_api_version == "2":391 if self.glance_api_version == "2":
345 del create_kwargs['is_public']392 del create_kwargs["is_public"]
346 create_kwargs['visibility'] = self.visibility393 create_kwargs["visibility"] = self.visibility
347394
348 # v2 automatically calculates size and checksum395 # v2 automatically calculates size and checksum
349 if self.glance_api_version == "1":396 if self.glance_api_version == "1":
350 if 'size' in image_metadata:397 if "size" in image_metadata:
351 create_kwargs['size'] = int(image_metadata.get('size'))398 create_kwargs["size"] = int(image_metadata.get("size"))
352 if 'md5' in image_metadata:399 if "md5" in image_metadata:
353 create_kwargs['checksum'] = image_metadata.get('md5')400 create_kwargs["checksum"] = image_metadata.get("md5")
354 if image_md5_hash and image_size:401 if image_md5_hash and image_size:
355 create_kwargs.update({402 create_kwargs.update(
356 'checksum': image_md5_hash,403 {
357 'size': image_size,404 "checksum": image_md5_hash,
358 })405 "size": image_size,
406 }
407 )
359408
360 if self.image_import_conversion:409 if self.image_import_conversion:
361 create_kwargs['disk_format'] = 'raw'410 create_kwargs["disk_format"] = "raw"
362 elif 'ftype' in image_metadata:411 elif "ftype" in image_metadata:
363 create_kwargs['disk_format'] = (412 create_kwargs["disk_format"] = (
364 disk_format(image_metadata['ftype']) or 'qcow2'413 disk_format(image_metadata["ftype"]) or "qcow2"
365 )414 )
366 else:415 else:
367 create_kwargs['disk_format'] = 'qcow2'416 create_kwargs["disk_format"] = "qcow2"
368417
369 return create_kwargs418 return create_kwargs
370419
@@ -378,37 +427,51 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
378 Returns a tuple of427 Returns a tuple of
379 (str(local-image-path), int(image-size), str(image-md5-hash)).428 (str(local-image-path), int(image-size), str(image-md5-hash)).
380 """429 """
381 image_name = image_stream_data.get('pubname')430 image_name = image_stream_data.get("pubname")
382 image_size = image_stream_data.get('size')431 image_size = image_stream_data.get("size")
383432
384 if self.progress_callback:433 if self.progress_callback:
434
385 def progress_wrapper(written):435 def progress_wrapper(written):
386 self.progress_callback(436 self.progress_callback(
387 dict(status="Downloading", name=image_name,437 dict(
388 size=None if image_size is None else int(image_size),438 status="Downloading",
389 written=written))439 name=image_name,
440 size=None if image_size is None else int(image_size),
441 written=written,
442 )
443 )
444
390 else:445 else:
446
391 def progress_wrapper(written):447 def progress_wrapper(written):
392 pass448 pass
393449
394 try:450 try:
395 tmp_path, _ = util.get_local_copy(451 tmp_path, _ = util.get_local_copy(
396 contentsource, progress_callback=progress_wrapper)452 contentsource, progress_callback=progress_wrapper
453 )
397454
398 if self.modify_hook:455 if self.modify_hook:
399 (new_size, new_md5) = call_hook(456 (new_size, new_md5) = call_hook(
400 item=image_stream_data, path=tmp_path,457 item=image_stream_data, path=tmp_path, cmd=self.modify_hook
401 cmd=self.modify_hook)458 )
402 else:459 else:
403 new_size = os.path.getsize(tmp_path)460 new_size = os.path.getsize(tmp_path)
404 new_md5 = image_stream_data.get('md5')461 new_md5 = image_stream_data.get("md5")
405 finally:462 finally:
406 contentsource.close()463 contentsource.close()
407464
408 return tmp_path, new_size, new_md5465 return tmp_path, new_size, new_md5
409466
410 def adapt_source_entry(self, source_entry, hypervisor_mapping, image_name,467 def adapt_source_entry(
411 image_md5_hash, image_size):468 self,
469 source_entry,
470 hypervisor_mapping,
471 image_name,
472 image_md5_hash,
473 image_size,
474 ):
412 """475 """
413 Adapts the source simplestreams dict `source_entry` for use in the476 Adapts the source simplestreams dict `source_entry` for use in the
414 generated local simplestreams index.477 generated local simplestreams index.
@@ -416,26 +479,30 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
416 output_entry = source_entry.copy()479 output_entry = source_entry.copy()
417480
418 # Drop attributes not needed for the simplestreams index itself.481 # Drop attributes not needed for the simplestreams index itself.
419 for property_name in ('path', 'product_name', 'version_name',482 for property_name in (
420 'item_name'):483 "path",
484 "product_name",
485 "version_name",
486 "item_name",
487 ):
421 if property_name in output_entry:488 if property_name in output_entry:
422 del output_entry[property_name]489 del output_entry[property_name]
423490
424 if hypervisor_mapping and 'ftype' in output_entry:491 if hypervisor_mapping and "ftype" in output_entry:
425 _hypervisor_type = hypervisor_type(output_entry['ftype'])492 _hypervisor_type = hypervisor_type(output_entry["ftype"])
426 if _hypervisor_type:493 if _hypervisor_type:
427 _virt_type = virt_type(_hypervisor_type)494 _virt_type = virt_type(_hypervisor_type)
428 if _virt_type:495 if _virt_type:
429 output_entry['virt'] = _virt_type496 output_entry["virt"] = _virt_type
430497
431 output_entry['region'] = self.region498 output_entry["region"] = self.region
432 output_entry['endpoint'] = self.auth_url499 output_entry["endpoint"] = self.auth_url
433 output_entry['owner_id'] = self.tenant_id500 output_entry["owner_id"] = self.tenant_id
434501
435 output_entry['name'] = image_name502 output_entry["name"] = image_name
436 if image_md5_hash and image_size:503 if image_md5_hash and image_size:
437 output_entry['md5'] = image_md5_hash504 output_entry["md5"] = image_md5_hash
438 output_entry['size'] = str(image_size)505 output_entry["size"] = str(image_size)
439506
440 return output_entry507 return output_entry
441508
@@ -459,58 +526,74 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
459 # (product-name, version-name, image-type)526 # (product-name, version-name, image-type)
460 # from the tuple `pedigree` in the source simplestreams index.527 # from the tuple `pedigree` in the source simplestreams index.
461 flattened_img_data = util.products_exdata(528 flattened_img_data = util.products_exdata(
462 src, pedigree, include_top=False)529 src, pedigree, include_top=False
530 )
463531
464 tmp_path = None532 tmp_path = None
465533
466 full_image_name = "{}{}".format(534 full_image_name = "{}{}".format(
467 self.name_prefix,535 self.name_prefix,
468 flattened_img_data.get('pubname', flattened_img_data.get('name')))536 flattened_img_data.get("pubname", flattened_img_data.get("name")),
469 if not full_image_name.endswith(flattened_img_data['item_name']):537 )
470 full_image_name += "-{}".format(flattened_img_data['item_name'])538 if not full_image_name.endswith(flattened_img_data["item_name"]):
539 full_image_name += "-{}".format(flattened_img_data["item_name"])
471540
472 # Download images locally into a temporary file.541 # Download images locally into a temporary file.
473 tmp_path, new_size, new_md5 = self.download_image(542 tmp_path, new_size, new_md5 = self.download_image(
474 contentsource, flattened_img_data)543 contentsource, flattened_img_data
544 )
475545
476 hypervisor_mapping = self.config.get('hypervisor_mapping', False)546 hypervisor_mapping = self.config.get("hypervisor_mapping", False)
477547
478 glance_props = self.create_glance_properties(548 glance_props = self.create_glance_properties(
479 target['content_id'], src['content_id'], flattened_img_data,549 target["content_id"],
480 hypervisor_mapping)550 src["content_id"],
551 flattened_img_data,
552 hypervisor_mapping,
553 )
481 LOG.debug("glance properties %s", glance_props)554 LOG.debug("glance properties %s", glance_props)
482 create_kwargs = self.prepare_glance_arguments(555 create_kwargs = self.prepare_glance_arguments(
483 full_image_name, flattened_img_data, new_md5, new_size,556 full_image_name,
484 glance_props)557 flattened_img_data,
558 new_md5,
559 new_size,
560 glance_props,
561 )
485562
486 target_sstream_item = self.adapt_source_entry(563 target_sstream_item = self.adapt_source_entry(
487 flattened_img_data, hypervisor_mapping, full_image_name, new_md5,564 flattened_img_data,
488 new_size)565 hypervisor_mapping,
566 full_image_name,
567 new_md5,
568 new_size,
569 )
489570
490 try:571 try:
491 if self.glance_api_version == "1":572 if self.glance_api_version == "1":
492 # Set data as string if v1573 # Set data as string if v1
493 create_kwargs['data'] = open(tmp_path, 'rb')574 create_kwargs["data"] = open(tmp_path, "rb")
494 else:575 else:
495 # Keep properties for v2 update call576 # Keep properties for v2 update call
496 _properties = create_kwargs['properties']577 _properties = create_kwargs["properties"]
497 del create_kwargs['properties']578 del create_kwargs["properties"]
498579
499 LOG.debug("glance create_kwargs %s", create_kwargs)580 LOG.debug("glance create_kwargs %s", create_kwargs)
500 glance_image = self.gclient.images.create(**create_kwargs)581 glance_image = self.gclient.images.create(**create_kwargs)
501 target_sstream_item['id'] = glance_image.id582 target_sstream_item["id"] = glance_image.id
502583
503 if self.glance_api_version == "2":584 if self.glance_api_version == "2":
504 if self.image_import_conversion:585 if self.image_import_conversion:
505 # Stage the image before starting import586 # Stage the image before starting import
506 self.gclient.images.stage(glance_image.id,587 self.gclient.images.stage(
507 open(tmp_path, 'rb'))588 glance_image.id, open(tmp_path, "rb")
589 )
508 # Import the Glance image590 # Import the Glance image
509 self.gclient.images.image_import(glance_image.id)591 self.gclient.images.image_import(glance_image.id)
510 else:592 else:
511 # Upload for v2593 # Upload for v2
512 self.gclient.images.upload(glance_image.id,594 self.gclient.images.upload(
513 open(tmp_path, 'rb'))595 glance_image.id, open(tmp_path, "rb")
596 )
514 # Update properties for v2597 # Update properties for v2
515 self.gclient.images.update(glance_image.id, **_properties)598 self.gclient.images.update(glance_image.id, **_properties)
516599
@@ -527,15 +610,19 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
527 # self.load_products() instead610 # self.load_products() instead
528 if self.set_latest_property:611 if self.set_latest_property:
529 # Search all images with the same target attributtes612 # Search all images with the same target attributtes
530 _filter_properties = {'filters': {613 _filter_properties = {
531 'latest': 'true',614 "filters": {
532 'os_version': glance_props['os_version'],615 "latest": "true",
533 'architecture': glance_props['architecture']}}616 "os_version": glance_props["os_version"],
617 "architecture": glance_props["architecture"],
618 }
619 }
534 images = self.gclient.images.list(**_filter_properties)620 images = self.gclient.images.list(**_filter_properties)
535 for image in images:621 for image in images:
536 if image.id != glance_image.id:622 if image.id != glance_image.id:
537 self.gclient.images.update(image.id,623 self.gclient.images.update(
538 remove_props=['latest'])624 image.id, remove_props=["latest"]
625 )
539626
540 finally:627 finally:
541 if tmp_path and os.path.exists(tmp_path):628 if tmp_path and os.path.exists(tmp_path):
@@ -562,10 +649,10 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
562 found = self.gclient.images.get(image_id)649 found = self.gclient.images.get(image_id)
563 if found.size == size and found.checksum == checksum:650 if found.size == size and found.checksum == checksum:
564 return651 return
565 msg = (652 msg = ("Invalid glance image: %s. " % image_id) + (
566 ("Invalid glance image: %s. " % image_id) +653 "Expected size=%s md5=%s. Found size=%s md5=%s."
567 ("Expected size=%s md5=%s. Found size=%s md5=%s." %654 % (size, checksum, found.size, found.checksum)
568 (size, checksum, found.size, found.checksum)))655 )
569 if delete:656 if delete:
570 LOG.warning("Deleting image %s: %s", image_id, msg)657 LOG.warning("Deleting image %s: %s", image_id, msg)
571 self.gclient.images.delete(image_id)658 self.gclient.images.delete(image_id)
@@ -587,13 +674,15 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
587 if version_name not in self.inserts[product_name]:674 if version_name not in self.inserts[product_name]:
588 self.inserts[product_name][version_name] = {}675 self.inserts[product_name][version_name] = {}
589676
590 if 'ftype' in data:677 if "ftype" in data:
591 ftype = data['ftype']678 ftype = data["ftype"]
592 else:679 else:
593 flat = util.products_exdata(src, pedigree, include_top=False)680 flat = util.products_exdata(src, pedigree, include_top=False)
594 ftype = flat.get('ftype')681 ftype = flat.get("ftype")
595 self.inserts[product_name][version_name][item_name] = (682 self.inserts[product_name][version_name][item_name] = (
596 ftype, (data, src, target, pedigree, contentsource))683 ftype,
684 (data, src, target, pedigree, contentsource),
685 )
597686
598 def insert_version(self, data, src, target, pedigree):687 def insert_version(self, data, src, target, pedigree):
599 """Upload all images for this version into glance688 """Upload all images for this version into glance
@@ -605,14 +694,20 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
605 product_name, version_name = pedigree694 product_name, version_name = pedigree
606 inserts = self.inserts.get(product_name, {}).get(version_name, [])695 inserts = self.inserts.get(product_name, {}).get(version_name, [])
607696
608 rtar_names = [f for f in inserts697 rtar_names = [
609 if inserts[f][0] in ('root.tar.gz', 'root.tar.xz')]698 f
699 for f in inserts
700 if inserts[f][0] in ("root.tar.gz", "root.tar.xz")
701 ]
610702
611 for _iname, (ftype, iargs) in inserts.items():703 for _iname, (ftype, iargs) in inserts.items():
612 if ftype == "squashfs" and rtar_names:704 if ftype == "squashfs" and rtar_names:
613 LOG.info("[%s] Skipping ftype 'squashfs' image in preference"705 LOG.info(
614 "for root tarball type in %s",706 "[%s] Skipping ftype 'squashfs' image in preference"
615 '/'.join(pedigree), rtar_names)707 "for root tarball type in %s",
708 "/".join(pedigree),
709 rtar_names,
710 )
616 continue711 continue
617 self._insert_item(*iargs)712 self._insert_item(*iargs)
618713
@@ -622,9 +717,9 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
622717
623 def remove_item(self, data, src, target, pedigree):718 def remove_item(self, data, src, target, pedigree):
624 util.products_del(target, pedigree)719 util.products_del(target, pedigree)
625 if 'id' in data:720 if "id" in data:
626 print("removing %s: %s" % (data['id'], data['name']))721 print("removing %s: %s" % (data["id"], data["name"]))
627 self.gclient.images.delete(data['id'])722 self.gclient.images.delete(data["id"])
628723
629 def filter_index_entry(self, data, src, pedigree):724 def filter_index_entry(self, data, src, pedigree):
630 return filters.filter_dict(self.index_filters, data)725 return filters.filter_dict(self.index_filters, data)
@@ -637,18 +732,18 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
637 util.products_prune(tree, preserve_empty_products=True)732 util.products_prune(tree, preserve_empty_products=True)
638733
639 # stop these items from copying up when we call condense734 # stop these items from copying up when we call condense
640 sticky = ['ftype', 'md5', 'sha256', 'size', 'name', 'id']735 sticky = ["ftype", "md5", "sha256", "size", "name", "id"]
641736
642 # LP: #1329805. Juju expects these on the item.737 # LP: #1329805. Juju expects these on the item.
643 if self.config.get('sticky_endpoint_region', True):738 if self.config.get("sticky_endpoint_region", True):
644 sticky += ['endpoint', 'region']739 sticky += ["endpoint", "region"]
645740
646 util.products_condense(tree, sticky=sticky)741 util.products_condense(tree, sticky=sticky)
647742
648 tsnow = util.timestamp()743 tsnow = util.timestamp()
649 tree['updated'] = tsnow744 tree["updated"] = tsnow
650745
651 dpath = self._cidpath(tree['content_id'])746 dpath = self._cidpath(tree["content_id"])
652 LOG.info("writing data: %s", dpath)747 LOG.info("writing data: %s", dpath)
653 self.store.insert_content(dpath, util.dump_data(tree))748 self.store.insert_content(dpath, util.dump_data(tree))
654749
@@ -659,17 +754,20 @@ class GlanceMirror(mirrors.BasicMirrorWriter):
659 except IOError as exc:754 except IOError as exc:
660 if exc.errno != errno.ENOENT:755 if exc.errno != errno.ENOENT:
661 raise756 raise
662 index = {"index": {}, 'format': 'index:1.0',757 index = {
663 'updated': util.timestamp()}758 "index": {},
664759 "format": "index:1.0",
665 index['index'][tree['content_id']] = {760 "updated": util.timestamp(),
666 'updated': tsnow,761 }
667 'datatype': 'image-ids',762
668 'clouds': [{'region': self.region, 'endpoint': self.auth_url}],763 index["index"][tree["content_id"]] = {
669 'cloudname': self.cloudname,764 "updated": tsnow,
670 'path': dpath,765 "datatype": "image-ids",
671 'products': list(tree['products'].keys()),766 "clouds": [{"region": self.region, "endpoint": self.auth_url}],
672 'format': tree['format'],767 "cloudname": self.cloudname,
768 "path": dpath,
769 "products": list(tree["products"].keys()),
770 "format": tree["format"],
673 }771 }
674 LOG.info("writing data: %s", ipath)772 LOG.info("writing data: %s", ipath)
675 self.store.insert_content(ipath, util.dump_data(index))773 self.store.insert_content(ipath, util.dump_data(index))
@@ -694,13 +792,13 @@ class ItemInfoDryRunMirror(GlanceMirror):
694792
695 def insert_item(self, data, src, target, pedigree, contentsource):793 def insert_item(self, data, src, target, pedigree, contentsource):
696 data = util.products_exdata(src, pedigree)794 data = util.products_exdata(src, pedigree)
697 if 'size' in data and 'path' in data and 'pubname' in data:795 if "size" in data and "path" in data and "pubname" in data:
698 self.items[data['pubname']] = int(data['size'])796 self.items[data["pubname"]] = int(data["size"])
699797
700798
701def _checksum_file(fobj, read_size=util.READ_SIZE, checksums=None):799def _checksum_file(fobj, read_size=util.READ_SIZE, checksums=None):
702 if checksums is None:800 if checksums is None:
703 checksums = {'md5': None}801 checksums = {"md5": None}
704 cksum = checksum_util.checksummer(checksums=checksums)802 cksum = checksum_util.checksummer(checksums=checksums)
705 while True:803 while True:
706 buf = fobj.read(read_size)804 buf = fobj.read(read_size)
@@ -713,13 +811,13 @@ def _checksum_file(fobj, read_size=util.READ_SIZE, checksums=None):
713def call_hook(item, path, cmd):811def call_hook(item, path, cmd):
714 env = os.environ.copy()812 env = os.environ.copy()
715 env.update(item)813 env.update(item)
716 env['IMAGE_PATH'] = path814 env["IMAGE_PATH"] = path
717 env['FIELDS'] = ' '.join(item.keys()) + ' IMAGE_PATH'815 env["FIELDS"] = " ".join(item.keys()) + " IMAGE_PATH"
718816
719 util.subp(cmd, env=env, capture=False)817 util.subp(cmd, env=env, capture=False)
720818
721 with open(path, "rb") as fp:819 with open(path, "rb") as fp:
722 md5 = _checksum_file(fp, checksums={'md5': None})820 md5 = _checksum_file(fp, checksums={"md5": None})
723821
724 return (os.path.getsize(path), md5)822 return (os.path.getsize(path), md5)
725823
@@ -728,12 +826,13 @@ def _strip_version(endpoint):
728 """Strip a version from the last component of an endpoint if present"""826 """Strip a version from the last component of an endpoint if present"""
729827
730 # Get rid of trailing '/' if present828 # Get rid of trailing '/' if present
731 if endpoint.endswith('/'):829 if endpoint.endswith("/"):
732 endpoint = endpoint[:-1]830 endpoint = endpoint[:-1]
733 url_bits = endpoint.split('/')831 url_bits = endpoint.split("/")
734 # regex to match 'v1' or 'v2.0' etc832 # regex to match 'v1' or 'v2.0' etc
735 if re.match(r'v\d+\.?\d*', url_bits[-1]):833 if re.match(r"v\d+\.?\d*", url_bits[-1]):
736 endpoint = '/'.join(url_bits[:-1])834 endpoint = "/".join(url_bits[:-1])
737 return endpoint835 return endpoint
738836
837
739# vi: ts=4 expandtab syntax=python838# vi: ts=4 expandtab syntax=python
diff --git a/simplestreams/objectstores/__init__.py b/simplestreams/objectstores/__init__.py
index f118a92..b9a6bbe 100644
--- a/simplestreams/objectstores/__init__.py
+++ b/simplestreams/objectstores/__init__.py
@@ -35,9 +35,13 @@ class ObjectStore(object):
3535
36 def insert_content(self, path, content, checksums=None, mutable=True):36 def insert_content(self, path, content, checksums=None, mutable=True):
37 if not isinstance(content, bytes):37 if not isinstance(content, bytes):
38 content = content.encode('utf-8')38 content = content.encode("utf-8")
39 self.insert(path=path, reader=cs.MemoryContentSource(content=content),39 self.insert(
40 checksums=checksums, mutable=mutable)40 path=path,
41 reader=cs.MemoryContentSource(content=content),
42 checksums=checksums,
43 mutable=mutable,
44 )
4145
42 def remove(self, path):46 def remove(self, path):
43 # remove path from store47 # remove path from store
@@ -48,9 +52,12 @@ class ObjectStore(object):
48 raise NotImplementedError()52 raise NotImplementedError()
4953
50 def exists_with_checksum(self, path, checksums=None):54 def exists_with_checksum(self, path, checksums=None):
51 return has_valid_checksum(path=path, reader=self.source,55 return has_valid_checksum(
52 checksums=checksums,56 path=path,
53 read_size=self.read_size)57 reader=self.source,
58 checksums=checksums,
59 read_size=self.read_size,
60 )
5461
5562
56class MemoryObjectStore(ObjectStore):63class MemoryObjectStore(ObjectStore):
@@ -73,35 +80,43 @@ class MemoryObjectStore(ObjectStore):
73 url = "%s://%s" % (self.__class__, path)80 url = "%s://%s" % (self.__class__, path)
74 return cs.MemoryContentSource(content=self.data[path], url=url)81 return cs.MemoryContentSource(content=self.data[path], url=url)
75 except KeyError:82 except KeyError:
76 raise IOError(errno.ENOENT, '%s not found' % path)83 raise IOError(errno.ENOENT, "%s not found" % path)
7784
7885
79class FileStore(ObjectStore):86class FileStore(ObjectStore):
80
81 def __init__(self, prefix, complete_callback=None):87 def __init__(self, prefix, complete_callback=None):
82 """ complete_callback is called periodically to notify users when a88 """complete_callback is called periodically to notify users when a
83 file is being inserted. It takes three arguments: the path that is89 file is being inserted. It takes three arguments: the path that is
84 inserted, the number of bytes downloaded, and the number of total90 inserted, the number of bytes downloaded, and the number of total
85 bytes. """91 bytes."""
86 self.prefix = prefix92 self.prefix = prefix
87 self.complete_callback = complete_callback93 self.complete_callback = complete_callback
8894
89 def insert(self, path, reader, checksums=None, mutable=True, size=None,95 def insert(
90 sparse=False):96 self,
9197 path,
98 reader,
99 checksums=None,
100 mutable=True,
101 size=None,
102 sparse=False,
103 ):
92 wpath = self._fullpath(path)104 wpath = self._fullpath(path)
93 if os.path.isfile(wpath):105 if os.path.isfile(wpath):
94 if not mutable:106 if not mutable:
95 # if the file exists, and not mutable, return107 # if the file exists, and not mutable, return
96 return108 return
97 if has_valid_checksum(path=path, reader=self.source,109 if has_valid_checksum(
98 checksums=checksums,110 path=path,
99 read_size=self.read_size):111 reader=self.source,
112 checksums=checksums,
113 read_size=self.read_size,
114 ):
100 return115 return
101116
102 zeros = None117 zeros = None
103 if sparse is True:118 if sparse is True:
104 zeros = '\0' * self.read_size119 zeros = "\0" * self.read_size
105120
106 cksum = checksum_util.checksummer(checksums)121 cksum = checksum_util.checksummer(checksums)
107 out_d = os.path.dirname(wpath)122 out_d = os.path.dirname(wpath)
@@ -110,8 +125,9 @@ class FileStore(ObjectStore):
110 util.mkdir_p(out_d)125 util.mkdir_p(out_d)
111 orig_part_size = 0126 orig_part_size = 0
112 reader_does_checksum = (127 reader_does_checksum = (
113 isinstance(reader, cs.ChecksummingContentSource) and128 isinstance(reader, cs.ChecksummingContentSource)
114 cksum.algorithm == reader.algorithm)129 and cksum.algorithm == reader.algorithm
130 )
115131
116 if os.path.exists(partfile):132 if os.path.exists(partfile):
117 try:133 try:
@@ -121,8 +137,12 @@ class FileStore(ObjectStore):
121 else:137 else:
122 reader.set_start_pos(orig_part_size)138 reader.set_start_pos(orig_part_size)
123139
124 LOG.debug("resuming partial (%s) download of '%s' from '%s'",140 LOG.debug(
125 orig_part_size, path, partfile)141 "resuming partial (%s) download of '%s' from '%s'",
142 orig_part_size,
143 path,
144 partfile,
145 )
126 with open(partfile, "rb") as fp:146 with open(partfile, "rb") as fp:
127 while True:147 while True:
128 buf = fp.read(self.read_size)148 buf = fp.read(self.read_size)
@@ -136,15 +156,17 @@ class FileStore(ObjectStore):
136 os.unlink(partfile)156 os.unlink(partfile)
137157
138 with open(partfile, "ab") as wfp:158 with open(partfile, "ab") as wfp:
139
140 while True:159 while True:
141 try:160 try:
142 buf = reader.read(self.read_size)161 buf = reader.read(self.read_size)
143 except checksum_util.InvalidChecksum:162 except checksum_util.InvalidChecksum:
144 break163 break
145 buflen = len(buf)164 buflen = len(buf)
146 if (buflen != self.read_size and zeros is not None and165 if (
147 zeros[0:buflen] == buf):166 buflen != self.read_size
167 and zeros is not None
168 and zeros[0:buflen] == buf
169 ):
148 wfp.seek(wfp.tell() + buflen)170 wfp.seek(wfp.tell() + buflen)
149 elif buf == zeros:171 elif buf == zeros:
150 wfp.seek(wfp.tell() + buflen)172 wfp.seek(wfp.tell() + buflen)
@@ -209,8 +231,9 @@ class FileStore(ObjectStore):
209 return os.path.join(self.prefix, path)231 return os.path.join(self.prefix, path)
210232
211233
212def has_valid_checksum(path, reader, checksums=None,234def has_valid_checksum(
213 read_size=READ_BUFFER_SIZE):235 path, reader, checksums=None, read_size=READ_BUFFER_SIZE
236):
214 if checksums is None:237 if checksums is None:
215 return False238 return False
216 try:239 try:
diff --git a/simplestreams/objectstores/s3.py b/simplestreams/objectstores/s3.py
index f1e9602..b07507f 100644
--- a/simplestreams/objectstores/s3.py
+++ b/simplestreams/objectstores/s3.py
@@ -15,19 +15,19 @@
15# You should have received a copy of the GNU Affero General Public License15# You should have received a copy of the GNU Affero General Public License
16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1717
18import errno
19import tempfile
20from contextlib import closing
21
18import boto.exception22import boto.exception
19import boto.s323import boto.s3
20import boto.s3.connection24import boto.s3.connection
21from contextlib import closing
22import errno
23import tempfile
2425
25import simplestreams.objectstores as objectstores
26import simplestreams.contentsource as cs26import simplestreams.contentsource as cs
27import simplestreams.objectstores as objectstores
2728
2829
29class S3ObjectStore(objectstores.ObjectStore):30class S3ObjectStore(objectstores.ObjectStore):
30
31 _bucket = None31 _bucket = None
32 _connection = None32 _connection = None
3333
@@ -92,8 +92,8 @@ class S3ObjectStore(objectstores.ObjectStore):
92 if key is None:92 if key is None:
93 return False93 return False
9494
95 if 'md5' in checksums:95 if "md5" in checksums:
96 return checksums['md5'] == key.etag.replace('"', "")96 return checksums["md5"] == key.etag.replace('"', "")
9797
98 return False98 return False
9999
diff --git a/simplestreams/objectstores/swift.py b/simplestreams/objectstores/swift.py
index f2c0d5b..d33fa3b 100644
--- a/simplestreams/objectstores/swift.py
+++ b/simplestreams/objectstores/swift.py
@@ -15,30 +15,30 @@
15# You should have received a copy of the GNU Affero General Public License15# You should have received a copy of the GNU Affero General Public License
16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1717
18import simplestreams.objectstores as objectstores
19import simplestreams.contentsource as cs
20import simplestreams.openstack as openstack
21
22import errno18import errno
23import hashlib19import hashlib
24from swiftclient import Connection, ClientException20
21from swiftclient import ClientException, Connection
22
23import simplestreams.contentsource as cs
24import simplestreams.objectstores as objectstores
25import simplestreams.openstack as openstack
2526
2627
27def get_swiftclient(**kwargs):28def get_swiftclient(**kwargs):
28 # nmap has entries that need name changes from a 'get_service_conn_info'29 # nmap has entries that need name changes from a 'get_service_conn_info'
29 # to a swift Connection name.30 # to a swift Connection name.
30 # pt has names that pass straight through31 # pt has names that pass straight through
31 nmap = {'endpoint': 'preauthurl', 'token': 'preauthtoken'}32 nmap = {"endpoint": "preauthurl", "token": "preauthtoken"}
32 pt = ('insecure', 'cacert')33 pt = ("insecure", "cacert")
3334
34 connargs = {v: kwargs.get(k) for k, v in nmap.items() if k in kwargs}35 connargs = {v: kwargs.get(k) for k, v in nmap.items() if k in kwargs}
35 connargs.update({k: kwargs.get(k) for k in pt if k in kwargs})36 connargs.update({k: kwargs.get(k) for k in pt if k in kwargs})
36 if kwargs.get('session'):37 if kwargs.get("session"):
37 sess = kwargs.get('session')38 sess = kwargs.get("session")
38 try:39 try:
39 # If session is available try it40 # If session is available try it
40 return Connection(session=sess,41 return Connection(session=sess, cacert=kwargs.get("cacert"))
41 cacert=kwargs.get('cacert'))
42 except TypeError:42 except TypeError:
43 # The edge case where session is availble but swiftclient is43 # The edge case where session is availble but swiftclient is
44 # < 3.3.0. Use the old style method for Connection.44 # < 3.3.0. Use the old style method for Connection.
@@ -52,7 +52,6 @@ class SwiftContentSource(cs.IteratorContentSource):
5252
5353
54class SwiftObjectStore(objectstores.ObjectStore):54class SwiftObjectStore(objectstores.ObjectStore):
55
56 def __init__(self, prefix, region=None):55 def __init__(self, prefix, region=None):
57 # expect 'swift://bucket/path_prefix'56 # expect 'swift://bucket/path_prefix'
58 self.prefix = prefix57 self.prefix = prefix
@@ -67,35 +66,41 @@ class SwiftObjectStore(objectstores.ObjectStore):
6766
68 self.keystone_creds = openstack.load_keystone_creds()67 self.keystone_creds = openstack.load_keystone_creds()
69 if region is not None:68 if region is not None:
70 self.keystone_creds['region_name'] = region69 self.keystone_creds["region_name"] = region
7170
72 conn_info = openstack.get_service_conn_info('object-store',71 conn_info = openstack.get_service_conn_info(
73 **self.keystone_creds)72 "object-store", **self.keystone_creds
73 )
74 self.swiftclient = get_swiftclient(**conn_info)74 self.swiftclient = get_swiftclient(**conn_info)
7575
76 # http://docs.openstack.org/developer/swift/misc.html#acls76 # http://docs.openstack.org/developer/swift/misc.html#acls
77 self.swiftclient.put_container(self.container,77 self.swiftclient.put_container(
78 headers={'X-Container-Read':78 self.container, headers={"X-Container-Read": ".r:*,.rlistings"}
79 '.r:*,.rlistings'})79 )
8080
81 def insert(self, path, reader, checksums=None, mutable=True, size=None):81 def insert(self, path, reader, checksums=None, mutable=True, size=None):
82 # store content from reader.read() into path, expecting result checksum82 # store content from reader.read() into path, expecting result checksum
83 self._insert(path=path, contents=reader, checksums=checksums,83 self._insert(
84 mutable=mutable)84 path=path, contents=reader, checksums=checksums, mutable=mutable
85 )
8586
86 def insert_content(self, path, content, checksums=None, mutable=True):87 def insert_content(self, path, content, checksums=None, mutable=True):
87 self._insert(path=path, contents=content, checksums=checksums,88 self._insert(
88 mutable=mutable)89 path=path, contents=content, checksums=checksums, mutable=mutable
90 )
8991
90 def remove(self, path):92 def remove(self, path):
91 self.swiftclient.delete_object(container=self.container,93 self.swiftclient.delete_object(
92 obj=self.path_prefix + path)94 container=self.container, obj=self.path_prefix + path
95 )
9396
94 def source(self, path):97 def source(self, path):
95 def itgen():98 def itgen():
96 (_headers, iterator) = self.swiftclient.get_object(99 (_headers, iterator) = self.swiftclient.get_object(
97 container=self.container, obj=self.path_prefix + path,100 container=self.container,
98 resp_chunk_size=self.read_size)101 obj=self.path_prefix + path,
102 resp_chunk_size=self.read_size,
103 )
99 return iterator104 return iterator
100105
101 return SwiftContentSource(itgen=itgen, url=self.prefix + path)106 return SwiftContentSource(itgen=itgen, url=self.prefix + path)
@@ -105,8 +110,9 @@ class SwiftObjectStore(objectstores.ObjectStore):
105110
106 def _head_path(self, path):111 def _head_path(self, path):
107 try:112 try:
108 headers = self.swiftclient.head_object(container=self.container,113 headers = self.swiftclient.head_object(
109 obj=self.path_prefix + path)114 container=self.container, obj=self.path_prefix + path
115 )
110 except Exception as exc:116 except Exception as exc:
111 if is_enoent(exc):117 if is_enoent(exc):
112 return {}118 return {}
@@ -122,19 +128,22 @@ class SwiftObjectStore(objectstores.ObjectStore):
122 if headers_match_checksums(headers, checksums):128 if headers_match_checksums(headers, checksums):
123 return129 return
124130
125 insargs = {'container': self.container, 'obj': self.path_prefix + path,131 insargs = {
126 'contents': contents}132 "container": self.container,
133 "obj": self.path_prefix + path,
134 "contents": contents,
135 }
127136
128 if size is not None and isinstance(contents, str):137 if size is not None and isinstance(contents, str):
129 size = len(contents)138 size = len(contents)
130139
131 if size is not None:140 if size is not None:
132 insargs['content_length'] = size141 insargs["content_length"] = size
133142
134 if checksums and checksums.get('md5'):143 if checksums and checksums.get("md5"):
135 insargs['etag'] = checksums.get('md5')144 insargs["etag"] = checksums.get("md5")
136 elif isinstance(contents, str):145 elif isinstance(contents, str):
137 insargs['etag'] = hashlib.md5(contents).hexdigest()146 insargs["etag"] = hashlib.md5(contents).hexdigest()
138147
139 self.swiftclient.put_object(**insargs)148 self.swiftclient.put_object(**insargs)
140149
@@ -142,13 +151,15 @@ class SwiftObjectStore(objectstores.ObjectStore):
142def headers_match_checksums(headers, checksums):151def headers_match_checksums(headers, checksums):
143 if not (headers and checksums):152 if not (headers and checksums):
144 return False153 return False
145 if ('md5' in checksums and headers.get('etag') == checksums.get('md5')):154 if "md5" in checksums and headers.get("etag") == checksums.get("md5"):
146 return True155 return True
147 return False156 return False
148157
149158
150def is_enoent(exc):159def is_enoent(exc):
151 return ((isinstance(exc, IOError) and exc.errno == errno.ENOENT) or160 return (isinstance(exc, IOError) and exc.errno == errno.ENOENT) or (
152 (isinstance(exc, ClientException) and exc.http_status == 404))161 isinstance(exc, ClientException) and exc.http_status == 404
162 )
163
153164
154# vi: ts=4 expandtab165# vi: ts=4 expandtab
diff --git a/simplestreams/openstack.py b/simplestreams/openstack.py
index ebf63c4..48101e7 100644
--- a/simplestreams/openstack.py
+++ b/simplestreams/openstack.py
@@ -20,9 +20,11 @@ import os
2020
21from keystoneclient.v2_0 import client as ksclient_v221from keystoneclient.v2_0 import client as ksclient_v2
22from keystoneclient.v3 import client as ksclient_v322from keystoneclient.v3 import client as ksclient_v3
23
23try:24try:
24 from keystoneauth1 import session25 from keystoneauth1 import session
25 from keystoneauth1.identity import (v2, v3)26 from keystoneauth1.identity import v2, v3
27
26 _LEGACY_CLIENTS = False28 _LEGACY_CLIENTS = False
27except ImportError:29except ImportError:
28 # 14.04 level packages do not have this.30 # 14.04 level packages do not have this.
@@ -31,43 +33,88 @@ except ImportError:
3133
3234
33OS_ENV_VARS = (35OS_ENV_VARS = (
34 'OS_AUTH_TOKEN', 'OS_AUTH_URL', 'OS_CACERT', 'OS_IMAGE_API_VERSION',36 "OS_AUTH_TOKEN",
35 'OS_IMAGE_URL', 'OS_PASSWORD', 'OS_REGION_NAME', 'OS_STORAGE_URL',37 "OS_AUTH_URL",
36 'OS_TENANT_ID', 'OS_TENANT_NAME', 'OS_USERNAME', 'OS_INSECURE',38 "OS_CACERT",
37 'OS_USER_DOMAIN_NAME', 'OS_PROJECT_DOMAIN_NAME',39 "OS_IMAGE_API_VERSION",
38 'OS_USER_DOMAIN_ID', 'OS_PROJECT_DOMAIN_ID', 'OS_PROJECT_NAME',40 "OS_IMAGE_URL",
39 'OS_PROJECT_ID'41 "OS_PASSWORD",
42 "OS_REGION_NAME",
43 "OS_STORAGE_URL",
44 "OS_TENANT_ID",
45 "OS_TENANT_NAME",
46 "OS_USERNAME",
47 "OS_INSECURE",
48 "OS_USER_DOMAIN_NAME",
49 "OS_PROJECT_DOMAIN_NAME",
50 "OS_USER_DOMAIN_ID",
51 "OS_PROJECT_DOMAIN_ID",
52 "OS_PROJECT_NAME",
53 "OS_PROJECT_ID",
40)54)
4155
4256
43# only used for legacy client connection57# only used for legacy client connection
44PT_V2 = ('username', 'password', 'tenant_id', 'tenant_name', 'auth_url',58PT_V2 = (
45 'cacert', 'insecure', )59 "username",
60 "password",
61 "tenant_id",
62 "tenant_name",
63 "auth_url",
64 "cacert",
65 "insecure",
66)
4667
47# annoyingly the 'insecure' option in the old client constructor is now called68# annoyingly the 'insecure' option in the old client constructor is now called
48# the 'verify' option in the session.Session() constructor69# the 'verify' option in the session.Session() constructor
49PASSWORD_V2 = ('auth_url', 'username', 'password', 'user_id', 'trust_id',70PASSWORD_V2 = (
50 'tenant_id', 'tenant_name', 'reauthenticate')71 "auth_url",
51PASSWORD_V3 = ('auth_url', 'password', 'username',72 "username",
52 'user_id', 'user_domain_id', 'user_domain_name',73 "password",
53 'trust_id', 'system_scope',74 "user_id",
54 'domain_id', 'domain_name',75 "trust_id",
55 'project_id', 'project_name',76 "tenant_id",
56 'project_domain_id', 'project_domain_name',77 "tenant_name",
57 'reauthenticate')78 "reauthenticate",
58SESSION_ARGS = ('cert', 'timeout', 'verify', 'original_ip', 'redirect',79)
59 'addition_headers', 'app_name', 'app_version',80PASSWORD_V3 = (
60 'additional_user_agent',81 "auth_url",
61 'discovery_cache', 'split_loggers', 'collect_timing')82 "password",
6283 "username",
6384 "user_id",
64Settings = collections.namedtuple('Settings', 'mod ident arg_set')85 "user_domain_id",
65KS_VERSION_RESOLVER = {2: Settings(mod=ksclient_v2,86 "user_domain_name",
66 ident=v2,87 "trust_id",
67 arg_set=PASSWORD_V2),88 "system_scope",
68 3: Settings(mod=ksclient_v3,89 "domain_id",
69 ident=v3,90 "domain_name",
70 arg_set=PASSWORD_V3)}91 "project_id",
92 "project_name",
93 "project_domain_id",
94 "project_domain_name",
95 "reauthenticate",
96)
97SESSION_ARGS = (
98 "cert",
99 "timeout",
100 "verify",
101 "original_ip",
102 "redirect",
103 "addition_headers",
104 "app_name",
105 "app_version",
106 "additional_user_agent",
107 "discovery_cache",
108 "split_loggers",
109 "collect_timing",
110)
111
112
113Settings = collections.namedtuple("Settings", "mod ident arg_set")
114KS_VERSION_RESOLVER = {
115 2: Settings(mod=ksclient_v2, ident=v2, arg_set=PASSWORD_V2),
116 3: Settings(mod=ksclient_v3, ident=v3, arg_set=PASSWORD_V3),
117}
71118
72119
73def load_keystone_creds(**kwargs):120def load_keystone_creds(**kwargs):
@@ -87,32 +134,38 @@ def load_keystone_creds(**kwargs):
87 # take off 'os_'134 # take off 'os_'
88 ret[short] = os.environ[name]135 ret[short] = os.environ[name]
89136
90 if 'insecure' in ret:137 if "insecure" in ret:
91 if isinstance(ret['insecure'], str):138 if isinstance(ret["insecure"], str):
92 ret['insecure'] = (ret['insecure'].lower() not in139 ret["insecure"] = ret["insecure"].lower() not in (
93 ("", "0", "no", "off", 'false'))140 "",
141 "0",
142 "no",
143 "off",
144 "false",
145 )
94 else:146 else:
95 ret['insecure'] = bool(ret['insecure'])147 ret["insecure"] = bool(ret["insecure"])
96148
97 # verify is the key that is used by requests, and thus the Session object.149 # verify is the key that is used by requests, and thus the Session object.
98 # i.e. verify is either False or a certificate path or file.150 # i.e. verify is either False or a certificate path or file.
99 if not ret.get('insecure', False) and 'cacert' in ret:151 if not ret.get("insecure", False) and "cacert" in ret:
100 ret['verify'] = ret['cacert']152 ret["verify"] = ret["cacert"]
101153
102 missing = []154 missing = []
103 for req in ('username', 'auth_url'):155 for req in ("username", "auth_url"):
104 if not ret.get(req, None):156 if not ret.get(req, None):
105 missing.append(req)157 missing.append(req)
106158
107 if not (ret.get('auth_token') or ret.get('password')):159 if not (ret.get("auth_token") or ret.get("password")):
108 missing.append("(auth_token or password)")160 missing.append("(auth_token or password)")
109161
110 api_version = get_ks_api_version(ret.get('auth_url', '')) or 2162 api_version = get_ks_api_version(ret.get("auth_url", "")) or 2
111 if (api_version == 2 and163 if api_version == 2 and not (
112 not (ret.get('tenant_id') or ret.get('tenant_name'))):164 ret.get("tenant_id") or ret.get("tenant_name")
165 ):
113 missing.append("(tenant_id or tenant_name)")166 missing.append("(tenant_id or tenant_name)")
114 if api_version == 3:167 if api_version == 3:
115 for k in ('user_domain_name', 'project_domain_name', 'project_name'):168 for k in ("user_domain_name", "project_domain_name", "project_name"):
116 if not ret.get(k, None):169 if not ret.get(k, None):
117 missing.append(k)170 missing.append(k)
118171
@@ -124,8 +177,8 @@ def load_keystone_creds(**kwargs):
124177
125def get_regions(client=None, services=None, kscreds=None):178def get_regions(client=None, services=None, kscreds=None):
126 # if kscreds had 'region_name', then return that179 # if kscreds had 'region_name', then return that
127 if kscreds and kscreds.get('region_name'):180 if kscreds and kscreds.get("region_name"):
128 return [kscreds.get('region_name')]181 return [kscreds.get("region_name")]
129182
130 if client is None:183 if client is None:
131 creds = kscreds184 creds = kscreds
@@ -139,7 +192,7 @@ def get_regions(client=None, services=None, kscreds=None):
139 regions = set()192 regions = set()
140 for service in services:193 for service in services:
141 for r in endpoints.get(service, {}):194 for r in endpoints.get(service, {}):
142 regions.add(r['region'])195 regions.add(r["region"])
143196
144 return list(regions)197 return list(regions)
145198
@@ -153,15 +206,15 @@ def get_ks_api_version(auth_url=None, env=None):
153 if env is None:206 if env is None:
154 env = os.environ207 env = os.environ
155208
156 if env.get('OS_IDENTITY_API_VERSION'):209 if env.get("OS_IDENTITY_API_VERSION"):
157 return int(env['OS_IDENTITY_API_VERSION'])210 return int(env["OS_IDENTITY_API_VERSION"])
158211
159 if auth_url is None:212 if auth_url is None:
160 auth_url = ""213 auth_url = ""
161214
162 if auth_url.endswith('/v3'):215 if auth_url.endswith("/v3"):
163 return 3216 return 3
164 elif auth_url.endswith('/v2.0'):217 elif auth_url.endswith("/v2.0"):
165 return 2218 return 2
166 # Return None if we can't determine the keystone version219 # Return None if we can't determine the keystone version
167 return None220 return None
@@ -178,20 +231,20 @@ def get_ksclient(**kwargs):
178 if _LEGACY_CLIENTS:231 if _LEGACY_CLIENTS:
179 return _legacy_ksclient(**kwargs)232 return _legacy_ksclient(**kwargs)
180233
181 api_version = get_ks_api_version(kwargs.get('auth_url', '')) or 2234 api_version = get_ks_api_version(kwargs.get("auth_url", "")) or 2
182 arg_set = KS_VERSION_RESOLVER[api_version].arg_set235 arg_set = KS_VERSION_RESOLVER[api_version].arg_set
183 # Filter/select the args for the api version from the kwargs dictionary236 # Filter/select the args for the api version from the kwargs dictionary
184 kskw = {k: v for k, v in kwargs.items() if k in arg_set}237 kskw = {k: v for k, v in kwargs.items() if k in arg_set}
185 auth = KS_VERSION_RESOLVER[api_version].ident.Password(**kskw)238 auth = KS_VERSION_RESOLVER[api_version].ident.Password(**kskw)
186 authkw = {k: v for k, v in kwargs.items() if k in SESSION_ARGS}239 authkw = {k: v for k, v in kwargs.items() if k in SESSION_ARGS}
187 authkw['auth'] = auth240 authkw["auth"] = auth
188 sess = session.Session(**authkw)241 sess = session.Session(**authkw)
189 client = KS_VERSION_RESOLVER[api_version].mod.Client(session=sess)242 client = KS_VERSION_RESOLVER[api_version].mod.Client(session=sess)
190 client.auth_ref = auth.get_access(sess)243 client.auth_ref = auth.get_access(sess)
191 return client244 return client
192245
193246
194def get_service_conn_info(service='image', client=None, **kwargs):247def get_service_conn_info(service="image", client=None, **kwargs):
195 # return a dict with token, insecure, cacert, endpoint248 # return a dict with token, insecure, cacert, endpoint
196 if not client:249 if not client:
197 client = get_ksclient(**kwargs)250 client = get_ksclient(**kwargs)
@@ -199,16 +252,23 @@ def get_service_conn_info(service='image', client=None, **kwargs):
199 endpoint = _get_endpoint(client, service, **kwargs)252 endpoint = _get_endpoint(client, service, **kwargs)
200 # Session client does not have tenant_id set at client.tenant_id253 # Session client does not have tenant_id set at client.tenant_id
201 # If client.tenant_id not set use method to get it254 # If client.tenant_id not set use method to get it
202 tenant_id = (client.tenant_id or client.get_project_id(client.session) or255 tenant_id = (
203 client.auth.client.get_project_id())256 client.tenant_id
204 info = {'token': client.auth_token, 'insecure': kwargs.get('insecure'),257 or client.get_project_id(client.session)
205 'cacert': kwargs.get('cacert'), 'endpoint': endpoint,258 or client.auth.client.get_project_id()
206 'tenant_id': tenant_id}259 )
260 info = {
261 "token": client.auth_token,
262 "insecure": kwargs.get("insecure"),
263 "cacert": kwargs.get("cacert"),
264 "endpoint": endpoint,
265 "tenant_id": tenant_id,
266 }
207 if not _LEGACY_CLIENTS:267 if not _LEGACY_CLIENTS:
208 info['session'] = client.session268 info["session"] = client.session
209 info['glance_version'] = '2'269 info["glance_version"] = "2"
210 else:270 else:
211 info['glance_version'] = '1'271 info["glance_version"] = "1"
212272
213 return info273 return info
214274
@@ -216,12 +276,12 @@ def get_service_conn_info(service='image', client=None, **kwargs):
216def _get_endpoint(client, service, **kwargs):276def _get_endpoint(client, service, **kwargs):
217 """Get an endpoint using the provided keystone client."""277 """Get an endpoint using the provided keystone client."""
218 endpoint_kwargs = {278 endpoint_kwargs = {
219 'service_type': service,279 "service_type": service,
220 'interface': kwargs.get('endpoint_type') or 'publicURL',280 "interface": kwargs.get("endpoint_type") or "publicURL",
221 'region_name': kwargs.get('region_name'),281 "region_name": kwargs.get("region_name"),
222 }282 }
223 if _LEGACY_CLIENTS:283 if _LEGACY_CLIENTS:
224 del endpoint_kwargs['interface']284 del endpoint_kwargs["interface"]
225285
226 endpoint = client.service_catalog.url_for(**endpoint_kwargs)286 endpoint = client.service_catalog.url_for(**endpoint_kwargs)
227 return endpoint287 return endpoint
diff --git a/simplestreams/util.py b/simplestreams/util.py
index 9866893..ebfe741 100644
--- a/simplestreams/util.py
+++ b/simplestreams/util.py
@@ -16,15 +16,15 @@
16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.16# along with Simplestreams. If not, see <http://www.gnu.org/licenses/>.
1717
18import errno18import errno
19import json
19import os20import os
20import re21import re
21import subprocess22import subprocess
22import tempfile23import tempfile
23import time24import time
24import json
2525
26import simplestreams.contentsource as cs
27import simplestreams.checksum_util as checksum_util26import simplestreams.checksum_util as checksum_util
27import simplestreams.contentsource as cs
28from simplestreams.log import LOG28from simplestreams.log import LOG
2929
30ALIASNAME = "_aliases"30ALIASNAME = "_aliases"
@@ -35,7 +35,7 @@ PGP_SIGNATURE_FOOTER = "-----END PGP SIGNATURE-----"
3535
36_UNSET = object()36_UNSET = object()
3737
38READ_SIZE = (1024 * 10)38READ_SIZE = 1024 * 10
3939
40PRODUCTS_TREE_DATA = (40PRODUCTS_TREE_DATA = (
41 ("products", "product_name"),41 ("products", "product_name"),
@@ -84,7 +84,7 @@ def products_exdata(tree, pedigree, include_top=True, insert_fieldnames=True):
84 if include_top and tree:84 if include_top and tree:
85 exdata.update(stringitems(tree))85 exdata.update(stringitems(tree))
86 clevel = tree86 clevel = tree
87 for (n, key) in enumerate(pedigree):87 for n, key in enumerate(pedigree):
88 dictname, fieldname = harchy[n]88 dictname, fieldname = harchy[n]
89 clevel = clevel.get(dictname, {}).get(key, {})89 clevel = clevel.get(dictname, {}).get(key, {})
90 exdata.update(stringitems(clevel))90 exdata.update(stringitems(clevel))
@@ -131,48 +131,54 @@ def products_del(tree, pedigree):
131131
132132
133def products_prune(tree, preserve_empty_products=False):133def products_prune(tree, preserve_empty_products=False):
134 for prodname in list(tree.get('products', {}).keys()):134 for prodname in list(tree.get("products", {}).keys()):
135 keys = list(tree['products'][prodname].get('versions', {}).keys())135 keys = list(tree["products"][prodname].get("versions", {}).keys())
136 for vername in keys:136 for vername in keys:
137 vtree = tree['products'][prodname]['versions'][vername]137 vtree = tree["products"][prodname]["versions"][vername]
138 for itemname in list(vtree.get('items', {}).keys()):138 for itemname in list(vtree.get("items", {}).keys()):
139 if not vtree['items'][itemname]:139 if not vtree["items"][itemname]:
140 del vtree['items'][itemname]140 del vtree["items"][itemname]
141141
142 if 'items' not in vtree or not vtree['items']:142 if "items" not in vtree or not vtree["items"]:
143 del tree['products'][prodname]['versions'][vername]143 del tree["products"][prodname]["versions"][vername]
144144
145 if ('versions' not in tree['products'][prodname] or145 if (
146 not tree['products'][prodname]['versions']):146 "versions" not in tree["products"][prodname]
147 del tree['products'][prodname]147 or not tree["products"][prodname]["versions"]
148148 ):
149 if (not preserve_empty_products and 'products' in tree and149 del tree["products"][prodname]
150 not tree['products']):150
151 del tree['products']151 if (
152152 not preserve_empty_products
153153 and "products" in tree
154def walk_products(tree, cb_product=None, cb_version=None, cb_item=None,154 and not tree["products"]
155 ret_finished=_UNSET):155 ):
156 del tree["products"]
157
158
159def walk_products(
160 tree, cb_product=None, cb_version=None, cb_item=None, ret_finished=_UNSET
161):
156 # walk a product tree. callbacks are called with (item, tree, (pedigree))162 # walk a product tree. callbacks are called with (item, tree, (pedigree))
157 for prodname, proddata in tree['products'].items():163 for prodname, proddata in tree["products"].items():
158 if cb_product:164 if cb_product:
159 ret = cb_product(proddata, tree, (prodname,))165 ret = cb_product(proddata, tree, (prodname,))
160 if ret_finished != _UNSET and ret == ret_finished:166 if ret_finished != _UNSET and ret == ret_finished:
161 return167 return
162168
163 if (not cb_version and not cb_item) or 'versions' not in proddata:169 if (not cb_version and not cb_item) or "versions" not in proddata:
164 continue170 continue
165171
166 for vername, verdata in proddata['versions'].items():172 for vername, verdata in proddata["versions"].items():
167 if cb_version:173 if cb_version:
168 ret = cb_version(verdata, tree, (prodname, vername))174 ret = cb_version(verdata, tree, (prodname, vername))
169 if ret_finished != _UNSET and ret == ret_finished:175 if ret_finished != _UNSET and ret == ret_finished:
170 return176 return
171177
172 if not cb_item or 'items' not in verdata:178 if not cb_item or "items" not in verdata:
173 continue179 continue
174180
175 for itemname, itemdata in verdata['items'].items():181 for itemname, itemdata in verdata["items"].items():
176 ret = cb_item(itemdata, tree, (prodname, vername, itemname))182 ret = cb_item(itemdata, tree, (prodname, vername, itemname))
177 if ret_finished != _UNSET and ret == ret_finished:183 if ret_finished != _UNSET and ret == ret_finished:
178 return184 return
@@ -205,8 +211,9 @@ def expand_data(data, refs=None, delete=False):
205 expand_data(item, refs)211 expand_data(item, refs)
206212
207213
208def resolve_work(src, target, maxnum=None, keep=False, itemfilter=None,214def resolve_work(
209 sort_reverse=True):215 src, target, maxnum=None, keep=False, itemfilter=None, sort_reverse=True
216):
210 # if more than maxnum items are in src, only the most recent maxnum will be217 # if more than maxnum items are in src, only the most recent maxnum will be
211 # stored in target. If keep is true, then the most recent maxnum items218 # stored in target. If keep is true, then the most recent maxnum items
212 # will be kept in target even if they are no longer in src.219 # will be kept in target even if they are no longer in src.
@@ -241,8 +248,9 @@ def resolve_work(src, target, maxnum=None, keep=False, itemfilter=None,
241 while len(remove) and (maxnum > (after_add - len(remove))):248 while len(remove) and (maxnum > (after_add - len(remove))):
242 remove.pop(0)249 remove.pop(0)
243250
244 mtarget = sorted([f for f in target + add if f not in remove],251 mtarget = sorted(
245 reverse=reverse)252 [f for f in target + add if f not in remove], reverse=reverse
253 )
246 if maxnum is not None and len(mtarget) > maxnum:254 if maxnum is not None and len(mtarget) > maxnum:
247 for item in mtarget[maxnum:]:255 for item in mtarget[maxnum:]:
248 if item in target:256 if item in target:
@@ -263,12 +271,12 @@ def has_gpgv():
263 if _HAS_GPGV is not None:271 if _HAS_GPGV is not None:
264 return _HAS_GPGV272 return _HAS_GPGV
265273
266 if which('gpgv'):274 if which("gpgv"):
267 try:275 try:
268 env = os.environ.copy()276 env = os.environ.copy()
269 env['LANG'] = 'C'277 env["LANG"] = "C"
270 out, err = subp(["gpgv", "--help"], capture=True, env=env)278 out, err = subp(["gpgv", "--help"], capture=True, env=env)
271 _HAS_GPGV = 'gnupg' in out.lower() or 'gnupg' in err.lower()279 _HAS_GPGV = "gnupg" in out.lower() or "gnupg" in err.lower()
272 except subprocess.CalledProcessError:280 except subprocess.CalledProcessError:
273 _HAS_GPGV = False281 _HAS_GPGV = False
274 else:282 else:
@@ -292,11 +300,13 @@ def read_signed(content, keyring=None, checked=True):
292 try:300 try:
293 subp(cmd, data=content)301 subp(cmd, data=content)
294 except subprocess.CalledProcessError as e:302 except subprocess.CalledProcessError as e:
295 LOG.debug("failed: %s\n out=%s\n err=%s" %303 LOG.debug(
296 (' '.join(cmd), e.output[0], e.output[1]))304 "failed: %s\n out=%s\n err=%s"
305 % (" ".join(cmd), e.output[0], e.output[1])
306 )
297 raise e307 raise e
298308
299 ret = {'body': [], 'signature': [], 'garbage': []}309 ret = {"body": [], "signature": [], "garbage": []}
300 lines = content.splitlines()310 lines = content.splitlines()
301 i = 0311 i = 0
302 for i in range(0, len(lines)):312 for i in range(0, len(lines)):
@@ -320,21 +330,22 @@ def read_signed(content, keyring=None, checked=True):
320 else:330 else:
321 ret[mode].append(lines[i])331 ret[mode].append(lines[i])
322332
323 ret['body'].append('') # need empty line at end333 ret["body"].append("") # need empty line at end
324 return "\n".join(ret['body'])334 return "\n".join(ret["body"])
325 else:335 else:
326 raise SignatureMissingException("No signature found!")336 raise SignatureMissingException("No signature found!")
327337
328338
329def load_content(content):339def load_content(content):
330 if isinstance(content, bytes):340 if isinstance(content, bytes):
331 content = content.decode('utf-8')341 content = content.decode("utf-8")
332 return json.loads(content)342 return json.loads(content)
333343
334344
335def dump_data(data):345def dump_data(data):
336 return json.dumps(data, indent=1, sort_keys=True,346 return json.dumps(
337 separators=(',', ': ')).encode('utf-8')347 data, indent=1, sort_keys=True, separators=(",", ": ")
348 ).encode("utf-8")
338349
339350
340def timestamp(ts=None):351def timestamp(ts=None):
@@ -375,24 +386,26 @@ def move_dups(src, target, sticky=None):
375 target.update(updates)386 target.update(updates)
376387
377388
378def products_condense(ptree, sticky=None, top='versions'):389def products_condense(ptree, sticky=None, top="versions"):
379 # walk a products tree, copying up item keys as far as they'll go390 # walk a products tree, copying up item keys as far as they'll go
380 # only move items to a sibling of the 'top'.391 # only move items to a sibling of the 'top'.
381392
382 if top not in ('versions', 'products'):393 if top not in ("versions", "products"):
383 raise ValueError("'top' must be one of: %s" %394 raise ValueError(
384 ','.join(PRODUCTS_TREE_HIERARCHY))395 "'top' must be one of: %s" % ",".join(PRODUCTS_TREE_HIERARCHY)
396 )
385397
386 def call_move_dups(cur, _tree, pedigree):398 def call_move_dups(cur, _tree, pedigree):
387 (_mtype, stname) = (("product", "versions"),399 (_mtype, stname) = (("product", "versions"), ("version", "items"))[
388 ("version", "items"))[len(pedigree) - 1]400 len(pedigree) - 1
401 ]
389 move_dups(cur.get(stname, {}), cur, sticky=sticky)402 move_dups(cur.get(stname, {}), cur, sticky=sticky)
390403
391 walk_products(ptree, cb_version=call_move_dups)404 walk_products(ptree, cb_version=call_move_dups)
392 walk_products(ptree, cb_product=call_move_dups)405 walk_products(ptree, cb_product=call_move_dups)
393 if top == 'versions':406 if top == "versions":
394 return407 return
395 move_dups(ptree['products'], ptree)408 move_dups(ptree["products"], ptree)
396409
397410
398def assert_safe_path(path):411def assert_safe_path(path):
@@ -449,16 +462,22 @@ def subp(args, data=None, capture=True, shell=False, env=None):
449 else:462 else:
450 stdout, stderr = (subprocess.PIPE, subprocess.PIPE)463 stdout, stderr = (subprocess.PIPE, subprocess.PIPE)
451464
452 sp = subprocess.Popen(args, stdout=stdout, stderr=stderr,465 sp = subprocess.Popen(
453 stdin=subprocess.PIPE, shell=shell, env=env)466 args,
467 stdout=stdout,
468 stderr=stderr,
469 stdin=subprocess.PIPE,
470 shell=shell,
471 env=env,
472 )
454 if isinstance(data, str):473 if isinstance(data, str):
455 data = data.encode('utf-8')474 data = data.encode("utf-8")
456475
457 (out, err) = sp.communicate(data)476 (out, err) = sp.communicate(data)
458 if isinstance(out, bytes):477 if isinstance(out, bytes):
459 out = out.decode('utf-8')478 out = out.decode("utf-8")
460 if isinstance(err, bytes):479 if isinstance(err, bytes):
461 err = err.decode('utf-8')480 err = err.decode("utf-8")
462481
463 rc = sp.returncode482 rc = sp.returncode
464 if rc != 0:483 if rc != 0:
@@ -468,22 +487,22 @@ def subp(args, data=None, capture=True, shell=False, env=None):
468487
469488
470def get_sign_cmd(path, output=None, inline=False):489def get_sign_cmd(path, output=None, inline=False):
471 cmd = ['gpg']490 cmd = ["gpg"]
472 defkey = os.environ.get('SS_GPG_DEFAULT_KEY')491 defkey = os.environ.get("SS_GPG_DEFAULT_KEY")
473 if defkey:492 if defkey:
474 cmd.extend(['--default-key', defkey])493 cmd.extend(["--default-key", defkey])
475494
476 batch = os.environ.get('SS_GPG_BATCH', "1").lower()495 batch = os.environ.get("SS_GPG_BATCH", "1").lower()
477 if batch not in ("0", "false"):496 if batch not in ("0", "false"):
478 cmd.append('--batch')497 cmd.append("--batch")
479498
480 if output:499 if output:
481 cmd.extend(['--output', output])500 cmd.extend(["--output", output])
482501
483 if inline:502 if inline:
484 cmd.append('--clearsign')503 cmd.append("--clearsign")
485 else:504 else:
486 cmd.extend(['--armor', '--detach-sign'])505 cmd.extend(["--armor", "--detach-sign"])
487506
488 cmd.extend([path])507 cmd.extend([path])
489 return cmd508 return cmd
@@ -498,17 +517,17 @@ def make_signed_content_paths(content):
498 if data.get("format") != "index:1.0":517 if data.get("format") != "index:1.0":
499 return (False, None)518 return (False, None)
500519
501 for content_ent in list(data.get('index', {}).values()):520 for content_ent in list(data.get("index", {}).values()):
502 path = content_ent.get('path')521 path = content_ent.get("path")
503 if path.endswith(".json"):522 if path.endswith(".json"):
504 content_ent['path'] = signed_fname(path, inline=True)523 content_ent["path"] = signed_fname(path, inline=True)
505524
506 return (True, json.dumps(data, indent=1))525 return (True, json.dumps(data, indent=1))
507526
508527
509def signed_fname(fname, inline=True):528def signed_fname(fname, inline=True):
510 if inline:529 if inline:
511 sfname = fname[0:-len(".json")] + ".sjson"530 sfname = fname[0 : -len(".json")] + ".sjson"
512 else:531 else:
513 sfname = fname + ".gpg"532 sfname = fname + ".gpg"
514533
@@ -555,8 +574,10 @@ def sign_file(fname, inline=True, outfile=None):
555574
556def sign_content(content, outfile="-", inline=True):575def sign_content(content, outfile="-", inline=True):
557 rm_f_file(outfile, skip=["-"])576 rm_f_file(outfile, skip=["-"])
558 return subp(args=get_sign_cmd(path="-", output=outfile, inline=inline),577 return subp(
559 data=content)[0]578 args=get_sign_cmd(path="-", output=outfile, inline=inline),
579 data=content,
580 )[0]
560581
561582
562def path_from_mirror_url(mirror, path):583def path_from_mirror_url(mirror, path):
@@ -566,8 +587,8 @@ def path_from_mirror_url(mirror, path):
566 path_regex = "streams/v1/.*[.](sjson|json)$"587 path_regex = "streams/v1/.*[.](sjson|json)$"
567 result = re.search(path_regex, mirror)588 result = re.search(path_regex, mirror)
568 if result:589 if result:
569 path = mirror[result.start():]590 path = mirror[result.start() :]
570 mirror = mirror[:result.start()]591 mirror = mirror[: result.start()]
571 else:592 else:
572 path = "streams/v1/index.sjson"593 path = "streams/v1/index.sjson"
573594
@@ -589,21 +610,21 @@ class ProgressAggregator(object):
589 self.total_written = 0610 self.total_written = 0
590611
591 def progress_callback(self, progress):612 def progress_callback(self, progress):
592 if self.current_file != progress['name']:613 if self.current_file != progress["name"]:
593 if self.remaining_items and self.current_file is not None:614 if self.remaining_items and self.current_file is not None:
594 del self.remaining_items[self.current_file]615 del self.remaining_items[self.current_file]
595 self.current_file = progress['name']616 self.current_file = progress["name"]
596 self.last_emitted = 0617 self.last_emitted = 0
597 self.current_written = 0618 self.current_written = 0
598619
599 size = float(progress['size'])620 size = float(progress["size"])
600 written = float(progress['written'])621 written = float(progress["written"])
601 self.current_written += written622 self.current_written += written
602 self.total_written += written623 self.total_written += written
603 interval = self.current_written - self.last_emitted624 interval = self.current_written - self.last_emitted
604 if interval > size / 100:625 if interval > size / 100:
605 self.last_emitted = self.current_written626 self.last_emitted = self.current_written
606 progress['written'] = self.current_written627 progress["written"] = self.current_written
607 self.emit(progress)628 self.emit(progress)
608629
609 def emit(self, progress):630 def emit(self, progress):
diff --git a/tests/httpserver.py b/tests/httpserver.py
index e88eb56..10bde9a 100644
--- a/tests/httpserver.py
+++ b/tests/httpserver.py
@@ -1,50 +1,61 @@
1#!/usr/bin/env python1#!/usr/bin/env python
2import os2import os
3import sys3import sys
4
4if sys.version_info.major == 2:5if sys.version_info.major == 2:
5 from SimpleHTTPServer import SimpleHTTPRequestHandler
6 from BaseHTTPServer import HTTPServer6 from BaseHTTPServer import HTTPServer
7 from SimpleHTTPServer import SimpleHTTPRequestHandler
7else:8else:
8 from http.server import SimpleHTTPRequestHandler9 from http.server import HTTPServer, SimpleHTTPRequestHandler
9 from http.server import HTTPServer
1010
1111
12class LoggingHTTPRequestHandler(SimpleHTTPRequestHandler):12class LoggingHTTPRequestHandler(SimpleHTTPRequestHandler):
13 def log_request(self, code='-', size='-'):13 def log_request(self, code="-", size="-"):
14 """14 """
15 Log an accepted request along with user-agent string.15 Log an accepted request along with user-agent string.
16 """16 """
1717
18 user_agent = self.headers.get("user-agent")18 user_agent = self.headers.get("user-agent")
19 self.log_message('"%s" %s %s (%s)',19 self.log_message(
20 self.requestline, str(code), str(size), user_agent)20 '"%s" %s %s (%s)',
21 self.requestline,
22 str(code),
23 str(size),
24 user_agent,
25 )
2126
2227
23def run(address, port,28def run(
24 HandlerClass=LoggingHTTPRequestHandler, ServerClass=HTTPServer):29 address,
30 port,
31 HandlerClass=LoggingHTTPRequestHandler,
32 ServerClass=HTTPServer,
33):
25 try:34 try:
26 server = ServerClass((address, port), HandlerClass)35 server = ServerClass((address, port), HandlerClass)
27 address, port = server.socket.getsockname()36 address, port = server.socket.getsockname()
28 sys.stderr.write("Serving HTTP: %s %s %s\n" %37 sys.stderr.write(
29 (address, port, os.getcwd()))38 "Serving HTTP: %s %s %s\n" % (address, port, os.getcwd())
39 )
30 server.serve_forever()40 server.serve_forever()
31 except KeyboardInterrupt:41 except KeyboardInterrupt:
32 server.socket.close()42 server.socket.close()
3343
3444
35if __name__ == '__main__':45if __name__ == "__main__":
36 import sys46 import sys
47
37 if len(sys.argv) == 3:48 if len(sys.argv) == 3:
38 # 2 args: address and port49 # 2 args: address and port
39 address = sys.argv[1]50 address = sys.argv[1]
40 port = int(sys.argv[2])51 port = int(sys.argv[2])
41 elif len(sys.argv) == 2:52 elif len(sys.argv) == 2:
42 # 1 arg: port53 # 1 arg: port
43 address = '0.0.0.0'54 address = "0.0.0.0"
44 port = int(sys.argv[1])55 port = int(sys.argv[1])
45 elif len(sys.argv) == 1:56 elif len(sys.argv) == 1:
46 # no args random port (port=0)57 # no args random port (port=0)
47 address = '0.0.0.0'58 address = "0.0.0.0"
48 port = 059 port = 0
49 else:60 else:
50 sys.stderr.write("Expect [address] [port]\n")61 sys.stderr.write("Expect [address] [port]\n")
diff --git a/tests/testutil.py b/tests/testutil.py
index 2c89e3a..2816366 100644
--- a/tests/testutil.py
+++ b/tests/testutil.py
@@ -1,10 +1,10 @@
1import os1import os
2from simplestreams import objectstores
3from simplestreams import mirrors
42
3from simplestreams import mirrors, objectstores
54
6EXAMPLES_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__),5EXAMPLES_DIR = os.path.abspath(
7 "..", "examples"))6 os.path.join(os.path.dirname(__file__), "..", "examples")
7)
88
99
10def get_mirror_reader(name, docdir=None, signed=False):10def get_mirror_reader(name, docdir=None, signed=False):
@@ -20,4 +20,5 @@ def get_mirror_reader(name, docdir=None, signed=False):
20 kwargs = {} if signed else {"policy": policy}20 kwargs = {} if signed else {"policy": policy}
21 return mirrors.ObjectStoreMirrorReader(sstore, **kwargs)21 return mirrors.ObjectStoreMirrorReader(sstore, **kwargs)
2222
23
23# vi: ts=4 expandtab syntax=python24# vi: ts=4 expandtab syntax=python
diff --git a/tests/unittests/test_badmirrors.py b/tests/unittests/test_badmirrors.py
index 6c5546f..d640e28 100644
--- a/tests/unittests/test_badmirrors.py
+++ b/tests/unittests/test_badmirrors.py
@@ -1,11 +1,12 @@
1from unittest import TestCase1from unittest import TestCase
2from tests.testutil import get_mirror_reader2
3from simplestreams import checksum_util, mirrors, util
3from simplestreams.mirrors import (4from simplestreams.mirrors import (
4 ObjectStoreMirrorWriter, ObjectStoreMirrorReader)5 ObjectStoreMirrorReader,
6 ObjectStoreMirrorWriter,
7)
5from simplestreams.objectstores import MemoryObjectStore8from simplestreams.objectstores import MemoryObjectStore
6from simplestreams import util9from tests.testutil import get_mirror_reader
7from simplestreams import checksum_util
8from simplestreams import mirrors
910
1011
11class TestBadDataSources(TestCase):12class TestBadDataSources(TestCase):
@@ -19,7 +20,8 @@ class TestBadDataSources(TestCase):
19 def setUp(self):20 def setUp(self):
20 self.src = self.get_clean_src(self.example, path=self.dlpath)21 self.src = self.get_clean_src(self.example, path=self.dlpath)
21 self.target = ObjectStoreMirrorWriter(22 self.target = ObjectStoreMirrorWriter(
22 config={}, objectstore=MemoryObjectStore())23 config={}, objectstore=MemoryObjectStore()
24 )
2325
24 def get_clean_src(self, exname, path):26 def get_clean_src(self, exname, path):
25 good_src = get_mirror_reader(exname)27 good_src = get_mirror_reader(exname)
@@ -34,7 +36,8 @@ class TestBadDataSources(TestCase):
34 del objectstore.data[k]36 del objectstore.data[k]
3537
36 return ObjectStoreMirrorReader(38 return ObjectStoreMirrorReader(
37 objectstore=objectstore, policy=lambda content, path: content)39 objectstore=objectstore, policy=lambda content, path: content
40 )
3841
39 def test_sanity_valid(self):42 def test_sanity_valid(self):
40 # verify that the tests are fine on expected pass43 # verify that the tests are fine on expected pass
@@ -43,50 +46,77 @@ class TestBadDataSources(TestCase):
4346
44 def test_larger_size_causes_bad_checksum(self):47 def test_larger_size_causes_bad_checksum(self):
45 def size_plus_1(item):48 def size_plus_1(item):
46 item['size'] = int(item['size']) + 149 item["size"] = int(item["size"]) + 1
47 return item50 return item
4851
49 _moditem(self.src, self.dlpath, self.pedigree, size_plus_1)52 _moditem(self.src, self.dlpath, self.pedigree, size_plus_1)
50 self.assertRaises(checksum_util.InvalidChecksum,53 self.assertRaises(
51 self.target.sync, self.src, self.dlpath)54 checksum_util.InvalidChecksum,
55 self.target.sync,
56 self.src,
57 self.dlpath,
58 )
5259
53 def test_smaller_size_causes_bad_checksum(self):60 def test_smaller_size_causes_bad_checksum(self):
54 def size_minus_1(item):61 def size_minus_1(item):
55 item['size'] = int(item['size']) - 162 item["size"] = int(item["size"]) - 1
56 return item63 return item
64
57 _moditem(self.src, self.dlpath, self.pedigree, size_minus_1)65 _moditem(self.src, self.dlpath, self.pedigree, size_minus_1)
58 self.assertRaises(checksum_util.InvalidChecksum,66 self.assertRaises(
59 self.target.sync, self.src, self.dlpath)67 checksum_util.InvalidChecksum,
68 self.target.sync,
69 self.src,
70 self.dlpath,
71 )
6072
61 def test_too_much_content_causes_bad_checksum(self):73 def test_too_much_content_causes_bad_checksum(self):
62 self.src.objectstore.data[self.item_path] += b"extra"74 self.src.objectstore.data[self.item_path] += b"extra"
63 self.assertRaises(checksum_util.InvalidChecksum,75 self.assertRaises(
64 self.target.sync, self.src, self.dlpath)76 checksum_util.InvalidChecksum,
77 self.target.sync,
78 self.src,
79 self.dlpath,
80 )
6581
66 def test_too_little_content_causes_bad_checksum(self):82 def test_too_little_content_causes_bad_checksum(self):
67 orig = self.src.objectstore.data[self.item_path]83 orig = self.src.objectstore.data[self.item_path]
68 self.src.objectstore.data[self.item_path] = orig[0:-1]84 self.src.objectstore.data[self.item_path] = orig[0:-1]
69 self.assertRaises(checksum_util.InvalidChecksum,85 self.assertRaises(
70 self.target.sync, self.src, self.dlpath)86 checksum_util.InvalidChecksum,
87 self.target.sync,
88 self.src,
89 self.dlpath,
90 )
7191
72 def test_busted_checksum_causes_bad_checksum(self):92 def test_busted_checksum_causes_bad_checksum(self):
73 def break_checksum(item):93 def break_checksum(item):
74 chars = "0123456789abcdef"94 chars = "0123456789abcdef"
75 orig = item['sha256']95 orig = item["sha256"]
76 item['sha256'] = ''.join(96 item["sha256"] = "".join(
77 [chars[(chars.find(c) + 1) % len(chars)] for c in orig])97 [chars[(chars.find(c) + 1) % len(chars)] for c in orig]
98 )
78 return item99 return item
79100
80 _moditem(self.src, self.dlpath, self.pedigree, break_checksum)101 _moditem(self.src, self.dlpath, self.pedigree, break_checksum)
81 self.assertRaises(checksum_util.InvalidChecksum,102 self.assertRaises(
82 self.target.sync, self.src, self.dlpath)103 checksum_util.InvalidChecksum,
104 self.target.sync,
105 self.src,
106 self.dlpath,
107 )
83108
84 def test_changed_content_causes_bad_checksum(self):109 def test_changed_content_causes_bad_checksum(self):
85 # correct size but different content should raise bad checksum110 # correct size but different content should raise bad checksum
86 self.src.objectstore.data[self.item_path] = ''.join(111 self.src.objectstore.data[self.item_path] = "".join(
87 ["x" for c in self.src.objectstore.data[self.item_path]])112 ["x" for c in self.src.objectstore.data[self.item_path]]
88 self.assertRaises(checksum_util.InvalidChecksum,113 )
89 self.target.sync, self.src, self.dlpath)114 self.assertRaises(
115 checksum_util.InvalidChecksum,
116 self.target.sync,
117 self.src,
118 self.dlpath,
119 )
90120
91 def test_no_checksums_cause_bad_checksum(self):121 def test_no_checksums_cause_bad_checksum(self):
92 def del_checksums(item):122 def del_checksums(item):
@@ -96,32 +126,41 @@ class TestBadDataSources(TestCase):
96126
97 _moditem(self.src, self.dlpath, self.pedigree, del_checksums)127 _moditem(self.src, self.dlpath, self.pedigree, del_checksums)
98 with _patched_missing_sum("fail"):128 with _patched_missing_sum("fail"):
99 self.assertRaises(checksum_util.InvalidChecksum,129 self.assertRaises(
100 self.target.sync, self.src, self.dlpath)130 checksum_util.InvalidChecksum,
131 self.target.sync,
132 self.src,
133 self.dlpath,
134 )
101135
102 def test_missing_size_causes_bad_checksum(self):136 def test_missing_size_causes_bad_checksum(self):
103 def del_size(item):137 def del_size(item):
104 del item['size']138 del item["size"]
105 return item139 return item
106140
107 _moditem(self.src, self.dlpath, self.pedigree, del_size)141 _moditem(self.src, self.dlpath, self.pedigree, del_size)
108 with _patched_missing_sum("fail"):142 with _patched_missing_sum("fail"):
109 self.assertRaises(checksum_util.InvalidChecksum,143 self.assertRaises(
110 self.target.sync, self.src, self.dlpath)144 checksum_util.InvalidChecksum,
145 self.target.sync,
146 self.src,
147 self.dlpath,
148 )
111149
112150
113class _patched_missing_sum(object):151class _patched_missing_sum(object):
114 """This patches the legacy mode for missing checksum info so152 """This patches the legacy mode for missing checksum info so
115 that it behaves like the new code path. Thus we can make153 that it behaves like the new code path. Thus we can make
116 the test run correctly"""154 the test run correctly"""
155
117 def __init__(self, mode="fail"):156 def __init__(self, mode="fail"):
118 self.mode = mode157 self.mode = mode
119158
120 def __enter__(self):159 def __enter__(self):
121 self.modmcb = getattr(mirrors, '_missing_cksum_behavior', {})160 self.modmcb = getattr(mirrors, "_missing_cksum_behavior", {})
122 self.orig = self.modmcb.copy()161 self.orig = self.modmcb.copy()
123 if self.modmcb:162 if self.modmcb:
124 self.modmcb['mode'] = self.mode163 self.modmcb["mode"] = self.mode
125 return self164 return self
126165
127 def __exit__(self, type, value, traceback):166 def __exit__(self, type, value, traceback):
diff --git a/tests/unittests/test_command_hook_mirror.py b/tests/unittests/test_command_hook_mirror.py
index 6a72749..e718c9e 100644
--- a/tests/unittests/test_command_hook_mirror.py
+++ b/tests/unittests/test_command_hook_mirror.py
@@ -1,4 +1,5 @@
1from unittest import TestCase1from unittest import TestCase
2
2import simplestreams.mirrors.command_hook as chm3import simplestreams.mirrors.command_hook as chm
3from tests.testutil import get_mirror_reader4from tests.testutil import get_mirror_reader
45
@@ -13,12 +14,11 @@ class TestCommandHookMirror(TestCase):
13 self.assertRaises(TypeError, chm.CommandHookMirror, {})14 self.assertRaises(TypeError, chm.CommandHookMirror, {})
1415
15 def test_init_with_load_products_works(self):16 def test_init_with_load_products_works(self):
16 chm.CommandHookMirror({'load_products': 'true'})17 chm.CommandHookMirror({"load_products": "true"})
1718
18 def test_stream_load_empty(self):19 def test_stream_load_empty(self):
19
20 src = get_mirror_reader("foocloud")20 src = get_mirror_reader("foocloud")
21 target = chm.CommandHookMirror({'load_products': ['true']})21 target = chm.CommandHookMirror({"load_products": ["true"]})
22 oruncmd = chm.run_command22 oruncmd = chm.run_command
2323
24 try:24 try:
@@ -30,14 +30,16 @@ class TestCommandHookMirror(TestCase):
3030
31 # the 'load_products' should be called once for each content31 # the 'load_products' should be called once for each content
32 # in the stream.32 # in the stream.
33 self.assertEqual(self._run_commands, [['true'], ['true']])33 self.assertEqual(self._run_commands, [["true"], ["true"]])
3434
35 def test_stream_insert_product(self):35 def test_stream_insert_product(self):
36
37 src = get_mirror_reader("foocloud")36 src = get_mirror_reader("foocloud")
38 target = chm.CommandHookMirror(37 target = chm.CommandHookMirror(
39 {'load_products': ['load-products'],38 {
40 'insert_products': ['insert-products']})39 "load_products": ["load-products"],
40 "insert_products": ["insert-products"],
41 }
42 )
41 oruncmd = chm.run_command43 oruncmd = chm.run_command
4244
43 try:45 try:
@@ -49,15 +51,18 @@ class TestCommandHookMirror(TestCase):
4951
50 # the 'load_products' should be called once for each content52 # the 'load_products' should be called once for each content
51 # in the stream. same for 'insert-products'53 # in the stream. same for 'insert-products'
52 self.assertEqual(len([f for f in self._run_commands54 self.assertEqual(
53 if f == ['load-products']]), 2)55 len([f for f in self._run_commands if f == ["load-products"]]), 2
54 self.assertEqual(len([f for f in self._run_commands56 )
55 if f == ['insert-products']]), 2)57 self.assertEqual(
58 len([f for f in self._run_commands if f == ["insert-products"]]), 2
59 )
5660
57 def _run_command(self, cmd, env=None, capture=False, rcs=None):61 def _run_command(self, cmd, env=None, capture=False, rcs=None):
58 self._run_commands.append(cmd)62 self._run_commands.append(cmd)
59 rc = 063 rc = 0
60 output = ''64 output = ""
61 return (rc, output)65 return (rc, output)
6266
67
63# vi: ts=4 expandtab syntax=python68# vi: ts=4 expandtab syntax=python
diff --git a/tests/unittests/test_contentsource.py b/tests/unittests/test_contentsource.py
index ef838a2..2b88a15 100644
--- a/tests/unittests/test_contentsource.py
+++ b/tests/unittests/test_contentsource.py
@@ -2,14 +2,14 @@ import os
2import shutil2import shutil
3import sys3import sys
4import tempfile4import tempfile
55from os.path import dirname, join
6from os.path import join, dirname6from subprocess import PIPE, STDOUT, Popen
7from simplestreams import objectstores
8from simplestreams import contentsource
9from subprocess import Popen, PIPE, STDOUT
10from unittest import TestCase, skipIf7from unittest import TestCase, skipIf
8
11import pytest9import pytest
1210
11from simplestreams import contentsource, objectstores
12
1313
14class RandomPortServer(object):14class RandomPortServer(object):
15 def __init__(self, path):15 def __init__(self, path):
@@ -22,14 +22,15 @@ class RandomPortServer(object):
22 if self.port and self.process:22 if self.port and self.process:
23 return23 return
24 testserver_path = join(24 testserver_path = join(
25 dirname(__file__), "..", "..", "tests", "httpserver.py")25 dirname(__file__), "..", "..", "tests", "httpserver.py"
26 pre = b'Serving HTTP:'26 )
27 pre = b"Serving HTTP:"
2728
28 cmd = [sys.executable, '-u', testserver_path, "0"]29 cmd = [sys.executable, "-u", testserver_path, "0"]
29 p = Popen(cmd, cwd=self.path, stdout=PIPE, stderr=STDOUT)30 p = Popen(cmd, cwd=self.path, stdout=PIPE, stderr=STDOUT)
30 line = p.stdout.readline() # pylint: disable=E110131 line = p.stdout.readline() # pylint: disable=E1101
31 if line.startswith(pre):32 if line.startswith(pre):
32 data = line[len(pre):].strip()33 data = line[len(pre) :].strip()
33 addr, port_str, cwd = data.decode().split(" ", 2)34 addr, port_str, cwd = data.decode().split(" ", 2)
34 self.port = int(port_str)35 self.port = int(port_str)
35 self.addr = addr36 self.addr = addr
@@ -39,8 +40,9 @@ class RandomPortServer(object):
39 else:40 else:
40 p.kill()41 p.kill()
41 raise RuntimeError(42 raise RuntimeError(
42 "Failed to start server in %s with %s. pid=%s. got: %s" %43 "Failed to start server in %s with %s. pid=%s. got: %s"
43 (self.path, cmd, self.process, line))44 % (self.path, cmd, self.process, line)
45 )
4446
45 def read_output(self):47 def read_output(self):
46 return str(self.process.stdout.readline())48 return str(self.process.stdout.readline())
@@ -63,13 +65,17 @@ class RandomPortServer(object):
63 if self.process:65 if self.process:
64 pid = self.process.pid66 pid = self.process.pid
6567
66 return ("RandomPortServer(port=%s, addr=%s, process=%s, path=%s)" %68 return "RandomPortServer(port=%s, addr=%s, process=%s, path=%s)" % (
67 (self.port, self.addr, pid, self.path))69 self.port,
70 self.addr,
71 pid,
72 self.path,
73 )
6874
69 def url_for(self, fpath=""):75 def url_for(self, fpath=""):
70 if self.port is None:76 if self.port is None:
71 raise ValueError("No port available")77 raise ValueError("No port available")
72 return 'http://127.0.0.1:%d/' % self.port + fpath78 return "http://127.0.0.1:%d/" % self.port + fpath
7379
7480
75class BaseDirUsingTestCase(TestCase):81class BaseDirUsingTestCase(TestCase):
@@ -100,7 +106,8 @@ class BaseDirUsingTestCase(TestCase):
100106
101 def getcs(self, path, url_reader=None, rel=None):107 def getcs(self, path, url_reader=None, rel=None):
102 return contentsource.UrlContentSource(108 return contentsource.UrlContentSource(
103 self.url_for(path, rel=rel), url_reader=url_reader)109 self.url_for(path, rel=rel), url_reader=url_reader
110 )
104111
105 def path_for(self, fpath, rel=None):112 def path_for(self, fpath, rel=None):
106 # return full path to fpath.113 # return full path to fpath.
@@ -120,8 +127,9 @@ class BaseDirUsingTestCase(TestCase):
120127
121 if not fullpath.startswith(self.tmpd + os.path.sep):128 if not fullpath.startswith(self.tmpd + os.path.sep):
122 raise ValueError(129 raise ValueError(
123 "%s is not a valid path. Not under tmpdir: %s" %130 "%s is not a valid path. Not under tmpdir: %s"
124 (fpath, self.tmpd))131 % (fpath, self.tmpd)
132 )
125133
126 return fullpath134 return fullpath
127135
@@ -133,17 +141,18 @@ class BaseDirUsingTestCase(TestCase):
133 if not self.server:141 if not self.server:
134 raise ValueError("No server available, but proto == http")142 raise ValueError("No server available, but proto == http")
135 return self.server.url_for(143 return self.server.url_for(
136 self.path_for(fpath=fpath, rel=rel)[len(self.tmpd)+1:])144 self.path_for(fpath=fpath, rel=rel)[len(self.tmpd) + 1 :]
145 )
137146
138147
139class TestUrlContentSource(BaseDirUsingTestCase):148class TestUrlContentSource(BaseDirUsingTestCase):
140 http = True149 http = True
141 fpath = 'foo'150 fpath = "foo"
142 fdata = b'hello world\n'151 fdata = b"hello world\n"
143152
144 def setUp(self):153 def setUp(self):
145 super(TestUrlContentSource, self).setUp()154 super(TestUrlContentSource, self).setUp()
146 with open(join(self.test_d, self.fpath), 'wb') as f:155 with open(join(self.test_d, self.fpath), "wb") as f:
147 f.write(self.fdata)156 f.write(self.fdata)
148157
149 def test_default_url_read_handles_None(self):158 def test_default_url_read_handles_None(self):
@@ -171,8 +180,10 @@ class TestUrlContentSource(BaseDirUsingTestCase):
171180
172 @skipIf(contentsource.requests is None, "requests not available")181 @skipIf(contentsource.requests is None, "requests not available")
173 def test_requests_default_timeout(self):182 def test_requests_default_timeout(self):
174 self.assertEqual(contentsource.RequestsUrlReader.timeout,183 self.assertEqual(
175 (contentsource.TIMEOUT, None))184 contentsource.RequestsUrlReader.timeout,
185 (contentsource.TIMEOUT, None),
186 )
176187
177 @skipIf(contentsource.requests is None, "requests not available")188 @skipIf(contentsource.requests is None, "requests not available")
178 def test_requests_url_read_handles_None(self):189 def test_requests_url_read_handles_None(self):
The diff has been truncated for viewing.

Subscribers

People subscribed via source and target branches