Merge ~paride/simplestreams:apply-black into simplestreams:master
- Git
- lp:~paride/simplestreams
- apply-black
- Merge into master
Status: | Work in progress |
---|---|
Proposed branch: | ~paride/simplestreams:apply-black |
Merge into: | simplestreams:master |
Diff against target: |
10555 lines (+3956/-2434) 47 files modified
.git-blame-ignore-revs (+4/-0) .launchpad.yaml (+38/-0) .pre-commit-config.yaml (+15/-0) bin/json2streams (+2/-2) bin/sstream-mirror (+118/-66) bin/sstream-mirror-glance (+197/-114) bin/sstream-query (+84/-47) bin/sstream-sync (+111/-63) pyproject.toml (+6/-0) setup.py (+17/-14) simplestreams/checksum_util.py (+33/-16) simplestreams/contentsource.py (+33/-26) simplestreams/filters.py (+10/-6) simplestreams/generate_simplestreams.py (+53/-38) simplestreams/json2streams.py (+34/-26) simplestreams/log.py (+13/-9) simplestreams/mirrors/__init__.py (+153/-100) simplestreams/mirrors/command_hook.py (+69/-50) simplestreams/mirrors/glance.py (+305/-206) simplestreams/objectstores/__init__.py (+49/-26) simplestreams/objectstores/s3.py (+7/-7) simplestreams/objectstores/swift.py (+48/-37) simplestreams/openstack.py (+126/-66) simplestreams/util.py (+100/-79) tests/httpserver.py (+24/-13) tests/testutil.py (+5/-4) tests/unittests/test_badmirrors.py (+72/-33) tests/unittests/test_command_hook_mirror.py (+17/-12) tests/unittests/test_contentsource.py (+62/-51) tests/unittests/test_generate_simplestreams.py (+322/-179) tests/unittests/test_glancemirror.py (+702/-390) tests/unittests/test_json2streams.py (+126/-95) tests/unittests/test_mirrorreaders.py (+24/-14) tests/unittests/test_mirrorwriters.py (+6/-6) tests/unittests/test_openstack.py (+184/-151) tests/unittests/test_resolvework.py (+90/-38) tests/unittests/test_signed_data.py (+7/-7) tests/unittests/test_util.py (+232/-132) tests/unittests/tests_filestore.py (+19/-13) tools/install-deps (+1/-1) tools/js2signed (+4/-3) tools/make-test-data (+240/-185) tools/sign_helper.py (+8/-4) tools/tab2streams (+31/-15) tools/toolutil.py (+66/-43) tools/ubuntu_versions.py (+71/-43) tox.ini (+18/-4) |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Server Team CI bot | continuous-integration | Approve | |
simplestreams-dev | Pending | ||
Review via email: mp+440053@code.launchpad.net |
Commit message
- Apply black and isort
- Add black and isort pre-commit hooks
- Run (read-only) pre-commit in tox environment
- Run tox in lpci
Description of the change
Server Team CI bot (server-team-bot) wrote : | # |
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:43978e994d5
https:/
Executed test runs:
FAILURE: https:/
FAILURE: https:/
FAILURE: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:43978e994d5
https:/
Executed test runs:
FAILURE: https:/
FAILURE: https:/
FAILURE: https:/
Click here to trigger a rebuild:
https:/
Paride Legovini (paride) wrote : | # |
CI failures fixed by:
https:/
- 246f30e... by Paride Legovini
-
lpci: don't run the pre-commit environment
Apparently the lpci workers do not have access to github.com,
so pre-commit can't download the hooks.
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:40311284d92
https:/
Executed test runs:
SUCCESS: https:/
FAILURE: https:/
FAILURE: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:7833c12c86e
https:/
Executed test runs:
SUCCESS: https:/
FAILURE: https:/
FAILURE: https:/
Click here to trigger a rebuild:
https:/
- 274817d... by Paride Legovini
-
install-deps: install pkg-config on non-amd64 archs
- a4d7a3f... by Paride Legovini
-
lpci: expand the test matrix
Cover: (amd64, arm64, ppc64el, s390x) * (focal, jammy, devel)
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:a4d7a3f8132
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
- 39c2e39... by Paride Legovini
-
lpci: add test dependency: python3-dev
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:39c2e398be3
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
- 19608c6... by Paride Legovini
-
mm
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:fe4262017af
https:/
Executed test runs:
FAILURE: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
- b405e84... by Paride Legovini
-
tox: pass the lowercase *_proxy variables
Workaround for https:/
/github. com/tox- dev/tox/ pull/2378/. - 363d3b4... by Paride Legovini
-
tox: skip_install in the pre_commit environment
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:37a21b666ed
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
- 75441dd... by Paride Legovini
-
ci: deal with pre-commit modifying files
We don't need to jump through hoops to avoid pre-commit modifying files
during the tox run: if that happens, the CI run will fail anyway. - 5449069... by Paride Legovini
-
prefer bin
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:19608c61d57
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:6feb6ce9622
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:6bbfb4fbe85
https:/
Executed test runs:
FAILURE: https:/
FAILURE: https:/
FAILURE: https:/
FAILURE: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:dec84e7e729
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
- 4c88164... by Paride Legovini
-
use-install-deps
The tool requires sudo to install dependencies.
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:f79823a18f7
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:5f19b7bdc80
https:/
Executed test runs:
FAILURE: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:0a7d8c46717
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
Adam Collard (adam-collard) wrote : | # |
I wouldn't mix a large reformatting like this (applying black across the codebase) with anything else.
I suggest landing the blacken commit, then separately isort, then separately again tox / lp-ci changes.
You'll want to ignore the black commits for e.g. git blame; but should see tox and lp-ci config in their own right
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:5f1da7595cc
https:/
Executed test runs:
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
SUCCESS: https:/
Click here to trigger a rebuild:
https:/
Paride Legovini (paride) wrote : | # |
Hi Adam, thanks for chiming in. Yeah I started this with the idea of quickly fixing CI (currently failing in master), then I started experimenting with integrating pre-commit, tox and lpci and the changeset grew.
I'll definitely split it up and resubmit it in smaller chunks.
Paride Legovini (paride) wrote : | # |
CI is stuck (running for 17 hours). I'll do a no-op force push to retrigger.
- 6da3cc1... by Paride Legovini
-
ci: append 127.0.0.1 to no_proxy
Paride Legovini (paride) wrote : | # |
CI passed! Next steps:
- Cleanup git history
- Split the this MP is smaller one.
I'll get back to this after 2023-06-13.
Adam Collard (adam-collard) wrote : | # |
@Paride - don't forget about this one!
Paride Legovini (paride) wrote : | # |
heh, knowing that other people are waiting for it will help. thanks!
Unmerged commits
- 6da3cc1... by Paride Legovini
-
ci: append 127.0.0.1 to no_proxy
-
lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) 1 → 12 of 12 results First • Previous • Next • Last - 4c88164... by Paride Legovini
-
use-install-deps
The tool requires sudo to install dependencies.
-
lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) 1 → 12 of 12 results First • Previous • Next • Last - 5449069... by Paride Legovini
-
prefer bin
- 19608c6... by Paride Legovini
-
mm
-
lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) lint:0 (build) unit-jammy:0 (build) unit-devel:0 (build) 1 → 12 of 12 results First • Previous • Next • Last - 75441dd... by Paride Legovini
-
ci: deal with pre-commit modifying files
We don't need to jump through hoops to avoid pre-commit modifying files
during the tox run: if that happens, the CI run will fail anyway. - 363d3b4... by Paride Legovini
-
tox: skip_install in the pre_commit environment
- b405e84... by Paride Legovini
-
tox: pass the lowercase *_proxy variables
Workaround for https:/
/github. com/tox- dev/tox/ pull/2378/. - 39c2e39... by Paride Legovini
-
lpci: add test dependency: python3-dev
-
1 → 32 of 32 results First • Previous • Next • Last - a4d7a3f... by Paride Legovini
-
lpci: expand the test matrix
Cover: (amd64, arm64, ppc64el, s390x) * (focal, jammy, devel)
-
1 → 32 of 32 results First • Previous • Next • Last - 274817d... by Paride Legovini
-
install-deps: install pkg-config on non-amd64 archs
Preview Diff
1 | diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs | |||
2 | 0 | new file mode 100644 | 0 | new file mode 100644 |
3 | index 0000000..da1123d | |||
4 | --- /dev/null | |||
5 | +++ b/.git-blame-ignore-revs | |||
6 | @@ -0,0 +1,4 @@ | |||
7 | 1 | # Automatically apply to git blame with `git config blame.ignorerevsfile .git-blame-ignore-revs` | ||
8 | 2 | |||
9 | 3 | # Apply black and isort formatting | ||
10 | 4 | 0d4060f7fa2de1e5b8c8b263100b1ed3a2c479bb | ||
11 | diff --git a/.launchpad.yaml b/.launchpad.yaml | |||
12 | 0 | new file mode 100644 | 5 | new file mode 100644 |
13 | index 0000000..20a3afb | |||
14 | --- /dev/null | |||
15 | +++ b/.launchpad.yaml | |||
16 | @@ -0,0 +1,38 @@ | |||
17 | 1 | pipeline: | ||
18 | 2 | - lint | ||
19 | 3 | - unit-jammy | ||
20 | 4 | - unit-devel | ||
21 | 5 | |||
22 | 6 | jobs: | ||
23 | 7 | lint: | ||
24 | 8 | series: jammy | ||
25 | 9 | architectures: | ||
26 | 10 | - amd64 | ||
27 | 11 | packages: | ||
28 | 12 | - git | ||
29 | 13 | - tox | ||
30 | 14 | run: tox -e flake8,pre-commit | ||
31 | 15 | unit-jammy: | ||
32 | 16 | series: jammy | ||
33 | 17 | architectures: | ||
34 | 18 | - amd64 | ||
35 | 19 | - arm64 | ||
36 | 20 | - ppc64el | ||
37 | 21 | - s390x | ||
38 | 22 | packages: | ||
39 | 23 | - sudo | ||
40 | 24 | run: | | ||
41 | 25 | tools/install-deps tox | ||
42 | 26 | no_proxy="${no_proxy:+$no_proxy,}127.0.0.1" tox -e py3 | ||
43 | 27 | unit-devel: | ||
44 | 28 | series: devel | ||
45 | 29 | architectures: | ||
46 | 30 | - amd64 | ||
47 | 31 | - arm64 | ||
48 | 32 | - ppc64el | ||
49 | 33 | - s390x | ||
50 | 34 | packages: | ||
51 | 35 | - sudo | ||
52 | 36 | run: | | ||
53 | 37 | tools/install-deps tox | ||
54 | 38 | no_proxy="${no_proxy:+$no_proxy,}127.0.0.1" tox -e py3 | ||
55 | diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml | |||
56 | 0 | new file mode 100644 | 39 | new file mode 100644 |
57 | index 0000000..0d08ba7 | |||
58 | --- /dev/null | |||
59 | +++ b/.pre-commit-config.yaml | |||
60 | @@ -0,0 +1,15 @@ | |||
61 | 1 | # Hooks called in the "manual" stage are meant to be read-only. We'll trigger | ||
62 | 2 | # them from tox, and we don't want the tox pre-commit environment to modify | ||
63 | 3 | # code, as this may interfere with other environments. | ||
64 | 4 | # | ||
65 | 5 | # To update the pinned versions run: pre-commit autoupdate | ||
66 | 6 | |||
67 | 7 | repos: | ||
68 | 8 | - repo: https://github.com/ambv/black | ||
69 | 9 | rev: 23.3.0 | ||
70 | 10 | hooks: | ||
71 | 11 | - id: black | ||
72 | 12 | - repo: https://github.com/pycqa/isort | ||
73 | 13 | rev: 5.12.0 | ||
74 | 14 | hooks: | ||
75 | 15 | - id: isort | ||
76 | diff --git a/bin/json2streams b/bin/json2streams | |||
77 | index bca6a7b..7dfd873 100755 | |||
78 | --- a/bin/json2streams | |||
79 | +++ b/bin/json2streams | |||
80 | @@ -2,8 +2,8 @@ | |||
81 | 2 | # Copyright (C) 2013, 2015 Canonical Ltd. | 2 | # Copyright (C) 2013, 2015 Canonical Ltd. |
82 | 3 | 3 | ||
83 | 4 | import sys | 4 | import sys |
84 | 5 | from simplestreams.json2streams import main | ||
85 | 6 | 5 | ||
86 | 6 | from simplestreams.json2streams import main | ||
87 | 7 | 7 | ||
89 | 8 | if __name__ == '__main__': | 8 | if __name__ == "__main__": |
90 | 9 | sys.exit(main()) | 9 | sys.exit(main()) |
91 | diff --git a/bin/sstream-mirror b/bin/sstream-mirror | |||
92 | index 08b71a5..fbe86da 100755 | |||
93 | --- a/bin/sstream-mirror | |||
94 | +++ b/bin/sstream-mirror | |||
95 | @@ -18,11 +18,7 @@ | |||
96 | 18 | import argparse | 18 | import argparse |
97 | 19 | import sys | 19 | import sys |
98 | 20 | 20 | ||
104 | 21 | from simplestreams import filters | 21 | from simplestreams import filters, log, mirrors, objectstores, util |
100 | 22 | from simplestreams import log | ||
101 | 23 | from simplestreams import mirrors | ||
102 | 24 | from simplestreams import objectstores | ||
103 | 25 | from simplestreams import util | ||
105 | 26 | 22 | ||
106 | 27 | 23 | ||
107 | 28 | class DotProgress(object): | 24 | class DotProgress(object): |
108 | @@ -39,9 +35,10 @@ class DotProgress(object): | |||
109 | 39 | self.curpath = path | 35 | self.curpath = path |
110 | 40 | status = "" | 36 | status = "" |
111 | 41 | if self.expected: | 37 | if self.expected: |
115 | 42 | status = (" %02s%%" % | 38 | status = " %02s%%" % ( |
116 | 43 | (int(self.bytes_read * 100 / self.expected))) | 39 | int(self.bytes_read * 100 / self.expected) |
117 | 44 | sys.stderr.write('=> %s [%s]%s\n' % (path, total, status)) | 40 | ) |
118 | 41 | sys.stderr.write("=> %s [%s]%s\n" % (path, total, status)) | ||
119 | 45 | 42 | ||
120 | 46 | if cur == total: | 43 | if cur == total: |
121 | 47 | sys.stderr.write("\n") | 44 | sys.stderr.write("\n") |
122 | @@ -52,7 +49,7 @@ class DotProgress(object): | |||
123 | 52 | toprint = int(cur * self.columns / total) - self.printed | 49 | toprint = int(cur * self.columns / total) - self.printed |
124 | 53 | if toprint <= 0: | 50 | if toprint <= 0: |
125 | 54 | return | 51 | return |
127 | 55 | sys.stderr.write('.' * toprint) | 52 | sys.stderr.write("." * toprint) |
128 | 56 | sys.stderr.flush() | 53 | sys.stderr.flush() |
129 | 57 | self.printed += toprint | 54 | self.printed += toprint |
130 | 58 | 55 | ||
131 | @@ -60,81 +57,135 @@ class DotProgress(object): | |||
132 | 60 | def main(): | 57 | def main(): |
133 | 61 | parser = argparse.ArgumentParser() | 58 | parser = argparse.ArgumentParser() |
134 | 62 | 59 | ||
170 | 63 | parser.add_argument('--keep', action='store_true', default=False, | 60 | parser.add_argument( |
171 | 64 | help='keep items in target up to MAX items ' | 61 | "--keep", |
172 | 65 | 'even after they have fallen out of the source') | 62 | action="store_true", |
173 | 66 | parser.add_argument('--max', type=int, default=None, | 63 | default=False, |
174 | 67 | help='store at most MAX items in the target') | 64 | help="keep items in target up to MAX items " |
175 | 68 | parser.add_argument('--path', default=None, | 65 | "even after they have fallen out of the source", |
176 | 69 | help='sync from index or products file in mirror') | 66 | ) |
177 | 70 | parser.add_argument('--no-item-download', action='store_true', | 67 | parser.add_argument( |
178 | 71 | default=False, | 68 | "--max", |
179 | 72 | help='do not download items with a "path"') | 69 | type=int, |
180 | 73 | parser.add_argument('--dry-run', action='store_true', default=False, | 70 | default=None, |
181 | 74 | help='only report what would be done') | 71 | help="store at most MAX items in the target", |
182 | 75 | parser.add_argument('--progress', action='store_true', default=False, | 72 | ) |
183 | 76 | help='show progress for downloading files') | 73 | parser.add_argument( |
184 | 77 | parser.add_argument('--mirror', action='append', default=[], | 74 | "--path", |
185 | 78 | dest="mirrors", | 75 | default=None, |
186 | 79 | help='additional mirrors to find referenced files') | 76 | help="sync from index or products file in mirror", |
187 | 80 | 77 | ) | |
188 | 81 | parser.add_argument('--verbose', '-v', action='count', default=0) | 78 | parser.add_argument( |
189 | 82 | parser.add_argument('--log-file', default=sys.stderr, | 79 | "--no-item-download", |
190 | 83 | type=argparse.FileType('w')) | 80 | action="store_true", |
191 | 84 | 81 | default=False, | |
192 | 85 | parser.add_argument('--keyring', action='store', default=None, | 82 | help='do not download items with a "path"', |
193 | 86 | help='keyring to be specified to gpg via --keyring') | 83 | ) |
194 | 87 | parser.add_argument('--no-verify', '-U', action='store_false', | 84 | parser.add_argument( |
195 | 88 | dest='verify', default=True, | 85 | "--dry-run", |
196 | 89 | help="do not gpg check signed json files") | 86 | action="store_true", |
197 | 90 | parser.add_argument('--no-checksumming-reader', action='store_false', | 87 | default=False, |
198 | 91 | dest='checksumming_reader', default=True, | 88 | help="only report what would be done", |
199 | 92 | help=("do not call 'insert_item' with a reader" | 89 | ) |
200 | 93 | " that does checksumming.")) | 90 | parser.add_argument( |
201 | 94 | 91 | "--progress", | |
202 | 95 | parser.add_argument('source_mirror') | 92 | action="store_true", |
203 | 96 | parser.add_argument('output_d') | 93 | default=False, |
204 | 97 | parser.add_argument('filters', nargs='*', default=[]) | 94 | help="show progress for downloading files", |
205 | 95 | ) | ||
206 | 96 | parser.add_argument( | ||
207 | 97 | "--mirror", | ||
208 | 98 | action="append", | ||
209 | 99 | default=[], | ||
210 | 100 | dest="mirrors", | ||
211 | 101 | help="additional mirrors to find referenced files", | ||
212 | 102 | ) | ||
213 | 103 | |||
214 | 104 | parser.add_argument("--verbose", "-v", action="count", default=0) | ||
215 | 105 | parser.add_argument( | ||
216 | 106 | "--log-file", default=sys.stderr, type=argparse.FileType("w") | ||
217 | 107 | ) | ||
218 | 108 | |||
219 | 109 | parser.add_argument( | ||
220 | 110 | "--keyring", | ||
221 | 111 | action="store", | ||
222 | 112 | default=None, | ||
223 | 113 | help="keyring to be specified to gpg via --keyring", | ||
224 | 114 | ) | ||
225 | 115 | parser.add_argument( | ||
226 | 116 | "--no-verify", | ||
227 | 117 | "-U", | ||
228 | 118 | action="store_false", | ||
229 | 119 | dest="verify", | ||
230 | 120 | default=True, | ||
231 | 121 | help="do not gpg check signed json files", | ||
232 | 122 | ) | ||
233 | 123 | parser.add_argument( | ||
234 | 124 | "--no-checksumming-reader", | ||
235 | 125 | action="store_false", | ||
236 | 126 | dest="checksumming_reader", | ||
237 | 127 | default=True, | ||
238 | 128 | help=( | ||
239 | 129 | "do not call 'insert_item' with a reader" | ||
240 | 130 | " that does checksumming." | ||
241 | 131 | ), | ||
242 | 132 | ) | ||
243 | 133 | |||
244 | 134 | parser.add_argument("source_mirror") | ||
245 | 135 | parser.add_argument("output_d") | ||
246 | 136 | parser.add_argument("filters", nargs="*", default=[]) | ||
247 | 98 | 137 | ||
248 | 99 | args = parser.parse_args() | 138 | args = parser.parse_args() |
249 | 100 | 139 | ||
252 | 101 | (mirror_url, initial_path) = util.path_from_mirror_url(args.source_mirror, | 140 | (mirror_url, initial_path) = util.path_from_mirror_url( |
253 | 102 | args.path) | 141 | args.source_mirror, args.path |
254 | 142 | ) | ||
255 | 103 | 143 | ||
256 | 104 | def policy(content, path): | 144 | def policy(content, path): |
261 | 105 | if initial_path.endswith('sjson'): | 145 | if initial_path.endswith("sjson"): |
262 | 106 | return util.read_signed(content, | 146 | return util.read_signed( |
263 | 107 | keyring=args.keyring, | 147 | content, keyring=args.keyring, checked=args.verify |
264 | 108 | checked=args.verify) | 148 | ) |
265 | 109 | else: | 149 | else: |
266 | 110 | return content | 150 | return content |
267 | 111 | 151 | ||
268 | 112 | filter_list = filters.get_filters(args.filters) | 152 | filter_list = filters.get_filters(args.filters) |
273 | 113 | mirror_config = {'max_items': args.max, 'keep_items': args.keep, | 153 | mirror_config = { |
274 | 114 | 'filters': filter_list, | 154 | "max_items": args.max, |
275 | 115 | 'item_download': not args.no_item_download, | 155 | "keep_items": args.keep, |
276 | 116 | 'checksumming_reader': args.checksumming_reader} | 156 | "filters": filter_list, |
277 | 157 | "item_download": not args.no_item_download, | ||
278 | 158 | "checksumming_reader": args.checksumming_reader, | ||
279 | 159 | } | ||
280 | 117 | 160 | ||
281 | 118 | level = (log.ERROR, log.INFO, log.DEBUG)[min(args.verbose, 2)] | 161 | level = (log.ERROR, log.INFO, log.DEBUG)[min(args.verbose, 2)] |
282 | 119 | log.basicConfig(stream=args.log_file, level=level) | 162 | log.basicConfig(stream=args.log_file, level=level) |
283 | 120 | 163 | ||
286 | 121 | smirror = mirrors.UrlMirrorReader(mirror_url, mirrors=args.mirrors, | 164 | smirror = mirrors.UrlMirrorReader( |
287 | 122 | policy=policy) | 165 | mirror_url, mirrors=args.mirrors, policy=policy |
288 | 166 | ) | ||
289 | 123 | tstore = objectstores.FileStore(args.output_d) | 167 | tstore = objectstores.FileStore(args.output_d) |
290 | 124 | 168 | ||
293 | 125 | drmirror = mirrors.DryRunMirrorWriter(config=mirror_config, | 169 | drmirror = mirrors.DryRunMirrorWriter( |
294 | 126 | objectstore=tstore) | 170 | config=mirror_config, objectstore=tstore |
295 | 171 | ) | ||
296 | 127 | drmirror.sync(smirror, initial_path) | 172 | drmirror.sync(smirror, initial_path) |
297 | 128 | 173 | ||
298 | 129 | def print_diff(char, items): | 174 | def print_diff(char, items): |
299 | 130 | for pedigree, path, size in items: | 175 | for pedigree, path, size in items: |
300 | 131 | fmt = "{char} {pedigree} {path} {size} Mb\n" | 176 | fmt = "{char} {pedigree} {path} {size} Mb\n" |
301 | 132 | size = int(size / (1024 * 1024)) | 177 | size = int(size / (1024 * 1024)) |
307 | 133 | sys.stderr.write(fmt.format( | 178 | sys.stderr.write( |
308 | 134 | char=char, pedigree=' '.join(pedigree), path=path, size=size)) | 179 | fmt.format( |
309 | 135 | 180 | char=char, | |
310 | 136 | print_diff('+', drmirror.downloading) | 181 | pedigree=" ".join(pedigree), |
311 | 137 | print_diff('-', drmirror.removing) | 182 | path=path, |
312 | 183 | size=size, | ||
313 | 184 | ) | ||
314 | 185 | ) | ||
315 | 186 | |||
316 | 187 | print_diff("+", drmirror.downloading) | ||
317 | 188 | print_diff("-", drmirror.removing) | ||
318 | 138 | sys.stderr.write("%d Mb change\n" % (drmirror.size / (1024 * 1024))) | 189 | sys.stderr.write("%d Mb change\n" % (drmirror.size / (1024 * 1024))) |
319 | 139 | 190 | ||
320 | 140 | if args.dry_run: | 191 | if args.dry_run: |
321 | @@ -147,13 +198,14 @@ def main(): | |||
322 | 147 | 198 | ||
323 | 148 | tstore = objectstores.FileStore(args.output_d, complete_callback=callback) | 199 | tstore = objectstores.FileStore(args.output_d, complete_callback=callback) |
324 | 149 | 200 | ||
327 | 150 | tmirror = mirrors.ObjectFilterMirror(config=mirror_config, | 201 | tmirror = mirrors.ObjectFilterMirror( |
328 | 151 | objectstore=tstore) | 202 | config=mirror_config, objectstore=tstore |
329 | 203 | ) | ||
330 | 152 | 204 | ||
331 | 153 | tmirror.sync(smirror, initial_path) | 205 | tmirror.sync(smirror, initial_path) |
332 | 154 | 206 | ||
333 | 155 | 207 | ||
335 | 156 | if __name__ == '__main__': | 208 | if __name__ == "__main__": |
336 | 157 | main() | 209 | main() |
337 | 158 | 210 | ||
338 | 159 | # vi: ts=4 expandtab syntax=python | 211 | # vi: ts=4 expandtab syntax=python |
339 | diff --git a/bin/sstream-mirror-glance b/bin/sstream-mirror-glance | |||
340 | index 5907137..953a2d3 100755 | |||
341 | --- a/bin/sstream-mirror-glance | |||
342 | +++ b/bin/sstream-mirror-glance | |||
343 | @@ -23,15 +23,11 @@ import argparse | |||
344 | 23 | import os.path | 23 | import os.path |
345 | 24 | import sys | 24 | import sys |
346 | 25 | 25 | ||
353 | 26 | from simplestreams import objectstores | 26 | from simplestreams import log, mirrors, objectstores, openstack, util |
348 | 27 | from simplestreams.objectstores import swift | ||
349 | 28 | from simplestreams import log | ||
350 | 29 | from simplestreams import mirrors | ||
351 | 30 | from simplestreams import openstack | ||
352 | 31 | from simplestreams import util | ||
354 | 32 | from simplestreams.mirrors import glance | 27 | from simplestreams.mirrors import glance |
355 | 28 | from simplestreams.objectstores import swift | ||
356 | 33 | 29 | ||
358 | 34 | DEFAULT_FILTERS = ['ftype~(disk1.img|disk.img)', 'arch~(x86_64|amd64|i386)'] | 30 | DEFAULT_FILTERS = ["ftype~(disk1.img|disk.img)", "arch~(x86_64|amd64|i386)"] |
359 | 35 | DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg" | 31 | DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg" |
360 | 36 | 32 | ||
361 | 37 | 33 | ||
362 | @@ -44,94 +40,172 @@ class StdoutProgressAggregator(util.ProgressAggregator): | |||
363 | 44 | super(StdoutProgressAggregator, self).__init__(remaining_items) | 40 | super(StdoutProgressAggregator, self).__init__(remaining_items) |
364 | 45 | 41 | ||
365 | 46 | def emit(self, progress): | 42 | def emit(self, progress): |
373 | 47 | size = float(progress['size']) | 43 | size = float(progress["size"]) |
374 | 48 | written = float(progress['written']) | 44 | written = float(progress["written"]) |
375 | 49 | print("%.2f %s (%d of %d images) - %.2f" % | 45 | print( |
376 | 50 | (written / size, progress['name'], | 46 | "%.2f %s (%d of %d images) - %.2f" |
377 | 51 | self.total_image_count - len(self.remaining_items) + 1, | 47 | % ( |
378 | 52 | self.total_image_count, | 48 | written / size, |
379 | 53 | float(self.total_written) / self.total_size)) | 49 | progress["name"], |
380 | 50 | self.total_image_count - len(self.remaining_items) + 1, | ||
381 | 51 | self.total_image_count, | ||
382 | 52 | float(self.total_written) / self.total_size, | ||
383 | 53 | ) | ||
384 | 54 | ) | ||
385 | 54 | 55 | ||
386 | 55 | 56 | ||
387 | 56 | def main(): | 57 | def main(): |
388 | 57 | parser = argparse.ArgumentParser() | 58 | parser = argparse.ArgumentParser() |
389 | 58 | 59 | ||
466 | 59 | parser.add_argument('--keep', action='store_true', default=False, | 60 | parser.add_argument( |
467 | 60 | help='keep items in target up to MAX items ' | 61 | "--keep", |
468 | 61 | 'even after they have fallen out of the source') | 62 | action="store_true", |
469 | 62 | parser.add_argument('--max', type=int, default=None, | 63 | default=False, |
470 | 63 | help='store at most MAX items in the target') | 64 | help="keep items in target up to MAX items " |
471 | 64 | 65 | "even after they have fallen out of the source", | |
472 | 65 | parser.add_argument('--region', action='append', default=None, | 66 | ) |
473 | 66 | dest='regions', | 67 | parser.add_argument( |
474 | 67 | help='operate on specified region ' | 68 | "--max", |
475 | 68 | '[useable multiple times]') | 69 | type=int, |
476 | 69 | 70 | default=None, | |
477 | 70 | parser.add_argument('--mirror', action='append', default=[], | 71 | help="store at most MAX items in the target", |
478 | 71 | dest="mirrors", | 72 | ) |
479 | 72 | help='additional mirrors to find referenced files') | 73 | |
480 | 73 | parser.add_argument('--path', default=None, | 74 | parser.add_argument( |
481 | 74 | help='sync from index or products file in mirror') | 75 | "--region", |
482 | 75 | parser.add_argument('--output-dir', metavar="DIR", default=False, | 76 | action="append", |
483 | 76 | help='write image data to storage in dir') | 77 | default=None, |
484 | 77 | parser.add_argument('--output-swift', metavar="prefix", default=False, | 78 | dest="regions", |
485 | 78 | help='write image data to swift under prefix') | 79 | help="operate on specified region " "[useable multiple times]", |
486 | 79 | 80 | ) | |
487 | 80 | parser.add_argument('--name-prefix', metavar="prefix", default=None, | 81 | |
488 | 81 | help='prefix for each published image name') | 82 | parser.add_argument( |
489 | 82 | parser.add_argument('--cloud-name', metavar="name", default=None, | 83 | "--mirror", |
490 | 83 | required=True, help='unique name for this cloud') | 84 | action="append", |
491 | 84 | parser.add_argument('--modify-hook', metavar="cmd", default=None, | 85 | default=[], |
492 | 85 | required=False, | 86 | dest="mirrors", |
493 | 86 | help='invoke cmd on each image prior to upload') | 87 | help="additional mirrors to find referenced files", |
494 | 87 | parser.add_argument('--content-id', metavar="name", default=None, | 88 | ) |
495 | 88 | required=True, | 89 | parser.add_argument( |
496 | 89 | help='content-id to use for published data.' | 90 | "--path", |
497 | 90 | ' may contain "%%(region)s"') | 91 | default=None, |
498 | 91 | 92 | help="sync from index or products file in mirror", | |
499 | 92 | parser.add_argument('--progress', action='store_true', default=False, | 93 | ) |
500 | 93 | help='display per-item download progress') | 94 | parser.add_argument( |
501 | 94 | parser.add_argument('--verbose', '-v', action='count', default=0) | 95 | "--output-dir", |
502 | 95 | parser.add_argument('--log-file', default=sys.stderr, | 96 | metavar="DIR", |
503 | 96 | type=argparse.FileType('w')) | 97 | default=False, |
504 | 97 | 98 | help="write image data to storage in dir", | |
505 | 98 | parser.add_argument('--keyring', action='store', default=DEFAULT_KEYRING, | 99 | ) |
506 | 99 | help='The keyring for gpg --keyring') | 100 | parser.add_argument( |
507 | 100 | 101 | "--output-swift", | |
508 | 101 | parser.add_argument('source_mirror') | 102 | metavar="prefix", |
509 | 102 | parser.add_argument('item_filters', nargs='*', default=DEFAULT_FILTERS, | 103 | default=False, |
510 | 103 | help="Filter expression for mirrored items. " | 104 | help="write image data to swift under prefix", |
511 | 104 | "Multiple filter arguments can be specified" | 105 | ) |
512 | 105 | "and will be combined with logical AND. " | 106 | |
513 | 106 | "Expressions are key[!]=literal_string " | 107 | parser.add_argument( |
514 | 107 | "or key[!]~regexp.") | 108 | "--name-prefix", |
515 | 108 | 109 | metavar="prefix", | |
516 | 109 | parser.add_argument('--hypervisor-mapping', action='store_true', | 110 | default=None, |
517 | 110 | default=False, | 111 | help="prefix for each published image name", |
518 | 111 | help="Set hypervisor_type attribute on stored images " | 112 | ) |
519 | 112 | "and the virt attribute in the associated stream " | 113 | parser.add_argument( |
520 | 113 | "data. This is useful in OpenStack Clouds which use " | 114 | "--cloud-name", |
521 | 114 | "multiple hypervisor types with in a single region.") | 115 | metavar="name", |
522 | 115 | 116 | default=None, | |
523 | 116 | parser.add_argument('--custom-property', action='append', default=[], | 117 | required=True, |
524 | 117 | dest="custom_properties", | 118 | help="unique name for this cloud", |
525 | 118 | help='additional properties to add to glance' | 119 | ) |
526 | 119 | ' image metadata (key=value format).') | 120 | parser.add_argument( |
527 | 120 | 121 | "--modify-hook", | |
528 | 121 | parser.add_argument('--visibility', action='store', default='public', | 122 | metavar="cmd", |
529 | 122 | choices=('public', 'private', 'community', 'shared'), | 123 | default=None, |
530 | 123 | help='Visibility to apply to stored images.') | 124 | required=False, |
531 | 124 | 125 | help="invoke cmd on each image prior to upload", | |
532 | 125 | parser.add_argument('--image-import-conversion', action='store_true', | 126 | ) |
533 | 126 | default=False, | 127 | parser.add_argument( |
534 | 127 | help="Enable conversion of images to raw format using " | 128 | "--content-id", |
535 | 128 | "image import option in Glance.") | 129 | metavar="name", |
536 | 129 | 130 | default=None, | |
537 | 130 | parser.add_argument('--set-latest-property', action='store_true', | 131 | required=True, |
538 | 131 | default=False, | 132 | help="content-id to use for published data." |
539 | 132 | help="Set 'latest=true' property to latest synced " | 133 | ' may contain "%%(region)s"', |
540 | 133 | "os_version/architecture image metadata and remove " | 134 | ) |
541 | 134 | "latest property from the old images.") | 135 | |
542 | 136 | parser.add_argument( | ||
543 | 137 | "--progress", | ||
544 | 138 | action="store_true", | ||
545 | 139 | default=False, | ||
546 | 140 | help="display per-item download progress", | ||
547 | 141 | ) | ||
548 | 142 | parser.add_argument("--verbose", "-v", action="count", default=0) | ||
549 | 143 | parser.add_argument( | ||
550 | 144 | "--log-file", default=sys.stderr, type=argparse.FileType("w") | ||
551 | 145 | ) | ||
552 | 146 | |||
553 | 147 | parser.add_argument( | ||
554 | 148 | "--keyring", | ||
555 | 149 | action="store", | ||
556 | 150 | default=DEFAULT_KEYRING, | ||
557 | 151 | help="The keyring for gpg --keyring", | ||
558 | 152 | ) | ||
559 | 153 | |||
560 | 154 | parser.add_argument("source_mirror") | ||
561 | 155 | parser.add_argument( | ||
562 | 156 | "item_filters", | ||
563 | 157 | nargs="*", | ||
564 | 158 | default=DEFAULT_FILTERS, | ||
565 | 159 | help="Filter expression for mirrored items. " | ||
566 | 160 | "Multiple filter arguments can be specified" | ||
567 | 161 | "and will be combined with logical AND. " | ||
568 | 162 | "Expressions are key[!]=literal_string " | ||
569 | 163 | "or key[!]~regexp.", | ||
570 | 164 | ) | ||
571 | 165 | |||
572 | 166 | parser.add_argument( | ||
573 | 167 | "--hypervisor-mapping", | ||
574 | 168 | action="store_true", | ||
575 | 169 | default=False, | ||
576 | 170 | help="Set hypervisor_type attribute on stored images " | ||
577 | 171 | "and the virt attribute in the associated stream " | ||
578 | 172 | "data. This is useful in OpenStack Clouds which use " | ||
579 | 173 | "multiple hypervisor types with in a single region.", | ||
580 | 174 | ) | ||
581 | 175 | |||
582 | 176 | parser.add_argument( | ||
583 | 177 | "--custom-property", | ||
584 | 178 | action="append", | ||
585 | 179 | default=[], | ||
586 | 180 | dest="custom_properties", | ||
587 | 181 | help="additional properties to add to glance" | ||
588 | 182 | " image metadata (key=value format).", | ||
589 | 183 | ) | ||
590 | 184 | |||
591 | 185 | parser.add_argument( | ||
592 | 186 | "--visibility", | ||
593 | 187 | action="store", | ||
594 | 188 | default="public", | ||
595 | 189 | choices=("public", "private", "community", "shared"), | ||
596 | 190 | help="Visibility to apply to stored images.", | ||
597 | 191 | ) | ||
598 | 192 | |||
599 | 193 | parser.add_argument( | ||
600 | 194 | "--image-import-conversion", | ||
601 | 195 | action="store_true", | ||
602 | 196 | default=False, | ||
603 | 197 | help="Enable conversion of images to raw format using " | ||
604 | 198 | "image import option in Glance.", | ||
605 | 199 | ) | ||
606 | 200 | |||
607 | 201 | parser.add_argument( | ||
608 | 202 | "--set-latest-property", | ||
609 | 203 | action="store_true", | ||
610 | 204 | default=False, | ||
611 | 205 | help="Set 'latest=true' property to latest synced " | ||
612 | 206 | "os_version/architecture image metadata and remove " | ||
613 | 207 | "latest property from the old images.", | ||
614 | 208 | ) | ||
615 | 135 | 209 | ||
616 | 136 | args = parser.parse_args() | 210 | args = parser.parse_args() |
617 | 137 | 211 | ||
618 | @@ -139,27 +213,32 @@ def main(): | |||
619 | 139 | if args.modify_hook: | 213 | if args.modify_hook: |
620 | 140 | modify_hook = args.modify_hook.split() | 214 | modify_hook = args.modify_hook.split() |
621 | 141 | 215 | ||
634 | 142 | mirror_config = {'max_items': args.max, 'keep_items': args.keep, | 216 | mirror_config = { |
635 | 143 | 'cloud_name': args.cloud_name, | 217 | "max_items": args.max, |
636 | 144 | 'modify_hook': modify_hook, | 218 | "keep_items": args.keep, |
637 | 145 | 'item_filters': args.item_filters, | 219 | "cloud_name": args.cloud_name, |
638 | 146 | 'hypervisor_mapping': args.hypervisor_mapping, | 220 | "modify_hook": modify_hook, |
639 | 147 | 'custom_properties': args.custom_properties, | 221 | "item_filters": args.item_filters, |
640 | 148 | 'visibility': args.visibility, | 222 | "hypervisor_mapping": args.hypervisor_mapping, |
641 | 149 | 'image_import_conversion': args.image_import_conversion, | 223 | "custom_properties": args.custom_properties, |
642 | 150 | 'set_latest_property': args.set_latest_property} | 224 | "visibility": args.visibility, |
643 | 151 | 225 | "image_import_conversion": args.image_import_conversion, | |
644 | 152 | (mirror_url, args.path) = util.path_from_mirror_url(args.source_mirror, | 226 | "set_latest_property": args.set_latest_property, |
645 | 153 | args.path) | 227 | } |
646 | 228 | |||
647 | 229 | (mirror_url, args.path) = util.path_from_mirror_url( | ||
648 | 230 | args.source_mirror, args.path | ||
649 | 231 | ) | ||
650 | 154 | 232 | ||
651 | 155 | def policy(content, path): # pylint: disable=W0613 | 233 | def policy(content, path): # pylint: disable=W0613 |
653 | 156 | if args.path.endswith('sjson'): | 234 | if args.path.endswith("sjson"): |
654 | 157 | return util.read_signed(content, keyring=args.keyring) | 235 | return util.read_signed(content, keyring=args.keyring) |
655 | 158 | else: | 236 | else: |
656 | 159 | return content | 237 | return content |
657 | 160 | 238 | ||
660 | 161 | smirror = mirrors.UrlMirrorReader(mirror_url, mirrors=args.mirrors, | 239 | smirror = mirrors.UrlMirrorReader( |
661 | 162 | policy=policy) | 240 | mirror_url, mirrors=args.mirrors, policy=policy |
662 | 241 | ) | ||
663 | 163 | if args.output_dir and args.output_swift: | 242 | if args.output_dir and args.output_swift: |
664 | 164 | error("--output-dir and --output-swift are mutually exclusive\n") | 243 | error("--output-dir and --output-swift are mutually exclusive\n") |
665 | 165 | sys.exit(1) | 244 | sys.exit(1) |
666 | @@ -169,7 +248,7 @@ def main(): | |||
667 | 169 | 248 | ||
668 | 170 | regions = args.regions | 249 | regions = args.regions |
669 | 171 | if regions is None: | 250 | if regions is None: |
671 | 172 | regions = openstack.get_regions(services=['image']) | 251 | regions = openstack.get_regions(services=["image"]) |
672 | 173 | 252 | ||
673 | 174 | for region in regions: | 253 | for region in regions: |
674 | 175 | if args.output_dir: | 254 | if args.output_dir: |
675 | @@ -181,25 +260,29 @@ def main(): | |||
676 | 181 | sys.stderr.write("not writing data anywhere\n") | 260 | sys.stderr.write("not writing data anywhere\n") |
677 | 182 | tstore = None | 261 | tstore = None |
678 | 183 | 262 | ||
680 | 184 | mirror_config['content_id'] = args.content_id % {'region': region} | 263 | mirror_config["content_id"] = args.content_id % {"region": region} |
681 | 185 | 264 | ||
682 | 186 | if args.progress: | 265 | if args.progress: |
685 | 187 | drmirror = glance.ItemInfoDryRunMirror(config=mirror_config, | 266 | drmirror = glance.ItemInfoDryRunMirror( |
686 | 188 | objectstore=tstore) | 267 | config=mirror_config, objectstore=tstore |
687 | 268 | ) | ||
688 | 189 | drmirror.sync(smirror, args.path) | 269 | drmirror.sync(smirror, args.path) |
689 | 190 | p = StdoutProgressAggregator(drmirror.items) | 270 | p = StdoutProgressAggregator(drmirror.items) |
690 | 191 | progress_callback = p.progress_callback | 271 | progress_callback = p.progress_callback |
691 | 192 | else: | 272 | else: |
692 | 193 | progress_callback = None | 273 | progress_callback = None |
693 | 194 | 274 | ||
698 | 195 | tmirror = glance.GlanceMirror(config=mirror_config, | 275 | tmirror = glance.GlanceMirror( |
699 | 196 | objectstore=tstore, region=region, | 276 | config=mirror_config, |
700 | 197 | name_prefix=args.name_prefix, | 277 | objectstore=tstore, |
701 | 198 | progress_callback=progress_callback) | 278 | region=region, |
702 | 279 | name_prefix=args.name_prefix, | ||
703 | 280 | progress_callback=progress_callback, | ||
704 | 281 | ) | ||
705 | 199 | tmirror.sync(smirror, args.path) | 282 | tmirror.sync(smirror, args.path) |
706 | 200 | 283 | ||
707 | 201 | 284 | ||
709 | 202 | if __name__ == '__main__': | 285 | if __name__ == "__main__": |
710 | 203 | main() | 286 | main() |
711 | 204 | 287 | ||
712 | 205 | # vi: ts=4 expandtab syntax=python | 288 | # vi: ts=4 expandtab syntax=python |
713 | diff --git a/bin/sstream-query b/bin/sstream-query | |||
714 | index 6534317..3752cac 100755 | |||
715 | --- a/bin/sstream-query | |||
716 | +++ b/bin/sstream-query | |||
717 | @@ -16,11 +16,6 @@ | |||
718 | 16 | # You should have received a copy of the GNU Affero General Public License | 16 | # You should have received a copy of the GNU Affero General Public License |
719 | 17 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 17 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
720 | 18 | 18 | ||
721 | 19 | from simplestreams import filters | ||
722 | 20 | from simplestreams import mirrors | ||
723 | 21 | from simplestreams import log | ||
724 | 22 | from simplestreams import util | ||
725 | 23 | |||
726 | 24 | import argparse | 19 | import argparse |
727 | 25 | import errno | 20 | import errno |
728 | 26 | import json | 21 | import json |
729 | @@ -28,6 +23,8 @@ import pprint | |||
730 | 28 | import signal | 23 | import signal |
731 | 29 | import sys | 24 | import sys |
732 | 30 | 25 | ||
733 | 26 | from simplestreams import filters, log, mirrors, util | ||
734 | 27 | |||
735 | 31 | FORMAT_PRETTY = "PRETTY" | 28 | FORMAT_PRETTY = "PRETTY" |
736 | 32 | FORMAT_JSON = "JSON" | 29 | FORMAT_JSON = "JSON" |
737 | 33 | DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg" | 30 | DEFAULT_KEYRING = "/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg" |
738 | @@ -43,15 +40,15 @@ class FilterMirror(mirrors.BasicMirrorWriter): | |||
739 | 43 | if config is None: | 40 | if config is None: |
740 | 44 | config = {} | 41 | config = {} |
741 | 45 | self.config = config | 42 | self.config = config |
744 | 46 | self.filters = config.get('filters', []) | 43 | self.filters = config.get("filters", []) |
745 | 47 | outfmt = config.get('output_format') | 44 | outfmt = config.get("output_format") |
746 | 48 | if not outfmt: | 45 | if not outfmt: |
747 | 49 | outfmt = "%s" | 46 | outfmt = "%s" |
748 | 50 | self.output_format = outfmt | 47 | self.output_format = outfmt |
749 | 51 | self.json_entries = [] | 48 | self.json_entries = [] |
750 | 52 | 49 | ||
751 | 53 | def load_products(self, path=None, content_id=None): | 50 | def load_products(self, path=None, content_id=None): |
753 | 54 | return {'content_id': content_id, 'products': {}} | 51 | return {"content_id": content_id, "products": {}} |
754 | 55 | 52 | ||
755 | 56 | def filter_item(self, data, src, target, pedigree): | 53 | def filter_item(self, data, src, target, pedigree): |
756 | 57 | return filters.filter_item(self.filters, data, src, pedigree) | 54 | return filters.filter_item(self.filters, data, src, pedigree) |
757 | @@ -61,8 +58,8 @@ class FilterMirror(mirrors.BasicMirrorWriter): | |||
758 | 61 | # data is src['products'][ped[0]]['versions'][ped[1]]['items'][ped[2]] | 58 | # data is src['products'][ped[0]]['versions'][ped[1]]['items'][ped[2]] |
759 | 62 | # contentsource is a ContentSource if 'path' exists in data or None | 59 | # contentsource is a ContentSource if 'path' exists in data or None |
760 | 63 | data = util.products_exdata(src, pedigree) | 60 | data = util.products_exdata(src, pedigree) |
763 | 64 | if 'path' in data: | 61 | if "path" in data: |
764 | 65 | data.update({'item_url': contentsource.url}) | 62 | data.update({"item_url": contentsource.url}) |
765 | 66 | 63 | ||
766 | 67 | if self.output_format == FORMAT_PRETTY: | 64 | if self.output_format == FORMAT_PRETTY: |
767 | 68 | pprint.pprint(data) | 65 | pprint.pprint(data) |
768 | @@ -79,39 +76,71 @@ class FilterMirror(mirrors.BasicMirrorWriter): | |||
769 | 79 | def main(): | 76 | def main(): |
770 | 80 | parser = argparse.ArgumentParser() | 77 | parser = argparse.ArgumentParser() |
771 | 81 | 78 | ||
774 | 82 | parser.add_argument('--max', type=int, default=None, dest='max_items', | 79 | parser.add_argument( |
775 | 83 | help='store at most MAX items in the target') | 80 | "--max", |
776 | 81 | type=int, | ||
777 | 82 | default=None, | ||
778 | 83 | dest="max_items", | ||
779 | 84 | help="store at most MAX items in the target", | ||
780 | 85 | ) | ||
781 | 84 | 86 | ||
784 | 85 | parser.add_argument('--path', default=None, | 87 | parser.add_argument( |
785 | 86 | help='sync from index or products file in mirror') | 88 | "--path", |
786 | 89 | default=None, | ||
787 | 90 | help="sync from index or products file in mirror", | ||
788 | 91 | ) | ||
789 | 87 | 92 | ||
790 | 88 | fmt_group = parser.add_mutually_exclusive_group() | 93 | fmt_group = parser.add_mutually_exclusive_group() |
812 | 89 | fmt_group.add_argument('--output-format', '-o', action='store', | 94 | fmt_group.add_argument( |
813 | 90 | dest='output_format', default=None, | 95 | "--output-format", |
814 | 91 | help="specify output format per python str.format") | 96 | "-o", |
815 | 92 | fmt_group.add_argument('--pretty', action='store_const', | 97 | action="store", |
816 | 93 | const=FORMAT_PRETTY, dest='output_format', | 98 | dest="output_format", |
817 | 94 | help="pretty print output") | 99 | default=None, |
818 | 95 | fmt_group.add_argument('--json', action='store_const', | 100 | help="specify output format per python str.format", |
819 | 96 | const=FORMAT_JSON, dest='output_format', | 101 | ) |
820 | 97 | help="output in JSON as a list of dicts.") | 102 | fmt_group.add_argument( |
821 | 98 | parser.add_argument('--verbose', '-v', action='count', default=0) | 103 | "--pretty", |
822 | 99 | parser.add_argument('--log-file', default=sys.stderr, | 104 | action="store_const", |
823 | 100 | type=argparse.FileType('w')) | 105 | const=FORMAT_PRETTY, |
824 | 101 | 106 | dest="output_format", | |
825 | 102 | parser.add_argument('--keyring', action='store', default=DEFAULT_KEYRING, | 107 | help="pretty print output", |
826 | 103 | help='keyring to be specified to gpg via --keyring') | 108 | ) |
827 | 104 | parser.add_argument('--no-verify', '-U', action='store_false', | 109 | fmt_group.add_argument( |
828 | 105 | dest='verify', default=True, | 110 | "--json", |
829 | 106 | help="do not gpg check signed json files") | 111 | action="store_const", |
830 | 107 | 112 | const=FORMAT_JSON, | |
831 | 108 | parser.add_argument('mirror_url') | 113 | dest="output_format", |
832 | 109 | parser.add_argument('filters', nargs='*', default=[]) | 114 | help="output in JSON as a list of dicts.", |
833 | 115 | ) | ||
834 | 116 | parser.add_argument("--verbose", "-v", action="count", default=0) | ||
835 | 117 | parser.add_argument( | ||
836 | 118 | "--log-file", default=sys.stderr, type=argparse.FileType("w") | ||
837 | 119 | ) | ||
838 | 120 | |||
839 | 121 | parser.add_argument( | ||
840 | 122 | "--keyring", | ||
841 | 123 | action="store", | ||
842 | 124 | default=DEFAULT_KEYRING, | ||
843 | 125 | help="keyring to be specified to gpg via --keyring", | ||
844 | 126 | ) | ||
845 | 127 | parser.add_argument( | ||
846 | 128 | "--no-verify", | ||
847 | 129 | "-U", | ||
848 | 130 | action="store_false", | ||
849 | 131 | dest="verify", | ||
850 | 132 | default=True, | ||
851 | 133 | help="do not gpg check signed json files", | ||
852 | 134 | ) | ||
853 | 135 | |||
854 | 136 | parser.add_argument("mirror_url") | ||
855 | 137 | parser.add_argument("filters", nargs="*", default=[]) | ||
856 | 110 | 138 | ||
857 | 111 | cmdargs = parser.parse_args() | 139 | cmdargs = parser.parse_args() |
858 | 112 | 140 | ||
861 | 113 | (mirror_url, path) = util.path_from_mirror_url(cmdargs.mirror_url, | 141 | (mirror_url, path) = util.path_from_mirror_url( |
862 | 114 | cmdargs.path) | 142 | cmdargs.mirror_url, cmdargs.path |
863 | 143 | ) | ||
864 | 115 | 144 | ||
865 | 116 | level = (log.ERROR, log.INFO, log.DEBUG)[min(cmdargs.verbose, 2)] | 145 | level = (log.ERROR, log.INFO, log.DEBUG)[min(cmdargs.verbose, 2)] |
866 | 117 | log.basicConfig(stream=cmdargs.log_file, level=level) | 146 | log.basicConfig(stream=cmdargs.log_file, level=level) |
867 | @@ -119,33 +148,41 @@ def main(): | |||
868 | 119 | initial_path = path | 148 | initial_path = path |
869 | 120 | 149 | ||
870 | 121 | def policy(content, path): | 150 | def policy(content, path): |
875 | 122 | if initial_path.endswith('sjson'): | 151 | if initial_path.endswith("sjson"): |
876 | 123 | return util.read_signed(content, | 152 | return util.read_signed( |
877 | 124 | keyring=cmdargs.keyring, | 153 | content, keyring=cmdargs.keyring, checked=cmdargs.verify |
878 | 125 | checked=cmdargs.verify) | 154 | ) |
879 | 126 | else: | 155 | else: |
880 | 127 | return content | 156 | return content |
881 | 128 | 157 | ||
882 | 129 | smirror = mirrors.UrlMirrorReader(mirror_url, policy=policy) | 158 | smirror = mirrors.UrlMirrorReader(mirror_url, policy=policy) |
883 | 130 | 159 | ||
884 | 131 | filter_list = filters.get_filters(cmdargs.filters) | 160 | filter_list = filters.get_filters(cmdargs.filters) |
888 | 132 | cfg = {'max_items': cmdargs.max_items, | 161 | cfg = { |
889 | 133 | 'filters': filter_list, | 162 | "max_items": cmdargs.max_items, |
890 | 134 | 'output_format': cmdargs.output_format} | 163 | "filters": filter_list, |
891 | 164 | "output_format": cmdargs.output_format, | ||
892 | 165 | } | ||
893 | 135 | 166 | ||
894 | 136 | tmirror = FilterMirror(config=cfg) | 167 | tmirror = FilterMirror(config=cfg) |
895 | 137 | try: | 168 | try: |
896 | 138 | tmirror.sync(smirror, path) | 169 | tmirror.sync(smirror, path) |
897 | 139 | if tmirror.output_format == FORMAT_JSON: | 170 | if tmirror.output_format == FORMAT_JSON: |
900 | 140 | print(json.dumps(tmirror.json_entries, indent=2, sort_keys=True, | 171 | print( |
901 | 141 | separators=(',', ': '))) | 172 | json.dumps( |
902 | 173 | tmirror.json_entries, | ||
903 | 174 | indent=2, | ||
904 | 175 | sort_keys=True, | ||
905 | 176 | separators=(",", ": "), | ||
906 | 177 | ) | ||
907 | 178 | ) | ||
908 | 142 | except IOError as e: | 179 | except IOError as e: |
909 | 143 | if e.errno == errno.EPIPE: | 180 | if e.errno == errno.EPIPE: |
910 | 144 | sys.exit(0x80 | signal.SIGPIPE) | 181 | sys.exit(0x80 | signal.SIGPIPE) |
911 | 145 | raise | 182 | raise |
912 | 146 | 183 | ||
913 | 147 | 184 | ||
915 | 148 | if __name__ == '__main__': | 185 | if __name__ == "__main__": |
916 | 149 | main() | 186 | main() |
917 | 150 | 187 | ||
918 | 151 | # vi: ts=4 expandtab syntax=python | 188 | # vi: ts=4 expandtab syntax=python |
919 | diff --git a/bin/sstream-sync b/bin/sstream-sync | |||
920 | index d10b90f..4082400 100755 | |||
921 | --- a/bin/sstream-sync | |||
922 | +++ b/bin/sstream-sync | |||
923 | @@ -16,18 +16,17 @@ | |||
924 | 16 | # You should have received a copy of the GNU Affero General Public License | 16 | # You should have received a copy of the GNU Affero General Public License |
925 | 17 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 17 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
926 | 18 | 18 | ||
927 | 19 | from simplestreams import mirrors | ||
928 | 20 | from simplestreams.mirrors import command_hook | ||
929 | 21 | from simplestreams import log | ||
930 | 22 | from simplestreams import util | ||
931 | 23 | |||
932 | 24 | import argparse | 19 | import argparse |
933 | 25 | import errno | 20 | import errno |
934 | 26 | import os | 21 | import os |
935 | 27 | import signal | 22 | import signal |
936 | 28 | import sys | 23 | import sys |
937 | 24 | |||
938 | 29 | import yaml | 25 | import yaml |
939 | 30 | 26 | ||
940 | 27 | from simplestreams import log, mirrors, util | ||
941 | 28 | from simplestreams.mirrors import command_hook | ||
942 | 29 | |||
943 | 31 | 30 | ||
944 | 32 | def which(program): | 31 | def which(program): |
945 | 33 | def is_exe(fpath): | 32 | def is_exe(fpath): |
946 | @@ -55,51 +54,90 @@ def main(): | |||
947 | 55 | parser = argparse.ArgumentParser() | 54 | parser = argparse.ArgumentParser() |
948 | 56 | defhook = command_hook.DEFAULT_HOOK_NAME | 55 | defhook = command_hook.DEFAULT_HOOK_NAME |
949 | 57 | 56 | ||
959 | 58 | hooks = [("--hook-%s" % hook.replace("_", "-"), hook, False) | 57 | hooks = [ |
960 | 59 | for hook in command_hook.HOOK_NAMES] | 58 | ("--hook-%s" % hook.replace("_", "-"), hook, False) |
961 | 60 | hooks.append(('--hook', defhook, False,)) | 59 | for hook in command_hook.HOOK_NAMES |
962 | 61 | 60 | ] | |
963 | 62 | parser.add_argument('--config', '-c', | 61 | hooks.append( |
964 | 63 | help='read config file', | 62 | ( |
965 | 64 | type=argparse.FileType('rb')) | 63 | "--hook", |
966 | 65 | 64 | defhook, | |
967 | 66 | for (argname, cfgname, _required) in hooks: | 65 | False, |
968 | 66 | ) | ||
969 | 67 | ) | ||
970 | 68 | |||
971 | 69 | parser.add_argument( | ||
972 | 70 | "--config", "-c", help="read config file", type=argparse.FileType("rb") | ||
973 | 71 | ) | ||
974 | 72 | |||
975 | 73 | for argname, cfgname, _required in hooks: | ||
976 | 67 | parser.add_argument(argname, dest=cfgname, required=False) | 74 | parser.add_argument(argname, dest=cfgname, required=False) |
977 | 68 | 75 | ||
1004 | 69 | parser.add_argument('--keep', action='store_true', default=False, | 76 | parser.add_argument( |
1005 | 70 | dest='keep_items', | 77 | "--keep", |
1006 | 71 | help='keep items in target up to MAX items ' | 78 | action="store_true", |
1007 | 72 | 'even after they have fallen out of the source') | 79 | default=False, |
1008 | 73 | parser.add_argument('--max', type=int, default=None, dest='max_items', | 80 | dest="keep_items", |
1009 | 74 | help='store at most MAX items in the target') | 81 | help="keep items in target up to MAX items " |
1010 | 75 | parser.add_argument('--item-skip-download', action='store_true', | 82 | "even after they have fallen out of the source", |
1011 | 76 | default=False, | 83 | ) |
1012 | 77 | help='Do not download items that are to be inserted.') | 84 | parser.add_argument( |
1013 | 78 | parser.add_argument('--delete', action='store_true', default=False, | 85 | "--max", |
1014 | 79 | dest='delete_filtered_items', | 86 | type=int, |
1015 | 80 | help='remove filtered items from the target') | 87 | default=None, |
1016 | 81 | parser.add_argument('--path', default=None, | 88 | dest="max_items", |
1017 | 82 | help='sync from index or products file in mirror') | 89 | help="store at most MAX items in the target", |
1018 | 83 | 90 | ) | |
1019 | 84 | parser.add_argument('--verbose', '-v', action='count', default=0) | 91 | parser.add_argument( |
1020 | 85 | parser.add_argument('--log-file', default=sys.stderr, | 92 | "--item-skip-download", |
1021 | 86 | type=argparse.FileType('w')) | 93 | action="store_true", |
1022 | 87 | 94 | default=False, | |
1023 | 88 | parser.add_argument('--keyring', action='store', default=None, | 95 | help="Do not download items that are to be inserted.", |
1024 | 89 | help='keyring to be specified to gpg via --keyring') | 96 | ) |
1025 | 90 | parser.add_argument('--no-verify', '-U', action='store_false', | 97 | parser.add_argument( |
1026 | 91 | dest='verify', default=True, | 98 | "--delete", |
1027 | 92 | help="do not gpg check signed json files") | 99 | action="store_true", |
1028 | 93 | 100 | default=False, | |
1029 | 94 | parser.add_argument('mirror_url') | 101 | dest="delete_filtered_items", |
1030 | 102 | help="remove filtered items from the target", | ||
1031 | 103 | ) | ||
1032 | 104 | parser.add_argument( | ||
1033 | 105 | "--path", | ||
1034 | 106 | default=None, | ||
1035 | 107 | help="sync from index or products file in mirror", | ||
1036 | 108 | ) | ||
1037 | 109 | |||
1038 | 110 | parser.add_argument("--verbose", "-v", action="count", default=0) | ||
1039 | 111 | parser.add_argument( | ||
1040 | 112 | "--log-file", default=sys.stderr, type=argparse.FileType("w") | ||
1041 | 113 | ) | ||
1042 | 114 | |||
1043 | 115 | parser.add_argument( | ||
1044 | 116 | "--keyring", | ||
1045 | 117 | action="store", | ||
1046 | 118 | default=None, | ||
1047 | 119 | help="keyring to be specified to gpg via --keyring", | ||
1048 | 120 | ) | ||
1049 | 121 | parser.add_argument( | ||
1050 | 122 | "--no-verify", | ||
1051 | 123 | "-U", | ||
1052 | 124 | action="store_false", | ||
1053 | 125 | dest="verify", | ||
1054 | 126 | default=True, | ||
1055 | 127 | help="do not gpg check signed json files", | ||
1056 | 128 | ) | ||
1057 | 129 | |||
1058 | 130 | parser.add_argument("mirror_url") | ||
1059 | 95 | cmdargs = parser.parse_args() | 131 | cmdargs = parser.parse_args() |
1060 | 96 | 132 | ||
1067 | 97 | known_cfg = [('--item-skip-download', 'item_skip_download', False), | 133 | known_cfg = [ |
1068 | 98 | ('--max', 'max_items', False), | 134 | ("--item-skip-download", "item_skip_download", False), |
1069 | 99 | ('--keep', 'keep_items', False), | 135 | ("--max", "max_items", False), |
1070 | 100 | ('--delete', 'delete_filtered_items', False), | 136 | ("--keep", "keep_items", False), |
1071 | 101 | ('mirror_url', 'mirror_url', True), | 137 | ("--delete", "delete_filtered_items", False), |
1072 | 102 | ('--path', 'path', True)] | 138 | ("mirror_url", "mirror_url", True), |
1073 | 139 | ("--path", "path", True), | ||
1074 | 140 | ] | ||
1075 | 103 | known_cfg.extend(hooks) | 141 | known_cfg.extend(hooks) |
1076 | 104 | 142 | ||
1077 | 105 | cfg = {} | 143 | cfg = {} |
1078 | @@ -116,31 +154,41 @@ def main(): | |||
1079 | 116 | missing = [] | 154 | missing = [] |
1080 | 117 | fallback = cfg.get(defhook, getattr(cmdargs, defhook, None)) | 155 | fallback = cfg.get(defhook, getattr(cmdargs, defhook, None)) |
1081 | 118 | 156 | ||
1083 | 119 | for (argname, cfgname, _required) in known_cfg: | 157 | for argname, cfgname, _required in known_cfg: |
1084 | 120 | val = getattr(cmdargs, cfgname) | 158 | val = getattr(cmdargs, cfgname) |
1085 | 121 | if val is not None: | 159 | if val is not None: |
1086 | 122 | cfg[cfgname] = val | 160 | cfg[cfgname] = val |
1087 | 123 | if val == "": | 161 | if val == "": |
1088 | 124 | cfg[cfgname] = None | 162 | cfg[cfgname] = None |
1089 | 125 | 163 | ||
1092 | 126 | if ((cfgname in command_hook.HOOK_NAMES or cfgname == defhook) and | 164 | if ( |
1093 | 127 | cfg.get(cfgname) is not None): | 165 | cfgname in command_hook.HOOK_NAMES or cfgname == defhook |
1094 | 166 | ) and cfg.get(cfgname) is not None: | ||
1095 | 128 | if which(cfg[cfgname]) is None: | 167 | if which(cfg[cfgname]) is None: |
1096 | 129 | msg = "invalid input for %s. '%s' is not executable\n" | 168 | msg = "invalid input for %s. '%s' is not executable\n" |
1097 | 130 | sys.stderr.write(msg % (argname, val)) | 169 | sys.stderr.write(msg % (argname, val)) |
1098 | 131 | sys.exit(1) | 170 | sys.exit(1) |
1099 | 132 | 171 | ||
1103 | 133 | if (cfgname in command_hook.REQUIRED_FIELDS and | 172 | if ( |
1104 | 134 | cfg.get(cfgname) is None and not fallback): | 173 | cfgname in command_hook.REQUIRED_FIELDS |
1105 | 135 | missing.append((argname, cfgname,)) | 174 | and cfg.get(cfgname) is None |
1106 | 175 | and not fallback | ||
1107 | 176 | ): | ||
1108 | 177 | missing.append( | ||
1109 | 178 | ( | ||
1110 | 179 | argname, | ||
1111 | 180 | cfgname, | ||
1112 | 181 | ) | ||
1113 | 182 | ) | ||
1114 | 136 | 183 | ||
1115 | 137 | pfm = util.path_from_mirror_url | 184 | pfm = util.path_from_mirror_url |
1117 | 138 | (cfg['mirror_url'], cfg['path']) = pfm(cfg['mirror_url'], cfg.get('path')) | 185 | (cfg["mirror_url"], cfg["path"]) = pfm(cfg["mirror_url"], cfg.get("path")) |
1118 | 139 | 186 | ||
1119 | 140 | if missing: | 187 | if missing: |
1123 | 141 | sys.stderr.write("must provide input for (--hook/%s for default):\n" | 188 | sys.stderr.write( |
1124 | 142 | % defhook) | 189 | "must provide input for (--hook/%s for default):\n" % defhook |
1125 | 143 | for (flag, cfg) in missing: | 190 | ) |
1126 | 191 | for flag, cfg in missing: | ||
1127 | 144 | sys.stderr.write(" cmdline '%s' or cfgname '%s'\n" % (flag, cfg)) | 192 | sys.stderr.write(" cmdline '%s' or cfgname '%s'\n" % (flag, cfg)) |
1128 | 145 | sys.exit(1) | 193 | sys.exit(1) |
1129 | 146 | 194 | ||
1130 | @@ -148,24 +196,24 @@ def main(): | |||
1131 | 148 | log.basicConfig(stream=cmdargs.log_file, level=level) | 196 | log.basicConfig(stream=cmdargs.log_file, level=level) |
1132 | 149 | 197 | ||
1133 | 150 | def policy(content, path): | 198 | def policy(content, path): |
1138 | 151 | if cfg['path'].endswith('sjson'): | 199 | if cfg["path"].endswith("sjson"): |
1139 | 152 | return util.read_signed(content, | 200 | return util.read_signed( |
1140 | 153 | keyring=cmdargs.keyring, | 201 | content, keyring=cmdargs.keyring, checked=cmdargs.verify |
1141 | 154 | checked=cmdargs.verify) | 202 | ) |
1142 | 155 | else: | 203 | else: |
1143 | 156 | return content | 204 | return content |
1144 | 157 | 205 | ||
1146 | 158 | smirror = mirrors.UrlMirrorReader(cfg['mirror_url'], policy=policy) | 206 | smirror = mirrors.UrlMirrorReader(cfg["mirror_url"], policy=policy) |
1147 | 159 | tmirror = command_hook.CommandHookMirror(config=cfg) | 207 | tmirror = command_hook.CommandHookMirror(config=cfg) |
1148 | 160 | try: | 208 | try: |
1150 | 161 | tmirror.sync(smirror, cfg['path']) | 209 | tmirror.sync(smirror, cfg["path"]) |
1151 | 162 | except IOError as e: | 210 | except IOError as e: |
1152 | 163 | if e.errno == errno.EPIPE: | 211 | if e.errno == errno.EPIPE: |
1153 | 164 | sys.exit(0x80 | signal.SIGPIPE) | 212 | sys.exit(0x80 | signal.SIGPIPE) |
1154 | 165 | raise | 213 | raise |
1155 | 166 | 214 | ||
1156 | 167 | 215 | ||
1158 | 168 | if __name__ == '__main__': | 216 | if __name__ == "__main__": |
1159 | 169 | main() | 217 | main() |
1160 | 170 | 218 | ||
1161 | 171 | # vi: ts=4 expandtab syntax=python | 219 | # vi: ts=4 expandtab syntax=python |
1162 | diff --git a/pyproject.toml b/pyproject.toml | |||
1163 | 172 | new file mode 100644 | 220 | new file mode 100644 |
1164 | index 0000000..d84cc51 | |||
1165 | --- /dev/null | |||
1166 | +++ b/pyproject.toml | |||
1167 | @@ -0,0 +1,6 @@ | |||
1168 | 1 | [tool.black] | ||
1169 | 2 | line-length = 79 | ||
1170 | 3 | |||
1171 | 4 | [tool.isort] | ||
1172 | 5 | profile = "black" | ||
1173 | 6 | line_length = 79 | ||
1174 | diff --git a/setup.py b/setup.py | |||
1175 | index 6b5a29b..301721b 100644 | |||
1176 | --- a/setup.py | |||
1177 | +++ b/setup.py | |||
1178 | @@ -1,8 +1,9 @@ | |||
1179 | 1 | from setuptools import setup | ||
1180 | 2 | from glob import glob | ||
1181 | 3 | import os | 1 | import os |
1182 | 2 | from glob import glob | ||
1183 | 3 | |||
1184 | 4 | from setuptools import setup | ||
1185 | 4 | 5 | ||
1187 | 5 | VERSION = '0.1.0' | 6 | VERSION = "0.1.0" |
1188 | 6 | 7 | ||
1189 | 7 | 8 | ||
1190 | 8 | def is_f(p): | 9 | def is_f(p): |
1191 | @@ -11,18 +12,20 @@ def is_f(p): | |||
1192 | 11 | 12 | ||
1193 | 12 | setup( | 13 | setup( |
1194 | 13 | name="python-simplestreams", | 14 | name="python-simplestreams", |
1196 | 14 | description='Library and tools for using Simple Streams data', | 15 | description="Library and tools for using Simple Streams data", |
1197 | 15 | version=VERSION, | 16 | version=VERSION, |
1200 | 16 | author='Scott Moser', | 17 | author="Scott Moser", |
1201 | 17 | author_email='scott.moser@canonical.com', | 18 | author_email="scott.moser@canonical.com", |
1202 | 18 | license="AGPL", | 19 | license="AGPL", |
1207 | 19 | url='http://launchpad.net/simplestreams/', | 20 | url="http://launchpad.net/simplestreams/", |
1208 | 20 | packages=['simplestreams', 'simplestreams.mirrors', | 21 | packages=[ |
1209 | 21 | 'simplestreams.objectstores'], | 22 | "simplestreams", |
1210 | 22 | scripts=glob('bin/*'), | 23 | "simplestreams.mirrors", |
1211 | 24 | "simplestreams.objectstores", | ||
1212 | 25 | ], | ||
1213 | 26 | scripts=glob("bin/*"), | ||
1214 | 23 | data_files=[ | 27 | data_files=[ |
1219 | 24 | ('lib/simplestreams', glob('tools/hook-*')), | 28 | ("lib/simplestreams", glob("tools/hook-*")), |
1220 | 25 | ('share/doc/simplestreams', | 29 | ("share/doc/simplestreams", [f for f in glob("doc/*") if is_f(f)]), |
1221 | 26 | [f for f in glob('doc/*') if is_f(f)]), | 30 | ], |
1218 | 27 | ] | ||
1222 | 28 | ) | 31 | ) |
1223 | diff --git a/simplestreams/checksum_util.py b/simplestreams/checksum_util.py | |||
1224 | index dc695e3..bc98161 100644 | |||
1225 | --- a/simplestreams/checksum_util.py | |||
1226 | +++ b/simplestreams/checksum_util.py | |||
1227 | @@ -20,7 +20,7 @@ import hashlib | |||
1228 | 20 | CHECKSUMS = ("md5", "sha256", "sha512") | 20 | CHECKSUMS = ("md5", "sha256", "sha512") |
1229 | 21 | 21 | ||
1230 | 22 | try: | 22 | try: |
1232 | 23 | ALGORITHMS = list(getattr(hashlib, 'algorithms')) | 23 | ALGORITHMS = list(getattr(hashlib, "algorithms")) |
1233 | 24 | except AttributeError: | 24 | except AttributeError: |
1234 | 25 | ALGORITHMS = list(hashlib.algorithms_available) | 25 | ALGORITHMS = list(hashlib.algorithms_available) |
1235 | 26 | 26 | ||
1236 | @@ -56,11 +56,13 @@ class checksummer(object): | |||
1237 | 56 | return self._hasher.hexdigest() | 56 | return self._hasher.hexdigest() |
1238 | 57 | 57 | ||
1239 | 58 | def check(self): | 58 | def check(self): |
1241 | 59 | return (self.expected is None or self.expected == self.hexdigest()) | 59 | return self.expected is None or self.expected == self.hexdigest() |
1242 | 60 | 60 | ||
1243 | 61 | def __str__(self): | 61 | def __str__(self): |
1246 | 62 | return ("checksummer (algorithm=%s expected=%s)" % | 62 | return "checksummer (algorithm=%s expected=%s)" % ( |
1247 | 63 | (self.algorithm, self.expected)) | 63 | self.algorithm, |
1248 | 64 | self.expected, | ||
1249 | 65 | ) | ||
1250 | 64 | 66 | ||
1251 | 65 | 67 | ||
1252 | 66 | def item_checksums(item): | 68 | def item_checksums(item): |
1253 | @@ -69,14 +71,16 @@ def item_checksums(item): | |||
1254 | 69 | 71 | ||
1255 | 70 | class SafeCheckSummer(checksummer): | 72 | class SafeCheckSummer(checksummer): |
1256 | 71 | """SafeCheckSummer raises ValueError if checksums are not provided.""" | 73 | """SafeCheckSummer raises ValueError if checksums are not provided.""" |
1257 | 74 | |||
1258 | 72 | def __init__(self, checksums, allowed=None): | 75 | def __init__(self, checksums, allowed=None): |
1259 | 73 | if allowed is None: | 76 | if allowed is None: |
1260 | 74 | allowed = CHECKSUMS | 77 | allowed = CHECKSUMS |
1261 | 75 | super(SafeCheckSummer, self).__init__(checksums) | 78 | super(SafeCheckSummer, self).__init__(checksums) |
1262 | 76 | if self.algorithm not in allowed: | 79 | if self.algorithm not in allowed: |
1263 | 77 | raise ValueError( | 80 | raise ValueError( |
1266 | 78 | "provided checksums (%s) did not include any allowed (%s)" % | 81 | "provided checksums (%s) did not include any allowed (%s)" |
1267 | 79 | (checksums, allowed)) | 82 | % (checksums, allowed) |
1268 | 83 | ) | ||
1269 | 80 | 84 | ||
1270 | 81 | 85 | ||
1271 | 82 | class InvalidChecksum(ValueError): | 86 | class InvalidChecksum(ValueError): |
1272 | @@ -93,18 +97,31 @@ class InvalidChecksum(ValueError): | |||
1273 | 93 | if not isinstance(self.expected_size, int): | 97 | if not isinstance(self.expected_size, int): |
1274 | 94 | msg = "Invalid size '%s' at %s." % (self.expected_size, self.path) | 98 | msg = "Invalid size '%s' at %s." % (self.expected_size, self.path) |
1275 | 95 | else: | 99 | else: |
1281 | 96 | msg = ("Invalid %s Checksum at %s. Found %s. Expected %s. " | 100 | msg = ( |
1282 | 97 | "read %s bytes expected %s bytes." % | 101 | "Invalid %s Checksum at %s. Found %s. Expected %s. " |
1283 | 98 | (self.cksum.algorithm, self.path, | 102 | "read %s bytes expected %s bytes." |
1284 | 99 | self.cksum.hexdigest(), self.cksum.expected, | 103 | % ( |
1285 | 100 | self.size, self.expected_size)) | 104 | self.cksum.algorithm, |
1286 | 105 | self.path, | ||
1287 | 106 | self.cksum.hexdigest(), | ||
1288 | 107 | self.cksum.expected, | ||
1289 | 108 | self.size, | ||
1290 | 109 | self.expected_size, | ||
1291 | 110 | ) | ||
1292 | 111 | ) | ||
1293 | 101 | if self.size: | 112 | if self.size: |
1296 | 102 | msg += (" (size %s expected %s)" % | 113 | msg += " (size %s expected %s)" % ( |
1297 | 103 | (self.size, self.expected_size)) | 114 | self.size, |
1298 | 115 | self.expected_size, | ||
1299 | 116 | ) | ||
1300 | 104 | return msg | 117 | return msg |
1301 | 105 | 118 | ||
1302 | 106 | 119 | ||
1303 | 107 | def invalid_checksum_for_reader(reader, msg=None): | 120 | def invalid_checksum_for_reader(reader, msg=None): |
1307 | 108 | return InvalidChecksum(path=reader.url, cksum=reader.checksummer, | 121 | return InvalidChecksum( |
1308 | 109 | size=reader.bytes_read, expected_size=reader.size, | 122 | path=reader.url, |
1309 | 110 | msg=msg) | 123 | cksum=reader.checksummer, |
1310 | 124 | size=reader.bytes_read, | ||
1311 | 125 | expected_size=reader.size, | ||
1312 | 126 | msg=msg, | ||
1313 | 127 | ) | ||
1314 | diff --git a/simplestreams/contentsource.py b/simplestreams/contentsource.py | |||
1315 | index ce45097..e5c6c98 100644 | |||
1316 | --- a/simplestreams/contentsource.py | |||
1317 | +++ b/simplestreams/contentsource.py | |||
1318 | @@ -23,12 +23,13 @@ import sys | |||
1319 | 23 | from . import checksum_util | 23 | from . import checksum_util |
1320 | 24 | 24 | ||
1321 | 25 | if sys.version_info > (3, 0): | 25 | if sys.version_info > (3, 0): |
1322 | 26 | import urllib.error as urllib_error | ||
1323 | 26 | import urllib.parse as urlparse | 27 | import urllib.parse as urlparse |
1324 | 27 | import urllib.request as urllib_request | 28 | import urllib.request as urllib_request |
1325 | 28 | import urllib.error as urllib_error | ||
1326 | 29 | else: | 29 | else: |
1327 | 30 | import urlparse | ||
1328 | 31 | import urllib2 as urllib_request | 30 | import urllib2 as urllib_request |
1329 | 31 | import urlparse | ||
1330 | 32 | |||
1331 | 32 | urllib_error = urllib_request | 33 | urllib_error = urllib_request |
1332 | 33 | 34 | ||
1333 | 34 | READ_BUFFER_SIZE = 1024 * 10 | 35 | READ_BUFFER_SIZE = 1024 * 10 |
1334 | @@ -38,12 +39,14 @@ try: | |||
1335 | 38 | # We try to use requests because we can do gzip encoding with it. | 39 | # We try to use requests because we can do gzip encoding with it. |
1336 | 39 | # however requests < 1.1 didn't have 'stream' argument to 'get' | 40 | # however requests < 1.1 didn't have 'stream' argument to 'get' |
1337 | 40 | # making it completely unsuitable for downloading large files. | 41 | # making it completely unsuitable for downloading large files. |
1338 | 41 | import requests | ||
1339 | 42 | from distutils.version import LooseVersion | 42 | from distutils.version import LooseVersion |
1340 | 43 | |||
1341 | 43 | import pkg_resources | 44 | import pkg_resources |
1343 | 44 | _REQ = pkg_resources.get_distribution('requests') | 45 | import requests |
1344 | 46 | |||
1345 | 47 | _REQ = pkg_resources.get_distribution("requests") | ||
1346 | 45 | _REQ_VER = LooseVersion(_REQ.version) | 48 | _REQ_VER = LooseVersion(_REQ.version) |
1348 | 46 | if _REQ_VER < LooseVersion('1.1'): | 49 | if _REQ_VER < LooseVersion("1.1"): |
1349 | 47 | raise ImportError("Requests version < 1.1, not suitable for usage.") | 50 | raise ImportError("Requests version < 1.1, not suitable for usage.") |
1350 | 48 | URL_READER_CLASSNAME = "RequestsUrlReader" | 51 | URL_READER_CLASSNAME = "RequestsUrlReader" |
1351 | 49 | except ImportError: | 52 | except ImportError: |
1352 | @@ -63,8 +66,8 @@ class ContentSource(object): | |||
1353 | 63 | raise NotImplementedError() | 66 | raise NotImplementedError() |
1354 | 64 | 67 | ||
1355 | 65 | def set_start_pos(self, offset): | 68 | def set_start_pos(self, offset): |
1358 | 66 | """ Implemented if the ContentSource supports seeking within content. | 69 | """Implemented if the ContentSource supports seeking within content. |
1359 | 67 | Used to resume failed transfers. """ | 70 | Used to resume failed transfers.""" |
1360 | 68 | 71 | ||
1361 | 69 | class SetStartPosNotImplementedError(NotImplementedError): | 72 | class SetStartPosNotImplementedError(NotImplementedError): |
1362 | 70 | pass | 73 | pass |
1363 | @@ -122,8 +125,9 @@ class UrlContentSource(ContentSource): | |||
1364 | 122 | if e.errno != errno.ENOENT: | 125 | if e.errno != errno.ENOENT: |
1365 | 123 | raise | 126 | raise |
1366 | 124 | continue | 127 | continue |
1369 | 125 | myerr = IOError("Unable to open %s. mirrors=%s" % | 128 | myerr = IOError( |
1370 | 126 | (self.input_url, self.mirrors)) | 129 | "Unable to open %s. mirrors=%s" % (self.input_url, self.mirrors) |
1371 | 130 | ) | ||
1372 | 127 | myerr.errno = errno.ENOENT | 131 | myerr.errno = errno.ENOENT |
1373 | 128 | raise myerr | 132 | raise myerr |
1374 | 129 | 133 | ||
1375 | @@ -181,7 +185,7 @@ class IteratorContentSource(ContentSource): | |||
1376 | 181 | raise exc | 185 | raise exc |
1377 | 182 | 186 | ||
1378 | 183 | def is_enoent(self, exc): | 187 | def is_enoent(self, exc): |
1380 | 184 | return (isinstance(exc, IOError) and exc.errno == errno.ENOENT) | 188 | return isinstance(exc, IOError) and exc.errno == errno.ENOENT |
1381 | 185 | 189 | ||
1382 | 186 | def read(self, size=None): | 190 | def read(self, size=None): |
1383 | 187 | self.open() | 191 | self.open() |
1384 | @@ -189,7 +193,7 @@ class IteratorContentSource(ContentSource): | |||
1385 | 189 | if self.consumed: | 193 | if self.consumed: |
1386 | 190 | return bytes() | 194 | return bytes() |
1387 | 191 | 195 | ||
1389 | 192 | if (size is None or size < 0): | 196 | if size is None or size < 0: |
1390 | 193 | # read everything | 197 | # read everything |
1391 | 194 | ret = self.leftover | 198 | ret = self.leftover |
1392 | 195 | self.leftover = bytes() | 199 | self.leftover = bytes() |
1393 | @@ -227,7 +231,7 @@ class IteratorContentSource(ContentSource): | |||
1394 | 227 | class MemoryContentSource(FdContentSource): | 231 | class MemoryContentSource(FdContentSource): |
1395 | 228 | def __init__(self, url=None, content=""): | 232 | def __init__(self, url=None, content=""): |
1396 | 229 | if isinstance(content, str): | 233 | if isinstance(content, str): |
1398 | 230 | content = content.encode('utf-8') | 234 | content = content.encode("utf-8") |
1399 | 231 | fd = io.BytesIO(content) | 235 | fd = io.BytesIO(content) |
1400 | 232 | if url is None: | 236 | if url is None: |
1401 | 233 | url = "MemoryContentSource://undefined" | 237 | url = "MemoryContentSource://undefined" |
1402 | @@ -265,8 +269,10 @@ class ChecksummingContentSource(ContentSource): | |||
1403 | 265 | 269 | ||
1404 | 266 | def _set_checksummer(self, checksummer): | 270 | def _set_checksummer(self, checksummer): |
1405 | 267 | if checksummer.algorithm not in checksum_util.CHECKSUMS: | 271 | if checksummer.algorithm not in checksum_util.CHECKSUMS: |
1408 | 268 | raise ValueError("algorithm %s is not valid (%s)" % | 272 | raise ValueError( |
1409 | 269 | (checksummer.algorithm, checksum_util.CHECKSUMS)) | 273 | "algorithm %s is not valid (%s)" |
1410 | 274 | % (checksummer.algorithm, checksum_util.CHECKSUMS) | ||
1411 | 275 | ) | ||
1412 | 270 | self.checksummer = checksummer | 276 | self.checksummer = checksummer |
1413 | 271 | 277 | ||
1414 | 272 | def check(self): | 278 | def check(self): |
1415 | @@ -322,7 +328,6 @@ class FileReader(UrlReader): | |||
1416 | 322 | 328 | ||
1417 | 323 | 329 | ||
1418 | 324 | class Urllib2UrlReader(UrlReader): | 330 | class Urllib2UrlReader(UrlReader): |
1419 | 325 | |||
1420 | 326 | timeout = TIMEOUT | 331 | timeout = TIMEOUT |
1421 | 327 | 332 | ||
1422 | 328 | def __init__(self, url, offset=None, user_agent=None): | 333 | def __init__(self, url, offset=None, user_agent=None): |
1423 | @@ -339,9 +344,9 @@ class Urllib2UrlReader(UrlReader): | |||
1424 | 339 | try: | 344 | try: |
1425 | 340 | req = urllib_request.Request(url) | 345 | req = urllib_request.Request(url) |
1426 | 341 | if user_agent is not None: | 346 | if user_agent is not None: |
1428 | 342 | req.add_header('User-Agent', user_agent) | 347 | req.add_header("User-Agent", user_agent) |
1429 | 343 | if offset is not None: | 348 | if offset is not None: |
1431 | 344 | req.add_header('Range', 'bytes=%d-' % offset) | 349 | req.add_header("Range", "bytes=%d-" % offset) |
1432 | 345 | self.req = opener(req, timeout=self.timeout) | 350 | self.req = opener(req, timeout=self.timeout) |
1433 | 346 | except urllib_error.HTTPError as e: | 351 | except urllib_error.HTTPError as e: |
1434 | 347 | if e.code == 404: | 352 | if e.code == 404: |
1435 | @@ -373,8 +378,10 @@ class RequestsUrlReader(UrlReader): | |||
1436 | 373 | 378 | ||
1437 | 374 | def __init__(self, url, buflen=None, offset=None, user_agent=None): | 379 | def __init__(self, url, buflen=None, offset=None, user_agent=None): |
1438 | 375 | if requests is None: | 380 | if requests is None: |
1441 | 376 | raise ImportError("Attempt to use RequestsUrlReader " | 381 | raise ImportError( |
1442 | 377 | "without suitable requests library.") | 382 | "Attempt to use RequestsUrlReader " |
1443 | 383 | "without suitable requests library." | ||
1444 | 384 | ) | ||
1445 | 378 | self.url = url | 385 | self.url = url |
1446 | 379 | (url, user, password) = parse_url_auth(url) | 386 | (url, user, password) = parse_url_auth(url) |
1447 | 380 | if user is None: | 387 | if user is None: |
1448 | @@ -384,15 +391,15 @@ class RequestsUrlReader(UrlReader): | |||
1449 | 384 | 391 | ||
1450 | 385 | headers = {} | 392 | headers = {} |
1451 | 386 | if user_agent is not None: | 393 | if user_agent is not None: |
1453 | 387 | headers['User-Agent'] = user_agent | 394 | headers["User-Agent"] = user_agent |
1454 | 388 | if offset is not None: | 395 | if offset is not None: |
1456 | 389 | headers['Range'] = 'bytes=%d-' % offset | 396 | headers["Range"] = "bytes=%d-" % offset |
1457 | 390 | if headers == {}: | 397 | if headers == {}: |
1458 | 391 | headers = None | 398 | headers = None |
1459 | 392 | 399 | ||
1460 | 393 | # requests version less than 2.4.1 takes an optional | 400 | # requests version less than 2.4.1 takes an optional |
1461 | 394 | # float for timeout. There is no separate read timeout | 401 | # float for timeout. There is no separate read timeout |
1463 | 395 | if _REQ_VER < LooseVersion('2.4.1'): | 402 | if _REQ_VER < LooseVersion("2.4.1"): |
1464 | 396 | self.timeout = TIMEOUT | 403 | self.timeout = TIMEOUT |
1465 | 397 | 404 | ||
1466 | 398 | self.req = requests.get( | 405 | self.req = requests.get( |
1467 | @@ -405,13 +412,13 @@ class RequestsUrlReader(UrlReader): | |||
1468 | 405 | self.leftover = bytes() | 412 | self.leftover = bytes() |
1469 | 406 | self.consumed = False | 413 | self.consumed = False |
1470 | 407 | 414 | ||
1472 | 408 | if (self.req.status_code == requests.codes.NOT_FOUND): | 415 | if self.req.status_code == requests.codes.NOT_FOUND: |
1473 | 409 | myerr = IOError("Unable to open %s" % url) | 416 | myerr = IOError("Unable to open %s" % url) |
1474 | 410 | myerr.errno = errno.ENOENT | 417 | myerr.errno = errno.ENOENT |
1475 | 411 | raise myerr | 418 | raise myerr |
1476 | 412 | 419 | ||
1479 | 413 | ce = self.req.headers.get('content-encoding', '').lower() | 420 | ce = self.req.headers.get("content-encoding", "").lower() |
1480 | 414 | if 'gzip' in ce or 'deflate' in ce: | 421 | if "gzip" in ce or "deflate" in ce: |
1481 | 415 | self._read = self.read_compressed | 422 | self._read = self.read_compressed |
1482 | 416 | else: | 423 | else: |
1483 | 417 | self._read = self.read_raw | 424 | self._read = self.read_raw |
1484 | @@ -426,7 +433,7 @@ class RequestsUrlReader(UrlReader): | |||
1485 | 426 | if self.consumed: | 433 | if self.consumed: |
1486 | 427 | return bytes() | 434 | return bytes() |
1487 | 428 | 435 | ||
1489 | 429 | if (size is None or size < 0): | 436 | if size is None or size < 0: |
1490 | 430 | # read everything | 437 | # read everything |
1491 | 431 | ret = self.leftover | 438 | ret = self.leftover |
1492 | 432 | self.leftover = bytes() | 439 | self.leftover = bytes() |
1493 | diff --git a/simplestreams/filters.py b/simplestreams/filters.py | |||
1494 | index 3818949..330a475 100644 | |||
1495 | --- a/simplestreams/filters.py | |||
1496 | +++ b/simplestreams/filters.py | |||
1497 | @@ -15,10 +15,10 @@ | |||
1498 | 15 | # You should have received a copy of the GNU Affero General Public License | 15 | # You should have received a copy of the GNU Affero General Public License |
1499 | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
1500 | 17 | 17 | ||
1501 | 18 | from simplestreams import util | ||
1502 | 19 | |||
1503 | 20 | import re | 18 | import re |
1504 | 21 | 19 | ||
1505 | 20 | from simplestreams import util | ||
1506 | 21 | |||
1507 | 22 | 22 | ||
1508 | 23 | class ItemFilter(object): | 23 | class ItemFilter(object): |
1509 | 24 | def __init__(self, content, noneval=""): | 24 | def __init__(self, content, noneval=""): |
1510 | @@ -37,7 +37,7 @@ class ItemFilter(object): | |||
1511 | 37 | else: | 37 | else: |
1512 | 38 | raise ValueError("Bad parsing of %s" % content) | 38 | raise ValueError("Bad parsing of %s" % content) |
1513 | 39 | 39 | ||
1515 | 40 | self.negator = (op[0] != "!") | 40 | self.negator = op[0] != "!" |
1516 | 41 | self.op = op | 41 | self.op = op |
1517 | 42 | self.key = key | 42 | self.key = key |
1518 | 43 | self.value = val | 43 | self.value = val |
1519 | @@ -45,15 +45,19 @@ class ItemFilter(object): | |||
1520 | 45 | self.noneval = noneval | 45 | self.noneval = noneval |
1521 | 46 | 46 | ||
1522 | 47 | def __str__(self): | 47 | def __str__(self): |
1525 | 48 | return "%s %s %s [none=%s]" % (self.key, self.op, | 48 | return "%s %s %s [none=%s]" % ( |
1526 | 49 | self.value, self.noneval) | 49 | self.key, |
1527 | 50 | self.op, | ||
1528 | 51 | self.value, | ||
1529 | 52 | self.noneval, | ||
1530 | 53 | ) | ||
1531 | 50 | 54 | ||
1532 | 51 | def __repr__(self): | 55 | def __repr__(self): |
1533 | 52 | return self.__str__() | 56 | return self.__str__() |
1534 | 53 | 57 | ||
1535 | 54 | def matches(self, item): | 58 | def matches(self, item): |
1536 | 55 | val = str(item.get(self.key, self.noneval)) | 59 | val = str(item.get(self.key, self.noneval)) |
1538 | 56 | return (self.negator == bool(self._matches(val))) | 60 | return self.negator == bool(self._matches(val)) |
1539 | 57 | 61 | ||
1540 | 58 | 62 | ||
1541 | 59 | def get_filters(filters, noneval=""): | 63 | def get_filters(filters, noneval=""): |
1542 | diff --git a/simplestreams/generate_simplestreams.py b/simplestreams/generate_simplestreams.py | |||
1543 | index 9b7b919..329a91d 100644 | |||
1544 | --- a/simplestreams/generate_simplestreams.py | |||
1545 | +++ b/simplestreams/generate_simplestreams.py | |||
1546 | @@ -13,54 +13,58 @@ | |||
1547 | 13 | # or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public | 13 | # or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public |
1548 | 14 | # License for more details. | 14 | # License for more details. |
1549 | 15 | # | 15 | # |
1550 | 16 | from collections import namedtuple | ||
1551 | 17 | from copy import deepcopy | ||
1552 | 18 | import json | 16 | import json |
1553 | 19 | import os | 17 | import os |
1554 | 20 | import sys | 18 | import sys |
1555 | 19 | from collections import namedtuple | ||
1556 | 20 | from copy import deepcopy | ||
1557 | 21 | 21 | ||
1558 | 22 | from simplestreams import util | 22 | from simplestreams import util |
1559 | 23 | 23 | ||
1563 | 24 | 24 | Item = namedtuple( | |
1564 | 25 | Item = namedtuple('Item', ['content_id', 'product_name', 'version_name', | 25 | "Item", ["content_id", "product_name", "version_name", "item_name", "data"] |
1565 | 26 | 'item_name', 'data']) | 26 | ) |
1566 | 27 | 27 | ||
1567 | 28 | 28 | ||
1568 | 29 | def items2content_trees(itemslist, exdata): | 29 | def items2content_trees(itemslist, exdata): |
1569 | 30 | # input is a list with each item having: | 30 | # input is a list with each item having: |
1570 | 31 | # (content_id, product_name, version_name, item_name, {data}) | 31 | # (content_id, product_name, version_name, item_name, {data}) |
1571 | 32 | ctrees = {} | 32 | ctrees = {} |
1573 | 33 | for (content_id, prodname, vername, itemname, data) in itemslist: | 33 | for content_id, prodname, vername, itemname, data in itemslist: |
1574 | 34 | if content_id not in ctrees: | 34 | if content_id not in ctrees: |
1577 | 35 | ctrees[content_id] = {'content_id': content_id, | 35 | ctrees[content_id] = { |
1578 | 36 | 'format': 'products:1.0', 'products': {}} | 36 | "content_id": content_id, |
1579 | 37 | "format": "products:1.0", | ||
1580 | 38 | "products": {}, | ||
1581 | 39 | } | ||
1582 | 37 | ctrees[content_id].update(exdata) | 40 | ctrees[content_id].update(exdata) |
1583 | 38 | 41 | ||
1584 | 39 | ctree = ctrees[content_id] | 42 | ctree = ctrees[content_id] |
1587 | 40 | if prodname not in ctree['products']: | 43 | if prodname not in ctree["products"]: |
1588 | 41 | ctree['products'][prodname] = {'versions': {}} | 44 | ctree["products"][prodname] = {"versions": {}} |
1589 | 42 | 45 | ||
1593 | 43 | prodtree = ctree['products'][prodname] | 46 | prodtree = ctree["products"][prodname] |
1594 | 44 | if vername not in prodtree['versions']: | 47 | if vername not in prodtree["versions"]: |
1595 | 45 | prodtree['versions'][vername] = {'items': {}} | 48 | prodtree["versions"][vername] = {"items": {}} |
1596 | 46 | 49 | ||
1598 | 47 | vertree = prodtree['versions'][vername] | 50 | vertree = prodtree["versions"][vername] |
1599 | 48 | 51 | ||
1603 | 49 | if itemname in vertree['items']: | 52 | if itemname in vertree["items"]: |
1604 | 50 | raise ValueError("%s: already existed" % | 53 | raise ValueError( |
1605 | 51 | str([content_id, prodname, vername, itemname])) | 54 | "%s: already existed" |
1606 | 55 | % str([content_id, prodname, vername, itemname]) | ||
1607 | 56 | ) | ||
1608 | 52 | 57 | ||
1610 | 53 | vertree['items'][itemname] = data | 58 | vertree["items"][itemname] = data |
1611 | 54 | return ctrees | 59 | return ctrees |
1612 | 55 | 60 | ||
1613 | 56 | 61 | ||
1614 | 57 | class FileNamer: | 62 | class FileNamer: |
1617 | 58 | 63 | streamdir = "streams/v1" | |
1616 | 59 | streamdir = 'streams/v1' | ||
1618 | 60 | 64 | ||
1619 | 61 | @classmethod | 65 | @classmethod |
1620 | 62 | def get_index_path(cls): | 66 | def get_index_path(cls): |
1622 | 63 | return "%s/%s" % (cls.streamdir, 'index.json') | 67 | return "%s/%s" % (cls.streamdir, "index.json") |
1623 | 64 | 68 | ||
1624 | 65 | @classmethod | 69 | @classmethod |
1625 | 66 | def get_content_path(cls, content_id): | 70 | def get_content_path(cls, content_id): |
1626 | @@ -68,16 +72,16 @@ class FileNamer: | |||
1627 | 68 | 72 | ||
1628 | 69 | 73 | ||
1629 | 70 | def generate_index(trees, updated, namer): | 74 | def generate_index(trees, updated, namer): |
1632 | 71 | index = {"index": {}, 'format': 'index:1.0', 'updated': updated} | 75 | index = {"index": {}, "format": "index:1.0", "updated": updated} |
1633 | 72 | not_copied_up = ['content_id'] | 76 | not_copied_up = ["content_id"] |
1634 | 73 | for content_id, content in trees.items(): | 77 | for content_id, content in trees.items(): |
1638 | 74 | index['index'][content_id] = { | 78 | index["index"][content_id] = { |
1639 | 75 | 'products': sorted(list(content['products'].keys())), | 79 | "products": sorted(list(content["products"].keys())), |
1640 | 76 | 'path': namer.get_content_path(content_id), | 80 | "path": namer.get_content_path(content_id), |
1641 | 77 | } | 81 | } |
1642 | 78 | for k in util.stringitems(content): | 82 | for k in util.stringitems(content): |
1643 | 79 | if k not in not_copied_up: | 83 | if k not in not_copied_up: |
1645 | 80 | index['index'][content_id][k] = content[k] | 84 | index["index"][content_id][k] = content[k] |
1646 | 81 | return index | 85 | return index |
1647 | 82 | 86 | ||
1648 | 83 | 87 | ||
1649 | @@ -85,20 +89,29 @@ def write_streams(out_d, trees, updated, namer=None, condense=True): | |||
1650 | 85 | if namer is None: | 89 | if namer is None: |
1651 | 86 | namer = FileNamer | 90 | namer = FileNamer |
1652 | 87 | index = generate_index(trees, updated, namer) | 91 | index = generate_index(trees, updated, namer) |
1654 | 88 | to_write = [(namer.get_index_path(), index,)] | 92 | to_write = [ |
1655 | 93 | ( | ||
1656 | 94 | namer.get_index_path(), | ||
1657 | 95 | index, | ||
1658 | 96 | ) | ||
1659 | 97 | ] | ||
1660 | 89 | # Don't let products_condense modify the input | 98 | # Don't let products_condense modify the input |
1661 | 90 | trees = deepcopy(trees) | 99 | trees = deepcopy(trees) |
1662 | 91 | for content_id in trees: | 100 | for content_id in trees: |
1663 | 92 | if condense: | 101 | if condense: |
1669 | 93 | util.products_condense(trees[content_id], | 102 | util.products_condense( |
1670 | 94 | sticky=[ | 103 | trees[content_id], |
1671 | 95 | 'path', 'sha256', 'md5', | 104 | sticky=["path", "sha256", "md5", "size", "mirrors"], |
1672 | 96 | 'size', 'mirrors' | 105 | ) |
1668 | 97 | ]) | ||
1673 | 98 | content = trees[content_id] | 106 | content = trees[content_id] |
1675 | 99 | to_write.append((index['index'][content_id]['path'], content,)) | 107 | to_write.append( |
1676 | 108 | ( | ||
1677 | 109 | index["index"][content_id]["path"], | ||
1678 | 110 | content, | ||
1679 | 111 | ) | ||
1680 | 112 | ) | ||
1681 | 100 | out_filenames = [] | 113 | out_filenames = [] |
1683 | 101 | for (outfile, data) in to_write: | 114 | for outfile, data in to_write: |
1684 | 102 | filef = os.path.join(out_d, outfile) | 115 | filef = os.path.join(out_d, outfile) |
1685 | 103 | util.mkdir_p(os.path.dirname(filef)) | 116 | util.mkdir_p(os.path.dirname(filef)) |
1686 | 104 | json_dump(data, filef) | 117 | json_dump(data, filef) |
1687 | @@ -108,6 +121,8 @@ def write_streams(out_d, trees, updated, namer=None, condense=True): | |||
1688 | 108 | 121 | ||
1689 | 109 | def json_dump(data, filename): | 122 | def json_dump(data, filename): |
1690 | 110 | with open(filename, "w") as fp: | 123 | with open(filename, "w") as fp: |
1694 | 111 | sys.stderr.write(u"writing %s\n" % filename) | 124 | sys.stderr.write("writing %s\n" % filename) |
1695 | 112 | fp.write(json.dumps(data, indent=2, sort_keys=True, | 125 | fp.write( |
1696 | 113 | separators=(',', ': ')) + "\n") | 126 | json.dumps(data, indent=2, sort_keys=True, separators=(",", ": ")) |
1697 | 127 | + "\n" | ||
1698 | 128 | ) | ||
1699 | diff --git a/simplestreams/json2streams.py b/simplestreams/json2streams.py | |||
1700 | index e9749f6..20b09e7 100755 | |||
1701 | --- a/simplestreams/json2streams.py | |||
1702 | +++ b/simplestreams/json2streams.py | |||
1703 | @@ -1,43 +1,41 @@ | |||
1704 | 1 | #!/usr/bin/env python3 | 1 | #!/usr/bin/env python3 |
1705 | 2 | # Copyright (C) 2013, 2015 Canonical Ltd. | 2 | # Copyright (C) 2013, 2015 Canonical Ltd. |
1706 | 3 | 3 | ||
1707 | 4 | from argparse import ArgumentParser | ||
1708 | 5 | import json | 4 | import json |
1709 | 6 | import os | 5 | import os |
1710 | 7 | import sys | 6 | import sys |
1711 | 7 | from argparse import ArgumentParser | ||
1712 | 8 | 8 | ||
1713 | 9 | from simplestreams import util | 9 | from simplestreams import util |
1714 | 10 | |||
1715 | 11 | from simplestreams.generate_simplestreams import ( | 10 | from simplestreams.generate_simplestreams import ( |
1716 | 12 | FileNamer, | 11 | FileNamer, |
1717 | 13 | Item, | 12 | Item, |
1718 | 14 | items2content_trees, | 13 | items2content_trees, |
1719 | 15 | json_dump, | 14 | json_dump, |
1720 | 16 | write_streams, | 15 | write_streams, |
1722 | 17 | ) | 16 | ) |
1723 | 18 | 17 | ||
1724 | 19 | 18 | ||
1725 | 20 | class JujuFileNamer(FileNamer): | 19 | class JujuFileNamer(FileNamer): |
1726 | 21 | |||
1727 | 22 | @classmethod | 20 | @classmethod |
1728 | 23 | def get_index_path(cls): | 21 | def get_index_path(cls): |
1730 | 24 | return "%s/%s" % (cls.streamdir, 'index2.json') | 22 | return "%s/%s" % (cls.streamdir, "index2.json") |
1731 | 25 | 23 | ||
1732 | 26 | @classmethod | 24 | @classmethod |
1733 | 27 | def get_content_path(cls, content_id): | 25 | def get_content_path(cls, content_id): |
1735 | 28 | return "%s/%s.json" % (cls.streamdir, content_id.replace(':', '-')) | 26 | return "%s/%s.json" % (cls.streamdir, content_id.replace(":", "-")) |
1736 | 29 | 27 | ||
1737 | 30 | 28 | ||
1738 | 31 | def dict_to_item(item_dict): | 29 | def dict_to_item(item_dict): |
1739 | 32 | """Convert a dict into an Item, mutating input.""" | 30 | """Convert a dict into an Item, mutating input.""" |
1742 | 33 | item_dict.pop('item_url', None) | 31 | item_dict.pop("item_url", None) |
1743 | 34 | size = item_dict.get('size') | 32 | size = item_dict.get("size") |
1744 | 35 | if size is not None: | 33 | if size is not None: |
1750 | 36 | item_dict['size'] = int(size) | 34 | item_dict["size"] = int(size) |
1751 | 37 | content_id = item_dict.pop('content_id') | 35 | content_id = item_dict.pop("content_id") |
1752 | 38 | product_name = item_dict.pop('product_name') | 36 | product_name = item_dict.pop("product_name") |
1753 | 39 | version_name = item_dict.pop('version_name') | 37 | version_name = item_dict.pop("version_name") |
1754 | 40 | item_name = item_dict.pop('item_name') | 38 | item_name = item_dict.pop("item_name") |
1755 | 41 | return Item(content_id, product_name, version_name, item_name, item_dict) | 39 | return Item(content_id, product_name, version_name, item_name, item_dict) |
1756 | 42 | 40 | ||
1757 | 43 | 41 | ||
1758 | @@ -51,9 +49,11 @@ def write_release_index(out_d): | |||
1759 | 51 | in_path = os.path.join(out_d, JujuFileNamer.get_index_path()) | 49 | in_path = os.path.join(out_d, JujuFileNamer.get_index_path()) |
1760 | 52 | with open(in_path) as in_file: | 50 | with open(in_path) as in_file: |
1761 | 53 | full_index = json.load(in_file) | 51 | full_index = json.load(in_file) |
1765 | 54 | full_index['index'] = dict( | 52 | full_index["index"] = dict( |
1766 | 55 | (k, v) for k, v in list(full_index['index'].items()) | 53 | (k, v) |
1767 | 56 | if k == 'com.ubuntu.juju:released:tools') | 54 | for k, v in list(full_index["index"].items()) |
1768 | 55 | if k == "com.ubuntu.juju:released:tools" | ||
1769 | 56 | ) | ||
1770 | 57 | out_path = os.path.join(out_d, FileNamer.get_index_path()) | 57 | out_path = os.path.join(out_d, FileNamer.get_index_path()) |
1771 | 58 | json_dump(full_index, out_path) | 58 | json_dump(full_index, out_path) |
1772 | 59 | return out_path | 59 | return out_path |
1773 | @@ -70,7 +70,7 @@ def filenames_to_streams(filenames, updated, out_d, juju_format=False): | |||
1774 | 70 | for items_file in filenames: | 70 | for items_file in filenames: |
1775 | 71 | items.extend(read_items_file(items_file)) | 71 | items.extend(read_items_file(items_file)) |
1776 | 72 | 72 | ||
1778 | 73 | data = {'updated': updated, 'datatype': 'content-download'} | 73 | data = {"updated": updated, "datatype": "content-download"} |
1779 | 74 | trees = items2content_trees(items, data) | 74 | trees = items2content_trees(items, data) |
1780 | 75 | if juju_format: | 75 | if juju_format: |
1781 | 76 | write = write_juju_streams | 76 | write = write_juju_streams |
1782 | @@ -88,23 +88,31 @@ def write_juju_streams(out_d, trees, updated): | |||
1783 | 88 | def parse_args(argv=None): | 88 | def parse_args(argv=None): |
1784 | 89 | parser = ArgumentParser() | 89 | parser = ArgumentParser() |
1785 | 90 | parser.add_argument( | 90 | parser.add_argument( |
1788 | 91 | 'items_file', metavar='items-file', help='File to read items from', | 91 | "items_file", |
1789 | 92 | nargs='+') | 92 | metavar="items-file", |
1790 | 93 | help="File to read items from", | ||
1791 | 94 | nargs="+", | ||
1792 | 95 | ) | ||
1793 | 93 | parser.add_argument( | 96 | parser.add_argument( |
1796 | 94 | 'out_d', metavar='output-dir', | 97 | "out_d", |
1797 | 95 | help='The directory to write stream files to.') | 98 | metavar="output-dir", |
1798 | 99 | help="The directory to write stream files to.", | ||
1799 | 100 | ) | ||
1800 | 96 | parser.add_argument( | 101 | parser.add_argument( |
1803 | 97 | '--juju-format', action='store_true', | 102 | "--juju-format", |
1804 | 98 | help='Write stream files in juju format.') | 103 | action="store_true", |
1805 | 104 | help="Write stream files in juju format.", | ||
1806 | 105 | ) | ||
1807 | 99 | return parser.parse_args(argv) | 106 | return parser.parse_args(argv) |
1808 | 100 | 107 | ||
1809 | 101 | 108 | ||
1810 | 102 | def main(): | 109 | def main(): |
1811 | 103 | args = parse_args() | 110 | args = parse_args() |
1812 | 104 | updated = util.timestamp() | 111 | updated = util.timestamp() |
1815 | 105 | filenames_to_streams(args.items_file, updated, args.out_d, | 112 | filenames_to_streams( |
1816 | 106 | args.juju_format) | 113 | args.items_file, updated, args.out_d, args.juju_format |
1817 | 114 | ) | ||
1818 | 107 | 115 | ||
1819 | 108 | 116 | ||
1821 | 109 | if __name__ == '__main__': | 117 | if __name__ == "__main__": |
1822 | 110 | sys.exit(main()) | 118 | sys.exit(main()) |
1823 | diff --git a/simplestreams/log.py b/simplestreams/log.py | |||
1824 | index 061103a..824f248 100644 | |||
1825 | --- a/simplestreams/log.py | |||
1826 | +++ b/simplestreams/log.py | |||
1827 | @@ -33,18 +33,22 @@ class NullHandler(logging.Handler): | |||
1828 | 33 | 33 | ||
1829 | 34 | def basicConfig(**kwargs): | 34 | def basicConfig(**kwargs): |
1830 | 35 | # basically like logging.basicConfig but only output for our logger | 35 | # basically like logging.basicConfig but only output for our logger |
1836 | 36 | if kwargs.get('filename'): | 36 | if kwargs.get("filename"): |
1837 | 37 | handler = logging.FileHandler(filename=kwargs['filename'], | 37 | handler = logging.FileHandler( |
1838 | 38 | mode=kwargs.get('filemode', 'a')) | 38 | filename=kwargs["filename"], mode=kwargs.get("filemode", "a") |
1839 | 39 | elif kwargs.get('stream'): | 39 | ) |
1840 | 40 | handler = logging.StreamHandler(stream=kwargs['stream']) | 40 | elif kwargs.get("stream"): |
1841 | 41 | handler = logging.StreamHandler(stream=kwargs["stream"]) | ||
1842 | 41 | else: | 42 | else: |
1843 | 42 | handler = NullHandler() | 43 | handler = NullHandler() |
1844 | 43 | 44 | ||
1846 | 44 | level = kwargs.get('level', NOTSET) | 45 | level = kwargs.get("level", NOTSET) |
1847 | 45 | 46 | ||
1850 | 46 | handler.setFormatter(logging.Formatter(fmt=kwargs.get('format'), | 47 | handler.setFormatter( |
1851 | 47 | datefmt=kwargs.get('datefmt'))) | 48 | logging.Formatter( |
1852 | 49 | fmt=kwargs.get("format"), datefmt=kwargs.get("datefmt") | ||
1853 | 50 | ) | ||
1854 | 51 | ) | ||
1855 | 48 | handler.setLevel(level) | 52 | handler.setLevel(level) |
1856 | 49 | 53 | ||
1857 | 50 | logging.getLogger().setLevel(level) | 54 | logging.getLogger().setLevel(level) |
1858 | @@ -56,7 +60,7 @@ def basicConfig(**kwargs): | |||
1859 | 56 | logger.addHandler(handler) | 60 | logger.addHandler(handler) |
1860 | 57 | 61 | ||
1861 | 58 | 62 | ||
1863 | 59 | def _getLogger(name='sstreams'): | 63 | def _getLogger(name="sstreams"): |
1864 | 60 | return logging.getLogger(name) | 64 | return logging.getLogger(name) |
1865 | 61 | 65 | ||
1866 | 62 | 66 | ||
1867 | diff --git a/simplestreams/mirrors/__init__.py b/simplestreams/mirrors/__init__.py | |||
1868 | index 4a9593a..b3374f8 100644 | |||
1869 | --- a/simplestreams/mirrors/__init__.py | |||
1870 | +++ b/simplestreams/mirrors/__init__.py | |||
1871 | @@ -18,10 +18,10 @@ import errno | |||
1872 | 18 | import io | 18 | import io |
1873 | 19 | import json | 19 | import json |
1874 | 20 | 20 | ||
1875 | 21 | import simplestreams.contentsource as cs | ||
1876 | 21 | import simplestreams.filters as filters | 22 | import simplestreams.filters as filters |
1877 | 22 | import simplestreams.util as util | 23 | import simplestreams.util as util |
1878 | 23 | from simplestreams import checksum_util | 24 | from simplestreams import checksum_util |
1879 | 24 | import simplestreams.contentsource as cs | ||
1880 | 25 | from simplestreams.log import LOG | 25 | from simplestreams.log import LOG |
1881 | 26 | 26 | ||
1882 | 27 | DEFAULT_USER_AGENT = "python-simplestreams/0.1" | 27 | DEFAULT_USER_AGENT = "python-simplestreams/0.1" |
1883 | @@ -29,8 +29,8 @@ DEFAULT_USER_AGENT = "python-simplestreams/0.1" | |||
1884 | 29 | 29 | ||
1885 | 30 | class MirrorReader(object): | 30 | class MirrorReader(object): |
1886 | 31 | def __init__(self, policy=util.policy_read_signed): | 31 | def __init__(self, policy=util.policy_read_signed): |
1889 | 32 | """ policy should be a function which returns the extracted payload or | 32 | """policy should be a function which returns the extracted payload or |
1890 | 33 | raises an exception if the policy is violated. """ | 33 | raises an exception if the policy is violated.""" |
1891 | 34 | self.policy = policy | 34 | self.policy = policy |
1892 | 35 | 35 | ||
1893 | 36 | def load_products(self, path): | 36 | def load_products(self, path): |
1894 | @@ -39,7 +39,7 @@ class MirrorReader(object): | |||
1895 | 39 | 39 | ||
1896 | 40 | def read_json(self, path): | 40 | def read_json(self, path): |
1897 | 41 | with self.source(path) as source: | 41 | with self.source(path) as source: |
1899 | 42 | raw = source.read().decode('utf-8') | 42 | raw = source.read().decode("utf-8") |
1900 | 43 | return raw, self.policy(content=raw, path=path) | 43 | return raw, self.policy(content=raw, path=path) |
1901 | 44 | 44 | ||
1902 | 45 | def source(self, path): | 45 | def source(self, path): |
1903 | @@ -164,8 +164,13 @@ class MirrorWriter(object): | |||
1904 | 164 | 164 | ||
1905 | 165 | 165 | ||
1906 | 166 | class UrlMirrorReader(MirrorReader): | 166 | class UrlMirrorReader(MirrorReader): |
1909 | 167 | def __init__(self, prefix, mirrors=None, policy=util.policy_read_signed, | 167 | def __init__( |
1910 | 168 | user_agent=DEFAULT_USER_AGENT): | 168 | self, |
1911 | 169 | prefix, | ||
1912 | 170 | mirrors=None, | ||
1913 | 171 | policy=util.policy_read_signed, | ||
1914 | 172 | user_agent=DEFAULT_USER_AGENT, | ||
1915 | 173 | ): | ||
1916 | 169 | super(UrlMirrorReader, self).__init__(policy=policy) | 174 | super(UrlMirrorReader, self).__init__(policy=policy) |
1917 | 170 | self._cs = cs.UrlContentSource | 175 | self._cs = cs.UrlContentSource |
1918 | 171 | if mirrors is None: | 176 | if mirrors is None: |
1919 | @@ -184,13 +189,18 @@ class UrlMirrorReader(MirrorReader): | |||
1920 | 184 | 189 | ||
1921 | 185 | def url_reader_factory(*args, **kwargs): | 190 | def url_reader_factory(*args, **kwargs): |
1922 | 186 | return cs.URL_READER( | 191 | return cs.URL_READER( |
1924 | 187 | *args, user_agent=self.user_agent, **kwargs) | 192 | *args, user_agent=self.user_agent, **kwargs |
1925 | 193 | ) | ||
1926 | 194 | |||
1927 | 188 | else: | 195 | else: |
1928 | 189 | url_reader_factory = None | 196 | url_reader_factory = None |
1929 | 190 | 197 | ||
1930 | 191 | if self._trailing_slash_checked: | 198 | if self._trailing_slash_checked: |
1933 | 192 | return self._cs(self.prefix + path, mirrors=mirrors, | 199 | return self._cs( |
1934 | 193 | url_reader=url_reader_factory) | 200 | self.prefix + path, |
1935 | 201 | mirrors=mirrors, | ||
1936 | 202 | url_reader=url_reader_factory, | ||
1937 | 203 | ) | ||
1938 | 194 | 204 | ||
1939 | 195 | # A little hack to fix up the user's path. It's fairly common to | 205 | # A little hack to fix up the user's path. It's fairly common to |
1940 | 196 | # specify URLs without a trailing slash, so we try to do that here as | 206 | # specify URLs without a trailing slash, so we try to do that here as |
1941 | @@ -198,22 +208,31 @@ class UrlMirrorReader(MirrorReader): | |||
1942 | 198 | # returned is not yet open (LP: #1237658) | 208 | # returned is not yet open (LP: #1237658) |
1943 | 199 | self._trailing_slash_checked = True | 209 | self._trailing_slash_checked = True |
1944 | 200 | try: | 210 | try: |
1947 | 201 | with self._cs(self.prefix + path, mirrors=None, | 211 | with self._cs( |
1948 | 202 | url_reader=url_reader_factory) as csource: | 212 | self.prefix + path, mirrors=None, url_reader=url_reader_factory |
1949 | 213 | ) as csource: | ||
1950 | 203 | csource.read(1024) | 214 | csource.read(1024) |
1951 | 204 | except Exception as e: | 215 | except Exception as e: |
1952 | 205 | if isinstance(e, IOError) and (e.errno == errno.ENOENT): | 216 | if isinstance(e, IOError) and (e.errno == errno.ENOENT): |
1956 | 206 | LOG.warning("got ENOENT for (%s, %s), trying with trailing /", | 217 | LOG.warning( |
1957 | 207 | self.prefix, path) | 218 | "got ENOENT for (%s, %s), trying with trailing /", |
1958 | 208 | self.prefix = self.prefix + '/' | 219 | self.prefix, |
1959 | 220 | path, | ||
1960 | 221 | ) | ||
1961 | 222 | self.prefix = self.prefix + "/" | ||
1962 | 209 | else: | 223 | else: |
1963 | 210 | # this raised exception, but it was sneaky to do it | 224 | # this raised exception, but it was sneaky to do it |
1964 | 211 | # so just ignore it. | 225 | # so just ignore it. |
1967 | 212 | LOG.debug("trailing / check on (%s, %s) resulted in %s", | 226 | LOG.debug( |
1968 | 213 | self.prefix, path, e) | 227 | "trailing / check on (%s, %s) resulted in %s", |
1969 | 228 | self.prefix, | ||
1970 | 229 | path, | ||
1971 | 230 | e, | ||
1972 | 231 | ) | ||
1973 | 214 | 232 | ||
1976 | 215 | return self._cs(self.prefix + path, mirrors=mirrors, | 233 | return self._cs( |
1977 | 216 | url_reader=url_reader_factory) | 234 | self.prefix + path, mirrors=mirrors, url_reader=url_reader_factory |
1978 | 235 | ) | ||
1979 | 217 | 236 | ||
1980 | 218 | 237 | ||
1981 | 219 | class ObjectStoreMirrorReader(MirrorReader): | 238 | class ObjectStoreMirrorReader(MirrorReader): |
1982 | @@ -231,7 +250,7 @@ class BasicMirrorWriter(MirrorWriter): | |||
1983 | 231 | if config is None: | 250 | if config is None: |
1984 | 232 | config = {} | 251 | config = {} |
1985 | 233 | self.config = config | 252 | self.config = config |
1987 | 234 | self.checksumming_reader = self.config.get('checksumming_reader', True) | 253 | self.checksumming_reader = self.config.get("checksumming_reader", True) |
1988 | 235 | 254 | ||
1989 | 236 | def load_products(self, path=None, content_id=None): | 255 | def load_products(self, path=None, content_id=None): |
1990 | 237 | super(BasicMirrorWriter, self).load_products(path, content_id) | 256 | super(BasicMirrorWriter, self).load_products(path, content_id) |
1991 | @@ -243,14 +262,14 @@ class BasicMirrorWriter(MirrorWriter): | |||
1992 | 243 | 262 | ||
1993 | 244 | check_tree_paths(src) | 263 | check_tree_paths(src) |
1994 | 245 | 264 | ||
1996 | 246 | itree = src.get('index') | 265 | itree = src.get("index") |
1997 | 247 | for content_id, index_entry in itree.items(): | 266 | for content_id, index_entry in itree.items(): |
1998 | 248 | if not self.filter_index_entry(index_entry, src, (content_id,)): | 267 | if not self.filter_index_entry(index_entry, src, (content_id,)): |
1999 | 249 | continue | 268 | continue |
2001 | 250 | epath = index_entry.get('path', None) | 269 | epath = index_entry.get("path", None) |
2002 | 251 | mycs = None | 270 | mycs = None |
2003 | 252 | if epath: | 271 | if epath: |
2005 | 253 | if index_entry.get('format') in ("index:1.0", "products:1.0"): | 272 | if index_entry.get("format") in ("index:1.0", "products:1.0"): |
2006 | 254 | self.sync(reader, path=epath) | 273 | self.sync(reader, path=epath) |
2007 | 255 | mycs = reader.source(epath) | 274 | mycs = reader.source(epath) |
2008 | 256 | 275 | ||
2009 | @@ -265,36 +284,37 @@ class BasicMirrorWriter(MirrorWriter): | |||
2010 | 265 | 284 | ||
2011 | 266 | check_tree_paths(src) | 285 | check_tree_paths(src) |
2012 | 267 | 286 | ||
2014 | 268 | content_id = src['content_id'] | 287 | content_id = src["content_id"] |
2015 | 269 | target = self.load_products(path, content_id) | 288 | target = self.load_products(path, content_id) |
2016 | 270 | if not target: | 289 | if not target: |
2017 | 271 | target = util.stringitems(src) | 290 | target = util.stringitems(src) |
2018 | 272 | 291 | ||
2019 | 273 | util.expand_tree(target) | 292 | util.expand_tree(target) |
2020 | 274 | 293 | ||
2024 | 275 | stree = src.get('products', {}) | 294 | stree = src.get("products", {}) |
2025 | 276 | if 'products' not in target: | 295 | if "products" not in target: |
2026 | 277 | target['products'] = {} | 296 | target["products"] = {} |
2027 | 278 | 297 | ||
2029 | 279 | tproducts = target['products'] | 298 | tproducts = target["products"] |
2030 | 280 | 299 | ||
2031 | 281 | filtered_products = [] | 300 | filtered_products = [] |
2032 | 282 | prodname = None | 301 | prodname = None |
2033 | 283 | 302 | ||
2034 | 284 | # Apply filters to items before filtering versions | 303 | # Apply filters to items before filtering versions |
2035 | 285 | for prodname, product in list(stree.items()): | 304 | for prodname, product in list(stree.items()): |
2039 | 286 | 305 | for vername, version in list(product.get("versions", {}).items()): | |
2040 | 287 | for vername, version in list(product.get('versions', {}).items()): | 306 | for itemname, item in list(version.get("items", {}).items()): |
2038 | 288 | for itemname, item in list(version.get('items', {}).items()): | ||
2041 | 289 | pgree = (prodname, vername, itemname) | 307 | pgree = (prodname, vername, itemname) |
2042 | 290 | if not self.filter_item(item, src, target, pgree): | 308 | if not self.filter_item(item, src, target, pgree): |
2043 | 291 | LOG.debug("Filtered out item: %s/%s", itemname, item) | 309 | LOG.debug("Filtered out item: %s/%s", itemname, item) |
2050 | 292 | del stree[prodname]['versions'][vername]['items'][ | 310 | del stree[prodname]["versions"][vername]["items"][ |
2051 | 293 | itemname] | 311 | itemname |
2052 | 294 | if not stree[prodname]['versions'][vername].get( | 312 | ] |
2053 | 295 | 'items', {}): | 313 | if not stree[prodname]["versions"][vername].get( |
2054 | 296 | del stree[prodname]['versions'][vername] | 314 | "items", {} |
2055 | 297 | if not stree[prodname].get('versions', {}): | 315 | ): |
2056 | 316 | del stree[prodname]["versions"][vername] | ||
2057 | 317 | if not stree[prodname].get("versions", {}): | ||
2058 | 298 | del stree[prodname] | 318 | del stree[prodname] |
2059 | 299 | 319 | ||
2060 | 300 | for prodname, product in stree.items(): | 320 | for prodname, product in stree.items(): |
2061 | @@ -305,50 +325,62 @@ class BasicMirrorWriter(MirrorWriter): | |||
2062 | 305 | if prodname not in tproducts: | 325 | if prodname not in tproducts: |
2063 | 306 | tproducts[prodname] = util.stringitems(product) | 326 | tproducts[prodname] = util.stringitems(product) |
2064 | 307 | tproduct = tproducts[prodname] | 327 | tproduct = tproducts[prodname] |
2067 | 308 | if 'versions' not in tproduct: | 328 | if "versions" not in tproduct: |
2068 | 309 | tproduct['versions'] = {} | 329 | tproduct["versions"] = {} |
2069 | 310 | 330 | ||
2070 | 311 | src_filtered_items = [] | 331 | src_filtered_items = [] |
2071 | 312 | 332 | ||
2072 | 313 | def _filter(itemkey): | 333 | def _filter(itemkey): |
2075 | 314 | ret = self.filter_version(product['versions'][itemkey], | 334 | ret = self.filter_version( |
2076 | 315 | src, target, (prodname, itemkey)) | 335 | product["versions"][itemkey], |
2077 | 336 | src, | ||
2078 | 337 | target, | ||
2079 | 338 | (prodname, itemkey), | ||
2080 | 339 | ) | ||
2081 | 316 | if not ret: | 340 | if not ret: |
2082 | 317 | src_filtered_items.append(itemkey) | 341 | src_filtered_items.append(itemkey) |
2083 | 318 | return ret | 342 | return ret |
2084 | 319 | 343 | ||
2085 | 320 | (to_add, to_remove) = util.resolve_work( | 344 | (to_add, to_remove) = util.resolve_work( |
2095 | 321 | src=list(product.get('versions', {}).keys()), | 345 | src=list(product.get("versions", {}).keys()), |
2096 | 322 | target=list(tproduct.get('versions', {}).keys()), | 346 | target=list(tproduct.get("versions", {}).keys()), |
2097 | 323 | maxnum=self.config.get('max_items'), | 347 | maxnum=self.config.get("max_items"), |
2098 | 324 | keep=self.config.get('keep_items'), itemfilter=_filter) | 348 | keep=self.config.get("keep_items"), |
2099 | 325 | 349 | itemfilter=_filter, | |
2100 | 326 | LOG.info("%s/%s: to_add=%s to_remove=%s", content_id, prodname, | 350 | ) |
2101 | 327 | to_add, to_remove) | 351 | |
2102 | 328 | 352 | LOG.info( | |
2103 | 329 | tversions = tproduct['versions'] | 353 | "%s/%s: to_add=%s to_remove=%s", |
2104 | 354 | content_id, | ||
2105 | 355 | prodname, | ||
2106 | 356 | to_add, | ||
2107 | 357 | to_remove, | ||
2108 | 358 | ) | ||
2109 | 359 | |||
2110 | 360 | tversions = tproduct["versions"] | ||
2111 | 330 | skipped_versions = [] | 361 | skipped_versions = [] |
2112 | 331 | for vername in to_add: | 362 | for vername in to_add: |
2114 | 332 | version = product['versions'][vername] | 363 | version = product["versions"][vername] |
2115 | 333 | 364 | ||
2116 | 334 | if vername not in tversions: | 365 | if vername not in tversions: |
2117 | 335 | tversions[vername] = util.stringitems(version) | 366 | tversions[vername] = util.stringitems(version) |
2118 | 336 | 367 | ||
2119 | 337 | added_items = [] | 368 | added_items = [] |
2121 | 338 | for itemname, item in version.get('items', {}).items(): | 369 | for itemname, item in version.get("items", {}).items(): |
2122 | 339 | pgree = (prodname, vername, itemname) | 370 | pgree = (prodname, vername, itemname) |
2123 | 340 | 371 | ||
2124 | 341 | added_items.append(itemname) | 372 | added_items.append(itemname) |
2125 | 342 | 373 | ||
2127 | 343 | ipath = item.get('path', None) | 374 | ipath = item.get("path", None) |
2128 | 344 | ipath_cs = None | 375 | ipath_cs = None |
2129 | 345 | if ipath and reader: | 376 | if ipath and reader: |
2130 | 346 | if self.checksumming_reader: | 377 | if self.checksumming_reader: |
2131 | 347 | flat = util.products_exdata(src, pgree) | 378 | flat = util.products_exdata(src, pgree) |
2132 | 348 | ipath_cs = cs.ChecksummingContentSource( | 379 | ipath_cs = cs.ChecksummingContentSource( |
2133 | 349 | csrc=reader.source(ipath), | 380 | csrc=reader.source(ipath), |
2136 | 350 | size=flat.get('size'), | 381 | size=flat.get("size"), |
2137 | 351 | checksums=checksum_util.item_checksums(flat)) | 382 | checksums=checksum_util.item_checksums(flat), |
2138 | 383 | ) | ||
2139 | 352 | else: | 384 | else: |
2140 | 353 | ipath_cs = reader.source(ipath) | 385 | ipath_cs = reader.source(ipath) |
2141 | 354 | 386 | ||
2142 | @@ -356,28 +388,38 @@ class BasicMirrorWriter(MirrorWriter): | |||
2143 | 356 | 388 | ||
2144 | 357 | if len(added_items): | 389 | if len(added_items): |
2145 | 358 | # do not insert versions that had all items filtered | 390 | # do not insert versions that had all items filtered |
2148 | 359 | self.insert_version(version, src, target, | 391 | self.insert_version( |
2149 | 360 | (prodname, vername)) | 392 | version, src, target, (prodname, vername) |
2150 | 393 | ) | ||
2151 | 361 | else: | 394 | else: |
2152 | 362 | skipped_versions.append(vername) | 395 | skipped_versions.append(vername) |
2153 | 363 | 396 | ||
2154 | 364 | for vername in skipped_versions: | 397 | for vername in skipped_versions: |
2157 | 365 | if vername in tproduct['versions']: | 398 | if vername in tproduct["versions"]: |
2158 | 366 | del tproduct['versions'][vername] | 399 | del tproduct["versions"][vername] |
2159 | 367 | 400 | ||
2162 | 368 | if self.config.get('delete_filtered_items', False): | 401 | if self.config.get("delete_filtered_items", False): |
2163 | 369 | tkeys = tproduct.get('versions', {}).keys() | 402 | tkeys = tproduct.get("versions", {}).keys() |
2164 | 370 | for v in src_filtered_items: | 403 | for v in src_filtered_items: |
2165 | 371 | if v not in to_remove and v in tkeys: | 404 | if v not in to_remove and v in tkeys: |
2166 | 372 | to_remove.append(v) | 405 | to_remove.append(v) |
2169 | 373 | LOG.info("After deletions %s/%s: to_add=%s to_remove=%s", | 406 | LOG.info( |
2170 | 374 | content_id, prodname, to_add, to_remove) | 407 | "After deletions %s/%s: to_add=%s to_remove=%s", |
2171 | 408 | content_id, | ||
2172 | 409 | prodname, | ||
2173 | 410 | to_add, | ||
2174 | 411 | to_remove, | ||
2175 | 412 | ) | ||
2176 | 375 | 413 | ||
2177 | 376 | for vername in to_remove: | 414 | for vername in to_remove: |
2178 | 377 | tversion = tversions[vername] | 415 | tversion = tversions[vername] |
2182 | 378 | for itemname in list(tversion.get('items', {}).keys()): | 416 | for itemname in list(tversion.get("items", {}).keys()): |
2183 | 379 | self.remove_item(tversion['items'][itemname], src, target, | 417 | self.remove_item( |
2184 | 380 | (prodname, vername, itemname)) | 418 | tversion["items"][itemname], |
2185 | 419 | src, | ||
2186 | 420 | target, | ||
2187 | 421 | (prodname, vername, itemname), | ||
2188 | 422 | ) | ||
2189 | 381 | 423 | ||
2190 | 382 | self.remove_version(tversion, src, target, (prodname, vername)) | 424 | self.remove_version(tversion, src, target, (prodname, vername)) |
2191 | 383 | del tversions[vername] | 425 | del tversions[vername] |
2192 | @@ -389,12 +431,14 @@ class BasicMirrorWriter(MirrorWriter): | |||
2193 | 389 | # that could accidentally delete a lot. | 431 | # that could accidentally delete a lot. |
2194 | 390 | # | 432 | # |
2195 | 391 | del_products = [] | 433 | del_products = [] |
2202 | 392 | if self.config.get('delete_products', False): | 434 | if self.config.get("delete_products", False): |
2203 | 393 | del_products.extend([p for p in list(tproducts.keys()) | 435 | del_products.extend( |
2204 | 394 | if p not in stree]) | 436 | [p for p in list(tproducts.keys()) if p not in stree] |
2205 | 395 | if self.config.get('delete_filtered_products', False): | 437 | ) |
2206 | 396 | del_products.extend([p for p in filtered_products | 438 | if self.config.get("delete_filtered_products", False): |
2207 | 397 | if p not in stree]) | 439 | del_products.extend( |
2208 | 440 | [p for p in filtered_products if p not in stree] | ||
2209 | 441 | ) | ||
2210 | 398 | 442 | ||
2211 | 399 | for prodname in del_products: | 443 | for prodname in del_products: |
2212 | 400 | # FIXME: we remove a product here, but unless that acts | 444 | # FIXME: we remove a product here, but unless that acts |
2213 | @@ -421,7 +465,7 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter): | |||
2214 | 421 | try: | 465 | try: |
2215 | 422 | with self.source(self._reference_count_data_path()) as source: | 466 | with self.source(self._reference_count_data_path()) as source: |
2216 | 423 | raw = source.read() | 467 | raw = source.read() |
2218 | 424 | return json.load(io.StringIO(raw.decode('utf-8'))) | 468 | return json.load(io.StringIO(raw.decode("utf-8"))) |
2219 | 425 | except IOError as e: | 469 | except IOError as e: |
2220 | 426 | if e.errno == errno.ENOENT: | 470 | if e.errno == errno.ENOENT: |
2221 | 427 | return {} | 471 | return {} |
2222 | @@ -432,7 +476,7 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter): | |||
2223 | 432 | self.store.insert(self._reference_count_data_path(), source) | 476 | self.store.insert(self._reference_count_data_path(), source) |
2224 | 433 | 477 | ||
2225 | 434 | def _build_rc_id(self, src, pedigree): | 478 | def _build_rc_id(self, src, pedigree): |
2227 | 435 | return '/'.join([src['content_id']] + list(pedigree)) | 479 | return "/".join([src["content_id"]] + list(pedigree)) |
2228 | 436 | 480 | ||
2229 | 437 | def _inc_rc(self, path, src, pedigree): | 481 | def _inc_rc(self, path, src, pedigree): |
2230 | 438 | rc = self._load_rc_dict() | 482 | rc = self._load_rc_dict() |
2231 | @@ -482,25 +526,30 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter): | |||
2232 | 482 | 526 | ||
2233 | 483 | def insert_item(self, data, src, target, pedigree, contentsource): | 527 | def insert_item(self, data, src, target, pedigree, contentsource): |
2234 | 484 | util.products_set(target, data, pedigree) | 528 | util.products_set(target, data, pedigree) |
2236 | 485 | if 'path' not in data: | 529 | if "path" not in data: |
2237 | 486 | return | 530 | return |
2239 | 487 | if not self.config.get('item_download', True): | 531 | if not self.config.get("item_download", True): |
2240 | 488 | return | 532 | return |
2246 | 489 | LOG.debug("inserting %s to %s", contentsource.url, data['path']) | 533 | LOG.debug("inserting %s to %s", contentsource.url, data["path"]) |
2247 | 490 | self.store.insert(data['path'], contentsource, | 534 | self.store.insert( |
2248 | 491 | checksums=checksum_util.item_checksums(data), | 535 | data["path"], |
2249 | 492 | mutable=False, size=data.get('size')) | 536 | contentsource, |
2250 | 493 | self._inc_rc(data['path'], src, pedigree) | 537 | checksums=checksum_util.item_checksums(data), |
2251 | 538 | mutable=False, | ||
2252 | 539 | size=data.get("size"), | ||
2253 | 540 | ) | ||
2254 | 541 | self._inc_rc(data["path"], src, pedigree) | ||
2255 | 494 | 542 | ||
2256 | 495 | def insert_index_entry(self, data, src, pedigree, contentsource): | 543 | def insert_index_entry(self, data, src, pedigree, contentsource): |
2258 | 496 | epath = data.get('path', None) | 544 | epath = data.get("path", None) |
2259 | 497 | if not epath: | 545 | if not epath: |
2260 | 498 | return | 546 | return |
2263 | 499 | self.store.insert(epath, contentsource, | 547 | self.store.insert( |
2264 | 500 | checksums=checksum_util.item_checksums(data)) | 548 | epath, contentsource, checksums=checksum_util.item_checksums(data) |
2265 | 549 | ) | ||
2266 | 501 | 550 | ||
2267 | 502 | def insert_products(self, path, target, content): | 551 | def insert_products(self, path, target, content): |
2269 | 503 | dpath = self.products_data_path(target['content_id']) | 552 | dpath = self.products_data_path(target["content_id"]) |
2270 | 504 | self.store.insert_content(dpath, util.dump_data(target)) | 553 | self.store.insert_content(dpath, util.dump_data(target)) |
2271 | 505 | if not path: | 554 | if not path: |
2272 | 506 | return | 555 | return |
2273 | @@ -517,16 +566,16 @@ class ObjectStoreMirrorWriter(BasicMirrorWriter): | |||
2274 | 517 | 566 | ||
2275 | 518 | def remove_item(self, data, src, target, pedigree): | 567 | def remove_item(self, data, src, target, pedigree): |
2276 | 519 | util.products_del(target, pedigree) | 568 | util.products_del(target, pedigree) |
2278 | 520 | if 'path' not in data: | 569 | if "path" not in data: |
2279 | 521 | return | 570 | return |
2282 | 522 | if self._dec_rc(data['path'], src, pedigree): | 571 | if self._dec_rc(data["path"], src, pedigree): |
2283 | 523 | self.store.remove(data['path']) | 572 | self.store.remove(data["path"]) |
2284 | 524 | 573 | ||
2285 | 525 | 574 | ||
2286 | 526 | class ObjectFilterMirror(ObjectStoreMirrorWriter): | 575 | class ObjectFilterMirror(ObjectStoreMirrorWriter): |
2287 | 527 | def __init__(self, *args, **kwargs): | 576 | def __init__(self, *args, **kwargs): |
2288 | 528 | super(ObjectFilterMirror, self).__init__(*args, **kwargs) | 577 | super(ObjectFilterMirror, self).__init__(*args, **kwargs) |
2290 | 529 | self.filters = self.config.get('filters', []) | 578 | self.filters = self.config.get("filters", []) |
2291 | 530 | 579 | ||
2292 | 531 | def filter_item(self, data, src, target, pedigree): | 580 | def filter_item(self, data, src, target, pedigree): |
2293 | 532 | return filters.filter_item(self.filters, data, src, pedigree) | 581 | return filters.filter_item(self.filters, data, src, pedigree) |
2294 | @@ -552,15 +601,15 @@ class DryRunMirrorWriter(ObjectFilterMirror): | |||
2295 | 552 | 601 | ||
2296 | 553 | def insert_item(self, data, src, target, pedigree, contentsource): | 602 | def insert_item(self, data, src, target, pedigree, contentsource): |
2297 | 554 | data = util.products_exdata(src, pedigree) | 603 | data = util.products_exdata(src, pedigree) |
2299 | 555 | if 'size' in data and 'path' in data: | 604 | if "size" in data and "path" in data: |
2300 | 556 | self.downloading.append( | 605 | self.downloading.append( |
2302 | 557 | (pedigree, data['path'], int(data['size']))) | 606 | (pedigree, data["path"], int(data["size"])) |
2303 | 607 | ) | ||
2304 | 558 | 608 | ||
2305 | 559 | def remove_item(self, data, src, target, pedigree): | 609 | def remove_item(self, data, src, target, pedigree): |
2306 | 560 | data = util.products_exdata(src, pedigree) | 610 | data = util.products_exdata(src, pedigree) |
2310 | 561 | if 'size' in data and 'path' in data: | 611 | if "size" in data and "path" in data: |
2311 | 562 | self.removing.append( | 612 | self.removing.append((pedigree, data["path"], int(data["size"]))) |
2309 | 563 | (pedigree, data['path'], int(data['size']))) | ||
2312 | 564 | 613 | ||
2313 | 565 | @property | 614 | @property |
2314 | 566 | def size(self): | 615 | def size(self): |
2315 | @@ -573,27 +622,31 @@ def _get_data_content(path, data, content, reader): | |||
2316 | 573 | if content is None and path: | 622 | if content is None and path: |
2317 | 574 | _, content = reader.read(path) | 623 | _, content = reader.read(path) |
2318 | 575 | if isinstance(content, bytes): | 624 | if isinstance(content, bytes): |
2320 | 576 | content = content.decode('utf-8') | 625 | content = content.decode("utf-8") |
2321 | 577 | 626 | ||
2322 | 578 | if data is None and content: | 627 | if data is None and content: |
2323 | 579 | data = util.load_content(content) | 628 | data = util.load_content(content) |
2324 | 580 | 629 | ||
2325 | 581 | if not data: | 630 | if not data: |
2328 | 582 | raise ValueError("Data could not be loaded. " | 631 | raise ValueError( |
2329 | 583 | "Path or content is required") | 632 | "Data could not be loaded. " "Path or content is required" |
2330 | 633 | ) | ||
2331 | 584 | return (data, content) | 634 | return (data, content) |
2332 | 585 | 635 | ||
2333 | 586 | 636 | ||
2334 | 587 | def check_tree_paths(tree, fmt=None): | 637 | def check_tree_paths(tree, fmt=None): |
2335 | 588 | if fmt is None: | 638 | if fmt is None: |
2337 | 589 | fmt = tree.get('format') | 639 | fmt = tree.get("format") |
2338 | 590 | if fmt == "products:1.0": | 640 | if fmt == "products:1.0": |
2339 | 641 | |||
2340 | 591 | def check_path(item, tree, pedigree): | 642 | def check_path(item, tree, pedigree): |
2342 | 592 | util.assert_safe_path(item.get('path')) | 643 | util.assert_safe_path(item.get("path")) |
2343 | 644 | |||
2344 | 593 | util.walk_products(tree, cb_item=check_path) | 645 | util.walk_products(tree, cb_item=check_path) |
2345 | 594 | elif fmt == "index:1.0": | 646 | elif fmt == "index:1.0": |
2347 | 595 | index = tree.get('index') | 647 | index = tree.get("index") |
2348 | 596 | for content_id in index: | 648 | for content_id in index: |
2350 | 597 | util.assert_safe_path(index[content_id].get('path')) | 649 | util.assert_safe_path(index[content_id].get("path")) |
2351 | 650 | |||
2352 | 598 | 651 | ||
2353 | 599 | # vi: ts=4 expandtab | 652 | # vi: ts=4 expandtab |
2354 | diff --git a/simplestreams/mirrors/command_hook.py b/simplestreams/mirrors/command_hook.py | |||
2355 | index ab70691..de42623 100644 | |||
2356 | --- a/simplestreams/mirrors/command_hook.py | |||
2357 | +++ b/simplestreams/mirrors/command_hook.py | |||
2358 | @@ -15,15 +15,15 @@ | |||
2359 | 15 | # You should have received a copy of the GNU Affero General Public License | 15 | # You should have received a copy of the GNU Affero General Public License |
2360 | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
2361 | 17 | 17 | ||
2362 | 18 | import simplestreams.mirrors as mirrors | ||
2363 | 19 | import simplestreams.util as util | ||
2364 | 20 | |||
2365 | 21 | import os | ||
2366 | 22 | import errno | 18 | import errno |
2367 | 19 | import os | ||
2368 | 23 | import signal | 20 | import signal |
2369 | 24 | import subprocess | 21 | import subprocess |
2370 | 25 | import tempfile | 22 | import tempfile |
2371 | 26 | 23 | ||
2372 | 24 | import simplestreams.mirrors as mirrors | ||
2373 | 25 | import simplestreams.util as util | ||
2374 | 26 | |||
2375 | 27 | REQUIRED_FIELDS = ("load_products",) | 27 | REQUIRED_FIELDS = ("load_products",) |
2376 | 28 | HOOK_NAMES = ( | 28 | HOOK_NAMES = ( |
2377 | 29 | "filter_index_entry", | 29 | "filter_index_entry", |
2378 | @@ -92,6 +92,7 @@ class CommandHookMirror(mirrors.BasicMirrorWriter): | |||
2379 | 92 | If the configuration setting 'item_skip_download' is set to True, then | 92 | If the configuration setting 'item_skip_download' is set to True, then |
2380 | 93 | 'path_url' will be set instead to a url where the item can be found. | 93 | 'path_url' will be set instead to a url where the item can be found. |
2381 | 94 | """ | 94 | """ |
2382 | 95 | |||
2383 | 95 | def __init__(self, config): | 96 | def __init__(self, config): |
2384 | 96 | if isinstance(config, str): | 97 | if isinstance(config, str): |
2385 | 97 | config = util.load_content(config) | 98 | config = util.load_content(config) |
2386 | @@ -100,32 +101,34 @@ class CommandHookMirror(mirrors.BasicMirrorWriter): | |||
2387 | 100 | super(CommandHookMirror, self).__init__(config=config) | 101 | super(CommandHookMirror, self).__init__(config=config) |
2388 | 101 | 102 | ||
2389 | 102 | def load_products(self, path=None, content_id=None): | 103 | def load_products(self, path=None, content_id=None): |
2393 | 103 | (_rc, output) = self.call_hook('load_products', | 104 | (_rc, output) = self.call_hook( |
2394 | 104 | data={'content_id': content_id}, | 105 | "load_products", data={"content_id": content_id}, capture=True |
2395 | 105 | capture=True) | 106 | ) |
2396 | 106 | fmt = self.config.get("product_load_output_format", "serial_list") | 107 | fmt = self.config.get("product_load_output_format", "serial_list") |
2397 | 107 | 108 | ||
2400 | 108 | loaded = load_product_output(output=output, content_id=content_id, | 109 | loaded = load_product_output( |
2401 | 109 | fmt=fmt) | 110 | output=output, content_id=content_id, fmt=fmt |
2402 | 111 | ) | ||
2403 | 110 | return loaded | 112 | return loaded |
2404 | 111 | 113 | ||
2405 | 112 | def filter_index_entry(self, data, src, pedigree): | 114 | def filter_index_entry(self, data, src, pedigree): |
2406 | 113 | mdata = util.stringitems(src) | 115 | mdata = util.stringitems(src) |
2408 | 114 | mdata['content_id'] = pedigree[0] | 116 | mdata["content_id"] = pedigree[0] |
2409 | 115 | mdata.update(util.stringitems(data)) | 117 | mdata.update(util.stringitems(data)) |
2410 | 116 | 118 | ||
2413 | 117 | (ret, _output) = self.call_hook('filter_index_entry', data=mdata, | 119 | (ret, _output) = self.call_hook( |
2414 | 118 | rcs=[0, 1]) | 120 | "filter_index_entry", data=mdata, rcs=[0, 1] |
2415 | 121 | ) | ||
2416 | 119 | return ret == 0 | 122 | return ret == 0 |
2417 | 120 | 123 | ||
2418 | 121 | def filter_product(self, data, src, target, pedigree): | 124 | def filter_product(self, data, src, target, pedigree): |
2420 | 122 | return self._call_filter('filter_product', src, pedigree) | 125 | return self._call_filter("filter_product", src, pedigree) |
2421 | 123 | 126 | ||
2422 | 124 | def filter_version(self, data, src, target, pedigree): | 127 | def filter_version(self, data, src, target, pedigree): |
2424 | 125 | return self._call_filter('filter_version', src, pedigree) | 128 | return self._call_filter("filter_version", src, pedigree) |
2425 | 126 | 129 | ||
2426 | 127 | def filter_item(self, data, src, target, pedigree): | 130 | def filter_item(self, data, src, target, pedigree): |
2428 | 128 | return self._call_filter('filter_item', src, pedigree) | 131 | return self._call_filter("filter_item", src, pedigree) |
2429 | 129 | 132 | ||
2430 | 130 | def _call_filter(self, name, src, pedigree): | 133 | def _call_filter(self, name, src, pedigree): |
2431 | 131 | data = util.products_exdata(src, pedigree) | 134 | data = util.products_exdata(src, pedigree) |
2432 | @@ -133,20 +136,27 @@ class CommandHookMirror(mirrors.BasicMirrorWriter): | |||
2433 | 133 | return ret == 0 | 136 | return ret == 0 |
2434 | 134 | 137 | ||
2435 | 135 | def insert_index(self, path, src, content): | 138 | def insert_index(self, path, src, content): |
2438 | 136 | return self.call_hook('insert_index', data=src, content=content, | 139 | return self.call_hook( |
2439 | 137 | extra={'path': path}) | 140 | "insert_index", data=src, content=content, extra={"path": path} |
2440 | 141 | ) | ||
2441 | 138 | 142 | ||
2442 | 139 | def insert_products(self, path, target, content): | 143 | def insert_products(self, path, target, content): |
2445 | 140 | return self.call_hook('insert_products', data=target, | 144 | return self.call_hook( |
2446 | 141 | content=content, extra={'path': path}) | 145 | "insert_products", |
2447 | 146 | data=target, | ||
2448 | 147 | content=content, | ||
2449 | 148 | extra={"path": path}, | ||
2450 | 149 | ) | ||
2451 | 142 | 150 | ||
2452 | 143 | def insert_product(self, data, src, target, pedigree): | 151 | def insert_product(self, data, src, target, pedigree): |
2455 | 144 | return self.call_hook('insert_product', | 152 | return self.call_hook( |
2456 | 145 | data=util.products_exdata(src, pedigree)) | 153 | "insert_product", data=util.products_exdata(src, pedigree) |
2457 | 154 | ) | ||
2458 | 146 | 155 | ||
2459 | 147 | def insert_version(self, data, src, target, pedigree): | 156 | def insert_version(self, data, src, target, pedigree): |
2462 | 148 | return self.call_hook('insert_version', | 157 | return self.call_hook( |
2463 | 149 | data=util.products_exdata(src, pedigree)) | 158 | "insert_version", data=util.products_exdata(src, pedigree) |
2464 | 159 | ) | ||
2465 | 150 | 160 | ||
2466 | 151 | def insert_item(self, data, src, target, pedigree, contentsource): | 161 | def insert_item(self, data, src, target, pedigree, contentsource): |
2467 | 152 | mdata = util.products_exdata(src, pedigree) | 162 | mdata = util.products_exdata(src, pedigree) |
2468 | @@ -154,43 +164,47 @@ class CommandHookMirror(mirrors.BasicMirrorWriter): | |||
2469 | 154 | tmp_path = None | 164 | tmp_path = None |
2470 | 155 | tmp_del = None | 165 | tmp_del = None |
2471 | 156 | extra = {} | 166 | extra = {} |
2475 | 157 | if 'path' in data: | 167 | if "path" in data: |
2476 | 158 | extra.update({'item_url': contentsource.url}) | 168 | extra.update({"item_url": contentsource.url}) |
2477 | 159 | if not self.config.get('item_skip_download', False): | 169 | if not self.config.get("item_skip_download", False): |
2478 | 160 | try: | 170 | try: |
2479 | 161 | (tmp_path, tmp_del) = util.get_local_copy(contentsource) | 171 | (tmp_path, tmp_del) = util.get_local_copy(contentsource) |
2481 | 162 | extra['path_local'] = tmp_path | 172 | extra["path_local"] = tmp_path |
2482 | 163 | finally: | 173 | finally: |
2483 | 164 | contentsource.close() | 174 | contentsource.close() |
2484 | 165 | 175 | ||
2485 | 166 | try: | 176 | try: |
2487 | 167 | ret = self.call_hook('insert_item', data=mdata, extra=extra) | 177 | ret = self.call_hook("insert_item", data=mdata, extra=extra) |
2488 | 168 | finally: | 178 | finally: |
2489 | 169 | if tmp_del and os.path.exists(tmp_path): | 179 | if tmp_del and os.path.exists(tmp_path): |
2490 | 170 | os.unlink(tmp_path) | 180 | os.unlink(tmp_path) |
2491 | 171 | return ret | 181 | return ret |
2492 | 172 | 182 | ||
2493 | 173 | def remove_product(self, data, src, target, pedigree): | 183 | def remove_product(self, data, src, target, pedigree): |
2496 | 174 | return self.call_hook('remove_product', | 184 | return self.call_hook( |
2497 | 175 | data=util.products_exdata(src, pedigree)) | 185 | "remove_product", data=util.products_exdata(src, pedigree) |
2498 | 186 | ) | ||
2499 | 176 | 187 | ||
2500 | 177 | def remove_version(self, data, src, target, pedigree): | 188 | def remove_version(self, data, src, target, pedigree): |
2503 | 178 | return self.call_hook('remove_version', | 189 | return self.call_hook( |
2504 | 179 | data=util.products_exdata(src, pedigree)) | 190 | "remove_version", data=util.products_exdata(src, pedigree) |
2505 | 191 | ) | ||
2506 | 180 | 192 | ||
2507 | 181 | def remove_item(self, data, src, target, pedigree): | 193 | def remove_item(self, data, src, target, pedigree): |
2510 | 182 | return self.call_hook('remove_item', | 194 | return self.call_hook( |
2511 | 183 | data=util.products_exdata(target, pedigree)) | 195 | "remove_item", data=util.products_exdata(target, pedigree) |
2512 | 196 | ) | ||
2513 | 184 | 197 | ||
2516 | 185 | def call_hook(self, hookname, data, capture=False, rcs=None, extra=None, | 198 | def call_hook( |
2517 | 186 | content=None): | 199 | self, hookname, data, capture=False, rcs=None, extra=None, content=None |
2518 | 200 | ): | ||
2519 | 187 | command = self.config.get(hookname, self.config.get(DEFAULT_HOOK_NAME)) | 201 | command = self.config.get(hookname, self.config.get(DEFAULT_HOOK_NAME)) |
2520 | 188 | if not command: | 202 | if not command: |
2521 | 189 | # return successful execution with no output | 203 | # return successful execution with no output |
2523 | 190 | return (0, '') | 204 | return (0, "") |
2524 | 191 | 205 | ||
2525 | 192 | if isinstance(command, str): | 206 | if isinstance(command, str): |
2527 | 193 | command = ['sh', '-c', command] | 207 | command = ["sh", "-c", command] |
2528 | 194 | 208 | ||
2529 | 195 | fdata = util.stringitems(data) | 209 | fdata = util.stringitems(data) |
2530 | 196 | 210 | ||
2531 | @@ -200,16 +214,20 @@ class CommandHookMirror(mirrors.BasicMirrorWriter): | |||
2532 | 200 | tfile = os.fdopen(tfd, "w") | 214 | tfile = os.fdopen(tfd, "w") |
2533 | 201 | tfile.write(content) | 215 | tfile.write(content) |
2534 | 202 | tfile.close() | 216 | tfile.close() |
2536 | 203 | fdata['content_file_path'] = content_file | 217 | fdata["content_file_path"] = content_file |
2537 | 204 | 218 | ||
2538 | 205 | if extra: | 219 | if extra: |
2539 | 206 | fdata.update(extra) | 220 | fdata.update(extra) |
2541 | 207 | fdata['HOOK'] = hookname | 221 | fdata["HOOK"] = hookname |
2542 | 208 | 222 | ||
2543 | 209 | try: | 223 | try: |
2547 | 210 | return call_hook(command=command, data=fdata, | 224 | return call_hook( |
2548 | 211 | unset=self.config.get('unset_value', None), | 225 | command=command, |
2549 | 212 | capture=capture, rcs=rcs) | 226 | data=fdata, |
2550 | 227 | unset=self.config.get("unset_value", None), | ||
2551 | 228 | capture=capture, | ||
2552 | 229 | rcs=rcs, | ||
2553 | 230 | ) | ||
2554 | 213 | finally: | 231 | finally: |
2555 | 214 | if content_file: | 232 | if content_file: |
2556 | 215 | os.unlink(content_file) | 233 | os.unlink(content_file) |
2557 | @@ -219,7 +237,7 @@ def call_hook(command, data, unset=None, capture=False, rcs=None): | |||
2558 | 219 | env = os.environ.copy() | 237 | env = os.environ.copy() |
2559 | 220 | data = data.copy() | 238 | data = data.copy() |
2560 | 221 | 239 | ||
2562 | 222 | data[ENV_FIELDS_NAME] = ' '.join([k for k in data if k != ENV_HOOK_NAME]) | 240 | data[ENV_FIELDS_NAME] = " ".join([k for k in data if k != ENV_HOOK_NAME]) |
2563 | 223 | 241 | ||
2564 | 224 | mcommand = render(command, data, unset=unset) | 242 | mcommand = render(command, data, unset=unset) |
2565 | 225 | 243 | ||
2566 | @@ -257,12 +275,12 @@ def load_product_output(output, content_id, fmt="serial_list"): | |||
2567 | 257 | 275 | ||
2568 | 258 | if fmt == "serial_list": | 276 | if fmt == "serial_list": |
2569 | 259 | # "line" format just is a list of serials that are present | 277 | # "line" format just is a list of serials that are present |
2571 | 260 | working = {'content_id': content_id, 'products': {}} | 278 | working = {"content_id": content_id, "products": {}} |
2572 | 261 | for line in output.splitlines(): | 279 | for line in output.splitlines(): |
2573 | 262 | (product_id, version) = line.split(None, 1) | 280 | (product_id, version) = line.split(None, 1) |
2577 | 263 | if product_id not in working['products']: | 281 | if product_id not in working["products"]: |
2578 | 264 | working['products'][product_id] = {'versions': {}} | 282 | working["products"][product_id] = {"versions": {}} |
2579 | 265 | working['products'][product_id]['versions'][version] = {} | 283 | working["products"][product_id]["versions"][version] = {} |
2580 | 266 | return working | 284 | return working |
2581 | 267 | 285 | ||
2582 | 268 | elif fmt == "json": | 286 | elif fmt == "json": |
2583 | @@ -293,10 +311,11 @@ def run_command(cmd, env=None, capture=False, rcs=None): | |||
2584 | 293 | raise subprocess.CalledProcessError(rc, cmd) | 311 | raise subprocess.CalledProcessError(rc, cmd) |
2585 | 294 | 312 | ||
2586 | 295 | if out is None: | 313 | if out is None: |
2588 | 296 | out = '' | 314 | out = "" |
2589 | 297 | elif isinstance(out, bytes): | 315 | elif isinstance(out, bytes): |
2591 | 298 | out = out.decode('utf-8') | 316 | out = out.decode("utf-8") |
2592 | 299 | 317 | ||
2593 | 300 | return (rc, out) | 318 | return (rc, out) |
2594 | 301 | 319 | ||
2595 | 320 | |||
2596 | 302 | # vi: ts=4 expandtab syntax=python | 321 | # vi: ts=4 expandtab syntax=python |
2597 | diff --git a/simplestreams/mirrors/glance.py b/simplestreams/mirrors/glance.py | |||
2598 | index b96f8eb..22e46e9 100644 | |||
2599 | --- a/simplestreams/mirrors/glance.py | |||
2600 | +++ b/simplestreams/mirrors/glance.py | |||
2601 | @@ -15,42 +15,47 @@ | |||
2602 | 15 | # You should have received a copy of the GNU Affero General Public License | 15 | # You should have received a copy of the GNU Affero General Public License |
2603 | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
2604 | 17 | 17 | ||
2605 | 18 | import simplestreams.filters as filters | ||
2606 | 19 | import simplestreams.mirrors as mirrors | ||
2607 | 20 | import simplestreams.util as util | ||
2608 | 21 | from simplestreams import checksum_util | ||
2609 | 22 | import simplestreams.openstack as openstack | ||
2610 | 23 | from simplestreams.log import LOG | ||
2611 | 24 | |||
2612 | 25 | import copy | ||
2613 | 26 | import collections | 18 | import collections |
2614 | 19 | import copy | ||
2615 | 27 | import errno | 20 | import errno |
2616 | 28 | import glanceclient | ||
2617 | 29 | import json | 21 | import json |
2618 | 30 | import os | 22 | import os |
2619 | 31 | import re | 23 | import re |
2620 | 32 | 24 | ||
2621 | 25 | import glanceclient | ||
2622 | 26 | |||
2623 | 27 | import simplestreams.filters as filters | ||
2624 | 28 | import simplestreams.mirrors as mirrors | ||
2625 | 29 | import simplestreams.openstack as openstack | ||
2626 | 30 | import simplestreams.util as util | ||
2627 | 31 | from simplestreams import checksum_util | ||
2628 | 32 | from simplestreams.log import LOG | ||
2629 | 33 | |||
2630 | 33 | 34 | ||
2632 | 34 | def get_glanceclient(version='1', **kwargs): | 35 | def get_glanceclient(version="1", **kwargs): |
2633 | 35 | # newer versions of the glanceclient will do this 'strip_version' for | 36 | # newer versions of the glanceclient will do this 'strip_version' for |
2634 | 36 | # us, but older versions do not. | 37 | # us, but older versions do not. |
2637 | 37 | kwargs['endpoint'] = _strip_version(kwargs['endpoint']) | 38 | kwargs["endpoint"] = _strip_version(kwargs["endpoint"]) |
2638 | 38 | pt = ('endpoint', 'token', 'insecure', 'cacert') | 39 | pt = ("endpoint", "token", "insecure", "cacert") |
2639 | 39 | kskw = {k: kwargs.get(k) for k in pt if k in kwargs} | 40 | kskw = {k: kwargs.get(k) for k in pt if k in kwargs} |
2642 | 40 | if kwargs.get('session'): | 41 | if kwargs.get("session"): |
2643 | 41 | sess = kwargs.get('session') | 42 | sess = kwargs.get("session") |
2644 | 42 | return glanceclient.Client(version, session=sess) | 43 | return glanceclient.Client(version, session=sess) |
2645 | 43 | else: | 44 | else: |
2646 | 44 | return glanceclient.Client(version, **kskw) | 45 | return glanceclient.Client(version, **kskw) |
2647 | 45 | 46 | ||
2648 | 46 | 47 | ||
2649 | 47 | def empty_iid_products(content_id): | 48 | def empty_iid_products(content_id): |
2652 | 48 | return {'content_id': content_id, 'products': {}, | 49 | return { |
2653 | 49 | 'datatype': 'image-ids', 'format': 'products:1.0'} | 50 | "content_id": content_id, |
2654 | 51 | "products": {}, | ||
2655 | 52 | "datatype": "image-ids", | ||
2656 | 53 | "format": "products:1.0", | ||
2657 | 54 | } | ||
2658 | 50 | 55 | ||
2659 | 51 | 56 | ||
2660 | 52 | def canonicalize_arch(arch): | 57 | def canonicalize_arch(arch): |
2662 | 53 | '''Canonicalize Ubuntu archs for use in OpenStack''' | 58 | """Canonicalize Ubuntu archs for use in OpenStack""" |
2663 | 54 | newarch = arch.lower() | 59 | newarch = arch.lower() |
2664 | 55 | if newarch == "amd64": | 60 | if newarch == "amd64": |
2665 | 56 | newarch = "x86_64" | 61 | newarch = "x86_64" |
2666 | @@ -68,21 +73,21 @@ def canonicalize_arch(arch): | |||
2667 | 68 | 73 | ||
2668 | 69 | 74 | ||
2669 | 70 | LXC_FTYPES = { | 75 | LXC_FTYPES = { |
2673 | 71 | 'root.tar.gz': 'root-tar', | 76 | "root.tar.gz": "root-tar", |
2674 | 72 | 'root.tar.xz': 'root-tar', | 77 | "root.tar.xz": "root-tar", |
2675 | 73 | 'squashfs': 'squashfs', | 78 | "squashfs": "squashfs", |
2676 | 74 | } | 79 | } |
2677 | 75 | 80 | ||
2678 | 76 | QEMU_FTYPES = { | 81 | QEMU_FTYPES = { |
2681 | 77 | 'disk.img': 'qcow2', | 82 | "disk.img": "qcow2", |
2682 | 78 | 'disk1.img': 'qcow2', | 83 | "disk1.img": "qcow2", |
2683 | 79 | } | 84 | } |
2684 | 80 | 85 | ||
2685 | 81 | 86 | ||
2686 | 82 | def disk_format(ftype): | 87 | def disk_format(ftype): |
2688 | 83 | '''Canonicalize disk formats for use in OpenStack. | 88 | """Canonicalize disk formats for use in OpenStack. |
2689 | 84 | Input ftype is a 'ftype' from a simplestream feed. | 89 | Input ftype is a 'ftype' from a simplestream feed. |
2691 | 85 | Return value is the appropriate 'disk_format' for glance.''' | 90 | Return value is the appropriate 'disk_format' for glance.""" |
2692 | 86 | newftype = ftype.lower() | 91 | newftype = ftype.lower() |
2693 | 87 | if newftype in LXC_FTYPES: | 92 | if newftype in LXC_FTYPES: |
2694 | 88 | return LXC_FTYPES[newftype] | 93 | return LXC_FTYPES[newftype] |
2695 | @@ -92,22 +97,22 @@ def disk_format(ftype): | |||
2696 | 92 | 97 | ||
2697 | 93 | 98 | ||
2698 | 94 | def hypervisor_type(ftype): | 99 | def hypervisor_type(ftype): |
2700 | 95 | '''Determine hypervisor type based on image format''' | 100 | """Determine hypervisor type based on image format""" |
2701 | 96 | newftype = ftype.lower() | 101 | newftype = ftype.lower() |
2702 | 97 | if newftype in LXC_FTYPES: | 102 | if newftype in LXC_FTYPES: |
2704 | 98 | return 'lxc' | 103 | return "lxc" |
2705 | 99 | if newftype in QEMU_FTYPES: | 104 | if newftype in QEMU_FTYPES: |
2707 | 100 | return 'qemu' | 105 | return "qemu" |
2708 | 101 | return None | 106 | return None |
2709 | 102 | 107 | ||
2710 | 103 | 108 | ||
2711 | 104 | def virt_type(hypervisor_type): | 109 | def virt_type(hypervisor_type): |
2713 | 105 | '''Map underlying hypervisor types into high level virt types''' | 110 | """Map underlying hypervisor types into high level virt types""" |
2714 | 106 | newhtype = hypervisor_type.lower() | 111 | newhtype = hypervisor_type.lower() |
2719 | 107 | if newhtype == 'qemu': | 112 | if newhtype == "qemu": |
2720 | 108 | return 'kvm' | 113 | return "kvm" |
2721 | 109 | if newhtype == 'lxc': | 114 | if newhtype == "lxc": |
2722 | 110 | return 'lxd' | 115 | return "lxd" |
2723 | 111 | return None | 116 | return None |
2724 | 112 | 117 | ||
2725 | 113 | 118 | ||
2726 | @@ -120,20 +125,29 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
2727 | 120 | `client` argument is used for testing to override openstack module: | 125 | `client` argument is used for testing to override openstack module: |
2728 | 121 | allows dependency injection of fake "openstack" module. | 126 | allows dependency injection of fake "openstack" module. |
2729 | 122 | """ | 127 | """ |
2733 | 123 | def __init__(self, config, objectstore=None, region=None, | 128 | |
2734 | 124 | name_prefix=None, progress_callback=None, | 129 | def __init__( |
2735 | 125 | client=None): | 130 | self, |
2736 | 131 | config, | ||
2737 | 132 | objectstore=None, | ||
2738 | 133 | region=None, | ||
2739 | 134 | name_prefix=None, | ||
2740 | 135 | progress_callback=None, | ||
2741 | 136 | client=None, | ||
2742 | 137 | ): | ||
2743 | 126 | super(GlanceMirror, self).__init__(config=config) | 138 | super(GlanceMirror, self).__init__(config=config) |
2744 | 127 | 139 | ||
2746 | 128 | self.item_filters = self.config.get('item_filters', []) | 140 | self.item_filters = self.config.get("item_filters", []) |
2747 | 129 | if len(self.item_filters) == 0: | 141 | if len(self.item_filters) == 0: |
2750 | 130 | self.item_filters = ['ftype~(disk1.img|disk.img)', | 142 | self.item_filters = [ |
2751 | 131 | 'arch~(x86_64|amd64|i386)'] | 143 | "ftype~(disk1.img|disk.img)", |
2752 | 144 | "arch~(x86_64|amd64|i386)", | ||
2753 | 145 | ] | ||
2754 | 132 | self.item_filters = filters.get_filters(self.item_filters) | 146 | self.item_filters = filters.get_filters(self.item_filters) |
2755 | 133 | 147 | ||
2757 | 134 | self.index_filters = self.config.get('index_filters', []) | 148 | self.index_filters = self.config.get("index_filters", []) |
2758 | 135 | if len(self.index_filters) == 0: | 149 | if len(self.index_filters) == 0: |
2760 | 136 | self.index_filters = ['datatype=image-downloads'] | 150 | self.index_filters = ["datatype=image-downloads"] |
2761 | 137 | self.index_filters = filters.get_filters(self.index_filters) | 151 | self.index_filters = filters.get_filters(self.index_filters) |
2762 | 138 | 152 | ||
2763 | 139 | self.loaded_content = {} | 153 | self.loaded_content = {} |
2764 | @@ -146,21 +160,28 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
2765 | 146 | 160 | ||
2766 | 147 | self.name_prefix = name_prefix or "" | 161 | self.name_prefix = name_prefix or "" |
2767 | 148 | if region is not None: | 162 | if region is not None: |
2769 | 149 | self.keystone_creds['region_name'] = region | 163 | self.keystone_creds["region_name"] = region |
2770 | 150 | 164 | ||
2771 | 151 | self.progress_callback = progress_callback | 165 | self.progress_callback = progress_callback |
2772 | 152 | 166 | ||
2773 | 153 | conn_info = client.get_service_conn_info( | 167 | conn_info = client.get_service_conn_info( |
2784 | 154 | 'image', **self.keystone_creds) | 168 | "image", **self.keystone_creds |
2785 | 155 | self.glance_api_version = conn_info['glance_version'] | 169 | ) |
2786 | 156 | self.gclient = get_glanceclient(version=self.glance_api_version, | 170 | self.glance_api_version = conn_info["glance_version"] |
2787 | 157 | **conn_info) | 171 | self.gclient = get_glanceclient( |
2788 | 158 | self.tenant_id = conn_info['tenant_id'] | 172 | version=self.glance_api_version, **conn_info |
2789 | 159 | 173 | ) | |
2790 | 160 | self.region = self.keystone_creds.get('region_name', 'nullregion') | 174 | self.tenant_id = conn_info["tenant_id"] |
2791 | 161 | self.cloudname = config.get("cloud_name", 'nullcloud') | 175 | |
2792 | 162 | self.crsn = '-'.join((self.cloudname, self.region,)) | 176 | self.region = self.keystone_creds.get("region_name", "nullregion") |
2793 | 163 | self.auth_url = self.keystone_creds['auth_url'] | 177 | self.cloudname = config.get("cloud_name", "nullcloud") |
2794 | 178 | self.crsn = "-".join( | ||
2795 | 179 | ( | ||
2796 | 180 | self.cloudname, | ||
2797 | 181 | self.region, | ||
2798 | 182 | ) | ||
2799 | 183 | ) | ||
2800 | 184 | self.auth_url = self.keystone_creds["auth_url"] | ||
2801 | 164 | 185 | ||
2802 | 165 | self.content_id = config.get("content_id") | 186 | self.content_id = config.get("content_id") |
2803 | 166 | self.modify_hook = config.get("modify_hook") | 187 | self.modify_hook = config.get("modify_hook") |
2804 | @@ -170,7 +191,7 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
2805 | 170 | raise TypeError("content_id is required") | 191 | raise TypeError("content_id is required") |
2806 | 171 | 192 | ||
2807 | 172 | self.custom_properties = collections.OrderedDict( | 193 | self.custom_properties = collections.OrderedDict( |
2809 | 173 | prop.split('=') for prop in config.get("custom_properties", []) | 194 | prop.split("=") for prop in config.get("custom_properties", []) |
2810 | 174 | ) | 195 | ) |
2811 | 175 | self.visibility = config.get("visibility", "public") | 196 | self.visibility = config.get("visibility", "public") |
2812 | 176 | 197 | ||
2813 | @@ -208,82 +229,100 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
2814 | 208 | for image in images: | 229 | for image in images: |
2815 | 209 | if self.glance_api_version == "1": | 230 | if self.glance_api_version == "1": |
2816 | 210 | image = image.to_dict() | 231 | image = image.to_dict() |
2818 | 211 | props = image['properties'] | 232 | props = image["properties"] |
2819 | 212 | else: | 233 | else: |
2820 | 213 | props = copy.deepcopy(image) | 234 | props = copy.deepcopy(image) |
2821 | 214 | 235 | ||
2823 | 215 | if image['owner'] != self.tenant_id: | 236 | if image["owner"] != self.tenant_id: |
2824 | 216 | continue | 237 | continue |
2825 | 217 | 238 | ||
2827 | 218 | if props.get('content_id') != my_cid: | 239 | if props.get("content_id") != my_cid: |
2828 | 219 | continue | 240 | continue |
2829 | 220 | 241 | ||
2833 | 221 | if image.get('status') != "active": | 242 | if image.get("status") != "active": |
2834 | 222 | LOG.warning("Ignoring inactive image %s with status '%s'" % ( | 243 | LOG.warning( |
2835 | 223 | image['id'], image.get('status'))) | 244 | "Ignoring inactive image %s with status '%s'" |
2836 | 245 | % (image["id"], image.get("status")) | ||
2837 | 246 | ) | ||
2838 | 224 | continue | 247 | continue |
2839 | 225 | 248 | ||
2841 | 226 | source_content_id = props.get('source_content_id') | 249 | source_content_id = props.get("source_content_id") |
2842 | 227 | 250 | ||
2846 | 228 | product = props.get('product_name') | 251 | product = props.get("product_name") |
2847 | 229 | version = props.get('version_name') | 252 | version = props.get("version_name") |
2848 | 230 | item = props.get('item_name') | 253 | item = props.get("item_name") |
2849 | 231 | if not (version and product and item and source_content_id): | 254 | if not (version and product and item and source_content_id): |
2851 | 232 | LOG.warning("%s missing required fields" % image['id']) | 255 | LOG.warning("%s missing required fields" % image["id"]) |
2852 | 233 | continue | 256 | continue |
2853 | 234 | 257 | ||
2854 | 235 | # get data from the datastore for this item, if it exists | 258 | # get data from the datastore for this item, if it exists |
2855 | 236 | # and then update that with glance data (just in case different) | 259 | # and then update that with glance data (just in case different) |
2856 | 237 | try: | 260 | try: |
2861 | 238 | item_data = util.products_exdata(store_t, | 261 | item_data = util.products_exdata( |
2862 | 239 | (product, version, item,), | 262 | store_t, |
2863 | 240 | include_top=False, | 263 | ( |
2864 | 241 | insert_fieldnames=False) | 264 | product, |
2865 | 265 | version, | ||
2866 | 266 | item, | ||
2867 | 267 | ), | ||
2868 | 268 | include_top=False, | ||
2869 | 269 | insert_fieldnames=False, | ||
2870 | 270 | ) | ||
2871 | 242 | except KeyError: | 271 | except KeyError: |
2872 | 243 | item_data = {} | 272 | item_data = {} |
2873 | 244 | 273 | ||
2874 | 245 | # If original simplestreams-metadata is stored on the image, | 274 | # If original simplestreams-metadata is stored on the image, |
2875 | 246 | # use that as well. | 275 | # use that as well. |
2877 | 247 | if 'simplestreams_metadata' in props: | 276 | if "simplestreams_metadata" in props: |
2878 | 248 | simplestreams_metadata = json.loads( | 277 | simplestreams_metadata = json.loads( |
2880 | 249 | props.get('simplestreams_metadata')) | 278 | props.get("simplestreams_metadata") |
2881 | 279 | ) | ||
2882 | 250 | else: | 280 | else: |
2883 | 251 | simplestreams_metadata = {} | 281 | simplestreams_metadata = {} |
2884 | 252 | item_data.update(simplestreams_metadata) | 282 | item_data.update(simplestreams_metadata) |
2885 | 253 | 283 | ||
2892 | 254 | item_data.update({'name': image['name'], 'id': image['id']}) | 284 | item_data.update({"name": image["name"], "id": image["id"]}) |
2893 | 255 | if 'owner_id' not in item_data: | 285 | if "owner_id" not in item_data: |
2894 | 256 | item_data['owner_id'] = self.tenant_id | 286 | item_data["owner_id"] = self.tenant_id |
2895 | 257 | 287 | ||
2896 | 258 | util.products_set(glance_t, item_data, | 288 | util.products_set( |
2897 | 259 | (product, version, item,)) | 289 | glance_t, |
2898 | 290 | item_data, | ||
2899 | 291 | ( | ||
2900 | 292 | product, | ||
2901 | 293 | version, | ||
2902 | 294 | item, | ||
2903 | 295 | ), | ||
2904 | 296 | ) | ||
2905 | 260 | 297 | ||
2909 | 261 | for product in glance_t['products']: | 298 | for product in glance_t["products"]: |
2910 | 262 | glance_t['products'][product]['region'] = self.region | 299 | glance_t["products"][product]["region"] = self.region |
2911 | 263 | glance_t['products'][product]['endpoint'] = self.auth_url | 300 | glance_t["products"][product]["endpoint"] = self.auth_url |
2912 | 264 | 301 | ||
2913 | 265 | return glance_t | 302 | return glance_t |
2914 | 266 | 303 | ||
2915 | 267 | def filter_item(self, data, src, target, pedigree): | 304 | def filter_item(self, data, src, target, pedigree): |
2916 | 268 | return filters.filter_item(self.item_filters, data, src, pedigree) | 305 | return filters.filter_item(self.item_filters, data, src, pedigree) |
2917 | 269 | 306 | ||
2920 | 270 | def create_glance_properties(self, content_id, source_content_id, | 307 | def create_glance_properties( |
2921 | 271 | image_metadata, hypervisor_mapping): | 308 | self, content_id, source_content_id, image_metadata, hypervisor_mapping |
2922 | 309 | ): | ||
2923 | 272 | """ | 310 | """ |
2924 | 273 | Construct extra properties to store in Glance for an image. | 311 | Construct extra properties to store in Glance for an image. |
2925 | 274 | 312 | ||
2926 | 275 | Based on source image metadata. | 313 | Based on source image metadata. |
2927 | 276 | """ | 314 | """ |
2928 | 277 | properties = { | 315 | properties = { |
2931 | 278 | 'content_id': content_id, | 316 | "content_id": content_id, |
2932 | 279 | 'source_content_id': source_content_id, | 317 | "source_content_id": source_content_id, |
2933 | 280 | } | 318 | } |
2934 | 281 | # An iterator of properties to carry over: if a property needs | 319 | # An iterator of properties to carry over: if a property needs |
2935 | 282 | # renaming, uses a tuple (old name, new name). | 320 | # renaming, uses a tuple (old name, new name). |
2938 | 283 | carry_over_simple = ( | 321 | carry_over_simple = ("product_name", "version_name", "item_name") |
2937 | 284 | 'product_name', 'version_name', 'item_name') | ||
2939 | 285 | carry_over = carry_over_simple + ( | 322 | carry_over = carry_over_simple + ( |
2941 | 286 | ('os', 'os_distro'), ('version', 'os_version')) | 323 | ("os", "os_distro"), |
2942 | 324 | ("version", "os_version"), | ||
2943 | 325 | ) | ||
2944 | 287 | for carry_over_property in carry_over: | 326 | for carry_over_property in carry_over: |
2945 | 288 | if isinstance(carry_over_property, tuple): | 327 | if isinstance(carry_over_property, tuple): |
2946 | 289 | name_old, name_new = carry_over_property | 328 | name_old, name_new = carry_over_property |
2947 | @@ -291,33 +330,41 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
2948 | 291 | name_old = name_new = carry_over_property | 330 | name_old = name_new = carry_over_property |
2949 | 292 | properties[name_new] = image_metadata.get(name_old) | 331 | properties[name_new] = image_metadata.get(name_old) |
2950 | 293 | 332 | ||
2954 | 294 | if 'arch' in image_metadata: | 333 | if "arch" in image_metadata: |
2955 | 295 | properties['architecture'] = canonicalize_arch( | 334 | properties["architecture"] = canonicalize_arch( |
2956 | 296 | image_metadata['arch']) | 335 | image_metadata["arch"] |
2957 | 336 | ) | ||
2958 | 297 | 337 | ||
2961 | 298 | if hypervisor_mapping and 'ftype' in image_metadata: | 338 | if hypervisor_mapping and "ftype" in image_metadata: |
2962 | 299 | _hypervisor_type = hypervisor_type(image_metadata['ftype']) | 339 | _hypervisor_type = hypervisor_type(image_metadata["ftype"]) |
2963 | 300 | if _hypervisor_type: | 340 | if _hypervisor_type: |
2965 | 301 | properties['hypervisor_type'] = _hypervisor_type | 341 | properties["hypervisor_type"] = _hypervisor_type |
2966 | 302 | 342 | ||
2967 | 303 | properties.update(self.custom_properties) | 343 | properties.update(self.custom_properties) |
2968 | 304 | 344 | ||
2969 | 305 | if self.set_latest_property: | 345 | if self.set_latest_property: |
2971 | 306 | properties['latest'] = "true" | 346 | properties["latest"] = "true" |
2972 | 307 | 347 | ||
2973 | 308 | # Store flattened metadata for a source image along with the | 348 | # Store flattened metadata for a source image along with the |
2974 | 309 | # image in 'simplestreams_metadata' property. | 349 | # image in 'simplestreams_metadata' property. |
2975 | 310 | simplestreams_metadata = image_metadata.copy() | 350 | simplestreams_metadata = image_metadata.copy() |
2977 | 311 | drop_keys = carry_over_simple + ('path',) | 351 | drop_keys = carry_over_simple + ("path",) |
2978 | 312 | for remove_key in drop_keys: | 352 | for remove_key in drop_keys: |
2979 | 313 | if remove_key in simplestreams_metadata: | 353 | if remove_key in simplestreams_metadata: |
2980 | 314 | del simplestreams_metadata[remove_key] | 354 | del simplestreams_metadata[remove_key] |
2983 | 315 | properties['simplestreams_metadata'] = json.dumps( | 355 | properties["simplestreams_metadata"] = json.dumps( |
2984 | 316 | simplestreams_metadata, sort_keys=True) | 356 | simplestreams_metadata, sort_keys=True |
2985 | 357 | ) | ||
2986 | 317 | return properties | 358 | return properties |
2987 | 318 | 359 | ||
2990 | 319 | def prepare_glance_arguments(self, full_image_name, image_metadata, | 360 | def prepare_glance_arguments( |
2991 | 320 | image_md5_hash, image_size, image_properties): | 361 | self, |
2992 | 362 | full_image_name, | ||
2993 | 363 | image_metadata, | ||
2994 | 364 | image_md5_hash, | ||
2995 | 365 | image_size, | ||
2996 | 366 | image_properties, | ||
2997 | 367 | ): | ||
2998 | 321 | """ | 368 | """ |
2999 | 322 | Prepare arguments to pass into Glance image creation method. | 369 | Prepare arguments to pass into Glance image creation method. |
3000 | 323 | 370 | ||
3001 | @@ -334,37 +381,39 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3002 | 334 | GlanceClient.images.create(). | 381 | GlanceClient.images.create(). |
3003 | 335 | """ | 382 | """ |
3004 | 336 | create_kwargs = { | 383 | create_kwargs = { |
3009 | 337 | 'name': full_image_name, | 384 | "name": full_image_name, |
3010 | 338 | 'container_format': 'bare', | 385 | "container_format": "bare", |
3011 | 339 | 'is_public': self.visibility == 'public', | 386 | "is_public": self.visibility == "public", |
3012 | 340 | 'properties': image_properties, | 387 | "properties": image_properties, |
3013 | 341 | } | 388 | } |
3014 | 342 | 389 | ||
3015 | 343 | # In v2 is_public=True is visibility='public' | 390 | # In v2 is_public=True is visibility='public' |
3016 | 344 | if self.glance_api_version == "2": | 391 | if self.glance_api_version == "2": |
3019 | 345 | del create_kwargs['is_public'] | 392 | del create_kwargs["is_public"] |
3020 | 346 | create_kwargs['visibility'] = self.visibility | 393 | create_kwargs["visibility"] = self.visibility |
3021 | 347 | 394 | ||
3022 | 348 | # v2 automatically calculates size and checksum | 395 | # v2 automatically calculates size and checksum |
3023 | 349 | if self.glance_api_version == "1": | 396 | if self.glance_api_version == "1": |
3028 | 350 | if 'size' in image_metadata: | 397 | if "size" in image_metadata: |
3029 | 351 | create_kwargs['size'] = int(image_metadata.get('size')) | 398 | create_kwargs["size"] = int(image_metadata.get("size")) |
3030 | 352 | if 'md5' in image_metadata: | 399 | if "md5" in image_metadata: |
3031 | 353 | create_kwargs['checksum'] = image_metadata.get('md5') | 400 | create_kwargs["checksum"] = image_metadata.get("md5") |
3032 | 354 | if image_md5_hash and image_size: | 401 | if image_md5_hash and image_size: |
3037 | 355 | create_kwargs.update({ | 402 | create_kwargs.update( |
3038 | 356 | 'checksum': image_md5_hash, | 403 | { |
3039 | 357 | 'size': image_size, | 404 | "checksum": image_md5_hash, |
3040 | 358 | }) | 405 | "size": image_size, |
3041 | 406 | } | ||
3042 | 407 | ) | ||
3043 | 359 | 408 | ||
3044 | 360 | if self.image_import_conversion: | 409 | if self.image_import_conversion: |
3049 | 361 | create_kwargs['disk_format'] = 'raw' | 410 | create_kwargs["disk_format"] = "raw" |
3050 | 362 | elif 'ftype' in image_metadata: | 411 | elif "ftype" in image_metadata: |
3051 | 363 | create_kwargs['disk_format'] = ( | 412 | create_kwargs["disk_format"] = ( |
3052 | 364 | disk_format(image_metadata['ftype']) or 'qcow2' | 413 | disk_format(image_metadata["ftype"]) or "qcow2" |
3053 | 365 | ) | 414 | ) |
3054 | 366 | else: | 415 | else: |
3056 | 367 | create_kwargs['disk_format'] = 'qcow2' | 416 | create_kwargs["disk_format"] = "qcow2" |
3057 | 368 | 417 | ||
3058 | 369 | return create_kwargs | 418 | return create_kwargs |
3059 | 370 | 419 | ||
3060 | @@ -378,37 +427,51 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3061 | 378 | Returns a tuple of | 427 | Returns a tuple of |
3062 | 379 | (str(local-image-path), int(image-size), str(image-md5-hash)). | 428 | (str(local-image-path), int(image-size), str(image-md5-hash)). |
3063 | 380 | """ | 429 | """ |
3066 | 381 | image_name = image_stream_data.get('pubname') | 430 | image_name = image_stream_data.get("pubname") |
3067 | 382 | image_size = image_stream_data.get('size') | 431 | image_size = image_stream_data.get("size") |
3068 | 383 | 432 | ||
3069 | 384 | if self.progress_callback: | 433 | if self.progress_callback: |
3070 | 434 | |||
3071 | 385 | def progress_wrapper(written): | 435 | def progress_wrapper(written): |
3072 | 386 | self.progress_callback( | 436 | self.progress_callback( |
3076 | 387 | dict(status="Downloading", name=image_name, | 437 | dict( |
3077 | 388 | size=None if image_size is None else int(image_size), | 438 | status="Downloading", |
3078 | 389 | written=written)) | 439 | name=image_name, |
3079 | 440 | size=None if image_size is None else int(image_size), | ||
3080 | 441 | written=written, | ||
3081 | 442 | ) | ||
3082 | 443 | ) | ||
3083 | 444 | |||
3084 | 390 | else: | 445 | else: |
3085 | 446 | |||
3086 | 391 | def progress_wrapper(written): | 447 | def progress_wrapper(written): |
3087 | 392 | pass | 448 | pass |
3088 | 393 | 449 | ||
3089 | 394 | try: | 450 | try: |
3090 | 395 | tmp_path, _ = util.get_local_copy( | 451 | tmp_path, _ = util.get_local_copy( |
3092 | 396 | contentsource, progress_callback=progress_wrapper) | 452 | contentsource, progress_callback=progress_wrapper |
3093 | 453 | ) | ||
3094 | 397 | 454 | ||
3095 | 398 | if self.modify_hook: | 455 | if self.modify_hook: |
3096 | 399 | (new_size, new_md5) = call_hook( | 456 | (new_size, new_md5) = call_hook( |
3099 | 400 | item=image_stream_data, path=tmp_path, | 457 | item=image_stream_data, path=tmp_path, cmd=self.modify_hook |
3100 | 401 | cmd=self.modify_hook) | 458 | ) |
3101 | 402 | else: | 459 | else: |
3102 | 403 | new_size = os.path.getsize(tmp_path) | 460 | new_size = os.path.getsize(tmp_path) |
3104 | 404 | new_md5 = image_stream_data.get('md5') | 461 | new_md5 = image_stream_data.get("md5") |
3105 | 405 | finally: | 462 | finally: |
3106 | 406 | contentsource.close() | 463 | contentsource.close() |
3107 | 407 | 464 | ||
3108 | 408 | return tmp_path, new_size, new_md5 | 465 | return tmp_path, new_size, new_md5 |
3109 | 409 | 466 | ||
3112 | 410 | def adapt_source_entry(self, source_entry, hypervisor_mapping, image_name, | 467 | def adapt_source_entry( |
3113 | 411 | image_md5_hash, image_size): | 468 | self, |
3114 | 469 | source_entry, | ||
3115 | 470 | hypervisor_mapping, | ||
3116 | 471 | image_name, | ||
3117 | 472 | image_md5_hash, | ||
3118 | 473 | image_size, | ||
3119 | 474 | ): | ||
3120 | 412 | """ | 475 | """ |
3121 | 413 | Adapts the source simplestreams dict `source_entry` for use in the | 476 | Adapts the source simplestreams dict `source_entry` for use in the |
3122 | 414 | generated local simplestreams index. | 477 | generated local simplestreams index. |
3123 | @@ -416,26 +479,30 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3124 | 416 | output_entry = source_entry.copy() | 479 | output_entry = source_entry.copy() |
3125 | 417 | 480 | ||
3126 | 418 | # Drop attributes not needed for the simplestreams index itself. | 481 | # Drop attributes not needed for the simplestreams index itself. |
3129 | 419 | for property_name in ('path', 'product_name', 'version_name', | 482 | for property_name in ( |
3130 | 420 | 'item_name'): | 483 | "path", |
3131 | 484 | "product_name", | ||
3132 | 485 | "version_name", | ||
3133 | 486 | "item_name", | ||
3134 | 487 | ): | ||
3135 | 421 | if property_name in output_entry: | 488 | if property_name in output_entry: |
3136 | 422 | del output_entry[property_name] | 489 | del output_entry[property_name] |
3137 | 423 | 490 | ||
3140 | 424 | if hypervisor_mapping and 'ftype' in output_entry: | 491 | if hypervisor_mapping and "ftype" in output_entry: |
3141 | 425 | _hypervisor_type = hypervisor_type(output_entry['ftype']) | 492 | _hypervisor_type = hypervisor_type(output_entry["ftype"]) |
3142 | 426 | if _hypervisor_type: | 493 | if _hypervisor_type: |
3143 | 427 | _virt_type = virt_type(_hypervisor_type) | 494 | _virt_type = virt_type(_hypervisor_type) |
3144 | 428 | if _virt_type: | 495 | if _virt_type: |
3146 | 429 | output_entry['virt'] = _virt_type | 496 | output_entry["virt"] = _virt_type |
3147 | 430 | 497 | ||
3151 | 431 | output_entry['region'] = self.region | 498 | output_entry["region"] = self.region |
3152 | 432 | output_entry['endpoint'] = self.auth_url | 499 | output_entry["endpoint"] = self.auth_url |
3153 | 433 | output_entry['owner_id'] = self.tenant_id | 500 | output_entry["owner_id"] = self.tenant_id |
3154 | 434 | 501 | ||
3156 | 435 | output_entry['name'] = image_name | 502 | output_entry["name"] = image_name |
3157 | 436 | if image_md5_hash and image_size: | 503 | if image_md5_hash and image_size: |
3160 | 437 | output_entry['md5'] = image_md5_hash | 504 | output_entry["md5"] = image_md5_hash |
3161 | 438 | output_entry['size'] = str(image_size) | 505 | output_entry["size"] = str(image_size) |
3162 | 439 | 506 | ||
3163 | 440 | return output_entry | 507 | return output_entry |
3164 | 441 | 508 | ||
3165 | @@ -459,58 +526,74 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3166 | 459 | # (product-name, version-name, image-type) | 526 | # (product-name, version-name, image-type) |
3167 | 460 | # from the tuple `pedigree` in the source simplestreams index. | 527 | # from the tuple `pedigree` in the source simplestreams index. |
3168 | 461 | flattened_img_data = util.products_exdata( | 528 | flattened_img_data = util.products_exdata( |
3170 | 462 | src, pedigree, include_top=False) | 529 | src, pedigree, include_top=False |
3171 | 530 | ) | ||
3172 | 463 | 531 | ||
3173 | 464 | tmp_path = None | 532 | tmp_path = None |
3174 | 465 | 533 | ||
3175 | 466 | full_image_name = "{}{}".format( | 534 | full_image_name = "{}{}".format( |
3176 | 467 | self.name_prefix, | 535 | self.name_prefix, |
3180 | 468 | flattened_img_data.get('pubname', flattened_img_data.get('name'))) | 536 | flattened_img_data.get("pubname", flattened_img_data.get("name")), |
3181 | 469 | if not full_image_name.endswith(flattened_img_data['item_name']): | 537 | ) |
3182 | 470 | full_image_name += "-{}".format(flattened_img_data['item_name']) | 538 | if not full_image_name.endswith(flattened_img_data["item_name"]): |
3183 | 539 | full_image_name += "-{}".format(flattened_img_data["item_name"]) | ||
3184 | 471 | 540 | ||
3185 | 472 | # Download images locally into a temporary file. | 541 | # Download images locally into a temporary file. |
3186 | 473 | tmp_path, new_size, new_md5 = self.download_image( | 542 | tmp_path, new_size, new_md5 = self.download_image( |
3188 | 474 | contentsource, flattened_img_data) | 543 | contentsource, flattened_img_data |
3189 | 544 | ) | ||
3190 | 475 | 545 | ||
3192 | 476 | hypervisor_mapping = self.config.get('hypervisor_mapping', False) | 546 | hypervisor_mapping = self.config.get("hypervisor_mapping", False) |
3193 | 477 | 547 | ||
3194 | 478 | glance_props = self.create_glance_properties( | 548 | glance_props = self.create_glance_properties( |
3197 | 479 | target['content_id'], src['content_id'], flattened_img_data, | 549 | target["content_id"], |
3198 | 480 | hypervisor_mapping) | 550 | src["content_id"], |
3199 | 551 | flattened_img_data, | ||
3200 | 552 | hypervisor_mapping, | ||
3201 | 553 | ) | ||
3202 | 481 | LOG.debug("glance properties %s", glance_props) | 554 | LOG.debug("glance properties %s", glance_props) |
3203 | 482 | create_kwargs = self.prepare_glance_arguments( | 555 | create_kwargs = self.prepare_glance_arguments( |
3206 | 483 | full_image_name, flattened_img_data, new_md5, new_size, | 556 | full_image_name, |
3207 | 484 | glance_props) | 557 | flattened_img_data, |
3208 | 558 | new_md5, | ||
3209 | 559 | new_size, | ||
3210 | 560 | glance_props, | ||
3211 | 561 | ) | ||
3212 | 485 | 562 | ||
3213 | 486 | target_sstream_item = self.adapt_source_entry( | 563 | target_sstream_item = self.adapt_source_entry( |
3216 | 487 | flattened_img_data, hypervisor_mapping, full_image_name, new_md5, | 564 | flattened_img_data, |
3217 | 488 | new_size) | 565 | hypervisor_mapping, |
3218 | 566 | full_image_name, | ||
3219 | 567 | new_md5, | ||
3220 | 568 | new_size, | ||
3221 | 569 | ) | ||
3222 | 489 | 570 | ||
3223 | 490 | try: | 571 | try: |
3224 | 491 | if self.glance_api_version == "1": | 572 | if self.glance_api_version == "1": |
3225 | 492 | # Set data as string if v1 | 573 | # Set data as string if v1 |
3227 | 493 | create_kwargs['data'] = open(tmp_path, 'rb') | 574 | create_kwargs["data"] = open(tmp_path, "rb") |
3228 | 494 | else: | 575 | else: |
3229 | 495 | # Keep properties for v2 update call | 576 | # Keep properties for v2 update call |
3232 | 496 | _properties = create_kwargs['properties'] | 577 | _properties = create_kwargs["properties"] |
3233 | 497 | del create_kwargs['properties'] | 578 | del create_kwargs["properties"] |
3234 | 498 | 579 | ||
3235 | 499 | LOG.debug("glance create_kwargs %s", create_kwargs) | 580 | LOG.debug("glance create_kwargs %s", create_kwargs) |
3236 | 500 | glance_image = self.gclient.images.create(**create_kwargs) | 581 | glance_image = self.gclient.images.create(**create_kwargs) |
3238 | 501 | target_sstream_item['id'] = glance_image.id | 582 | target_sstream_item["id"] = glance_image.id |
3239 | 502 | 583 | ||
3240 | 503 | if self.glance_api_version == "2": | 584 | if self.glance_api_version == "2": |
3241 | 504 | if self.image_import_conversion: | 585 | if self.image_import_conversion: |
3242 | 505 | # Stage the image before starting import | 586 | # Stage the image before starting import |
3245 | 506 | self.gclient.images.stage(glance_image.id, | 587 | self.gclient.images.stage( |
3246 | 507 | open(tmp_path, 'rb')) | 588 | glance_image.id, open(tmp_path, "rb") |
3247 | 589 | ) | ||
3248 | 508 | # Import the Glance image | 590 | # Import the Glance image |
3249 | 509 | self.gclient.images.image_import(glance_image.id) | 591 | self.gclient.images.image_import(glance_image.id) |
3250 | 510 | else: | 592 | else: |
3251 | 511 | # Upload for v2 | 593 | # Upload for v2 |
3254 | 512 | self.gclient.images.upload(glance_image.id, | 594 | self.gclient.images.upload( |
3255 | 513 | open(tmp_path, 'rb')) | 595 | glance_image.id, open(tmp_path, "rb") |
3256 | 596 | ) | ||
3257 | 514 | # Update properties for v2 | 597 | # Update properties for v2 |
3258 | 515 | self.gclient.images.update(glance_image.id, **_properties) | 598 | self.gclient.images.update(glance_image.id, **_properties) |
3259 | 516 | 599 | ||
3260 | @@ -527,15 +610,19 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3261 | 527 | # self.load_products() instead | 610 | # self.load_products() instead |
3262 | 528 | if self.set_latest_property: | 611 | if self.set_latest_property: |
3263 | 529 | # Search all images with the same target attributtes | 612 | # Search all images with the same target attributtes |
3268 | 530 | _filter_properties = {'filters': { | 613 | _filter_properties = { |
3269 | 531 | 'latest': 'true', | 614 | "filters": { |
3270 | 532 | 'os_version': glance_props['os_version'], | 615 | "latest": "true", |
3271 | 533 | 'architecture': glance_props['architecture']}} | 616 | "os_version": glance_props["os_version"], |
3272 | 617 | "architecture": glance_props["architecture"], | ||
3273 | 618 | } | ||
3274 | 619 | } | ||
3275 | 534 | images = self.gclient.images.list(**_filter_properties) | 620 | images = self.gclient.images.list(**_filter_properties) |
3276 | 535 | for image in images: | 621 | for image in images: |
3277 | 536 | if image.id != glance_image.id: | 622 | if image.id != glance_image.id: |
3280 | 537 | self.gclient.images.update(image.id, | 623 | self.gclient.images.update( |
3281 | 538 | remove_props=['latest']) | 624 | image.id, remove_props=["latest"] |
3282 | 625 | ) | ||
3283 | 539 | 626 | ||
3284 | 540 | finally: | 627 | finally: |
3285 | 541 | if tmp_path and os.path.exists(tmp_path): | 628 | if tmp_path and os.path.exists(tmp_path): |
3286 | @@ -562,10 +649,10 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3287 | 562 | found = self.gclient.images.get(image_id) | 649 | found = self.gclient.images.get(image_id) |
3288 | 563 | if found.size == size and found.checksum == checksum: | 650 | if found.size == size and found.checksum == checksum: |
3289 | 564 | return | 651 | return |
3294 | 565 | msg = ( | 652 | msg = ("Invalid glance image: %s. " % image_id) + ( |
3295 | 566 | ("Invalid glance image: %s. " % image_id) + | 653 | "Expected size=%s md5=%s. Found size=%s md5=%s." |
3296 | 567 | ("Expected size=%s md5=%s. Found size=%s md5=%s." % | 654 | % (size, checksum, found.size, found.checksum) |
3297 | 568 | (size, checksum, found.size, found.checksum))) | 655 | ) |
3298 | 569 | if delete: | 656 | if delete: |
3299 | 570 | LOG.warning("Deleting image %s: %s", image_id, msg) | 657 | LOG.warning("Deleting image %s: %s", image_id, msg) |
3300 | 571 | self.gclient.images.delete(image_id) | 658 | self.gclient.images.delete(image_id) |
3301 | @@ -587,13 +674,15 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3302 | 587 | if version_name not in self.inserts[product_name]: | 674 | if version_name not in self.inserts[product_name]: |
3303 | 588 | self.inserts[product_name][version_name] = {} | 675 | self.inserts[product_name][version_name] = {} |
3304 | 589 | 676 | ||
3307 | 590 | if 'ftype' in data: | 677 | if "ftype" in data: |
3308 | 591 | ftype = data['ftype'] | 678 | ftype = data["ftype"] |
3309 | 592 | else: | 679 | else: |
3310 | 593 | flat = util.products_exdata(src, pedigree, include_top=False) | 680 | flat = util.products_exdata(src, pedigree, include_top=False) |
3312 | 594 | ftype = flat.get('ftype') | 681 | ftype = flat.get("ftype") |
3313 | 595 | self.inserts[product_name][version_name][item_name] = ( | 682 | self.inserts[product_name][version_name][item_name] = ( |
3315 | 596 | ftype, (data, src, target, pedigree, contentsource)) | 683 | ftype, |
3316 | 684 | (data, src, target, pedigree, contentsource), | ||
3317 | 685 | ) | ||
3318 | 597 | 686 | ||
3319 | 598 | def insert_version(self, data, src, target, pedigree): | 687 | def insert_version(self, data, src, target, pedigree): |
3320 | 599 | """Upload all images for this version into glance | 688 | """Upload all images for this version into glance |
3321 | @@ -605,14 +694,20 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3322 | 605 | product_name, version_name = pedigree | 694 | product_name, version_name = pedigree |
3323 | 606 | inserts = self.inserts.get(product_name, {}).get(version_name, []) | 695 | inserts = self.inserts.get(product_name, {}).get(version_name, []) |
3324 | 607 | 696 | ||
3327 | 608 | rtar_names = [f for f in inserts | 697 | rtar_names = [ |
3328 | 609 | if inserts[f][0] in ('root.tar.gz', 'root.tar.xz')] | 698 | f |
3329 | 699 | for f in inserts | ||
3330 | 700 | if inserts[f][0] in ("root.tar.gz", "root.tar.xz") | ||
3331 | 701 | ] | ||
3332 | 610 | 702 | ||
3333 | 611 | for _iname, (ftype, iargs) in inserts.items(): | 703 | for _iname, (ftype, iargs) in inserts.items(): |
3334 | 612 | if ftype == "squashfs" and rtar_names: | 704 | if ftype == "squashfs" and rtar_names: |
3338 | 613 | LOG.info("[%s] Skipping ftype 'squashfs' image in preference" | 705 | LOG.info( |
3339 | 614 | "for root tarball type in %s", | 706 | "[%s] Skipping ftype 'squashfs' image in preference" |
3340 | 615 | '/'.join(pedigree), rtar_names) | 707 | "for root tarball type in %s", |
3341 | 708 | "/".join(pedigree), | ||
3342 | 709 | rtar_names, | ||
3343 | 710 | ) | ||
3344 | 616 | continue | 711 | continue |
3345 | 617 | self._insert_item(*iargs) | 712 | self._insert_item(*iargs) |
3346 | 618 | 713 | ||
3347 | @@ -622,9 +717,9 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3348 | 622 | 717 | ||
3349 | 623 | def remove_item(self, data, src, target, pedigree): | 718 | def remove_item(self, data, src, target, pedigree): |
3350 | 624 | util.products_del(target, pedigree) | 719 | util.products_del(target, pedigree) |
3354 | 625 | if 'id' in data: | 720 | if "id" in data: |
3355 | 626 | print("removing %s: %s" % (data['id'], data['name'])) | 721 | print("removing %s: %s" % (data["id"], data["name"])) |
3356 | 627 | self.gclient.images.delete(data['id']) | 722 | self.gclient.images.delete(data["id"]) |
3357 | 628 | 723 | ||
3358 | 629 | def filter_index_entry(self, data, src, pedigree): | 724 | def filter_index_entry(self, data, src, pedigree): |
3359 | 630 | return filters.filter_dict(self.index_filters, data) | 725 | return filters.filter_dict(self.index_filters, data) |
3360 | @@ -637,18 +732,18 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3361 | 637 | util.products_prune(tree, preserve_empty_products=True) | 732 | util.products_prune(tree, preserve_empty_products=True) |
3362 | 638 | 733 | ||
3363 | 639 | # stop these items from copying up when we call condense | 734 | # stop these items from copying up when we call condense |
3365 | 640 | sticky = ['ftype', 'md5', 'sha256', 'size', 'name', 'id'] | 735 | sticky = ["ftype", "md5", "sha256", "size", "name", "id"] |
3366 | 641 | 736 | ||
3367 | 642 | # LP: #1329805. Juju expects these on the item. | 737 | # LP: #1329805. Juju expects these on the item. |
3370 | 643 | if self.config.get('sticky_endpoint_region', True): | 738 | if self.config.get("sticky_endpoint_region", True): |
3371 | 644 | sticky += ['endpoint', 'region'] | 739 | sticky += ["endpoint", "region"] |
3372 | 645 | 740 | ||
3373 | 646 | util.products_condense(tree, sticky=sticky) | 741 | util.products_condense(tree, sticky=sticky) |
3374 | 647 | 742 | ||
3375 | 648 | tsnow = util.timestamp() | 743 | tsnow = util.timestamp() |
3377 | 649 | tree['updated'] = tsnow | 744 | tree["updated"] = tsnow |
3378 | 650 | 745 | ||
3380 | 651 | dpath = self._cidpath(tree['content_id']) | 746 | dpath = self._cidpath(tree["content_id"]) |
3381 | 652 | LOG.info("writing data: %s", dpath) | 747 | LOG.info("writing data: %s", dpath) |
3382 | 653 | self.store.insert_content(dpath, util.dump_data(tree)) | 748 | self.store.insert_content(dpath, util.dump_data(tree)) |
3383 | 654 | 749 | ||
3384 | @@ -659,17 +754,20 @@ class GlanceMirror(mirrors.BasicMirrorWriter): | |||
3385 | 659 | except IOError as exc: | 754 | except IOError as exc: |
3386 | 660 | if exc.errno != errno.ENOENT: | 755 | if exc.errno != errno.ENOENT: |
3387 | 661 | raise | 756 | raise |
3399 | 662 | index = {"index": {}, 'format': 'index:1.0', | 757 | index = { |
3400 | 663 | 'updated': util.timestamp()} | 758 | "index": {}, |
3401 | 664 | 759 | "format": "index:1.0", | |
3402 | 665 | index['index'][tree['content_id']] = { | 760 | "updated": util.timestamp(), |
3403 | 666 | 'updated': tsnow, | 761 | } |
3404 | 667 | 'datatype': 'image-ids', | 762 | |
3405 | 668 | 'clouds': [{'region': self.region, 'endpoint': self.auth_url}], | 763 | index["index"][tree["content_id"]] = { |
3406 | 669 | 'cloudname': self.cloudname, | 764 | "updated": tsnow, |
3407 | 670 | 'path': dpath, | 765 | "datatype": "image-ids", |
3408 | 671 | 'products': list(tree['products'].keys()), | 766 | "clouds": [{"region": self.region, "endpoint": self.auth_url}], |
3409 | 672 | 'format': tree['format'], | 767 | "cloudname": self.cloudname, |
3410 | 768 | "path": dpath, | ||
3411 | 769 | "products": list(tree["products"].keys()), | ||
3412 | 770 | "format": tree["format"], | ||
3413 | 673 | } | 771 | } |
3414 | 674 | LOG.info("writing data: %s", ipath) | 772 | LOG.info("writing data: %s", ipath) |
3415 | 675 | self.store.insert_content(ipath, util.dump_data(index)) | 773 | self.store.insert_content(ipath, util.dump_data(index)) |
3416 | @@ -694,13 +792,13 @@ class ItemInfoDryRunMirror(GlanceMirror): | |||
3417 | 694 | 792 | ||
3418 | 695 | def insert_item(self, data, src, target, pedigree, contentsource): | 793 | def insert_item(self, data, src, target, pedigree, contentsource): |
3419 | 696 | data = util.products_exdata(src, pedigree) | 794 | data = util.products_exdata(src, pedigree) |
3422 | 697 | if 'size' in data and 'path' in data and 'pubname' in data: | 795 | if "size" in data and "path" in data and "pubname" in data: |
3423 | 698 | self.items[data['pubname']] = int(data['size']) | 796 | self.items[data["pubname"]] = int(data["size"]) |
3424 | 699 | 797 | ||
3425 | 700 | 798 | ||
3426 | 701 | def _checksum_file(fobj, read_size=util.READ_SIZE, checksums=None): | 799 | def _checksum_file(fobj, read_size=util.READ_SIZE, checksums=None): |
3427 | 702 | if checksums is None: | 800 | if checksums is None: |
3429 | 703 | checksums = {'md5': None} | 801 | checksums = {"md5": None} |
3430 | 704 | cksum = checksum_util.checksummer(checksums=checksums) | 802 | cksum = checksum_util.checksummer(checksums=checksums) |
3431 | 705 | while True: | 803 | while True: |
3432 | 706 | buf = fobj.read(read_size) | 804 | buf = fobj.read(read_size) |
3433 | @@ -713,13 +811,13 @@ def _checksum_file(fobj, read_size=util.READ_SIZE, checksums=None): | |||
3434 | 713 | def call_hook(item, path, cmd): | 811 | def call_hook(item, path, cmd): |
3435 | 714 | env = os.environ.copy() | 812 | env = os.environ.copy() |
3436 | 715 | env.update(item) | 813 | env.update(item) |
3439 | 716 | env['IMAGE_PATH'] = path | 814 | env["IMAGE_PATH"] = path |
3440 | 717 | env['FIELDS'] = ' '.join(item.keys()) + ' IMAGE_PATH' | 815 | env["FIELDS"] = " ".join(item.keys()) + " IMAGE_PATH" |
3441 | 718 | 816 | ||
3442 | 719 | util.subp(cmd, env=env, capture=False) | 817 | util.subp(cmd, env=env, capture=False) |
3443 | 720 | 818 | ||
3444 | 721 | with open(path, "rb") as fp: | 819 | with open(path, "rb") as fp: |
3446 | 722 | md5 = _checksum_file(fp, checksums={'md5': None}) | 820 | md5 = _checksum_file(fp, checksums={"md5": None}) |
3447 | 723 | 821 | ||
3448 | 724 | return (os.path.getsize(path), md5) | 822 | return (os.path.getsize(path), md5) |
3449 | 725 | 823 | ||
3450 | @@ -728,12 +826,13 @@ def _strip_version(endpoint): | |||
3451 | 728 | """Strip a version from the last component of an endpoint if present""" | 826 | """Strip a version from the last component of an endpoint if present""" |
3452 | 729 | 827 | ||
3453 | 730 | # Get rid of trailing '/' if present | 828 | # Get rid of trailing '/' if present |
3455 | 731 | if endpoint.endswith('/'): | 829 | if endpoint.endswith("/"): |
3456 | 732 | endpoint = endpoint[:-1] | 830 | endpoint = endpoint[:-1] |
3458 | 733 | url_bits = endpoint.split('/') | 831 | url_bits = endpoint.split("/") |
3459 | 734 | # regex to match 'v1' or 'v2.0' etc | 832 | # regex to match 'v1' or 'v2.0' etc |
3462 | 735 | if re.match(r'v\d+\.?\d*', url_bits[-1]): | 833 | if re.match(r"v\d+\.?\d*", url_bits[-1]): |
3463 | 736 | endpoint = '/'.join(url_bits[:-1]) | 834 | endpoint = "/".join(url_bits[:-1]) |
3464 | 737 | return endpoint | 835 | return endpoint |
3465 | 738 | 836 | ||
3466 | 837 | |||
3467 | 739 | # vi: ts=4 expandtab syntax=python | 838 | # vi: ts=4 expandtab syntax=python |
3468 | diff --git a/simplestreams/objectstores/__init__.py b/simplestreams/objectstores/__init__.py | |||
3469 | index f118a92..b9a6bbe 100644 | |||
3470 | --- a/simplestreams/objectstores/__init__.py | |||
3471 | +++ b/simplestreams/objectstores/__init__.py | |||
3472 | @@ -35,9 +35,13 @@ class ObjectStore(object): | |||
3473 | 35 | 35 | ||
3474 | 36 | def insert_content(self, path, content, checksums=None, mutable=True): | 36 | def insert_content(self, path, content, checksums=None, mutable=True): |
3475 | 37 | if not isinstance(content, bytes): | 37 | if not isinstance(content, bytes): |
3479 | 38 | content = content.encode('utf-8') | 38 | content = content.encode("utf-8") |
3480 | 39 | self.insert(path=path, reader=cs.MemoryContentSource(content=content), | 39 | self.insert( |
3481 | 40 | checksums=checksums, mutable=mutable) | 40 | path=path, |
3482 | 41 | reader=cs.MemoryContentSource(content=content), | ||
3483 | 42 | checksums=checksums, | ||
3484 | 43 | mutable=mutable, | ||
3485 | 44 | ) | ||
3486 | 41 | 45 | ||
3487 | 42 | def remove(self, path): | 46 | def remove(self, path): |
3488 | 43 | # remove path from store | 47 | # remove path from store |
3489 | @@ -48,9 +52,12 @@ class ObjectStore(object): | |||
3490 | 48 | raise NotImplementedError() | 52 | raise NotImplementedError() |
3491 | 49 | 53 | ||
3492 | 50 | def exists_with_checksum(self, path, checksums=None): | 54 | def exists_with_checksum(self, path, checksums=None): |
3496 | 51 | return has_valid_checksum(path=path, reader=self.source, | 55 | return has_valid_checksum( |
3497 | 52 | checksums=checksums, | 56 | path=path, |
3498 | 53 | read_size=self.read_size) | 57 | reader=self.source, |
3499 | 58 | checksums=checksums, | ||
3500 | 59 | read_size=self.read_size, | ||
3501 | 60 | ) | ||
3502 | 54 | 61 | ||
3503 | 55 | 62 | ||
3504 | 56 | class MemoryObjectStore(ObjectStore): | 63 | class MemoryObjectStore(ObjectStore): |
3505 | @@ -73,35 +80,43 @@ class MemoryObjectStore(ObjectStore): | |||
3506 | 73 | url = "%s://%s" % (self.__class__, path) | 80 | url = "%s://%s" % (self.__class__, path) |
3507 | 74 | return cs.MemoryContentSource(content=self.data[path], url=url) | 81 | return cs.MemoryContentSource(content=self.data[path], url=url) |
3508 | 75 | except KeyError: | 82 | except KeyError: |
3510 | 76 | raise IOError(errno.ENOENT, '%s not found' % path) | 83 | raise IOError(errno.ENOENT, "%s not found" % path) |
3511 | 77 | 84 | ||
3512 | 78 | 85 | ||
3513 | 79 | class FileStore(ObjectStore): | 86 | class FileStore(ObjectStore): |
3514 | 80 | |||
3515 | 81 | def __init__(self, prefix, complete_callback=None): | 87 | def __init__(self, prefix, complete_callback=None): |
3517 | 82 | """ complete_callback is called periodically to notify users when a | 88 | """complete_callback is called periodically to notify users when a |
3518 | 83 | file is being inserted. It takes three arguments: the path that is | 89 | file is being inserted. It takes three arguments: the path that is |
3519 | 84 | inserted, the number of bytes downloaded, and the number of total | 90 | inserted, the number of bytes downloaded, and the number of total |
3521 | 85 | bytes. """ | 91 | bytes.""" |
3522 | 86 | self.prefix = prefix | 92 | self.prefix = prefix |
3523 | 87 | self.complete_callback = complete_callback | 93 | self.complete_callback = complete_callback |
3524 | 88 | 94 | ||
3528 | 89 | def insert(self, path, reader, checksums=None, mutable=True, size=None, | 95 | def insert( |
3529 | 90 | sparse=False): | 96 | self, |
3530 | 91 | 97 | path, | |
3531 | 98 | reader, | ||
3532 | 99 | checksums=None, | ||
3533 | 100 | mutable=True, | ||
3534 | 101 | size=None, | ||
3535 | 102 | sparse=False, | ||
3536 | 103 | ): | ||
3537 | 92 | wpath = self._fullpath(path) | 104 | wpath = self._fullpath(path) |
3538 | 93 | if os.path.isfile(wpath): | 105 | if os.path.isfile(wpath): |
3539 | 94 | if not mutable: | 106 | if not mutable: |
3540 | 95 | # if the file exists, and not mutable, return | 107 | # if the file exists, and not mutable, return |
3541 | 96 | return | 108 | return |
3545 | 97 | if has_valid_checksum(path=path, reader=self.source, | 109 | if has_valid_checksum( |
3546 | 98 | checksums=checksums, | 110 | path=path, |
3547 | 99 | read_size=self.read_size): | 111 | reader=self.source, |
3548 | 112 | checksums=checksums, | ||
3549 | 113 | read_size=self.read_size, | ||
3550 | 114 | ): | ||
3551 | 100 | return | 115 | return |
3552 | 101 | 116 | ||
3553 | 102 | zeros = None | 117 | zeros = None |
3554 | 103 | if sparse is True: | 118 | if sparse is True: |
3556 | 104 | zeros = '\0' * self.read_size | 119 | zeros = "\0" * self.read_size |
3557 | 105 | 120 | ||
3558 | 106 | cksum = checksum_util.checksummer(checksums) | 121 | cksum = checksum_util.checksummer(checksums) |
3559 | 107 | out_d = os.path.dirname(wpath) | 122 | out_d = os.path.dirname(wpath) |
3560 | @@ -110,8 +125,9 @@ class FileStore(ObjectStore): | |||
3561 | 110 | util.mkdir_p(out_d) | 125 | util.mkdir_p(out_d) |
3562 | 111 | orig_part_size = 0 | 126 | orig_part_size = 0 |
3563 | 112 | reader_does_checksum = ( | 127 | reader_does_checksum = ( |
3566 | 113 | isinstance(reader, cs.ChecksummingContentSource) and | 128 | isinstance(reader, cs.ChecksummingContentSource) |
3567 | 114 | cksum.algorithm == reader.algorithm) | 129 | and cksum.algorithm == reader.algorithm |
3568 | 130 | ) | ||
3569 | 115 | 131 | ||
3570 | 116 | if os.path.exists(partfile): | 132 | if os.path.exists(partfile): |
3571 | 117 | try: | 133 | try: |
3572 | @@ -121,8 +137,12 @@ class FileStore(ObjectStore): | |||
3573 | 121 | else: | 137 | else: |
3574 | 122 | reader.set_start_pos(orig_part_size) | 138 | reader.set_start_pos(orig_part_size) |
3575 | 123 | 139 | ||
3578 | 124 | LOG.debug("resuming partial (%s) download of '%s' from '%s'", | 140 | LOG.debug( |
3579 | 125 | orig_part_size, path, partfile) | 141 | "resuming partial (%s) download of '%s' from '%s'", |
3580 | 142 | orig_part_size, | ||
3581 | 143 | path, | ||
3582 | 144 | partfile, | ||
3583 | 145 | ) | ||
3584 | 126 | with open(partfile, "rb") as fp: | 146 | with open(partfile, "rb") as fp: |
3585 | 127 | while True: | 147 | while True: |
3586 | 128 | buf = fp.read(self.read_size) | 148 | buf = fp.read(self.read_size) |
3587 | @@ -136,15 +156,17 @@ class FileStore(ObjectStore): | |||
3588 | 136 | os.unlink(partfile) | 156 | os.unlink(partfile) |
3589 | 137 | 157 | ||
3590 | 138 | with open(partfile, "ab") as wfp: | 158 | with open(partfile, "ab") as wfp: |
3591 | 139 | |||
3592 | 140 | while True: | 159 | while True: |
3593 | 141 | try: | 160 | try: |
3594 | 142 | buf = reader.read(self.read_size) | 161 | buf = reader.read(self.read_size) |
3595 | 143 | except checksum_util.InvalidChecksum: | 162 | except checksum_util.InvalidChecksum: |
3596 | 144 | break | 163 | break |
3597 | 145 | buflen = len(buf) | 164 | buflen = len(buf) |
3600 | 146 | if (buflen != self.read_size and zeros is not None and | 165 | if ( |
3601 | 147 | zeros[0:buflen] == buf): | 166 | buflen != self.read_size |
3602 | 167 | and zeros is not None | ||
3603 | 168 | and zeros[0:buflen] == buf | ||
3604 | 169 | ): | ||
3605 | 148 | wfp.seek(wfp.tell() + buflen) | 170 | wfp.seek(wfp.tell() + buflen) |
3606 | 149 | elif buf == zeros: | 171 | elif buf == zeros: |
3607 | 150 | wfp.seek(wfp.tell() + buflen) | 172 | wfp.seek(wfp.tell() + buflen) |
3608 | @@ -209,8 +231,9 @@ class FileStore(ObjectStore): | |||
3609 | 209 | return os.path.join(self.prefix, path) | 231 | return os.path.join(self.prefix, path) |
3610 | 210 | 232 | ||
3611 | 211 | 233 | ||
3614 | 212 | def has_valid_checksum(path, reader, checksums=None, | 234 | def has_valid_checksum( |
3615 | 213 | read_size=READ_BUFFER_SIZE): | 235 | path, reader, checksums=None, read_size=READ_BUFFER_SIZE |
3616 | 236 | ): | ||
3617 | 214 | if checksums is None: | 237 | if checksums is None: |
3618 | 215 | return False | 238 | return False |
3619 | 216 | try: | 239 | try: |
3620 | diff --git a/simplestreams/objectstores/s3.py b/simplestreams/objectstores/s3.py | |||
3621 | index f1e9602..b07507f 100644 | |||
3622 | --- a/simplestreams/objectstores/s3.py | |||
3623 | +++ b/simplestreams/objectstores/s3.py | |||
3624 | @@ -15,19 +15,19 @@ | |||
3625 | 15 | # You should have received a copy of the GNU Affero General Public License | 15 | # You should have received a copy of the GNU Affero General Public License |
3626 | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
3627 | 17 | 17 | ||
3628 | 18 | import errno | ||
3629 | 19 | import tempfile | ||
3630 | 20 | from contextlib import closing | ||
3631 | 21 | |||
3632 | 18 | import boto.exception | 22 | import boto.exception |
3633 | 19 | import boto.s3 | 23 | import boto.s3 |
3634 | 20 | import boto.s3.connection | 24 | import boto.s3.connection |
3635 | 21 | from contextlib import closing | ||
3636 | 22 | import errno | ||
3637 | 23 | import tempfile | ||
3638 | 24 | 25 | ||
3639 | 25 | import simplestreams.objectstores as objectstores | ||
3640 | 26 | import simplestreams.contentsource as cs | 26 | import simplestreams.contentsource as cs |
3641 | 27 | import simplestreams.objectstores as objectstores | ||
3642 | 27 | 28 | ||
3643 | 28 | 29 | ||
3644 | 29 | class S3ObjectStore(objectstores.ObjectStore): | 30 | class S3ObjectStore(objectstores.ObjectStore): |
3645 | 30 | |||
3646 | 31 | _bucket = None | 31 | _bucket = None |
3647 | 32 | _connection = None | 32 | _connection = None |
3648 | 33 | 33 | ||
3649 | @@ -92,8 +92,8 @@ class S3ObjectStore(objectstores.ObjectStore): | |||
3650 | 92 | if key is None: | 92 | if key is None: |
3651 | 93 | return False | 93 | return False |
3652 | 94 | 94 | ||
3655 | 95 | if 'md5' in checksums: | 95 | if "md5" in checksums: |
3656 | 96 | return checksums['md5'] == key.etag.replace('"', "") | 96 | return checksums["md5"] == key.etag.replace('"', "") |
3657 | 97 | 97 | ||
3658 | 98 | return False | 98 | return False |
3659 | 99 | 99 | ||
3660 | diff --git a/simplestreams/objectstores/swift.py b/simplestreams/objectstores/swift.py | |||
3661 | index f2c0d5b..d33fa3b 100644 | |||
3662 | --- a/simplestreams/objectstores/swift.py | |||
3663 | +++ b/simplestreams/objectstores/swift.py | |||
3664 | @@ -15,30 +15,30 @@ | |||
3665 | 15 | # You should have received a copy of the GNU Affero General Public License | 15 | # You should have received a copy of the GNU Affero General Public License |
3666 | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
3667 | 17 | 17 | ||
3668 | 18 | import simplestreams.objectstores as objectstores | ||
3669 | 19 | import simplestreams.contentsource as cs | ||
3670 | 20 | import simplestreams.openstack as openstack | ||
3671 | 21 | |||
3672 | 22 | import errno | 18 | import errno |
3673 | 23 | import hashlib | 19 | import hashlib |
3675 | 24 | from swiftclient import Connection, ClientException | 20 | |
3676 | 21 | from swiftclient import ClientException, Connection | ||
3677 | 22 | |||
3678 | 23 | import simplestreams.contentsource as cs | ||
3679 | 24 | import simplestreams.objectstores as objectstores | ||
3680 | 25 | import simplestreams.openstack as openstack | ||
3681 | 25 | 26 | ||
3682 | 26 | 27 | ||
3683 | 27 | def get_swiftclient(**kwargs): | 28 | def get_swiftclient(**kwargs): |
3684 | 28 | # nmap has entries that need name changes from a 'get_service_conn_info' | 29 | # nmap has entries that need name changes from a 'get_service_conn_info' |
3685 | 29 | # to a swift Connection name. | 30 | # to a swift Connection name. |
3686 | 30 | # pt has names that pass straight through | 31 | # pt has names that pass straight through |
3689 | 31 | nmap = {'endpoint': 'preauthurl', 'token': 'preauthtoken'} | 32 | nmap = {"endpoint": "preauthurl", "token": "preauthtoken"} |
3690 | 32 | pt = ('insecure', 'cacert') | 33 | pt = ("insecure", "cacert") |
3691 | 33 | 34 | ||
3692 | 34 | connargs = {v: kwargs.get(k) for k, v in nmap.items() if k in kwargs} | 35 | connargs = {v: kwargs.get(k) for k, v in nmap.items() if k in kwargs} |
3693 | 35 | connargs.update({k: kwargs.get(k) for k in pt if k in kwargs}) | 36 | connargs.update({k: kwargs.get(k) for k in pt if k in kwargs}) |
3696 | 36 | if kwargs.get('session'): | 37 | if kwargs.get("session"): |
3697 | 37 | sess = kwargs.get('session') | 38 | sess = kwargs.get("session") |
3698 | 38 | try: | 39 | try: |
3699 | 39 | # If session is available try it | 40 | # If session is available try it |
3702 | 40 | return Connection(session=sess, | 41 | return Connection(session=sess, cacert=kwargs.get("cacert")) |
3701 | 41 | cacert=kwargs.get('cacert')) | ||
3703 | 42 | except TypeError: | 42 | except TypeError: |
3704 | 43 | # The edge case where session is availble but swiftclient is | 43 | # The edge case where session is availble but swiftclient is |
3705 | 44 | # < 3.3.0. Use the old style method for Connection. | 44 | # < 3.3.0. Use the old style method for Connection. |
3706 | @@ -52,7 +52,6 @@ class SwiftContentSource(cs.IteratorContentSource): | |||
3707 | 52 | 52 | ||
3708 | 53 | 53 | ||
3709 | 54 | class SwiftObjectStore(objectstores.ObjectStore): | 54 | class SwiftObjectStore(objectstores.ObjectStore): |
3710 | 55 | |||
3711 | 56 | def __init__(self, prefix, region=None): | 55 | def __init__(self, prefix, region=None): |
3712 | 57 | # expect 'swift://bucket/path_prefix' | 56 | # expect 'swift://bucket/path_prefix' |
3713 | 58 | self.prefix = prefix | 57 | self.prefix = prefix |
3714 | @@ -67,35 +66,41 @@ class SwiftObjectStore(objectstores.ObjectStore): | |||
3715 | 67 | 66 | ||
3716 | 68 | self.keystone_creds = openstack.load_keystone_creds() | 67 | self.keystone_creds = openstack.load_keystone_creds() |
3717 | 69 | if region is not None: | 68 | if region is not None: |
3719 | 70 | self.keystone_creds['region_name'] = region | 69 | self.keystone_creds["region_name"] = region |
3720 | 71 | 70 | ||
3723 | 72 | conn_info = openstack.get_service_conn_info('object-store', | 71 | conn_info = openstack.get_service_conn_info( |
3724 | 73 | **self.keystone_creds) | 72 | "object-store", **self.keystone_creds |
3725 | 73 | ) | ||
3726 | 74 | self.swiftclient = get_swiftclient(**conn_info) | 74 | self.swiftclient = get_swiftclient(**conn_info) |
3727 | 75 | 75 | ||
3728 | 76 | # http://docs.openstack.org/developer/swift/misc.html#acls | 76 | # http://docs.openstack.org/developer/swift/misc.html#acls |
3732 | 77 | self.swiftclient.put_container(self.container, | 77 | self.swiftclient.put_container( |
3733 | 78 | headers={'X-Container-Read': | 78 | self.container, headers={"X-Container-Read": ".r:*,.rlistings"} |
3734 | 79 | '.r:*,.rlistings'}) | 79 | ) |
3735 | 80 | 80 | ||
3736 | 81 | def insert(self, path, reader, checksums=None, mutable=True, size=None): | 81 | def insert(self, path, reader, checksums=None, mutable=True, size=None): |
3737 | 82 | # store content from reader.read() into path, expecting result checksum | 82 | # store content from reader.read() into path, expecting result checksum |
3740 | 83 | self._insert(path=path, contents=reader, checksums=checksums, | 83 | self._insert( |
3741 | 84 | mutable=mutable) | 84 | path=path, contents=reader, checksums=checksums, mutable=mutable |
3742 | 85 | ) | ||
3743 | 85 | 86 | ||
3744 | 86 | def insert_content(self, path, content, checksums=None, mutable=True): | 87 | def insert_content(self, path, content, checksums=None, mutable=True): |
3747 | 87 | self._insert(path=path, contents=content, checksums=checksums, | 88 | self._insert( |
3748 | 88 | mutable=mutable) | 89 | path=path, contents=content, checksums=checksums, mutable=mutable |
3749 | 90 | ) | ||
3750 | 89 | 91 | ||
3751 | 90 | def remove(self, path): | 92 | def remove(self, path): |
3754 | 91 | self.swiftclient.delete_object(container=self.container, | 93 | self.swiftclient.delete_object( |
3755 | 92 | obj=self.path_prefix + path) | 94 | container=self.container, obj=self.path_prefix + path |
3756 | 95 | ) | ||
3757 | 93 | 96 | ||
3758 | 94 | def source(self, path): | 97 | def source(self, path): |
3759 | 95 | def itgen(): | 98 | def itgen(): |
3760 | 96 | (_headers, iterator) = self.swiftclient.get_object( | 99 | (_headers, iterator) = self.swiftclient.get_object( |
3763 | 97 | container=self.container, obj=self.path_prefix + path, | 100 | container=self.container, |
3764 | 98 | resp_chunk_size=self.read_size) | 101 | obj=self.path_prefix + path, |
3765 | 102 | resp_chunk_size=self.read_size, | ||
3766 | 103 | ) | ||
3767 | 99 | return iterator | 104 | return iterator |
3768 | 100 | 105 | ||
3769 | 101 | return SwiftContentSource(itgen=itgen, url=self.prefix + path) | 106 | return SwiftContentSource(itgen=itgen, url=self.prefix + path) |
3770 | @@ -105,8 +110,9 @@ class SwiftObjectStore(objectstores.ObjectStore): | |||
3771 | 105 | 110 | ||
3772 | 106 | def _head_path(self, path): | 111 | def _head_path(self, path): |
3773 | 107 | try: | 112 | try: |
3776 | 108 | headers = self.swiftclient.head_object(container=self.container, | 113 | headers = self.swiftclient.head_object( |
3777 | 109 | obj=self.path_prefix + path) | 114 | container=self.container, obj=self.path_prefix + path |
3778 | 115 | ) | ||
3779 | 110 | except Exception as exc: | 116 | except Exception as exc: |
3780 | 111 | if is_enoent(exc): | 117 | if is_enoent(exc): |
3781 | 112 | return {} | 118 | return {} |
3782 | @@ -122,19 +128,22 @@ class SwiftObjectStore(objectstores.ObjectStore): | |||
3783 | 122 | if headers_match_checksums(headers, checksums): | 128 | if headers_match_checksums(headers, checksums): |
3784 | 123 | return | 129 | return |
3785 | 124 | 130 | ||
3788 | 125 | insargs = {'container': self.container, 'obj': self.path_prefix + path, | 131 | insargs = { |
3789 | 126 | 'contents': contents} | 132 | "container": self.container, |
3790 | 133 | "obj": self.path_prefix + path, | ||
3791 | 134 | "contents": contents, | ||
3792 | 135 | } | ||
3793 | 127 | 136 | ||
3794 | 128 | if size is not None and isinstance(contents, str): | 137 | if size is not None and isinstance(contents, str): |
3795 | 129 | size = len(contents) | 138 | size = len(contents) |
3796 | 130 | 139 | ||
3797 | 131 | if size is not None: | 140 | if size is not None: |
3799 | 132 | insargs['content_length'] = size | 141 | insargs["content_length"] = size |
3800 | 133 | 142 | ||
3803 | 134 | if checksums and checksums.get('md5'): | 143 | if checksums and checksums.get("md5"): |
3804 | 135 | insargs['etag'] = checksums.get('md5') | 144 | insargs["etag"] = checksums.get("md5") |
3805 | 136 | elif isinstance(contents, str): | 145 | elif isinstance(contents, str): |
3807 | 137 | insargs['etag'] = hashlib.md5(contents).hexdigest() | 146 | insargs["etag"] = hashlib.md5(contents).hexdigest() |
3808 | 138 | 147 | ||
3809 | 139 | self.swiftclient.put_object(**insargs) | 148 | self.swiftclient.put_object(**insargs) |
3810 | 140 | 149 | ||
3811 | @@ -142,13 +151,15 @@ class SwiftObjectStore(objectstores.ObjectStore): | |||
3812 | 142 | def headers_match_checksums(headers, checksums): | 151 | def headers_match_checksums(headers, checksums): |
3813 | 143 | if not (headers and checksums): | 152 | if not (headers and checksums): |
3814 | 144 | return False | 153 | return False |
3816 | 145 | if ('md5' in checksums and headers.get('etag') == checksums.get('md5')): | 154 | if "md5" in checksums and headers.get("etag") == checksums.get("md5"): |
3817 | 146 | return True | 155 | return True |
3818 | 147 | return False | 156 | return False |
3819 | 148 | 157 | ||
3820 | 149 | 158 | ||
3821 | 150 | def is_enoent(exc): | 159 | def is_enoent(exc): |
3824 | 151 | return ((isinstance(exc, IOError) and exc.errno == errno.ENOENT) or | 160 | return (isinstance(exc, IOError) and exc.errno == errno.ENOENT) or ( |
3825 | 152 | (isinstance(exc, ClientException) and exc.http_status == 404)) | 161 | isinstance(exc, ClientException) and exc.http_status == 404 |
3826 | 162 | ) | ||
3827 | 163 | |||
3828 | 153 | 164 | ||
3829 | 154 | # vi: ts=4 expandtab | 165 | # vi: ts=4 expandtab |
3830 | diff --git a/simplestreams/openstack.py b/simplestreams/openstack.py | |||
3831 | index ebf63c4..48101e7 100644 | |||
3832 | --- a/simplestreams/openstack.py | |||
3833 | +++ b/simplestreams/openstack.py | |||
3834 | @@ -20,9 +20,11 @@ import os | |||
3835 | 20 | 20 | ||
3836 | 21 | from keystoneclient.v2_0 import client as ksclient_v2 | 21 | from keystoneclient.v2_0 import client as ksclient_v2 |
3837 | 22 | from keystoneclient.v3 import client as ksclient_v3 | 22 | from keystoneclient.v3 import client as ksclient_v3 |
3838 | 23 | |||
3839 | 23 | try: | 24 | try: |
3840 | 24 | from keystoneauth1 import session | 25 | from keystoneauth1 import session |
3842 | 25 | from keystoneauth1.identity import (v2, v3) | 26 | from keystoneauth1.identity import v2, v3 |
3843 | 27 | |||
3844 | 26 | _LEGACY_CLIENTS = False | 28 | _LEGACY_CLIENTS = False |
3845 | 27 | except ImportError: | 29 | except ImportError: |
3846 | 28 | # 14.04 level packages do not have this. | 30 | # 14.04 level packages do not have this. |
3847 | @@ -31,43 +33,88 @@ except ImportError: | |||
3848 | 31 | 33 | ||
3849 | 32 | 34 | ||
3850 | 33 | OS_ENV_VARS = ( | 35 | OS_ENV_VARS = ( |
3857 | 34 | 'OS_AUTH_TOKEN', 'OS_AUTH_URL', 'OS_CACERT', 'OS_IMAGE_API_VERSION', | 36 | "OS_AUTH_TOKEN", |
3858 | 35 | 'OS_IMAGE_URL', 'OS_PASSWORD', 'OS_REGION_NAME', 'OS_STORAGE_URL', | 37 | "OS_AUTH_URL", |
3859 | 36 | 'OS_TENANT_ID', 'OS_TENANT_NAME', 'OS_USERNAME', 'OS_INSECURE', | 38 | "OS_CACERT", |
3860 | 37 | 'OS_USER_DOMAIN_NAME', 'OS_PROJECT_DOMAIN_NAME', | 39 | "OS_IMAGE_API_VERSION", |
3861 | 38 | 'OS_USER_DOMAIN_ID', 'OS_PROJECT_DOMAIN_ID', 'OS_PROJECT_NAME', | 40 | "OS_IMAGE_URL", |
3862 | 39 | 'OS_PROJECT_ID' | 41 | "OS_PASSWORD", |
3863 | 42 | "OS_REGION_NAME", | ||
3864 | 43 | "OS_STORAGE_URL", | ||
3865 | 44 | "OS_TENANT_ID", | ||
3866 | 45 | "OS_TENANT_NAME", | ||
3867 | 46 | "OS_USERNAME", | ||
3868 | 47 | "OS_INSECURE", | ||
3869 | 48 | "OS_USER_DOMAIN_NAME", | ||
3870 | 49 | "OS_PROJECT_DOMAIN_NAME", | ||
3871 | 50 | "OS_USER_DOMAIN_ID", | ||
3872 | 51 | "OS_PROJECT_DOMAIN_ID", | ||
3873 | 52 | "OS_PROJECT_NAME", | ||
3874 | 53 | "OS_PROJECT_ID", | ||
3875 | 40 | ) | 54 | ) |
3876 | 41 | 55 | ||
3877 | 42 | 56 | ||
3878 | 43 | # only used for legacy client connection | 57 | # only used for legacy client connection |
3881 | 44 | PT_V2 = ('username', 'password', 'tenant_id', 'tenant_name', 'auth_url', | 58 | PT_V2 = ( |
3882 | 45 | 'cacert', 'insecure', ) | 59 | "username", |
3883 | 60 | "password", | ||
3884 | 61 | "tenant_id", | ||
3885 | 62 | "tenant_name", | ||
3886 | 63 | "auth_url", | ||
3887 | 64 | "cacert", | ||
3888 | 65 | "insecure", | ||
3889 | 66 | ) | ||
3890 | 46 | 67 | ||
3891 | 47 | # annoyingly the 'insecure' option in the old client constructor is now called | 68 | # annoyingly the 'insecure' option in the old client constructor is now called |
3892 | 48 | # the 'verify' option in the session.Session() constructor | 69 | # the 'verify' option in the session.Session() constructor |
3915 | 49 | PASSWORD_V2 = ('auth_url', 'username', 'password', 'user_id', 'trust_id', | 70 | PASSWORD_V2 = ( |
3916 | 50 | 'tenant_id', 'tenant_name', 'reauthenticate') | 71 | "auth_url", |
3917 | 51 | PASSWORD_V3 = ('auth_url', 'password', 'username', | 72 | "username", |
3918 | 52 | 'user_id', 'user_domain_id', 'user_domain_name', | 73 | "password", |
3919 | 53 | 'trust_id', 'system_scope', | 74 | "user_id", |
3920 | 54 | 'domain_id', 'domain_name', | 75 | "trust_id", |
3921 | 55 | 'project_id', 'project_name', | 76 | "tenant_id", |
3922 | 56 | 'project_domain_id', 'project_domain_name', | 77 | "tenant_name", |
3923 | 57 | 'reauthenticate') | 78 | "reauthenticate", |
3924 | 58 | SESSION_ARGS = ('cert', 'timeout', 'verify', 'original_ip', 'redirect', | 79 | ) |
3925 | 59 | 'addition_headers', 'app_name', 'app_version', | 80 | PASSWORD_V3 = ( |
3926 | 60 | 'additional_user_agent', | 81 | "auth_url", |
3927 | 61 | 'discovery_cache', 'split_loggers', 'collect_timing') | 82 | "password", |
3928 | 62 | 83 | "username", | |
3929 | 63 | 84 | "user_id", | |
3930 | 64 | Settings = collections.namedtuple('Settings', 'mod ident arg_set') | 85 | "user_domain_id", |
3931 | 65 | KS_VERSION_RESOLVER = {2: Settings(mod=ksclient_v2, | 86 | "user_domain_name", |
3932 | 66 | ident=v2, | 87 | "trust_id", |
3933 | 67 | arg_set=PASSWORD_V2), | 88 | "system_scope", |
3934 | 68 | 3: Settings(mod=ksclient_v3, | 89 | "domain_id", |
3935 | 69 | ident=v3, | 90 | "domain_name", |
3936 | 70 | arg_set=PASSWORD_V3)} | 91 | "project_id", |
3937 | 92 | "project_name", | ||
3938 | 93 | "project_domain_id", | ||
3939 | 94 | "project_domain_name", | ||
3940 | 95 | "reauthenticate", | ||
3941 | 96 | ) | ||
3942 | 97 | SESSION_ARGS = ( | ||
3943 | 98 | "cert", | ||
3944 | 99 | "timeout", | ||
3945 | 100 | "verify", | ||
3946 | 101 | "original_ip", | ||
3947 | 102 | "redirect", | ||
3948 | 103 | "addition_headers", | ||
3949 | 104 | "app_name", | ||
3950 | 105 | "app_version", | ||
3951 | 106 | "additional_user_agent", | ||
3952 | 107 | "discovery_cache", | ||
3953 | 108 | "split_loggers", | ||
3954 | 109 | "collect_timing", | ||
3955 | 110 | ) | ||
3956 | 111 | |||
3957 | 112 | |||
3958 | 113 | Settings = collections.namedtuple("Settings", "mod ident arg_set") | ||
3959 | 114 | KS_VERSION_RESOLVER = { | ||
3960 | 115 | 2: Settings(mod=ksclient_v2, ident=v2, arg_set=PASSWORD_V2), | ||
3961 | 116 | 3: Settings(mod=ksclient_v3, ident=v3, arg_set=PASSWORD_V3), | ||
3962 | 117 | } | ||
3963 | 71 | 118 | ||
3964 | 72 | 119 | ||
3965 | 73 | def load_keystone_creds(**kwargs): | 120 | def load_keystone_creds(**kwargs): |
3966 | @@ -87,32 +134,38 @@ def load_keystone_creds(**kwargs): | |||
3967 | 87 | # take off 'os_' | 134 | # take off 'os_' |
3968 | 88 | ret[short] = os.environ[name] | 135 | ret[short] = os.environ[name] |
3969 | 89 | 136 | ||
3974 | 90 | if 'insecure' in ret: | 137 | if "insecure" in ret: |
3975 | 91 | if isinstance(ret['insecure'], str): | 138 | if isinstance(ret["insecure"], str): |
3976 | 92 | ret['insecure'] = (ret['insecure'].lower() not in | 139 | ret["insecure"] = ret["insecure"].lower() not in ( |
3977 | 93 | ("", "0", "no", "off", 'false')) | 140 | "", |
3978 | 141 | "0", | ||
3979 | 142 | "no", | ||
3980 | 143 | "off", | ||
3981 | 144 | "false", | ||
3982 | 145 | ) | ||
3983 | 94 | else: | 146 | else: |
3985 | 95 | ret['insecure'] = bool(ret['insecure']) | 147 | ret["insecure"] = bool(ret["insecure"]) |
3986 | 96 | 148 | ||
3987 | 97 | # verify is the key that is used by requests, and thus the Session object. | 149 | # verify is the key that is used by requests, and thus the Session object. |
3988 | 98 | # i.e. verify is either False or a certificate path or file. | 150 | # i.e. verify is either False or a certificate path or file. |
3991 | 99 | if not ret.get('insecure', False) and 'cacert' in ret: | 151 | if not ret.get("insecure", False) and "cacert" in ret: |
3992 | 100 | ret['verify'] = ret['cacert'] | 152 | ret["verify"] = ret["cacert"] |
3993 | 101 | 153 | ||
3994 | 102 | missing = [] | 154 | missing = [] |
3996 | 103 | for req in ('username', 'auth_url'): | 155 | for req in ("username", "auth_url"): |
3997 | 104 | if not ret.get(req, None): | 156 | if not ret.get(req, None): |
3998 | 105 | missing.append(req) | 157 | missing.append(req) |
3999 | 106 | 158 | ||
4001 | 107 | if not (ret.get('auth_token') or ret.get('password')): | 159 | if not (ret.get("auth_token") or ret.get("password")): |
4002 | 108 | missing.append("(auth_token or password)") | 160 | missing.append("(auth_token or password)") |
4003 | 109 | 161 | ||
4007 | 110 | api_version = get_ks_api_version(ret.get('auth_url', '')) or 2 | 162 | api_version = get_ks_api_version(ret.get("auth_url", "")) or 2 |
4008 | 111 | if (api_version == 2 and | 163 | if api_version == 2 and not ( |
4009 | 112 | not (ret.get('tenant_id') or ret.get('tenant_name'))): | 164 | ret.get("tenant_id") or ret.get("tenant_name") |
4010 | 165 | ): | ||
4011 | 113 | missing.append("(tenant_id or tenant_name)") | 166 | missing.append("(tenant_id or tenant_name)") |
4012 | 114 | if api_version == 3: | 167 | if api_version == 3: |
4014 | 115 | for k in ('user_domain_name', 'project_domain_name', 'project_name'): | 168 | for k in ("user_domain_name", "project_domain_name", "project_name"): |
4015 | 116 | if not ret.get(k, None): | 169 | if not ret.get(k, None): |
4016 | 117 | missing.append(k) | 170 | missing.append(k) |
4017 | 118 | 171 | ||
4018 | @@ -124,8 +177,8 @@ def load_keystone_creds(**kwargs): | |||
4019 | 124 | 177 | ||
4020 | 125 | def get_regions(client=None, services=None, kscreds=None): | 178 | def get_regions(client=None, services=None, kscreds=None): |
4021 | 126 | # if kscreds had 'region_name', then return that | 179 | # if kscreds had 'region_name', then return that |
4024 | 127 | if kscreds and kscreds.get('region_name'): | 180 | if kscreds and kscreds.get("region_name"): |
4025 | 128 | return [kscreds.get('region_name')] | 181 | return [kscreds.get("region_name")] |
4026 | 129 | 182 | ||
4027 | 130 | if client is None: | 183 | if client is None: |
4028 | 131 | creds = kscreds | 184 | creds = kscreds |
4029 | @@ -139,7 +192,7 @@ def get_regions(client=None, services=None, kscreds=None): | |||
4030 | 139 | regions = set() | 192 | regions = set() |
4031 | 140 | for service in services: | 193 | for service in services: |
4032 | 141 | for r in endpoints.get(service, {}): | 194 | for r in endpoints.get(service, {}): |
4034 | 142 | regions.add(r['region']) | 195 | regions.add(r["region"]) |
4035 | 143 | 196 | ||
4036 | 144 | return list(regions) | 197 | return list(regions) |
4037 | 145 | 198 | ||
4038 | @@ -153,15 +206,15 @@ def get_ks_api_version(auth_url=None, env=None): | |||
4039 | 153 | if env is None: | 206 | if env is None: |
4040 | 154 | env = os.environ | 207 | env = os.environ |
4041 | 155 | 208 | ||
4044 | 156 | if env.get('OS_IDENTITY_API_VERSION'): | 209 | if env.get("OS_IDENTITY_API_VERSION"): |
4045 | 157 | return int(env['OS_IDENTITY_API_VERSION']) | 210 | return int(env["OS_IDENTITY_API_VERSION"]) |
4046 | 158 | 211 | ||
4047 | 159 | if auth_url is None: | 212 | if auth_url is None: |
4048 | 160 | auth_url = "" | 213 | auth_url = "" |
4049 | 161 | 214 | ||
4051 | 162 | if auth_url.endswith('/v3'): | 215 | if auth_url.endswith("/v3"): |
4052 | 163 | return 3 | 216 | return 3 |
4054 | 164 | elif auth_url.endswith('/v2.0'): | 217 | elif auth_url.endswith("/v2.0"): |
4055 | 165 | return 2 | 218 | return 2 |
4056 | 166 | # Return None if we can't determine the keystone version | 219 | # Return None if we can't determine the keystone version |
4057 | 167 | return None | 220 | return None |
4058 | @@ -178,20 +231,20 @@ def get_ksclient(**kwargs): | |||
4059 | 178 | if _LEGACY_CLIENTS: | 231 | if _LEGACY_CLIENTS: |
4060 | 179 | return _legacy_ksclient(**kwargs) | 232 | return _legacy_ksclient(**kwargs) |
4061 | 180 | 233 | ||
4063 | 181 | api_version = get_ks_api_version(kwargs.get('auth_url', '')) or 2 | 234 | api_version = get_ks_api_version(kwargs.get("auth_url", "")) or 2 |
4064 | 182 | arg_set = KS_VERSION_RESOLVER[api_version].arg_set | 235 | arg_set = KS_VERSION_RESOLVER[api_version].arg_set |
4065 | 183 | # Filter/select the args for the api version from the kwargs dictionary | 236 | # Filter/select the args for the api version from the kwargs dictionary |
4066 | 184 | kskw = {k: v for k, v in kwargs.items() if k in arg_set} | 237 | kskw = {k: v for k, v in kwargs.items() if k in arg_set} |
4067 | 185 | auth = KS_VERSION_RESOLVER[api_version].ident.Password(**kskw) | 238 | auth = KS_VERSION_RESOLVER[api_version].ident.Password(**kskw) |
4068 | 186 | authkw = {k: v for k, v in kwargs.items() if k in SESSION_ARGS} | 239 | authkw = {k: v for k, v in kwargs.items() if k in SESSION_ARGS} |
4070 | 187 | authkw['auth'] = auth | 240 | authkw["auth"] = auth |
4071 | 188 | sess = session.Session(**authkw) | 241 | sess = session.Session(**authkw) |
4072 | 189 | client = KS_VERSION_RESOLVER[api_version].mod.Client(session=sess) | 242 | client = KS_VERSION_RESOLVER[api_version].mod.Client(session=sess) |
4073 | 190 | client.auth_ref = auth.get_access(sess) | 243 | client.auth_ref = auth.get_access(sess) |
4074 | 191 | return client | 244 | return client |
4075 | 192 | 245 | ||
4076 | 193 | 246 | ||
4078 | 194 | def get_service_conn_info(service='image', client=None, **kwargs): | 247 | def get_service_conn_info(service="image", client=None, **kwargs): |
4079 | 195 | # return a dict with token, insecure, cacert, endpoint | 248 | # return a dict with token, insecure, cacert, endpoint |
4080 | 196 | if not client: | 249 | if not client: |
4081 | 197 | client = get_ksclient(**kwargs) | 250 | client = get_ksclient(**kwargs) |
4082 | @@ -199,16 +252,23 @@ def get_service_conn_info(service='image', client=None, **kwargs): | |||
4083 | 199 | endpoint = _get_endpoint(client, service, **kwargs) | 252 | endpoint = _get_endpoint(client, service, **kwargs) |
4084 | 200 | # Session client does not have tenant_id set at client.tenant_id | 253 | # Session client does not have tenant_id set at client.tenant_id |
4085 | 201 | # If client.tenant_id not set use method to get it | 254 | # If client.tenant_id not set use method to get it |
4091 | 202 | tenant_id = (client.tenant_id or client.get_project_id(client.session) or | 255 | tenant_id = ( |
4092 | 203 | client.auth.client.get_project_id()) | 256 | client.tenant_id |
4093 | 204 | info = {'token': client.auth_token, 'insecure': kwargs.get('insecure'), | 257 | or client.get_project_id(client.session) |
4094 | 205 | 'cacert': kwargs.get('cacert'), 'endpoint': endpoint, | 258 | or client.auth.client.get_project_id() |
4095 | 206 | 'tenant_id': tenant_id} | 259 | ) |
4096 | 260 | info = { | ||
4097 | 261 | "token": client.auth_token, | ||
4098 | 262 | "insecure": kwargs.get("insecure"), | ||
4099 | 263 | "cacert": kwargs.get("cacert"), | ||
4100 | 264 | "endpoint": endpoint, | ||
4101 | 265 | "tenant_id": tenant_id, | ||
4102 | 266 | } | ||
4103 | 207 | if not _LEGACY_CLIENTS: | 267 | if not _LEGACY_CLIENTS: |
4106 | 208 | info['session'] = client.session | 268 | info["session"] = client.session |
4107 | 209 | info['glance_version'] = '2' | 269 | info["glance_version"] = "2" |
4108 | 210 | else: | 270 | else: |
4110 | 211 | info['glance_version'] = '1' | 271 | info["glance_version"] = "1" |
4111 | 212 | 272 | ||
4112 | 213 | return info | 273 | return info |
4113 | 214 | 274 | ||
4114 | @@ -216,12 +276,12 @@ def get_service_conn_info(service='image', client=None, **kwargs): | |||
4115 | 216 | def _get_endpoint(client, service, **kwargs): | 276 | def _get_endpoint(client, service, **kwargs): |
4116 | 217 | """Get an endpoint using the provided keystone client.""" | 277 | """Get an endpoint using the provided keystone client.""" |
4117 | 218 | endpoint_kwargs = { | 278 | endpoint_kwargs = { |
4121 | 219 | 'service_type': service, | 279 | "service_type": service, |
4122 | 220 | 'interface': kwargs.get('endpoint_type') or 'publicURL', | 280 | "interface": kwargs.get("endpoint_type") or "publicURL", |
4123 | 221 | 'region_name': kwargs.get('region_name'), | 281 | "region_name": kwargs.get("region_name"), |
4124 | 222 | } | 282 | } |
4125 | 223 | if _LEGACY_CLIENTS: | 283 | if _LEGACY_CLIENTS: |
4127 | 224 | del endpoint_kwargs['interface'] | 284 | del endpoint_kwargs["interface"] |
4128 | 225 | 285 | ||
4129 | 226 | endpoint = client.service_catalog.url_for(**endpoint_kwargs) | 286 | endpoint = client.service_catalog.url_for(**endpoint_kwargs) |
4130 | 227 | return endpoint | 287 | return endpoint |
4131 | diff --git a/simplestreams/util.py b/simplestreams/util.py | |||
4132 | index 9866893..ebfe741 100644 | |||
4133 | --- a/simplestreams/util.py | |||
4134 | +++ b/simplestreams/util.py | |||
4135 | @@ -16,15 +16,15 @@ | |||
4136 | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. | 16 | # along with Simplestreams. If not, see <http://www.gnu.org/licenses/>. |
4137 | 17 | 17 | ||
4138 | 18 | import errno | 18 | import errno |
4139 | 19 | import json | ||
4140 | 19 | import os | 20 | import os |
4141 | 20 | import re | 21 | import re |
4142 | 21 | import subprocess | 22 | import subprocess |
4143 | 22 | import tempfile | 23 | import tempfile |
4144 | 23 | import time | 24 | import time |
4145 | 24 | import json | ||
4146 | 25 | 25 | ||
4147 | 26 | import simplestreams.contentsource as cs | ||
4148 | 27 | import simplestreams.checksum_util as checksum_util | 26 | import simplestreams.checksum_util as checksum_util |
4149 | 27 | import simplestreams.contentsource as cs | ||
4150 | 28 | from simplestreams.log import LOG | 28 | from simplestreams.log import LOG |
4151 | 29 | 29 | ||
4152 | 30 | ALIASNAME = "_aliases" | 30 | ALIASNAME = "_aliases" |
4153 | @@ -35,7 +35,7 @@ PGP_SIGNATURE_FOOTER = "-----END PGP SIGNATURE-----" | |||
4154 | 35 | 35 | ||
4155 | 36 | _UNSET = object() | 36 | _UNSET = object() |
4156 | 37 | 37 | ||
4158 | 38 | READ_SIZE = (1024 * 10) | 38 | READ_SIZE = 1024 * 10 |
4159 | 39 | 39 | ||
4160 | 40 | PRODUCTS_TREE_DATA = ( | 40 | PRODUCTS_TREE_DATA = ( |
4161 | 41 | ("products", "product_name"), | 41 | ("products", "product_name"), |
4162 | @@ -84,7 +84,7 @@ def products_exdata(tree, pedigree, include_top=True, insert_fieldnames=True): | |||
4163 | 84 | if include_top and tree: | 84 | if include_top and tree: |
4164 | 85 | exdata.update(stringitems(tree)) | 85 | exdata.update(stringitems(tree)) |
4165 | 86 | clevel = tree | 86 | clevel = tree |
4167 | 87 | for (n, key) in enumerate(pedigree): | 87 | for n, key in enumerate(pedigree): |
4168 | 88 | dictname, fieldname = harchy[n] | 88 | dictname, fieldname = harchy[n] |
4169 | 89 | clevel = clevel.get(dictname, {}).get(key, {}) | 89 | clevel = clevel.get(dictname, {}).get(key, {}) |
4170 | 90 | exdata.update(stringitems(clevel)) | 90 | exdata.update(stringitems(clevel)) |
4171 | @@ -131,48 +131,54 @@ def products_del(tree, pedigree): | |||
4172 | 131 | 131 | ||
4173 | 132 | 132 | ||
4174 | 133 | def products_prune(tree, preserve_empty_products=False): | 133 | def products_prune(tree, preserve_empty_products=False): |
4177 | 134 | for prodname in list(tree.get('products', {}).keys()): | 134 | for prodname in list(tree.get("products", {}).keys()): |
4178 | 135 | keys = list(tree['products'][prodname].get('versions', {}).keys()) | 135 | keys = list(tree["products"][prodname].get("versions", {}).keys()) |
4179 | 136 | for vername in keys: | 136 | for vername in keys: |
4199 | 137 | vtree = tree['products'][prodname]['versions'][vername] | 137 | vtree = tree["products"][prodname]["versions"][vername] |
4200 | 138 | for itemname in list(vtree.get('items', {}).keys()): | 138 | for itemname in list(vtree.get("items", {}).keys()): |
4201 | 139 | if not vtree['items'][itemname]: | 139 | if not vtree["items"][itemname]: |
4202 | 140 | del vtree['items'][itemname] | 140 | del vtree["items"][itemname] |
4203 | 141 | 141 | ||
4204 | 142 | if 'items' not in vtree or not vtree['items']: | 142 | if "items" not in vtree or not vtree["items"]: |
4205 | 143 | del tree['products'][prodname]['versions'][vername] | 143 | del tree["products"][prodname]["versions"][vername] |
4206 | 144 | 144 | ||
4207 | 145 | if ('versions' not in tree['products'][prodname] or | 145 | if ( |
4208 | 146 | not tree['products'][prodname]['versions']): | 146 | "versions" not in tree["products"][prodname] |
4209 | 147 | del tree['products'][prodname] | 147 | or not tree["products"][prodname]["versions"] |
4210 | 148 | 148 | ): | |
4211 | 149 | if (not preserve_empty_products and 'products' in tree and | 149 | del tree["products"][prodname] |
4212 | 150 | not tree['products']): | 150 | |
4213 | 151 | del tree['products'] | 151 | if ( |
4214 | 152 | 152 | not preserve_empty_products | |
4215 | 153 | 153 | and "products" in tree | |
4216 | 154 | def walk_products(tree, cb_product=None, cb_version=None, cb_item=None, | 154 | and not tree["products"] |
4217 | 155 | ret_finished=_UNSET): | 155 | ): |
4218 | 156 | del tree["products"] | ||
4219 | 157 | |||
4220 | 158 | |||
4221 | 159 | def walk_products( | ||
4222 | 160 | tree, cb_product=None, cb_version=None, cb_item=None, ret_finished=_UNSET | ||
4223 | 161 | ): | ||
4224 | 156 | # walk a product tree. callbacks are called with (item, tree, (pedigree)) | 162 | # walk a product tree. callbacks are called with (item, tree, (pedigree)) |
4226 | 157 | for prodname, proddata in tree['products'].items(): | 163 | for prodname, proddata in tree["products"].items(): |
4227 | 158 | if cb_product: | 164 | if cb_product: |
4228 | 159 | ret = cb_product(proddata, tree, (prodname,)) | 165 | ret = cb_product(proddata, tree, (prodname,)) |
4229 | 160 | if ret_finished != _UNSET and ret == ret_finished: | 166 | if ret_finished != _UNSET and ret == ret_finished: |
4230 | 161 | return | 167 | return |
4231 | 162 | 168 | ||
4233 | 163 | if (not cb_version and not cb_item) or 'versions' not in proddata: | 169 | if (not cb_version and not cb_item) or "versions" not in proddata: |
4234 | 164 | continue | 170 | continue |
4235 | 165 | 171 | ||
4237 | 166 | for vername, verdata in proddata['versions'].items(): | 172 | for vername, verdata in proddata["versions"].items(): |
4238 | 167 | if cb_version: | 173 | if cb_version: |
4239 | 168 | ret = cb_version(verdata, tree, (prodname, vername)) | 174 | ret = cb_version(verdata, tree, (prodname, vername)) |
4240 | 169 | if ret_finished != _UNSET and ret == ret_finished: | 175 | if ret_finished != _UNSET and ret == ret_finished: |
4241 | 170 | return | 176 | return |
4242 | 171 | 177 | ||
4244 | 172 | if not cb_item or 'items' not in verdata: | 178 | if not cb_item or "items" not in verdata: |
4245 | 173 | continue | 179 | continue |
4246 | 174 | 180 | ||
4248 | 175 | for itemname, itemdata in verdata['items'].items(): | 181 | for itemname, itemdata in verdata["items"].items(): |
4249 | 176 | ret = cb_item(itemdata, tree, (prodname, vername, itemname)) | 182 | ret = cb_item(itemdata, tree, (prodname, vername, itemname)) |
4250 | 177 | if ret_finished != _UNSET and ret == ret_finished: | 183 | if ret_finished != _UNSET and ret == ret_finished: |
4251 | 178 | return | 184 | return |
4252 | @@ -205,8 +211,9 @@ def expand_data(data, refs=None, delete=False): | |||
4253 | 205 | expand_data(item, refs) | 211 | expand_data(item, refs) |
4254 | 206 | 212 | ||
4255 | 207 | 213 | ||
4258 | 208 | def resolve_work(src, target, maxnum=None, keep=False, itemfilter=None, | 214 | def resolve_work( |
4259 | 209 | sort_reverse=True): | 215 | src, target, maxnum=None, keep=False, itemfilter=None, sort_reverse=True |
4260 | 216 | ): | ||
4261 | 210 | # if more than maxnum items are in src, only the most recent maxnum will be | 217 | # if more than maxnum items are in src, only the most recent maxnum will be |
4262 | 211 | # stored in target. If keep is true, then the most recent maxnum items | 218 | # stored in target. If keep is true, then the most recent maxnum items |
4263 | 212 | # will be kept in target even if they are no longer in src. | 219 | # will be kept in target even if they are no longer in src. |
4264 | @@ -241,8 +248,9 @@ def resolve_work(src, target, maxnum=None, keep=False, itemfilter=None, | |||
4265 | 241 | while len(remove) and (maxnum > (after_add - len(remove))): | 248 | while len(remove) and (maxnum > (after_add - len(remove))): |
4266 | 242 | remove.pop(0) | 249 | remove.pop(0) |
4267 | 243 | 250 | ||
4270 | 244 | mtarget = sorted([f for f in target + add if f not in remove], | 251 | mtarget = sorted( |
4271 | 245 | reverse=reverse) | 252 | [f for f in target + add if f not in remove], reverse=reverse |
4272 | 253 | ) | ||
4273 | 246 | if maxnum is not None and len(mtarget) > maxnum: | 254 | if maxnum is not None and len(mtarget) > maxnum: |
4274 | 247 | for item in mtarget[maxnum:]: | 255 | for item in mtarget[maxnum:]: |
4275 | 248 | if item in target: | 256 | if item in target: |
4276 | @@ -263,12 +271,12 @@ def has_gpgv(): | |||
4277 | 263 | if _HAS_GPGV is not None: | 271 | if _HAS_GPGV is not None: |
4278 | 264 | return _HAS_GPGV | 272 | return _HAS_GPGV |
4279 | 265 | 273 | ||
4281 | 266 | if which('gpgv'): | 274 | if which("gpgv"): |
4282 | 267 | try: | 275 | try: |
4283 | 268 | env = os.environ.copy() | 276 | env = os.environ.copy() |
4285 | 269 | env['LANG'] = 'C' | 277 | env["LANG"] = "C" |
4286 | 270 | out, err = subp(["gpgv", "--help"], capture=True, env=env) | 278 | out, err = subp(["gpgv", "--help"], capture=True, env=env) |
4288 | 271 | _HAS_GPGV = 'gnupg' in out.lower() or 'gnupg' in err.lower() | 279 | _HAS_GPGV = "gnupg" in out.lower() or "gnupg" in err.lower() |
4289 | 272 | except subprocess.CalledProcessError: | 280 | except subprocess.CalledProcessError: |
4290 | 273 | _HAS_GPGV = False | 281 | _HAS_GPGV = False |
4291 | 274 | else: | 282 | else: |
4292 | @@ -292,11 +300,13 @@ def read_signed(content, keyring=None, checked=True): | |||
4293 | 292 | try: | 300 | try: |
4294 | 293 | subp(cmd, data=content) | 301 | subp(cmd, data=content) |
4295 | 294 | except subprocess.CalledProcessError as e: | 302 | except subprocess.CalledProcessError as e: |
4298 | 295 | LOG.debug("failed: %s\n out=%s\n err=%s" % | 303 | LOG.debug( |
4299 | 296 | (' '.join(cmd), e.output[0], e.output[1])) | 304 | "failed: %s\n out=%s\n err=%s" |
4300 | 305 | % (" ".join(cmd), e.output[0], e.output[1]) | ||
4301 | 306 | ) | ||
4302 | 297 | raise e | 307 | raise e |
4303 | 298 | 308 | ||
4305 | 299 | ret = {'body': [], 'signature': [], 'garbage': []} | 309 | ret = {"body": [], "signature": [], "garbage": []} |
4306 | 300 | lines = content.splitlines() | 310 | lines = content.splitlines() |
4307 | 301 | i = 0 | 311 | i = 0 |
4308 | 302 | for i in range(0, len(lines)): | 312 | for i in range(0, len(lines)): |
4309 | @@ -320,21 +330,22 @@ def read_signed(content, keyring=None, checked=True): | |||
4310 | 320 | else: | 330 | else: |
4311 | 321 | ret[mode].append(lines[i]) | 331 | ret[mode].append(lines[i]) |
4312 | 322 | 332 | ||
4315 | 323 | ret['body'].append('') # need empty line at end | 333 | ret["body"].append("") # need empty line at end |
4316 | 324 | return "\n".join(ret['body']) | 334 | return "\n".join(ret["body"]) |
4317 | 325 | else: | 335 | else: |
4318 | 326 | raise SignatureMissingException("No signature found!") | 336 | raise SignatureMissingException("No signature found!") |
4319 | 327 | 337 | ||
4320 | 328 | 338 | ||
4321 | 329 | def load_content(content): | 339 | def load_content(content): |
4322 | 330 | if isinstance(content, bytes): | 340 | if isinstance(content, bytes): |
4324 | 331 | content = content.decode('utf-8') | 341 | content = content.decode("utf-8") |
4325 | 332 | return json.loads(content) | 342 | return json.loads(content) |
4326 | 333 | 343 | ||
4327 | 334 | 344 | ||
4328 | 335 | def dump_data(data): | 345 | def dump_data(data): |
4331 | 336 | return json.dumps(data, indent=1, sort_keys=True, | 346 | return json.dumps( |
4332 | 337 | separators=(',', ': ')).encode('utf-8') | 347 | data, indent=1, sort_keys=True, separators=(",", ": ") |
4333 | 348 | ).encode("utf-8") | ||
4334 | 338 | 349 | ||
4335 | 339 | 350 | ||
4336 | 340 | def timestamp(ts=None): | 351 | def timestamp(ts=None): |
4337 | @@ -375,24 +386,26 @@ def move_dups(src, target, sticky=None): | |||
4338 | 375 | target.update(updates) | 386 | target.update(updates) |
4339 | 376 | 387 | ||
4340 | 377 | 388 | ||
4342 | 378 | def products_condense(ptree, sticky=None, top='versions'): | 389 | def products_condense(ptree, sticky=None, top="versions"): |
4343 | 379 | # walk a products tree, copying up item keys as far as they'll go | 390 | # walk a products tree, copying up item keys as far as they'll go |
4344 | 380 | # only move items to a sibling of the 'top'. | 391 | # only move items to a sibling of the 'top'. |
4345 | 381 | 392 | ||
4349 | 382 | if top not in ('versions', 'products'): | 393 | if top not in ("versions", "products"): |
4350 | 383 | raise ValueError("'top' must be one of: %s" % | 394 | raise ValueError( |
4351 | 384 | ','.join(PRODUCTS_TREE_HIERARCHY)) | 395 | "'top' must be one of: %s" % ",".join(PRODUCTS_TREE_HIERARCHY) |
4352 | 396 | ) | ||
4353 | 385 | 397 | ||
4354 | 386 | def call_move_dups(cur, _tree, pedigree): | 398 | def call_move_dups(cur, _tree, pedigree): |
4357 | 387 | (_mtype, stname) = (("product", "versions"), | 399 | (_mtype, stname) = (("product", "versions"), ("version", "items"))[ |
4358 | 388 | ("version", "items"))[len(pedigree) - 1] | 400 | len(pedigree) - 1 |
4359 | 401 | ] | ||
4360 | 389 | move_dups(cur.get(stname, {}), cur, sticky=sticky) | 402 | move_dups(cur.get(stname, {}), cur, sticky=sticky) |
4361 | 390 | 403 | ||
4362 | 391 | walk_products(ptree, cb_version=call_move_dups) | 404 | walk_products(ptree, cb_version=call_move_dups) |
4363 | 392 | walk_products(ptree, cb_product=call_move_dups) | 405 | walk_products(ptree, cb_product=call_move_dups) |
4365 | 393 | if top == 'versions': | 406 | if top == "versions": |
4366 | 394 | return | 407 | return |
4368 | 395 | move_dups(ptree['products'], ptree) | 408 | move_dups(ptree["products"], ptree) |
4369 | 396 | 409 | ||
4370 | 397 | 410 | ||
4371 | 398 | def assert_safe_path(path): | 411 | def assert_safe_path(path): |
4372 | @@ -449,16 +462,22 @@ def subp(args, data=None, capture=True, shell=False, env=None): | |||
4373 | 449 | else: | 462 | else: |
4374 | 450 | stdout, stderr = (subprocess.PIPE, subprocess.PIPE) | 463 | stdout, stderr = (subprocess.PIPE, subprocess.PIPE) |
4375 | 451 | 464 | ||
4378 | 452 | sp = subprocess.Popen(args, stdout=stdout, stderr=stderr, | 465 | sp = subprocess.Popen( |
4379 | 453 | stdin=subprocess.PIPE, shell=shell, env=env) | 466 | args, |
4380 | 467 | stdout=stdout, | ||
4381 | 468 | stderr=stderr, | ||
4382 | 469 | stdin=subprocess.PIPE, | ||
4383 | 470 | shell=shell, | ||
4384 | 471 | env=env, | ||
4385 | 472 | ) | ||
4386 | 454 | if isinstance(data, str): | 473 | if isinstance(data, str): |
4388 | 455 | data = data.encode('utf-8') | 474 | data = data.encode("utf-8") |
4389 | 456 | 475 | ||
4390 | 457 | (out, err) = sp.communicate(data) | 476 | (out, err) = sp.communicate(data) |
4391 | 458 | if isinstance(out, bytes): | 477 | if isinstance(out, bytes): |
4393 | 459 | out = out.decode('utf-8') | 478 | out = out.decode("utf-8") |
4394 | 460 | if isinstance(err, bytes): | 479 | if isinstance(err, bytes): |
4396 | 461 | err = err.decode('utf-8') | 480 | err = err.decode("utf-8") |
4397 | 462 | 481 | ||
4398 | 463 | rc = sp.returncode | 482 | rc = sp.returncode |
4399 | 464 | if rc != 0: | 483 | if rc != 0: |
4400 | @@ -468,22 +487,22 @@ def subp(args, data=None, capture=True, shell=False, env=None): | |||
4401 | 468 | 487 | ||
4402 | 469 | 488 | ||
4403 | 470 | def get_sign_cmd(path, output=None, inline=False): | 489 | def get_sign_cmd(path, output=None, inline=False): |
4406 | 471 | cmd = ['gpg'] | 490 | cmd = ["gpg"] |
4407 | 472 | defkey = os.environ.get('SS_GPG_DEFAULT_KEY') | 491 | defkey = os.environ.get("SS_GPG_DEFAULT_KEY") |
4408 | 473 | if defkey: | 492 | if defkey: |
4410 | 474 | cmd.extend(['--default-key', defkey]) | 493 | cmd.extend(["--default-key", defkey]) |
4411 | 475 | 494 | ||
4413 | 476 | batch = os.environ.get('SS_GPG_BATCH', "1").lower() | 495 | batch = os.environ.get("SS_GPG_BATCH", "1").lower() |
4414 | 477 | if batch not in ("0", "false"): | 496 | if batch not in ("0", "false"): |
4416 | 478 | cmd.append('--batch') | 497 | cmd.append("--batch") |
4417 | 479 | 498 | ||
4418 | 480 | if output: | 499 | if output: |
4420 | 481 | cmd.extend(['--output', output]) | 500 | cmd.extend(["--output", output]) |
4421 | 482 | 501 | ||
4422 | 483 | if inline: | 502 | if inline: |
4424 | 484 | cmd.append('--clearsign') | 503 | cmd.append("--clearsign") |
4425 | 485 | else: | 504 | else: |
4427 | 486 | cmd.extend(['--armor', '--detach-sign']) | 505 | cmd.extend(["--armor", "--detach-sign"]) |
4428 | 487 | 506 | ||
4429 | 488 | cmd.extend([path]) | 507 | cmd.extend([path]) |
4430 | 489 | return cmd | 508 | return cmd |
4431 | @@ -498,17 +517,17 @@ def make_signed_content_paths(content): | |||
4432 | 498 | if data.get("format") != "index:1.0": | 517 | if data.get("format") != "index:1.0": |
4433 | 499 | return (False, None) | 518 | return (False, None) |
4434 | 500 | 519 | ||
4437 | 501 | for content_ent in list(data.get('index', {}).values()): | 520 | for content_ent in list(data.get("index", {}).values()): |
4438 | 502 | path = content_ent.get('path') | 521 | path = content_ent.get("path") |
4439 | 503 | if path.endswith(".json"): | 522 | if path.endswith(".json"): |
4441 | 504 | content_ent['path'] = signed_fname(path, inline=True) | 523 | content_ent["path"] = signed_fname(path, inline=True) |
4442 | 505 | 524 | ||
4443 | 506 | return (True, json.dumps(data, indent=1)) | 525 | return (True, json.dumps(data, indent=1)) |
4444 | 507 | 526 | ||
4445 | 508 | 527 | ||
4446 | 509 | def signed_fname(fname, inline=True): | 528 | def signed_fname(fname, inline=True): |
4447 | 510 | if inline: | 529 | if inline: |
4449 | 511 | sfname = fname[0:-len(".json")] + ".sjson" | 530 | sfname = fname[0 : -len(".json")] + ".sjson" |
4450 | 512 | else: | 531 | else: |
4451 | 513 | sfname = fname + ".gpg" | 532 | sfname = fname + ".gpg" |
4452 | 514 | 533 | ||
4453 | @@ -555,8 +574,10 @@ def sign_file(fname, inline=True, outfile=None): | |||
4454 | 555 | 574 | ||
4455 | 556 | def sign_content(content, outfile="-", inline=True): | 575 | def sign_content(content, outfile="-", inline=True): |
4456 | 557 | rm_f_file(outfile, skip=["-"]) | 576 | rm_f_file(outfile, skip=["-"]) |
4459 | 558 | return subp(args=get_sign_cmd(path="-", output=outfile, inline=inline), | 577 | return subp( |
4460 | 559 | data=content)[0] | 578 | args=get_sign_cmd(path="-", output=outfile, inline=inline), |
4461 | 579 | data=content, | ||
4462 | 580 | )[0] | ||
4463 | 560 | 581 | ||
4464 | 561 | 582 | ||
4465 | 562 | def path_from_mirror_url(mirror, path): | 583 | def path_from_mirror_url(mirror, path): |
4466 | @@ -566,8 +587,8 @@ def path_from_mirror_url(mirror, path): | |||
4467 | 566 | path_regex = "streams/v1/.*[.](sjson|json)$" | 587 | path_regex = "streams/v1/.*[.](sjson|json)$" |
4468 | 567 | result = re.search(path_regex, mirror) | 588 | result = re.search(path_regex, mirror) |
4469 | 568 | if result: | 589 | if result: |
4472 | 569 | path = mirror[result.start():] | 590 | path = mirror[result.start() :] |
4473 | 570 | mirror = mirror[:result.start()] | 591 | mirror = mirror[: result.start()] |
4474 | 571 | else: | 592 | else: |
4475 | 572 | path = "streams/v1/index.sjson" | 593 | path = "streams/v1/index.sjson" |
4476 | 573 | 594 | ||
4477 | @@ -589,21 +610,21 @@ class ProgressAggregator(object): | |||
4478 | 589 | self.total_written = 0 | 610 | self.total_written = 0 |
4479 | 590 | 611 | ||
4480 | 591 | def progress_callback(self, progress): | 612 | def progress_callback(self, progress): |
4482 | 592 | if self.current_file != progress['name']: | 613 | if self.current_file != progress["name"]: |
4483 | 593 | if self.remaining_items and self.current_file is not None: | 614 | if self.remaining_items and self.current_file is not None: |
4484 | 594 | del self.remaining_items[self.current_file] | 615 | del self.remaining_items[self.current_file] |
4486 | 595 | self.current_file = progress['name'] | 616 | self.current_file = progress["name"] |
4487 | 596 | self.last_emitted = 0 | 617 | self.last_emitted = 0 |
4488 | 597 | self.current_written = 0 | 618 | self.current_written = 0 |
4489 | 598 | 619 | ||
4492 | 599 | size = float(progress['size']) | 620 | size = float(progress["size"]) |
4493 | 600 | written = float(progress['written']) | 621 | written = float(progress["written"]) |
4494 | 601 | self.current_written += written | 622 | self.current_written += written |
4495 | 602 | self.total_written += written | 623 | self.total_written += written |
4496 | 603 | interval = self.current_written - self.last_emitted | 624 | interval = self.current_written - self.last_emitted |
4497 | 604 | if interval > size / 100: | 625 | if interval > size / 100: |
4498 | 605 | self.last_emitted = self.current_written | 626 | self.last_emitted = self.current_written |
4500 | 606 | progress['written'] = self.current_written | 627 | progress["written"] = self.current_written |
4501 | 607 | self.emit(progress) | 628 | self.emit(progress) |
4502 | 608 | 629 | ||
4503 | 609 | def emit(self, progress): | 630 | def emit(self, progress): |
4504 | diff --git a/tests/httpserver.py b/tests/httpserver.py | |||
4505 | index e88eb56..10bde9a 100644 | |||
4506 | --- a/tests/httpserver.py | |||
4507 | +++ b/tests/httpserver.py | |||
4508 | @@ -1,50 +1,61 @@ | |||
4509 | 1 | #!/usr/bin/env python | 1 | #!/usr/bin/env python |
4510 | 2 | import os | 2 | import os |
4511 | 3 | import sys | 3 | import sys |
4512 | 4 | |||
4513 | 4 | if sys.version_info.major == 2: | 5 | if sys.version_info.major == 2: |
4514 | 5 | from SimpleHTTPServer import SimpleHTTPRequestHandler | ||
4515 | 6 | from BaseHTTPServer import HTTPServer | 6 | from BaseHTTPServer import HTTPServer |
4516 | 7 | from SimpleHTTPServer import SimpleHTTPRequestHandler | ||
4517 | 7 | else: | 8 | else: |
4520 | 8 | from http.server import SimpleHTTPRequestHandler | 9 | from http.server import HTTPServer, SimpleHTTPRequestHandler |
4519 | 9 | from http.server import HTTPServer | ||
4521 | 10 | 10 | ||
4522 | 11 | 11 | ||
4523 | 12 | class LoggingHTTPRequestHandler(SimpleHTTPRequestHandler): | 12 | class LoggingHTTPRequestHandler(SimpleHTTPRequestHandler): |
4525 | 13 | def log_request(self, code='-', size='-'): | 13 | def log_request(self, code="-", size="-"): |
4526 | 14 | """ | 14 | """ |
4527 | 15 | Log an accepted request along with user-agent string. | 15 | Log an accepted request along with user-agent string. |
4528 | 16 | """ | 16 | """ |
4529 | 17 | 17 | ||
4530 | 18 | user_agent = self.headers.get("user-agent") | 18 | user_agent = self.headers.get("user-agent") |
4533 | 19 | self.log_message('"%s" %s %s (%s)', | 19 | self.log_message( |
4534 | 20 | self.requestline, str(code), str(size), user_agent) | 20 | '"%s" %s %s (%s)', |
4535 | 21 | self.requestline, | ||
4536 | 22 | str(code), | ||
4537 | 23 | str(size), | ||
4538 | 24 | user_agent, | ||
4539 | 25 | ) | ||
4540 | 21 | 26 | ||
4541 | 22 | 27 | ||
4544 | 23 | def run(address, port, | 28 | def run( |
4545 | 24 | HandlerClass=LoggingHTTPRequestHandler, ServerClass=HTTPServer): | 29 | address, |
4546 | 30 | port, | ||
4547 | 31 | HandlerClass=LoggingHTTPRequestHandler, | ||
4548 | 32 | ServerClass=HTTPServer, | ||
4549 | 33 | ): | ||
4550 | 25 | try: | 34 | try: |
4551 | 26 | server = ServerClass((address, port), HandlerClass) | 35 | server = ServerClass((address, port), HandlerClass) |
4552 | 27 | address, port = server.socket.getsockname() | 36 | address, port = server.socket.getsockname() |
4555 | 28 | sys.stderr.write("Serving HTTP: %s %s %s\n" % | 37 | sys.stderr.write( |
4556 | 29 | (address, port, os.getcwd())) | 38 | "Serving HTTP: %s %s %s\n" % (address, port, os.getcwd()) |
4557 | 39 | ) | ||
4558 | 30 | server.serve_forever() | 40 | server.serve_forever() |
4559 | 31 | except KeyboardInterrupt: | 41 | except KeyboardInterrupt: |
4560 | 32 | server.socket.close() | 42 | server.socket.close() |
4561 | 33 | 43 | ||
4562 | 34 | 44 | ||
4564 | 35 | if __name__ == '__main__': | 45 | if __name__ == "__main__": |
4565 | 36 | import sys | 46 | import sys |
4566 | 47 | |||
4567 | 37 | if len(sys.argv) == 3: | 48 | if len(sys.argv) == 3: |
4568 | 38 | # 2 args: address and port | 49 | # 2 args: address and port |
4569 | 39 | address = sys.argv[1] | 50 | address = sys.argv[1] |
4570 | 40 | port = int(sys.argv[2]) | 51 | port = int(sys.argv[2]) |
4571 | 41 | elif len(sys.argv) == 2: | 52 | elif len(sys.argv) == 2: |
4572 | 42 | # 1 arg: port | 53 | # 1 arg: port |
4574 | 43 | address = '0.0.0.0' | 54 | address = "0.0.0.0" |
4575 | 44 | port = int(sys.argv[1]) | 55 | port = int(sys.argv[1]) |
4576 | 45 | elif len(sys.argv) == 1: | 56 | elif len(sys.argv) == 1: |
4577 | 46 | # no args random port (port=0) | 57 | # no args random port (port=0) |
4579 | 47 | address = '0.0.0.0' | 58 | address = "0.0.0.0" |
4580 | 48 | port = 0 | 59 | port = 0 |
4581 | 49 | else: | 60 | else: |
4582 | 50 | sys.stderr.write("Expect [address] [port]\n") | 61 | sys.stderr.write("Expect [address] [port]\n") |
4583 | diff --git a/tests/testutil.py b/tests/testutil.py | |||
4584 | index 2c89e3a..2816366 100644 | |||
4585 | --- a/tests/testutil.py | |||
4586 | +++ b/tests/testutil.py | |||
4587 | @@ -1,10 +1,10 @@ | |||
4588 | 1 | import os | 1 | import os |
4589 | 2 | from simplestreams import objectstores | ||
4590 | 3 | from simplestreams import mirrors | ||
4591 | 4 | 2 | ||
4592 | 3 | from simplestreams import mirrors, objectstores | ||
4593 | 5 | 4 | ||
4596 | 6 | EXAMPLES_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__), | 5 | EXAMPLES_DIR = os.path.abspath( |
4597 | 7 | "..", "examples")) | 6 | os.path.join(os.path.dirname(__file__), "..", "examples") |
4598 | 7 | ) | ||
4599 | 8 | 8 | ||
4600 | 9 | 9 | ||
4601 | 10 | def get_mirror_reader(name, docdir=None, signed=False): | 10 | def get_mirror_reader(name, docdir=None, signed=False): |
4602 | @@ -20,4 +20,5 @@ def get_mirror_reader(name, docdir=None, signed=False): | |||
4603 | 20 | kwargs = {} if signed else {"policy": policy} | 20 | kwargs = {} if signed else {"policy": policy} |
4604 | 21 | return mirrors.ObjectStoreMirrorReader(sstore, **kwargs) | 21 | return mirrors.ObjectStoreMirrorReader(sstore, **kwargs) |
4605 | 22 | 22 | ||
4606 | 23 | |||
4607 | 23 | # vi: ts=4 expandtab syntax=python | 24 | # vi: ts=4 expandtab syntax=python |
4608 | diff --git a/tests/unittests/test_badmirrors.py b/tests/unittests/test_badmirrors.py | |||
4609 | index 6c5546f..d640e28 100644 | |||
4610 | --- a/tests/unittests/test_badmirrors.py | |||
4611 | +++ b/tests/unittests/test_badmirrors.py | |||
4612 | @@ -1,11 +1,12 @@ | |||
4613 | 1 | from unittest import TestCase | 1 | from unittest import TestCase |
4615 | 2 | from tests.testutil import get_mirror_reader | 2 | |
4616 | 3 | from simplestreams import checksum_util, mirrors, util | ||
4617 | 3 | from simplestreams.mirrors import ( | 4 | from simplestreams.mirrors import ( |
4619 | 4 | ObjectStoreMirrorWriter, ObjectStoreMirrorReader) | 5 | ObjectStoreMirrorReader, |
4620 | 6 | ObjectStoreMirrorWriter, | ||
4621 | 7 | ) | ||
4622 | 5 | from simplestreams.objectstores import MemoryObjectStore | 8 | from simplestreams.objectstores import MemoryObjectStore |
4626 | 6 | from simplestreams import util | 9 | from tests.testutil import get_mirror_reader |
4624 | 7 | from simplestreams import checksum_util | ||
4625 | 8 | from simplestreams import mirrors | ||
4627 | 9 | 10 | ||
4628 | 10 | 11 | ||
4629 | 11 | class TestBadDataSources(TestCase): | 12 | class TestBadDataSources(TestCase): |
4630 | @@ -19,7 +20,8 @@ class TestBadDataSources(TestCase): | |||
4631 | 19 | def setUp(self): | 20 | def setUp(self): |
4632 | 20 | self.src = self.get_clean_src(self.example, path=self.dlpath) | 21 | self.src = self.get_clean_src(self.example, path=self.dlpath) |
4633 | 21 | self.target = ObjectStoreMirrorWriter( | 22 | self.target = ObjectStoreMirrorWriter( |
4635 | 22 | config={}, objectstore=MemoryObjectStore()) | 23 | config={}, objectstore=MemoryObjectStore() |
4636 | 24 | ) | ||
4637 | 23 | 25 | ||
4638 | 24 | def get_clean_src(self, exname, path): | 26 | def get_clean_src(self, exname, path): |
4639 | 25 | good_src = get_mirror_reader(exname) | 27 | good_src = get_mirror_reader(exname) |
4640 | @@ -34,7 +36,8 @@ class TestBadDataSources(TestCase): | |||
4641 | 34 | del objectstore.data[k] | 36 | del objectstore.data[k] |
4642 | 35 | 37 | ||
4643 | 36 | return ObjectStoreMirrorReader( | 38 | return ObjectStoreMirrorReader( |
4645 | 37 | objectstore=objectstore, policy=lambda content, path: content) | 39 | objectstore=objectstore, policy=lambda content, path: content |
4646 | 40 | ) | ||
4647 | 38 | 41 | ||
4648 | 39 | def test_sanity_valid(self): | 42 | def test_sanity_valid(self): |
4649 | 40 | # verify that the tests are fine on expected pass | 43 | # verify that the tests are fine on expected pass |
4650 | @@ -43,50 +46,77 @@ class TestBadDataSources(TestCase): | |||
4651 | 43 | 46 | ||
4652 | 44 | def test_larger_size_causes_bad_checksum(self): | 47 | def test_larger_size_causes_bad_checksum(self): |
4653 | 45 | def size_plus_1(item): | 48 | def size_plus_1(item): |
4655 | 46 | item['size'] = int(item['size']) + 1 | 49 | item["size"] = int(item["size"]) + 1 |
4656 | 47 | return item | 50 | return item |
4657 | 48 | 51 | ||
4658 | 49 | _moditem(self.src, self.dlpath, self.pedigree, size_plus_1) | 52 | _moditem(self.src, self.dlpath, self.pedigree, size_plus_1) |
4661 | 50 | self.assertRaises(checksum_util.InvalidChecksum, | 53 | self.assertRaises( |
4662 | 51 | self.target.sync, self.src, self.dlpath) | 54 | checksum_util.InvalidChecksum, |
4663 | 55 | self.target.sync, | ||
4664 | 56 | self.src, | ||
4665 | 57 | self.dlpath, | ||
4666 | 58 | ) | ||
4667 | 52 | 59 | ||
4668 | 53 | def test_smaller_size_causes_bad_checksum(self): | 60 | def test_smaller_size_causes_bad_checksum(self): |
4669 | 54 | def size_minus_1(item): | 61 | def size_minus_1(item): |
4671 | 55 | item['size'] = int(item['size']) - 1 | 62 | item["size"] = int(item["size"]) - 1 |
4672 | 56 | return item | 63 | return item |
4673 | 64 | |||
4674 | 57 | _moditem(self.src, self.dlpath, self.pedigree, size_minus_1) | 65 | _moditem(self.src, self.dlpath, self.pedigree, size_minus_1) |
4677 | 58 | self.assertRaises(checksum_util.InvalidChecksum, | 66 | self.assertRaises( |
4678 | 59 | self.target.sync, self.src, self.dlpath) | 67 | checksum_util.InvalidChecksum, |
4679 | 68 | self.target.sync, | ||
4680 | 69 | self.src, | ||
4681 | 70 | self.dlpath, | ||
4682 | 71 | ) | ||
4683 | 60 | 72 | ||
4684 | 61 | def test_too_much_content_causes_bad_checksum(self): | 73 | def test_too_much_content_causes_bad_checksum(self): |
4685 | 62 | self.src.objectstore.data[self.item_path] += b"extra" | 74 | self.src.objectstore.data[self.item_path] += b"extra" |
4688 | 63 | self.assertRaises(checksum_util.InvalidChecksum, | 75 | self.assertRaises( |
4689 | 64 | self.target.sync, self.src, self.dlpath) | 76 | checksum_util.InvalidChecksum, |
4690 | 77 | self.target.sync, | ||
4691 | 78 | self.src, | ||
4692 | 79 | self.dlpath, | ||
4693 | 80 | ) | ||
4694 | 65 | 81 | ||
4695 | 66 | def test_too_little_content_causes_bad_checksum(self): | 82 | def test_too_little_content_causes_bad_checksum(self): |
4696 | 67 | orig = self.src.objectstore.data[self.item_path] | 83 | orig = self.src.objectstore.data[self.item_path] |
4697 | 68 | self.src.objectstore.data[self.item_path] = orig[0:-1] | 84 | self.src.objectstore.data[self.item_path] = orig[0:-1] |
4700 | 69 | self.assertRaises(checksum_util.InvalidChecksum, | 85 | self.assertRaises( |
4701 | 70 | self.target.sync, self.src, self.dlpath) | 86 | checksum_util.InvalidChecksum, |
4702 | 87 | self.target.sync, | ||
4703 | 88 | self.src, | ||
4704 | 89 | self.dlpath, | ||
4705 | 90 | ) | ||
4706 | 71 | 91 | ||
4707 | 72 | def test_busted_checksum_causes_bad_checksum(self): | 92 | def test_busted_checksum_causes_bad_checksum(self): |
4708 | 73 | def break_checksum(item): | 93 | def break_checksum(item): |
4709 | 74 | chars = "0123456789abcdef" | 94 | chars = "0123456789abcdef" |
4713 | 75 | orig = item['sha256'] | 95 | orig = item["sha256"] |
4714 | 76 | item['sha256'] = ''.join( | 96 | item["sha256"] = "".join( |
4715 | 77 | [chars[(chars.find(c) + 1) % len(chars)] for c in orig]) | 97 | [chars[(chars.find(c) + 1) % len(chars)] for c in orig] |
4716 | 98 | ) | ||
4717 | 78 | return item | 99 | return item |
4718 | 79 | 100 | ||
4719 | 80 | _moditem(self.src, self.dlpath, self.pedigree, break_checksum) | 101 | _moditem(self.src, self.dlpath, self.pedigree, break_checksum) |
4722 | 81 | self.assertRaises(checksum_util.InvalidChecksum, | 102 | self.assertRaises( |
4723 | 82 | self.target.sync, self.src, self.dlpath) | 103 | checksum_util.InvalidChecksum, |
4724 | 104 | self.target.sync, | ||
4725 | 105 | self.src, | ||
4726 | 106 | self.dlpath, | ||
4727 | 107 | ) | ||
4728 | 83 | 108 | ||
4729 | 84 | def test_changed_content_causes_bad_checksum(self): | 109 | def test_changed_content_causes_bad_checksum(self): |
4730 | 85 | # correct size but different content should raise bad checksum | 110 | # correct size but different content should raise bad checksum |
4735 | 86 | self.src.objectstore.data[self.item_path] = ''.join( | 111 | self.src.objectstore.data[self.item_path] = "".join( |
4736 | 87 | ["x" for c in self.src.objectstore.data[self.item_path]]) | 112 | ["x" for c in self.src.objectstore.data[self.item_path]] |
4737 | 88 | self.assertRaises(checksum_util.InvalidChecksum, | 113 | ) |
4738 | 89 | self.target.sync, self.src, self.dlpath) | 114 | self.assertRaises( |
4739 | 115 | checksum_util.InvalidChecksum, | ||
4740 | 116 | self.target.sync, | ||
4741 | 117 | self.src, | ||
4742 | 118 | self.dlpath, | ||
4743 | 119 | ) | ||
4744 | 90 | 120 | ||
4745 | 91 | def test_no_checksums_cause_bad_checksum(self): | 121 | def test_no_checksums_cause_bad_checksum(self): |
4746 | 92 | def del_checksums(item): | 122 | def del_checksums(item): |
4747 | @@ -96,32 +126,41 @@ class TestBadDataSources(TestCase): | |||
4748 | 96 | 126 | ||
4749 | 97 | _moditem(self.src, self.dlpath, self.pedigree, del_checksums) | 127 | _moditem(self.src, self.dlpath, self.pedigree, del_checksums) |
4750 | 98 | with _patched_missing_sum("fail"): | 128 | with _patched_missing_sum("fail"): |
4753 | 99 | self.assertRaises(checksum_util.InvalidChecksum, | 129 | self.assertRaises( |
4754 | 100 | self.target.sync, self.src, self.dlpath) | 130 | checksum_util.InvalidChecksum, |
4755 | 131 | self.target.sync, | ||
4756 | 132 | self.src, | ||
4757 | 133 | self.dlpath, | ||
4758 | 134 | ) | ||
4759 | 101 | 135 | ||
4760 | 102 | def test_missing_size_causes_bad_checksum(self): | 136 | def test_missing_size_causes_bad_checksum(self): |
4761 | 103 | def del_size(item): | 137 | def del_size(item): |
4763 | 104 | del item['size'] | 138 | del item["size"] |
4764 | 105 | return item | 139 | return item |
4765 | 106 | 140 | ||
4766 | 107 | _moditem(self.src, self.dlpath, self.pedigree, del_size) | 141 | _moditem(self.src, self.dlpath, self.pedigree, del_size) |
4767 | 108 | with _patched_missing_sum("fail"): | 142 | with _patched_missing_sum("fail"): |
4770 | 109 | self.assertRaises(checksum_util.InvalidChecksum, | 143 | self.assertRaises( |
4771 | 110 | self.target.sync, self.src, self.dlpath) | 144 | checksum_util.InvalidChecksum, |
4772 | 145 | self.target.sync, | ||
4773 | 146 | self.src, | ||
4774 | 147 | self.dlpath, | ||
4775 | 148 | ) | ||
4776 | 111 | 149 | ||
4777 | 112 | 150 | ||
4778 | 113 | class _patched_missing_sum(object): | 151 | class _patched_missing_sum(object): |
4779 | 114 | """This patches the legacy mode for missing checksum info so | 152 | """This patches the legacy mode for missing checksum info so |
4780 | 115 | that it behaves like the new code path. Thus we can make | 153 | that it behaves like the new code path. Thus we can make |
4781 | 116 | the test run correctly""" | 154 | the test run correctly""" |
4782 | 155 | |||
4783 | 117 | def __init__(self, mode="fail"): | 156 | def __init__(self, mode="fail"): |
4784 | 118 | self.mode = mode | 157 | self.mode = mode |
4785 | 119 | 158 | ||
4786 | 120 | def __enter__(self): | 159 | def __enter__(self): |
4788 | 121 | self.modmcb = getattr(mirrors, '_missing_cksum_behavior', {}) | 160 | self.modmcb = getattr(mirrors, "_missing_cksum_behavior", {}) |
4789 | 122 | self.orig = self.modmcb.copy() | 161 | self.orig = self.modmcb.copy() |
4790 | 123 | if self.modmcb: | 162 | if self.modmcb: |
4792 | 124 | self.modmcb['mode'] = self.mode | 163 | self.modmcb["mode"] = self.mode |
4793 | 125 | return self | 164 | return self |
4794 | 126 | 165 | ||
4795 | 127 | def __exit__(self, type, value, traceback): | 166 | def __exit__(self, type, value, traceback): |
4796 | diff --git a/tests/unittests/test_command_hook_mirror.py b/tests/unittests/test_command_hook_mirror.py | |||
4797 | index 6a72749..e718c9e 100644 | |||
4798 | --- a/tests/unittests/test_command_hook_mirror.py | |||
4799 | +++ b/tests/unittests/test_command_hook_mirror.py | |||
4800 | @@ -1,4 +1,5 @@ | |||
4801 | 1 | from unittest import TestCase | 1 | from unittest import TestCase |
4802 | 2 | |||
4803 | 2 | import simplestreams.mirrors.command_hook as chm | 3 | import simplestreams.mirrors.command_hook as chm |
4804 | 3 | from tests.testutil import get_mirror_reader | 4 | from tests.testutil import get_mirror_reader |
4805 | 4 | 5 | ||
4806 | @@ -13,12 +14,11 @@ class TestCommandHookMirror(TestCase): | |||
4807 | 13 | self.assertRaises(TypeError, chm.CommandHookMirror, {}) | 14 | self.assertRaises(TypeError, chm.CommandHookMirror, {}) |
4808 | 14 | 15 | ||
4809 | 15 | def test_init_with_load_products_works(self): | 16 | def test_init_with_load_products_works(self): |
4811 | 16 | chm.CommandHookMirror({'load_products': 'true'}) | 17 | chm.CommandHookMirror({"load_products": "true"}) |
4812 | 17 | 18 | ||
4813 | 18 | def test_stream_load_empty(self): | 19 | def test_stream_load_empty(self): |
4814 | 19 | |||
4815 | 20 | src = get_mirror_reader("foocloud") | 20 | src = get_mirror_reader("foocloud") |
4817 | 21 | target = chm.CommandHookMirror({'load_products': ['true']}) | 21 | target = chm.CommandHookMirror({"load_products": ["true"]}) |
4818 | 22 | oruncmd = chm.run_command | 22 | oruncmd = chm.run_command |
4819 | 23 | 23 | ||
4820 | 24 | try: | 24 | try: |
4821 | @@ -30,14 +30,16 @@ class TestCommandHookMirror(TestCase): | |||
4822 | 30 | 30 | ||
4823 | 31 | # the 'load_products' should be called once for each content | 31 | # the 'load_products' should be called once for each content |
4824 | 32 | # in the stream. | 32 | # in the stream. |
4826 | 33 | self.assertEqual(self._run_commands, [['true'], ['true']]) | 33 | self.assertEqual(self._run_commands, [["true"], ["true"]]) |
4827 | 34 | 34 | ||
4828 | 35 | def test_stream_insert_product(self): | 35 | def test_stream_insert_product(self): |
4829 | 36 | |||
4830 | 37 | src = get_mirror_reader("foocloud") | 36 | src = get_mirror_reader("foocloud") |
4831 | 38 | target = chm.CommandHookMirror( | 37 | target = chm.CommandHookMirror( |
4834 | 39 | {'load_products': ['load-products'], | 38 | { |
4835 | 40 | 'insert_products': ['insert-products']}) | 39 | "load_products": ["load-products"], |
4836 | 40 | "insert_products": ["insert-products"], | ||
4837 | 41 | } | ||
4838 | 42 | ) | ||
4839 | 41 | oruncmd = chm.run_command | 43 | oruncmd = chm.run_command |
4840 | 42 | 44 | ||
4841 | 43 | try: | 45 | try: |
4842 | @@ -49,15 +51,18 @@ class TestCommandHookMirror(TestCase): | |||
4843 | 49 | 51 | ||
4844 | 50 | # the 'load_products' should be called once for each content | 52 | # the 'load_products' should be called once for each content |
4845 | 51 | # in the stream. same for 'insert-products' | 53 | # in the stream. same for 'insert-products' |
4850 | 52 | self.assertEqual(len([f for f in self._run_commands | 54 | self.assertEqual( |
4851 | 53 | if f == ['load-products']]), 2) | 55 | len([f for f in self._run_commands if f == ["load-products"]]), 2 |
4852 | 54 | self.assertEqual(len([f for f in self._run_commands | 56 | ) |
4853 | 55 | if f == ['insert-products']]), 2) | 57 | self.assertEqual( |
4854 | 58 | len([f for f in self._run_commands if f == ["insert-products"]]), 2 | ||
4855 | 59 | ) | ||
4856 | 56 | 60 | ||
4857 | 57 | def _run_command(self, cmd, env=None, capture=False, rcs=None): | 61 | def _run_command(self, cmd, env=None, capture=False, rcs=None): |
4858 | 58 | self._run_commands.append(cmd) | 62 | self._run_commands.append(cmd) |
4859 | 59 | rc = 0 | 63 | rc = 0 |
4861 | 60 | output = '' | 64 | output = "" |
4862 | 61 | return (rc, output) | 65 | return (rc, output) |
4863 | 62 | 66 | ||
4864 | 67 | |||
4865 | 63 | # vi: ts=4 expandtab syntax=python | 68 | # vi: ts=4 expandtab syntax=python |
4866 | diff --git a/tests/unittests/test_contentsource.py b/tests/unittests/test_contentsource.py | |||
4867 | index ef838a2..2b88a15 100644 | |||
4868 | --- a/tests/unittests/test_contentsource.py | |||
4869 | +++ b/tests/unittests/test_contentsource.py | |||
4870 | @@ -2,14 +2,14 @@ import os | |||
4871 | 2 | import shutil | 2 | import shutil |
4872 | 3 | import sys | 3 | import sys |
4873 | 4 | import tempfile | 4 | import tempfile |
4879 | 5 | 5 | from os.path import dirname, join | |
4880 | 6 | from os.path import join, dirname | 6 | from subprocess import PIPE, STDOUT, Popen |
4876 | 7 | from simplestreams import objectstores | ||
4877 | 8 | from simplestreams import contentsource | ||
4878 | 9 | from subprocess import Popen, PIPE, STDOUT | ||
4881 | 10 | from unittest import TestCase, skipIf | 7 | from unittest import TestCase, skipIf |
4882 | 8 | |||
4883 | 11 | import pytest | 9 | import pytest |
4884 | 12 | 10 | ||
4885 | 11 | from simplestreams import contentsource, objectstores | ||
4886 | 12 | |||
4887 | 13 | 13 | ||
4888 | 14 | class RandomPortServer(object): | 14 | class RandomPortServer(object): |
4889 | 15 | def __init__(self, path): | 15 | def __init__(self, path): |
4890 | @@ -22,14 +22,15 @@ class RandomPortServer(object): | |||
4891 | 22 | if self.port and self.process: | 22 | if self.port and self.process: |
4892 | 23 | return | 23 | return |
4893 | 24 | testserver_path = join( | 24 | testserver_path = join( |
4896 | 25 | dirname(__file__), "..", "..", "tests", "httpserver.py") | 25 | dirname(__file__), "..", "..", "tests", "httpserver.py" |
4897 | 26 | pre = b'Serving HTTP:' | 26 | ) |
4898 | 27 | pre = b"Serving HTTP:" | ||
4899 | 27 | 28 | ||
4901 | 28 | cmd = [sys.executable, '-u', testserver_path, "0"] | 29 | cmd = [sys.executable, "-u", testserver_path, "0"] |
4902 | 29 | p = Popen(cmd, cwd=self.path, stdout=PIPE, stderr=STDOUT) | 30 | p = Popen(cmd, cwd=self.path, stdout=PIPE, stderr=STDOUT) |
4903 | 30 | line = p.stdout.readline() # pylint: disable=E1101 | 31 | line = p.stdout.readline() # pylint: disable=E1101 |
4904 | 31 | if line.startswith(pre): | 32 | if line.startswith(pre): |
4906 | 32 | data = line[len(pre):].strip() | 33 | data = line[len(pre) :].strip() |
4907 | 33 | addr, port_str, cwd = data.decode().split(" ", 2) | 34 | addr, port_str, cwd = data.decode().split(" ", 2) |
4908 | 34 | self.port = int(port_str) | 35 | self.port = int(port_str) |
4909 | 35 | self.addr = addr | 36 | self.addr = addr |
4910 | @@ -39,8 +40,9 @@ class RandomPortServer(object): | |||
4911 | 39 | else: | 40 | else: |
4912 | 40 | p.kill() | 41 | p.kill() |
4913 | 41 | raise RuntimeError( | 42 | raise RuntimeError( |
4916 | 42 | "Failed to start server in %s with %s. pid=%s. got: %s" % | 43 | "Failed to start server in %s with %s. pid=%s. got: %s" |
4917 | 43 | (self.path, cmd, self.process, line)) | 44 | % (self.path, cmd, self.process, line) |
4918 | 45 | ) | ||
4919 | 44 | 46 | ||
4920 | 45 | def read_output(self): | 47 | def read_output(self): |
4921 | 46 | return str(self.process.stdout.readline()) | 48 | return str(self.process.stdout.readline()) |
4922 | @@ -63,13 +65,17 @@ class RandomPortServer(object): | |||
4923 | 63 | if self.process: | 65 | if self.process: |
4924 | 64 | pid = self.process.pid | 66 | pid = self.process.pid |
4925 | 65 | 67 | ||
4928 | 66 | return ("RandomPortServer(port=%s, addr=%s, process=%s, path=%s)" % | 68 | return "RandomPortServer(port=%s, addr=%s, process=%s, path=%s)" % ( |
4929 | 67 | (self.port, self.addr, pid, self.path)) | 69 | self.port, |
4930 | 70 | self.addr, | ||
4931 | 71 | pid, | ||
4932 | 72 | self.path, | ||
4933 | 73 | ) | ||
4934 | 68 | 74 | ||
4935 | 69 | def url_for(self, fpath=""): | 75 | def url_for(self, fpath=""): |
4936 | 70 | if self.port is None: | 76 | if self.port is None: |
4937 | 71 | raise ValueError("No port available") | 77 | raise ValueError("No port available") |
4939 | 72 | return 'http://127.0.0.1:%d/' % self.port + fpath | 78 | return "http://127.0.0.1:%d/" % self.port + fpath |
4940 | 73 | 79 | ||
4941 | 74 | 80 | ||
4942 | 75 | class BaseDirUsingTestCase(TestCase): | 81 | class BaseDirUsingTestCase(TestCase): |
4943 | @@ -100,7 +106,8 @@ class BaseDirUsingTestCase(TestCase): | |||
4944 | 100 | 106 | ||
4945 | 101 | def getcs(self, path, url_reader=None, rel=None): | 107 | def getcs(self, path, url_reader=None, rel=None): |
4946 | 102 | return contentsource.UrlContentSource( | 108 | return contentsource.UrlContentSource( |
4948 | 103 | self.url_for(path, rel=rel), url_reader=url_reader) | 109 | self.url_for(path, rel=rel), url_reader=url_reader |
4949 | 110 | ) | ||
4950 | 104 | 111 | ||
4951 | 105 | def path_for(self, fpath, rel=None): | 112 | def path_for(self, fpath, rel=None): |
4952 | 106 | # return full path to fpath. | 113 | # return full path to fpath. |
4953 | @@ -120,8 +127,9 @@ class BaseDirUsingTestCase(TestCase): | |||
4954 | 120 | 127 | ||
4955 | 121 | if not fullpath.startswith(self.tmpd + os.path.sep): | 128 | if not fullpath.startswith(self.tmpd + os.path.sep): |
4956 | 122 | raise ValueError( | 129 | raise ValueError( |
4959 | 123 | "%s is not a valid path. Not under tmpdir: %s" % | 130 | "%s is not a valid path. Not under tmpdir: %s" |
4960 | 124 | (fpath, self.tmpd)) | 131 | % (fpath, self.tmpd) |
4961 | 132 | ) | ||
4962 | 125 | 133 | ||
4963 | 126 | return fullpath | 134 | return fullpath |
4964 | 127 | 135 | ||
4965 | @@ -133,17 +141,18 @@ class BaseDirUsingTestCase(TestCase): | |||
4966 | 133 | if not self.server: | 141 | if not self.server: |
4967 | 134 | raise ValueError("No server available, but proto == http") | 142 | raise ValueError("No server available, but proto == http") |
4968 | 135 | return self.server.url_for( | 143 | return self.server.url_for( |
4970 | 136 | self.path_for(fpath=fpath, rel=rel)[len(self.tmpd)+1:]) | 144 | self.path_for(fpath=fpath, rel=rel)[len(self.tmpd) + 1 :] |
4971 | 145 | ) | ||
4972 | 137 | 146 | ||
4973 | 138 | 147 | ||
4974 | 139 | class TestUrlContentSource(BaseDirUsingTestCase): | 148 | class TestUrlContentSource(BaseDirUsingTestCase): |
4975 | 140 | http = True | 149 | http = True |
4978 | 141 | fpath = 'foo' | 150 | fpath = "foo" |
4979 | 142 | fdata = b'hello world\n' | 151 | fdata = b"hello world\n" |
4980 | 143 | 152 | ||
4981 | 144 | def setUp(self): | 153 | def setUp(self): |
4982 | 145 | super(TestUrlContentSource, self).setUp() | 154 | super(TestUrlContentSource, self).setUp() |
4984 | 146 | with open(join(self.test_d, self.fpath), 'wb') as f: | 155 | with open(join(self.test_d, self.fpath), "wb") as f: |
4985 | 147 | f.write(self.fdata) | 156 | f.write(self.fdata) |
4986 | 148 | 157 | ||
4987 | 149 | def test_default_url_read_handles_None(self): | 158 | def test_default_url_read_handles_None(self): |
4988 | @@ -171,8 +180,10 @@ class TestUrlContentSource(BaseDirUsingTestCase): | |||
4989 | 171 | 180 | ||
4990 | 172 | @skipIf(contentsource.requests is None, "requests not available") | 181 | @skipIf(contentsource.requests is None, "requests not available") |
4991 | 173 | def test_requests_default_timeout(self): | 182 | def test_requests_default_timeout(self): |
4994 | 174 | self.assertEqual(contentsource.RequestsUrlReader.timeout, | 183 | self.assertEqual( |
4995 | 175 | (contentsource.TIMEOUT, None)) | 184 | contentsource.RequestsUrlReader.timeout, |
4996 | 185 | (contentsource.TIMEOUT, None), | ||
4997 | 186 | ) | ||
4998 | 176 | 187 | ||
4999 | 177 | @skipIf(contentsource.requests is None, "requests not available") | 188 | @skipIf(contentsource.requests is None, "requests not available") |
5000 | 178 | def test_requests_url_read_handles_None(self): | 189 | def test_requests_url_read_handles_None(self): |
FAILED: Continuous integration, rev:43978e994d5 0150f5e7b2107bc 310ec57ed2efd2 /jenkins. canonical. com/server- team/job/ simplestreams- ci/11/ /jenkins. canonical. com/server- team/job/ simplestreams- ci/nodes= metal-amd64/ 11/ /jenkins. canonical. com/server- team/job/ simplestreams- ci/nodes= metal-ppc64el/ 11/ /jenkins. canonical. com/server- team/job/ simplestreams- ci/nodes= metal-s390x/ 11/
https:/
Executed test runs:
FAILURE: https:/
FAILURE: https:/
FAILURE: https:/
Click here to trigger a rebuild: /jenkins. canonical. com/server- team/job/ simplestreams- ci/11// rebuild
https:/