Merge lp:~fo0bar/turku/turku-agent-cleanup into lp:turku/turku-agent
- turku-agent-cleanup
- Merge into turku-agent
Status: | Merged |
---|---|
Approved by: | Barry Price |
Approved revision: | 57 |
Merged at revision: | 57 |
Proposed branch: | lp:~fo0bar/turku/turku-agent-cleanup |
Merge into: | lp:turku/turku-agent |
Diff against target: |
1204 lines (+487/-298) 10 files modified
.bzrignore (+61/-5) MANIFEST.in (+13/-0) Makefile (+28/-0) setup.py (+13/-13) tests/test_stub.py (+8/-0) tox.ini (+38/-0) turku_agent/ping.py (+108/-84) turku_agent/rsyncd_wrapper.py (+11/-7) turku_agent/update_config.py (+71/-69) turku_agent/utils.py (+136/-120) |
To merge this branch: | bzr merge lp:~fo0bar/turku/turku-agent-cleanup |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Stuart Bishop (community) | Approve | ||
Review via email: mp+386145@code.launchpad.net |
Commit message
Mega-noop cleanup
Description of the change
This is the minimum required for:
- tox test suite with all passing tests
- black-managed formatting
- Shippable sdist module
It is intended as a base for the other MPs, so they don't have to e.g. establish tests/*, or worry about about existing failing flake8, or worry about how to add additional optional modules.
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote : | # |
- 57. By Ryan Finnie
-
Mega-noop cleanup
- Sort imports
- Remove shabangs from non-scripts
- Update MANIFEST.in so `setup.py sdist` produces usable tarballs
- Create stub tests
- Add tox.ini
- Add blank requirements.txt
- Add Makefile
- make black
- Update .bzrignore
- Clean up flake8:
- update_config.py: '.utils.json_dump_ p' imported but unused
- update_config.py: 'api_reply' is assigned to but never used
- utils.py: 'sources_secrets_d' is assigned to but never used
Stuart Bishop (stub) wrote : | # |
Looks good. The new bits seem fine (bzr, buildchain, tox). The code changes all appear to be nothing but Black reformatting (as expected).
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote : | # |
Change successfully merged at revision 57
Preview Diff
1 | === modified file '.bzrignore' | |||
2 | --- .bzrignore 2019-04-22 01:16:04 +0000 | |||
3 | +++ .bzrignore 2020-06-21 22:22:36 +0000 | |||
4 | @@ -1,5 +1,61 @@ | |||
10 | 1 | *.pyc | 1 | MANIFEST |
11 | 2 | ./build/ | 2 | .pybuild/ |
12 | 3 | ./dist/ | 3 | .pytest_cache/ |
13 | 4 | ./MANIFEST | 4 | |
14 | 5 | ./*.egg-info/ | 5 | # Byte-compiled / optimized / DLL files |
15 | 6 | __pycache__/ | ||
16 | 7 | *.py[cod] | ||
17 | 8 | |||
18 | 9 | # C extensions | ||
19 | 10 | *.so | ||
20 | 11 | |||
21 | 12 | # Distribution / packaging | ||
22 | 13 | .Python | ||
23 | 14 | env/ | ||
24 | 15 | build/ | ||
25 | 16 | develop-eggs/ | ||
26 | 17 | dist/ | ||
27 | 18 | downloads/ | ||
28 | 19 | eggs/ | ||
29 | 20 | .eggs/ | ||
30 | 21 | lib/ | ||
31 | 22 | lib64/ | ||
32 | 23 | parts/ | ||
33 | 24 | sdist/ | ||
34 | 25 | var/ | ||
35 | 26 | *.egg-info/ | ||
36 | 27 | .installed.cfg | ||
37 | 28 | *.egg | ||
38 | 29 | |||
39 | 30 | # PyInstaller | ||
40 | 31 | # Usually these files are written by a python script from a template | ||
41 | 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. | ||
42 | 33 | *.manifest | ||
43 | 34 | *.spec | ||
44 | 35 | |||
45 | 36 | # Installer logs | ||
46 | 37 | pip-log.txt | ||
47 | 38 | pip-delete-this-directory.txt | ||
48 | 39 | |||
49 | 40 | # Unit test / coverage reports | ||
50 | 41 | htmlcov/ | ||
51 | 42 | .tox/ | ||
52 | 43 | .coverage | ||
53 | 44 | .coverage.* | ||
54 | 45 | .cache | ||
55 | 46 | nosetests.xml | ||
56 | 47 | coverage.xml | ||
57 | 48 | *,cover | ||
58 | 49 | |||
59 | 50 | # Translations | ||
60 | 51 | *.mo | ||
61 | 52 | *.pot | ||
62 | 53 | |||
63 | 54 | # Django stuff: | ||
64 | 55 | *.log | ||
65 | 56 | |||
66 | 57 | # Sphinx documentation | ||
67 | 58 | docs/_build/ | ||
68 | 59 | |||
69 | 60 | # PyBuilder | ||
70 | 61 | target/ | ||
71 | 6 | 62 | ||
72 | === modified file 'MANIFEST.in' | |||
73 | --- MANIFEST.in 2015-03-29 07:59:33 +0000 | |||
74 | +++ MANIFEST.in 2020-06-21 22:22:36 +0000 | |||
75 | @@ -1,3 +1,16 @@ | |||
76 | 1 | include Makefile | ||
77 | 2 | include README | ||
78 | 3 | include requirements.txt | ||
79 | 4 | include tests/*.py | ||
80 | 5 | include tox.ini | ||
81 | 1 | include turku-agent.cron | 6 | include turku-agent.cron |
82 | 7 | include turku-agent-ping | ||
83 | 8 | include turku-agent-ping.service | ||
84 | 9 | include turku-agent-ping.timer | ||
85 | 2 | include turku-agent-rsyncd.conf | 10 | include turku-agent-rsyncd.conf |
86 | 11 | include turku-agent-rsyncd.init-debian | ||
87 | 3 | include turku-agent-rsyncd.service | 12 | include turku-agent-rsyncd.service |
88 | 13 | include turku-agent-rsyncd-wrapper | ||
89 | 14 | include turku-update-config | ||
90 | 15 | include turku-update-config.service | ||
91 | 16 | include turku-update-config.timer | ||
92 | 4 | 17 | ||
93 | === added file 'Makefile' | |||
94 | --- Makefile 1970-01-01 00:00:00 +0000 | |||
95 | +++ Makefile 2020-06-21 22:22:36 +0000 | |||
96 | @@ -0,0 +1,28 @@ | |||
97 | 1 | PYTHON := python3 | ||
98 | 2 | |||
99 | 3 | all: build | ||
100 | 4 | |||
101 | 5 | build: | ||
102 | 6 | $(PYTHON) setup.py build | ||
103 | 7 | |||
104 | 8 | lint: | ||
105 | 9 | $(PYTHON) -mtox -e flake8 | ||
106 | 10 | |||
107 | 11 | test: | ||
108 | 12 | $(PYTHON) -mtox | ||
109 | 13 | |||
110 | 14 | test-quick: | ||
111 | 15 | $(PYTHON) -mtox -e black,flake8,pytest-quick | ||
112 | 16 | |||
113 | 17 | black-check: | ||
114 | 18 | $(PYTHON) -mtox -e black | ||
115 | 19 | |||
116 | 20 | black: | ||
117 | 21 | $(PYTHON) -mblack $(CURDIR) | ||
118 | 22 | |||
119 | 23 | install: build | ||
120 | 24 | $(PYTHON) setup.py install | ||
121 | 25 | |||
122 | 26 | clean: | ||
123 | 27 | $(PYTHON) setup.py clean | ||
124 | 28 | $(RM) -r build MANIFEST | ||
125 | 0 | 29 | ||
126 | === added file 'requirements.txt' | |||
127 | === modified file 'setup.py' | |||
128 | --- setup.py 2019-04-22 01:16:04 +0000 | |||
129 | +++ setup.py 2020-06-21 22:22:36 +0000 | |||
130 | @@ -18,22 +18,22 @@ | |||
131 | 18 | import sys | 18 | import sys |
132 | 19 | from setuptools import setup | 19 | from setuptools import setup |
133 | 20 | 20 | ||
135 | 21 | assert(sys.version_info > (3, 4)) | 21 | assert sys.version_info > (3, 4) |
136 | 22 | 22 | ||
137 | 23 | 23 | ||
138 | 24 | setup( | 24 | setup( |
146 | 25 | name='turku_agent', | 25 | name="turku_agent", |
147 | 26 | description='Turku backups - client agent', | 26 | description="Turku backups - client agent", |
148 | 27 | version='0.2.0', | 27 | version="0.2.0", |
149 | 28 | author='Ryan Finnie', | 28 | author="Ryan Finnie", |
150 | 29 | author_email='ryan.finnie@canonical.com', | 29 | author_email="ryan.finnie@canonical.com", |
151 | 30 | url='https://launchpad.net/turku', | 30 | url="https://launchpad.net/turku", |
152 | 31 | packages=['turku_agent'], | 31 | packages=["turku_agent"], |
153 | 32 | entry_points={ | 32 | entry_points={ |
159 | 33 | 'console_scripts': [ | 33 | "console_scripts": [ |
160 | 34 | 'turku-agent-ping = turku_agent.ping:main', | 34 | "turku-agent-ping = turku_agent.ping:main", |
161 | 35 | 'turku-agent-rsyncd-wrapper = turku_agent.rsyncd_wrapper:main', | 35 | "turku-agent-rsyncd-wrapper = turku_agent.rsyncd_wrapper:main", |
162 | 36 | 'turku-update-config = turku_agent.update_config:main', | 36 | "turku-update-config = turku_agent.update_config:main", |
163 | 37 | ], | 37 | ] |
164 | 38 | }, | 38 | }, |
165 | 39 | ) | 39 | ) |
166 | 40 | 40 | ||
167 | === added directory 'tests' | |||
168 | === added file 'tests/__init__.py' | |||
169 | === added file 'tests/test_stub.py' | |||
170 | --- tests/test_stub.py 1970-01-01 00:00:00 +0000 | |||
171 | +++ tests/test_stub.py 2020-06-21 22:22:36 +0000 | |||
172 | @@ -0,0 +1,8 @@ | |||
173 | 1 | import unittest | ||
174 | 2 | import warnings | ||
175 | 3 | |||
176 | 4 | |||
177 | 5 | class TestStub(unittest.TestCase): | ||
178 | 6 | def test_stub(self): | ||
179 | 7 | # pytest doesn't like a tests/ with no tests | ||
180 | 8 | warnings.warn("Remove this file once unit tests are added") | ||
181 | 0 | 9 | ||
182 | === added file 'tox.ini' | |||
183 | --- tox.ini 1970-01-01 00:00:00 +0000 | |||
184 | +++ tox.ini 2020-06-21 22:22:36 +0000 | |||
185 | @@ -0,0 +1,38 @@ | |||
186 | 1 | [tox] | ||
187 | 2 | envlist = black, flake8, pytest | ||
188 | 3 | |||
189 | 4 | [testenv] | ||
190 | 5 | basepython = python | ||
191 | 6 | |||
192 | 7 | [testenv:black] | ||
193 | 8 | commands = python -mblack --check . | ||
194 | 9 | deps = black | ||
195 | 10 | |||
196 | 11 | [testenv:flake8] | ||
197 | 12 | commands = python -mflake8 | ||
198 | 13 | deps = flake8 | ||
199 | 14 | |||
200 | 15 | [testenv:pytest] | ||
201 | 16 | commands = python -mpytest --cov=turku_agent --cov-report=term-missing | ||
202 | 17 | deps = pytest | ||
203 | 18 | pytest-cov | ||
204 | 19 | -r{toxinidir}/requirements.txt | ||
205 | 20 | |||
206 | 21 | [testenv:pytest-quick] | ||
207 | 22 | commands = python -mpytest -m "not slow" | ||
208 | 23 | deps = pytest | ||
209 | 24 | -r{toxinidir}/requirements.txt | ||
210 | 25 | |||
211 | 26 | [flake8] | ||
212 | 27 | exclude = | ||
213 | 28 | .git, | ||
214 | 29 | __pycache__, | ||
215 | 30 | .tox, | ||
216 | 31 | # TODO: remove C901 once complexity is reduced | ||
217 | 32 | ignore = C901,E203,E231,W503 | ||
218 | 33 | max-line-length = 120 | ||
219 | 34 | max-complexity = 10 | ||
220 | 35 | |||
221 | 36 | [pytest] | ||
222 | 37 | markers = | ||
223 | 38 | slow | ||
224 | 0 | 39 | ||
225 | === modified file 'turku_agent/ping.py' | |||
226 | --- turku_agent/ping.py 2020-03-26 06:06:23 +0000 | |||
227 | +++ turku_agent/ping.py 2020-06-21 22:22:36 +0000 | |||
228 | @@ -1,5 +1,3 @@ | |||
229 | 1 | #!/usr/bin/env python3 | ||
230 | 2 | |||
231 | 3 | # Turku backups - client agent | 1 | # Turku backups - client agent |
232 | 4 | # Copyright 2015 Canonical Ltd. | 2 | # Copyright 2015 Canonical Ltd. |
233 | 5 | # | 3 | # |
234 | @@ -16,13 +14,14 @@ | |||
235 | 16 | # this program. If not, see <http://www.gnu.org/licenses/>. | 14 | # this program. If not, see <http://www.gnu.org/licenses/>. |
236 | 17 | 15 | ||
237 | 18 | 16 | ||
238 | 17 | import json | ||
239 | 19 | import os | 18 | import os |
240 | 20 | import json | ||
241 | 21 | import random | 19 | import random |
242 | 22 | import shlex | 20 | import shlex |
243 | 23 | import subprocess | 21 | import subprocess |
244 | 24 | import tempfile | 22 | import tempfile |
245 | 25 | import time | 23 | import time |
246 | 24 | |||
247 | 26 | from .utils import load_config, acquire_lock, api_call | 25 | from .utils import load_config, acquire_lock, api_call |
248 | 27 | 26 | ||
249 | 28 | 27 | ||
250 | @@ -30,42 +29,60 @@ | |||
251 | 30 | import argparse | 29 | import argparse |
252 | 31 | 30 | ||
253 | 32 | parser = argparse.ArgumentParser( | 31 | parser = argparse.ArgumentParser( |
261 | 33 | formatter_class=argparse.ArgumentDefaultsHelpFormatter) | 32 | formatter_class=argparse.ArgumentDefaultsHelpFormatter |
262 | 34 | parser.add_argument('--config-dir', '-c', type=str, default='/etc/turku-agent') | 33 | ) |
263 | 35 | parser.add_argument('--wait', '-w', type=float) | 34 | parser.add_argument("--config-dir", "-c", type=str, default="/etc/turku-agent") |
264 | 36 | parser.add_argument('--restore', action='store_true') | 35 | parser.add_argument("--wait", "-w", type=float) |
265 | 37 | parser.add_argument('--restore-storage', type=str, default=None) | 36 | parser.add_argument("--restore", action="store_true") |
266 | 38 | parser.add_argument('--gonogo-program', type=str, default=None, | 37 | parser.add_argument("--restore-storage", type=str, default=None) |
267 | 39 | help='Go/no-go program run each time to determine whether to ping') | 38 | parser.add_argument( |
268 | 39 | "--gonogo-program", | ||
269 | 40 | type=str, | ||
270 | 41 | default=None, | ||
271 | 42 | help="Go/no-go program run each time to determine whether to ping", | ||
272 | 43 | ) | ||
273 | 40 | return parser.parse_args() | 44 | return parser.parse_args() |
274 | 41 | 45 | ||
275 | 42 | 46 | ||
276 | 43 | def call_ssh(config, storage, ssh_req): | 47 | def call_ssh(config, storage, ssh_req): |
277 | 44 | # Write the server host public key | 48 | # Write the server host public key |
281 | 45 | t = tempfile.NamedTemporaryFile(mode='w+', encoding='UTF-8') | 49 | t = tempfile.NamedTemporaryFile(mode="w+", encoding="UTF-8") |
282 | 46 | for key in storage['ssh_ping_host_keys']: | 50 | for key in storage["ssh_ping_host_keys"]: |
283 | 47 | t.write('%s %s\n' % (storage['ssh_ping_host'], key)) | 51 | t.write("%s %s\n" % (storage["ssh_ping_host"], key)) |
284 | 48 | t.flush() | 52 | t.flush() |
285 | 49 | 53 | ||
286 | 50 | # Call ssh | 54 | # Call ssh |
288 | 51 | ssh_command = config['ssh_command'] | 55 | ssh_command = config["ssh_command"] |
289 | 52 | ssh_command += [ | 56 | ssh_command += [ |
301 | 53 | '-T', | 57 | "-T", |
302 | 54 | '-o', 'BatchMode=yes', | 58 | "-o", |
303 | 55 | '-o', 'UserKnownHostsFile=%s' % t.name, | 59 | "BatchMode=yes", |
304 | 56 | '-o', 'StrictHostKeyChecking=yes', | 60 | "-o", |
305 | 57 | '-o', 'CheckHostIP=no', | 61 | "UserKnownHostsFile=%s" % t.name, |
306 | 58 | '-i', config['ssh_private_key_file'], | 62 | "-o", |
307 | 59 | '-R', '%d:%s:%d' % (ssh_req['port'], config['rsyncd_local_address'], config['rsyncd_local_port']), | 63 | "StrictHostKeyChecking=yes", |
308 | 60 | '-p', str(storage['ssh_ping_port']), | 64 | "-o", |
309 | 61 | '-l', storage['ssh_ping_user'], | 65 | "CheckHostIP=no", |
310 | 62 | storage['ssh_ping_host'], | 66 | "-i", |
311 | 63 | 'turku-ping-remote', | 67 | config["ssh_private_key_file"], |
312 | 68 | "-R", | ||
313 | 69 | "%d:%s:%d" | ||
314 | 70 | % ( | ||
315 | 71 | ssh_req["port"], | ||
316 | 72 | config["rsyncd_local_address"], | ||
317 | 73 | config["rsyncd_local_port"], | ||
318 | 74 | ), | ||
319 | 75 | "-p", | ||
320 | 76 | str(storage["ssh_ping_port"]), | ||
321 | 77 | "-l", | ||
322 | 78 | storage["ssh_ping_user"], | ||
323 | 79 | storage["ssh_ping_host"], | ||
324 | 80 | "turku-ping-remote", | ||
325 | 64 | ] | 81 | ] |
326 | 65 | p = subprocess.Popen(ssh_command, stdin=subprocess.PIPE) | 82 | p = subprocess.Popen(ssh_command, stdin=subprocess.PIPE) |
327 | 66 | 83 | ||
328 | 67 | # Write the ssh request | 84 | # Write the ssh request |
330 | 68 | p.stdin.write((json.dumps(ssh_req) + '\n.\n').encode('UTF-8')) | 85 | p.stdin.write((json.dumps(ssh_req) + "\n.\n").encode("UTF-8")) |
331 | 69 | p.stdin.flush() | 86 | p.stdin.flush() |
332 | 70 | 87 | ||
333 | 71 | # Wait for the server to close the SSH connection | 88 | # Wait for the server to close the SSH connection |
334 | @@ -88,16 +105,18 @@ | |||
335 | 88 | config = load_config(args.config_dir) | 105 | config = load_config(args.config_dir) |
336 | 89 | 106 | ||
337 | 90 | # Basic checks | 107 | # Basic checks |
339 | 91 | for i in ('ssh_private_key_file', 'machine_uuid', 'machine_secret', 'api_url'): | 108 | for i in ("ssh_private_key_file", "machine_uuid", "machine_secret", "api_url"): |
340 | 92 | if i not in config: | 109 | if i not in config: |
341 | 93 | return | 110 | return |
343 | 94 | if not os.path.isfile(config['ssh_private_key_file']): | 111 | if not os.path.isfile(config["ssh_private_key_file"]): |
344 | 95 | return | 112 | return |
345 | 96 | 113 | ||
346 | 97 | # If a go/no-go program is defined, run it and only go if it exits 0. | 114 | # If a go/no-go program is defined, run it and only go if it exits 0. |
347 | 98 | # Example: prevent backups during high-load for sensitive systems: | 115 | # Example: prevent backups during high-load for sensitive systems: |
348 | 99 | # ['check_load', '-c', '1,5,15'] | 116 | # ['check_load', '-c', '1,5,15'] |
350 | 100 | gonogo_program = args.gonogo_program if args.gonogo_program else config['gonogo_program'] | 117 | gonogo_program = ( |
351 | 118 | args.gonogo_program if args.gonogo_program else config["gonogo_program"] | ||
352 | 119 | ) | ||
353 | 101 | if isinstance(gonogo_program, (list, tuple)): | 120 | if isinstance(gonogo_program, (list, tuple)): |
354 | 102 | # List, program name first, optional arguments after | 121 | # List, program name first, optional arguments after |
355 | 103 | gonogo_program_and_args = list(gonogo_program) | 122 | gonogo_program_and_args = list(gonogo_program) |
356 | @@ -113,106 +132,111 @@ | |||
357 | 113 | except (subprocess.CalledProcessError, OSError): | 132 | except (subprocess.CalledProcessError, OSError): |
358 | 114 | return | 133 | return |
359 | 115 | 134 | ||
361 | 116 | lock = acquire_lock(os.path.join(config['lock_dir'], 'turku-agent-ping.lock')) | 135 | lock = acquire_lock(os.path.join(config["lock_dir"], "turku-agent-ping.lock")) |
362 | 117 | 136 | ||
363 | 118 | restore_mode = args.restore | 137 | restore_mode = args.restore |
364 | 119 | 138 | ||
365 | 120 | # Check with the API server | 139 | # Check with the API server |
366 | 121 | api_out = {} | 140 | api_out = {} |
367 | 122 | 141 | ||
373 | 123 | machine_merge_map = ( | 142 | machine_merge_map = (("machine_uuid", "uuid"), ("machine_secret", "secret")) |
374 | 124 | ('machine_uuid', 'uuid'), | 143 | api_out["machine"] = {} |
370 | 125 | ('machine_secret', 'secret'), | ||
371 | 126 | ) | ||
372 | 127 | api_out['machine'] = {} | ||
375 | 128 | for a, b in machine_merge_map: | 144 | for a, b in machine_merge_map: |
376 | 129 | if a in config: | 145 | if a in config: |
378 | 130 | api_out['machine'][b] = config[a] | 146 | api_out["machine"][b] = config[a] |
379 | 131 | 147 | ||
380 | 132 | if restore_mode: | 148 | if restore_mode: |
382 | 133 | print('Entering restore mode.') | 149 | print("Entering restore mode.") |
383 | 134 | print() | 150 | print() |
385 | 135 | api_reply = api_call(config['api_url'], 'agent_ping_restore', api_out) | 151 | api_reply = api_call(config["api_url"], "agent_ping_restore", api_out) |
386 | 136 | 152 | ||
387 | 137 | sources_by_storage = {} | 153 | sources_by_storage = {} |
397 | 138 | for source_name in api_reply['machine']['sources']: | 154 | for source_name in api_reply["machine"]["sources"]: |
398 | 139 | source = api_reply['machine']['sources'][source_name] | 155 | source = api_reply["machine"]["sources"][source_name] |
399 | 140 | if source_name not in config['sources']: | 156 | if source_name not in config["sources"]: |
400 | 141 | continue | 157 | continue |
401 | 142 | if 'storage' not in source: | 158 | if "storage" not in source: |
402 | 143 | continue | 159 | continue |
403 | 144 | if source['storage']['name'] not in sources_by_storage: | 160 | if source["storage"]["name"] not in sources_by_storage: |
404 | 145 | sources_by_storage[source['storage']['name']] = {} | 161 | sources_by_storage[source["storage"]["name"]] = {} |
405 | 146 | sources_by_storage[source['storage']['name']][source_name] = source | 162 | sources_by_storage[source["storage"]["name"]][source_name] = source |
406 | 147 | 163 | ||
407 | 148 | if len(sources_by_storage) == 0: | 164 | if len(sources_by_storage) == 0: |
409 | 149 | print('Cannot find any appropraite sources.') | 165 | print("Cannot find any appropraite sources.") |
410 | 150 | return | 166 | return |
412 | 151 | print('This machine\'s sources are on the following storage units:') | 167 | print("This machine's sources are on the following storage units:") |
413 | 152 | for storage_name in sources_by_storage: | 168 | for storage_name in sources_by_storage: |
415 | 153 | print(' %s' % storage_name) | 169 | print(" %s" % storage_name) |
416 | 154 | for source_name in sources_by_storage[storage_name]: | 170 | for source_name in sources_by_storage[storage_name]: |
418 | 155 | print(' %s' % source_name) | 171 | print(" %s" % source_name) |
419 | 156 | print() | 172 | print() |
420 | 157 | if len(sources_by_storage) == 1: | 173 | if len(sources_by_storage) == 1: |
422 | 158 | storage = list(list(sources_by_storage.values())[0].values())[0]['storage'] | 174 | storage = list(list(sources_by_storage.values())[0].values())[0]["storage"] |
423 | 159 | elif args.restore_storage: | 175 | elif args.restore_storage: |
424 | 160 | if args.restore_storage in sources_by_storage: | 176 | if args.restore_storage in sources_by_storage: |
426 | 161 | storage = sources_by_storage[args.restore_storage]['storage'] | 177 | storage = sources_by_storage[args.restore_storage]["storage"] |
427 | 162 | else: | 178 | else: |
428 | 163 | print('Cannot find appropriate storage "%s"' % args.restore_storage) | 179 | print('Cannot find appropriate storage "%s"' % args.restore_storage) |
429 | 164 | return | 180 | return |
430 | 165 | else: | 181 | else: |
432 | 166 | print('Multiple storages found. Please use --restore-storage to specify one.') | 182 | print( |
433 | 183 | "Multiple storages found. Please use --restore-storage to specify one." | ||
434 | 184 | ) | ||
435 | 167 | return | 185 | return |
436 | 168 | 186 | ||
437 | 169 | ssh_req = { | 187 | ssh_req = { |
441 | 170 | 'verbose': True, | 188 | "verbose": True, |
442 | 171 | 'action': 'restore', | 189 | "action": "restore", |
443 | 172 | 'port': random.randint(49152, 65535), | 190 | "port": random.randint(49152, 65535), |
444 | 173 | } | 191 | } |
449 | 174 | print('Storage unit: %s' % storage['name']) | 192 | print("Storage unit: %s" % storage["name"]) |
450 | 175 | if 'restore_path' in config: | 193 | if "restore_path" in config: |
451 | 176 | print('Local destination path: %s' % config['restore_path']) | 194 | print("Local destination path: %s" % config["restore_path"]) |
452 | 177 | print('Sample restore usage from storage unit:') | 195 | print("Sample restore usage from storage unit:") |
453 | 178 | print( | 196 | print( |
458 | 179 | ' RSYNC_PASSWORD=%s rsync -avzP --numeric-ids ${P?}/ rsync://%s@127.0.0.1:%s/%s/' % ( | 197 | " RSYNC_PASSWORD=%s rsync -avzP --numeric-ids ${P?}/ rsync://%s@127.0.0.1:%s/%s/" |
459 | 180 | config['restore_password'], | 198 | % ( |
460 | 181 | config['restore_username'], | 199 | config["restore_password"], |
461 | 182 | ssh_req['port'], config['restore_module'] | 200 | config["restore_username"], |
462 | 201 | ssh_req["port"], | ||
463 | 202 | config["restore_module"], | ||
464 | 183 | ) | 203 | ) |
465 | 184 | ) | 204 | ) |
466 | 185 | print() | 205 | print() |
467 | 186 | call_ssh(config, storage, ssh_req) | 206 | call_ssh(config, storage, ssh_req) |
468 | 187 | else: | 207 | else: |
470 | 188 | api_reply = api_call(config['api_url'], 'agent_ping_checkin', api_out) | 208 | api_reply = api_call(config["api_url"], "agent_ping_checkin", api_out) |
471 | 189 | 209 | ||
473 | 190 | if 'scheduled_sources' not in api_reply: | 210 | if "scheduled_sources" not in api_reply: |
474 | 191 | return | 211 | return |
475 | 192 | sources_by_storage = {} | 212 | sources_by_storage = {} |
485 | 193 | for source_name in api_reply['machine']['scheduled_sources']: | 213 | for source_name in api_reply["machine"]["scheduled_sources"]: |
486 | 194 | source = api_reply['machine']['scheduled_sources'][source_name] | 214 | source = api_reply["machine"]["scheduled_sources"][source_name] |
487 | 195 | if source_name not in config['sources']: | 215 | if source_name not in config["sources"]: |
488 | 196 | continue | 216 | continue |
489 | 197 | if 'storage' not in source: | 217 | if "storage" not in source: |
490 | 198 | continue | 218 | continue |
491 | 199 | if source['storage']['name'] not in sources_by_storage: | 219 | if source["storage"]["name"] not in sources_by_storage: |
492 | 200 | sources_by_storage[source['storage']['name']] = {} | 220 | sources_by_storage[source["storage"]["name"]] = {} |
493 | 201 | sources_by_storage[source['storage']['name']][source_name] = source | 221 | sources_by_storage[source["storage"]["name"]][source_name] = source |
494 | 202 | 222 | ||
495 | 203 | for storage_name in sources_by_storage: | 223 | for storage_name in sources_by_storage: |
496 | 204 | ssh_req = { | 224 | ssh_req = { |
501 | 205 | 'verbose': True, | 225 | "verbose": True, |
502 | 206 | 'action': 'checkin', | 226 | "action": "checkin", |
503 | 207 | 'port': random.randint(49152, 65535), | 227 | "port": random.randint(49152, 65535), |
504 | 208 | 'sources': {}, | 228 | "sources": {}, |
505 | 209 | } | 229 | } |
506 | 210 | for source in sources_by_storage[storage_name]: | 230 | for source in sources_by_storage[storage_name]: |
510 | 211 | ssh_req['sources'][source] = { | 231 | ssh_req["sources"][source] = { |
511 | 212 | 'username': config['sources'][source]['username'], | 232 | "username": config["sources"][source]["username"], |
512 | 213 | 'password': config['sources'][source]['password'], | 233 | "password": config["sources"][source]["password"], |
513 | 214 | } | 234 | } |
515 | 215 | call_ssh(config, list(sources_by_storage[storage_name].values())[0]['storage'], ssh_req) | 235 | call_ssh( |
516 | 236 | config, | ||
517 | 237 | list(sources_by_storage[storage_name].values())[0]["storage"], | ||
518 | 238 | ssh_req, | ||
519 | 239 | ) | ||
520 | 216 | 240 | ||
521 | 217 | # Cleanup | 241 | # Cleanup |
522 | 218 | lock.close() | 242 | lock.close() |
523 | 219 | 243 | ||
524 | === modified file 'turku_agent/rsyncd_wrapper.py' | |||
525 | --- turku_agent/rsyncd_wrapper.py 2019-04-22 01:16:04 +0000 | |||
526 | +++ turku_agent/rsyncd_wrapper.py 2020-06-21 22:22:36 +0000 | |||
527 | @@ -16,6 +16,7 @@ | |||
528 | 16 | # this program. If not, see <http://www.gnu.org/licenses/>. | 16 | # this program. If not, see <http://www.gnu.org/licenses/>. |
529 | 17 | 17 | ||
530 | 18 | import os | 18 | import os |
531 | 19 | |||
532 | 19 | from .utils import load_config | 20 | from .utils import load_config |
533 | 20 | 21 | ||
534 | 21 | 22 | ||
535 | @@ -23,9 +24,10 @@ | |||
536 | 23 | import argparse | 24 | import argparse |
537 | 24 | 25 | ||
538 | 25 | parser = argparse.ArgumentParser( | 26 | parser = argparse.ArgumentParser( |
542 | 26 | formatter_class=argparse.ArgumentDefaultsHelpFormatter) | 27 | formatter_class=argparse.ArgumentDefaultsHelpFormatter |
543 | 27 | parser.add_argument('--config-dir', '-c', type=str, default='/etc/turku-agent') | 28 | ) |
544 | 28 | parser.add_argument('--detach', action='store_true') | 29 | parser.add_argument("--config-dir", "-c", type=str, default="/etc/turku-agent") |
545 | 30 | parser.add_argument("--detach", action="store_true") | ||
546 | 29 | return parser.parse_known_args() | 31 | return parser.parse_known_args() |
547 | 30 | 32 | ||
548 | 31 | 33 | ||
549 | @@ -33,10 +35,12 @@ | |||
550 | 33 | args, rest = parse_args() | 35 | args, rest = parse_args() |
551 | 34 | 36 | ||
552 | 35 | config = load_config(args.config_dir) | 37 | config = load_config(args.config_dir) |
554 | 36 | rsyncd_command = config['rsyncd_command'] | 38 | rsyncd_command = config["rsyncd_command"] |
555 | 37 | if not args.detach: | 39 | if not args.detach: |
559 | 38 | rsyncd_command.append('--no-detach') | 40 | rsyncd_command.append("--no-detach") |
560 | 39 | rsyncd_command.append('--daemon') | 41 | rsyncd_command.append("--daemon") |
561 | 40 | rsyncd_command.append('--config=%s' % os.path.join(config['var_dir'], 'rsyncd.conf')) | 42 | rsyncd_command.append( |
562 | 43 | "--config=%s" % os.path.join(config["var_dir"], "rsyncd.conf") | ||
563 | 44 | ) | ||
564 | 41 | rsyncd_command += rest | 45 | rsyncd_command += rest |
565 | 42 | os.execvp(rsyncd_command[0], rsyncd_command) | 46 | os.execvp(rsyncd_command[0], rsyncd_command) |
566 | 43 | 47 | ||
567 | === modified file 'turku_agent/update_config.py' | |||
568 | --- turku_agent/update_config.py 2019-05-23 14:18:25 +0000 | |||
569 | +++ turku_agent/update_config.py 2020-06-21 22:22:36 +0000 | |||
570 | @@ -1,5 +1,3 @@ | |||
571 | 1 | #!/usr/bin/env python3 | ||
572 | 2 | |||
573 | 3 | # Turku backups - client agent | 1 | # Turku backups - client agent |
574 | 4 | # Copyright 2015 Canonical Ltd. | 2 | # Copyright 2015 Canonical Ltd. |
575 | 5 | # | 3 | # |
576 | @@ -15,12 +13,13 @@ | |||
577 | 15 | # You should have received a copy of the GNU General Public License along with | 13 | # You should have received a copy of the GNU General Public License along with |
578 | 16 | # this program. If not, see <http://www.gnu.org/licenses/>. | 14 | # this program. If not, see <http://www.gnu.org/licenses/>. |
579 | 17 | 15 | ||
580 | 16 | import logging | ||
581 | 17 | import os | ||
582 | 18 | import random | 18 | import random |
583 | 19 | import os | ||
584 | 20 | import subprocess | 19 | import subprocess |
585 | 21 | import time | 20 | import time |
588 | 22 | import logging | 21 | |
589 | 23 | from .utils import json_dump_p, json_dumps_p, load_config, fill_config, acquire_lock, api_call | 22 | from .utils import json_dumps_p, load_config, fill_config, acquire_lock, api_call |
590 | 24 | 23 | ||
591 | 25 | 24 | ||
592 | 26 | class IncompleteConfigError(Exception): | 25 | class IncompleteConfigError(Exception): |
593 | @@ -31,69 +30,70 @@ | |||
594 | 31 | import argparse | 30 | import argparse |
595 | 32 | 31 | ||
596 | 33 | parser = argparse.ArgumentParser( | 32 | parser = argparse.ArgumentParser( |
601 | 34 | formatter_class=argparse.ArgumentDefaultsHelpFormatter) | 33 | formatter_class=argparse.ArgumentDefaultsHelpFormatter |
602 | 35 | parser.add_argument('--config-dir', '-c', type=str, default='/etc/turku-agent') | 34 | ) |
603 | 36 | parser.add_argument('--wait', '-w', type=float) | 35 | parser.add_argument("--config-dir", "-c", type=str, default="/etc/turku-agent") |
604 | 37 | parser.add_argument('--debug', action='store_true') | 36 | parser.add_argument("--wait", "-w", type=float) |
605 | 37 | parser.add_argument("--debug", action="store_true") | ||
606 | 38 | return parser.parse_args() | 38 | return parser.parse_args() |
607 | 39 | 39 | ||
608 | 40 | 40 | ||
609 | 41 | def write_conf_files(config): | 41 | def write_conf_files(config): |
610 | 42 | # Build rsyncd.conf | 42 | # Build rsyncd.conf |
611 | 43 | built_rsyncd_conf = ( | 43 | built_rsyncd_conf = ( |
618 | 44 | 'address = %s\n' % config['rsyncd_local_address'] + | 44 | "address = %s\n" % config["rsyncd_local_address"] |
619 | 45 | 'port = %d\n' % config['rsyncd_local_port'] + | 45 | + "port = %d\n" % config["rsyncd_local_port"] |
620 | 46 | 'log file = /dev/stdout\n' + | 46 | + "log file = /dev/stdout\n" |
621 | 47 | 'uid = root\n' + | 47 | + "uid = root\n" |
622 | 48 | 'gid = root\n' + | 48 | + "gid = root\n" |
623 | 49 | 'list = false\n\n' | 49 | + "list = false\n\n" |
624 | 50 | ) | 50 | ) |
625 | 51 | rsyncd_secrets = [] | 51 | rsyncd_secrets = [] |
627 | 52 | rsyncd_secrets.append((config['restore_username'], config['restore_password'])) | 52 | rsyncd_secrets.append((config["restore_username"], config["restore_password"])) |
628 | 53 | built_rsyncd_conf += ( | 53 | built_rsyncd_conf += ( |
634 | 54 | '[%s]\n' + | 54 | "[%s]\n" |
635 | 55 | ' path = %s\n' + | 55 | + " path = %s\n" |
636 | 56 | ' auth users = %s\n' + | 56 | + " auth users = %s\n" |
637 | 57 | ' secrets file = %s\n' + | 57 | + " secrets file = %s\n" |
638 | 58 | ' read only = false\n\n' | 58 | + " read only = false\n\n" |
639 | 59 | ) % ( | 59 | ) % ( |
644 | 60 | config['restore_module'], | 60 | config["restore_module"], |
645 | 61 | config['restore_path'], | 61 | config["restore_path"], |
646 | 62 | config['restore_username'], | 62 | config["restore_username"], |
647 | 63 | os.path.join(config['var_dir'], 'rsyncd.secrets'), | 63 | os.path.join(config["var_dir"], "rsyncd.secrets"), |
648 | 64 | ) | 64 | ) |
652 | 65 | for s in config['sources']: | 65 | for s in config["sources"]: |
653 | 66 | sd = config['sources'][s] | 66 | sd = config["sources"][s] |
654 | 67 | rsyncd_secrets.append((sd['username'], sd['password'])) | 67 | rsyncd_secrets.append((sd["username"], sd["password"])) |
655 | 68 | built_rsyncd_conf += ( | 68 | built_rsyncd_conf += ( |
661 | 69 | '[%s]\n' + | 69 | "[%s]\n" |
662 | 70 | ' path = %s\n' + | 70 | + " path = %s\n" |
663 | 71 | ' auth users = %s\n' + | 71 | + " auth users = %s\n" |
664 | 72 | ' secrets file = %s\n' + | 72 | + " secrets file = %s\n" |
665 | 73 | ' read only = true\n\n' | 73 | + " read only = true\n\n" |
666 | 74 | ) % ( | 74 | ) % ( |
667 | 75 | s, | 75 | s, |
671 | 76 | sd['path'], | 76 | sd["path"], |
672 | 77 | sd['username'], | 77 | sd["username"], |
673 | 78 | os.path.join(config['var_dir'], 'rsyncd.secrets'), | 78 | os.path.join(config["var_dir"], "rsyncd.secrets"), |
674 | 79 | ) | 79 | ) |
676 | 80 | with open(os.path.join(config['var_dir'], 'rsyncd.conf'), 'w') as f: | 80 | with open(os.path.join(config["var_dir"], "rsyncd.conf"), "w") as f: |
677 | 81 | f.write(built_rsyncd_conf) | 81 | f.write(built_rsyncd_conf) |
678 | 82 | 82 | ||
679 | 83 | # Build rsyncd.secrets | 83 | # Build rsyncd.secrets |
681 | 84 | built_rsyncd_secrets = '' | 84 | built_rsyncd_secrets = "" |
682 | 85 | for (username, password) in rsyncd_secrets: | 85 | for (username, password) in rsyncd_secrets: |
685 | 86 | built_rsyncd_secrets += username + ':' + password + '\n' | 86 | built_rsyncd_secrets += username + ":" + password + "\n" |
686 | 87 | with open(os.path.join(config['var_dir'], 'rsyncd.secrets'), 'w') as f: | 87 | with open(os.path.join(config["var_dir"], "rsyncd.secrets"), "w") as f: |
687 | 88 | os.fchmod(f.fileno(), 0o600) | 88 | os.fchmod(f.fileno(), 0o600) |
688 | 89 | f.write(built_rsyncd_secrets) | 89 | f.write(built_rsyncd_secrets) |
689 | 90 | 90 | ||
690 | 91 | 91 | ||
691 | 92 | def init_is_upstart(): | 92 | def init_is_upstart(): |
692 | 93 | try: | 93 | try: |
696 | 94 | return 'upstart' in subprocess.check_output( | 94 | return "upstart" in subprocess.check_output( |
697 | 95 | ['initctl', 'version'], | 95 | ["initctl", "version"], stderr=subprocess.DEVNULL, universal_newlines=True |
698 | 96 | stderr=subprocess.DEVNULL, universal_newlines=True) | 96 | ) |
699 | 97 | except (FileNotFoundError, subprocess.CalledProcessError): | 97 | except (FileNotFoundError, subprocess.CalledProcessError): |
700 | 98 | return False | 98 | return False |
701 | 99 | 99 | ||
702 | @@ -107,52 +107,54 @@ | |||
703 | 107 | # With Upstart, start will fail if the service is already running, | 107 | # With Upstart, start will fail if the service is already running, |
704 | 108 | # so we need to check for that first. | 108 | # so we need to check for that first. |
705 | 109 | try: | 109 | try: |
709 | 110 | if 'start/running' in subprocess.check_output( | 110 | if "start/running" in subprocess.check_output( |
710 | 111 | ['status', 'turku-agent-rsyncd'], | 111 | ["status", "turku-agent-rsyncd"], |
711 | 112 | stderr=subprocess.STDOUT, universal_newlines=True): | 112 | stderr=subprocess.STDOUT, |
712 | 113 | universal_newlines=True, | ||
713 | 114 | ): | ||
714 | 113 | return | 115 | return |
715 | 114 | except subprocess.CalledProcessError: | 116 | except subprocess.CalledProcessError: |
716 | 115 | pass | 117 | pass |
718 | 116 | subprocess.check_call(['service', 'turku-agent-rsyncd', 'start']) | 118 | subprocess.check_call(["service", "turku-agent-rsyncd", "start"]) |
719 | 117 | 119 | ||
720 | 118 | 120 | ||
721 | 119 | def send_config(config): | 121 | def send_config(config): |
725 | 120 | required_keys = ['api_url'] | 122 | required_keys = ["api_url"] |
726 | 121 | if 'api_auth' not in config: | 123 | if "api_auth" not in config: |
727 | 122 | required_keys += ['api_auth_name', 'api_auth_secret'] | 124 | required_keys += ["api_auth_name", "api_auth_secret"] |
728 | 123 | for k in required_keys: | 125 | for k in required_keys: |
729 | 124 | if k not in config: | 126 | if k not in config: |
730 | 125 | raise IncompleteConfigError('Required config "%s" not found.' % k) | 127 | raise IncompleteConfigError('Required config "%s" not found.' % k) |
731 | 126 | 128 | ||
732 | 127 | api_out = {} | 129 | api_out = {} |
734 | 128 | if ('api_auth_name' in config) and ('api_auth_secret' in config): | 130 | if ("api_auth_name" in config) and ("api_auth_secret" in config): |
735 | 129 | # name/secret style | 131 | # name/secret style |
739 | 130 | api_out['auth'] = { | 132 | api_out["auth"] = { |
740 | 131 | 'name': config['api_auth_name'], | 133 | "name": config["api_auth_name"], |
741 | 132 | 'secret': config['api_auth_secret'], | 134 | "secret": config["api_auth_secret"], |
742 | 133 | } | 135 | } |
743 | 134 | else: | 136 | else: |
744 | 135 | # nameless secret style | 137 | # nameless secret style |
746 | 136 | api_out['auth'] = config['api_auth'] | 138 | api_out["auth"] = config["api_auth"] |
747 | 137 | 139 | ||
748 | 138 | # Merge the following options into the machine section | 140 | # Merge the following options into the machine section |
749 | 139 | machine_merge_map = ( | 141 | machine_merge_map = ( |
757 | 140 | ('machine_uuid', 'uuid'), | 142 | ("machine_uuid", "uuid"), |
758 | 141 | ('machine_secret', 'secret'), | 143 | ("machine_secret", "secret"), |
759 | 142 | ('environment_name', 'environment_name'), | 144 | ("environment_name", "environment_name"), |
760 | 143 | ('service_name', 'service_name'), | 145 | ("service_name", "service_name"), |
761 | 144 | ('unit_name', 'unit_name'), | 146 | ("unit_name", "unit_name"), |
762 | 145 | ('ssh_public_key', 'ssh_public_key'), | 147 | ("ssh_public_key", "ssh_public_key"), |
763 | 146 | ('published', 'published'), | 148 | ("published", "published"), |
764 | 147 | ) | 149 | ) |
766 | 148 | api_out['machine'] = {} | 150 | api_out["machine"] = {} |
767 | 149 | for a, b in machine_merge_map: | 151 | for a, b in machine_merge_map: |
768 | 150 | if a in config: | 152 | if a in config: |
774 | 151 | api_out['machine'][b] = config[a] | 153 | api_out["machine"][b] = config[a] |
775 | 152 | 154 | ||
776 | 153 | api_out['machine']['sources'] = config['sources'] | 155 | api_out["machine"]["sources"] = config["sources"] |
777 | 154 | 156 | ||
778 | 155 | api_reply = api_call(config['api_url'], 'update_config', api_out) | 157 | api_call(config["api_url"], "update_config", api_out) |
779 | 156 | 158 | ||
780 | 157 | 159 | ||
781 | 158 | def main(): | 160 | def main(): |
782 | @@ -162,7 +164,7 @@ | |||
783 | 162 | time.sleep(random.uniform(0, args.wait)) | 164 | time.sleep(random.uniform(0, args.wait)) |
784 | 163 | 165 | ||
785 | 164 | config = load_config(args.config_dir) | 166 | config = load_config(args.config_dir) |
787 | 165 | lock = acquire_lock(os.path.join(config['lock_dir'], 'turku-update-config.lock')) | 167 | lock = acquire_lock(os.path.join(config["lock_dir"], "turku-update-config.lock")) |
788 | 166 | fill_config(config) | 168 | fill_config(config) |
789 | 167 | if args.debug: | 169 | if args.debug: |
790 | 168 | print(json_dumps_p(config)) | 170 | print(json_dumps_p(config)) |
791 | 169 | 171 | ||
792 | === modified file 'turku_agent/utils.py' | |||
793 | --- turku_agent/utils.py 2020-03-23 22:31:56 +0000 | |||
794 | +++ turku_agent/utils.py 2020-06-21 22:22:36 +0000 | |||
795 | @@ -1,5 +1,3 @@ | |||
796 | 1 | #!/usr/bin/env python3 | ||
797 | 2 | |||
798 | 3 | # Turku backups - client agent | 1 | # Turku backups - client agent |
799 | 4 | # Copyright 2015 Canonical Ltd. | 2 | # Copyright 2015 Canonical Ltd. |
800 | 5 | # | 3 | # |
801 | @@ -15,32 +13,34 @@ | |||
802 | 15 | # You should have received a copy of the GNU General Public License along with | 13 | # You should have received a copy of the GNU General Public License along with |
803 | 16 | # this program. If not, see <http://www.gnu.org/licenses/>. | 14 | # this program. If not, see <http://www.gnu.org/licenses/>. |
804 | 17 | 15 | ||
808 | 18 | import uuid | 16 | import copy |
809 | 19 | import string | 17 | import http.client |
807 | 20 | import random | ||
810 | 21 | import json | 18 | import json |
811 | 22 | import os | 19 | import os |
813 | 23 | import copy | 20 | import platform |
814 | 21 | import random | ||
815 | 22 | import string | ||
816 | 24 | import subprocess | 23 | import subprocess |
817 | 25 | import platform | ||
818 | 26 | import urllib.parse | 24 | import urllib.parse |
823 | 27 | import http.client | 25 | import uuid |
824 | 28 | 26 | ||
825 | 29 | 27 | ||
826 | 30 | class RuntimeLock(): | 28 | class RuntimeLock: |
827 | 31 | name = None | 29 | name = None |
828 | 32 | file = None | 30 | file = None |
829 | 33 | 31 | ||
830 | 34 | def __init__(self, name): | 32 | def __init__(self, name): |
831 | 35 | import fcntl | 33 | import fcntl |
833 | 36 | file = open(name, 'w') | 34 | |
834 | 35 | file = open(name, "w") | ||
835 | 37 | try: | 36 | try: |
836 | 38 | fcntl.lockf(file, fcntl.LOCK_EX | fcntl.LOCK_NB) | 37 | fcntl.lockf(file, fcntl.LOCK_EX | fcntl.LOCK_NB) |
837 | 39 | except IOError as e: | 38 | except IOError as e: |
838 | 40 | import errno | 39 | import errno |
839 | 40 | |||
840 | 41 | if e.errno in (errno.EACCES, errno.EAGAIN): | 41 | if e.errno in (errno.EACCES, errno.EAGAIN): |
841 | 42 | raise | 42 | raise |
843 | 43 | file.write('%10s\n' % os.getpid()) | 43 | file.write("%10s\n" % os.getpid()) |
844 | 44 | file.flush() | 44 | file.flush() |
845 | 45 | file.seek(0) | 45 | file.seek(0) |
846 | 46 | self.name = name | 46 | self.name = name |
847 | @@ -71,12 +71,12 @@ | |||
848 | 71 | 71 | ||
849 | 72 | def json_dump_p(obj, f): | 72 | def json_dump_p(obj, f): |
850 | 73 | """Calls json.dump with standard (pretty) formatting""" | 73 | """Calls json.dump with standard (pretty) formatting""" |
852 | 74 | return json.dump(obj, f, sort_keys=True, indent=4, separators=(',', ': ')) | 74 | return json.dump(obj, f, sort_keys=True, indent=4, separators=(",", ": ")) |
853 | 75 | 75 | ||
854 | 76 | 76 | ||
855 | 77 | def json_dumps_p(obj): | 77 | def json_dumps_p(obj): |
856 | 78 | """Calls json.dumps with standard (pretty) formatting""" | 78 | """Calls json.dumps with standard (pretty) formatting""" |
858 | 79 | return json.dumps(obj, sort_keys=True, indent=4, separators=(',', ': ')) | 79 | return json.dumps(obj, sort_keys=True, indent=4, separators=(",", ": ")) |
859 | 80 | 80 | ||
860 | 81 | 81 | ||
861 | 82 | def json_load_file(file): | 82 | def json_load_file(file): |
862 | @@ -103,10 +103,10 @@ | |||
863 | 103 | 103 | ||
864 | 104 | def load_config(config_dir): | 104 | def load_config(config_dir): |
865 | 105 | config = {} | 105 | config = {} |
867 | 106 | config['config_dir'] = config_dir | 106 | config["config_dir"] = config_dir |
868 | 107 | 107 | ||
871 | 108 | config_d = os.path.join(config['config_dir'], 'config.d') | 108 | config_d = os.path.join(config["config_dir"], "config.d") |
872 | 109 | sources_d = os.path.join(config['config_dir'], 'sources.d') | 109 | sources_d = os.path.join(config["config_dir"], "sources.d") |
873 | 110 | 110 | ||
874 | 111 | # Merge in config.d/*.json to the root level | 111 | # Merge in config.d/*.json to the root level |
875 | 112 | config_files = [] | 112 | config_files = [] |
876 | @@ -114,7 +114,7 @@ | |||
877 | 114 | config_files = [ | 114 | config_files = [ |
878 | 115 | os.path.join(config_d, fn) | 115 | os.path.join(config_d, fn) |
879 | 116 | for fn in os.listdir(config_d) | 116 | for fn in os.listdir(config_d) |
881 | 117 | if fn.endswith('.json') | 117 | if fn.endswith(".json") |
882 | 118 | and os.path.isfile(os.path.join(config_d, fn)) | 118 | and os.path.isfile(os.path.join(config_d, fn)) |
883 | 119 | and os.access(os.path.join(config_d, fn), os.R_OK) | 119 | and os.access(os.path.join(config_d, fn), os.R_OK) |
884 | 120 | ] | 120 | ] |
885 | @@ -122,10 +122,10 @@ | |||
886 | 122 | for file in config_files: | 122 | for file in config_files: |
887 | 123 | config = dict_merge(config, json_load_file(file)) | 123 | config = dict_merge(config, json_load_file(file)) |
888 | 124 | 124 | ||
891 | 125 | if 'var_dir' not in config: | 125 | if "var_dir" not in config: |
892 | 126 | config['var_dir'] = '/var/lib/turku-agent' | 126 | config["var_dir"] = "/var/lib/turku-agent" |
893 | 127 | 127 | ||
895 | 128 | var_config_d = os.path.join(config['var_dir'], 'config.d') | 128 | var_config_d = os.path.join(config["var_dir"], "config.d") |
896 | 129 | 129 | ||
897 | 130 | # Load /var config.d files | 130 | # Load /var config.d files |
898 | 131 | var_config = {} | 131 | var_config = {} |
899 | @@ -134,7 +134,7 @@ | |||
900 | 134 | var_config_files = [ | 134 | var_config_files = [ |
901 | 135 | os.path.join(var_config_d, fn) | 135 | os.path.join(var_config_d, fn) |
902 | 136 | for fn in os.listdir(var_config_d) | 136 | for fn in os.listdir(var_config_d) |
904 | 137 | if fn.endswith('.json') | 137 | if fn.endswith(".json") |
905 | 138 | and os.path.isfile(os.path.join(var_config_d, fn)) | 138 | and os.path.isfile(os.path.join(var_config_d, fn)) |
906 | 139 | and os.access(os.path.join(var_config_d, fn), os.R_OK) | 139 | and os.access(os.path.join(var_config_d, fn), os.R_OK) |
907 | 140 | ] | 140 | ] |
908 | @@ -145,40 +145,40 @@ | |||
909 | 145 | var_config = dict_merge(var_config, config) | 145 | var_config = dict_merge(var_config, config) |
910 | 146 | config = var_config | 146 | config = var_config |
911 | 147 | 147 | ||
926 | 148 | if 'lock_dir' not in config: | 148 | if "lock_dir" not in config: |
927 | 149 | config['lock_dir'] = '/var/lock' | 149 | config["lock_dir"] = "/var/lock" |
928 | 150 | 150 | ||
929 | 151 | if 'rsyncd_command' not in config: | 151 | if "rsyncd_command" not in config: |
930 | 152 | config['rsyncd_command'] = ['rsync'] | 152 | config["rsyncd_command"] = ["rsync"] |
931 | 153 | 153 | ||
932 | 154 | if 'rsyncd_local_address' not in config: | 154 | if "rsyncd_local_address" not in config: |
933 | 155 | config['rsyncd_local_address'] = '127.0.0.1' | 155 | config["rsyncd_local_address"] = "127.0.0.1" |
934 | 156 | 156 | ||
935 | 157 | if 'rsyncd_local_port' not in config: | 157 | if "rsyncd_local_port" not in config: |
936 | 158 | config['rsyncd_local_port'] = 27873 | 158 | config["rsyncd_local_port"] = 27873 |
937 | 159 | 159 | ||
938 | 160 | if 'ssh_command' not in config: | 160 | if "ssh_command" not in config: |
939 | 161 | config['ssh_command'] = ['ssh'] | 161 | config["ssh_command"] = ["ssh"] |
940 | 162 | 162 | ||
941 | 163 | # If a go/no-go program is defined, run it and only go if it exits 0. | 163 | # If a go/no-go program is defined, run it and only go if it exits 0. |
942 | 164 | # Type: String (program with no args) or list (program first, optional arguments after) | 164 | # Type: String (program with no args) or list (program first, optional arguments after) |
945 | 165 | if 'gonogo_program' not in config: | 165 | if "gonogo_program" not in config: |
946 | 166 | config['gonogo_program'] = None | 166 | config["gonogo_program"] = None |
947 | 167 | 167 | ||
949 | 168 | var_sources_d = os.path.join(config['var_dir'], 'sources.d') | 168 | var_sources_d = os.path.join(config["var_dir"], "sources.d") |
950 | 169 | 169 | ||
951 | 170 | # Validate the unit name | 170 | # Validate the unit name |
954 | 171 | if 'unit_name' not in config: | 171 | if "unit_name" not in config: |
955 | 172 | config['unit_name'] = platform.node() | 172 | config["unit_name"] = platform.node() |
956 | 173 | # If this isn't in the on-disk config, don't write it; just | 173 | # If this isn't in the on-disk config, don't write it; just |
957 | 174 | # generate it every time | 174 | # generate it every time |
958 | 175 | 175 | ||
959 | 176 | # Pull the SSH public key | 176 | # Pull the SSH public key |
965 | 177 | if os.path.isfile(os.path.join(config['var_dir'], 'ssh_key.pub')): | 177 | if os.path.isfile(os.path.join(config["var_dir"], "ssh_key.pub")): |
966 | 178 | with open(os.path.join(config['var_dir'], 'ssh_key.pub')) as f: | 178 | with open(os.path.join(config["var_dir"], "ssh_key.pub")) as f: |
967 | 179 | config['ssh_public_key'] = f.read().rstrip() | 179 | config["ssh_public_key"] = f.read().rstrip() |
968 | 180 | config['ssh_public_key_file'] = os.path.join(config['var_dir'], 'ssh_key.pub') | 180 | config["ssh_public_key_file"] = os.path.join(config["var_dir"], "ssh_key.pub") |
969 | 181 | config['ssh_private_key_file'] = os.path.join(config['var_dir'], 'ssh_key') | 181 | config["ssh_private_key_file"] = os.path.join(config["var_dir"], "ssh_key") |
970 | 182 | 182 | ||
971 | 183 | sources_config = {} | 183 | sources_config = {} |
972 | 184 | # Merge in sources.d/*.json to the sources dict | 184 | # Merge in sources.d/*.json to the sources dict |
973 | @@ -187,7 +187,7 @@ | |||
974 | 187 | sources_files = [ | 187 | sources_files = [ |
975 | 188 | os.path.join(sources_d, fn) | 188 | os.path.join(sources_d, fn) |
976 | 189 | for fn in os.listdir(sources_d) | 189 | for fn in os.listdir(sources_d) |
978 | 190 | if fn.endswith('.json') | 190 | if fn.endswith(".json") |
979 | 191 | and os.path.isfile(os.path.join(sources_d, fn)) | 191 | and os.path.isfile(os.path.join(sources_d, fn)) |
980 | 192 | and os.access(os.path.join(sources_d, fn), os.R_OK) | 192 | and os.access(os.path.join(sources_d, fn), os.R_OK) |
981 | 193 | ] | 193 | ] |
982 | @@ -197,7 +197,7 @@ | |||
983 | 197 | var_sources_files = [ | 197 | var_sources_files = [ |
984 | 198 | os.path.join(var_sources_d, fn) | 198 | os.path.join(var_sources_d, fn) |
985 | 199 | for fn in os.listdir(var_sources_d) | 199 | for fn in os.listdir(var_sources_d) |
987 | 200 | if fn.endswith('.json') | 200 | if fn.endswith(".json") |
988 | 201 | and os.path.isfile(os.path.join(var_sources_d, fn)) | 201 | and os.path.isfile(os.path.join(var_sources_d, fn)) |
989 | 202 | and os.access(os.path.join(var_sources_d, fn), os.R_OK) | 202 | and os.access(os.path.join(var_sources_d, fn), os.R_OK) |
990 | 203 | ] | 203 | ] |
991 | @@ -208,19 +208,19 @@ | |||
992 | 208 | 208 | ||
993 | 209 | # Check for required sources options | 209 | # Check for required sources options |
994 | 210 | for s in list(sources_config.keys()): | 210 | for s in list(sources_config.keys()): |
996 | 211 | if 'path' not in sources_config[s]: | 211 | if "path" not in sources_config[s]: |
997 | 212 | del sources_config[s] | 212 | del sources_config[s] |
998 | 213 | 213 | ||
1000 | 214 | config['sources'] = sources_config | 214 | config["sources"] = sources_config |
1001 | 215 | 215 | ||
1002 | 216 | return config | 216 | return config |
1003 | 217 | 217 | ||
1004 | 218 | 218 | ||
1005 | 219 | def fill_config(config): | 219 | def fill_config(config): |
1010 | 220 | config_d = os.path.join(config['config_dir'], 'config.d') | 220 | config_d = os.path.join(config["config_dir"], "config.d") |
1011 | 221 | sources_d = os.path.join(config['config_dir'], 'sources.d') | 221 | sources_d = os.path.join(config["config_dir"], "sources.d") |
1012 | 222 | var_config_d = os.path.join(config['var_dir'], 'config.d') | 222 | var_config_d = os.path.join(config["var_dir"], "config.d") |
1013 | 223 | var_sources_d = os.path.join(config['var_dir'], 'sources.d') | 223 | var_sources_d = os.path.join(config["var_dir"], "sources.d") |
1014 | 224 | 224 | ||
1015 | 225 | # Create required directories | 225 | # Create required directories |
1016 | 226 | for d in (config_d, sources_d, var_config_d, var_sources_d): | 226 | for d in (config_d, sources_d, var_config_d, var_sources_d): |
1017 | @@ -229,106 +229,122 @@ | |||
1018 | 229 | 229 | ||
1019 | 230 | # Validate the machine UUID/secret | 230 | # Validate the machine UUID/secret |
1020 | 231 | write_uuid_data = False | 231 | write_uuid_data = False |
1023 | 232 | if 'machine_uuid' not in config: | 232 | if "machine_uuid" not in config: |
1024 | 233 | config['machine_uuid'] = str(uuid.uuid4()) | 233 | config["machine_uuid"] = str(uuid.uuid4()) |
1025 | 234 | write_uuid_data = True | 234 | write_uuid_data = True |
1030 | 235 | if 'machine_secret' not in config: | 235 | if "machine_secret" not in config: |
1031 | 236 | config['machine_secret'] = ''.join( | 236 | config["machine_secret"] = "".join( |
1032 | 237 | random.choice(string.ascii_letters + string.digits) | 237 | random.choice(string.ascii_letters + string.digits) for i in range(30) |
1029 | 238 | for i in range(30) | ||
1033 | 239 | ) | 238 | ) |
1034 | 240 | write_uuid_data = True | 239 | write_uuid_data = True |
1035 | 241 | # Write out the machine UUID/secret if needed | 240 | # Write out the machine UUID/secret if needed |
1036 | 242 | if write_uuid_data: | 241 | if write_uuid_data: |
1038 | 243 | with open(os.path.join(var_config_d, '10-machine_uuid.json'), 'w') as f: | 242 | with open(os.path.join(var_config_d, "10-machine_uuid.json"), "w") as f: |
1039 | 244 | os.fchmod(f.fileno(), 0o600) | 243 | os.fchmod(f.fileno(), 0o600) |
1044 | 245 | json_dump_p({ | 244 | json_dump_p( |
1045 | 246 | 'machine_uuid': config['machine_uuid'], | 245 | { |
1046 | 247 | 'machine_secret': config['machine_secret'], | 246 | "machine_uuid": config["machine_uuid"], |
1047 | 248 | }, f) | 247 | "machine_secret": config["machine_secret"], |
1048 | 248 | }, | ||
1049 | 249 | f, | ||
1050 | 250 | ) | ||
1051 | 249 | 251 | ||
1052 | 250 | # Restoration configuration | 252 | # Restoration configuration |
1053 | 251 | write_restore_data = False | 253 | write_restore_data = False |
1067 | 252 | if 'restore_path' not in config: | 254 | if "restore_path" not in config: |
1068 | 253 | config['restore_path'] = '/var/backups/turku-agent/restore' | 255 | config["restore_path"] = "/var/backups/turku-agent/restore" |
1069 | 254 | write_restore_data = True | 256 | write_restore_data = True |
1070 | 255 | if 'restore_module' not in config: | 257 | if "restore_module" not in config: |
1071 | 256 | config['restore_module'] = 'turku-restore' | 258 | config["restore_module"] = "turku-restore" |
1072 | 257 | write_restore_data = True | 259 | write_restore_data = True |
1073 | 258 | if 'restore_username' not in config: | 260 | if "restore_username" not in config: |
1074 | 259 | config['restore_username'] = str(uuid.uuid4()) | 261 | config["restore_username"] = str(uuid.uuid4()) |
1075 | 260 | write_restore_data = True | 262 | write_restore_data = True |
1076 | 261 | if 'restore_password' not in config: | 263 | if "restore_password" not in config: |
1077 | 262 | config['restore_password'] = ''.join( | 264 | config["restore_password"] = "".join( |
1078 | 263 | random.choice(string.ascii_letters + string.digits) | 265 | random.choice(string.ascii_letters + string.digits) for i in range(30) |
1066 | 264 | for i in range(30) | ||
1079 | 265 | ) | 266 | ) |
1080 | 266 | write_restore_data = True | 267 | write_restore_data = True |
1081 | 267 | if write_restore_data: | 268 | if write_restore_data: |
1083 | 268 | with open(os.path.join(var_config_d, '10-restore.json'), 'w') as f: | 269 | with open(os.path.join(var_config_d, "10-restore.json"), "w") as f: |
1084 | 269 | os.fchmod(f.fileno(), 0o600) | 270 | os.fchmod(f.fileno(), 0o600) |
1085 | 270 | restore_out = { | 271 | restore_out = { |
1090 | 271 | 'restore_path': config['restore_path'], | 272 | "restore_path": config["restore_path"], |
1091 | 272 | 'restore_module': config['restore_module'], | 273 | "restore_module": config["restore_module"], |
1092 | 273 | 'restore_username': config['restore_username'], | 274 | "restore_username": config["restore_username"], |
1093 | 274 | 'restore_password': config['restore_password'], | 275 | "restore_password": config["restore_password"], |
1094 | 275 | } | 276 | } |
1095 | 276 | json_dump_p(restore_out, f) | 277 | json_dump_p(restore_out, f) |
1098 | 277 | if not os.path.isdir(config['restore_path']): | 278 | if not os.path.isdir(config["restore_path"]): |
1099 | 278 | os.makedirs(config['restore_path']) | 279 | os.makedirs(config["restore_path"]) |
1100 | 279 | 280 | ||
1101 | 280 | # Generate the SSH keypair if it doesn't exist | 281 | # Generate the SSH keypair if it doesn't exist |
1111 | 281 | if 'ssh_private_key_file' not in config: | 282 | if "ssh_private_key_file" not in config: |
1112 | 282 | subprocess.check_call([ | 283 | subprocess.check_call( |
1113 | 283 | 'ssh-keygen', '-t', 'rsa', '-N', '', '-C', 'turku', | 284 | [ |
1114 | 284 | '-f', os.path.join(config['var_dir'], 'ssh_key') | 285 | "ssh-keygen", |
1115 | 285 | ]) | 286 | "-t", |
1116 | 286 | with open(os.path.join(config['var_dir'], 'ssh_key.pub')) as f: | 287 | "rsa", |
1117 | 287 | config['ssh_public_key'] = f.read().rstrip() | 288 | "-N", |
1118 | 288 | config['ssh_public_key_file'] = os.path.join(config['var_dir'], 'ssh_key.pub') | 289 | "", |
1119 | 289 | config['ssh_private_key_file'] = os.path.join(config['var_dir'], 'ssh_key') | 290 | "-C", |
1120 | 291 | "turku", | ||
1121 | 292 | "-f", | ||
1122 | 293 | os.path.join(config["var_dir"], "ssh_key"), | ||
1123 | 294 | ] | ||
1124 | 295 | ) | ||
1125 | 296 | with open(os.path.join(config["var_dir"], "ssh_key.pub")) as f: | ||
1126 | 297 | config["ssh_public_key"] = f.read().rstrip() | ||
1127 | 298 | config["ssh_public_key_file"] = os.path.join(config["var_dir"], "ssh_key.pub") | ||
1128 | 299 | config["ssh_private_key_file"] = os.path.join(config["var_dir"], "ssh_key") | ||
1129 | 290 | 300 | ||
1131 | 291 | for s in config['sources']: | 301 | for s in config["sources"]: |
1132 | 292 | # Check for missing usernames/passwords | 302 | # Check for missing usernames/passwords |
1139 | 293 | if not ('username' in config['sources'][s] or 'password' in config['sources'][s]): | 303 | if not ( |
1140 | 294 | sources_secrets_d = os.path.join(config['config_dir'], 'sources_secrets.d') | 304 | "username" in config["sources"][s] or "password" in config["sources"][s] |
1141 | 295 | if 'username' not in config['sources'][s]: | 305 | ): |
1142 | 296 | config['sources'][s]['username'] = str(uuid.uuid4()) | 306 | if "username" not in config["sources"][s]: |
1143 | 297 | if 'password' not in config['sources'][s]: | 307 | config["sources"][s]["username"] = str(uuid.uuid4()) |
1144 | 298 | config['sources'][s]['password'] = ''.join( | 308 | if "password" not in config["sources"][s]: |
1145 | 309 | config["sources"][s]["password"] = "".join( | ||
1146 | 299 | random.choice(string.ascii_letters + string.digits) | 310 | random.choice(string.ascii_letters + string.digits) |
1147 | 300 | for i in range(30) | 311 | for i in range(30) |
1148 | 301 | ) | 312 | ) |
1150 | 302 | with open(os.path.join(var_sources_d, '10-' + s + '.json'), 'w') as f: | 313 | with open(os.path.join(var_sources_d, "10-" + s + ".json"), "w") as f: |
1151 | 303 | os.fchmod(f.fileno(), 0o600) | 314 | os.fchmod(f.fileno(), 0o600) |
1158 | 304 | json_dump_p({ | 315 | json_dump_p( |
1159 | 305 | s: { | 316 | { |
1160 | 306 | 'username': config['sources'][s]['username'], | 317 | s: { |
1161 | 307 | 'password': config['sources'][s]['password'], | 318 | "username": config["sources"][s]["username"], |
1162 | 308 | } | 319 | "password": config["sources"][s]["password"], |
1163 | 309 | }, f) | 320 | } |
1164 | 321 | }, | ||
1165 | 322 | f, | ||
1166 | 323 | ) | ||
1167 | 310 | 324 | ||
1168 | 311 | 325 | ||
1169 | 312 | def api_call(api_url, cmd, post_data, timeout=5): | 326 | def api_call(api_url, cmd, post_data, timeout=5): |
1170 | 313 | url = urllib.parse.urlparse(api_url) | 327 | url = urllib.parse.urlparse(api_url) |
1172 | 314 | if url.scheme == 'https': | 328 | if url.scheme == "https": |
1173 | 315 | h = http.client.HTTPSConnection(url.netloc, timeout=timeout) | 329 | h = http.client.HTTPSConnection(url.netloc, timeout=timeout) |
1174 | 316 | else: | 330 | else: |
1175 | 317 | h = http.client.HTTPConnection(url.netloc, timeout=timeout) | 331 | h = http.client.HTTPConnection(url.netloc, timeout=timeout) |
1176 | 318 | out = json.dumps(post_data) | 332 | out = json.dumps(post_data) |
1181 | 319 | h.putrequest('POST', '%s/%s' % (url.path, cmd)) | 333 | h.putrequest("POST", "%s/%s" % (url.path, cmd)) |
1182 | 320 | h.putheader('Content-Type', 'application/json') | 334 | h.putheader("Content-Type", "application/json") |
1183 | 321 | h.putheader('Content-Length', len(out)) | 335 | h.putheader("Content-Length", len(out)) |
1184 | 322 | h.putheader('Accept', 'application/json') | 336 | h.putheader("Accept", "application/json") |
1185 | 323 | h.endheaders() | 337 | h.endheaders() |
1187 | 324 | h.send(out.encode('UTF-8')) | 338 | h.send(out.encode("UTF-8")) |
1188 | 325 | 339 | ||
1189 | 326 | res = h.getresponse() | 340 | res = h.getresponse() |
1190 | 327 | if not res.status == http.client.OK: | 341 | if not res.status == http.client.OK: |
1194 | 328 | raise Exception('Received error %d (%s) from API server' % (res.status, res.reason)) | 342 | raise Exception( |
1195 | 329 | if not res.getheader('content-type') == 'application/json': | 343 | "Received error %d (%s) from API server" % (res.status, res.reason) |
1196 | 330 | raise Exception('Received invalid reply from API server') | 344 | ) |
1197 | 345 | if not res.getheader("content-type") == "application/json": | ||
1198 | 346 | raise Exception("Received invalid reply from API server") | ||
1199 | 331 | try: | 347 | try: |
1201 | 332 | return json.loads(res.read().decode('UTF-8')) | 348 | return json.loads(res.read().decode("UTF-8")) |
1202 | 333 | except ValueError: | 349 | except ValueError: |
1204 | 334 | raise Exception('Received invalid reply from API server') | 350 | raise Exception("Received invalid reply from API server") |
This merge proposal is being monitored by mergebot. Change the status to Approved to merge.