Merge lp:~fo0bar/turku/turku-agent-cleanup into lp:turku/turku-agent

Proposed by Ryan Finnie
Status: Merged
Approved by: Barry Price
Approved revision: 57
Merged at revision: 57
Proposed branch: lp:~fo0bar/turku/turku-agent-cleanup
Merge into: lp:turku/turku-agent
Diff against target: 1204 lines (+487/-298)
10 files modified
.bzrignore (+61/-5)
MANIFEST.in (+13/-0)
Makefile (+28/-0)
setup.py (+13/-13)
tests/test_stub.py (+8/-0)
tox.ini (+38/-0)
turku_agent/ping.py (+108/-84)
turku_agent/rsyncd_wrapper.py (+11/-7)
turku_agent/update_config.py (+71/-69)
turku_agent/utils.py (+136/-120)
To merge this branch: bzr merge lp:~fo0bar/turku/turku-agent-cleanup
Reviewer Review Type Date Requested Status
Stuart Bishop (community) Approve
Review via email: mp+386145@code.launchpad.net

Commit message

Mega-noop cleanup

Description of the change

This is the minimum required for:
- tox test suite with all passing tests
- black-managed formatting
- Shippable sdist module

It is intended as a base for the other MPs, so they don't have to e.g. establish tests/*, or worry about about existing failing flake8, or worry about how to add additional optional modules.

To post a comment you must log in.
Revision history for this message
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote :

This merge proposal is being monitored by mergebot. Change the status to Approved to merge.

lp:~fo0bar/turku/turku-agent-cleanup updated
57. By Ryan Finnie

Mega-noop cleanup

- Sort imports
- Remove shabangs from non-scripts
- Update MANIFEST.in so `setup.py sdist` produces usable tarballs
- Create stub tests
- Add tox.ini
- Add blank requirements.txt
- Add Makefile
- make black
- Update .bzrignore
- Clean up flake8:
  - update_config.py: '.utils.json_dump_p' imported but unused
  - update_config.py: 'api_reply' is assigned to but never used
  - utils.py: 'sources_secrets_d' is assigned to but never used

Revision history for this message
Stuart Bishop (stub) wrote :

Looks good. The new bits seem fine (bzr, buildchain, tox). The code changes all appear to be nothing but Black reformatting (as expected).

review: Approve
Revision history for this message
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote :

Change successfully merged at revision 57

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file '.bzrignore'
--- .bzrignore 2019-04-22 01:16:04 +0000
+++ .bzrignore 2020-06-21 22:22:36 +0000
@@ -1,5 +1,61 @@
1*.pyc1MANIFEST
2./build/2.pybuild/
3./dist/3.pytest_cache/
4./MANIFEST4
5./*.egg-info/5# Byte-compiled / optimized / DLL files
6__pycache__/
7*.py[cod]
8
9# C extensions
10*.so
11
12# Distribution / packaging
13.Python
14env/
15build/
16develop-eggs/
17dist/
18downloads/
19eggs/
20.eggs/
21lib/
22lib64/
23parts/
24sdist/
25var/
26*.egg-info/
27.installed.cfg
28*.egg
29
30# PyInstaller
31# Usually these files are written by a python script from a template
32# before PyInstaller builds the exe, so as to inject date/other infos into it.
33*.manifest
34*.spec
35
36# Installer logs
37pip-log.txt
38pip-delete-this-directory.txt
39
40# Unit test / coverage reports
41htmlcov/
42.tox/
43.coverage
44.coverage.*
45.cache
46nosetests.xml
47coverage.xml
48*,cover
49
50# Translations
51*.mo
52*.pot
53
54# Django stuff:
55*.log
56
57# Sphinx documentation
58docs/_build/
59
60# PyBuilder
61target/
662
=== modified file 'MANIFEST.in'
--- MANIFEST.in 2015-03-29 07:59:33 +0000
+++ MANIFEST.in 2020-06-21 22:22:36 +0000
@@ -1,3 +1,16 @@
1include Makefile
2include README
3include requirements.txt
4include tests/*.py
5include tox.ini
1include turku-agent.cron6include turku-agent.cron
7include turku-agent-ping
8include turku-agent-ping.service
9include turku-agent-ping.timer
2include turku-agent-rsyncd.conf10include turku-agent-rsyncd.conf
11include turku-agent-rsyncd.init-debian
3include turku-agent-rsyncd.service12include turku-agent-rsyncd.service
13include turku-agent-rsyncd-wrapper
14include turku-update-config
15include turku-update-config.service
16include turku-update-config.timer
417
=== added file 'Makefile'
--- Makefile 1970-01-01 00:00:00 +0000
+++ Makefile 2020-06-21 22:22:36 +0000
@@ -0,0 +1,28 @@
1PYTHON := python3
2
3all: build
4
5build:
6 $(PYTHON) setup.py build
7
8lint:
9 $(PYTHON) -mtox -e flake8
10
11test:
12 $(PYTHON) -mtox
13
14test-quick:
15 $(PYTHON) -mtox -e black,flake8,pytest-quick
16
17black-check:
18 $(PYTHON) -mtox -e black
19
20black:
21 $(PYTHON) -mblack $(CURDIR)
22
23install: build
24 $(PYTHON) setup.py install
25
26clean:
27 $(PYTHON) setup.py clean
28 $(RM) -r build MANIFEST
029
=== added file 'requirements.txt'
=== modified file 'setup.py'
--- setup.py 2019-04-22 01:16:04 +0000
+++ setup.py 2020-06-21 22:22:36 +0000
@@ -18,22 +18,22 @@
18import sys18import sys
19from setuptools import setup19from setuptools import setup
2020
21assert(sys.version_info > (3, 4))21assert sys.version_info > (3, 4)
2222
2323
24setup(24setup(
25 name='turku_agent',25 name="turku_agent",
26 description='Turku backups - client agent',26 description="Turku backups - client agent",
27 version='0.2.0',27 version="0.2.0",
28 author='Ryan Finnie',28 author="Ryan Finnie",
29 author_email='ryan.finnie@canonical.com',29 author_email="ryan.finnie@canonical.com",
30 url='https://launchpad.net/turku',30 url="https://launchpad.net/turku",
31 packages=['turku_agent'],31 packages=["turku_agent"],
32 entry_points={32 entry_points={
33 'console_scripts': [33 "console_scripts": [
34 'turku-agent-ping = turku_agent.ping:main',34 "turku-agent-ping = turku_agent.ping:main",
35 'turku-agent-rsyncd-wrapper = turku_agent.rsyncd_wrapper:main',35 "turku-agent-rsyncd-wrapper = turku_agent.rsyncd_wrapper:main",
36 'turku-update-config = turku_agent.update_config:main',36 "turku-update-config = turku_agent.update_config:main",
37 ],37 ]
38 },38 },
39)39)
4040
=== added directory 'tests'
=== added file 'tests/__init__.py'
=== added file 'tests/test_stub.py'
--- tests/test_stub.py 1970-01-01 00:00:00 +0000
+++ tests/test_stub.py 2020-06-21 22:22:36 +0000
@@ -0,0 +1,8 @@
1import unittest
2import warnings
3
4
5class TestStub(unittest.TestCase):
6 def test_stub(self):
7 # pytest doesn't like a tests/ with no tests
8 warnings.warn("Remove this file once unit tests are added")
09
=== added file 'tox.ini'
--- tox.ini 1970-01-01 00:00:00 +0000
+++ tox.ini 2020-06-21 22:22:36 +0000
@@ -0,0 +1,38 @@
1[tox]
2envlist = black, flake8, pytest
3
4[testenv]
5basepython = python
6
7[testenv:black]
8commands = python -mblack --check .
9deps = black
10
11[testenv:flake8]
12commands = python -mflake8
13deps = flake8
14
15[testenv:pytest]
16commands = python -mpytest --cov=turku_agent --cov-report=term-missing
17deps = pytest
18 pytest-cov
19 -r{toxinidir}/requirements.txt
20
21[testenv:pytest-quick]
22commands = python -mpytest -m "not slow"
23deps = pytest
24 -r{toxinidir}/requirements.txt
25
26[flake8]
27exclude =
28 .git,
29 __pycache__,
30 .tox,
31# TODO: remove C901 once complexity is reduced
32ignore = C901,E203,E231,W503
33max-line-length = 120
34max-complexity = 10
35
36[pytest]
37markers =
38 slow
039
=== modified file 'turku_agent/ping.py'
--- turku_agent/ping.py 2020-03-26 06:06:23 +0000
+++ turku_agent/ping.py 2020-06-21 22:22:36 +0000
@@ -1,5 +1,3 @@
1#!/usr/bin/env python3
2
3# Turku backups - client agent1# Turku backups - client agent
4# Copyright 2015 Canonical Ltd.2# Copyright 2015 Canonical Ltd.
5#3#
@@ -16,13 +14,14 @@
16# this program. If not, see <http://www.gnu.org/licenses/>.14# this program. If not, see <http://www.gnu.org/licenses/>.
1715
1816
17import json
19import os18import os
20import json
21import random19import random
22import shlex20import shlex
23import subprocess21import subprocess
24import tempfile22import tempfile
25import time23import time
24
26from .utils import load_config, acquire_lock, api_call25from .utils import load_config, acquire_lock, api_call
2726
2827
@@ -30,42 +29,60 @@
30 import argparse29 import argparse
3130
32 parser = argparse.ArgumentParser(31 parser = argparse.ArgumentParser(
33 formatter_class=argparse.ArgumentDefaultsHelpFormatter)32 formatter_class=argparse.ArgumentDefaultsHelpFormatter
34 parser.add_argument('--config-dir', '-c', type=str, default='/etc/turku-agent')33 )
35 parser.add_argument('--wait', '-w', type=float)34 parser.add_argument("--config-dir", "-c", type=str, default="/etc/turku-agent")
36 parser.add_argument('--restore', action='store_true')35 parser.add_argument("--wait", "-w", type=float)
37 parser.add_argument('--restore-storage', type=str, default=None)36 parser.add_argument("--restore", action="store_true")
38 parser.add_argument('--gonogo-program', type=str, default=None,37 parser.add_argument("--restore-storage", type=str, default=None)
39 help='Go/no-go program run each time to determine whether to ping')38 parser.add_argument(
39 "--gonogo-program",
40 type=str,
41 default=None,
42 help="Go/no-go program run each time to determine whether to ping",
43 )
40 return parser.parse_args()44 return parser.parse_args()
4145
4246
43def call_ssh(config, storage, ssh_req):47def call_ssh(config, storage, ssh_req):
44 # Write the server host public key48 # Write the server host public key
45 t = tempfile.NamedTemporaryFile(mode='w+', encoding='UTF-8')49 t = tempfile.NamedTemporaryFile(mode="w+", encoding="UTF-8")
46 for key in storage['ssh_ping_host_keys']:50 for key in storage["ssh_ping_host_keys"]:
47 t.write('%s %s\n' % (storage['ssh_ping_host'], key))51 t.write("%s %s\n" % (storage["ssh_ping_host"], key))
48 t.flush()52 t.flush()
4953
50 # Call ssh54 # Call ssh
51 ssh_command = config['ssh_command']55 ssh_command = config["ssh_command"]
52 ssh_command += [56 ssh_command += [
53 '-T',57 "-T",
54 '-o', 'BatchMode=yes',58 "-o",
55 '-o', 'UserKnownHostsFile=%s' % t.name,59 "BatchMode=yes",
56 '-o', 'StrictHostKeyChecking=yes',60 "-o",
57 '-o', 'CheckHostIP=no',61 "UserKnownHostsFile=%s" % t.name,
58 '-i', config['ssh_private_key_file'],62 "-o",
59 '-R', '%d:%s:%d' % (ssh_req['port'], config['rsyncd_local_address'], config['rsyncd_local_port']),63 "StrictHostKeyChecking=yes",
60 '-p', str(storage['ssh_ping_port']),64 "-o",
61 '-l', storage['ssh_ping_user'],65 "CheckHostIP=no",
62 storage['ssh_ping_host'],66 "-i",
63 'turku-ping-remote',67 config["ssh_private_key_file"],
68 "-R",
69 "%d:%s:%d"
70 % (
71 ssh_req["port"],
72 config["rsyncd_local_address"],
73 config["rsyncd_local_port"],
74 ),
75 "-p",
76 str(storage["ssh_ping_port"]),
77 "-l",
78 storage["ssh_ping_user"],
79 storage["ssh_ping_host"],
80 "turku-ping-remote",
64 ]81 ]
65 p = subprocess.Popen(ssh_command, stdin=subprocess.PIPE)82 p = subprocess.Popen(ssh_command, stdin=subprocess.PIPE)
6683
67 # Write the ssh request84 # Write the ssh request
68 p.stdin.write((json.dumps(ssh_req) + '\n.\n').encode('UTF-8'))85 p.stdin.write((json.dumps(ssh_req) + "\n.\n").encode("UTF-8"))
69 p.stdin.flush()86 p.stdin.flush()
7087
71 # Wait for the server to close the SSH connection88 # Wait for the server to close the SSH connection
@@ -88,16 +105,18 @@
88 config = load_config(args.config_dir)105 config = load_config(args.config_dir)
89106
90 # Basic checks107 # Basic checks
91 for i in ('ssh_private_key_file', 'machine_uuid', 'machine_secret', 'api_url'):108 for i in ("ssh_private_key_file", "machine_uuid", "machine_secret", "api_url"):
92 if i not in config:109 if i not in config:
93 return110 return
94 if not os.path.isfile(config['ssh_private_key_file']):111 if not os.path.isfile(config["ssh_private_key_file"]):
95 return112 return
96113
97 # If a go/no-go program is defined, run it and only go if it exits 0.114 # If a go/no-go program is defined, run it and only go if it exits 0.
98 # Example: prevent backups during high-load for sensitive systems:115 # Example: prevent backups during high-load for sensitive systems:
99 # ['check_load', '-c', '1,5,15']116 # ['check_load', '-c', '1,5,15']
100 gonogo_program = args.gonogo_program if args.gonogo_program else config['gonogo_program']117 gonogo_program = (
118 args.gonogo_program if args.gonogo_program else config["gonogo_program"]
119 )
101 if isinstance(gonogo_program, (list, tuple)):120 if isinstance(gonogo_program, (list, tuple)):
102 # List, program name first, optional arguments after121 # List, program name first, optional arguments after
103 gonogo_program_and_args = list(gonogo_program)122 gonogo_program_and_args = list(gonogo_program)
@@ -113,106 +132,111 @@
113 except (subprocess.CalledProcessError, OSError):132 except (subprocess.CalledProcessError, OSError):
114 return133 return
115134
116 lock = acquire_lock(os.path.join(config['lock_dir'], 'turku-agent-ping.lock'))135 lock = acquire_lock(os.path.join(config["lock_dir"], "turku-agent-ping.lock"))
117136
118 restore_mode = args.restore137 restore_mode = args.restore
119138
120 # Check with the API server139 # Check with the API server
121 api_out = {}140 api_out = {}
122141
123 machine_merge_map = (142 machine_merge_map = (("machine_uuid", "uuid"), ("machine_secret", "secret"))
124 ('machine_uuid', 'uuid'),143 api_out["machine"] = {}
125 ('machine_secret', 'secret'),
126 )
127 api_out['machine'] = {}
128 for a, b in machine_merge_map:144 for a, b in machine_merge_map:
129 if a in config:145 if a in config:
130 api_out['machine'][b] = config[a]146 api_out["machine"][b] = config[a]
131147
132 if restore_mode:148 if restore_mode:
133 print('Entering restore mode.')149 print("Entering restore mode.")
134 print()150 print()
135 api_reply = api_call(config['api_url'], 'agent_ping_restore', api_out)151 api_reply = api_call(config["api_url"], "agent_ping_restore", api_out)
136152
137 sources_by_storage = {}153 sources_by_storage = {}
138 for source_name in api_reply['machine']['sources']:154 for source_name in api_reply["machine"]["sources"]:
139 source = api_reply['machine']['sources'][source_name]155 source = api_reply["machine"]["sources"][source_name]
140 if source_name not in config['sources']:156 if source_name not in config["sources"]:
141 continue157 continue
142 if 'storage' not in source:158 if "storage" not in source:
143 continue159 continue
144 if source['storage']['name'] not in sources_by_storage:160 if source["storage"]["name"] not in sources_by_storage:
145 sources_by_storage[source['storage']['name']] = {}161 sources_by_storage[source["storage"]["name"]] = {}
146 sources_by_storage[source['storage']['name']][source_name] = source162 sources_by_storage[source["storage"]["name"]][source_name] = source
147163
148 if len(sources_by_storage) == 0:164 if len(sources_by_storage) == 0:
149 print('Cannot find any appropraite sources.')165 print("Cannot find any appropraite sources.")
150 return166 return
151 print('This machine\'s sources are on the following storage units:')167 print("This machine's sources are on the following storage units:")
152 for storage_name in sources_by_storage:168 for storage_name in sources_by_storage:
153 print(' %s' % storage_name)169 print(" %s" % storage_name)
154 for source_name in sources_by_storage[storage_name]:170 for source_name in sources_by_storage[storage_name]:
155 print(' %s' % source_name)171 print(" %s" % source_name)
156 print()172 print()
157 if len(sources_by_storage) == 1:173 if len(sources_by_storage) == 1:
158 storage = list(list(sources_by_storage.values())[0].values())[0]['storage']174 storage = list(list(sources_by_storage.values())[0].values())[0]["storage"]
159 elif args.restore_storage:175 elif args.restore_storage:
160 if args.restore_storage in sources_by_storage:176 if args.restore_storage in sources_by_storage:
161 storage = sources_by_storage[args.restore_storage]['storage']177 storage = sources_by_storage[args.restore_storage]["storage"]
162 else:178 else:
163 print('Cannot find appropriate storage "%s"' % args.restore_storage)179 print('Cannot find appropriate storage "%s"' % args.restore_storage)
164 return180 return
165 else:181 else:
166 print('Multiple storages found. Please use --restore-storage to specify one.')182 print(
183 "Multiple storages found. Please use --restore-storage to specify one."
184 )
167 return185 return
168186
169 ssh_req = {187 ssh_req = {
170 'verbose': True,188 "verbose": True,
171 'action': 'restore',189 "action": "restore",
172 'port': random.randint(49152, 65535),190 "port": random.randint(49152, 65535),
173 }191 }
174 print('Storage unit: %s' % storage['name'])192 print("Storage unit: %s" % storage["name"])
175 if 'restore_path' in config:193 if "restore_path" in config:
176 print('Local destination path: %s' % config['restore_path'])194 print("Local destination path: %s" % config["restore_path"])
177 print('Sample restore usage from storage unit:')195 print("Sample restore usage from storage unit:")
178 print(196 print(
179 ' RSYNC_PASSWORD=%s rsync -avzP --numeric-ids ${P?}/ rsync://%s@127.0.0.1:%s/%s/' % (197 " RSYNC_PASSWORD=%s rsync -avzP --numeric-ids ${P?}/ rsync://%s@127.0.0.1:%s/%s/"
180 config['restore_password'],198 % (
181 config['restore_username'],199 config["restore_password"],
182 ssh_req['port'], config['restore_module']200 config["restore_username"],
201 ssh_req["port"],
202 config["restore_module"],
183 )203 )
184 )204 )
185 print()205 print()
186 call_ssh(config, storage, ssh_req)206 call_ssh(config, storage, ssh_req)
187 else:207 else:
188 api_reply = api_call(config['api_url'], 'agent_ping_checkin', api_out)208 api_reply = api_call(config["api_url"], "agent_ping_checkin", api_out)
189209
190 if 'scheduled_sources' not in api_reply:210 if "scheduled_sources" not in api_reply:
191 return211 return
192 sources_by_storage = {}212 sources_by_storage = {}
193 for source_name in api_reply['machine']['scheduled_sources']:213 for source_name in api_reply["machine"]["scheduled_sources"]:
194 source = api_reply['machine']['scheduled_sources'][source_name]214 source = api_reply["machine"]["scheduled_sources"][source_name]
195 if source_name not in config['sources']:215 if source_name not in config["sources"]:
196 continue216 continue
197 if 'storage' not in source:217 if "storage" not in source:
198 continue218 continue
199 if source['storage']['name'] not in sources_by_storage:219 if source["storage"]["name"] not in sources_by_storage:
200 sources_by_storage[source['storage']['name']] = {}220 sources_by_storage[source["storage"]["name"]] = {}
201 sources_by_storage[source['storage']['name']][source_name] = source221 sources_by_storage[source["storage"]["name"]][source_name] = source
202222
203 for storage_name in sources_by_storage:223 for storage_name in sources_by_storage:
204 ssh_req = {224 ssh_req = {
205 'verbose': True,225 "verbose": True,
206 'action': 'checkin',226 "action": "checkin",
207 'port': random.randint(49152, 65535),227 "port": random.randint(49152, 65535),
208 'sources': {},228 "sources": {},
209 }229 }
210 for source in sources_by_storage[storage_name]:230 for source in sources_by_storage[storage_name]:
211 ssh_req['sources'][source] = {231 ssh_req["sources"][source] = {
212 'username': config['sources'][source]['username'],232 "username": config["sources"][source]["username"],
213 'password': config['sources'][source]['password'],233 "password": config["sources"][source]["password"],
214 }234 }
215 call_ssh(config, list(sources_by_storage[storage_name].values())[0]['storage'], ssh_req)235 call_ssh(
236 config,
237 list(sources_by_storage[storage_name].values())[0]["storage"],
238 ssh_req,
239 )
216240
217 # Cleanup241 # Cleanup
218 lock.close()242 lock.close()
219243
=== modified file 'turku_agent/rsyncd_wrapper.py'
--- turku_agent/rsyncd_wrapper.py 2019-04-22 01:16:04 +0000
+++ turku_agent/rsyncd_wrapper.py 2020-06-21 22:22:36 +0000
@@ -16,6 +16,7 @@
16# this program. If not, see <http://www.gnu.org/licenses/>.16# this program. If not, see <http://www.gnu.org/licenses/>.
1717
18import os18import os
19
19from .utils import load_config20from .utils import load_config
2021
2122
@@ -23,9 +24,10 @@
23 import argparse24 import argparse
2425
25 parser = argparse.ArgumentParser(26 parser = argparse.ArgumentParser(
26 formatter_class=argparse.ArgumentDefaultsHelpFormatter)27 formatter_class=argparse.ArgumentDefaultsHelpFormatter
27 parser.add_argument('--config-dir', '-c', type=str, default='/etc/turku-agent')28 )
28 parser.add_argument('--detach', action='store_true')29 parser.add_argument("--config-dir", "-c", type=str, default="/etc/turku-agent")
30 parser.add_argument("--detach", action="store_true")
29 return parser.parse_known_args()31 return parser.parse_known_args()
3032
3133
@@ -33,10 +35,12 @@
33 args, rest = parse_args()35 args, rest = parse_args()
3436
35 config = load_config(args.config_dir)37 config = load_config(args.config_dir)
36 rsyncd_command = config['rsyncd_command']38 rsyncd_command = config["rsyncd_command"]
37 if not args.detach:39 if not args.detach:
38 rsyncd_command.append('--no-detach')40 rsyncd_command.append("--no-detach")
39 rsyncd_command.append('--daemon')41 rsyncd_command.append("--daemon")
40 rsyncd_command.append('--config=%s' % os.path.join(config['var_dir'], 'rsyncd.conf'))42 rsyncd_command.append(
43 "--config=%s" % os.path.join(config["var_dir"], "rsyncd.conf")
44 )
41 rsyncd_command += rest45 rsyncd_command += rest
42 os.execvp(rsyncd_command[0], rsyncd_command)46 os.execvp(rsyncd_command[0], rsyncd_command)
4347
=== modified file 'turku_agent/update_config.py'
--- turku_agent/update_config.py 2019-05-23 14:18:25 +0000
+++ turku_agent/update_config.py 2020-06-21 22:22:36 +0000
@@ -1,5 +1,3 @@
1#!/usr/bin/env python3
2
3# Turku backups - client agent1# Turku backups - client agent
4# Copyright 2015 Canonical Ltd.2# Copyright 2015 Canonical Ltd.
5#3#
@@ -15,12 +13,13 @@
15# You should have received a copy of the GNU General Public License along with13# You should have received a copy of the GNU General Public License along with
16# this program. If not, see <http://www.gnu.org/licenses/>.14# this program. If not, see <http://www.gnu.org/licenses/>.
1715
16import logging
17import os
18import random18import random
19import os
20import subprocess19import subprocess
21import time20import time
22import logging21
23from .utils import json_dump_p, json_dumps_p, load_config, fill_config, acquire_lock, api_call22from .utils import json_dumps_p, load_config, fill_config, acquire_lock, api_call
2423
2524
26class IncompleteConfigError(Exception):25class IncompleteConfigError(Exception):
@@ -31,69 +30,70 @@
31 import argparse30 import argparse
3231
33 parser = argparse.ArgumentParser(32 parser = argparse.ArgumentParser(
34 formatter_class=argparse.ArgumentDefaultsHelpFormatter)33 formatter_class=argparse.ArgumentDefaultsHelpFormatter
35 parser.add_argument('--config-dir', '-c', type=str, default='/etc/turku-agent')34 )
36 parser.add_argument('--wait', '-w', type=float)35 parser.add_argument("--config-dir", "-c", type=str, default="/etc/turku-agent")
37 parser.add_argument('--debug', action='store_true')36 parser.add_argument("--wait", "-w", type=float)
37 parser.add_argument("--debug", action="store_true")
38 return parser.parse_args()38 return parser.parse_args()
3939
4040
41def write_conf_files(config):41def write_conf_files(config):
42 # Build rsyncd.conf42 # Build rsyncd.conf
43 built_rsyncd_conf = (43 built_rsyncd_conf = (
44 'address = %s\n' % config['rsyncd_local_address'] +44 "address = %s\n" % config["rsyncd_local_address"]
45 'port = %d\n' % config['rsyncd_local_port'] +45 + "port = %d\n" % config["rsyncd_local_port"]
46 'log file = /dev/stdout\n' +46 + "log file = /dev/stdout\n"
47 'uid = root\n' +47 + "uid = root\n"
48 'gid = root\n' +48 + "gid = root\n"
49 'list = false\n\n'49 + "list = false\n\n"
50 )50 )
51 rsyncd_secrets = []51 rsyncd_secrets = []
52 rsyncd_secrets.append((config['restore_username'], config['restore_password']))52 rsyncd_secrets.append((config["restore_username"], config["restore_password"]))
53 built_rsyncd_conf += (53 built_rsyncd_conf += (
54 '[%s]\n' +54 "[%s]\n"
55 ' path = %s\n' +55 + " path = %s\n"
56 ' auth users = %s\n' +56 + " auth users = %s\n"
57 ' secrets file = %s\n' +57 + " secrets file = %s\n"
58 ' read only = false\n\n'58 + " read only = false\n\n"
59 ) % (59 ) % (
60 config['restore_module'],60 config["restore_module"],
61 config['restore_path'],61 config["restore_path"],
62 config['restore_username'],62 config["restore_username"],
63 os.path.join(config['var_dir'], 'rsyncd.secrets'),63 os.path.join(config["var_dir"], "rsyncd.secrets"),
64 )64 )
65 for s in config['sources']:65 for s in config["sources"]:
66 sd = config['sources'][s]66 sd = config["sources"][s]
67 rsyncd_secrets.append((sd['username'], sd['password']))67 rsyncd_secrets.append((sd["username"], sd["password"]))
68 built_rsyncd_conf += (68 built_rsyncd_conf += (
69 '[%s]\n' +69 "[%s]\n"
70 ' path = %s\n' +70 + " path = %s\n"
71 ' auth users = %s\n' +71 + " auth users = %s\n"
72 ' secrets file = %s\n' +72 + " secrets file = %s\n"
73 ' read only = true\n\n'73 + " read only = true\n\n"
74 ) % (74 ) % (
75 s,75 s,
76 sd['path'],76 sd["path"],
77 sd['username'],77 sd["username"],
78 os.path.join(config['var_dir'], 'rsyncd.secrets'),78 os.path.join(config["var_dir"], "rsyncd.secrets"),
79 )79 )
80 with open(os.path.join(config['var_dir'], 'rsyncd.conf'), 'w') as f:80 with open(os.path.join(config["var_dir"], "rsyncd.conf"), "w") as f:
81 f.write(built_rsyncd_conf)81 f.write(built_rsyncd_conf)
8282
83 # Build rsyncd.secrets83 # Build rsyncd.secrets
84 built_rsyncd_secrets = ''84 built_rsyncd_secrets = ""
85 for (username, password) in rsyncd_secrets:85 for (username, password) in rsyncd_secrets:
86 built_rsyncd_secrets += username + ':' + password + '\n'86 built_rsyncd_secrets += username + ":" + password + "\n"
87 with open(os.path.join(config['var_dir'], 'rsyncd.secrets'), 'w') as f:87 with open(os.path.join(config["var_dir"], "rsyncd.secrets"), "w") as f:
88 os.fchmod(f.fileno(), 0o600)88 os.fchmod(f.fileno(), 0o600)
89 f.write(built_rsyncd_secrets)89 f.write(built_rsyncd_secrets)
9090
9191
92def init_is_upstart():92def init_is_upstart():
93 try:93 try:
94 return 'upstart' in subprocess.check_output(94 return "upstart" in subprocess.check_output(
95 ['initctl', 'version'],95 ["initctl", "version"], stderr=subprocess.DEVNULL, universal_newlines=True
96 stderr=subprocess.DEVNULL, universal_newlines=True)96 )
97 except (FileNotFoundError, subprocess.CalledProcessError):97 except (FileNotFoundError, subprocess.CalledProcessError):
98 return False98 return False
9999
@@ -107,52 +107,54 @@
107 # With Upstart, start will fail if the service is already running,107 # With Upstart, start will fail if the service is already running,
108 # so we need to check for that first.108 # so we need to check for that first.
109 try:109 try:
110 if 'start/running' in subprocess.check_output(110 if "start/running" in subprocess.check_output(
111 ['status', 'turku-agent-rsyncd'],111 ["status", "turku-agent-rsyncd"],
112 stderr=subprocess.STDOUT, universal_newlines=True):112 stderr=subprocess.STDOUT,
113 universal_newlines=True,
114 ):
113 return115 return
114 except subprocess.CalledProcessError:116 except subprocess.CalledProcessError:
115 pass117 pass
116 subprocess.check_call(['service', 'turku-agent-rsyncd', 'start'])118 subprocess.check_call(["service", "turku-agent-rsyncd", "start"])
117119
118120
119def send_config(config):121def send_config(config):
120 required_keys = ['api_url']122 required_keys = ["api_url"]
121 if 'api_auth' not in config:123 if "api_auth" not in config:
122 required_keys += ['api_auth_name', 'api_auth_secret']124 required_keys += ["api_auth_name", "api_auth_secret"]
123 for k in required_keys:125 for k in required_keys:
124 if k not in config:126 if k not in config:
125 raise IncompleteConfigError('Required config "%s" not found.' % k)127 raise IncompleteConfigError('Required config "%s" not found.' % k)
126128
127 api_out = {}129 api_out = {}
128 if ('api_auth_name' in config) and ('api_auth_secret' in config):130 if ("api_auth_name" in config) and ("api_auth_secret" in config):
129 # name/secret style131 # name/secret style
130 api_out['auth'] = {132 api_out["auth"] = {
131 'name': config['api_auth_name'],133 "name": config["api_auth_name"],
132 'secret': config['api_auth_secret'],134 "secret": config["api_auth_secret"],
133 }135 }
134 else:136 else:
135 # nameless secret style137 # nameless secret style
136 api_out['auth'] = config['api_auth']138 api_out["auth"] = config["api_auth"]
137139
138 # Merge the following options into the machine section140 # Merge the following options into the machine section
139 machine_merge_map = (141 machine_merge_map = (
140 ('machine_uuid', 'uuid'),142 ("machine_uuid", "uuid"),
141 ('machine_secret', 'secret'),143 ("machine_secret", "secret"),
142 ('environment_name', 'environment_name'),144 ("environment_name", "environment_name"),
143 ('service_name', 'service_name'),145 ("service_name", "service_name"),
144 ('unit_name', 'unit_name'),146 ("unit_name", "unit_name"),
145 ('ssh_public_key', 'ssh_public_key'),147 ("ssh_public_key", "ssh_public_key"),
146 ('published', 'published'),148 ("published", "published"),
147 )149 )
148 api_out['machine'] = {}150 api_out["machine"] = {}
149 for a, b in machine_merge_map:151 for a, b in machine_merge_map:
150 if a in config:152 if a in config:
151 api_out['machine'][b] = config[a]153 api_out["machine"][b] = config[a]
152154
153 api_out['machine']['sources'] = config['sources']155 api_out["machine"]["sources"] = config["sources"]
154156
155 api_reply = api_call(config['api_url'], 'update_config', api_out)157 api_call(config["api_url"], "update_config", api_out)
156158
157159
158def main():160def main():
@@ -162,7 +164,7 @@
162 time.sleep(random.uniform(0, args.wait))164 time.sleep(random.uniform(0, args.wait))
163165
164 config = load_config(args.config_dir)166 config = load_config(args.config_dir)
165 lock = acquire_lock(os.path.join(config['lock_dir'], 'turku-update-config.lock'))167 lock = acquire_lock(os.path.join(config["lock_dir"], "turku-update-config.lock"))
166 fill_config(config)168 fill_config(config)
167 if args.debug:169 if args.debug:
168 print(json_dumps_p(config))170 print(json_dumps_p(config))
169171
=== modified file 'turku_agent/utils.py'
--- turku_agent/utils.py 2020-03-23 22:31:56 +0000
+++ turku_agent/utils.py 2020-06-21 22:22:36 +0000
@@ -1,5 +1,3 @@
1#!/usr/bin/env python3
2
3# Turku backups - client agent1# Turku backups - client agent
4# Copyright 2015 Canonical Ltd.2# Copyright 2015 Canonical Ltd.
5#3#
@@ -15,32 +13,34 @@
15# You should have received a copy of the GNU General Public License along with13# You should have received a copy of the GNU General Public License along with
16# this program. If not, see <http://www.gnu.org/licenses/>.14# this program. If not, see <http://www.gnu.org/licenses/>.
1715
18import uuid16import copy
19import string17import http.client
20import random
21import json18import json
22import os19import os
23import copy20import platform
21import random
22import string
24import subprocess23import subprocess
25import platform
26import urllib.parse24import urllib.parse
27import http.client25import uuid
2826
2927
30class RuntimeLock():28class RuntimeLock:
31 name = None29 name = None
32 file = None30 file = None
3331
34 def __init__(self, name):32 def __init__(self, name):
35 import fcntl33 import fcntl
36 file = open(name, 'w')34
35 file = open(name, "w")
37 try:36 try:
38 fcntl.lockf(file, fcntl.LOCK_EX | fcntl.LOCK_NB)37 fcntl.lockf(file, fcntl.LOCK_EX | fcntl.LOCK_NB)
39 except IOError as e:38 except IOError as e:
40 import errno39 import errno
40
41 if e.errno in (errno.EACCES, errno.EAGAIN):41 if e.errno in (errno.EACCES, errno.EAGAIN):
42 raise42 raise
43 file.write('%10s\n' % os.getpid())43 file.write("%10s\n" % os.getpid())
44 file.flush()44 file.flush()
45 file.seek(0)45 file.seek(0)
46 self.name = name46 self.name = name
@@ -71,12 +71,12 @@
7171
72def json_dump_p(obj, f):72def json_dump_p(obj, f):
73 """Calls json.dump with standard (pretty) formatting"""73 """Calls json.dump with standard (pretty) formatting"""
74 return json.dump(obj, f, sort_keys=True, indent=4, separators=(',', ': '))74 return json.dump(obj, f, sort_keys=True, indent=4, separators=(",", ": "))
7575
7676
77def json_dumps_p(obj):77def json_dumps_p(obj):
78 """Calls json.dumps with standard (pretty) formatting"""78 """Calls json.dumps with standard (pretty) formatting"""
79 return json.dumps(obj, sort_keys=True, indent=4, separators=(',', ': '))79 return json.dumps(obj, sort_keys=True, indent=4, separators=(",", ": "))
8080
8181
82def json_load_file(file):82def json_load_file(file):
@@ -103,10 +103,10 @@
103103
104def load_config(config_dir):104def load_config(config_dir):
105 config = {}105 config = {}
106 config['config_dir'] = config_dir106 config["config_dir"] = config_dir
107107
108 config_d = os.path.join(config['config_dir'], 'config.d')108 config_d = os.path.join(config["config_dir"], "config.d")
109 sources_d = os.path.join(config['config_dir'], 'sources.d')109 sources_d = os.path.join(config["config_dir"], "sources.d")
110110
111 # Merge in config.d/*.json to the root level111 # Merge in config.d/*.json to the root level
112 config_files = []112 config_files = []
@@ -114,7 +114,7 @@
114 config_files = [114 config_files = [
115 os.path.join(config_d, fn)115 os.path.join(config_d, fn)
116 for fn in os.listdir(config_d)116 for fn in os.listdir(config_d)
117 if fn.endswith('.json')117 if fn.endswith(".json")
118 and os.path.isfile(os.path.join(config_d, fn))118 and os.path.isfile(os.path.join(config_d, fn))
119 and os.access(os.path.join(config_d, fn), os.R_OK)119 and os.access(os.path.join(config_d, fn), os.R_OK)
120 ]120 ]
@@ -122,10 +122,10 @@
122 for file in config_files:122 for file in config_files:
123 config = dict_merge(config, json_load_file(file))123 config = dict_merge(config, json_load_file(file))
124124
125 if 'var_dir' not in config:125 if "var_dir" not in config:
126 config['var_dir'] = '/var/lib/turku-agent'126 config["var_dir"] = "/var/lib/turku-agent"
127127
128 var_config_d = os.path.join(config['var_dir'], 'config.d')128 var_config_d = os.path.join(config["var_dir"], "config.d")
129129
130 # Load /var config.d files130 # Load /var config.d files
131 var_config = {}131 var_config = {}
@@ -134,7 +134,7 @@
134 var_config_files = [134 var_config_files = [
135 os.path.join(var_config_d, fn)135 os.path.join(var_config_d, fn)
136 for fn in os.listdir(var_config_d)136 for fn in os.listdir(var_config_d)
137 if fn.endswith('.json')137 if fn.endswith(".json")
138 and os.path.isfile(os.path.join(var_config_d, fn))138 and os.path.isfile(os.path.join(var_config_d, fn))
139 and os.access(os.path.join(var_config_d, fn), os.R_OK)139 and os.access(os.path.join(var_config_d, fn), os.R_OK)
140 ]140 ]
@@ -145,40 +145,40 @@
145 var_config = dict_merge(var_config, config)145 var_config = dict_merge(var_config, config)
146 config = var_config146 config = var_config
147147
148 if 'lock_dir' not in config:148 if "lock_dir" not in config:
149 config['lock_dir'] = '/var/lock'149 config["lock_dir"] = "/var/lock"
150150
151 if 'rsyncd_command' not in config:151 if "rsyncd_command" not in config:
152 config['rsyncd_command'] = ['rsync']152 config["rsyncd_command"] = ["rsync"]
153153
154 if 'rsyncd_local_address' not in config:154 if "rsyncd_local_address" not in config:
155 config['rsyncd_local_address'] = '127.0.0.1'155 config["rsyncd_local_address"] = "127.0.0.1"
156156
157 if 'rsyncd_local_port' not in config:157 if "rsyncd_local_port" not in config:
158 config['rsyncd_local_port'] = 27873158 config["rsyncd_local_port"] = 27873
159159
160 if 'ssh_command' not in config:160 if "ssh_command" not in config:
161 config['ssh_command'] = ['ssh']161 config["ssh_command"] = ["ssh"]
162162
163 # If a go/no-go program is defined, run it and only go if it exits 0.163 # If a go/no-go program is defined, run it and only go if it exits 0.
164 # Type: String (program with no args) or list (program first, optional arguments after)164 # Type: String (program with no args) or list (program first, optional arguments after)
165 if 'gonogo_program' not in config:165 if "gonogo_program" not in config:
166 config['gonogo_program'] = None166 config["gonogo_program"] = None
167167
168 var_sources_d = os.path.join(config['var_dir'], 'sources.d')168 var_sources_d = os.path.join(config["var_dir"], "sources.d")
169169
170 # Validate the unit name170 # Validate the unit name
171 if 'unit_name' not in config:171 if "unit_name" not in config:
172 config['unit_name'] = platform.node()172 config["unit_name"] = platform.node()
173 # If this isn't in the on-disk config, don't write it; just173 # If this isn't in the on-disk config, don't write it; just
174 # generate it every time174 # generate it every time
175175
176 # Pull the SSH public key176 # Pull the SSH public key
177 if os.path.isfile(os.path.join(config['var_dir'], 'ssh_key.pub')):177 if os.path.isfile(os.path.join(config["var_dir"], "ssh_key.pub")):
178 with open(os.path.join(config['var_dir'], 'ssh_key.pub')) as f:178 with open(os.path.join(config["var_dir"], "ssh_key.pub")) as f:
179 config['ssh_public_key'] = f.read().rstrip()179 config["ssh_public_key"] = f.read().rstrip()
180 config['ssh_public_key_file'] = os.path.join(config['var_dir'], 'ssh_key.pub')180 config["ssh_public_key_file"] = os.path.join(config["var_dir"], "ssh_key.pub")
181 config['ssh_private_key_file'] = os.path.join(config['var_dir'], 'ssh_key')181 config["ssh_private_key_file"] = os.path.join(config["var_dir"], "ssh_key")
182182
183 sources_config = {}183 sources_config = {}
184 # Merge in sources.d/*.json to the sources dict184 # Merge in sources.d/*.json to the sources dict
@@ -187,7 +187,7 @@
187 sources_files = [187 sources_files = [
188 os.path.join(sources_d, fn)188 os.path.join(sources_d, fn)
189 for fn in os.listdir(sources_d)189 for fn in os.listdir(sources_d)
190 if fn.endswith('.json')190 if fn.endswith(".json")
191 and os.path.isfile(os.path.join(sources_d, fn))191 and os.path.isfile(os.path.join(sources_d, fn))
192 and os.access(os.path.join(sources_d, fn), os.R_OK)192 and os.access(os.path.join(sources_d, fn), os.R_OK)
193 ]193 ]
@@ -197,7 +197,7 @@
197 var_sources_files = [197 var_sources_files = [
198 os.path.join(var_sources_d, fn)198 os.path.join(var_sources_d, fn)
199 for fn in os.listdir(var_sources_d)199 for fn in os.listdir(var_sources_d)
200 if fn.endswith('.json')200 if fn.endswith(".json")
201 and os.path.isfile(os.path.join(var_sources_d, fn))201 and os.path.isfile(os.path.join(var_sources_d, fn))
202 and os.access(os.path.join(var_sources_d, fn), os.R_OK)202 and os.access(os.path.join(var_sources_d, fn), os.R_OK)
203 ]203 ]
@@ -208,19 +208,19 @@
208208
209 # Check for required sources options209 # Check for required sources options
210 for s in list(sources_config.keys()):210 for s in list(sources_config.keys()):
211 if 'path' not in sources_config[s]:211 if "path" not in sources_config[s]:
212 del sources_config[s]212 del sources_config[s]
213213
214 config['sources'] = sources_config214 config["sources"] = sources_config
215215
216 return config216 return config
217217
218218
219def fill_config(config):219def fill_config(config):
220 config_d = os.path.join(config['config_dir'], 'config.d')220 config_d = os.path.join(config["config_dir"], "config.d")
221 sources_d = os.path.join(config['config_dir'], 'sources.d')221 sources_d = os.path.join(config["config_dir"], "sources.d")
222 var_config_d = os.path.join(config['var_dir'], 'config.d')222 var_config_d = os.path.join(config["var_dir"], "config.d")
223 var_sources_d = os.path.join(config['var_dir'], 'sources.d')223 var_sources_d = os.path.join(config["var_dir"], "sources.d")
224224
225 # Create required directories225 # Create required directories
226 for d in (config_d, sources_d, var_config_d, var_sources_d):226 for d in (config_d, sources_d, var_config_d, var_sources_d):
@@ -229,106 +229,122 @@
229229
230 # Validate the machine UUID/secret230 # Validate the machine UUID/secret
231 write_uuid_data = False231 write_uuid_data = False
232 if 'machine_uuid' not in config:232 if "machine_uuid" not in config:
233 config['machine_uuid'] = str(uuid.uuid4())233 config["machine_uuid"] = str(uuid.uuid4())
234 write_uuid_data = True234 write_uuid_data = True
235 if 'machine_secret' not in config:235 if "machine_secret" not in config:
236 config['machine_secret'] = ''.join(236 config["machine_secret"] = "".join(
237 random.choice(string.ascii_letters + string.digits)237 random.choice(string.ascii_letters + string.digits) for i in range(30)
238 for i in range(30)
239 )238 )
240 write_uuid_data = True239 write_uuid_data = True
241 # Write out the machine UUID/secret if needed240 # Write out the machine UUID/secret if needed
242 if write_uuid_data:241 if write_uuid_data:
243 with open(os.path.join(var_config_d, '10-machine_uuid.json'), 'w') as f:242 with open(os.path.join(var_config_d, "10-machine_uuid.json"), "w") as f:
244 os.fchmod(f.fileno(), 0o600)243 os.fchmod(f.fileno(), 0o600)
245 json_dump_p({244 json_dump_p(
246 'machine_uuid': config['machine_uuid'],245 {
247 'machine_secret': config['machine_secret'],246 "machine_uuid": config["machine_uuid"],
248 }, f)247 "machine_secret": config["machine_secret"],
248 },
249 f,
250 )
249251
250 # Restoration configuration252 # Restoration configuration
251 write_restore_data = False253 write_restore_data = False
252 if 'restore_path' not in config:254 if "restore_path" not in config:
253 config['restore_path'] = '/var/backups/turku-agent/restore'255 config["restore_path"] = "/var/backups/turku-agent/restore"
254 write_restore_data = True256 write_restore_data = True
255 if 'restore_module' not in config:257 if "restore_module" not in config:
256 config['restore_module'] = 'turku-restore'258 config["restore_module"] = "turku-restore"
257 write_restore_data = True259 write_restore_data = True
258 if 'restore_username' not in config:260 if "restore_username" not in config:
259 config['restore_username'] = str(uuid.uuid4())261 config["restore_username"] = str(uuid.uuid4())
260 write_restore_data = True262 write_restore_data = True
261 if 'restore_password' not in config:263 if "restore_password" not in config:
262 config['restore_password'] = ''.join(264 config["restore_password"] = "".join(
263 random.choice(string.ascii_letters + string.digits)265 random.choice(string.ascii_letters + string.digits) for i in range(30)
264 for i in range(30)
265 )266 )
266 write_restore_data = True267 write_restore_data = True
267 if write_restore_data:268 if write_restore_data:
268 with open(os.path.join(var_config_d, '10-restore.json'), 'w') as f:269 with open(os.path.join(var_config_d, "10-restore.json"), "w") as f:
269 os.fchmod(f.fileno(), 0o600)270 os.fchmod(f.fileno(), 0o600)
270 restore_out = {271 restore_out = {
271 'restore_path': config['restore_path'],272 "restore_path": config["restore_path"],
272 'restore_module': config['restore_module'],273 "restore_module": config["restore_module"],
273 'restore_username': config['restore_username'],274 "restore_username": config["restore_username"],
274 'restore_password': config['restore_password'],275 "restore_password": config["restore_password"],
275 }276 }
276 json_dump_p(restore_out, f)277 json_dump_p(restore_out, f)
277 if not os.path.isdir(config['restore_path']):278 if not os.path.isdir(config["restore_path"]):
278 os.makedirs(config['restore_path'])279 os.makedirs(config["restore_path"])
279280
280 # Generate the SSH keypair if it doesn't exist281 # Generate the SSH keypair if it doesn't exist
281 if 'ssh_private_key_file' not in config:282 if "ssh_private_key_file" not in config:
282 subprocess.check_call([283 subprocess.check_call(
283 'ssh-keygen', '-t', 'rsa', '-N', '', '-C', 'turku',284 [
284 '-f', os.path.join(config['var_dir'], 'ssh_key')285 "ssh-keygen",
285 ])286 "-t",
286 with open(os.path.join(config['var_dir'], 'ssh_key.pub')) as f:287 "rsa",
287 config['ssh_public_key'] = f.read().rstrip()288 "-N",
288 config['ssh_public_key_file'] = os.path.join(config['var_dir'], 'ssh_key.pub')289 "",
289 config['ssh_private_key_file'] = os.path.join(config['var_dir'], 'ssh_key')290 "-C",
291 "turku",
292 "-f",
293 os.path.join(config["var_dir"], "ssh_key"),
294 ]
295 )
296 with open(os.path.join(config["var_dir"], "ssh_key.pub")) as f:
297 config["ssh_public_key"] = f.read().rstrip()
298 config["ssh_public_key_file"] = os.path.join(config["var_dir"], "ssh_key.pub")
299 config["ssh_private_key_file"] = os.path.join(config["var_dir"], "ssh_key")
290300
291 for s in config['sources']:301 for s in config["sources"]:
292 # Check for missing usernames/passwords302 # Check for missing usernames/passwords
293 if not ('username' in config['sources'][s] or 'password' in config['sources'][s]):303 if not (
294 sources_secrets_d = os.path.join(config['config_dir'], 'sources_secrets.d')304 "username" in config["sources"][s] or "password" in config["sources"][s]
295 if 'username' not in config['sources'][s]:305 ):
296 config['sources'][s]['username'] = str(uuid.uuid4())306 if "username" not in config["sources"][s]:
297 if 'password' not in config['sources'][s]:307 config["sources"][s]["username"] = str(uuid.uuid4())
298 config['sources'][s]['password'] = ''.join(308 if "password" not in config["sources"][s]:
309 config["sources"][s]["password"] = "".join(
299 random.choice(string.ascii_letters + string.digits)310 random.choice(string.ascii_letters + string.digits)
300 for i in range(30)311 for i in range(30)
301 )312 )
302 with open(os.path.join(var_sources_d, '10-' + s + '.json'), 'w') as f:313 with open(os.path.join(var_sources_d, "10-" + s + ".json"), "w") as f:
303 os.fchmod(f.fileno(), 0o600)314 os.fchmod(f.fileno(), 0o600)
304 json_dump_p({315 json_dump_p(
305 s: {316 {
306 'username': config['sources'][s]['username'],317 s: {
307 'password': config['sources'][s]['password'],318 "username": config["sources"][s]["username"],
308 }319 "password": config["sources"][s]["password"],
309 }, f)320 }
321 },
322 f,
323 )
310324
311325
312def api_call(api_url, cmd, post_data, timeout=5):326def api_call(api_url, cmd, post_data, timeout=5):
313 url = urllib.parse.urlparse(api_url)327 url = urllib.parse.urlparse(api_url)
314 if url.scheme == 'https':328 if url.scheme == "https":
315 h = http.client.HTTPSConnection(url.netloc, timeout=timeout)329 h = http.client.HTTPSConnection(url.netloc, timeout=timeout)
316 else:330 else:
317 h = http.client.HTTPConnection(url.netloc, timeout=timeout)331 h = http.client.HTTPConnection(url.netloc, timeout=timeout)
318 out = json.dumps(post_data)332 out = json.dumps(post_data)
319 h.putrequest('POST', '%s/%s' % (url.path, cmd))333 h.putrequest("POST", "%s/%s" % (url.path, cmd))
320 h.putheader('Content-Type', 'application/json')334 h.putheader("Content-Type", "application/json")
321 h.putheader('Content-Length', len(out))335 h.putheader("Content-Length", len(out))
322 h.putheader('Accept', 'application/json')336 h.putheader("Accept", "application/json")
323 h.endheaders()337 h.endheaders()
324 h.send(out.encode('UTF-8'))338 h.send(out.encode("UTF-8"))
325339
326 res = h.getresponse()340 res = h.getresponse()
327 if not res.status == http.client.OK:341 if not res.status == http.client.OK:
328 raise Exception('Received error %d (%s) from API server' % (res.status, res.reason))342 raise Exception(
329 if not res.getheader('content-type') == 'application/json':343 "Received error %d (%s) from API server" % (res.status, res.reason)
330 raise Exception('Received invalid reply from API server')344 )
345 if not res.getheader("content-type") == "application/json":
346 raise Exception("Received invalid reply from API server")
331 try:347 try:
332 return json.loads(res.read().decode('UTF-8'))348 return json.loads(res.read().decode("UTF-8"))
333 except ValueError:349 except ValueError:
334 raise Exception('Received invalid reply from API server')350 raise Exception("Received invalid reply from API server")

Subscribers

People subscribed via source and target branches

to all changes: