Merge lp:~fo0bar/turku/turku-api-cleanup into lp:turku/turku-api
- turku-api-cleanup
- Merge into turku-api
Status: | Merged |
---|---|
Approved by: | Barry Price |
Approved revision: | 65 |
Merged at revision: | 65 |
Proposed branch: | lp:~fo0bar/turku/turku-api-cleanup |
Merge into: | lp:turku/turku-api |
Diff against target: |
2291 lines (+901/-541) 12 files modified
.bzrignore (+61/-1) MANIFEST.in (+9/-0) Makefile (+28/-0) setup.py (+28/-0) tests/test_stub.py (+8/-0) tox.ini (+38/-0) turku_api/admin.py (+109/-82) turku_api/models.py (+167/-162) turku_api/settings.py (+37/-35) turku_api/urls.py (+31/-13) turku_api/views.py (+382/-247) turku_api/wsgi.py (+3/-1) |
To merge this branch: | bzr merge lp:~fo0bar/turku/turku-api-cleanup |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Stuart Bishop (community) | Approve | ||
Review via email: mp+386143@code.launchpad.net |
Commit message
Mega-noop cleanup
Description of the change
This is the minimum required for:
- tox test suite with all passing tests
- black-managed formatting
- Shippable sdist module
It is intended as a base for the other MPs, so they don't have to e.g. establish tests/*, or worry about about existing failing flake8, or worry about how to add additional optional modules.
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote : | # |
- 65. By Ryan Finnie
-
Mega-noop cleanup
- Add setup.py
- This is a Django application, but treating it as a full Python
module helps with tox testing
- Sort imports
- Create MANIFEST.in so `setup.py sdist` produces usable tarballs
- Create stub tests
- Add tox.ini
- Add blank requirements.txt
- Add Makefile
- make black
- Update .bzrignore
- Remove blank turku_api/tests.py
- Clean up flake8:
- Ignore local_settings/local_urls import * F401/F403
- Ignore wsgi.py import E402
- Fix urls.py 'django.conf.urls. include' imported but unused
Stuart Bishop (stub) wrote : | # |
Yup, same deal as the other two turku cleanup branches.
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote : | # |
Change successfully merged at revision 65
Preview Diff
1 | === modified file '.bzrignore' |
2 | --- .bzrignore 2015-03-26 05:04:41 +0000 |
3 | +++ .bzrignore 2020-06-21 23:58:30 +0000 |
4 | @@ -1,4 +1,64 @@ |
5 | -*.pyc |
6 | db.sqlite3 |
7 | turku_api/local_settings.py |
8 | turku_api/local_urls.py |
9 | +MANIFEST |
10 | +.pybuild/ |
11 | +.pytest_cache/ |
12 | + |
13 | +# Byte-compiled / optimized / DLL files |
14 | +__pycache__/ |
15 | +*.py[cod] |
16 | + |
17 | +# C extensions |
18 | +*.so |
19 | + |
20 | +# Distribution / packaging |
21 | +.Python |
22 | +env/ |
23 | +build/ |
24 | +develop-eggs/ |
25 | +dist/ |
26 | +downloads/ |
27 | +eggs/ |
28 | +.eggs/ |
29 | +lib/ |
30 | +lib64/ |
31 | +parts/ |
32 | +sdist/ |
33 | +var/ |
34 | +*.egg-info/ |
35 | +.installed.cfg |
36 | +*.egg |
37 | + |
38 | +# PyInstaller |
39 | +# Usually these files are written by a python script from a template |
40 | +# before PyInstaller builds the exe, so as to inject date/other infos into it. |
41 | +*.manifest |
42 | +*.spec |
43 | + |
44 | +# Installer logs |
45 | +pip-log.txt |
46 | +pip-delete-this-directory.txt |
47 | + |
48 | +# Unit test / coverage reports |
49 | +htmlcov/ |
50 | +.tox/ |
51 | +.coverage |
52 | +.coverage.* |
53 | +.cache |
54 | +nosetests.xml |
55 | +coverage.xml |
56 | +*,cover |
57 | + |
58 | +# Translations |
59 | +*.mo |
60 | +*.pot |
61 | + |
62 | +# Django stuff: |
63 | +*.log |
64 | + |
65 | +# Sphinx documentation |
66 | +docs/_build/ |
67 | + |
68 | +# PyBuilder |
69 | +target/ |
70 | |
71 | === added file 'MANIFEST.in' |
72 | --- MANIFEST.in 1970-01-01 00:00:00 +0000 |
73 | +++ MANIFEST.in 2020-06-21 23:58:30 +0000 |
74 | @@ -0,0 +1,9 @@ |
75 | +include Makefile |
76 | +include manage.py |
77 | +include MANIFEST.in |
78 | +include README.md |
79 | +include requirements.txt |
80 | +include scripts/turku_health |
81 | +include tests/*.py |
82 | +include tox.ini |
83 | +include turku_api/templates/admin/*.html |
84 | |
85 | === added file 'Makefile' |
86 | --- Makefile 1970-01-01 00:00:00 +0000 |
87 | +++ Makefile 2020-06-21 23:58:30 +0000 |
88 | @@ -0,0 +1,28 @@ |
89 | +PYTHON := python3 |
90 | + |
91 | +all: build |
92 | + |
93 | +build: |
94 | + $(PYTHON) setup.py build |
95 | + |
96 | +lint: |
97 | + $(PYTHON) -mtox -e flake8 |
98 | + |
99 | +test: |
100 | + $(PYTHON) -mtox |
101 | + |
102 | +test-quick: |
103 | + $(PYTHON) -mtox -e black,flake8,pytest-quick |
104 | + |
105 | +black-check: |
106 | + $(PYTHON) -mtox -e black |
107 | + |
108 | +black: |
109 | + $(PYTHON) -mblack $(CURDIR) |
110 | + |
111 | +install: build |
112 | + $(PYTHON) setup.py install |
113 | + |
114 | +clean: |
115 | + $(PYTHON) setup.py clean |
116 | + $(RM) -r build MANIFEST |
117 | |
118 | === added file 'requirements.txt' |
119 | === added file 'setup.py' |
120 | --- setup.py 1970-01-01 00:00:00 +0000 |
121 | +++ setup.py 2020-06-21 23:58:30 +0000 |
122 | @@ -0,0 +1,28 @@ |
123 | +#!/usr/bin/env python3 |
124 | + |
125 | +# Turku backups - API application |
126 | +# Copyright 2015-2020 Canonical Ltd. |
127 | +# |
128 | +# This program is free software: you can redistribute it and/or modify it |
129 | +# under the terms of the GNU General Public License version 3, as published by |
130 | +# the Free Software Foundation. |
131 | +# |
132 | +# This program is distributed in the hope that it will be useful, but WITHOUT |
133 | +# ANY WARRANTY; without even the implied warranties of MERCHANTABILITY, |
134 | +# SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR PURPOSE. See the GNU |
135 | +# General Public License for more details. |
136 | +# |
137 | +# You should have received a copy of the GNU General Public License along with |
138 | +# this program. If not, see <http://www.gnu.org/licenses/>. |
139 | + |
140 | +from setuptools import setup |
141 | + |
142 | + |
143 | +setup( |
144 | + name="turku_api", |
145 | + description="Turku backups - API application", |
146 | + author="Ryan Finnie", |
147 | + url="https://launchpad.net/turku", |
148 | + python_requires="~=3.4", |
149 | + packages=["turku_api"], |
150 | +) |
151 | |
152 | === added directory 'tests' |
153 | === added file 'tests/__init__.py' |
154 | === added file 'tests/test_stub.py' |
155 | --- tests/test_stub.py 1970-01-01 00:00:00 +0000 |
156 | +++ tests/test_stub.py 2020-06-21 23:58:30 +0000 |
157 | @@ -0,0 +1,8 @@ |
158 | +import unittest |
159 | +import warnings |
160 | + |
161 | + |
162 | +class TestStub(unittest.TestCase): |
163 | + def test_stub(self): |
164 | + # pytest doesn't like a tests/ with no tests |
165 | + warnings.warn("Remove this file once unit tests are added") |
166 | |
167 | === added file 'tox.ini' |
168 | --- tox.ini 1970-01-01 00:00:00 +0000 |
169 | +++ tox.ini 2020-06-21 23:58:30 +0000 |
170 | @@ -0,0 +1,38 @@ |
171 | +[tox] |
172 | +envlist = black, flake8, pytest |
173 | + |
174 | +[testenv] |
175 | +basepython = python |
176 | + |
177 | +[testenv:black] |
178 | +commands = python -mblack --check . |
179 | +deps = black |
180 | + |
181 | +[testenv:flake8] |
182 | +commands = python -mflake8 |
183 | +deps = flake8 |
184 | + |
185 | +[testenv:pytest] |
186 | +commands = python -mpytest --cov=turku_api --cov-report=term-missing |
187 | +deps = pytest |
188 | + pytest-cov |
189 | + -r{toxinidir}/requirements.txt |
190 | + |
191 | +[testenv:pytest-quick] |
192 | +commands = python -mpytest -m "not slow" |
193 | +deps = pytest |
194 | + -r{toxinidir}/requirements.txt |
195 | + |
196 | +[flake8] |
197 | +exclude = |
198 | + .git, |
199 | + __pycache__, |
200 | + .tox, |
201 | +# TODO: remove C901 once complexity is reduced |
202 | +ignore = C901,E203,E231,W503 |
203 | +max-line-length = 120 |
204 | +max-complexity = 10 |
205 | + |
206 | +[pytest] |
207 | +markers = |
208 | + slow |
209 | |
210 | === modified file 'turku_api/admin.py' |
211 | --- turku_api/admin.py 2020-05-06 02:41:37 +0000 |
212 | +++ turku_api/admin.py 2020-06-21 23:58:30 +0000 |
213 | @@ -14,38 +14,39 @@ |
214 | # License along with this program. If not, see |
215 | # <http://www.gnu.org/licenses/>. |
216 | |
217 | +import datetime |
218 | + |
219 | from django import forms |
220 | from django.contrib import admin |
221 | -from turku_api.models import Machine, Source, Auth, Storage, BackupLog, FilterSet |
222 | +from django.contrib.humanize.templatetags.humanize import naturaltime |
223 | +from django.utils import timezone |
224 | from django.utils.html import format_html |
225 | -from django.utils import timezone |
226 | -from django.contrib.humanize.templatetags.humanize import naturaltime |
227 | + |
228 | try: |
229 | from django.urls import reverse # 1.10+ |
230 | except ModuleNotFoundError: |
231 | from django.core.urlresolvers import reverse # pre-1.10 |
232 | -import datetime |
233 | + |
234 | +from turku_api.models import Auth, BackupLog, FilterSet, Machine, Source, Storage |
235 | |
236 | |
237 | def get_admin_change_link(obj, name=None): |
238 | url = reverse( |
239 | - 'admin:%s_%s_change' % (obj._meta.app_label, obj._meta.model_name), |
240 | - args=(obj.id,) |
241 | + "admin:%s_%s_change" % (obj._meta.app_label, obj._meta.model_name), |
242 | + args=(obj.id,), |
243 | ) |
244 | if not name: |
245 | name = obj |
246 | - return format_html( |
247 | - '<a href="{}">{}</a>'.format(url, name) |
248 | - ) |
249 | + return format_html('<a href="{}">{}</a>'.format(url, name)) |
250 | |
251 | |
252 | def human_si(v, begin=0): |
253 | - p = ('', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi', 'Yi') |
254 | + p = ("", "Ki", "Mi", "Gi", "Ti", "Pi", "Ei", "Zi", "Yi") |
255 | i = begin |
256 | while v >= 1024.0: |
257 | v = int(v / 10.24) / 100.0 |
258 | i += 1 |
259 | - return '%g %sB' % (v, p[i]) |
260 | + return "%g %sB" % (v, p[i]) |
261 | |
262 | |
263 | def human_time(t): |
264 | @@ -58,58 +59,62 @@ |
265 | |
266 | |
267 | class CustomModelAdmin(admin.ModelAdmin): |
268 | - change_form_template = 'admin/custom_change_form.html' |
269 | + change_form_template = "admin/custom_change_form.html" |
270 | |
271 | def render_change_form(self, request, context, *args, **kwargs): |
272 | # Build a list of related children objects and their counts |
273 | # so they may be linked to in the admin interface |
274 | related_links = [] |
275 | - if 'object_id' in context and hasattr(self.model._meta, 'get_fields'): |
276 | + if "object_id" in context and hasattr(self.model._meta, "get_fields"): |
277 | related_objs = [ |
278 | - f for f in self.model._meta.get_fields() |
279 | - if (f.one_to_many or f.one_to_one) |
280 | - and f.auto_created and not f.concrete |
281 | + f |
282 | + for f in self.model._meta.get_fields() |
283 | + if (f.one_to_many or f.one_to_one) and f.auto_created and not f.concrete |
284 | ] |
285 | for obj in related_objs: |
286 | - count = obj.related_model.objects.filter(**{obj.field.name: context['object_id']}).count() |
287 | + count = obj.related_model.objects.filter( |
288 | + **{obj.field.name: context["object_id"]} |
289 | + ).count() |
290 | if count > 0: |
291 | related_links.append((obj, obj.related_model._meta, count)) |
292 | - context.update({'related_links': related_links}) |
293 | + context.update({"related_links": related_links}) |
294 | |
295 | - return super(CustomModelAdmin, self).render_change_form(request, context, *args, **kwargs) |
296 | + return super(CustomModelAdmin, self).render_change_form( |
297 | + request, context, *args, **kwargs |
298 | + ) |
299 | |
300 | |
301 | class MachineAdminForm(forms.ModelForm): |
302 | class Meta: |
303 | model = Machine |
304 | - fields = '__all__' |
305 | + fields = "__all__" |
306 | |
307 | def __init__(self, *args, **kwargs): |
308 | super(MachineAdminForm, self).__init__(*args, **kwargs) |
309 | - self.fields['auth'].queryset = Auth.objects.filter(secret_type='machine_reg') |
310 | + self.fields["auth"].queryset = Auth.objects.filter(secret_type="machine_reg") |
311 | |
312 | |
313 | class StorageAdminForm(forms.ModelForm): |
314 | class Meta: |
315 | model = Storage |
316 | - fields = '__all__' |
317 | + fields = "__all__" |
318 | |
319 | def __init__(self, *args, **kwargs): |
320 | super(StorageAdminForm, self).__init__(*args, **kwargs) |
321 | - self.fields['auth'].queryset = Auth.objects.filter(secret_type='storage_reg') |
322 | + self.fields["auth"].queryset = Auth.objects.filter(secret_type="storage_reg") |
323 | |
324 | |
325 | class AuthAdmin(CustomModelAdmin): |
326 | - list_display = ('name', 'secret_type', 'date_added', 'active') |
327 | - ordering = ('name',) |
328 | - search_fields = ('name', 'comment',) |
329 | + list_display = ("name", "secret_type", "date_added", "active") |
330 | + ordering = ("name",) |
331 | + search_fields = ("name", "comment") |
332 | |
333 | |
334 | class ExcludeListFilter(admin.SimpleListFilter): |
335 | def __init__(self, *args, **kwargs): |
336 | if not self.title: |
337 | self.title = self.parameter_name |
338 | - self.parameter_name += '__exclude' |
339 | + self.parameter_name += "__exclude" |
340 | super(ExcludeListFilter, self).__init__(*args, **kwargs) |
341 | |
342 | def has_output(self): |
343 | @@ -125,7 +130,7 @@ |
344 | |
345 | |
346 | class NameExcludeListFilter(ExcludeListFilter): |
347 | - parameter_name = 'name' |
348 | + parameter_name = "name" |
349 | |
350 | |
351 | class MachineAdmin(CustomModelAdmin): |
352 | @@ -133,61 +138,73 @@ |
353 | return get_admin_change_link(obj.storage) |
354 | |
355 | storage_link.allow_tags = True |
356 | - storage_link.admin_order_field = 'storage__name' |
357 | - storage_link.short_description = 'storage' |
358 | + storage_link.admin_order_field = "storage__name" |
359 | + storage_link.short_description = "storage" |
360 | |
361 | def date_checked_in_human(self, obj): |
362 | return human_time(obj.date_checked_in) |
363 | |
364 | - date_checked_in_human.admin_order_field = 'date_checked_in' |
365 | - date_checked_in_human.short_description = 'date checked in' |
366 | + date_checked_in_human.admin_order_field = "date_checked_in" |
367 | + date_checked_in_human.short_description = "date checked in" |
368 | |
369 | form = MachineAdminForm |
370 | list_display = ( |
371 | - 'unit_name', 'uuid', 'storage_link', 'environment_name', |
372 | - 'service_name', 'date_checked_in_human', 'published', 'active', |
373 | - 'healthy', |
374 | - ) |
375 | - list_display_links = ('unit_name',) |
376 | - list_filter = ('date_checked_in', 'storage', 'active', 'published') |
377 | - ordering = ('unit_name',) |
378 | - search_fields = ( |
379 | - 'unit_name', 'uuid', 'environment_name', 'service_name', |
380 | - 'comment', |
381 | - ) |
382 | + "unit_name", |
383 | + "uuid", |
384 | + "storage_link", |
385 | + "environment_name", |
386 | + "service_name", |
387 | + "date_checked_in_human", |
388 | + "published", |
389 | + "active", |
390 | + "healthy", |
391 | + ) |
392 | + list_display_links = ("unit_name",) |
393 | + list_filter = ("date_checked_in", "storage", "active", "published") |
394 | + ordering = ("unit_name",) |
395 | + search_fields = ("unit_name", "uuid", "environment_name", "service_name", "comment") |
396 | |
397 | |
398 | class SourceAdmin(CustomModelAdmin): |
399 | def date_last_backed_up_human(self, obj): |
400 | return human_time(obj.date_last_backed_up) |
401 | |
402 | - date_last_backed_up_human.admin_order_field = 'date_last_backed_up' |
403 | - date_last_backed_up_human.short_description = 'date last backed up' |
404 | + date_last_backed_up_human.admin_order_field = "date_last_backed_up" |
405 | + date_last_backed_up_human.short_description = "date last backed up" |
406 | |
407 | def date_next_backup_human(self, obj): |
408 | return human_time(obj.date_next_backup) |
409 | |
410 | - date_next_backup_human.admin_order_field = 'date_next_backup' |
411 | - date_next_backup_human.short_description = 'date next backup' |
412 | + date_next_backup_human.admin_order_field = "date_next_backup" |
413 | + date_next_backup_human.short_description = "date next backup" |
414 | |
415 | def machine_link(self, obj): |
416 | return get_admin_change_link(obj.machine) |
417 | |
418 | machine_link.allow_tags = True |
419 | - machine_link.admin_order_field = 'machine__unit_name' |
420 | - machine_link.short_description = 'machine' |
421 | + machine_link.admin_order_field = "machine__unit_name" |
422 | + machine_link.short_description = "machine" |
423 | |
424 | list_display = ( |
425 | - 'name', 'machine_link', 'path', 'date_last_backed_up_human', |
426 | - 'date_next_backup_human', 'published', 'active', 'healthy', |
427 | + "name", |
428 | + "machine_link", |
429 | + "path", |
430 | + "date_last_backed_up_human", |
431 | + "date_next_backup_human", |
432 | + "published", |
433 | + "active", |
434 | + "healthy", |
435 | ) |
436 | - list_display_links = ('name',) |
437 | + list_display_links = ("name",) |
438 | list_filter = ( |
439 | - 'date_last_backed_up', 'date_next_backup', 'active', 'published', |
440 | + "date_last_backed_up", |
441 | + "date_next_backup", |
442 | + "active", |
443 | + "published", |
444 | NameExcludeListFilter, |
445 | ) |
446 | - ordering = ('machine__unit_name', 'name') |
447 | - search_fields = ('name', 'comment', 'path',) |
448 | + ordering = ("machine__unit_name", "name") |
449 | + search_fields = ("name", "comment", "path") |
450 | |
451 | |
452 | class BackupLogAdmin(CustomModelAdmin): |
453 | @@ -195,8 +212,8 @@ |
454 | return get_admin_change_link(obj.source) |
455 | |
456 | source_link.allow_tags = True |
457 | - source_link.admin_order_field = 'source__name' |
458 | - source_link.short_description = 'source' |
459 | + source_link.admin_order_field = "source__name" |
460 | + source_link.short_description = "source" |
461 | |
462 | def duration(self, obj): |
463 | if not (obj.date_end and obj.date_begin): |
464 | @@ -204,8 +221,8 @@ |
465 | d = obj.date_end - obj.date_begin |
466 | return d - datetime.timedelta(microseconds=d.microseconds) |
467 | |
468 | - duration.admin_order_field = 'date_end' |
469 | - duration.short_description = 'duration' |
470 | + duration.admin_order_field = "date_end" |
471 | + duration.short_description = "duration" |
472 | |
473 | def storage_link(self, obj): |
474 | if not obj.storage: |
475 | @@ -213,57 +230,67 @@ |
476 | return get_admin_change_link(obj.storage) |
477 | |
478 | storage_link.allow_tags = True |
479 | - storage_link.admin_order_field = 'storage__name' |
480 | - storage_link.short_description = 'storage' |
481 | + storage_link.admin_order_field = "storage__name" |
482 | + storage_link.short_description = "storage" |
483 | |
484 | def date_human(self, obj): |
485 | return human_time(obj.date) |
486 | |
487 | - date_human.admin_order_field = 'date' |
488 | - date_human.short_description = 'date' |
489 | + date_human.admin_order_field = "date" |
490 | + date_human.short_description = "date" |
491 | |
492 | list_display = ( |
493 | - 'date_human', 'source_link', 'success', 'snapshot', 'storage_link', |
494 | - 'duration', |
495 | + "date_human", |
496 | + "source_link", |
497 | + "success", |
498 | + "snapshot", |
499 | + "storage_link", |
500 | + "duration", |
501 | ) |
502 | - list_display_links = ('date_human',) |
503 | - list_filter = ('date', 'success') |
504 | - ordering = ('-date',) |
505 | + list_display_links = ("date_human",) |
506 | + list_filter = ("date", "success") |
507 | + ordering = ("-date",) |
508 | |
509 | |
510 | class FilterSetAdmin(CustomModelAdmin): |
511 | - list_display = ('name', 'date_added', 'active') |
512 | - ordering = ('name',) |
513 | - search_fields = ('name', 'comment',) |
514 | + list_display = ("name", "date_added", "active") |
515 | + ordering = ("name",) |
516 | + search_fields = ("name", "comment") |
517 | |
518 | |
519 | class StorageAdmin(CustomModelAdmin): |
520 | def space_total_human(self, obj): |
521 | return human_si(obj.space_total, 2) |
522 | |
523 | - space_total_human.admin_order_field = 'space_total' |
524 | - space_total_human.short_description = 'space total' |
525 | + space_total_human.admin_order_field = "space_total" |
526 | + space_total_human.short_description = "space total" |
527 | |
528 | def space_available_human(self, obj): |
529 | return human_si(obj.space_available, 2) |
530 | |
531 | - space_available_human.admin_order_field = 'space_available' |
532 | - space_available_human.short_description = 'space available' |
533 | + space_available_human.admin_order_field = "space_available" |
534 | + space_available_human.short_description = "space available" |
535 | |
536 | def date_checked_in_human(self, obj): |
537 | return human_time(obj.date_checked_in) |
538 | |
539 | - date_checked_in_human.admin_order_field = 'date_checked_in' |
540 | - date_checked_in_human.short_description = 'date checked in' |
541 | + date_checked_in_human.admin_order_field = "date_checked_in" |
542 | + date_checked_in_human.short_description = "date checked in" |
543 | |
544 | form = StorageAdminForm |
545 | list_display = ( |
546 | - 'name', 'ssh_ping_host', 'ssh_ping_user', 'date_checked_in_human', |
547 | - 'space_total_human', 'space_available_human', 'published', 'active', |
548 | - 'healthy', |
549 | + "name", |
550 | + "ssh_ping_host", |
551 | + "ssh_ping_user", |
552 | + "date_checked_in_human", |
553 | + "space_total_human", |
554 | + "space_available_human", |
555 | + "published", |
556 | + "active", |
557 | + "healthy", |
558 | ) |
559 | - ordering = ('name',) |
560 | - search_fields = ('name', 'comment', 'ssh_ping_host',) |
561 | + ordering = ("name",) |
562 | + search_fields = ("name", "comment", "ssh_ping_host") |
563 | |
564 | |
565 | admin.site.register(Auth, AuthAdmin) |
566 | |
567 | === modified file 'turku_api/models.py' |
568 | --- turku_api/models.py 2020-04-11 21:20:31 +0000 |
569 | +++ turku_api/models.py 2020-06-21 23:58:30 +0000 |
570 | @@ -14,14 +14,15 @@ |
571 | # License along with this program. If not, see |
572 | # <http://www.gnu.org/licenses/>. |
573 | |
574 | +from datetime import timedelta |
575 | +import json |
576 | +import uuid |
577 | + |
578 | from django.db import models |
579 | +from django.contrib.auth.hashers import is_password_usable |
580 | from django.core.exceptions import ValidationError |
581 | from django.core.validators import MaxValueValidator, MinValueValidator |
582 | -from django.contrib.auth.hashers import is_password_usable |
583 | from django.utils import timezone |
584 | -from datetime import timedelta |
585 | -import json |
586 | -import uuid |
587 | |
588 | |
589 | def new_uuid(): |
590 | @@ -32,85 +33,83 @@ |
591 | try: |
592 | str(uuid.UUID(value)) |
593 | except ValueError: |
594 | - raise ValidationError('Invalid UUID format') |
595 | + raise ValidationError("Invalid UUID format") |
596 | |
597 | |
598 | def validate_hashed_password(value): |
599 | if not is_password_usable(value): |
600 | - raise ValidationError('Invalid hashed password') |
601 | + raise ValidationError("Invalid hashed password") |
602 | |
603 | |
604 | def validate_json_string_list(value): |
605 | try: |
606 | decoded_json = json.loads(value) |
607 | except ValueError: |
608 | - raise ValidationError('Must be a valid JSON string list') |
609 | + raise ValidationError("Must be a valid JSON string list") |
610 | if not isinstance(decoded_json, (list, tuple, set)): |
611 | - raise ValidationError('Must be a valid JSON string list') |
612 | + raise ValidationError("Must be a valid JSON string list") |
613 | for i in decoded_json: |
614 | if not isinstance(i, str): |
615 | - raise ValidationError('Must be a valid JSON string list') |
616 | + raise ValidationError("Must be a valid JSON string list") |
617 | |
618 | |
619 | def validate_storage_auth(value): |
620 | try: |
621 | a = Auth.objects.get(id=value) |
622 | except Auth.DoesNotExist: |
623 | - raise ValidationError('Auth %s does not exist' % value) |
624 | - if a.secret_type != 'storage_reg': |
625 | - raise ValidationError('Must be a Storage registration') |
626 | + raise ValidationError("Auth %s does not exist" % value) |
627 | + if a.secret_type != "storage_reg": |
628 | + raise ValidationError("Must be a Storage registration") |
629 | |
630 | |
631 | def validate_machine_auth(value): |
632 | try: |
633 | a = Auth.objects.get(id=value) |
634 | except Auth.DoesNotExist: |
635 | - raise ValidationError('Auth %s does not exist' % value) |
636 | - if a.secret_type != 'machine_reg': |
637 | - raise ValidationError('Must be a Machine registration') |
638 | + raise ValidationError("Auth %s does not exist" % value) |
639 | + if a.secret_type != "machine_reg": |
640 | + raise ValidationError("Must be a Machine registration") |
641 | |
642 | |
643 | class UuidPrimaryKeyField(models.CharField): |
644 | def __init__(self, *args, **kwargs): |
645 | - kwargs['blank'] = True |
646 | - kwargs['default'] = new_uuid |
647 | - kwargs['editable'] = False |
648 | - kwargs['max_length'] = 36 |
649 | - kwargs['primary_key'] = True |
650 | + kwargs["blank"] = True |
651 | + kwargs["default"] = new_uuid |
652 | + kwargs["editable"] = False |
653 | + kwargs["max_length"] = 36 |
654 | + kwargs["primary_key"] = True |
655 | super(UuidPrimaryKeyField, self).__init__(*args, **kwargs) |
656 | |
657 | |
658 | class Auth(models.Model): |
659 | SECRET_TYPES = ( |
660 | - ('machine_reg', 'Machine registration'), |
661 | - ('storage_reg', 'Storage registration'), |
662 | + ("machine_reg", "Machine registration"), |
663 | + ("storage_reg", "Storage registration"), |
664 | ) |
665 | id = UuidPrimaryKeyField() |
666 | name = models.CharField( |
667 | - max_length=200, unique=True, |
668 | - help_text='Human-readable name of this auth.', |
669 | + max_length=200, unique=True, help_text="Human-readable name of this auth." |
670 | ) |
671 | secret_hash = models.CharField( |
672 | max_length=200, |
673 | validators=[validate_hashed_password], |
674 | - help_text='Hashed secret (password) of this auth.', |
675 | + help_text="Hashed secret (password) of this auth.", |
676 | ) |
677 | secret_type = models.CharField( |
678 | - max_length=200, choices=SECRET_TYPES, |
679 | - help_text='Auth secret type (machine/storage).', |
680 | + max_length=200, |
681 | + choices=SECRET_TYPES, |
682 | + help_text="Auth secret type (machine/storage).", |
683 | ) |
684 | comment = models.CharField( |
685 | - max_length=200, blank=True, null=True, |
686 | - help_text='Human-readable comment.', |
687 | + max_length=200, blank=True, null=True, help_text="Human-readable comment." |
688 | ) |
689 | active = models.BooleanField( |
690 | default=True, |
691 | - help_text='Whether this auth is enabled. Disabling prevents new registrations using its key, and prevents ' + |
692 | - 'existing machines using its key from updating their configs.', |
693 | + help_text="Whether this auth is enabled. Disabling prevents new registrations using its key, and prevents " |
694 | + + "existing machines using its key from updating their configs.", |
695 | ) |
696 | date_added = models.DateTimeField( |
697 | - default=timezone.now, |
698 | - help_text='Date/time this auth was added.', |
699 | + default=timezone.now, help_text="Date/time this auth was added." |
700 | ) |
701 | |
702 | def __str__(self): |
703 | @@ -125,77 +124,78 @@ |
704 | return True |
705 | if not self.date_checked_in: |
706 | return False |
707 | - return (now <= (self.date_checked_in + timedelta(minutes=30))) |
708 | + return now <= (self.date_checked_in + timedelta(minutes=30)) |
709 | + |
710 | healthy.boolean = True |
711 | |
712 | id = UuidPrimaryKeyField() |
713 | name = models.CharField( |
714 | - max_length=200, unique=True, |
715 | - help_text='Name of this storage unit. This is used as its login ID and must be unique.', |
716 | + max_length=200, |
717 | + unique=True, |
718 | + help_text="Name of this storage unit. This is used as its login ID and must be unique.", |
719 | ) |
720 | secret_hash = models.CharField( |
721 | max_length=200, |
722 | validators=[validate_hashed_password], |
723 | - help_text='Hashed secret (password) of this storage unit.', |
724 | + help_text="Hashed secret (password) of this storage unit.", |
725 | ) |
726 | comment = models.CharField( |
727 | - max_length=200, blank=True, null=True, |
728 | - help_text='Human-readable comment.', |
729 | + max_length=200, blank=True, null=True, help_text="Human-readable comment." |
730 | ) |
731 | ssh_ping_host = models.CharField( |
732 | max_length=200, |
733 | - verbose_name='SSH ping host', |
734 | - help_text='Hostname/IP address of this storage unit\'s SSH server.', |
735 | + verbose_name="SSH ping host", |
736 | + help_text="Hostname/IP address of this storage unit's SSH server.", |
737 | ) |
738 | ssh_ping_host_keys = models.CharField( |
739 | - max_length=65536, default='[]', |
740 | + max_length=65536, |
741 | + default="[]", |
742 | validators=[validate_json_string_list], |
743 | - verbose_name='SSH ping host keys', |
744 | - help_text='JSON list of this storage unit\'s SSH host keys.', |
745 | + verbose_name="SSH ping host keys", |
746 | + help_text="JSON list of this storage unit's SSH host keys.", |
747 | ) |
748 | ssh_ping_port = models.PositiveIntegerField( |
749 | validators=[MinValueValidator(1), MaxValueValidator(65535)], |
750 | - verbose_name='SSH ping port', |
751 | - help_text='Port number of this storage unit\'s SSH server.', |
752 | + verbose_name="SSH ping port", |
753 | + help_text="Port number of this storage unit's SSH server.", |
754 | ) |
755 | ssh_ping_user = models.CharField( |
756 | max_length=200, |
757 | - verbose_name='SSH ping user', |
758 | - help_text='Username of this storage unit\'s SSH server.', |
759 | + verbose_name="SSH ping user", |
760 | + help_text="Username of this storage unit's SSH server.", |
761 | ) |
762 | space_total = models.PositiveIntegerField( |
763 | default=0, |
764 | - help_text='Total disk space of this storage unit\'s storage directories, in MiB.', |
765 | + help_text="Total disk space of this storage unit's storage directories, in MiB.", |
766 | ) |
767 | space_available = models.PositiveIntegerField( |
768 | default=0, |
769 | - help_text='Available disk space of this storage unit\'s storage directories, in MiB.', |
770 | + help_text="Available disk space of this storage unit's storage directories, in MiB.", |
771 | ) |
772 | auth = models.ForeignKey( |
773 | - Auth, validators=[validate_storage_auth], on_delete=models.CASCADE, |
774 | - help_text='Storage auth used to register this storage unit.', |
775 | + Auth, |
776 | + validators=[validate_storage_auth], |
777 | + on_delete=models.CASCADE, |
778 | + help_text="Storage auth used to register this storage unit.", |
779 | ) |
780 | active = models.BooleanField( |
781 | default=True, |
782 | - help_text='Whether this storage unit is enabled. Disabling prevents this storage unit from checking in or ' + |
783 | - 'being assigned to new machines. Existing machines which ping this storage unit will get errors ' + |
784 | - 'because this storage unit can no longer query the API server.', |
785 | + help_text="Whether this storage unit is enabled. Disabling prevents this storage unit from checking in or " |
786 | + + "being assigned to new machines. Existing machines which ping this storage unit will get errors " |
787 | + + "because this storage unit can no longer query the API server.", |
788 | ) |
789 | published = models.BooleanField( |
790 | - default=True, |
791 | - help_text='Whether this storage unit has been enabled by itself.', |
792 | + default=True, help_text="Whether this storage unit has been enabled by itself." |
793 | ) |
794 | date_registered = models.DateTimeField( |
795 | - default=timezone.now, |
796 | - help_text='Date/time this storage unit was registered.', |
797 | + default=timezone.now, help_text="Date/time this storage unit was registered." |
798 | ) |
799 | date_updated = models.DateTimeField( |
800 | default=timezone.now, |
801 | - help_text='Date/time this storage unit presented a modified config.', |
802 | + help_text="Date/time this storage unit presented a modified config.", |
803 | ) |
804 | date_checked_in = models.DateTimeField( |
805 | - blank=True, null=True, |
806 | - help_text='Date/time this storage unit last checked in.', |
807 | + blank=True, null=True, help_text="Date/time this storage unit last checked in." |
808 | ) |
809 | |
810 | def __str__(self): |
811 | @@ -210,75 +210,82 @@ |
812 | return True |
813 | if not self.date_checked_in: |
814 | return False |
815 | - return (now <= (self.date_checked_in + timedelta(hours=10))) |
816 | + return now <= (self.date_checked_in + timedelta(hours=10)) |
817 | + |
818 | healthy.boolean = True |
819 | |
820 | id = UuidPrimaryKeyField() |
821 | uuid = models.CharField( |
822 | - max_length=36, unique=True, validators=[validate_uuid], |
823 | - verbose_name='UUID', |
824 | - help_text='UUID of this machine. This UUID is set by the machine and must be globally unique.', |
825 | + max_length=36, |
826 | + unique=True, |
827 | + validators=[validate_uuid], |
828 | + verbose_name="UUID", |
829 | + help_text="UUID of this machine. This UUID is set by the machine and must be globally unique.", |
830 | ) |
831 | secret_hash = models.CharField( |
832 | max_length=200, |
833 | validators=[validate_hashed_password], |
834 | - help_text='Hashed secret (password) of this machine.', |
835 | + help_text="Hashed secret (password) of this machine.", |
836 | ) |
837 | environment_name = models.CharField( |
838 | - max_length=200, blank=True, null=True, |
839 | - help_text='Environment this machine is part of.', |
840 | + max_length=200, |
841 | + blank=True, |
842 | + null=True, |
843 | + help_text="Environment this machine is part of.", |
844 | ) |
845 | service_name = models.CharField( |
846 | - max_length=200, blank=True, null=True, |
847 | - help_text='Service this machine is part of. For Juju units, this is the first part of the unit name ' + |
848 | - '(before the slash).', |
849 | + max_length=200, |
850 | + blank=True, |
851 | + null=True, |
852 | + help_text="Service this machine is part of. For Juju units, this is the first part of the unit name " |
853 | + + "(before the slash).", |
854 | ) |
855 | unit_name = models.CharField( |
856 | max_length=200, |
857 | - help_text='Unit name of this machine. For Juju units, this is the full unit name (e.g. "service-name/0"). ' + |
858 | - 'Otherwise, this should be the machine\'s hostname.', |
859 | + help_text='Unit name of this machine. For Juju units, this is the full unit name (e.g. "service-name/0"). ' |
860 | + + "Otherwise, this should be the machine's hostname.", |
861 | ) |
862 | comment = models.CharField( |
863 | - max_length=200, blank=True, null=True, |
864 | - help_text='Human-readable comment.', |
865 | + max_length=200, blank=True, null=True, help_text="Human-readable comment." |
866 | ) |
867 | ssh_public_key = models.CharField( |
868 | max_length=2048, |
869 | - verbose_name='SSH public key', |
870 | - help_text='SSH public key of this machine\'s agent.', |
871 | + verbose_name="SSH public key", |
872 | + help_text="SSH public key of this machine's agent.", |
873 | ) |
874 | auth = models.ForeignKey( |
875 | - Auth, validators=[validate_machine_auth], on_delete=models.CASCADE, |
876 | - help_text='Machine auth used to register this machine.', |
877 | + Auth, |
878 | + validators=[validate_machine_auth], |
879 | + on_delete=models.CASCADE, |
880 | + help_text="Machine auth used to register this machine.", |
881 | ) |
882 | storage = models.ForeignKey( |
883 | - Storage, on_delete=models.CASCADE, |
884 | - help_text='Storage unit this machine is assigned to.', |
885 | + Storage, |
886 | + on_delete=models.CASCADE, |
887 | + help_text="Storage unit this machine is assigned to.", |
888 | ) |
889 | active = models.BooleanField( |
890 | default=True, |
891 | - help_text='Whether this machine is enabled. Disabling removes its key from its storage unit, stops this ' + |
892 | - 'machine from updating its registration, etc.', |
893 | + help_text="Whether this machine is enabled. Disabling removes its key from its storage unit, stops this " |
894 | + + "machine from updating its registration, etc.", |
895 | ) |
896 | published = models.BooleanField( |
897 | default=True, |
898 | - help_text='Whether this machine has been enabled by the machine agent.', |
899 | + help_text="Whether this machine has been enabled by the machine agent.", |
900 | ) |
901 | date_registered = models.DateTimeField( |
902 | - default=timezone.now, |
903 | - help_text='Date/time this machine was registered.', |
904 | + default=timezone.now, help_text="Date/time this machine was registered." |
905 | ) |
906 | date_updated = models.DateTimeField( |
907 | default=timezone.now, |
908 | - help_text='Date/time this machine presented a modified config.', |
909 | + help_text="Date/time this machine presented a modified config.", |
910 | ) |
911 | date_checked_in = models.DateTimeField( |
912 | - blank=True, null=True, |
913 | - help_text='Date/time this machine last checked in.', |
914 | + blank=True, null=True, help_text="Date/time this machine last checked in." |
915 | ) |
916 | |
917 | def __str__(self): |
918 | - return '%s (%s)' % (self.unit_name, self.uuid[0:8]) |
919 | + return "%s (%s)" % (self.unit_name, self.uuid[0:8]) |
920 | |
921 | |
922 | class Source(models.Model): |
923 | @@ -289,174 +296,172 @@ |
924 | return True |
925 | if not self.success: |
926 | return False |
927 | - return (now <= (self.date_next_backup + timedelta(hours=10))) |
928 | + return now <= (self.date_next_backup + timedelta(hours=10)) |
929 | + |
930 | healthy.boolean = True |
931 | |
932 | SNAPSHOT_MODES = ( |
933 | - ('none', 'No snapshotting'), |
934 | - ('attic', 'Attic'), |
935 | - ('link-dest', 'Hardlink trees (rsync --link-dest)'), |
936 | + ("none", "No snapshotting"), |
937 | + ("attic", "Attic"), |
938 | + ("link-dest", "Hardlink trees (rsync --link-dest)"), |
939 | ) |
940 | id = UuidPrimaryKeyField() |
941 | name = models.CharField( |
942 | - max_length=200, |
943 | - help_text='Computer-readable source name identifier.', |
944 | + max_length=200, help_text="Computer-readable source name identifier." |
945 | ) |
946 | machine = models.ForeignKey( |
947 | - Machine, on_delete=models.CASCADE, |
948 | - help_text='Machine this source belongs to.', |
949 | + Machine, on_delete=models.CASCADE, help_text="Machine this source belongs to." |
950 | ) |
951 | comment = models.CharField( |
952 | - max_length=200, blank=True, null=True, |
953 | - help_text='Human-readable comment.', |
954 | + max_length=200, blank=True, null=True, help_text="Human-readable comment." |
955 | ) |
956 | path = models.CharField( |
957 | - max_length=200, |
958 | - help_text='Full filesystem path of this source.', |
959 | + max_length=200, help_text="Full filesystem path of this source." |
960 | ) |
961 | filter = models.CharField( |
962 | - max_length=2048, default='[]', validators=[validate_json_string_list], |
963 | - help_text='JSON list of rsync-compatible --filter options.', |
964 | + max_length=2048, |
965 | + default="[]", |
966 | + validators=[validate_json_string_list], |
967 | + help_text="JSON list of rsync-compatible --filter options.", |
968 | ) |
969 | exclude = models.CharField( |
970 | - max_length=2048, default='[]', validators=[validate_json_string_list], |
971 | - help_text='JSON list of rsync-compatible --exclude options.', |
972 | + max_length=2048, |
973 | + default="[]", |
974 | + validators=[validate_json_string_list], |
975 | + help_text="JSON list of rsync-compatible --exclude options.", |
976 | ) |
977 | frequency = models.CharField( |
978 | - max_length=200, default='daily', |
979 | - help_text='How often to back up this source.', |
980 | + max_length=200, default="daily", help_text="How often to back up this source." |
981 | ) |
982 | retention = models.CharField( |
983 | - max_length=200, default='last 5 days, earliest of month', |
984 | - help_text='Retention schedule, describing when to preserve snapshots.', |
985 | + max_length=200, |
986 | + default="last 5 days, earliest of month", |
987 | + help_text="Retention schedule, describing when to preserve snapshots.", |
988 | ) |
989 | bwlimit = models.CharField( |
990 | max_length=200, |
991 | - blank=True, null=True, |
992 | - verbose_name='bandwidth limit', |
993 | - help_text='Bandwith limit for remote transfer, using the rsync --bwlimit format.', |
994 | + blank=True, |
995 | + null=True, |
996 | + verbose_name="bandwidth limit", |
997 | + help_text="Bandwith limit for remote transfer, using the rsync --bwlimit format.", |
998 | ) |
999 | snapshot_mode = models.CharField( |
1000 | - blank=True, null=True, |
1001 | - max_length=200, choices=SNAPSHOT_MODES, |
1002 | - help_text='Override the storage unit\'s snapshot logic and use an explicit snapshot mode for this source.', |
1003 | + blank=True, |
1004 | + null=True, |
1005 | + max_length=200, |
1006 | + choices=SNAPSHOT_MODES, |
1007 | + help_text="Override the storage unit's snapshot logic and use an explicit snapshot mode for this source.", |
1008 | ) |
1009 | preserve_hard_links = models.BooleanField( |
1010 | default=False, |
1011 | - help_text='Whether to preserve hard links when backing up this source.', |
1012 | + help_text="Whether to preserve hard links when backing up this source.", |
1013 | ) |
1014 | shared_service = models.BooleanField( |
1015 | default=False, |
1016 | - help_text='Whether this source is part of a shared service of multiple machines to be backed up.', |
1017 | + help_text="Whether this source is part of a shared service of multiple machines to be backed up.", |
1018 | ) |
1019 | large_rotating_files = models.BooleanField( |
1020 | default=False, |
1021 | - help_text='Whether this source contains a number of large files which rotate through filenames, e.g. ' + |
1022 | - '"postgresql.1.dump.gz" becomes "postgresql.2.dump.gz".', |
1023 | + help_text="Whether this source contains a number of large files which rotate through filenames, e.g. " |
1024 | + + '"postgresql.1.dump.gz" becomes "postgresql.2.dump.gz".', |
1025 | ) |
1026 | large_modifying_files = models.BooleanField( |
1027 | default=False, |
1028 | - help_text='Whether this source contains a number of large files which grow or are otherwise modified, ' + |
1029 | - 'e.g. log files or filesystem images.', |
1030 | + help_text="Whether this source contains a number of large files which grow or are otherwise modified, " |
1031 | + + "e.g. log files or filesystem images.", |
1032 | ) |
1033 | active = models.BooleanField( |
1034 | default=True, |
1035 | - help_text='Whether this source is enabled. Disabling means the API server no longer gives it to the ' + |
1036 | - 'storage unit, even if it\'s time for a backup.', |
1037 | + help_text="Whether this source is enabled. Disabling means the API server no longer gives it to the " |
1038 | + + "storage unit, even if it's time for a backup.", |
1039 | ) |
1040 | success = models.BooleanField( |
1041 | - default=True, |
1042 | - help_text='Whether this source\'s last backup was successful.', |
1043 | + default=True, help_text="Whether this source's last backup was successful." |
1044 | ) |
1045 | published = models.BooleanField( |
1046 | default=True, |
1047 | - help_text='Whether this source is actively being published by the machine agent.', |
1048 | + help_text="Whether this source is actively being published by the machine agent.", |
1049 | ) |
1050 | date_added = models.DateTimeField( |
1051 | default=timezone.now, |
1052 | - help_text='Date/time this source was first added by the machine agent.', |
1053 | + help_text="Date/time this source was first added by the machine agent.", |
1054 | ) |
1055 | date_updated = models.DateTimeField( |
1056 | default=timezone.now, |
1057 | - help_text='Date/time the machine presented a modified config of this source.', |
1058 | + help_text="Date/time the machine presented a modified config of this source.", |
1059 | ) |
1060 | date_last_backed_up = models.DateTimeField( |
1061 | - blank=True, null=True, |
1062 | - help_text='Date/time this source was last successfully backed up.', |
1063 | + blank=True, |
1064 | + null=True, |
1065 | + help_text="Date/time this source was last successfully backed up.", |
1066 | ) |
1067 | date_next_backup = models.DateTimeField( |
1068 | default=timezone.now, |
1069 | - help_text='Date/time this source is next scheduled to be backed up. Set to now (or in the past) to ' + |
1070 | - 'trigger a backup as soon as possible.', |
1071 | + help_text="Date/time this source is next scheduled to be backed up. Set to now (or in the past) to " |
1072 | + + "trigger a backup as soon as possible.", |
1073 | ) |
1074 | |
1075 | class Meta: |
1076 | - unique_together = (('machine', 'name'),) |
1077 | + unique_together = (("machine", "name"),) |
1078 | |
1079 | def __str__(self): |
1080 | - return '%s %s' % (self.machine.unit_name, self.name) |
1081 | + return "%s %s" % (self.machine.unit_name, self.name) |
1082 | |
1083 | |
1084 | class BackupLog(models.Model): |
1085 | id = UuidPrimaryKeyField() |
1086 | source = models.ForeignKey( |
1087 | - Source, on_delete=models.CASCADE, |
1088 | - help_text='Source this log entry belongs to.', |
1089 | + Source, on_delete=models.CASCADE, help_text="Source this log entry belongs to." |
1090 | ) |
1091 | date = models.DateTimeField( |
1092 | default=timezone.now, |
1093 | - help_text='Date/time this log entry was received/processed.', |
1094 | + help_text="Date/time this log entry was received/processed.", |
1095 | ) |
1096 | storage = models.ForeignKey( |
1097 | - Storage, blank=True, null=True, on_delete=models.CASCADE, |
1098 | - help_text='Storage unit this backup occurred on.', |
1099 | + Storage, |
1100 | + blank=True, |
1101 | + null=True, |
1102 | + on_delete=models.CASCADE, |
1103 | + help_text="Storage unit this backup occurred on.", |
1104 | ) |
1105 | success = models.BooleanField( |
1106 | - default=False, |
1107 | - help_text='Whether this backup succeeded.', |
1108 | + default=False, help_text="Whether this backup succeeded." |
1109 | ) |
1110 | date_begin = models.DateTimeField( |
1111 | - blank=True, null=True, |
1112 | - help_text='Date/time this backup began.', |
1113 | + blank=True, null=True, help_text="Date/time this backup began." |
1114 | ) |
1115 | date_end = models.DateTimeField( |
1116 | - blank=True, null=True, |
1117 | - help_text='Date/time this backup ended.', |
1118 | + blank=True, null=True, help_text="Date/time this backup ended." |
1119 | ) |
1120 | snapshot = models.CharField( |
1121 | - max_length=200, blank=True, null=True, |
1122 | - help_text='Name of the created snapshot.', |
1123 | + max_length=200, blank=True, null=True, help_text="Name of the created snapshot." |
1124 | ) |
1125 | summary = models.TextField( |
1126 | - blank=True, null=True, |
1127 | - help_text='Summary of the backup\'s events.', |
1128 | + blank=True, null=True, help_text="Summary of the backup's events." |
1129 | ) |
1130 | |
1131 | def __str__(self): |
1132 | - return '%s %s' % (str(self.source), self.date.strftime('%Y-%m-%d %H:%M:%S')) |
1133 | + return "%s %s" % (str(self.source), self.date.strftime("%Y-%m-%d %H:%M:%S")) |
1134 | |
1135 | |
1136 | class FilterSet(models.Model): |
1137 | id = UuidPrimaryKeyField() |
1138 | name = models.CharField( |
1139 | - max_length=200, unique=True, |
1140 | - help_text='Name of this filter set.', |
1141 | + max_length=200, unique=True, help_text="Name of this filter set." |
1142 | ) |
1143 | filters = models.TextField( |
1144 | - default='[]', validators=[validate_json_string_list], |
1145 | - help_text='JSON list of this filter set\'s filter rules.', |
1146 | + default="[]", |
1147 | + validators=[validate_json_string_list], |
1148 | + help_text="JSON list of this filter set's filter rules.", |
1149 | ) |
1150 | comment = models.CharField( |
1151 | - max_length=200, blank=True, null=True, |
1152 | - help_text='Human-readable comment.', |
1153 | + max_length=200, blank=True, null=True, help_text="Human-readable comment." |
1154 | ) |
1155 | active = models.BooleanField( |
1156 | - default=True, |
1157 | - help_text='Whether this filter set is enabled.', |
1158 | + default=True, help_text="Whether this filter set is enabled." |
1159 | ) |
1160 | date_added = models.DateTimeField( |
1161 | - default=timezone.now, |
1162 | - help_text='Date/time this filter set was added.', |
1163 | + default=timezone.now, help_text="Date/time this filter set was added." |
1164 | ) |
1165 | |
1166 | def __str__(self): |
1167 | |
1168 | === modified file 'turku_api/settings.py' |
1169 | --- turku_api/settings.py 2020-04-11 21:20:31 +0000 |
1170 | +++ turku_api/settings.py 2020-06-21 23:58:30 +0000 |
1171 | @@ -21,57 +21,59 @@ |
1172 | BASE_DIR = os.path.dirname(os.path.dirname(__file__)) |
1173 | DEBUG = False |
1174 | TEMPLATE_DEBUG = False |
1175 | -ALLOWED_HOSTS = ('*',) |
1176 | +ALLOWED_HOSTS = ("*",) |
1177 | INSTALLED_APPS = ( |
1178 | - 'django.contrib.admin', |
1179 | - 'django.contrib.auth', |
1180 | - 'django.contrib.contenttypes', |
1181 | - 'django.contrib.sessions', |
1182 | - 'django.contrib.messages', |
1183 | - 'django.contrib.staticfiles', |
1184 | - 'turku_api', |
1185 | + "django.contrib.admin", |
1186 | + "django.contrib.auth", |
1187 | + "django.contrib.contenttypes", |
1188 | + "django.contrib.sessions", |
1189 | + "django.contrib.messages", |
1190 | + "django.contrib.staticfiles", |
1191 | + "turku_api", |
1192 | ) |
1193 | MIDDLEWARE = ( |
1194 | - 'django.contrib.sessions.middleware.SessionMiddleware', |
1195 | - 'django.middleware.common.CommonMiddleware', |
1196 | - 'django.middleware.csrf.CsrfViewMiddleware', |
1197 | - 'django.contrib.auth.middleware.AuthenticationMiddleware', |
1198 | - 'django.contrib.messages.middleware.MessageMiddleware', |
1199 | - 'django.middleware.clickjacking.XFrameOptionsMiddleware', |
1200 | + "django.contrib.sessions.middleware.SessionMiddleware", |
1201 | + "django.middleware.common.CommonMiddleware", |
1202 | + "django.middleware.csrf.CsrfViewMiddleware", |
1203 | + "django.contrib.auth.middleware.AuthenticationMiddleware", |
1204 | + "django.contrib.messages.middleware.MessageMiddleware", |
1205 | + "django.middleware.clickjacking.XFrameOptionsMiddleware", |
1206 | ) |
1207 | MIDDLEWARE_CLASSES = MIDDLEWARE # pre-1.10 |
1208 | -ROOT_URLCONF = 'turku_api.urls' |
1209 | -WSGI_APPLICATION = 'turku_api.wsgi.application' |
1210 | -LANGUAGE_CODE = 'en-us' |
1211 | -TIME_ZONE = 'UTC' |
1212 | +ROOT_URLCONF = "turku_api.urls" |
1213 | +WSGI_APPLICATION = "turku_api.wsgi.application" |
1214 | +LANGUAGE_CODE = "en-us" |
1215 | +TIME_ZONE = "UTC" |
1216 | USE_I18N = True |
1217 | USE_L10N = True |
1218 | USE_TZ = True |
1219 | -STATIC_URL = '/static/' |
1220 | +STATIC_URL = "/static/" |
1221 | TEMPLATES = [ |
1222 | { |
1223 | - 'BACKEND': 'django.template.backends.django.DjangoTemplates', |
1224 | - 'DIRS': [os.path.join(BASE_DIR, 'turku_api/templates')], |
1225 | - 'APP_DIRS': True, |
1226 | - 'OPTIONS': { |
1227 | - 'context_processors': [ |
1228 | - 'django.template.context_processors.debug', |
1229 | - 'django.template.context_processors.request', |
1230 | - 'django.contrib.auth.context_processors.auth', |
1231 | - 'django.contrib.messages.context_processors.messages', |
1232 | - ], |
1233 | + "BACKEND": "django.template.backends.django.DjangoTemplates", |
1234 | + "DIRS": [os.path.join(BASE_DIR, "turku_api/templates")], |
1235 | + "APP_DIRS": True, |
1236 | + "OPTIONS": { |
1237 | + "context_processors": [ |
1238 | + "django.template.context_processors.debug", |
1239 | + "django.template.context_processors.request", |
1240 | + "django.contrib.auth.context_processors.auth", |
1241 | + "django.contrib.messages.context_processors.messages", |
1242 | + ] |
1243 | }, |
1244 | - }, |
1245 | + } |
1246 | ] |
1247 | DATABASES = { |
1248 | - 'default': { |
1249 | - 'ENGINE': 'django.db.backends.sqlite3', |
1250 | - 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'), |
1251 | + "default": { |
1252 | + "ENGINE": "django.db.backends.sqlite3", |
1253 | + "NAME": os.path.join(BASE_DIR, "db.sqlite3"), |
1254 | } |
1255 | } |
1256 | -SECRET_KEY = ''.join(random.choice(string.ascii_letters + string.digits) for i in range(30)) |
1257 | +SECRET_KEY = "".join( |
1258 | + random.choice(string.ascii_letters + string.digits) for i in range(30) |
1259 | +) |
1260 | |
1261 | try: |
1262 | - from turku_api.local_settings import * |
1263 | + from turku_api.local_settings import * # noqa: F401,F403 |
1264 | except ImportError: |
1265 | pass |
1266 | |
1267 | === removed file 'turku_api/tests.py' |
1268 | === modified file 'turku_api/urls.py' |
1269 | --- turku_api/urls.py 2020-04-11 21:20:31 +0000 |
1270 | +++ turku_api/urls.py 2020-06-21 23:58:30 +0000 |
1271 | @@ -14,31 +14,49 @@ |
1272 | # License along with this program. If not, see |
1273 | # <http://www.gnu.org/licenses/>. |
1274 | |
1275 | -from django.conf.urls import include, url |
1276 | +from django.conf.urls import url |
1277 | +from django.contrib import admin |
1278 | +from django.views.generic.base import RedirectView |
1279 | + |
1280 | try: |
1281 | from django.urls import reverse_lazy # 1.10+ |
1282 | except ModuleNotFoundError: |
1283 | from django.core.urlresolvers import reverse_lazy # pre-1.10 |
1284 | -from django.views.generic.base import RedirectView |
1285 | + |
1286 | from turku_api import views |
1287 | -from django.contrib import admin |
1288 | |
1289 | |
1290 | admin.autodiscover() |
1291 | |
1292 | urlpatterns = [ |
1293 | - url(r'^$', RedirectView.as_view(url=reverse_lazy('admin:index'))), |
1294 | - url(r'^v1/health$', views.health, name='health'), |
1295 | - url(r'^v1/update_config$', views.update_config, name='update_config'), |
1296 | - url(r'^v1/agent_ping_checkin$', views.agent_ping_checkin, name='agent_ping_checkin'), |
1297 | - url(r'^v1/agent_ping_restore$', views.agent_ping_restore, name='agent_ping_restore'), |
1298 | - url(r'^v1/storage_ping_checkin$', views.storage_ping_checkin, name='storage_ping_checkin'), |
1299 | - url(r'^v1/storage_ping_source_update$', views.storage_ping_source_update, name='storage_ping_source_update'), |
1300 | - url(r'^v1/storage_update_config$', views.storage_update_config, name='storage_update_config'), |
1301 | - url(r'^admin/', admin.site.urls), |
1302 | + url(r"^$", RedirectView.as_view(url=reverse_lazy("admin:index"))), |
1303 | + url(r"^v1/health$", views.health, name="health"), |
1304 | + url(r"^v1/update_config$", views.update_config, name="update_config"), |
1305 | + url( |
1306 | + r"^v1/agent_ping_checkin$", views.agent_ping_checkin, name="agent_ping_checkin" |
1307 | + ), |
1308 | + url( |
1309 | + r"^v1/agent_ping_restore$", views.agent_ping_restore, name="agent_ping_restore" |
1310 | + ), |
1311 | + url( |
1312 | + r"^v1/storage_ping_checkin$", |
1313 | + views.storage_ping_checkin, |
1314 | + name="storage_ping_checkin", |
1315 | + ), |
1316 | + url( |
1317 | + r"^v1/storage_ping_source_update$", |
1318 | + views.storage_ping_source_update, |
1319 | + name="storage_ping_source_update", |
1320 | + ), |
1321 | + url( |
1322 | + r"^v1/storage_update_config$", |
1323 | + views.storage_update_config, |
1324 | + name="storage_update_config", |
1325 | + ), |
1326 | + url(r"^admin/", admin.site.urls), |
1327 | ] |
1328 | |
1329 | try: |
1330 | - from local_urls import * |
1331 | + from local_urls import * # noqa: F401,F403 |
1332 | except ImportError: |
1333 | pass |
1334 | |
1335 | === modified file 'turku_api/views.py' |
1336 | --- turku_api/views.py 2020-03-24 23:07:22 +0000 |
1337 | +++ turku_api/views.py 2020-06-21 23:58:30 +0000 |
1338 | @@ -14,87 +14,107 @@ |
1339 | # License along with this program. If not, see |
1340 | # <http://www.gnu.org/licenses/>. |
1341 | |
1342 | +from datetime import datetime, timedelta |
1343 | +import json |
1344 | +import random |
1345 | + |
1346 | +from django.contrib.auth import hashers |
1347 | +from django.core.exceptions import ValidationError |
1348 | from django.http import ( |
1349 | - HttpResponse, HttpResponseBadRequest, HttpResponseNotAllowed, |
1350 | - HttpResponseForbidden, HttpResponseNotFound, |
1351 | + HttpResponse, |
1352 | + HttpResponseBadRequest, |
1353 | + HttpResponseForbidden, |
1354 | + HttpResponseNotAllowed, |
1355 | + HttpResponseNotFound, |
1356 | ) |
1357 | +from django.utils import timezone |
1358 | from django.views.decorators.csrf import csrf_exempt |
1359 | -from django.utils import timezone |
1360 | -from django.core.exceptions import ValidationError |
1361 | - |
1362 | -from turku_api.models import Auth, Machine, Source, Storage, BackupLog, FilterSet |
1363 | - |
1364 | -import json |
1365 | -import random |
1366 | -from datetime import timedelta, datetime |
1367 | -from django.contrib.auth import hashers |
1368 | + |
1369 | +from turku_api.models import Auth, BackupLog, FilterSet, Machine, Source, Storage |
1370 | |
1371 | |
1372 | def frequency_next_scheduled(frequency, base_time=None): |
1373 | if not base_time: |
1374 | base_time = timezone.now() |
1375 | - f = [x.strip() for x in frequency.split(',')] |
1376 | + f = [x.strip() for x in frequency.split(",")] |
1377 | |
1378 | - if f[0] == 'hourly': |
1379 | - target_time = ( |
1380 | - base_time.replace( |
1381 | - minute=random.randint(0, 59), second=random.randint(0, 59), microsecond=0 |
1382 | - ) + timedelta(hours=1) |
1383 | - ) |
1384 | + if f[0] == "hourly": |
1385 | + target_time = base_time.replace( |
1386 | + minute=random.randint(0, 59), second=random.randint(0, 59), microsecond=0 |
1387 | + ) + timedelta(hours=1) |
1388 | # Push it out 10 minutes if it falls within 10 minutes of now |
1389 | if target_time < (base_time + timedelta(minutes=10)): |
1390 | - target_time = (target_time + timedelta(minutes=10)) |
1391 | + target_time = target_time + timedelta(minutes=10) |
1392 | return target_time |
1393 | |
1394 | today = base_time.replace(hour=0, minute=0, second=0, microsecond=0) |
1395 | - if f[0] == 'daily': |
1396 | + if f[0] == "daily": |
1397 | # Tomorrow |
1398 | - target_date = (today + timedelta(days=1)) |
1399 | - elif f[0] == 'weekly': |
1400 | + target_date = today + timedelta(days=1) |
1401 | + elif f[0] == "weekly": |
1402 | # Random day next week |
1403 | target_day = random.randint(0, 6) |
1404 | - target_date = (today + timedelta(weeks=1) - timedelta(days=((today.weekday() + 1) % 7)) + timedelta(days=target_day)) |
1405 | + target_date = ( |
1406 | + today |
1407 | + + timedelta(weeks=1) |
1408 | + - timedelta(days=((today.weekday() + 1) % 7)) |
1409 | + + timedelta(days=target_day) |
1410 | + ) |
1411 | # Push it out 3 days if it falls within 3 days of now |
1412 | if target_date < (base_time + timedelta(days=3)): |
1413 | - target_date = (target_date + timedelta(days=3)) |
1414 | - elif f[0] in ('sunday', 'monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday'): |
1415 | + target_date = target_date + timedelta(days=3) |
1416 | + elif f[0] in ( |
1417 | + "sunday", |
1418 | + "monday", |
1419 | + "tuesday", |
1420 | + "wednesday", |
1421 | + "thursday", |
1422 | + "friday", |
1423 | + "saturday", |
1424 | + ): |
1425 | # Next Xday |
1426 | day_map = { |
1427 | - 'sunday': 0, |
1428 | - 'monday': 1, |
1429 | - 'tuesday': 2, |
1430 | - 'wednesday': 3, |
1431 | - 'thursday': 4, |
1432 | - 'friday': 5, |
1433 | - 'saturday': 6, |
1434 | + "sunday": 0, |
1435 | + "monday": 1, |
1436 | + "tuesday": 2, |
1437 | + "wednesday": 3, |
1438 | + "thursday": 4, |
1439 | + "friday": 5, |
1440 | + "saturday": 6, |
1441 | } |
1442 | target_day = day_map[f[0]] |
1443 | - target_date = (today - timedelta(days=((today.weekday() + 1) % 7)) + timedelta(days=target_day)) |
1444 | + target_date = ( |
1445 | + today |
1446 | + - timedelta(days=((today.weekday() + 1) % 7)) |
1447 | + + timedelta(days=target_day) |
1448 | + ) |
1449 | if target_date < today: |
1450 | - target_date = (target_date + timedelta(weeks=1)) |
1451 | - elif f[0] == 'monthly': |
1452 | + target_date = target_date + timedelta(weeks=1) |
1453 | + elif f[0] == "monthly": |
1454 | next_month = (today.replace(day=1) + timedelta(days=40)).replace(day=1) |
1455 | month_after = (next_month.replace(day=1) + timedelta(days=40)).replace(day=1) |
1456 | - target_date = (next_month + timedelta(days=random.randint(1, (month_after - next_month).days))) |
1457 | + target_date = next_month + timedelta( |
1458 | + days=random.randint(1, (month_after - next_month).days) |
1459 | + ) |
1460 | # Push it out a week if it falls within a week of now |
1461 | if target_date < (base_time + timedelta(days=7)): |
1462 | - target_date = (target_date + timedelta(days=7)) |
1463 | + target_date = target_date + timedelta(days=7) |
1464 | else: |
1465 | # Fall back to tomorrow |
1466 | - target_date = (today + timedelta(days=1)) |
1467 | + target_date = today + timedelta(days=1) |
1468 | |
1469 | if len(f) == 1: |
1470 | - return (target_date + timedelta(seconds=random.randint(0, 86399))) |
1471 | - time_range = f[1].split('-') |
1472 | + return target_date + timedelta(seconds=random.randint(0, 86399)) |
1473 | + time_range = f[1].split("-") |
1474 | start = (int(time_range[0][0:2]) * 60 * 60) + (int(time_range[0][2:4]) * 60) |
1475 | if len(time_range) == 1: |
1476 | # Not a range |
1477 | - return (target_date + timedelta(seconds=start)) |
1478 | + return target_date + timedelta(seconds=start) |
1479 | end = (int(time_range[1][0:2]) * 60 * 60) + (int(time_range[1][2:4]) * 60) |
1480 | if end < start: |
1481 | # Day rollover |
1482 | end = end + 86400 |
1483 | - return (target_date + timedelta(seconds=random.randint(start, end))) |
1484 | + return target_date + timedelta(seconds=random.randint(start, end)) |
1485 | |
1486 | |
1487 | def random_weighted(m): |
1488 | @@ -115,8 +135,9 @@ |
1489 | |
1490 | def get_repo_revision(): |
1491 | import os |
1492 | + |
1493 | base_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) |
1494 | - if os.path.isdir(os.path.join(base_dir, '.bzr')): |
1495 | + if os.path.isdir(os.path.join(base_dir, ".bzr")): |
1496 | try: |
1497 | import bzrlib.errors |
1498 | from bzrlib.branch import Branch |
1499 | @@ -137,17 +158,22 @@ |
1500 | return repr(self.message) |
1501 | |
1502 | |
1503 | -class ViewV1(): |
1504 | +class ViewV1: |
1505 | def __init__(self, django_request): |
1506 | self.django_request = django_request |
1507 | self._parse_json_post() |
1508 | |
1509 | def _parse_json_post(self): |
1510 | # Require JSON POST |
1511 | - if not self.django_request.method == 'POST': |
1512 | - raise HttpResponseException(HttpResponseNotAllowed(['POST'])) |
1513 | - if not (('CONTENT_TYPE' in self.django_request.META) and (self.django_request.META['CONTENT_TYPE'] == 'application/json')): |
1514 | - raise HttpResponseException(HttpResponseBadRequest('Bad Content-Type (expected application/json)')) |
1515 | + if not self.django_request.method == "POST": |
1516 | + raise HttpResponseException(HttpResponseNotAllowed(["POST"])) |
1517 | + if not ( |
1518 | + ("CONTENT_TYPE" in self.django_request.META) |
1519 | + and (self.django_request.META["CONTENT_TYPE"] == "application/json") |
1520 | + ): |
1521 | + raise HttpResponseException( |
1522 | + HttpResponseBadRequest("Bad Content-Type (expected application/json)") |
1523 | + ) |
1524 | |
1525 | # Load the POSTed JSON |
1526 | try: |
1527 | @@ -157,80 +183,103 @@ |
1528 | |
1529 | def _storage_authenticate(self): |
1530 | # Check for storage auth |
1531 | - if 'storage' not in self.req: |
1532 | - raise HttpResponseException(HttpResponseBadRequest('Missing required option "storage"')) |
1533 | - for k in ('name', 'secret'): |
1534 | - if k not in self.req['storage']: |
1535 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1536 | + if "storage" not in self.req: |
1537 | + raise HttpResponseException( |
1538 | + HttpResponseBadRequest('Missing required option "storage"') |
1539 | + ) |
1540 | + for k in ("name", "secret"): |
1541 | + if k not in self.req["storage"]: |
1542 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1543 | try: |
1544 | - self.storage = Storage.objects.get(name=self.req['storage']['name'], active=True) |
1545 | + self.storage = Storage.objects.get( |
1546 | + name=self.req["storage"]["name"], active=True |
1547 | + ) |
1548 | except Storage.DoesNotExist: |
1549 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1550 | - if not hashers.check_password(self.req['storage']['secret'], self.storage.secret_hash): |
1551 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1552 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1553 | + if not hashers.check_password( |
1554 | + self.req["storage"]["secret"], self.storage.secret_hash |
1555 | + ): |
1556 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1557 | |
1558 | def _storage_get_machine(self): |
1559 | # Make sure these exist in the request |
1560 | - if 'machine' not in self.req: |
1561 | - raise HttpResponseException(HttpResponseBadRequest('Missing required option "machine"')) |
1562 | - if 'uuid' not in self.req['machine']: |
1563 | - raise HttpResponseException(HttpResponseBadRequest('Missing required option "machine.uuid"')) |
1564 | + if "machine" not in self.req: |
1565 | + raise HttpResponseException( |
1566 | + HttpResponseBadRequest('Missing required option "machine"') |
1567 | + ) |
1568 | + if "uuid" not in self.req["machine"]: |
1569 | + raise HttpResponseException( |
1570 | + HttpResponseBadRequest('Missing required option "machine.uuid"') |
1571 | + ) |
1572 | |
1573 | # Create or load the machine |
1574 | try: |
1575 | - return Machine.objects.get(uuid=self.req['machine']['uuid'], storage=self.storage, active=True, published=True) |
1576 | + return Machine.objects.get( |
1577 | + uuid=self.req["machine"]["uuid"], |
1578 | + storage=self.storage, |
1579 | + active=True, |
1580 | + published=True, |
1581 | + ) |
1582 | except Machine.DoesNotExist: |
1583 | - raise HttpResponseException(HttpResponseNotFound('Machine not found')) |
1584 | + raise HttpResponseException(HttpResponseNotFound("Machine not found")) |
1585 | |
1586 | def get_registration_auth(self, secret_type): |
1587 | # Check for global auth |
1588 | - if 'auth' not in self.req: |
1589 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1590 | - if isinstance(self.req['auth'], dict): |
1591 | - if not (('name' in self.req['auth']) and ('secret' in self.req['auth'])): |
1592 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1593 | + if "auth" not in self.req: |
1594 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1595 | + if isinstance(self.req["auth"], dict): |
1596 | + if not (("name" in self.req["auth"]) and ("secret" in self.req["auth"])): |
1597 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1598 | try: |
1599 | - a = Auth.objects.get(name=self.req['auth']['name'], secret_type=secret_type, active=True) |
1600 | + a = Auth.objects.get( |
1601 | + name=self.req["auth"]["name"], secret_type=secret_type, active=True |
1602 | + ) |
1603 | except Auth.DoesNotExist: |
1604 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1605 | - if hashers.check_password(self.req['auth']['secret'], a.secret_hash): |
1606 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1607 | + if hashers.check_password(self.req["auth"]["secret"], a.secret_hash): |
1608 | return a |
1609 | else: |
1610 | # XXX inefficient but temporary (legacy) |
1611 | for a in Auth.objects.filter(secret_type=secret_type, active=True): |
1612 | - if hashers.check_password(self.req['auth'], a.secret_hash): |
1613 | + if hashers.check_password(self.req["auth"], a.secret_hash): |
1614 | return a |
1615 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1616 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1617 | |
1618 | def update_config(self): |
1619 | - if not (('machine' in self.req) and (isinstance(self.req['machine'], dict))): |
1620 | - raise HttpResponseException(HttpResponseBadRequest('"machine" dict required')) |
1621 | - req_machine = self.req['machine'] |
1622 | + if not (("machine" in self.req) and (isinstance(self.req["machine"], dict))): |
1623 | + raise HttpResponseException( |
1624 | + HttpResponseBadRequest('"machine" dict required') |
1625 | + ) |
1626 | + req_machine = self.req["machine"] |
1627 | |
1628 | # Make sure these exist in the request (validation comes later) |
1629 | - for k in ('uuid', 'secret'): |
1630 | + for k in ("uuid", "secret"): |
1631 | if k not in req_machine: |
1632 | - raise HttpResponseException(HttpResponseBadRequest('Missing required machine option "%s"' % k)) |
1633 | + raise HttpResponseException( |
1634 | + HttpResponseBadRequest('Missing required machine option "%s"' % k) |
1635 | + ) |
1636 | |
1637 | # Create or load the machine |
1638 | try: |
1639 | - m = Machine.objects.get(uuid=req_machine['uuid'], active=True) |
1640 | + m = Machine.objects.get(uuid=req_machine["uuid"], active=True) |
1641 | modified = False |
1642 | except Machine.DoesNotExist: |
1643 | - m = Machine(uuid=req_machine['uuid']) |
1644 | - m.secret_hash = hashers.make_password(req_machine['secret']) |
1645 | - m.auth = self.get_registration_auth('machine_reg') |
1646 | + m = Machine(uuid=req_machine["uuid"]) |
1647 | + m.secret_hash = hashers.make_password(req_machine["secret"]) |
1648 | + m.auth = self.get_registration_auth("machine_reg") |
1649 | modified = True |
1650 | |
1651 | # If the machine existed before, it had a secret. Make sure that |
1652 | # hasn't changed. |
1653 | - if not hashers.check_password(req_machine['secret'], m.secret_hash): |
1654 | - raise HttpResponseException(HttpResponseForbidden('Bad secret for existing machine')) |
1655 | + if not hashers.check_password(req_machine["secret"], m.secret_hash): |
1656 | + raise HttpResponseException( |
1657 | + HttpResponseForbidden("Bad secret for existing machine") |
1658 | + ) |
1659 | |
1660 | # Change the machine published status if needed |
1661 | - if ('published' in req_machine): |
1662 | - if req_machine['published'] != m.published: |
1663 | - m.published = req_machine['published'] |
1664 | + if "published" in req_machine: |
1665 | + if req_machine["published"] != m.published: |
1666 | + m.published = req_machine["published"] |
1667 | modified = True |
1668 | else: |
1669 | # If not present, default to want published |
1670 | @@ -251,11 +300,19 @@ |
1671 | m.storage = random_weighted(weights) |
1672 | modified = True |
1673 | except IndexError: |
1674 | - raise HttpResponseException(HttpResponseNotFound('No storages are currently available')) |
1675 | + raise HttpResponseException( |
1676 | + HttpResponseNotFound("No storages are currently available") |
1677 | + ) |
1678 | |
1679 | # If any of these exist in the request, add or update them in the |
1680 | # machine. |
1681 | - for k in ('environment_name', 'service_name', 'unit_name', 'comment', 'ssh_public_key'): |
1682 | + for k in ( |
1683 | + "environment_name", |
1684 | + "service_name", |
1685 | + "unit_name", |
1686 | + "comment", |
1687 | + "ssh_public_key", |
1688 | + ): |
1689 | if (k in req_machine) and (getattr(m, k) != req_machine[k]): |
1690 | setattr(m, k, req_machine[k]) |
1691 | modified = True |
1692 | @@ -266,18 +323,24 @@ |
1693 | try: |
1694 | m.full_clean() |
1695 | except ValidationError as e: |
1696 | - raise HttpResponseException(HttpResponseBadRequest('Validation error: %s' % str(e))) |
1697 | + raise HttpResponseException( |
1698 | + HttpResponseBadRequest("Validation error: %s" % str(e)) |
1699 | + ) |
1700 | m.save() |
1701 | |
1702 | - if 'sources' in req_machine: |
1703 | - req_sources = req_machine['sources'] |
1704 | + if "sources" in req_machine: |
1705 | + req_sources = req_machine["sources"] |
1706 | if not isinstance(req_sources, dict): |
1707 | - raise HttpResponseException(HttpResponseBadRequest('Invalid type for "sources"')) |
1708 | - elif 'sources' in self.req: |
1709 | + raise HttpResponseException( |
1710 | + HttpResponseBadRequest('Invalid type for "sources"') |
1711 | + ) |
1712 | + elif "sources" in self.req: |
1713 | # XXX legacy |
1714 | - req_sources = self.req['sources'] |
1715 | + req_sources = self.req["sources"] |
1716 | if not isinstance(req_sources, dict): |
1717 | - raise HttpResponseException(HttpResponseBadRequest('Invalid type for "sources"')) |
1718 | + raise HttpResponseException( |
1719 | + HttpResponseBadRequest('Invalid type for "sources"') |
1720 | + ) |
1721 | else: |
1722 | req_sources = {} |
1723 | |
1724 | @@ -291,17 +354,27 @@ |
1725 | |
1726 | modified = False |
1727 | for k in ( |
1728 | - 'path', 'frequency', 'retention', |
1729 | - 'comment', 'shared_service', 'large_rotating_files', |
1730 | - 'large_modifying_files', 'bwlimit', 'snapshot_mode', |
1731 | - 'preserve_hard_links', |
1732 | + "path", |
1733 | + "frequency", |
1734 | + "retention", |
1735 | + "comment", |
1736 | + "shared_service", |
1737 | + "large_rotating_files", |
1738 | + "large_modifying_files", |
1739 | + "bwlimit", |
1740 | + "snapshot_mode", |
1741 | + "preserve_hard_links", |
1742 | ): |
1743 | - if (k in req_sources[s.name]) and (getattr(s, k) != req_sources[s.name][k]): |
1744 | + if (k in req_sources[s.name]) and ( |
1745 | + getattr(s, k) != req_sources[s.name][k] |
1746 | + ): |
1747 | setattr(s, k, req_sources[s.name][k]) |
1748 | - if k == 'frequency': |
1749 | - s.date_next_backup = frequency_next_scheduled(req_sources[s.name][k]) |
1750 | + if k == "frequency": |
1751 | + s.date_next_backup = frequency_next_scheduled( |
1752 | + req_sources[s.name][k] |
1753 | + ) |
1754 | modified = True |
1755 | - for k in ('filter', 'exclude'): |
1756 | + for k in ("filter", "exclude"): |
1757 | if k not in req_sources[s.name]: |
1758 | continue |
1759 | v = json.dumps(req_sources[s.name][k], sort_keys=True) |
1760 | @@ -315,7 +388,9 @@ |
1761 | try: |
1762 | s.full_clean() |
1763 | except ValidationError as e: |
1764 | - raise HttpResponseException(HttpResponseBadRequest('Validation error: %s' % str(e))) |
1765 | + raise HttpResponseException( |
1766 | + HttpResponseBadRequest("Validation error: %s" % str(e)) |
1767 | + ) |
1768 | s.save() |
1769 | |
1770 | for name in req_sources: |
1771 | @@ -326,15 +401,21 @@ |
1772 | s.machine = m |
1773 | |
1774 | for k in ( |
1775 | - 'path', 'frequency', 'retention', |
1776 | - 'comment', 'shared_service', 'large_rotating_files', |
1777 | - 'large_modifying_files', 'bwlimit', 'snapshot_mode', |
1778 | - 'preserve_hard_links', |
1779 | + "path", |
1780 | + "frequency", |
1781 | + "retention", |
1782 | + "comment", |
1783 | + "shared_service", |
1784 | + "large_rotating_files", |
1785 | + "large_modifying_files", |
1786 | + "bwlimit", |
1787 | + "snapshot_mode", |
1788 | + "preserve_hard_links", |
1789 | ): |
1790 | if k not in req_sources[s.name]: |
1791 | continue |
1792 | setattr(s, k, req_sources[s.name][k]) |
1793 | - for k in ('filter', 'exclude'): |
1794 | + for k in ("filter", "exclude"): |
1795 | if k not in req_sources[s.name]: |
1796 | continue |
1797 | v = json.dumps(req_sources[s.name][k], sort_keys=True) |
1798 | @@ -346,18 +427,20 @@ |
1799 | try: |
1800 | s.full_clean() |
1801 | except ValidationError as e: |
1802 | - raise HttpResponseException(HttpResponseBadRequest('Validation error: %s' % str(e))) |
1803 | + raise HttpResponseException( |
1804 | + HttpResponseBadRequest("Validation error: %s" % str(e)) |
1805 | + ) |
1806 | s.save() |
1807 | |
1808 | # XXX legacy |
1809 | out = { |
1810 | - 'storage_name': m.storage.name, |
1811 | - 'ssh_ping_host': m.storage.ssh_ping_host, |
1812 | - 'ssh_ping_host_keys': json.loads(m.storage.ssh_ping_host_keys), |
1813 | - 'ssh_ping_port': m.storage.ssh_ping_port, |
1814 | - 'ssh_ping_user': m.storage.ssh_ping_user, |
1815 | + "storage_name": m.storage.name, |
1816 | + "ssh_ping_host": m.storage.ssh_ping_host, |
1817 | + "ssh_ping_host_keys": json.loads(m.storage.ssh_ping_host_keys), |
1818 | + "ssh_ping_port": m.storage.ssh_ping_port, |
1819 | + "ssh_ping_user": m.storage.ssh_ping_user, |
1820 | } |
1821 | - return HttpResponse(json.dumps(out), content_type='application/json') |
1822 | + return HttpResponse(json.dumps(out), content_type="application/json") |
1823 | |
1824 | def build_filters(self, set, loaded_sets=None): |
1825 | if not loaded_sets: |
1826 | @@ -365,10 +448,10 @@ |
1827 | out = [] |
1828 | for f in set: |
1829 | try: |
1830 | - (verb, subsetname) = f.split(' ', 1) |
1831 | + (verb, subsetname) = f.split(" ", 1) |
1832 | except ValueError: |
1833 | continue |
1834 | - if verb in ('merge', '.'): |
1835 | + if verb in ("merge", "."): |
1836 | if subsetname in loaded_sets: |
1837 | continue |
1838 | try: |
1839 | @@ -379,10 +462,22 @@ |
1840 | out.append(f2) |
1841 | loaded_sets.append(subsetname) |
1842 | elif verb in ( |
1843 | - 'dir-merge', ':', 'clear', '!', |
1844 | - 'exclude', '-', 'include', '+', |
1845 | - 'hide', 'H', 'show', 'S', |
1846 | - 'protect', 'P', 'risk', 'R', |
1847 | + "dir-merge", |
1848 | + ":", |
1849 | + "clear", |
1850 | + "!", |
1851 | + "exclude", |
1852 | + "-", |
1853 | + "include", |
1854 | + "+", |
1855 | + "hide", |
1856 | + "H", |
1857 | + "show", |
1858 | + "S", |
1859 | + "protect", |
1860 | + "P", |
1861 | + "risk", |
1862 | + "R", |
1863 | ): |
1864 | out.append(f) |
1865 | return out |
1866 | @@ -390,109 +485,117 @@ |
1867 | def get_checkin_scheduled_sources(self, m): |
1868 | scheduled_sources = {} |
1869 | now = timezone.now() |
1870 | - for s in m.source_set.filter(date_next_backup__lte=now, active=True, published=True): |
1871 | + for s in m.source_set.filter( |
1872 | + date_next_backup__lte=now, active=True, published=True |
1873 | + ): |
1874 | scheduled_sources[s.name] = { |
1875 | - 'path': s.path, |
1876 | - 'retention': s.retention, |
1877 | - 'bwlimit': s.bwlimit, |
1878 | - 'filter': self.build_filters(json.loads(s.filter)), |
1879 | - 'exclude': json.loads(s.exclude), |
1880 | - 'shared_service': s.shared_service, |
1881 | - 'large_rotating_files': s.large_rotating_files, |
1882 | - 'large_modifying_files': s.large_modifying_files, |
1883 | - 'snapshot_mode': s.snapshot_mode, |
1884 | - 'preserve_hard_links': s.preserve_hard_links, |
1885 | - 'storage': { |
1886 | - 'name': s.machine.storage.name, |
1887 | - 'ssh_ping_host': s.machine.storage.ssh_ping_host, |
1888 | - 'ssh_ping_host_keys': json.loads(s.machine.storage.ssh_ping_host_keys), |
1889 | - 'ssh_ping_port': s.machine.storage.ssh_ping_port, |
1890 | - 'ssh_ping_user': s.machine.storage.ssh_ping_user, |
1891 | - } |
1892 | + "path": s.path, |
1893 | + "retention": s.retention, |
1894 | + "bwlimit": s.bwlimit, |
1895 | + "filter": self.build_filters(json.loads(s.filter)), |
1896 | + "exclude": json.loads(s.exclude), |
1897 | + "shared_service": s.shared_service, |
1898 | + "large_rotating_files": s.large_rotating_files, |
1899 | + "large_modifying_files": s.large_modifying_files, |
1900 | + "snapshot_mode": s.snapshot_mode, |
1901 | + "preserve_hard_links": s.preserve_hard_links, |
1902 | + "storage": { |
1903 | + "name": s.machine.storage.name, |
1904 | + "ssh_ping_host": s.machine.storage.ssh_ping_host, |
1905 | + "ssh_ping_host_keys": json.loads( |
1906 | + s.machine.storage.ssh_ping_host_keys |
1907 | + ), |
1908 | + "ssh_ping_port": s.machine.storage.ssh_ping_port, |
1909 | + "ssh_ping_user": s.machine.storage.ssh_ping_user, |
1910 | + }, |
1911 | } |
1912 | return scheduled_sources |
1913 | |
1914 | def agent_ping_checkin(self): |
1915 | - if not (('machine' in self.req) and (isinstance(self.req['machine'], dict))): |
1916 | - raise HttpResponseException(HttpResponseBadRequest('"machine" dict required')) |
1917 | - req_machine = self.req['machine'] |
1918 | + if not (("machine" in self.req) and (isinstance(self.req["machine"], dict))): |
1919 | + raise HttpResponseException( |
1920 | + HttpResponseBadRequest('"machine" dict required') |
1921 | + ) |
1922 | + req_machine = self.req["machine"] |
1923 | |
1924 | # Make sure these exist in the request |
1925 | - for k in ('uuid', 'secret'): |
1926 | + for k in ("uuid", "secret"): |
1927 | if k not in req_machine: |
1928 | - raise HttpResponseException(HttpResponseBadRequest('Missing required machine option "%s"' % k)) |
1929 | + raise HttpResponseException( |
1930 | + HttpResponseBadRequest('Missing required machine option "%s"' % k) |
1931 | + ) |
1932 | |
1933 | # Load the machine |
1934 | try: |
1935 | - m = Machine.objects.get(uuid=req_machine['uuid'], active=True, published=True) |
1936 | + m = Machine.objects.get( |
1937 | + uuid=req_machine["uuid"], active=True, published=True |
1938 | + ) |
1939 | except Machine.DoesNotExist: |
1940 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1941 | - if not hashers.check_password(req_machine['secret'], m.secret_hash): |
1942 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1943 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1944 | + if not hashers.check_password(req_machine["secret"], m.secret_hash): |
1945 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1946 | |
1947 | scheduled_sources = self.get_checkin_scheduled_sources(m) |
1948 | now = timezone.now() |
1949 | |
1950 | - out = { |
1951 | - 'machine': { |
1952 | - 'scheduled_sources': scheduled_sources, |
1953 | - }, |
1954 | - } |
1955 | + out = {"machine": {"scheduled_sources": scheduled_sources}} |
1956 | |
1957 | # XXX legacy |
1958 | - out['scheduled_sources'] = scheduled_sources |
1959 | + out["scheduled_sources"] = scheduled_sources |
1960 | |
1961 | m.date_checked_in = now |
1962 | m.save() |
1963 | - return HttpResponse(json.dumps(out), content_type='application/json') |
1964 | + return HttpResponse(json.dumps(out), content_type="application/json") |
1965 | |
1966 | def agent_ping_restore(self): |
1967 | - if not (('machine' in self.req) and (isinstance(self.req['machine'], dict))): |
1968 | - raise HttpResponseException(HttpResponseBadRequest('"machine" dict required')) |
1969 | - req_machine = self.req['machine'] |
1970 | + if not (("machine" in self.req) and (isinstance(self.req["machine"], dict))): |
1971 | + raise HttpResponseException( |
1972 | + HttpResponseBadRequest('"machine" dict required') |
1973 | + ) |
1974 | + req_machine = self.req["machine"] |
1975 | |
1976 | # Make sure these exist in the request |
1977 | - for k in ('uuid', 'secret'): |
1978 | + for k in ("uuid", "secret"): |
1979 | if k not in req_machine: |
1980 | - raise HttpResponseException(HttpResponseBadRequest('Missing required machine option "%s"' % k)) |
1981 | + raise HttpResponseException( |
1982 | + HttpResponseBadRequest('Missing required machine option "%s"' % k) |
1983 | + ) |
1984 | |
1985 | # Load the machine |
1986 | try: |
1987 | - m = Machine.objects.get(uuid=req_machine['uuid'], active=True) |
1988 | + m = Machine.objects.get(uuid=req_machine["uuid"], active=True) |
1989 | except Machine.DoesNotExist: |
1990 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1991 | - if not hashers.check_password(req_machine['secret'], m.secret_hash): |
1992 | - raise HttpResponseException(HttpResponseForbidden('Bad auth')) |
1993 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1994 | + if not hashers.check_password(req_machine["secret"], m.secret_hash): |
1995 | + raise HttpResponseException(HttpResponseForbidden("Bad auth")) |
1996 | |
1997 | sources = {} |
1998 | for s in m.source_set.filter(active=True): |
1999 | sources[s.name] = { |
2000 | - 'path': s.path, |
2001 | - 'retention': s.retention, |
2002 | - 'bwlimit': s.bwlimit, |
2003 | - 'filter': self.build_filters(json.loads(s.filter)), |
2004 | - 'exclude': json.loads(s.exclude), |
2005 | - 'shared_service': s.shared_service, |
2006 | - 'large_rotating_files': s.large_rotating_files, |
2007 | - 'large_modifying_files': s.large_modifying_files, |
2008 | - 'snapshot_mode': s.snapshot_mode, |
2009 | - 'preserve_hard_links': s.preserve_hard_links, |
2010 | - 'storage': { |
2011 | - 'name': s.machine.storage.name, |
2012 | - 'ssh_ping_host': s.machine.storage.ssh_ping_host, |
2013 | - 'ssh_ping_host_keys': json.loads(s.machine.storage.ssh_ping_host_keys), |
2014 | - 'ssh_ping_port': s.machine.storage.ssh_ping_port, |
2015 | - 'ssh_ping_user': s.machine.storage.ssh_ping_user, |
2016 | - } |
2017 | + "path": s.path, |
2018 | + "retention": s.retention, |
2019 | + "bwlimit": s.bwlimit, |
2020 | + "filter": self.build_filters(json.loads(s.filter)), |
2021 | + "exclude": json.loads(s.exclude), |
2022 | + "shared_service": s.shared_service, |
2023 | + "large_rotating_files": s.large_rotating_files, |
2024 | + "large_modifying_files": s.large_modifying_files, |
2025 | + "snapshot_mode": s.snapshot_mode, |
2026 | + "preserve_hard_links": s.preserve_hard_links, |
2027 | + "storage": { |
2028 | + "name": s.machine.storage.name, |
2029 | + "ssh_ping_host": s.machine.storage.ssh_ping_host, |
2030 | + "ssh_ping_host_keys": json.loads( |
2031 | + s.machine.storage.ssh_ping_host_keys |
2032 | + ), |
2033 | + "ssh_ping_port": s.machine.storage.ssh_ping_port, |
2034 | + "ssh_ping_user": s.machine.storage.ssh_ping_user, |
2035 | + }, |
2036 | } |
2037 | |
2038 | - out = { |
2039 | - 'machine': { |
2040 | - 'sources': sources, |
2041 | - }, |
2042 | - } |
2043 | + out = {"machine": {"sources": sources}} |
2044 | |
2045 | - return HttpResponse(json.dumps(out), content_type='application/json') |
2046 | + return HttpResponse(json.dumps(out), content_type="application/json") |
2047 | |
2048 | def storage_ping_checkin(self): |
2049 | self._storage_authenticate() |
2050 | @@ -502,32 +605,34 @@ |
2051 | now = timezone.now() |
2052 | |
2053 | out = { |
2054 | - 'machine': { |
2055 | - 'uuid': m.uuid, |
2056 | - 'environment_name': m.environment_name, |
2057 | - 'service_name': m.service_name, |
2058 | - 'unit_name': m.unit_name, |
2059 | - 'scheduled_sources': scheduled_sources, |
2060 | - }, |
2061 | + "machine": { |
2062 | + "uuid": m.uuid, |
2063 | + "environment_name": m.environment_name, |
2064 | + "service_name": m.service_name, |
2065 | + "unit_name": m.unit_name, |
2066 | + "scheduled_sources": scheduled_sources, |
2067 | + } |
2068 | } |
2069 | m.date_checked_in = now |
2070 | m.save() |
2071 | - return HttpResponse(json.dumps(out), content_type='application/json') |
2072 | + return HttpResponse(json.dumps(out), content_type="application/json") |
2073 | |
2074 | def storage_ping_source_update(self): |
2075 | self._storage_authenticate() |
2076 | m = self._storage_get_machine() |
2077 | |
2078 | - if 'sources' not in self.req['machine']: |
2079 | - raise HttpResponseException(HttpResponseBadRequest('Missing required option "machine.sources"')) |
2080 | - for source_name in self.req['machine']['sources']: |
2081 | - source_data = self.req['machine']['sources'][source_name] |
2082 | + if "sources" not in self.req["machine"]: |
2083 | + raise HttpResponseException( |
2084 | + HttpResponseBadRequest('Missing required option "machine.sources"') |
2085 | + ) |
2086 | + for source_name in self.req["machine"]["sources"]: |
2087 | + source_data = self.req["machine"]["sources"][source_name] |
2088 | try: |
2089 | s = m.source_set.get(name=source_name, active=True, published=True) |
2090 | except Source.DoesNotExist: |
2091 | - raise HttpResponseException(HttpResponseNotFound('Source not found')) |
2092 | + raise HttpResponseException(HttpResponseNotFound("Source not found")) |
2093 | now = timezone.now() |
2094 | - is_success = ('success' in source_data and source_data['success']) |
2095 | + is_success = "success" in source_data and source_data["success"] |
2096 | s.success = is_success |
2097 | if is_success: |
2098 | s.date_last_backed_up = now |
2099 | @@ -538,46 +643,63 @@ |
2100 | bl.date = now |
2101 | bl.storage = self.storage |
2102 | bl.success = is_success |
2103 | - if 'snapshot' in source_data: |
2104 | - bl.snapshot = source_data['snapshot'] |
2105 | - if 'summary' in source_data: |
2106 | - bl.summary = source_data['summary'] |
2107 | - if 'time_begin' in source_data: |
2108 | - bl.date_begin = timezone.make_aware(datetime.utcfromtimestamp(source_data['time_begin']), timezone.utc) |
2109 | - if 'time_end' in source_data: |
2110 | - bl.date_end = timezone.make_aware(datetime.utcfromtimestamp(source_data['time_end']), timezone.utc) |
2111 | + if "snapshot" in source_data: |
2112 | + bl.snapshot = source_data["snapshot"] |
2113 | + if "summary" in source_data: |
2114 | + bl.summary = source_data["summary"] |
2115 | + if "time_begin" in source_data: |
2116 | + bl.date_begin = timezone.make_aware( |
2117 | + datetime.utcfromtimestamp(source_data["time_begin"]), timezone.utc |
2118 | + ) |
2119 | + if "time_end" in source_data: |
2120 | + bl.date_end = timezone.make_aware( |
2121 | + datetime.utcfromtimestamp(source_data["time_end"]), timezone.utc |
2122 | + ) |
2123 | bl.save() |
2124 | - return HttpResponse(json.dumps({}), content_type='application/json') |
2125 | + return HttpResponse(json.dumps({}), content_type="application/json") |
2126 | |
2127 | def storage_update_config(self): |
2128 | - if not (('storage' in self.req) and (isinstance(self.req['storage'], dict))): |
2129 | - raise HttpResponseException(HttpResponseBadRequest('"storage" dict required')) |
2130 | - req_storage = self.req['storage'] |
2131 | + if not (("storage" in self.req) and (isinstance(self.req["storage"], dict))): |
2132 | + raise HttpResponseException( |
2133 | + HttpResponseBadRequest('"storage" dict required') |
2134 | + ) |
2135 | + req_storage = self.req["storage"] |
2136 | |
2137 | # Make sure these exist in the request (validation comes later) |
2138 | - for k in ('name', 'secret', 'ssh_ping_host', 'ssh_ping_port', 'ssh_ping_user', 'ssh_ping_host_keys'): |
2139 | + for k in ( |
2140 | + "name", |
2141 | + "secret", |
2142 | + "ssh_ping_host", |
2143 | + "ssh_ping_port", |
2144 | + "ssh_ping_user", |
2145 | + "ssh_ping_host_keys", |
2146 | + ): |
2147 | if k not in req_storage: |
2148 | - raise HttpResponseException(HttpResponseBadRequest('Missing required storage option "%s"' % k)) |
2149 | + raise HttpResponseException( |
2150 | + HttpResponseBadRequest('Missing required storage option "%s"' % k) |
2151 | + ) |
2152 | |
2153 | # Create or load the storage |
2154 | try: |
2155 | - self.storage = Storage.objects.get(name=req_storage['name'], active=True) |
2156 | + self.storage = Storage.objects.get(name=req_storage["name"], active=True) |
2157 | modified = False |
2158 | except Storage.DoesNotExist: |
2159 | - self.storage = Storage(name=req_storage['name']) |
2160 | - self.storage.secret_hash = hashers.make_password(req_storage['secret']) |
2161 | - self.storage.auth = self.get_registration_auth('storage_reg') |
2162 | + self.storage = Storage(name=req_storage["name"]) |
2163 | + self.storage.secret_hash = hashers.make_password(req_storage["secret"]) |
2164 | + self.storage.auth = self.get_registration_auth("storage_reg") |
2165 | modified = True |
2166 | |
2167 | # If the storage existed before, it had a secret. Make sure that |
2168 | # hasn't changed. |
2169 | - if not hashers.check_password(req_storage['secret'], self.storage.secret_hash): |
2170 | - raise HttpResponseException(HttpResponseForbidden('Bad secret for existing storage')) |
2171 | + if not hashers.check_password(req_storage["secret"], self.storage.secret_hash): |
2172 | + raise HttpResponseException( |
2173 | + HttpResponseForbidden("Bad secret for existing storage") |
2174 | + ) |
2175 | |
2176 | # Change the storage published status if needed |
2177 | - if ('published' in req_storage): |
2178 | - if req_storage['published'] != self.storage.published: |
2179 | - self.storage.published = req_storage['published'] |
2180 | + if "published" in req_storage: |
2181 | + if req_storage["published"] != self.storage.published: |
2182 | + self.storage.published = req_storage["published"] |
2183 | modified = True |
2184 | else: |
2185 | # If not present, default to want published |
2186 | @@ -587,12 +709,19 @@ |
2187 | |
2188 | # If any of these exist in the request, add or update them in the |
2189 | # self.storage. |
2190 | - for k in ('comment', 'ssh_ping_host', 'ssh_ping_port', 'ssh_ping_user', 'space_total', 'space_available'): |
2191 | + for k in ( |
2192 | + "comment", |
2193 | + "ssh_ping_host", |
2194 | + "ssh_ping_port", |
2195 | + "ssh_ping_user", |
2196 | + "space_total", |
2197 | + "space_available", |
2198 | + ): |
2199 | if (k in req_storage) and (getattr(self.storage, k) != req_storage[k]): |
2200 | setattr(self.storage, k, req_storage[k]) |
2201 | modified = True |
2202 | |
2203 | - for k in ('ssh_ping_host_keys',): |
2204 | + for k in ("ssh_ping_host_keys",): |
2205 | if k not in req_storage: |
2206 | continue |
2207 | v = json.dumps(req_storage[k], sort_keys=True) |
2208 | @@ -606,21 +735,27 @@ |
2209 | try: |
2210 | self.storage.full_clean() |
2211 | except ValidationError as e: |
2212 | - raise HttpResponseException(HttpResponseBadRequest('Validation error: %s' % str(e))) |
2213 | + raise HttpResponseException( |
2214 | + HttpResponseBadRequest("Validation error: %s" % str(e)) |
2215 | + ) |
2216 | |
2217 | self.storage.date_checked_in = timezone.now() |
2218 | self.storage.save() |
2219 | |
2220 | machines = {} |
2221 | - for m in Machine.objects.filter(storage=self.storage, active=True, published=True): |
2222 | + for m in Machine.objects.filter( |
2223 | + storage=self.storage, active=True, published=True |
2224 | + ): |
2225 | machines[m.uuid] = { |
2226 | - 'environment_name': m.environment_name, |
2227 | - 'service_name': m.service_name, |
2228 | - 'unit_name': m.unit_name, |
2229 | - 'comment': m.comment, |
2230 | - 'ssh_public_key': m.ssh_public_key, |
2231 | + "environment_name": m.environment_name, |
2232 | + "service_name": m.service_name, |
2233 | + "unit_name": m.unit_name, |
2234 | + "comment": m.comment, |
2235 | + "ssh_public_key": m.ssh_public_key, |
2236 | } |
2237 | - return HttpResponse(json.dumps({'machines': machines}), content_type='application/json') |
2238 | + return HttpResponse( |
2239 | + json.dumps({"machines": machines}), content_type="application/json" |
2240 | + ) |
2241 | |
2242 | |
2243 | @csrf_exempt |
2244 | @@ -629,19 +764,19 @@ |
2245 | # to connect to its database and serve data). It does not |
2246 | # indicate the health of machines, storage units, etc. |
2247 | out = { |
2248 | - 'healthy': True, |
2249 | - 'date': timezone.now().isoformat(), |
2250 | - 'repo_revision': get_repo_revision(), |
2251 | - 'counts': { |
2252 | - 'auth': Auth.objects.count(), |
2253 | - 'storage': Storage.objects.count(), |
2254 | - 'machine': Machine.objects.count(), |
2255 | - 'source': Source.objects.count(), |
2256 | - 'filter_set': FilterSet.objects.count(), |
2257 | - 'backup_log': BackupLog.objects.count(), |
2258 | + "healthy": True, |
2259 | + "date": timezone.now().isoformat(), |
2260 | + "repo_revision": get_repo_revision(), |
2261 | + "counts": { |
2262 | + "auth": Auth.objects.count(), |
2263 | + "storage": Storage.objects.count(), |
2264 | + "machine": Machine.objects.count(), |
2265 | + "source": Source.objects.count(), |
2266 | + "filter_set": FilterSet.objects.count(), |
2267 | + "backup_log": BackupLog.objects.count(), |
2268 | }, |
2269 | } |
2270 | - return HttpResponse(json.dumps(out), content_type='application/json') |
2271 | + return HttpResponse(json.dumps(out), content_type="application/json") |
2272 | |
2273 | |
2274 | @csrf_exempt |
2275 | |
2276 | === modified file 'turku_api/wsgi.py' |
2277 | --- turku_api/wsgi.py 2015-07-30 22:41:42 +0000 |
2278 | +++ turku_api/wsgi.py 2020-06-21 23:58:30 +0000 |
2279 | @@ -25,9 +25,11 @@ |
2280 | |
2281 | import os |
2282 | import sys |
2283 | + |
2284 | BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) |
2285 | sys.path.append(BASE_DIR) |
2286 | os.environ.setdefault("DJANGO_SETTINGS_MODULE", "turku_api.settings") |
2287 | |
2288 | -from django.core.wsgi import get_wsgi_application |
2289 | +from django.core.wsgi import get_wsgi_application # noqa: E402 |
2290 | + |
2291 | application = get_wsgi_application() |
This merge proposal is being monitored by mergebot. Change the status to Approved to merge.