Merge ~cjwatson/charm-clamav-database-mirror:initial into charm-clamav-database-mirror:main
- Git
- lp:~cjwatson/charm-clamav-database-mirror
- initial
- Merge into main
Status: | Merged |
---|---|
Merged at revision: | d09540ef93fe67e180e1c07145a48127cc634bc6 |
Proposed branch: | ~cjwatson/charm-clamav-database-mirror:initial |
Merge into: | charm-clamav-database-mirror:main |
Diff against target: |
2117 lines (+2009/-0) 18 files modified
.gitignore (+9/-0) CONTRIBUTING.md (+31/-0) LICENSE (+202/-0) README.md (+12/-0) charmcraft.yaml (+16/-0) config.yaml (+13/-0) files/cvdupdate.timer (+12/-0) lib/charms/operator_libs_linux/v0/apt.py (+1329/-0) metadata.yaml (+14/-0) pyproject.toml (+33/-0) requirements.txt (+2/-0) src/charm.py (+106/-0) templates/cvdupdate.service.j2 (+22/-0) templates/nginx-site.conf.j2 (+9/-0) tests/integration/test_charm.py (+34/-0) tests/unit/test_charm.py (+88/-0) tox.ini (+73/-0) update-lib (+4/-0) |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Andrey Fedoseev (community) | Approve | ||
Review via email: mp+431861@code.launchpad.net |
Commit message
Add new charm
Description of the change
Much of the project skeleton came from `charmcraft init --profile machine`.
`lib/charms/
I considered using either the existing `apache2` charm or the existing `nginx` charm. However, `apache2` currently only works up to focal, and the `clamav-cvdupdate` package is only available starting from jammy; while the `nginx` charm doesn't allow deploying a site as a subordinate, which makes it rather cumbersome to use here. Since it's fairly trivial to handle installing `nginx` directly, I just went with that.
The test suite doesn't quite have 100% coverage, but there's enough here that it caught a number of my mistakes during development, and I wanted to get this up for review without too much more delay.
Andrey Fedoseev (andrey-fedoseev) : | # |
Preview Diff
1 | diff --git a/.gitignore b/.gitignore | |||
2 | 0 | new file mode 100644 | 0 | new file mode 100644 |
3 | index 0000000..a26d707 | |||
4 | --- /dev/null | |||
5 | +++ b/.gitignore | |||
6 | @@ -0,0 +1,9 @@ | |||
7 | 1 | venv/ | ||
8 | 2 | build/ | ||
9 | 3 | *.charm | ||
10 | 4 | .tox/ | ||
11 | 5 | .coverage | ||
12 | 6 | __pycache__/ | ||
13 | 7 | *.py[cod] | ||
14 | 8 | .idea | ||
15 | 9 | .vscode/ | ||
16 | diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md | |||
17 | 0 | new file mode 100644 | 10 | new file mode 100644 |
18 | index 0000000..5863a00 | |||
19 | --- /dev/null | |||
20 | +++ b/CONTRIBUTING.md | |||
21 | @@ -0,0 +1,31 @@ | |||
22 | 1 | # Contributing | ||
23 | 2 | |||
24 | 3 | To make contributions to this charm, you'll need a working [development setup](https://juju.is/docs/sdk/dev-setup). | ||
25 | 4 | |||
26 | 5 | You can use the environments created by `tox` for development: | ||
27 | 6 | |||
28 | 7 | ```shell | ||
29 | 8 | tox --notest -e unit | ||
30 | 9 | source .tox/unit/bin/activate | ||
31 | 10 | ``` | ||
32 | 11 | |||
33 | 12 | ## Testing | ||
34 | 13 | |||
35 | 14 | This project uses `tox` for managing test environments. There are some pre-configured environments | ||
36 | 15 | that can be used for linting and formatting code when you're preparing contributions to the charm: | ||
37 | 16 | |||
38 | 17 | ```shell | ||
39 | 18 | tox -e fmt # update your code according to linting rules | ||
40 | 19 | tox -e lint # code style | ||
41 | 20 | tox -e unit # unit tests | ||
42 | 21 | tox -e integration # integration tests | ||
43 | 22 | tox # runs 'lint' and 'unit' environments | ||
44 | 23 | ``` | ||
45 | 24 | |||
46 | 25 | ## Build the charm | ||
47 | 26 | |||
48 | 27 | Build the charm in this git repository using: | ||
49 | 28 | |||
50 | 29 | ```shell | ||
51 | 30 | charmcraft pack | ||
52 | 31 | ``` | ||
53 | diff --git a/LICENSE b/LICENSE | |||
54 | 0 | new file mode 100644 | 32 | new file mode 100644 |
55 | index 0000000..7e9d504 | |||
56 | --- /dev/null | |||
57 | +++ b/LICENSE | |||
58 | @@ -0,0 +1,202 @@ | |||
59 | 1 | |||
60 | 2 | Apache License | ||
61 | 3 | Version 2.0, January 2004 | ||
62 | 4 | http://www.apache.org/licenses/ | ||
63 | 5 | |||
64 | 6 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION | ||
65 | 7 | |||
66 | 8 | 1. Definitions. | ||
67 | 9 | |||
68 | 10 | "License" shall mean the terms and conditions for use, reproduction, | ||
69 | 11 | and distribution as defined by Sections 1 through 9 of this document. | ||
70 | 12 | |||
71 | 13 | "Licensor" shall mean the copyright owner or entity authorized by | ||
72 | 14 | the copyright owner that is granting the License. | ||
73 | 15 | |||
74 | 16 | "Legal Entity" shall mean the union of the acting entity and all | ||
75 | 17 | other entities that control, are controlled by, or are under common | ||
76 | 18 | control with that entity. For the purposes of this definition, | ||
77 | 19 | "control" means (i) the power, direct or indirect, to cause the | ||
78 | 20 | direction or management of such entity, whether by contract or | ||
79 | 21 | otherwise, or (ii) ownership of fifty percent (50%) or more of the | ||
80 | 22 | outstanding shares, or (iii) beneficial ownership of such entity. | ||
81 | 23 | |||
82 | 24 | "You" (or "Your") shall mean an individual or Legal Entity | ||
83 | 25 | exercising permissions granted by this License. | ||
84 | 26 | |||
85 | 27 | "Source" form shall mean the preferred form for making modifications, | ||
86 | 28 | including but not limited to software source code, documentation | ||
87 | 29 | source, and configuration files. | ||
88 | 30 | |||
89 | 31 | "Object" form shall mean any form resulting from mechanical | ||
90 | 32 | transformation or translation of a Source form, including but | ||
91 | 33 | not limited to compiled object code, generated documentation, | ||
92 | 34 | and conversions to other media types. | ||
93 | 35 | |||
94 | 36 | "Work" shall mean the work of authorship, whether in Source or | ||
95 | 37 | Object form, made available under the License, as indicated by a | ||
96 | 38 | copyright notice that is included in or attached to the work | ||
97 | 39 | (an example is provided in the Appendix below). | ||
98 | 40 | |||
99 | 41 | "Derivative Works" shall mean any work, whether in Source or Object | ||
100 | 42 | form, that is based on (or derived from) the Work and for which the | ||
101 | 43 | editorial revisions, annotations, elaborations, or other modifications | ||
102 | 44 | represent, as a whole, an original work of authorship. For the purposes | ||
103 | 45 | of this License, Derivative Works shall not include works that remain | ||
104 | 46 | separable from, or merely link (or bind by name) to the interfaces of, | ||
105 | 47 | the Work and Derivative Works thereof. | ||
106 | 48 | |||
107 | 49 | "Contribution" shall mean any work of authorship, including | ||
108 | 50 | the original version of the Work and any modifications or additions | ||
109 | 51 | to that Work or Derivative Works thereof, that is intentionally | ||
110 | 52 | submitted to Licensor for inclusion in the Work by the copyright owner | ||
111 | 53 | or by an individual or Legal Entity authorized to submit on behalf of | ||
112 | 54 | the copyright owner. For the purposes of this definition, "submitted" | ||
113 | 55 | means any form of electronic, verbal, or written communication sent | ||
114 | 56 | to the Licensor or its representatives, including but not limited to | ||
115 | 57 | communication on electronic mailing lists, source code control systems, | ||
116 | 58 | and issue tracking systems that are managed by, or on behalf of, the | ||
117 | 59 | Licensor for the purpose of discussing and improving the Work, but | ||
118 | 60 | excluding communication that is conspicuously marked or otherwise | ||
119 | 61 | designated in writing by the copyright owner as "Not a Contribution." | ||
120 | 62 | |||
121 | 63 | "Contributor" shall mean Licensor and any individual or Legal Entity | ||
122 | 64 | on behalf of whom a Contribution has been received by Licensor and | ||
123 | 65 | subsequently incorporated within the Work. | ||
124 | 66 | |||
125 | 67 | 2. Grant of Copyright License. Subject to the terms and conditions of | ||
126 | 68 | this License, each Contributor hereby grants to You a perpetual, | ||
127 | 69 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable | ||
128 | 70 | copyright license to reproduce, prepare Derivative Works of, | ||
129 | 71 | publicly display, publicly perform, sublicense, and distribute the | ||
130 | 72 | Work and such Derivative Works in Source or Object form. | ||
131 | 73 | |||
132 | 74 | 3. Grant of Patent License. Subject to the terms and conditions of | ||
133 | 75 | this License, each Contributor hereby grants to You a perpetual, | ||
134 | 76 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable | ||
135 | 77 | (except as stated in this section) patent license to make, have made, | ||
136 | 78 | use, offer to sell, sell, import, and otherwise transfer the Work, | ||
137 | 79 | where such license applies only to those patent claims licensable | ||
138 | 80 | by such Contributor that are necessarily infringed by their | ||
139 | 81 | Contribution(s) alone or by combination of their Contribution(s) | ||
140 | 82 | with the Work to which such Contribution(s) was submitted. If You | ||
141 | 83 | institute patent litigation against any entity (including a | ||
142 | 84 | cross-claim or counterclaim in a lawsuit) alleging that the Work | ||
143 | 85 | or a Contribution incorporated within the Work constitutes direct | ||
144 | 86 | or contributory patent infringement, then any patent licenses | ||
145 | 87 | granted to You under this License for that Work shall terminate | ||
146 | 88 | as of the date such litigation is filed. | ||
147 | 89 | |||
148 | 90 | 4. Redistribution. You may reproduce and distribute copies of the | ||
149 | 91 | Work or Derivative Works thereof in any medium, with or without | ||
150 | 92 | modifications, and in Source or Object form, provided that You | ||
151 | 93 | meet the following conditions: | ||
152 | 94 | |||
153 | 95 | (a) You must give any other recipients of the Work or | ||
154 | 96 | Derivative Works a copy of this License; and | ||
155 | 97 | |||
156 | 98 | (b) You must cause any modified files to carry prominent notices | ||
157 | 99 | stating that You changed the files; and | ||
158 | 100 | |||
159 | 101 | (c) You must retain, in the Source form of any Derivative Works | ||
160 | 102 | that You distribute, all copyright, patent, trademark, and | ||
161 | 103 | attribution notices from the Source form of the Work, | ||
162 | 104 | excluding those notices that do not pertain to any part of | ||
163 | 105 | the Derivative Works; and | ||
164 | 106 | |||
165 | 107 | (d) If the Work includes a "NOTICE" text file as part of its | ||
166 | 108 | distribution, then any Derivative Works that You distribute must | ||
167 | 109 | include a readable copy of the attribution notices contained | ||
168 | 110 | within such NOTICE file, excluding those notices that do not | ||
169 | 111 | pertain to any part of the Derivative Works, in at least one | ||
170 | 112 | of the following places: within a NOTICE text file distributed | ||
171 | 113 | as part of the Derivative Works; within the Source form or | ||
172 | 114 | documentation, if provided along with the Derivative Works; or, | ||
173 | 115 | within a display generated by the Derivative Works, if and | ||
174 | 116 | wherever such third-party notices normally appear. The contents | ||
175 | 117 | of the NOTICE file are for informational purposes only and | ||
176 | 118 | do not modify the License. You may add Your own attribution | ||
177 | 119 | notices within Derivative Works that You distribute, alongside | ||
178 | 120 | or as an addendum to the NOTICE text from the Work, provided | ||
179 | 121 | that such additional attribution notices cannot be construed | ||
180 | 122 | as modifying the License. | ||
181 | 123 | |||
182 | 124 | You may add Your own copyright statement to Your modifications and | ||
183 | 125 | may provide additional or different license terms and conditions | ||
184 | 126 | for use, reproduction, or distribution of Your modifications, or | ||
185 | 127 | for any such Derivative Works as a whole, provided Your use, | ||
186 | 128 | reproduction, and distribution of the Work otherwise complies with | ||
187 | 129 | the conditions stated in this License. | ||
188 | 130 | |||
189 | 131 | 5. Submission of Contributions. Unless You explicitly state otherwise, | ||
190 | 132 | any Contribution intentionally submitted for inclusion in the Work | ||
191 | 133 | by You to the Licensor shall be under the terms and conditions of | ||
192 | 134 | this License, without any additional terms or conditions. | ||
193 | 135 | Notwithstanding the above, nothing herein shall supersede or modify | ||
194 | 136 | the terms of any separate license agreement you may have executed | ||
195 | 137 | with Licensor regarding such Contributions. | ||
196 | 138 | |||
197 | 139 | 6. Trademarks. This License does not grant permission to use the trade | ||
198 | 140 | names, trademarks, service marks, or product names of the Licensor, | ||
199 | 141 | except as required for reasonable and customary use in describing the | ||
200 | 142 | origin of the Work and reproducing the content of the NOTICE file. | ||
201 | 143 | |||
202 | 144 | 7. Disclaimer of Warranty. Unless required by applicable law or | ||
203 | 145 | agreed to in writing, Licensor provides the Work (and each | ||
204 | 146 | Contributor provides its Contributions) on an "AS IS" BASIS, | ||
205 | 147 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or | ||
206 | 148 | implied, including, without limitation, any warranties or conditions | ||
207 | 149 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A | ||
208 | 150 | PARTICULAR PURPOSE. You are solely responsible for determining the | ||
209 | 151 | appropriateness of using or redistributing the Work and assume any | ||
210 | 152 | risks associated with Your exercise of permissions under this License. | ||
211 | 153 | |||
212 | 154 | 8. Limitation of Liability. In no event and under no legal theory, | ||
213 | 155 | whether in tort (including negligence), contract, or otherwise, | ||
214 | 156 | unless required by applicable law (such as deliberate and grossly | ||
215 | 157 | negligent acts) or agreed to in writing, shall any Contributor be | ||
216 | 158 | liable to You for damages, including any direct, indirect, special, | ||
217 | 159 | incidental, or consequential damages of any character arising as a | ||
218 | 160 | result of this License or out of the use or inability to use the | ||
219 | 161 | Work (including but not limited to damages for loss of goodwill, | ||
220 | 162 | work stoppage, computer failure or malfunction, or any and all | ||
221 | 163 | other commercial damages or losses), even if such Contributor | ||
222 | 164 | has been advised of the possibility of such damages. | ||
223 | 165 | |||
224 | 166 | 9. Accepting Warranty or Additional Liability. While redistributing | ||
225 | 167 | the Work or Derivative Works thereof, You may choose to offer, | ||
226 | 168 | and charge a fee for, acceptance of support, warranty, indemnity, | ||
227 | 169 | or other liability obligations and/or rights consistent with this | ||
228 | 170 | License. However, in accepting such obligations, You may act only | ||
229 | 171 | on Your own behalf and on Your sole responsibility, not on behalf | ||
230 | 172 | of any other Contributor, and only if You agree to indemnify, | ||
231 | 173 | defend, and hold each Contributor harmless for any liability | ||
232 | 174 | incurred by, or claims asserted against, such Contributor by reason | ||
233 | 175 | of your accepting any such warranty or additional liability. | ||
234 | 176 | |||
235 | 177 | END OF TERMS AND CONDITIONS | ||
236 | 178 | |||
237 | 179 | APPENDIX: How to apply the Apache License to your work. | ||
238 | 180 | |||
239 | 181 | To apply the Apache License to your work, attach the following | ||
240 | 182 | boilerplate notice, with the fields enclosed by brackets "[]" | ||
241 | 183 | replaced with your own identifying information. (Don't include | ||
242 | 184 | the brackets!) The text should be enclosed in the appropriate | ||
243 | 185 | comment syntax for the file format. We also recommend that a | ||
244 | 186 | file or class name and description of purpose be included on the | ||
245 | 187 | same "printed page" as the copyright notice for easier | ||
246 | 188 | identification within third-party archives. | ||
247 | 189 | |||
248 | 190 | Copyright 2022 Canonical Ltd. | ||
249 | 191 | |||
250 | 192 | Licensed under the Apache License, Version 2.0 (the "License"); | ||
251 | 193 | you may not use this file except in compliance with the License. | ||
252 | 194 | You may obtain a copy of the License at | ||
253 | 195 | |||
254 | 196 | http://www.apache.org/licenses/LICENSE-2.0 | ||
255 | 197 | |||
256 | 198 | Unless required by applicable law or agreed to in writing, software | ||
257 | 199 | distributed under the License is distributed on an "AS IS" BASIS, | ||
258 | 200 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
259 | 201 | See the License for the specific language governing permissions and | ||
260 | 202 | limitations under the License. | ||
261 | diff --git a/README.md b/README.md | |||
262 | 0 | new file mode 100644 | 203 | new file mode 100644 |
263 | index 0000000..d78d2a3 | |||
264 | --- /dev/null | |||
265 | +++ b/README.md | |||
266 | @@ -0,0 +1,12 @@ | |||
267 | 1 | # clamav-database-mirror | ||
268 | 2 | |||
269 | 3 | Charmhub package name: clamav-database-mirror | ||
270 | 4 | More information: https://charmhub.io/clamav-database-mirror | ||
271 | 5 | |||
272 | 6 | Maintains a local mirror of the latest ClamAV databases. | ||
273 | 7 | |||
274 | 8 | ## Other resources | ||
275 | 9 | |||
276 | 10 | - [Contributing](CONTRIBUTING.md) <!-- or link to other contribution documentation --> | ||
277 | 11 | |||
278 | 12 | - See the [Juju SDK documentation](https://juju.is/docs/sdk) for more information about developing and improving charms. | ||
279 | diff --git a/charmcraft.yaml b/charmcraft.yaml | |||
280 | 0 | new file mode 100644 | 13 | new file mode 100644 |
281 | index 0000000..0cd3b34 | |||
282 | --- /dev/null | |||
283 | +++ b/charmcraft.yaml | |||
284 | @@ -0,0 +1,16 @@ | |||
285 | 1 | # This file configures Charmcraft. | ||
286 | 2 | # See https://juju.is/docs/sdk/charmcraft-config for guidance. | ||
287 | 3 | |||
288 | 4 | type: charm | ||
289 | 5 | bases: | ||
290 | 6 | - build-on: | ||
291 | 7 | - name: ubuntu | ||
292 | 8 | channel: "22.04" | ||
293 | 9 | run-on: | ||
294 | 10 | - name: ubuntu | ||
295 | 11 | channel: "22.04" | ||
296 | 12 | parts: | ||
297 | 13 | charm: | ||
298 | 14 | prime: | ||
299 | 15 | - files/* | ||
300 | 16 | - templates/* | ||
301 | diff --git a/config.yaml b/config.yaml | |||
302 | 0 | new file mode 100644 | 17 | new file mode 100644 |
303 | index 0000000..f100171 | |||
304 | --- /dev/null | |||
305 | +++ b/config.yaml | |||
306 | @@ -0,0 +1,13 @@ | |||
307 | 1 | options: | ||
308 | 2 | host: | ||
309 | 3 | type: string | ||
310 | 4 | default: "localhost" | ||
311 | 5 | description: The mirror's host name. | ||
312 | 6 | port: | ||
313 | 7 | type: int | ||
314 | 8 | default: 80 | ||
315 | 9 | description: The mirror's listen port. | ||
316 | 10 | http-proxy: | ||
317 | 11 | type: string | ||
318 | 12 | default: "" | ||
319 | 13 | description: HTTP proxy to use for requests. | ||
320 | diff --git a/files/cvdupdate.timer b/files/cvdupdate.timer | |||
321 | 0 | new file mode 100644 | 14 | new file mode 100644 |
322 | index 0000000..dda6d5d | |||
323 | --- /dev/null | |||
324 | +++ b/files/cvdupdate.timer | |||
325 | @@ -0,0 +1,12 @@ | |||
326 | 1 | [Unit] | ||
327 | 2 | Description=Update ClamAV database mirror | ||
328 | 3 | Documentation=man:cvdupdate(1) | ||
329 | 4 | |||
330 | 5 | [Timer] | ||
331 | 6 | OnCalendar=*-*-* 0/4:30:00 | ||
332 | 7 | RandomizedDelaySec=1h | ||
333 | 8 | FixedRandomDelay=true | ||
334 | 9 | Persistent=true | ||
335 | 10 | |||
336 | 11 | [Install] | ||
337 | 12 | WantedBy=timers.target | ||
338 | diff --git a/lib/charms/operator_libs_linux/v0/apt.py b/lib/charms/operator_libs_linux/v0/apt.py | |||
339 | 0 | new file mode 100644 | 13 | new file mode 100644 |
340 | index 0000000..2b5c8f2 | |||
341 | --- /dev/null | |||
342 | +++ b/lib/charms/operator_libs_linux/v0/apt.py | |||
343 | @@ -0,0 +1,1329 @@ | |||
344 | 1 | # Copyright 2021 Canonical Ltd. | ||
345 | 2 | # | ||
346 | 3 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
347 | 4 | # you may not use this file except in compliance with the License. | ||
348 | 5 | # You may obtain a copy of the License at | ||
349 | 6 | # | ||
350 | 7 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
351 | 8 | # | ||
352 | 9 | # Unless required by applicable law or agreed to in writing, software | ||
353 | 10 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
354 | 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
355 | 12 | # See the License for the specific language governing permissions and | ||
356 | 13 | # limitations under the License. | ||
357 | 14 | |||
358 | 15 | """Abstractions for the system's Debian/Ubuntu package information and repositories. | ||
359 | 16 | |||
360 | 17 | This module contains abstractions and wrappers around Debian/Ubuntu-style repositories and | ||
361 | 18 | packages, in order to easily provide an idiomatic and Pythonic mechanism for adding packages and/or | ||
362 | 19 | repositories to systems for use in machine charms. | ||
363 | 20 | |||
364 | 21 | A sane default configuration is attainable through nothing more than instantiation of the | ||
365 | 22 | appropriate classes. `DebianPackage` objects provide information about the architecture, version, | ||
366 | 23 | name, and status of a package. | ||
367 | 24 | |||
368 | 25 | `DebianPackage` will try to look up a package either from `dpkg -L` or from `apt-cache` when | ||
369 | 26 | provided with a string indicating the package name. If it cannot be located, `PackageNotFoundError` | ||
370 | 27 | will be returned, as `apt` and `dpkg` otherwise return `100` for all errors, and a meaningful error | ||
371 | 28 | message if the package is not known is desirable. | ||
372 | 29 | |||
373 | 30 | To install packages with convenience methods: | ||
374 | 31 | |||
375 | 32 | ```python | ||
376 | 33 | try: | ||
377 | 34 | # Run `apt-get update` | ||
378 | 35 | apt.update() | ||
379 | 36 | apt.add_package("zsh") | ||
380 | 37 | apt.add_package(["vim", "htop", "wget"]) | ||
381 | 38 | except PackageNotFoundError: | ||
382 | 39 | logger.error("a specified package not found in package cache or on system") | ||
383 | 40 | except PackageError as e: | ||
384 | 41 | logger.error("could not install package. Reason: %s", e.message) | ||
385 | 42 | ```` | ||
386 | 43 | |||
387 | 44 | To find details of a specific package: | ||
388 | 45 | |||
389 | 46 | ```python | ||
390 | 47 | try: | ||
391 | 48 | vim = apt.DebianPackage.from_system("vim") | ||
392 | 49 | |||
393 | 50 | # To find from the apt cache only | ||
394 | 51 | # apt.DebianPackage.from_apt_cache("vim") | ||
395 | 52 | |||
396 | 53 | # To find from installed packages only | ||
397 | 54 | # apt.DebianPackage.from_installed_package("vim") | ||
398 | 55 | |||
399 | 56 | vim.ensure(PackageState.Latest) | ||
400 | 57 | logger.info("updated vim to version: %s", vim.fullversion) | ||
401 | 58 | except PackageNotFoundError: | ||
402 | 59 | logger.error("a specified package not found in package cache or on system") | ||
403 | 60 | except PackageError as e: | ||
404 | 61 | logger.error("could not install package. Reason: %s", e.message) | ||
405 | 62 | ``` | ||
406 | 63 | |||
407 | 64 | |||
408 | 65 | `RepositoryMapping` will return a dict-like object containing enabled system repositories | ||
409 | 66 | and their properties (available groups, baseuri. gpg key). This class can add, disable, or | ||
410 | 67 | manipulate repositories. Items can be retrieved as `DebianRepository` objects. | ||
411 | 68 | |||
412 | 69 | In order add a new repository with explicit details for fields, a new `DebianRepository` can | ||
413 | 70 | be added to `RepositoryMapping` | ||
414 | 71 | |||
415 | 72 | `RepositoryMapping` provides an abstraction around the existing repositories on the system, | ||
416 | 73 | and can be accessed and iterated over like any `Mapping` object, to retrieve values by key, | ||
417 | 74 | iterate, or perform other operations. | ||
418 | 75 | |||
419 | 76 | Keys are constructed as `{repo_type}-{}-{release}` in order to uniquely identify a repository. | ||
420 | 77 | |||
421 | 78 | Repositories can be added with explicit values through a Python constructor. | ||
422 | 79 | |||
423 | 80 | Example: | ||
424 | 81 | |||
425 | 82 | ```python | ||
426 | 83 | repositories = apt.RepositoryMapping() | ||
427 | 84 | |||
428 | 85 | if "deb-example.com-focal" not in repositories: | ||
429 | 86 | repositories.add(DebianRepository(enabled=True, repotype="deb", | ||
430 | 87 | uri="https://example.com", release="focal", groups=["universe"])) | ||
431 | 88 | ``` | ||
432 | 89 | |||
433 | 90 | Alternatively, any valid `sources.list` line may be used to construct a new | ||
434 | 91 | `DebianRepository`. | ||
435 | 92 | |||
436 | 93 | Example: | ||
437 | 94 | |||
438 | 95 | ```python | ||
439 | 96 | repositories = apt.RepositoryMapping() | ||
440 | 97 | |||
441 | 98 | if "deb-us.archive.ubuntu.com-xenial" not in repositories: | ||
442 | 99 | line = "deb http://us.archive.ubuntu.com/ubuntu xenial main restricted" | ||
443 | 100 | repo = DebianRepository.from_repo_line(line) | ||
444 | 101 | repositories.add(repo) | ||
445 | 102 | ``` | ||
446 | 103 | """ | ||
447 | 104 | |||
448 | 105 | import fileinput | ||
449 | 106 | import glob | ||
450 | 107 | import logging | ||
451 | 108 | import os | ||
452 | 109 | import re | ||
453 | 110 | import subprocess | ||
454 | 111 | from collections.abc import Mapping | ||
455 | 112 | from enum import Enum | ||
456 | 113 | from subprocess import PIPE, CalledProcessError, check_call, check_output | ||
457 | 114 | from typing import Iterable, List, Optional, Tuple, Union | ||
458 | 115 | from urllib.parse import urlparse | ||
459 | 116 | |||
460 | 117 | logger = logging.getLogger(__name__) | ||
461 | 118 | |||
462 | 119 | # The unique Charmhub library identifier, never change it | ||
463 | 120 | LIBID = "7c3dbc9c2ad44a47bd6fcb25caa270e5" | ||
464 | 121 | |||
465 | 122 | # Increment this major API version when introducing breaking changes | ||
466 | 123 | LIBAPI = 0 | ||
467 | 124 | |||
468 | 125 | # Increment this PATCH version before using `charmcraft publish-lib` or reset | ||
469 | 126 | # to 0 if you are raising the major API version | ||
470 | 127 | LIBPATCH = 7 | ||
471 | 128 | |||
472 | 129 | |||
473 | 130 | VALID_SOURCE_TYPES = ("deb", "deb-src") | ||
474 | 131 | OPTIONS_MATCHER = re.compile(r"\[.*?\]") | ||
475 | 132 | |||
476 | 133 | |||
477 | 134 | class Error(Exception): | ||
478 | 135 | """Base class of most errors raised by this library.""" | ||
479 | 136 | |||
480 | 137 | def __repr__(self): | ||
481 | 138 | """String representation of Error.""" | ||
482 | 139 | return "<{}.{} {}>".format(type(self).__module__, type(self).__name__, self.args) | ||
483 | 140 | |||
484 | 141 | @property | ||
485 | 142 | def name(self): | ||
486 | 143 | """Return a string representation of the model plus class.""" | ||
487 | 144 | return "<{}.{}>".format(type(self).__module__, type(self).__name__) | ||
488 | 145 | |||
489 | 146 | @property | ||
490 | 147 | def message(self): | ||
491 | 148 | """Return the message passed as an argument.""" | ||
492 | 149 | return self.args[0] | ||
493 | 150 | |||
494 | 151 | |||
495 | 152 | class PackageError(Error): | ||
496 | 153 | """Raised when there's an error installing or removing a package.""" | ||
497 | 154 | |||
498 | 155 | |||
499 | 156 | class PackageNotFoundError(Error): | ||
500 | 157 | """Raised when a requested package is not known to the system.""" | ||
501 | 158 | |||
502 | 159 | |||
503 | 160 | class PackageState(Enum): | ||
504 | 161 | """A class to represent possible package states.""" | ||
505 | 162 | |||
506 | 163 | Present = "present" | ||
507 | 164 | Absent = "absent" | ||
508 | 165 | Latest = "latest" | ||
509 | 166 | Available = "available" | ||
510 | 167 | |||
511 | 168 | |||
512 | 169 | class DebianPackage: | ||
513 | 170 | """Represents a traditional Debian package and its utility functions. | ||
514 | 171 | |||
515 | 172 | `DebianPackage` wraps information and functionality around a known package, whether installed | ||
516 | 173 | or available. The version, epoch, name, and architecture can be easily queried and compared | ||
517 | 174 | against other `DebianPackage` objects to determine the latest version or to install a specific | ||
518 | 175 | version. | ||
519 | 176 | |||
520 | 177 | The representation of this object as a string mimics the output from `dpkg` for familiarity. | ||
521 | 178 | |||
522 | 179 | Installation and removal of packages is handled through the `state` property or `ensure` | ||
523 | 180 | method, with the following options: | ||
524 | 181 | |||
525 | 182 | apt.PackageState.Absent | ||
526 | 183 | apt.PackageState.Available | ||
527 | 184 | apt.PackageState.Present | ||
528 | 185 | apt.PackageState.Latest | ||
529 | 186 | |||
530 | 187 | When `DebianPackage` is initialized, the state of a given `DebianPackage` object will be set to | ||
531 | 188 | `Available`, `Present`, or `Latest`, with `Absent` implemented as a convenience for removal | ||
532 | 189 | (though it operates essentially the same as `Available`). | ||
533 | 190 | """ | ||
534 | 191 | |||
535 | 192 | def __init__( | ||
536 | 193 | self, name: str, version: str, epoch: str, arch: str, state: PackageState | ||
537 | 194 | ) -> None: | ||
538 | 195 | self._name = name | ||
539 | 196 | self._arch = arch | ||
540 | 197 | self._state = state | ||
541 | 198 | self._version = Version(version, epoch) | ||
542 | 199 | |||
543 | 200 | def __eq__(self, other) -> bool: | ||
544 | 201 | """Equality for comparison. | ||
545 | 202 | |||
546 | 203 | Args: | ||
547 | 204 | other: a `DebianPackage` object for comparison | ||
548 | 205 | |||
549 | 206 | Returns: | ||
550 | 207 | A boolean reflecting equality | ||
551 | 208 | """ | ||
552 | 209 | return isinstance(other, self.__class__) and ( | ||
553 | 210 | self._name, | ||
554 | 211 | self._version.number, | ||
555 | 212 | ) == (other._name, other._version.number) | ||
556 | 213 | |||
557 | 214 | def __hash__(self): | ||
558 | 215 | """A basic hash so this class can be used in Mappings and dicts.""" | ||
559 | 216 | return hash((self._name, self._version.number)) | ||
560 | 217 | |||
561 | 218 | def __repr__(self): | ||
562 | 219 | """A representation of the package.""" | ||
563 | 220 | return "<{}.{}: {}>".format(self.__module__, self.__class__.__name__, self.__dict__) | ||
564 | 221 | |||
565 | 222 | def __str__(self): | ||
566 | 223 | """A human-readable representation of the package.""" | ||
567 | 224 | return "<{}: {}-{}.{} -- {}>".format( | ||
568 | 225 | self.__class__.__name__, | ||
569 | 226 | self._name, | ||
570 | 227 | self._version, | ||
571 | 228 | self._arch, | ||
572 | 229 | str(self._state), | ||
573 | 230 | ) | ||
574 | 231 | |||
575 | 232 | @staticmethod | ||
576 | 233 | def _apt( | ||
577 | 234 | command: str, | ||
578 | 235 | package_names: Union[str, List], | ||
579 | 236 | optargs: Optional[List[str]] = None, | ||
580 | 237 | ) -> None: | ||
581 | 238 | """Wrap package management commands for Debian/Ubuntu systems. | ||
582 | 239 | |||
583 | 240 | Args: | ||
584 | 241 | command: the command given to `apt-get` | ||
585 | 242 | package_names: a package name or list of package names to operate on | ||
586 | 243 | optargs: an (Optional) list of additioanl arguments | ||
587 | 244 | |||
588 | 245 | Raises: | ||
589 | 246 | PackageError if an error is encountered | ||
590 | 247 | """ | ||
591 | 248 | optargs = optargs if optargs is not None else [] | ||
592 | 249 | if isinstance(package_names, str): | ||
593 | 250 | package_names = [package_names] | ||
594 | 251 | _cmd = ["apt-get", "-y", *optargs, command, *package_names] | ||
595 | 252 | try: | ||
596 | 253 | check_call(_cmd, stderr=PIPE, stdout=PIPE) | ||
597 | 254 | except CalledProcessError as e: | ||
598 | 255 | raise PackageError( | ||
599 | 256 | "Could not {} package(s) [{}]: {}".format(command, [*package_names], e.output) | ||
600 | 257 | ) from None | ||
601 | 258 | |||
602 | 259 | def _add(self) -> None: | ||
603 | 260 | """Add a package to the system.""" | ||
604 | 261 | self._apt( | ||
605 | 262 | "install", | ||
606 | 263 | "{}={}".format(self.name, self.version), | ||
607 | 264 | optargs=["--option=Dpkg::Options::=--force-confold"], | ||
608 | 265 | ) | ||
609 | 266 | |||
610 | 267 | def _remove(self) -> None: | ||
611 | 268 | """Removes a package from the system. Implementation-specific.""" | ||
612 | 269 | return self._apt("remove", "{}={}".format(self.name, self.version)) | ||
613 | 270 | |||
614 | 271 | @property | ||
615 | 272 | def name(self) -> str: | ||
616 | 273 | """Returns the name of the package.""" | ||
617 | 274 | return self._name | ||
618 | 275 | |||
619 | 276 | def ensure(self, state: PackageState): | ||
620 | 277 | """Ensures that a package is in a given state. | ||
621 | 278 | |||
622 | 279 | Args: | ||
623 | 280 | state: a `PackageState` to reconcile the package to | ||
624 | 281 | |||
625 | 282 | Raises: | ||
626 | 283 | PackageError from the underlying call to apt | ||
627 | 284 | """ | ||
628 | 285 | if self._state is not state: | ||
629 | 286 | if state not in (PackageState.Present, PackageState.Latest): | ||
630 | 287 | self._remove() | ||
631 | 288 | else: | ||
632 | 289 | self._add() | ||
633 | 290 | self._state = state | ||
634 | 291 | |||
635 | 292 | @property | ||
636 | 293 | def present(self) -> bool: | ||
637 | 294 | """Returns whether or not a package is present.""" | ||
638 | 295 | return self._state in (PackageState.Present, PackageState.Latest) | ||
639 | 296 | |||
640 | 297 | @property | ||
641 | 298 | def latest(self) -> bool: | ||
642 | 299 | """Returns whether the package is the most recent version.""" | ||
643 | 300 | return self._state is PackageState.Latest | ||
644 | 301 | |||
645 | 302 | @property | ||
646 | 303 | def state(self) -> PackageState: | ||
647 | 304 | """Returns the current package state.""" | ||
648 | 305 | return self._state | ||
649 | 306 | |||
650 | 307 | @state.setter | ||
651 | 308 | def state(self, state: PackageState) -> None: | ||
652 | 309 | """Sets the package state to a given value. | ||
653 | 310 | |||
654 | 311 | Args: | ||
655 | 312 | state: a `PackageState` to reconcile the package to | ||
656 | 313 | |||
657 | 314 | Raises: | ||
658 | 315 | PackageError from the underlying call to apt | ||
659 | 316 | """ | ||
660 | 317 | if state in (PackageState.Latest, PackageState.Present): | ||
661 | 318 | self._add() | ||
662 | 319 | else: | ||
663 | 320 | self._remove() | ||
664 | 321 | self._state = state | ||
665 | 322 | |||
666 | 323 | @property | ||
667 | 324 | def version(self) -> "Version": | ||
668 | 325 | """Returns the version for a package.""" | ||
669 | 326 | return self._version | ||
670 | 327 | |||
671 | 328 | @property | ||
672 | 329 | def epoch(self) -> str: | ||
673 | 330 | """Returns the epoch for a package. May be unset.""" | ||
674 | 331 | return self._version.epoch | ||
675 | 332 | |||
676 | 333 | @property | ||
677 | 334 | def arch(self) -> str: | ||
678 | 335 | """Returns the architecture for a package.""" | ||
679 | 336 | return self._arch | ||
680 | 337 | |||
681 | 338 | @property | ||
682 | 339 | def fullversion(self) -> str: | ||
683 | 340 | """Returns the name+epoch for a package.""" | ||
684 | 341 | return "{}.{}".format(self._version, self._arch) | ||
685 | 342 | |||
686 | 343 | @staticmethod | ||
687 | 344 | def _get_epoch_from_version(version: str) -> Tuple[str, str]: | ||
688 | 345 | """Pull the epoch, if any, out of a version string.""" | ||
689 | 346 | epoch_matcher = re.compile(r"^((?P<epoch>\d+):)?(?P<version>.*)") | ||
690 | 347 | matches = epoch_matcher.search(version).groupdict() | ||
691 | 348 | return matches.get("epoch", ""), matches.get("version") | ||
692 | 349 | |||
693 | 350 | @classmethod | ||
694 | 351 | def from_system( | ||
695 | 352 | cls, package: str, version: Optional[str] = "", arch: Optional[str] = "" | ||
696 | 353 | ) -> "DebianPackage": | ||
697 | 354 | """Locates a package, either on the system or known to apt, and serializes the information. | ||
698 | 355 | |||
699 | 356 | Args: | ||
700 | 357 | package: a string representing the package | ||
701 | 358 | version: an optional string if a specific version isr equested | ||
702 | 359 | arch: an optional architecture, defaulting to `dpkg --print-architecture`. If an | ||
703 | 360 | architecture is not specified, this will be used for selection. | ||
704 | 361 | |||
705 | 362 | """ | ||
706 | 363 | try: | ||
707 | 364 | return DebianPackage.from_installed_package(package, version, arch) | ||
708 | 365 | except PackageNotFoundError: | ||
709 | 366 | logger.debug( | ||
710 | 367 | "package '%s' is not currently installed or has the wrong architecture.", package | ||
711 | 368 | ) | ||
712 | 369 | |||
713 | 370 | # Ok, try `apt-cache ...` | ||
714 | 371 | try: | ||
715 | 372 | return DebianPackage.from_apt_cache(package, version, arch) | ||
716 | 373 | except (PackageNotFoundError, PackageError): | ||
717 | 374 | # If we get here, it's not known to the systems. | ||
718 | 375 | # This seems unnecessary, but virtually all `apt` commands have a return code of `100`, | ||
719 | 376 | # and providing meaningful error messages without this is ugly. | ||
720 | 377 | raise PackageNotFoundError( | ||
721 | 378 | "Package '{}{}' could not be found on the system or in the apt cache!".format( | ||
722 | 379 | package, ".{}".format(arch) if arch else "" | ||
723 | 380 | ) | ||
724 | 381 | ) from None | ||
725 | 382 | |||
726 | 383 | @classmethod | ||
727 | 384 | def from_installed_package( | ||
728 | 385 | cls, package: str, version: Optional[str] = "", arch: Optional[str] = "" | ||
729 | 386 | ) -> "DebianPackage": | ||
730 | 387 | """Check whether the package is already installed and return an instance. | ||
731 | 388 | |||
732 | 389 | Args: | ||
733 | 390 | package: a string representing the package | ||
734 | 391 | version: an optional string if a specific version isr equested | ||
735 | 392 | arch: an optional architecture, defaulting to `dpkg --print-architecture`. | ||
736 | 393 | If an architecture is not specified, this will be used for selection. | ||
737 | 394 | """ | ||
738 | 395 | system_arch = check_output( | ||
739 | 396 | ["dpkg", "--print-architecture"], universal_newlines=True | ||
740 | 397 | ).strip() | ||
741 | 398 | arch = arch if arch else system_arch | ||
742 | 399 | |||
743 | 400 | # Regexps are a really terrible way to do this. Thanks dpkg | ||
744 | 401 | output = "" | ||
745 | 402 | try: | ||
746 | 403 | output = check_output(["dpkg", "-l", package], stderr=PIPE, universal_newlines=True) | ||
747 | 404 | except CalledProcessError: | ||
748 | 405 | raise PackageNotFoundError("Package is not installed: {}".format(package)) from None | ||
749 | 406 | |||
750 | 407 | # Pop off the output from `dpkg -l' because there's no flag to | ||
751 | 408 | # omit it` | ||
752 | 409 | lines = str(output).splitlines()[5:] | ||
753 | 410 | |||
754 | 411 | dpkg_matcher = re.compile( | ||
755 | 412 | r""" | ||
756 | 413 | ^(?P<package_status>\w+?)\s+ | ||
757 | 414 | (?P<package_name>.*?)(?P<throwaway_arch>:\w+?)?\s+ | ||
758 | 415 | (?P<version>.*?)\s+ | ||
759 | 416 | (?P<arch>\w+?)\s+ | ||
760 | 417 | (?P<description>.*) | ||
761 | 418 | """, | ||
762 | 419 | re.VERBOSE, | ||
763 | 420 | ) | ||
764 | 421 | |||
765 | 422 | for line in lines: | ||
766 | 423 | try: | ||
767 | 424 | matches = dpkg_matcher.search(line).groupdict() | ||
768 | 425 | package_status = matches["package_status"] | ||
769 | 426 | |||
770 | 427 | if not package_status.endswith("i"): | ||
771 | 428 | logger.debug( | ||
772 | 429 | "package '%s' in dpkg output but not installed, status: '%s'", | ||
773 | 430 | package, | ||
774 | 431 | package_status, | ||
775 | 432 | ) | ||
776 | 433 | break | ||
777 | 434 | |||
778 | 435 | epoch, split_version = DebianPackage._get_epoch_from_version(matches["version"]) | ||
779 | 436 | pkg = DebianPackage( | ||
780 | 437 | matches["package_name"], | ||
781 | 438 | split_version, | ||
782 | 439 | epoch, | ||
783 | 440 | matches["arch"], | ||
784 | 441 | PackageState.Present, | ||
785 | 442 | ) | ||
786 | 443 | if (pkg.arch == "all" or pkg.arch == arch) and ( | ||
787 | 444 | version == "" or str(pkg.version) == version | ||
788 | 445 | ): | ||
789 | 446 | return pkg | ||
790 | 447 | except AttributeError: | ||
791 | 448 | logger.warning("dpkg matcher could not parse line: %s", line) | ||
792 | 449 | |||
793 | 450 | # If we didn't find it, fail through | ||
794 | 451 | raise PackageNotFoundError("Package {}.{} is not installed!".format(package, arch)) | ||
795 | 452 | |||
796 | 453 | @classmethod | ||
797 | 454 | def from_apt_cache( | ||
798 | 455 | cls, package: str, version: Optional[str] = "", arch: Optional[str] = "" | ||
799 | 456 | ) -> "DebianPackage": | ||
800 | 457 | """Check whether the package is already installed and return an instance. | ||
801 | 458 | |||
802 | 459 | Args: | ||
803 | 460 | package: a string representing the package | ||
804 | 461 | version: an optional string if a specific version isr equested | ||
805 | 462 | arch: an optional architecture, defaulting to `dpkg --print-architecture`. | ||
806 | 463 | If an architecture is not specified, this will be used for selection. | ||
807 | 464 | """ | ||
808 | 465 | system_arch = check_output( | ||
809 | 466 | ["dpkg", "--print-architecture"], universal_newlines=True | ||
810 | 467 | ).strip() | ||
811 | 468 | arch = arch if arch else system_arch | ||
812 | 469 | |||
813 | 470 | # Regexps are a really terrible way to do this. Thanks dpkg | ||
814 | 471 | keys = ("Package", "Architecture", "Version") | ||
815 | 472 | |||
816 | 473 | try: | ||
817 | 474 | output = check_output( | ||
818 | 475 | ["apt-cache", "show", package], stderr=PIPE, universal_newlines=True | ||
819 | 476 | ) | ||
820 | 477 | except CalledProcessError as e: | ||
821 | 478 | raise PackageError( | ||
822 | 479 | "Could not list packages in apt-cache: {}".format(e.output) | ||
823 | 480 | ) from None | ||
824 | 481 | |||
825 | 482 | pkg_groups = output.strip().split("\n\n") | ||
826 | 483 | keys = ("Package", "Architecture", "Version") | ||
827 | 484 | |||
828 | 485 | for pkg_raw in pkg_groups: | ||
829 | 486 | lines = str(pkg_raw).splitlines() | ||
830 | 487 | vals = {} | ||
831 | 488 | for line in lines: | ||
832 | 489 | if line.startswith(keys): | ||
833 | 490 | items = line.split(":", 1) | ||
834 | 491 | vals[items[0]] = items[1].strip() | ||
835 | 492 | else: | ||
836 | 493 | continue | ||
837 | 494 | |||
838 | 495 | epoch, split_version = DebianPackage._get_epoch_from_version(vals["Version"]) | ||
839 | 496 | pkg = DebianPackage( | ||
840 | 497 | vals["Package"], | ||
841 | 498 | split_version, | ||
842 | 499 | epoch, | ||
843 | 500 | vals["Architecture"], | ||
844 | 501 | PackageState.Available, | ||
845 | 502 | ) | ||
846 | 503 | |||
847 | 504 | if (pkg.arch == "all" or pkg.arch == arch) and ( | ||
848 | 505 | version == "" or str(pkg.version) == version | ||
849 | 506 | ): | ||
850 | 507 | return pkg | ||
851 | 508 | |||
852 | 509 | # If we didn't find it, fail through | ||
853 | 510 | raise PackageNotFoundError("Package {}.{} is not in the apt cache!".format(package, arch)) | ||
854 | 511 | |||
855 | 512 | |||
856 | 513 | class Version: | ||
857 | 514 | """An abstraction around package versions. | ||
858 | 515 | |||
859 | 516 | This seems like it should be strictly unnecessary, except that `apt_pkg` is not usable inside a | ||
860 | 517 | venv, and wedging version comparisions into `DebianPackage` would overcomplicate it. | ||
861 | 518 | |||
862 | 519 | This class implements the algorithm found here: | ||
863 | 520 | https://www.debian.org/doc/debian-policy/ch-controlfields.html#version | ||
864 | 521 | """ | ||
865 | 522 | |||
866 | 523 | def __init__(self, version: str, epoch: str): | ||
867 | 524 | self._version = version | ||
868 | 525 | self._epoch = epoch or "" | ||
869 | 526 | |||
870 | 527 | def __repr__(self): | ||
871 | 528 | """A representation of the package.""" | ||
872 | 529 | return "<{}.{}: {}>".format(self.__module__, self.__class__.__name__, self.__dict__) | ||
873 | 530 | |||
874 | 531 | def __str__(self): | ||
875 | 532 | """A human-readable representation of the package.""" | ||
876 | 533 | return "{}{}".format("{}:".format(self._epoch) if self._epoch else "", self._version) | ||
877 | 534 | |||
878 | 535 | @property | ||
879 | 536 | def epoch(self): | ||
880 | 537 | """Returns the epoch for a package. May be empty.""" | ||
881 | 538 | return self._epoch | ||
882 | 539 | |||
883 | 540 | @property | ||
884 | 541 | def number(self) -> str: | ||
885 | 542 | """Returns the version number for a package.""" | ||
886 | 543 | return self._version | ||
887 | 544 | |||
888 | 545 | def _get_parts(self, version: str) -> Tuple[str, str]: | ||
889 | 546 | """Separate the version into component upstream and Debian pieces.""" | ||
890 | 547 | try: | ||
891 | 548 | version.rindex("-") | ||
892 | 549 | except ValueError: | ||
893 | 550 | # No hyphens means no Debian version | ||
894 | 551 | return version, "0" | ||
895 | 552 | |||
896 | 553 | upstream, debian = version.rsplit("-", 1) | ||
897 | 554 | return upstream, debian | ||
898 | 555 | |||
899 | 556 | def _listify(self, revision: str) -> List[str]: | ||
900 | 557 | """Split a revision string into a listself. | ||
901 | 558 | |||
902 | 559 | This list is comprised of alternating between strings and numbers, | ||
903 | 560 | padded on either end to always be "str, int, str, int..." and | ||
904 | 561 | always be of even length. This allows us to trivially implement the | ||
905 | 562 | comparison algorithm described. | ||
906 | 563 | """ | ||
907 | 564 | result = [] | ||
908 | 565 | while revision: | ||
909 | 566 | rev_1, remains = self._get_alphas(revision) | ||
910 | 567 | rev_2, remains = self._get_digits(remains) | ||
911 | 568 | result.extend([rev_1, rev_2]) | ||
912 | 569 | revision = remains | ||
913 | 570 | return result | ||
914 | 571 | |||
915 | 572 | def _get_alphas(self, revision: str) -> Tuple[str, str]: | ||
916 | 573 | """Return a tuple of the first non-digit characters of a revision.""" | ||
917 | 574 | # get the index of the first digit | ||
918 | 575 | for i, char in enumerate(revision): | ||
919 | 576 | if char.isdigit(): | ||
920 | 577 | if i == 0: | ||
921 | 578 | return "", revision | ||
922 | 579 | return revision[0:i], revision[i:] | ||
923 | 580 | # string is entirely alphas | ||
924 | 581 | return revision, "" | ||
925 | 582 | |||
926 | 583 | def _get_digits(self, revision: str) -> Tuple[int, str]: | ||
927 | 584 | """Return a tuple of the first integer characters of a revision.""" | ||
928 | 585 | # If the string is empty, return (0,'') | ||
929 | 586 | if not revision: | ||
930 | 587 | return 0, "" | ||
931 | 588 | # get the index of the first non-digit | ||
932 | 589 | for i, char in enumerate(revision): | ||
933 | 590 | if not char.isdigit(): | ||
934 | 591 | if i == 0: | ||
935 | 592 | return 0, revision | ||
936 | 593 | return int(revision[0:i]), revision[i:] | ||
937 | 594 | # string is entirely digits | ||
938 | 595 | return int(revision), "" | ||
939 | 596 | |||
940 | 597 | def _dstringcmp(self, a, b): # noqa: C901 | ||
941 | 598 | """Debian package version string section lexical sort algorithm. | ||
942 | 599 | |||
943 | 600 | The lexical comparison is a comparison of ASCII values modified so | ||
944 | 601 | that all the letters sort earlier than all the non-letters and so that | ||
945 | 602 | a tilde sorts before anything, even the end of a part. | ||
946 | 603 | """ | ||
947 | 604 | if a == b: | ||
948 | 605 | return 0 | ||
949 | 606 | try: | ||
950 | 607 | for i, char in enumerate(a): | ||
951 | 608 | if char == b[i]: | ||
952 | 609 | continue | ||
953 | 610 | # "a tilde sorts before anything, even the end of a part" | ||
954 | 611 | # (emptyness) | ||
955 | 612 | if char == "~": | ||
956 | 613 | return -1 | ||
957 | 614 | if b[i] == "~": | ||
958 | 615 | return 1 | ||
959 | 616 | # "all the letters sort earlier than all the non-letters" | ||
960 | 617 | if char.isalpha() and not b[i].isalpha(): | ||
961 | 618 | return -1 | ||
962 | 619 | if not char.isalpha() and b[i].isalpha(): | ||
963 | 620 | return 1 | ||
964 | 621 | # otherwise lexical sort | ||
965 | 622 | if ord(char) > ord(b[i]): | ||
966 | 623 | return 1 | ||
967 | 624 | if ord(char) < ord(b[i]): | ||
968 | 625 | return -1 | ||
969 | 626 | except IndexError: | ||
970 | 627 | # a is longer than b but otherwise equal, greater unless there are tildes | ||
971 | 628 | if char == "~": | ||
972 | 629 | return -1 | ||
973 | 630 | return 1 | ||
974 | 631 | # if we get here, a is shorter than b but otherwise equal, so check for tildes... | ||
975 | 632 | if b[len(a)] == "~": | ||
976 | 633 | return 1 | ||
977 | 634 | return -1 | ||
978 | 635 | |||
979 | 636 | def _compare_revision_strings(self, first: str, second: str): # noqa: C901 | ||
980 | 637 | """Compare two debian revision strings.""" | ||
981 | 638 | if first == second: | ||
982 | 639 | return 0 | ||
983 | 640 | |||
984 | 641 | # listify pads results so that we will always be comparing ints to ints | ||
985 | 642 | # and strings to strings (at least until we fall off the end of a list) | ||
986 | 643 | first_list = self._listify(first) | ||
987 | 644 | second_list = self._listify(second) | ||
988 | 645 | if first_list == second_list: | ||
989 | 646 | return 0 | ||
990 | 647 | try: | ||
991 | 648 | for i, item in enumerate(first_list): | ||
992 | 649 | # explicitly raise IndexError if we've fallen off the edge of list2 | ||
993 | 650 | if i >= len(second_list): | ||
994 | 651 | raise IndexError | ||
995 | 652 | # if the items are equal, next | ||
996 | 653 | if item == second_list[i]: | ||
997 | 654 | continue | ||
998 | 655 | # numeric comparison | ||
999 | 656 | if isinstance(item, int): | ||
1000 | 657 | if item > second_list[i]: | ||
1001 | 658 | return 1 | ||
1002 | 659 | if item < second_list[i]: | ||
1003 | 660 | return -1 | ||
1004 | 661 | else: | ||
1005 | 662 | # string comparison | ||
1006 | 663 | return self._dstringcmp(item, second_list[i]) | ||
1007 | 664 | except IndexError: | ||
1008 | 665 | # rev1 is longer than rev2 but otherwise equal, hence greater | ||
1009 | 666 | # ...except for goddamn tildes | ||
1010 | 667 | if first_list[len(second_list)][0][0] == "~": | ||
1011 | 668 | return 1 | ||
1012 | 669 | return 1 | ||
1013 | 670 | # rev1 is shorter than rev2 but otherwise equal, hence lesser | ||
1014 | 671 | # ...except for goddamn tildes | ||
1015 | 672 | if second_list[len(first_list)][0][0] == "~": | ||
1016 | 673 | return -1 | ||
1017 | 674 | return -1 | ||
1018 | 675 | |||
1019 | 676 | def _compare_version(self, other) -> int: | ||
1020 | 677 | if (self.number, self.epoch) == (other.number, other.epoch): | ||
1021 | 678 | return 0 | ||
1022 | 679 | |||
1023 | 680 | if self.epoch < other.epoch: | ||
1024 | 681 | return -1 | ||
1025 | 682 | if self.epoch > other.epoch: | ||
1026 | 683 | return 1 | ||
1027 | 684 | |||
1028 | 685 | # If none of these are true, follow the algorithm | ||
1029 | 686 | upstream_version, debian_version = self._get_parts(self.number) | ||
1030 | 687 | other_upstream_version, other_debian_version = self._get_parts(other.number) | ||
1031 | 688 | |||
1032 | 689 | upstream_cmp = self._compare_revision_strings(upstream_version, other_upstream_version) | ||
1033 | 690 | if upstream_cmp != 0: | ||
1034 | 691 | return upstream_cmp | ||
1035 | 692 | |||
1036 | 693 | debian_cmp = self._compare_revision_strings(debian_version, other_debian_version) | ||
1037 | 694 | if debian_cmp != 0: | ||
1038 | 695 | return debian_cmp | ||
1039 | 696 | |||
1040 | 697 | return 0 | ||
1041 | 698 | |||
1042 | 699 | def __lt__(self, other) -> bool: | ||
1043 | 700 | """Less than magic method impl.""" | ||
1044 | 701 | return self._compare_version(other) < 0 | ||
1045 | 702 | |||
1046 | 703 | def __eq__(self, other) -> bool: | ||
1047 | 704 | """Equality magic method impl.""" | ||
1048 | 705 | return self._compare_version(other) == 0 | ||
1049 | 706 | |||
1050 | 707 | def __gt__(self, other) -> bool: | ||
1051 | 708 | """Greater than magic method impl.""" | ||
1052 | 709 | return self._compare_version(other) > 0 | ||
1053 | 710 | |||
1054 | 711 | def __le__(self, other) -> bool: | ||
1055 | 712 | """Less than or equal to magic method impl.""" | ||
1056 | 713 | return self.__eq__(other) or self.__lt__(other) | ||
1057 | 714 | |||
1058 | 715 | def __ge__(self, other) -> bool: | ||
1059 | 716 | """Greater than or equal to magic method impl.""" | ||
1060 | 717 | return self.__gt__(other) or self.__eq__(other) | ||
1061 | 718 | |||
1062 | 719 | def __ne__(self, other) -> bool: | ||
1063 | 720 | """Not equal to magic method impl.""" | ||
1064 | 721 | return not self.__eq__(other) | ||
1065 | 722 | |||
1066 | 723 | |||
1067 | 724 | def add_package( | ||
1068 | 725 | package_names: Union[str, List[str]], | ||
1069 | 726 | version: Optional[str] = "", | ||
1070 | 727 | arch: Optional[str] = "", | ||
1071 | 728 | update_cache: Optional[bool] = False, | ||
1072 | 729 | ) -> Union[DebianPackage, List[DebianPackage]]: | ||
1073 | 730 | """Add a package or list of packages to the system. | ||
1074 | 731 | |||
1075 | 732 | Args: | ||
1076 | 733 | name: the name(s) of the package(s) | ||
1077 | 734 | version: an (Optional) version as a string. Defaults to the latest known | ||
1078 | 735 | arch: an optional architecture for the package | ||
1079 | 736 | update_cache: whether or not to run `apt-get update` prior to operating | ||
1080 | 737 | |||
1081 | 738 | Raises: | ||
1082 | 739 | PackageNotFoundError if the package is not in the cache. | ||
1083 | 740 | """ | ||
1084 | 741 | cache_refreshed = False | ||
1085 | 742 | if update_cache: | ||
1086 | 743 | update() | ||
1087 | 744 | cache_refreshed = True | ||
1088 | 745 | |||
1089 | 746 | packages = {"success": [], "retry": [], "failed": []} | ||
1090 | 747 | |||
1091 | 748 | package_names = [package_names] if type(package_names) is str else package_names | ||
1092 | 749 | if not package_names: | ||
1093 | 750 | raise TypeError("Expected at least one package name to add, received zero!") | ||
1094 | 751 | |||
1095 | 752 | if len(package_names) != 1 and version: | ||
1096 | 753 | raise TypeError( | ||
1097 | 754 | "Explicit version should not be set if more than one package is being added!" | ||
1098 | 755 | ) | ||
1099 | 756 | |||
1100 | 757 | for p in package_names: | ||
1101 | 758 | pkg, success = _add(p, version, arch) | ||
1102 | 759 | if success: | ||
1103 | 760 | packages["success"].append(pkg) | ||
1104 | 761 | else: | ||
1105 | 762 | logger.warning("failed to locate and install/update '%s'", pkg) | ||
1106 | 763 | packages["retry"].append(p) | ||
1107 | 764 | |||
1108 | 765 | if packages["retry"] and not cache_refreshed: | ||
1109 | 766 | logger.info("updating the apt-cache and retrying installation of failed packages.") | ||
1110 | 767 | update() | ||
1111 | 768 | |||
1112 | 769 | for p in packages["retry"]: | ||
1113 | 770 | pkg, success = _add(p, version, arch) | ||
1114 | 771 | if success: | ||
1115 | 772 | packages["success"].append(pkg) | ||
1116 | 773 | else: | ||
1117 | 774 | packages["failed"].append(p) | ||
1118 | 775 | |||
1119 | 776 | if packages["failed"]: | ||
1120 | 777 | raise PackageError("Failed to install packages: {}".format(", ".join(packages["failed"]))) | ||
1121 | 778 | |||
1122 | 779 | return packages["success"] if len(packages["success"]) > 1 else packages["success"][0] | ||
1123 | 780 | |||
1124 | 781 | |||
1125 | 782 | def _add( | ||
1126 | 783 | name: str, | ||
1127 | 784 | version: Optional[str] = "", | ||
1128 | 785 | arch: Optional[str] = "", | ||
1129 | 786 | ) -> Tuple[Union[DebianPackage, str], bool]: | ||
1130 | 787 | """Adds a package. | ||
1131 | 788 | |||
1132 | 789 | Args: | ||
1133 | 790 | name: the name(s) of the package(s) | ||
1134 | 791 | version: an (Optional) version as a string. Defaults to the latest known | ||
1135 | 792 | arch: an optional architecture for the package | ||
1136 | 793 | |||
1137 | 794 | Returns: a tuple of `DebianPackage` if found, or a :str: if it is not, and | ||
1138 | 795 | a boolean indicating success | ||
1139 | 796 | """ | ||
1140 | 797 | try: | ||
1141 | 798 | pkg = DebianPackage.from_system(name, version, arch) | ||
1142 | 799 | pkg.ensure(state=PackageState.Present) | ||
1143 | 800 | return pkg, True | ||
1144 | 801 | except PackageNotFoundError: | ||
1145 | 802 | return name, False | ||
1146 | 803 | |||
1147 | 804 | |||
1148 | 805 | def remove_package( | ||
1149 | 806 | package_names: Union[str, List[str]] | ||
1150 | 807 | ) -> Union[DebianPackage, List[DebianPackage]]: | ||
1151 | 808 | """Removes a package from the system. | ||
1152 | 809 | |||
1153 | 810 | Args: | ||
1154 | 811 | package_names: the name of a package | ||
1155 | 812 | |||
1156 | 813 | Raises: | ||
1157 | 814 | PackageNotFoundError if the package is not found. | ||
1158 | 815 | """ | ||
1159 | 816 | packages = [] | ||
1160 | 817 | |||
1161 | 818 | package_names = [package_names] if type(package_names) is str else package_names | ||
1162 | 819 | if not package_names: | ||
1163 | 820 | raise TypeError("Expected at least one package name to add, received zero!") | ||
1164 | 821 | |||
1165 | 822 | for p in package_names: | ||
1166 | 823 | try: | ||
1167 | 824 | pkg = DebianPackage.from_installed_package(p) | ||
1168 | 825 | pkg.ensure(state=PackageState.Absent) | ||
1169 | 826 | packages.append(pkg) | ||
1170 | 827 | except PackageNotFoundError: | ||
1171 | 828 | logger.info("package '%s' was requested for removal, but it was not installed.", p) | ||
1172 | 829 | |||
1173 | 830 | # the list of packages will be empty when no package is removed | ||
1174 | 831 | logger.debug("packages: '%s'", packages) | ||
1175 | 832 | return packages[0] if len(packages) == 1 else packages | ||
1176 | 833 | |||
1177 | 834 | |||
1178 | 835 | def update() -> None: | ||
1179 | 836 | """Updates the apt cache via `apt-get update`.""" | ||
1180 | 837 | check_call(["apt-get", "update"], stderr=PIPE, stdout=PIPE) | ||
1181 | 838 | |||
1182 | 839 | |||
1183 | 840 | class InvalidSourceError(Error): | ||
1184 | 841 | """Exceptions for invalid source entries.""" | ||
1185 | 842 | |||
1186 | 843 | |||
1187 | 844 | class GPGKeyError(Error): | ||
1188 | 845 | """Exceptions for GPG keys.""" | ||
1189 | 846 | |||
1190 | 847 | |||
1191 | 848 | class DebianRepository: | ||
1192 | 849 | """An abstraction to represent a repository.""" | ||
1193 | 850 | |||
1194 | 851 | def __init__( | ||
1195 | 852 | self, | ||
1196 | 853 | enabled: bool, | ||
1197 | 854 | repotype: str, | ||
1198 | 855 | uri: str, | ||
1199 | 856 | release: str, | ||
1200 | 857 | groups: List[str], | ||
1201 | 858 | filename: Optional[str] = "", | ||
1202 | 859 | gpg_key_filename: Optional[str] = "", | ||
1203 | 860 | options: Optional[dict] = None, | ||
1204 | 861 | ): | ||
1205 | 862 | self._enabled = enabled | ||
1206 | 863 | self._repotype = repotype | ||
1207 | 864 | self._uri = uri | ||
1208 | 865 | self._release = release | ||
1209 | 866 | self._groups = groups | ||
1210 | 867 | self._filename = filename | ||
1211 | 868 | self._gpg_key_filename = gpg_key_filename | ||
1212 | 869 | self._options = options | ||
1213 | 870 | |||
1214 | 871 | @property | ||
1215 | 872 | def enabled(self): | ||
1216 | 873 | """Return whether or not the repository is enabled.""" | ||
1217 | 874 | return self._enabled | ||
1218 | 875 | |||
1219 | 876 | @property | ||
1220 | 877 | def repotype(self): | ||
1221 | 878 | """Return whether it is binary or source.""" | ||
1222 | 879 | return self._repotype | ||
1223 | 880 | |||
1224 | 881 | @property | ||
1225 | 882 | def uri(self): | ||
1226 | 883 | """Return the URI.""" | ||
1227 | 884 | return self._uri | ||
1228 | 885 | |||
1229 | 886 | @property | ||
1230 | 887 | def release(self): | ||
1231 | 888 | """Return which Debian/Ubuntu releases it is valid for.""" | ||
1232 | 889 | return self._release | ||
1233 | 890 | |||
1234 | 891 | @property | ||
1235 | 892 | def groups(self): | ||
1236 | 893 | """Return the enabled package groups.""" | ||
1237 | 894 | return self._groups | ||
1238 | 895 | |||
1239 | 896 | @property | ||
1240 | 897 | def filename(self): | ||
1241 | 898 | """Returns the filename for a repository.""" | ||
1242 | 899 | return self._filename | ||
1243 | 900 | |||
1244 | 901 | @filename.setter | ||
1245 | 902 | def filename(self, fname: str) -> None: | ||
1246 | 903 | """Sets the filename used when a repo is written back to diskself. | ||
1247 | 904 | |||
1248 | 905 | Args: | ||
1249 | 906 | fname: a filename to write the repository information to. | ||
1250 | 907 | """ | ||
1251 | 908 | if not fname.endswith(".list"): | ||
1252 | 909 | raise InvalidSourceError("apt source filenames should end in .list!") | ||
1253 | 910 | |||
1254 | 911 | self._filename = fname | ||
1255 | 912 | |||
1256 | 913 | @property | ||
1257 | 914 | def gpg_key(self): | ||
1258 | 915 | """Returns the path to the GPG key for this repository.""" | ||
1259 | 916 | return self._gpg_key_filename | ||
1260 | 917 | |||
1261 | 918 | @property | ||
1262 | 919 | def options(self): | ||
1263 | 920 | """Returns any additional repo options which are set.""" | ||
1264 | 921 | return self._options | ||
1265 | 922 | |||
1266 | 923 | def make_options_string(self) -> str: | ||
1267 | 924 | """Generate the complete options string for a a repository. | ||
1268 | 925 | |||
1269 | 926 | Combining `gpg_key`, if set, and the rest of the options to find | ||
1270 | 927 | a complex repo string. | ||
1271 | 928 | """ | ||
1272 | 929 | options = self._options if self._options else {} | ||
1273 | 930 | if self._gpg_key_filename: | ||
1274 | 931 | options["signed-by"] = self._gpg_key_filename | ||
1275 | 932 | |||
1276 | 933 | return ( | ||
1277 | 934 | "[{}] ".format(" ".join(["{}={}".format(k, v) for k, v in options.items()])) | ||
1278 | 935 | if options | ||
1279 | 936 | else "" | ||
1280 | 937 | ) | ||
1281 | 938 | |||
1282 | 939 | @staticmethod | ||
1283 | 940 | def prefix_from_uri(uri: str) -> str: | ||
1284 | 941 | """Get a repo list prefix from the uri, depending on whether a path is set.""" | ||
1285 | 942 | uridetails = urlparse(uri) | ||
1286 | 943 | path = ( | ||
1287 | 944 | uridetails.path.lstrip("/").replace("/", "-") if uridetails.path else uridetails.netloc | ||
1288 | 945 | ) | ||
1289 | 946 | return "/etc/apt/sources.list.d/{}".format(path) | ||
1290 | 947 | |||
1291 | 948 | @staticmethod | ||
1292 | 949 | def from_repo_line(repo_line: str, write_file: Optional[bool] = True) -> "DebianRepository": | ||
1293 | 950 | """Instantiate a new `DebianRepository` a `sources.list` entry line. | ||
1294 | 951 | |||
1295 | 952 | Args: | ||
1296 | 953 | repo_line: a string representing a repository entry | ||
1297 | 954 | write_file: boolean to enable writing the new repo to disk | ||
1298 | 955 | """ | ||
1299 | 956 | repo = RepositoryMapping._parse(repo_line, "UserInput") | ||
1300 | 957 | fname = "{}-{}.list".format( | ||
1301 | 958 | DebianRepository.prefix_from_uri(repo.uri), repo.release.replace("/", "-") | ||
1302 | 959 | ) | ||
1303 | 960 | repo.filename = fname | ||
1304 | 961 | |||
1305 | 962 | options = repo.options if repo.options else {} | ||
1306 | 963 | if repo.gpg_key: | ||
1307 | 964 | options["signed-by"] = repo.gpg_key | ||
1308 | 965 | |||
1309 | 966 | # For Python 3.5 it's required to use sorted in the options dict in order to not have | ||
1310 | 967 | # different results in the order of the options between executions. | ||
1311 | 968 | options_str = ( | ||
1312 | 969 | "[{}] ".format(" ".join(["{}={}".format(k, v) for k, v in sorted(options.items())])) | ||
1313 | 970 | if options | ||
1314 | 971 | else "" | ||
1315 | 972 | ) | ||
1316 | 973 | |||
1317 | 974 | if write_file: | ||
1318 | 975 | with open(fname, "wb") as f: | ||
1319 | 976 | f.write( | ||
1320 | 977 | ( | ||
1321 | 978 | "{}".format("#" if not repo.enabled else "") | ||
1322 | 979 | + "{} {}{} ".format(repo.repotype, options_str, repo.uri) | ||
1323 | 980 | + "{} {}\n".format(repo.release, " ".join(repo.groups)) | ||
1324 | 981 | ).encode("utf-8") | ||
1325 | 982 | ) | ||
1326 | 983 | |||
1327 | 984 | return repo | ||
1328 | 985 | |||
1329 | 986 | def disable(self) -> None: | ||
1330 | 987 | """Remove this repository from consideration. | ||
1331 | 988 | |||
1332 | 989 | Disable it instead of removing from the repository file. | ||
1333 | 990 | """ | ||
1334 | 991 | searcher = "{} {}{} {}".format( | ||
1335 | 992 | self.repotype, self.make_options_string(), self.uri, self.release | ||
1336 | 993 | ) | ||
1337 | 994 | for line in fileinput.input(self._filename, inplace=True): | ||
1338 | 995 | if re.match(r"^{}\s".format(re.escape(searcher)), line): | ||
1339 | 996 | print("# {}".format(line), end="") | ||
1340 | 997 | else: | ||
1341 | 998 | print(line, end="") | ||
1342 | 999 | |||
1343 | 1000 | def import_key(self, key: str) -> None: | ||
1344 | 1001 | """Import an ASCII Armor key. | ||
1345 | 1002 | |||
1346 | 1003 | A Radix64 format keyid is also supported for backwards | ||
1347 | 1004 | compatibility. In this case Ubuntu keyserver will be | ||
1348 | 1005 | queried for a key via HTTPS by its keyid. This method | ||
1349 | 1006 | is less preferrable because https proxy servers may | ||
1350 | 1007 | require traffic decryption which is equivalent to a | ||
1351 | 1008 | man-in-the-middle attack (a proxy server impersonates | ||
1352 | 1009 | keyserver TLS certificates and has to be explicitly | ||
1353 | 1010 | trusted by the system). | ||
1354 | 1011 | |||
1355 | 1012 | Args: | ||
1356 | 1013 | key: A GPG key in ASCII armor format, | ||
1357 | 1014 | including BEGIN and END markers or a keyid. | ||
1358 | 1015 | |||
1359 | 1016 | Raises: | ||
1360 | 1017 | GPGKeyError if the key could not be imported | ||
1361 | 1018 | """ | ||
1362 | 1019 | key = key.strip() | ||
1363 | 1020 | if "-" in key or "\n" in key: | ||
1364 | 1021 | # Send everything not obviously a keyid to GPG to import, as | ||
1365 | 1022 | # we trust its validation better than our own. eg. handling | ||
1366 | 1023 | # comments before the key. | ||
1367 | 1024 | logger.debug("PGP key found (looks like ASCII Armor format)") | ||
1368 | 1025 | if ( | ||
1369 | 1026 | "-----BEGIN PGP PUBLIC KEY BLOCK-----" in key | ||
1370 | 1027 | and "-----END PGP PUBLIC KEY BLOCK-----" in key | ||
1371 | 1028 | ): | ||
1372 | 1029 | logger.debug("Writing provided PGP key in the binary format") | ||
1373 | 1030 | key_bytes = key.encode("utf-8") | ||
1374 | 1031 | key_name = self._get_keyid_by_gpg_key(key_bytes) | ||
1375 | 1032 | key_gpg = self._dearmor_gpg_key(key_bytes) | ||
1376 | 1033 | self._gpg_key_filename = "/etc/apt/trusted.gpg.d/{}.gpg".format(key_name) | ||
1377 | 1034 | self._write_apt_gpg_keyfile(key_name=self._gpg_key_filename, key_material=key_gpg) | ||
1378 | 1035 | else: | ||
1379 | 1036 | raise GPGKeyError("ASCII armor markers missing from GPG key") | ||
1380 | 1037 | else: | ||
1381 | 1038 | logger.warning( | ||
1382 | 1039 | "PGP key found (looks like Radix64 format). " | ||
1383 | 1040 | "SECURELY importing PGP key from keyserver; " | ||
1384 | 1041 | "full key not provided." | ||
1385 | 1042 | ) | ||
1386 | 1043 | # as of bionic add-apt-repository uses curl with an HTTPS keyserver URL | ||
1387 | 1044 | # to retrieve GPG keys. `apt-key adv` command is deprecated as is | ||
1388 | 1045 | # apt-key in general as noted in its manpage. See lp:1433761 for more | ||
1389 | 1046 | # history. Instead, /etc/apt/trusted.gpg.d is used directly to drop | ||
1390 | 1047 | # gpg | ||
1391 | 1048 | key_asc = self._get_key_by_keyid(key) | ||
1392 | 1049 | # write the key in GPG format so that apt-key list shows it | ||
1393 | 1050 | key_gpg = self._dearmor_gpg_key(key_asc.encode("utf-8")) | ||
1394 | 1051 | self._gpg_key_filename = "/etc/apt/trusted.gpg.d/{}.gpg".format(key) | ||
1395 | 1052 | self._write_apt_gpg_keyfile(key_name=key, key_material=key_gpg) | ||
1396 | 1053 | |||
1397 | 1054 | @staticmethod | ||
1398 | 1055 | def _get_keyid_by_gpg_key(key_material: bytes) -> str: | ||
1399 | 1056 | """Get a GPG key fingerprint by GPG key material. | ||
1400 | 1057 | |||
1401 | 1058 | Gets a GPG key fingerprint (40-digit, 160-bit) by the ASCII armor-encoded | ||
1402 | 1059 | or binary GPG key material. Can be used, for example, to generate file | ||
1403 | 1060 | names for keys passed via charm options. | ||
1404 | 1061 | """ | ||
1405 | 1062 | # Use the same gpg command for both Xenial and Bionic | ||
1406 | 1063 | cmd = ["gpg", "--with-colons", "--with-fingerprint"] | ||
1407 | 1064 | ps = subprocess.run( | ||
1408 | 1065 | cmd, | ||
1409 | 1066 | stdout=PIPE, | ||
1410 | 1067 | stderr=PIPE, | ||
1411 | 1068 | input=key_material, | ||
1412 | 1069 | ) | ||
1413 | 1070 | out, err = ps.stdout.decode(), ps.stderr.decode() | ||
1414 | 1071 | if "gpg: no valid OpenPGP data found." in err: | ||
1415 | 1072 | raise GPGKeyError("Invalid GPG key material provided") | ||
1416 | 1073 | # from gnupg2 docs: fpr :: Fingerprint (fingerprint is in field 10) | ||
1417 | 1074 | return re.search(r"^fpr:{9}([0-9A-F]{40}):$", out, re.MULTILINE).group(1) | ||
1418 | 1075 | |||
1419 | 1076 | @staticmethod | ||
1420 | 1077 | def _get_key_by_keyid(keyid: str) -> str: | ||
1421 | 1078 | """Get a key via HTTPS from the Ubuntu keyserver. | ||
1422 | 1079 | |||
1423 | 1080 | Different key ID formats are supported by SKS keyservers (the longer ones | ||
1424 | 1081 | are more secure, see "dead beef attack" and https://evil32.com/). Since | ||
1425 | 1082 | HTTPS is used, if SSLBump-like HTTPS proxies are in place, they will | ||
1426 | 1083 | impersonate keyserver.ubuntu.com and generate a certificate with | ||
1427 | 1084 | keyserver.ubuntu.com in the CN field or in SubjAltName fields of a | ||
1428 | 1085 | certificate. If such proxy behavior is expected it is necessary to add the | ||
1429 | 1086 | CA certificate chain containing the intermediate CA of the SSLBump proxy to | ||
1430 | 1087 | every machine that this code runs on via ca-certs cloud-init directive (via | ||
1431 | 1088 | cloudinit-userdata model-config) or via other means (such as through a | ||
1432 | 1089 | custom charm option). Also note that DNS resolution for the hostname in a | ||
1433 | 1090 | URL is done at a proxy server - not at the client side. | ||
1434 | 1091 | 8-digit (32 bit) key ID | ||
1435 | 1092 | https://keyserver.ubuntu.com/pks/lookup?search=0x4652B4E6 | ||
1436 | 1093 | 16-digit (64 bit) key ID | ||
1437 | 1094 | https://keyserver.ubuntu.com/pks/lookup?search=0x6E85A86E4652B4E6 | ||
1438 | 1095 | 40-digit key ID: | ||
1439 | 1096 | https://keyserver.ubuntu.com/pks/lookup?search=0x35F77D63B5CEC106C577ED856E85A86E4652B4E6 | ||
1440 | 1097 | |||
1441 | 1098 | Args: | ||
1442 | 1099 | keyid: An 8, 16 or 40 hex digit keyid to find a key for | ||
1443 | 1100 | |||
1444 | 1101 | Returns: | ||
1445 | 1102 | A string contining key material for the specified GPG key id | ||
1446 | 1103 | |||
1447 | 1104 | |||
1448 | 1105 | Raises: | ||
1449 | 1106 | subprocess.CalledProcessError | ||
1450 | 1107 | """ | ||
1451 | 1108 | # options=mr - machine-readable output (disables html wrappers) | ||
1452 | 1109 | keyserver_url = ( | ||
1453 | 1110 | "https://keyserver.ubuntu.com" "/pks/lookup?op=get&options=mr&exact=on&search=0x{}" | ||
1454 | 1111 | ) | ||
1455 | 1112 | curl_cmd = ["curl", keyserver_url.format(keyid)] | ||
1456 | 1113 | # use proxy server settings in order to retrieve the key | ||
1457 | 1114 | return check_output(curl_cmd).decode() | ||
1458 | 1115 | |||
1459 | 1116 | @staticmethod | ||
1460 | 1117 | def _dearmor_gpg_key(key_asc: bytes) -> bytes: | ||
1461 | 1118 | """Converts a GPG key in the ASCII armor format to the binary format. | ||
1462 | 1119 | |||
1463 | 1120 | Args: | ||
1464 | 1121 | key_asc: A GPG key in ASCII armor format. | ||
1465 | 1122 | |||
1466 | 1123 | Returns: | ||
1467 | 1124 | A GPG key in binary format as a string | ||
1468 | 1125 | |||
1469 | 1126 | Raises: | ||
1470 | 1127 | GPGKeyError | ||
1471 | 1128 | """ | ||
1472 | 1129 | ps = subprocess.run(["gpg", "--dearmor"], stdout=PIPE, stderr=PIPE, input=key_asc) | ||
1473 | 1130 | out, err = ps.stdout, ps.stderr.decode() | ||
1474 | 1131 | if "gpg: no valid OpenPGP data found." in err: | ||
1475 | 1132 | raise GPGKeyError( | ||
1476 | 1133 | "Invalid GPG key material. Check your network setup" | ||
1477 | 1134 | " (MTU, routing, DNS) and/or proxy server settings" | ||
1478 | 1135 | " as well as destination keyserver status." | ||
1479 | 1136 | ) | ||
1480 | 1137 | else: | ||
1481 | 1138 | return out | ||
1482 | 1139 | |||
1483 | 1140 | @staticmethod | ||
1484 | 1141 | def _write_apt_gpg_keyfile(key_name: str, key_material: bytes) -> None: | ||
1485 | 1142 | """Writes GPG key material into a file at a provided path. | ||
1486 | 1143 | |||
1487 | 1144 | Args: | ||
1488 | 1145 | key_name: A key name to use for a key file (could be a fingerprint) | ||
1489 | 1146 | key_material: A GPG key material (binary) | ||
1490 | 1147 | """ | ||
1491 | 1148 | with open(key_name, "wb") as keyf: | ||
1492 | 1149 | keyf.write(key_material) | ||
1493 | 1150 | |||
1494 | 1151 | |||
1495 | 1152 | class RepositoryMapping(Mapping): | ||
1496 | 1153 | """An representation of known repositories. | ||
1497 | 1154 | |||
1498 | 1155 | Instantiation of `RepositoryMapping` will iterate through the | ||
1499 | 1156 | filesystem, parse out repository files in `/etc/apt/...`, and create | ||
1500 | 1157 | `DebianRepository` objects in this list. | ||
1501 | 1158 | |||
1502 | 1159 | Typical usage: | ||
1503 | 1160 | |||
1504 | 1161 | repositories = apt.RepositoryMapping() | ||
1505 | 1162 | repositories.add(DebianRepository( | ||
1506 | 1163 | enabled=True, repotype="deb", uri="https://example.com", release="focal", | ||
1507 | 1164 | groups=["universe"] | ||
1508 | 1165 | )) | ||
1509 | 1166 | """ | ||
1510 | 1167 | |||
1511 | 1168 | def __init__(self): | ||
1512 | 1169 | self._repository_map = {} | ||
1513 | 1170 | # Repositories that we're adding -- used to implement mode param | ||
1514 | 1171 | self.default_file = "/etc/apt/sources.list" | ||
1515 | 1172 | |||
1516 | 1173 | # read sources.list if it exists | ||
1517 | 1174 | if os.path.isfile(self.default_file): | ||
1518 | 1175 | self.load(self.default_file) | ||
1519 | 1176 | |||
1520 | 1177 | # read sources.list.d | ||
1521 | 1178 | for file in glob.iglob("/etc/apt/sources.list.d/*.list"): | ||
1522 | 1179 | self.load(file) | ||
1523 | 1180 | |||
1524 | 1181 | def __contains__(self, key: str) -> bool: | ||
1525 | 1182 | """Magic method for checking presence of repo in mapping.""" | ||
1526 | 1183 | return key in self._repository_map | ||
1527 | 1184 | |||
1528 | 1185 | def __len__(self) -> int: | ||
1529 | 1186 | """Return number of repositories in map.""" | ||
1530 | 1187 | return len(self._repository_map) | ||
1531 | 1188 | |||
1532 | 1189 | def __iter__(self) -> Iterable[DebianRepository]: | ||
1533 | 1190 | """Iterator magic method for RepositoryMapping.""" | ||
1534 | 1191 | return iter(self._repository_map.values()) | ||
1535 | 1192 | |||
1536 | 1193 | def __getitem__(self, repository_uri: str) -> DebianRepository: | ||
1537 | 1194 | """Return a given `DebianRepository`.""" | ||
1538 | 1195 | return self._repository_map[repository_uri] | ||
1539 | 1196 | |||
1540 | 1197 | def __setitem__(self, repository_uri: str, repository: DebianRepository) -> None: | ||
1541 | 1198 | """Add a `DebianRepository` to the cache.""" | ||
1542 | 1199 | self._repository_map[repository_uri] = repository | ||
1543 | 1200 | |||
1544 | 1201 | def load(self, filename: str): | ||
1545 | 1202 | """Load a repository source file into the cache. | ||
1546 | 1203 | |||
1547 | 1204 | Args: | ||
1548 | 1205 | filename: the path to the repository file | ||
1549 | 1206 | """ | ||
1550 | 1207 | parsed = [] | ||
1551 | 1208 | skipped = [] | ||
1552 | 1209 | with open(filename, "r") as f: | ||
1553 | 1210 | for n, line in enumerate(f): | ||
1554 | 1211 | try: | ||
1555 | 1212 | repo = self._parse(line, filename) | ||
1556 | 1213 | except InvalidSourceError: | ||
1557 | 1214 | skipped.append(n) | ||
1558 | 1215 | else: | ||
1559 | 1216 | repo_identifier = "{}-{}-{}".format(repo.repotype, repo.uri, repo.release) | ||
1560 | 1217 | self._repository_map[repo_identifier] = repo | ||
1561 | 1218 | parsed.append(n) | ||
1562 | 1219 | logger.debug("parsed repo: '%s'", repo_identifier) | ||
1563 | 1220 | |||
1564 | 1221 | if skipped: | ||
1565 | 1222 | skip_list = ", ".join(str(s) for s in skipped) | ||
1566 | 1223 | logger.debug("skipped the following lines in file '%s': %s", filename, skip_list) | ||
1567 | 1224 | |||
1568 | 1225 | if parsed: | ||
1569 | 1226 | logger.info("parsed %d apt package repositories", len(parsed)) | ||
1570 | 1227 | else: | ||
1571 | 1228 | raise InvalidSourceError("all repository lines in '{}' were invalid!".format(filename)) | ||
1572 | 1229 | |||
1573 | 1230 | @staticmethod | ||
1574 | 1231 | def _parse(line: str, filename: str) -> DebianRepository: | ||
1575 | 1232 | """Parse a line in a sources.list file. | ||
1576 | 1233 | |||
1577 | 1234 | Args: | ||
1578 | 1235 | line: a single line from `load` to parse | ||
1579 | 1236 | filename: the filename being read | ||
1580 | 1237 | |||
1581 | 1238 | Raises: | ||
1582 | 1239 | InvalidSourceError if the source type is unknown | ||
1583 | 1240 | """ | ||
1584 | 1241 | enabled = True | ||
1585 | 1242 | repotype = uri = release = gpg_key = "" | ||
1586 | 1243 | options = {} | ||
1587 | 1244 | groups = [] | ||
1588 | 1245 | |||
1589 | 1246 | line = line.strip() | ||
1590 | 1247 | if line.startswith("#"): | ||
1591 | 1248 | enabled = False | ||
1592 | 1249 | line = line[1:] | ||
1593 | 1250 | |||
1594 | 1251 | # Check for "#" in the line and treat a part after it as a comment then strip it off. | ||
1595 | 1252 | i = line.find("#") | ||
1596 | 1253 | if i > 0: | ||
1597 | 1254 | line = line[:i] | ||
1598 | 1255 | |||
1599 | 1256 | # Split a source into substrings to initialize a new repo. | ||
1600 | 1257 | source = line.strip() | ||
1601 | 1258 | if source: | ||
1602 | 1259 | # Match any repo options, and get a dict representation. | ||
1603 | 1260 | for v in re.findall(OPTIONS_MATCHER, source): | ||
1604 | 1261 | opts = dict(o.split("=") for o in v.strip("[]").split()) | ||
1605 | 1262 | # Extract the 'signed-by' option for the gpg_key | ||
1606 | 1263 | gpg_key = opts.pop("signed-by", "") | ||
1607 | 1264 | options = opts | ||
1608 | 1265 | |||
1609 | 1266 | # Remove any options from the source string and split the string into chunks | ||
1610 | 1267 | source = re.sub(OPTIONS_MATCHER, "", source) | ||
1611 | 1268 | chunks = source.split() | ||
1612 | 1269 | |||
1613 | 1270 | # Check we've got a valid list of chunks | ||
1614 | 1271 | if len(chunks) < 3 or chunks[0] not in VALID_SOURCE_TYPES: | ||
1615 | 1272 | raise InvalidSourceError("An invalid sources line was found in %s!", filename) | ||
1616 | 1273 | |||
1617 | 1274 | repotype = chunks[0] | ||
1618 | 1275 | uri = chunks[1] | ||
1619 | 1276 | release = chunks[2] | ||
1620 | 1277 | groups = chunks[3:] | ||
1621 | 1278 | |||
1622 | 1279 | return DebianRepository( | ||
1623 | 1280 | enabled, repotype, uri, release, groups, filename, gpg_key, options | ||
1624 | 1281 | ) | ||
1625 | 1282 | else: | ||
1626 | 1283 | raise InvalidSourceError("An invalid sources line was found in %s!", filename) | ||
1627 | 1284 | |||
1628 | 1285 | def add(self, repo: DebianRepository, default_filename: Optional[bool] = False) -> None: | ||
1629 | 1286 | """Add a new repository to the system. | ||
1630 | 1287 | |||
1631 | 1288 | Args: | ||
1632 | 1289 | repo: a `DebianRepository` object | ||
1633 | 1290 | default_filename: an (Optional) filename if the default is not desirable | ||
1634 | 1291 | """ | ||
1635 | 1292 | new_filename = "{}-{}.list".format( | ||
1636 | 1293 | DebianRepository.prefix_from_uri(repo.uri), repo.release.replace("/", "-") | ||
1637 | 1294 | ) | ||
1638 | 1295 | |||
1639 | 1296 | fname = repo.filename or new_filename | ||
1640 | 1297 | |||
1641 | 1298 | options = repo.options if repo.options else {} | ||
1642 | 1299 | if repo.gpg_key: | ||
1643 | 1300 | options["signed-by"] = repo.gpg_key | ||
1644 | 1301 | |||
1645 | 1302 | with open(fname, "wb") as f: | ||
1646 | 1303 | f.write( | ||
1647 | 1304 | ( | ||
1648 | 1305 | "{}".format("#" if not repo.enabled else "") | ||
1649 | 1306 | + "{} {}{} ".format(repo.repotype, repo.make_options_string(), repo.uri) | ||
1650 | 1307 | + "{} {}\n".format(repo.release, " ".join(repo.groups)) | ||
1651 | 1308 | ).encode("utf-8") | ||
1652 | 1309 | ) | ||
1653 | 1310 | |||
1654 | 1311 | self._repository_map["{}-{}-{}".format(repo.repotype, repo.uri, repo.release)] = repo | ||
1655 | 1312 | |||
1656 | 1313 | def disable(self, repo: DebianRepository) -> None: | ||
1657 | 1314 | """Remove a repository. Disable by default. | ||
1658 | 1315 | |||
1659 | 1316 | Args: | ||
1660 | 1317 | repo: a `DebianRepository` to disable | ||
1661 | 1318 | """ | ||
1662 | 1319 | searcher = "{} {}{} {}".format( | ||
1663 | 1320 | repo.repotype, repo.make_options_string(), repo.uri, repo.release | ||
1664 | 1321 | ) | ||
1665 | 1322 | |||
1666 | 1323 | for line in fileinput.input(repo.filename, inplace=True): | ||
1667 | 1324 | if re.match(r"^{}\s".format(re.escape(searcher)), line): | ||
1668 | 1325 | print("# {}".format(line), end="") | ||
1669 | 1326 | else: | ||
1670 | 1327 | print(line, end="") | ||
1671 | 1328 | |||
1672 | 1329 | self._repository_map["{}-{}-{}".format(repo.repotype, repo.uri, repo.release)] = repo | ||
1673 | diff --git a/metadata.yaml b/metadata.yaml | |||
1674 | 0 | new file mode 100644 | 1330 | new file mode 100644 |
1675 | index 0000000..3cd1f95 | |||
1676 | --- /dev/null | |||
1677 | +++ b/metadata.yaml | |||
1678 | @@ -0,0 +1,14 @@ | |||
1679 | 1 | name: clamav-database-mirror | ||
1680 | 2 | display-name: ClamAV Database Mirror | ||
1681 | 3 | summary: A local mirror of the ClamAV database. | ||
1682 | 4 | description: | | ||
1683 | 5 | Maintains a local mirror of the latest ClamAV databases. | ||
1684 | 6 | |||
1685 | 7 | Workloads that perform ClamAV scans in ephemeral environments need to | ||
1686 | 8 | fetch fresh copies of virus definitions before doing anything else; but if | ||
1687 | 9 | they do that using the default configuration that points to | ||
1688 | 10 | database.clamav.net, they will likely find themselves rate-limited in | ||
1689 | 11 | short order. To avoid that, point those workloads at a deployment of this | ||
1690 | 12 | charm instead, which maintains a local mirror more economically. | ||
1691 | 13 | series: | ||
1692 | 14 | - jammy | ||
1693 | diff --git a/pyproject.toml b/pyproject.toml | |||
1694 | 0 | new file mode 100644 | 15 | new file mode 100644 |
1695 | index 0000000..2edc519 | |||
1696 | --- /dev/null | |||
1697 | +++ b/pyproject.toml | |||
1698 | @@ -0,0 +1,33 @@ | |||
1699 | 1 | # Testing tools configuration | ||
1700 | 2 | [tool.coverage.run] | ||
1701 | 3 | branch = true | ||
1702 | 4 | |||
1703 | 5 | [tool.coverage.report] | ||
1704 | 6 | show_missing = true | ||
1705 | 7 | |||
1706 | 8 | [tool.pytest.ini_options] | ||
1707 | 9 | minversion = "6.0" | ||
1708 | 10 | log_cli_level = "INFO" | ||
1709 | 11 | |||
1710 | 12 | # Formatting tools configuration | ||
1711 | 13 | [tool.black] | ||
1712 | 14 | line-length = 99 | ||
1713 | 15 | target-version = ["py38"] | ||
1714 | 16 | |||
1715 | 17 | [tool.isort] | ||
1716 | 18 | line_length = 99 | ||
1717 | 19 | profile = "black" | ||
1718 | 20 | |||
1719 | 21 | # Linting tools configuration | ||
1720 | 22 | [tool.flake8] | ||
1721 | 23 | max-line-length = 99 | ||
1722 | 24 | max-doc-length = 99 | ||
1723 | 25 | max-complexity = 10 | ||
1724 | 26 | exclude = [".git", "__pycache__", ".tox", "build", "dist", "*.egg_info", "venv"] | ||
1725 | 27 | select = ["E", "W", "F", "C", "N", "R", "D", "H"] | ||
1726 | 28 | # Ignore W503, E501 because using black creates errors with this | ||
1727 | 29 | # Ignore D107 Missing docstring in __init__ | ||
1728 | 30 | ignore = ["W503", "E501", "D107"] | ||
1729 | 31 | # D100, D101, D102, D103: Ignore missing docstrings in tests | ||
1730 | 32 | per-file-ignores = ["tests/*:D100,D101,D102,D103,D104"] | ||
1731 | 33 | docstring-convention = "google" | ||
1732 | diff --git a/requirements.txt b/requirements.txt | |||
1733 | 0 | new file mode 100644 | 34 | new file mode 100644 |
1734 | index 0000000..4bbf401 | |||
1735 | --- /dev/null | |||
1736 | +++ b/requirements.txt | |||
1737 | @@ -0,0 +1,2 @@ | |||
1738 | 1 | jinja2 | ||
1739 | 2 | ops >= 1.5.0 | ||
1740 | diff --git a/src/charm.py b/src/charm.py | |||
1741 | 0 | new file mode 100755 | 3 | new file mode 100755 |
1742 | index 0000000..fbbf2c0 | |||
1743 | --- /dev/null | |||
1744 | +++ b/src/charm.py | |||
1745 | @@ -0,0 +1,106 @@ | |||
1746 | 1 | #!/usr/bin/env python3 | ||
1747 | 2 | # Copyright 2022 Canonical Ltd. | ||
1748 | 3 | # See LICENSE file for licensing details. | ||
1749 | 4 | # | ||
1750 | 5 | # Learn more at: https://juju.is/docs/sdk | ||
1751 | 6 | |||
1752 | 7 | """Charm the service. | ||
1753 | 8 | |||
1754 | 9 | Refer to the following post for a quick-start guide that will help you | ||
1755 | 10 | develop a new k8s charm using the Operator Framework: | ||
1756 | 11 | |||
1757 | 12 | https://discourse.charmhub.io/t/4208 | ||
1758 | 13 | """ | ||
1759 | 14 | |||
1760 | 15 | import logging | ||
1761 | 16 | import shutil | ||
1762 | 17 | import subprocess | ||
1763 | 18 | from pathlib import Path | ||
1764 | 19 | |||
1765 | 20 | from charms.operator_libs_linux.v0 import apt | ||
1766 | 21 | from jinja2 import Environment, FileSystemLoader | ||
1767 | 22 | from ops.charm import CharmBase | ||
1768 | 23 | from ops.main import main | ||
1769 | 24 | from ops.model import ActiveStatus, MaintenanceStatus | ||
1770 | 25 | |||
1771 | 26 | logger = logging.getLogger(__name__) | ||
1772 | 27 | |||
1773 | 28 | db_path = "/srv/clamav-database-mirror/database" | ||
1774 | 29 | |||
1775 | 30 | |||
1776 | 31 | class ClamAVDatabaseMirrorCharm(CharmBase): | ||
1777 | 32 | """Charm the service.""" | ||
1778 | 33 | |||
1779 | 34 | def __init__(self, *args): | ||
1780 | 35 | super().__init__(*args) | ||
1781 | 36 | self.framework.observe(self.on.install, self._on_install) | ||
1782 | 37 | self.framework.observe(self.on.config_changed, self._on_config_changed) | ||
1783 | 38 | |||
1784 | 39 | def _set_maintenance_step(self, description): | ||
1785 | 40 | self.unit.status = MaintenanceStatus(description) | ||
1786 | 41 | logger.info(description) | ||
1787 | 42 | |||
1788 | 43 | def _install(self): | ||
1789 | 44 | """Install our dependencies.""" | ||
1790 | 45 | self._set_maintenance_step("Installing dependencies") | ||
1791 | 46 | apt.add_package(package_names=["clamav-cvdupdate", "nginx"], update_cache=True) | ||
1792 | 47 | subprocess.run(["systemctl", "disable", "--now", "nginx.service"], check=True) | ||
1793 | 48 | Path("/etc/nginx/sites-enabled/default").unlink(missing_ok=True) | ||
1794 | 49 | |||
1795 | 50 | def _run_as(self, user, args, **kwargs): | ||
1796 | 51 | subprocess.run( | ||
1797 | 52 | ["setpriv", "--reuid", user, "--regid", user, "--init-groups", "--"] + args, **kwargs | ||
1798 | 53 | ) | ||
1799 | 54 | |||
1800 | 55 | def _configure_database(self): | ||
1801 | 56 | """Configure the database.""" | ||
1802 | 57 | self._set_maintenance_step("Configuring database") | ||
1803 | 58 | subprocess.run(["mkdir", "-p", db_path]) | ||
1804 | 59 | subprocess.run(["chown", "ubuntu:ubuntu", db_path]) | ||
1805 | 60 | self._run_as("ubuntu", ["cvdupdate", "config", "set", "--dbdir", db_path]) | ||
1806 | 61 | |||
1807 | 62 | def _configure_timer(self): | ||
1808 | 63 | """Create a systemd timer to update the database automatically.""" | ||
1809 | 64 | self._set_maintenance_step("Configuring timer") | ||
1810 | 65 | template_env = Environment(loader=FileSystemLoader(str(self.charm_dir / "templates"))) | ||
1811 | 66 | template = template_env.get_template("cvdupdate.service.j2") | ||
1812 | 67 | service = template.render(config=self.config) | ||
1813 | 68 | Path("/lib/systemd/system/cvdupdate.service").write_text(service) | ||
1814 | 69 | shutil.copy( | ||
1815 | 70 | self.charm_dir / "files" / "cvdupdate.timer", "/lib/systemd/system/cvdupdate.timer" | ||
1816 | 71 | ) | ||
1817 | 72 | subprocess.run(["systemctl", "daemon-reload"]) | ||
1818 | 73 | # Start the update service once before enabling the timer, to ensure | ||
1819 | 74 | # that the database is populated immediately. | ||
1820 | 75 | subprocess.run(["systemctl", "start", "cvdupdate.service"], check=True) | ||
1821 | 76 | subprocess.run(["systemctl", "enable", "--now", "cvdupdate.timer"], check=True) | ||
1822 | 77 | |||
1823 | 78 | def _configure_nginx(self): | ||
1824 | 79 | """Configure nginx to serve the database.""" | ||
1825 | 80 | self._set_maintenance_step("Configuring nginx") | ||
1826 | 81 | template_env = Environment(loader=FileSystemLoader(str(self.charm_dir / "templates"))) | ||
1827 | 82 | template = template_env.get_template("nginx-site.conf.j2") | ||
1828 | 83 | site = template.render(config=self.config, db_path=db_path) | ||
1829 | 84 | available_conf = Path("/etc/nginx/sites-available/clamav-database-mirror.conf") | ||
1830 | 85 | enabled_conf = Path("/etc/nginx/sites-enabled/clamav-database-mirror.conf") | ||
1831 | 86 | available_conf.write_text(site) | ||
1832 | 87 | if not enabled_conf.exists(): | ||
1833 | 88 | enabled_conf.symlink_to(available_conf) | ||
1834 | 89 | subprocess.run(["systemctl", "enable", "nginx.service"], check=True) | ||
1835 | 90 | subprocess.run(["systemctl", "reload-or-restart", "nginx.service"], check=True) | ||
1836 | 91 | |||
1837 | 92 | def _on_install(self, event): | ||
1838 | 93 | self._install() | ||
1839 | 94 | self._configure_database() | ||
1840 | 95 | self._configure_timer() | ||
1841 | 96 | self._configure_nginx() | ||
1842 | 97 | self.unit.status = ActiveStatus() | ||
1843 | 98 | |||
1844 | 99 | def _on_config_changed(self, event): | ||
1845 | 100 | self._configure_timer() | ||
1846 | 101 | self._configure_nginx() | ||
1847 | 102 | self.unit.status = ActiveStatus() | ||
1848 | 103 | |||
1849 | 104 | |||
1850 | 105 | if __name__ == "__main__": # pragma: nocover | ||
1851 | 106 | main(ClamAVDatabaseMirrorCharm) | ||
1852 | diff --git a/templates/cvdupdate.service.j2 b/templates/cvdupdate.service.j2 | |||
1853 | 0 | new file mode 100644 | 107 | new file mode 100644 |
1854 | index 0000000..3946693 | |||
1855 | --- /dev/null | |||
1856 | +++ b/templates/cvdupdate.service.j2 | |||
1857 | @@ -0,0 +1,22 @@ | |||
1858 | 1 | [Unit] | ||
1859 | 2 | Description=Update ClamAV database mirror | ||
1860 | 3 | Documentation=man:cvdupdate(1) | ||
1861 | 4 | |||
1862 | 5 | [Service] | ||
1863 | 6 | Type=oneshot | ||
1864 | 7 | ExecStart=/usr/bin/cvdupdate update | ||
1865 | 8 | User=ubuntu | ||
1866 | 9 | {% if config["http-proxy"] -%} | ||
1867 | 10 | Environment=http_proxy={{ config["http-proxy"] }} | ||
1868 | 11 | Environment=https_proxy={{ config["http-proxy"] }} | ||
1869 | 12 | {% endif -%} | ||
1870 | 13 | ProtectSystem=full | ||
1871 | 14 | PrivateTmp=true | ||
1872 | 15 | PrivateDevices=true | ||
1873 | 16 | ProtectHostname=true | ||
1874 | 17 | ProtectClock=true | ||
1875 | 18 | ProtectKernelTunables=true | ||
1876 | 19 | ProtectKernelModules=true | ||
1877 | 20 | ProtectKernelLogs=true | ||
1878 | 21 | ProtectControlGroups=true | ||
1879 | 22 | |||
1880 | diff --git a/templates/nginx-site.conf.j2 b/templates/nginx-site.conf.j2 | |||
1881 | 0 | new file mode 100644 | 23 | new file mode 100644 |
1882 | index 0000000..5fec382 | |||
1883 | --- /dev/null | |||
1884 | +++ b/templates/nginx-site.conf.j2 | |||
1885 | @@ -0,0 +1,9 @@ | |||
1886 | 1 | server { | ||
1887 | 2 | server_name {{ config["host"] }}; | ||
1888 | 3 | listen {{ config["port"] }}; | ||
1889 | 4 | location / { | ||
1890 | 5 | alias {{ db_path }}/; | ||
1891 | 6 | autoindex on; | ||
1892 | 7 | } | ||
1893 | 8 | } | ||
1894 | 9 | |||
1895 | diff --git a/tests/integration/test_charm.py b/tests/integration/test_charm.py | |||
1896 | 0 | new file mode 100644 | 10 | new file mode 100644 |
1897 | index 0000000..ffcc9ec | |||
1898 | --- /dev/null | |||
1899 | +++ b/tests/integration/test_charm.py | |||
1900 | @@ -0,0 +1,34 @@ | |||
1901 | 1 | #!/usr/bin/env python3 | ||
1902 | 2 | # Copyright 2022 Canonical Ltd. | ||
1903 | 3 | # See LICENSE file for licensing details. | ||
1904 | 4 | |||
1905 | 5 | import asyncio | ||
1906 | 6 | import logging | ||
1907 | 7 | from pathlib import Path | ||
1908 | 8 | |||
1909 | 9 | import pytest | ||
1910 | 10 | import yaml | ||
1911 | 11 | from pytest_operator.plugin import OpsTest | ||
1912 | 12 | |||
1913 | 13 | logger = logging.getLogger(__name__) | ||
1914 | 14 | |||
1915 | 15 | METADATA = yaml.safe_load(Path("./metadata.yaml").read_text()) | ||
1916 | 16 | APP_NAME = METADATA["name"] | ||
1917 | 17 | |||
1918 | 18 | |||
1919 | 19 | @pytest.mark.abort_on_fail | ||
1920 | 20 | async def test_build_and_deploy(ops_test: OpsTest): | ||
1921 | 21 | """Build the charm-under-test and deploy it together with related charms. | ||
1922 | 22 | |||
1923 | 23 | Assert on the unit status before any relations/configurations take place. | ||
1924 | 24 | """ | ||
1925 | 25 | # Build and deploy charm from local source folder | ||
1926 | 26 | charm = await ops_test.build_charm(".") | ||
1927 | 27 | |||
1928 | 28 | # Deploy the charm and wait for active/idle status | ||
1929 | 29 | await asyncio.gather( | ||
1930 | 30 | ops_test.model.deploy(charm, application_name=APP_NAME), | ||
1931 | 31 | ops_test.model.wait_for_idle( | ||
1932 | 32 | apps=[APP_NAME], status="active", raise_on_blocked=True, timeout=1000 | ||
1933 | 33 | ), | ||
1934 | 34 | ) | ||
1935 | diff --git a/tests/unit/test_charm.py b/tests/unit/test_charm.py | |||
1936 | 0 | new file mode 100644 | 35 | new file mode 100644 |
1937 | index 0000000..bae3788 | |||
1938 | --- /dev/null | |||
1939 | +++ b/tests/unit/test_charm.py | |||
1940 | @@ -0,0 +1,88 @@ | |||
1941 | 1 | # Copyright 2022 Canonical Ltd. | ||
1942 | 2 | # See LICENSE file for licensing details. | ||
1943 | 3 | # | ||
1944 | 4 | # Learn more about testing at: https://juju.is/docs/sdk/testing | ||
1945 | 5 | |||
1946 | 6 | from pathlib import Path | ||
1947 | 7 | |||
1948 | 8 | import ops.testing | ||
1949 | 9 | import pytest | ||
1950 | 10 | from ops.model import ActiveStatus | ||
1951 | 11 | from ops.testing import Harness | ||
1952 | 12 | |||
1953 | 13 | from charm import ClamAVDatabaseMirrorCharm | ||
1954 | 14 | |||
1955 | 15 | # The "fs" fixture is a fake filesystem from pyfakefs; the "fp" fixture is | ||
1956 | 16 | # from pytest-subprocess. | ||
1957 | 17 | |||
1958 | 18 | |||
1959 | 19 | @pytest.fixture(autouse=True) | ||
1960 | 20 | def simulate_can_connect(monkeypatch): | ||
1961 | 21 | # Enable more accurate simulation of container networking. | ||
1962 | 22 | # For more information, see https://juju.is/docs/sdk/testing#heading--simulate-can-connect | ||
1963 | 23 | monkeypatch.setattr(ops.testing, "SIMULATE_CAN_CONNECT", True) | ||
1964 | 24 | |||
1965 | 25 | |||
1966 | 26 | @pytest.fixture | ||
1967 | 27 | def harness(): | ||
1968 | 28 | harness = Harness(ClamAVDatabaseMirrorCharm) | ||
1969 | 29 | harness.begin() | ||
1970 | 30 | yield harness | ||
1971 | 31 | harness.cleanup() | ||
1972 | 32 | |||
1973 | 33 | |||
1974 | 34 | def test_install(harness, fs, fp): | ||
1975 | 35 | fs.add_real_directory(harness.charm.charm_dir / "files") | ||
1976 | 36 | fs.add_real_directory(harness.charm.charm_dir / "templates") | ||
1977 | 37 | Path("/lib/systemd/system").mkdir(parents=True) | ||
1978 | 38 | Path("/etc/nginx/sites-available").mkdir(parents=True) | ||
1979 | 39 | Path("/etc/nginx/sites-enabled").mkdir(parents=True) | ||
1980 | 40 | fp.keep_last_process(True) | ||
1981 | 41 | fp.register( | ||
1982 | 42 | ["apt-cache", "show", "clamav-cvdupdate"], | ||
1983 | 43 | stdout="Package: clamav-cvdupdate\nArchitecture: all\nVersion: 0.1\n", | ||
1984 | 44 | ) | ||
1985 | 45 | fp.register( | ||
1986 | 46 | ["apt-cache", "show", "nginx"], stdout="Package: nginx\nArchitecture: all\nVersion: 0.2\n" | ||
1987 | 47 | ) | ||
1988 | 48 | fp.register([fp.any()]) | ||
1989 | 49 | |||
1990 | 50 | harness.charm.on.install.emit() | ||
1991 | 51 | |||
1992 | 52 | assert harness.model.unit.status == ActiveStatus() | ||
1993 | 53 | run_as_ubuntu = ["setpriv", "--reuid", "ubuntu", "--regid", "ubuntu", "--init-groups", "--"] | ||
1994 | 54 | assert list(fp.calls) == [ | ||
1995 | 55 | ["apt-get", "update"], | ||
1996 | 56 | ["dpkg", "--print-architecture"], | ||
1997 | 57 | ["dpkg", "-l", "clamav-cvdupdate"], | ||
1998 | 58 | ["dpkg", "--print-architecture"], | ||
1999 | 59 | ["apt-cache", "show", "clamav-cvdupdate"], | ||
2000 | 60 | [ | ||
2001 | 61 | "apt-get", | ||
2002 | 62 | "-y", | ||
2003 | 63 | "--option=Dpkg::Options::=--force-confold", | ||
2004 | 64 | "install", | ||
2005 | 65 | "clamav-cvdupdate=0.1", | ||
2006 | 66 | ], | ||
2007 | 67 | ["dpkg", "--print-architecture"], | ||
2008 | 68 | ["dpkg", "-l", "nginx"], | ||
2009 | 69 | ["dpkg", "--print-architecture"], | ||
2010 | 70 | ["apt-cache", "show", "nginx"], | ||
2011 | 71 | [ | ||
2012 | 72 | "apt-get", | ||
2013 | 73 | "-y", | ||
2014 | 74 | "--option=Dpkg::Options::=--force-confold", | ||
2015 | 75 | "install", | ||
2016 | 76 | "nginx=0.2", | ||
2017 | 77 | ], | ||
2018 | 78 | ["systemctl", "disable", "--now", "nginx.service"], | ||
2019 | 79 | ["mkdir", "-p", "/srv/clamav-database-mirror/database"], | ||
2020 | 80 | ["chown", "ubuntu:ubuntu", "/srv/clamav-database-mirror/database"], | ||
2021 | 81 | run_as_ubuntu | ||
2022 | 82 | + ["cvdupdate", "config", "set", "--dbdir", "/srv/clamav-database-mirror/database"], | ||
2023 | 83 | ["systemctl", "daemon-reload"], | ||
2024 | 84 | ["systemctl", "start", "cvdupdate.service"], | ||
2025 | 85 | ["systemctl", "enable", "--now", "cvdupdate.timer"], | ||
2026 | 86 | ["systemctl", "enable", "nginx.service"], | ||
2027 | 87 | ["systemctl", "reload-or-restart", "nginx.service"], | ||
2028 | 88 | ] | ||
2029 | diff --git a/tox.ini b/tox.ini | |||
2030 | 0 | new file mode 100644 | 89 | new file mode 100644 |
2031 | index 0000000..dc12427 | |||
2032 | --- /dev/null | |||
2033 | +++ b/tox.ini | |||
2034 | @@ -0,0 +1,73 @@ | |||
2035 | 1 | # Copyright 2022 Canonical Ltd. | ||
2036 | 2 | # See LICENSE file for licensing details. | ||
2037 | 3 | |||
2038 | 4 | [tox] | ||
2039 | 5 | skipsdist=True | ||
2040 | 6 | skip_missing_interpreters = True | ||
2041 | 7 | envlist = lint, unit | ||
2042 | 8 | |||
2043 | 9 | [vars] | ||
2044 | 10 | src_path = {toxinidir}/src/ | ||
2045 | 11 | tst_path = {toxinidir}/tests/ | ||
2046 | 12 | all_path = {[vars]src_path} {[vars]tst_path} | ||
2047 | 13 | |||
2048 | 14 | [testenv] | ||
2049 | 15 | setenv = | ||
2050 | 16 | PYTHONPATH = {toxinidir}:{toxinidir}/lib:{[vars]src_path} | ||
2051 | 17 | PYTHONBREAKPOINT=pdb.set_trace | ||
2052 | 18 | PY_COLORS=1 | ||
2053 | 19 | passenv = | ||
2054 | 20 | PYTHONPATH | ||
2055 | 21 | CHARM_BUILD_DIR | ||
2056 | 22 | MODEL_SETTINGS | ||
2057 | 23 | |||
2058 | 24 | [testenv:fmt] | ||
2059 | 25 | description = Apply coding style standards to code | ||
2060 | 26 | deps = | ||
2061 | 27 | black | ||
2062 | 28 | isort | ||
2063 | 29 | commands = | ||
2064 | 30 | isort {[vars]all_path} | ||
2065 | 31 | black {[vars]all_path} | ||
2066 | 32 | |||
2067 | 33 | [testenv:lint] | ||
2068 | 34 | description = Check code against coding style standards | ||
2069 | 35 | deps = | ||
2070 | 36 | black | ||
2071 | 37 | flake8-docstrings | ||
2072 | 38 | flake8-builtins | ||
2073 | 39 | pyproject-flake8 | ||
2074 | 40 | pep8-naming | ||
2075 | 41 | isort | ||
2076 | 42 | codespell | ||
2077 | 43 | commands = | ||
2078 | 44 | codespell {toxinidir}/. --skip {toxinidir}/.git --skip {toxinidir}/.tox \ | ||
2079 | 45 | --skip {toxinidir}/build --skip {toxinidir}/lib --skip {toxinidir}/venv \ | ||
2080 | 46 | --skip {toxinidir}/.mypy_cache --skip {toxinidir}/icon.svg | ||
2081 | 47 | # pflake8 wrapper supports config from pyproject.toml | ||
2082 | 48 | pflake8 {[vars]all_path} | ||
2083 | 49 | isort --check-only --diff {[vars]all_path} | ||
2084 | 50 | black --check --diff {[vars]all_path} | ||
2085 | 51 | |||
2086 | 52 | [testenv:unit] | ||
2087 | 53 | description = Run unit tests | ||
2088 | 54 | deps = | ||
2089 | 55 | pyfakefs | ||
2090 | 56 | pytest | ||
2091 | 57 | pytest-subprocess | ||
2092 | 58 | coverage[toml] | ||
2093 | 59 | -r{toxinidir}/requirements.txt | ||
2094 | 60 | commands = | ||
2095 | 61 | coverage run --source={[vars]src_path} \ | ||
2096 | 62 | -m pytest --ignore={[vars]tst_path}integration -v --tb native -s {posargs} | ||
2097 | 63 | coverage report | ||
2098 | 64 | |||
2099 | 65 | [testenv:integration] | ||
2100 | 66 | description = Run integration tests | ||
2101 | 67 | deps = | ||
2102 | 68 | pytest | ||
2103 | 69 | juju | ||
2104 | 70 | pytest-operator | ||
2105 | 71 | -r{toxinidir}/requirements.txt | ||
2106 | 72 | commands = | ||
2107 | 73 | pytest -v --tb native --ignore={[vars]tst_path}unit --log-cli-level=INFO -s {posargs} | ||
2108 | diff --git a/update-lib b/update-lib | |||
2109 | 0 | new file mode 100755 | 74 | new file mode 100755 |
2110 | index 0000000..943e8fe | |||
2111 | --- /dev/null | |||
2112 | +++ b/update-lib | |||
2113 | @@ -0,0 +1,4 @@ | |||
2114 | 1 | #! /bin/sh | ||
2115 | 2 | set -e | ||
2116 | 3 | |||
2117 | 4 | charmcraft fetch-lib charms.operator_libs_linux.v0.apt |