Merge lp:~xavpaice/landscape-client-charm/lp1825267 into lp:landscape-client-charm
- lp1825267
- Merge into trunk
Status: | Rejected |
---|---|
Rejected by: | Haw Loeung |
Proposed branch: | lp:~xavpaice/landscape-client-charm/lp1825267 |
Merge into: | lp:landscape-client-charm |
Diff against target: |
2620 lines (+1684/-161) 24 files modified
Makefile (+1/-2) hooks/charmhelpers/__init__.py (+65/-4) hooks/charmhelpers/core/hookenv.py (+450/-28) hooks/charmhelpers/core/host.py (+170/-11) hooks/charmhelpers/core/host_factory/centos.py (+16/-0) hooks/charmhelpers/core/host_factory/ubuntu.py (+58/-0) hooks/charmhelpers/core/kernel.py (+2/-2) hooks/charmhelpers/core/services/base.py (+18/-7) hooks/charmhelpers/core/strutils.py (+64/-5) hooks/charmhelpers/core/sysctl.py (+21/-10) hooks/charmhelpers/core/templating.py (+18/-9) hooks/charmhelpers/core/unitdata.py (+8/-1) hooks/charmhelpers/fetch/__init__.py (+19/-9) hooks/charmhelpers/fetch/archiveurl.py (+1/-1) hooks/charmhelpers/fetch/bzrurl.py (+2/-2) hooks/charmhelpers/fetch/centos.py (+1/-1) hooks/charmhelpers/fetch/giturl.py (+2/-2) hooks/charmhelpers/fetch/python/__init__.py (+13/-0) hooks/charmhelpers/fetch/python/debug.py (+54/-0) hooks/charmhelpers/fetch/python/packages.py (+154/-0) hooks/charmhelpers/fetch/python/rpdb.py (+56/-0) hooks/charmhelpers/fetch/python/version.py (+32/-0) hooks/charmhelpers/fetch/snap.py (+33/-5) hooks/charmhelpers/fetch/ubuntu.py (+426/-62) |
To merge this branch: | bzr merge lp:~xavpaice/landscape-client-charm/lp1825267 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Xav Paice (community) | Needs Resubmitting | ||
🤖 Landscape Builder | test results | Approve | |
Landscape | Pending | ||
Review via email: mp+366238@code.launchpad.net |
Commit message
Description of the change
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Fail
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Success
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Success
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Fail
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Success
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Success
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Success
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: make ci-test
Result: Success
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/
Xav Paice (xavpaice) wrote : | # |
if this is still relevant it needs some serious rebase and retesting
Preview Diff
1 | === modified file 'Makefile' | |||
2 | --- Makefile 2019-02-07 15:13:07 +0000 | |||
3 | +++ Makefile 2019-04-18 01:13:06 +0000 | |||
4 | @@ -26,8 +26,7 @@ | |||
5 | 26 | 26 | ||
6 | 27 | $(SYNC_SCRIPT): | 27 | $(SYNC_SCRIPT): |
7 | 28 | @mkdir -p bin | 28 | @mkdir -p bin |
10 | 29 | @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \ | 29 | @curl -o bin/charm_helpers_sync.py https://raw.githubusercontent.com/juju/charm-helpers/master/tools/charm_helpers_sync/charm_helpers_sync.py |
9 | 30 | > $(SYNC_SCRIPT) | ||
11 | 31 | 30 | ||
12 | 32 | # Note: The target name is unfortunate, but that's what other charms use. | 31 | # Note: The target name is unfortunate, but that's what other charms use. |
13 | 33 | sync: $(SYNC_SCRIPT) | 32 | sync: $(SYNC_SCRIPT) |
14 | 34 | 33 | ||
15 | === modified file 'hooks/charmhelpers/__init__.py' | |||
16 | --- hooks/charmhelpers/__init__.py 2017-03-03 19:56:10 +0000 | |||
17 | +++ hooks/charmhelpers/__init__.py 2019-04-18 01:13:06 +0000 | |||
18 | @@ -14,23 +14,84 @@ | |||
19 | 14 | 14 | ||
20 | 15 | # Bootstrap charm-helpers, installing its dependencies if necessary using | 15 | # Bootstrap charm-helpers, installing its dependencies if necessary using |
21 | 16 | # only standard libraries. | 16 | # only standard libraries. |
22 | 17 | from __future__ import print_function | ||
23 | 18 | from __future__ import absolute_import | ||
24 | 19 | |||
25 | 20 | import functools | ||
26 | 21 | import inspect | ||
27 | 17 | import subprocess | 22 | import subprocess |
28 | 18 | import sys | 23 | import sys |
29 | 19 | 24 | ||
30 | 20 | try: | 25 | try: |
32 | 21 | import six # flake8: noqa | 26 | import six # NOQA:F401 |
33 | 22 | except ImportError: | 27 | except ImportError: |
34 | 23 | if sys.version_info.major == 2: | 28 | if sys.version_info.major == 2: |
35 | 24 | subprocess.check_call(['apt-get', 'install', '-y', 'python-six']) | 29 | subprocess.check_call(['apt-get', 'install', '-y', 'python-six']) |
36 | 25 | else: | 30 | else: |
37 | 26 | subprocess.check_call(['apt-get', 'install', '-y', 'python3-six']) | 31 | subprocess.check_call(['apt-get', 'install', '-y', 'python3-six']) |
39 | 27 | import six # flake8: noqa | 32 | import six # NOQA:F401 |
40 | 28 | 33 | ||
41 | 29 | try: | 34 | try: |
43 | 30 | import yaml # flake8: noqa | 35 | import yaml # NOQA:F401 |
44 | 31 | except ImportError: | 36 | except ImportError: |
45 | 32 | if sys.version_info.major == 2: | 37 | if sys.version_info.major == 2: |
46 | 33 | subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml']) | 38 | subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml']) |
47 | 34 | else: | 39 | else: |
48 | 35 | subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml']) | 40 | subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml']) |
50 | 36 | import yaml # flake8: noqa | 41 | import yaml # NOQA:F401 |
51 | 42 | |||
52 | 43 | |||
53 | 44 | # Holds a list of mapping of mangled function names that have been deprecated | ||
54 | 45 | # using the @deprecate decorator below. This is so that the warning is only | ||
55 | 46 | # printed once for each usage of the function. | ||
56 | 47 | __deprecated_functions = {} | ||
57 | 48 | |||
58 | 49 | |||
59 | 50 | def deprecate(warning, date=None, log=None): | ||
60 | 51 | """Add a deprecation warning the first time the function is used. | ||
61 | 52 | The date, which is a string in semi-ISO8660 format indicate the year-month | ||
62 | 53 | that the function is officially going to be removed. | ||
63 | 54 | |||
64 | 55 | usage: | ||
65 | 56 | |||
66 | 57 | @deprecate('use core/fetch/add_source() instead', '2017-04') | ||
67 | 58 | def contributed_add_source_thing(...): | ||
68 | 59 | ... | ||
69 | 60 | |||
70 | 61 | And it then prints to the log ONCE that the function is deprecated. | ||
71 | 62 | The reason for passing the logging function (log) is so that hookenv.log | ||
72 | 63 | can be used for a charm if needed. | ||
73 | 64 | |||
74 | 65 | :param warning: String to indicat where it has moved ot. | ||
75 | 66 | :param date: optional sting, in YYYY-MM format to indicate when the | ||
76 | 67 | function will definitely (probably) be removed. | ||
77 | 68 | :param log: The log function to call to log. If not, logs to stdout | ||
78 | 69 | """ | ||
79 | 70 | def wrap(f): | ||
80 | 71 | |||
81 | 72 | @functools.wraps(f) | ||
82 | 73 | def wrapped_f(*args, **kwargs): | ||
83 | 74 | try: | ||
84 | 75 | module = inspect.getmodule(f) | ||
85 | 76 | file = inspect.getsourcefile(f) | ||
86 | 77 | lines = inspect.getsourcelines(f) | ||
87 | 78 | f_name = "{}-{}-{}..{}-{}".format( | ||
88 | 79 | module.__name__, file, lines[0], lines[-1], f.__name__) | ||
89 | 80 | except (IOError, TypeError): | ||
90 | 81 | # assume it was local, so just use the name of the function | ||
91 | 82 | f_name = f.__name__ | ||
92 | 83 | if f_name not in __deprecated_functions: | ||
93 | 84 | __deprecated_functions[f_name] = True | ||
94 | 85 | s = "DEPRECATION WARNING: Function {} is being removed".format( | ||
95 | 86 | f.__name__) | ||
96 | 87 | if date: | ||
97 | 88 | s = "{} on/around {}".format(s, date) | ||
98 | 89 | if warning: | ||
99 | 90 | s = "{} : {}".format(s, warning) | ||
100 | 91 | if log: | ||
101 | 92 | log(s) | ||
102 | 93 | else: | ||
103 | 94 | print(s) | ||
104 | 95 | return f(*args, **kwargs) | ||
105 | 96 | return wrapped_f | ||
106 | 97 | return wrap | ||
107 | 37 | 98 | ||
108 | === modified file 'hooks/charmhelpers/core/hookenv.py' | |||
109 | --- hooks/charmhelpers/core/hookenv.py 2017-03-03 19:56:10 +0000 | |||
110 | +++ hooks/charmhelpers/core/hookenv.py 2019-04-18 01:13:06 +0000 | |||
111 | @@ -22,10 +22,12 @@ | |||
112 | 22 | import copy | 22 | import copy |
113 | 23 | from distutils.version import LooseVersion | 23 | from distutils.version import LooseVersion |
114 | 24 | from functools import wraps | 24 | from functools import wraps |
115 | 25 | from collections import namedtuple | ||
116 | 25 | import glob | 26 | import glob |
117 | 26 | import os | 27 | import os |
118 | 27 | import json | 28 | import json |
119 | 28 | import yaml | 29 | import yaml |
120 | 30 | import re | ||
121 | 29 | import subprocess | 31 | import subprocess |
122 | 30 | import sys | 32 | import sys |
123 | 31 | import errno | 33 | import errno |
124 | @@ -38,12 +40,20 @@ | |||
125 | 38 | else: | 40 | else: |
126 | 39 | from collections import UserDict | 41 | from collections import UserDict |
127 | 40 | 42 | ||
128 | 43 | |||
129 | 41 | CRITICAL = "CRITICAL" | 44 | CRITICAL = "CRITICAL" |
130 | 42 | ERROR = "ERROR" | 45 | ERROR = "ERROR" |
131 | 43 | WARNING = "WARNING" | 46 | WARNING = "WARNING" |
132 | 44 | INFO = "INFO" | 47 | INFO = "INFO" |
133 | 45 | DEBUG = "DEBUG" | 48 | DEBUG = "DEBUG" |
134 | 49 | TRACE = "TRACE" | ||
135 | 46 | MARKER = object() | 50 | MARKER = object() |
136 | 51 | SH_MAX_ARG = 131071 | ||
137 | 52 | |||
138 | 53 | |||
139 | 54 | RANGE_WARNING = ('Passing NO_PROXY string that includes a cidr. ' | ||
140 | 55 | 'This may not be compatible with software you are ' | ||
141 | 56 | 'running in your shell.') | ||
142 | 47 | 57 | ||
143 | 48 | cache = {} | 58 | cache = {} |
144 | 49 | 59 | ||
145 | @@ -64,7 +74,7 @@ | |||
146 | 64 | @wraps(func) | 74 | @wraps(func) |
147 | 65 | def wrapper(*args, **kwargs): | 75 | def wrapper(*args, **kwargs): |
148 | 66 | global cache | 76 | global cache |
150 | 67 | key = str((func, args, kwargs)) | 77 | key = json.dumps((func, args, kwargs), sort_keys=True, default=str) |
151 | 68 | try: | 78 | try: |
152 | 69 | return cache[key] | 79 | return cache[key] |
153 | 70 | except KeyError: | 80 | except KeyError: |
154 | @@ -94,7 +104,7 @@ | |||
155 | 94 | command += ['-l', level] | 104 | command += ['-l', level] |
156 | 95 | if not isinstance(message, six.string_types): | 105 | if not isinstance(message, six.string_types): |
157 | 96 | message = repr(message) | 106 | message = repr(message) |
159 | 97 | command += [message] | 107 | command += [message[:SH_MAX_ARG]] |
160 | 98 | # Missing juju-log should not cause failures in unit tests | 108 | # Missing juju-log should not cause failures in unit tests |
161 | 99 | # Send log output to stderr | 109 | # Send log output to stderr |
162 | 100 | try: | 110 | try: |
163 | @@ -197,9 +207,56 @@ | |||
164 | 197 | return os.environ.get('JUJU_REMOTE_UNIT', None) | 207 | return os.environ.get('JUJU_REMOTE_UNIT', None) |
165 | 198 | 208 | ||
166 | 199 | 209 | ||
167 | 210 | def application_name(): | ||
168 | 211 | """ | ||
169 | 212 | The name of the deployed application this unit belongs to. | ||
170 | 213 | """ | ||
171 | 214 | return local_unit().split('/')[0] | ||
172 | 215 | |||
173 | 216 | |||
174 | 200 | def service_name(): | 217 | def service_name(): |
177 | 201 | """The name service group this unit belongs to""" | 218 | """ |
178 | 202 | return local_unit().split('/')[0] | 219 | .. deprecated:: 0.19.1 |
179 | 220 | Alias for :func:`application_name`. | ||
180 | 221 | """ | ||
181 | 222 | return application_name() | ||
182 | 223 | |||
183 | 224 | |||
184 | 225 | def model_name(): | ||
185 | 226 | """ | ||
186 | 227 | Name of the model that this unit is deployed in. | ||
187 | 228 | """ | ||
188 | 229 | return os.environ['JUJU_MODEL_NAME'] | ||
189 | 230 | |||
190 | 231 | |||
191 | 232 | def model_uuid(): | ||
192 | 233 | """ | ||
193 | 234 | UUID of the model that this unit is deployed in. | ||
194 | 235 | """ | ||
195 | 236 | return os.environ['JUJU_MODEL_UUID'] | ||
196 | 237 | |||
197 | 238 | |||
198 | 239 | def principal_unit(): | ||
199 | 240 | """Returns the principal unit of this unit, otherwise None""" | ||
200 | 241 | # Juju 2.2 and above provides JUJU_PRINCIPAL_UNIT | ||
201 | 242 | principal_unit = os.environ.get('JUJU_PRINCIPAL_UNIT', None) | ||
202 | 243 | # If it's empty, then this unit is the principal | ||
203 | 244 | if principal_unit == '': | ||
204 | 245 | return os.environ['JUJU_UNIT_NAME'] | ||
205 | 246 | elif principal_unit is not None: | ||
206 | 247 | return principal_unit | ||
207 | 248 | # For Juju 2.1 and below, let's try work out the principle unit by | ||
208 | 249 | # the various charms' metadata.yaml. | ||
209 | 250 | for reltype in relation_types(): | ||
210 | 251 | for rid in relation_ids(reltype): | ||
211 | 252 | for unit in related_units(rid): | ||
212 | 253 | md = _metadata_unit(unit) | ||
213 | 254 | if not md: | ||
214 | 255 | continue | ||
215 | 256 | subordinate = md.pop('subordinate', None) | ||
216 | 257 | if not subordinate: | ||
217 | 258 | return unit | ||
218 | 259 | return None | ||
219 | 203 | 260 | ||
220 | 204 | 261 | ||
221 | 205 | @cached | 262 | @cached |
222 | @@ -263,7 +320,7 @@ | |||
223 | 263 | self.implicit_save = True | 320 | self.implicit_save = True |
224 | 264 | self._prev_dict = None | 321 | self._prev_dict = None |
225 | 265 | self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME) | 322 | self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME) |
227 | 266 | if os.path.exists(self.path): | 323 | if os.path.exists(self.path) and os.stat(self.path).st_size: |
228 | 267 | self.load_previous() | 324 | self.load_previous() |
229 | 268 | atexit(self._implicit_save) | 325 | atexit(self._implicit_save) |
230 | 269 | 326 | ||
231 | @@ -283,7 +340,11 @@ | |||
232 | 283 | """ | 340 | """ |
233 | 284 | self.path = path or self.path | 341 | self.path = path or self.path |
234 | 285 | with open(self.path) as f: | 342 | with open(self.path) as f: |
236 | 286 | self._prev_dict = json.load(f) | 343 | try: |
237 | 344 | self._prev_dict = json.load(f) | ||
238 | 345 | except ValueError as e: | ||
239 | 346 | log('Unable to parse previous config data - {}'.format(str(e)), | ||
240 | 347 | level=ERROR) | ||
241 | 287 | for k, v in copy.deepcopy(self._prev_dict).items(): | 348 | for k, v in copy.deepcopy(self._prev_dict).items(): |
242 | 288 | if k not in self: | 349 | if k not in self: |
243 | 289 | self[k] = v | 350 | self[k] = v |
244 | @@ -319,6 +380,7 @@ | |||
245 | 319 | 380 | ||
246 | 320 | """ | 381 | """ |
247 | 321 | with open(self.path, 'w') as f: | 382 | with open(self.path, 'w') as f: |
248 | 383 | os.fchmod(f.fileno(), 0o600) | ||
249 | 322 | json.dump(self, f) | 384 | json.dump(self, f) |
250 | 323 | 385 | ||
251 | 324 | def _implicit_save(self): | 386 | def _implicit_save(self): |
252 | @@ -326,22 +388,40 @@ | |||
253 | 326 | self.save() | 388 | self.save() |
254 | 327 | 389 | ||
255 | 328 | 390 | ||
257 | 329 | @cached | 391 | _cache_config = None |
258 | 392 | |||
259 | 393 | |||
260 | 330 | def config(scope=None): | 394 | def config(scope=None): |
271 | 331 | """Juju charm configuration""" | 395 | """ |
272 | 332 | config_cmd_line = ['config-get'] | 396 | Get the juju charm configuration (scope==None) or individual key, |
273 | 333 | if scope is not None: | 397 | (scope=str). The returned value is a Python data structure loaded as |
274 | 334 | config_cmd_line.append(scope) | 398 | JSON from the Juju config command. |
275 | 335 | else: | 399 | |
276 | 336 | config_cmd_line.append('--all') | 400 | :param scope: If set, return the value for the specified key. |
277 | 337 | config_cmd_line.append('--format=json') | 401 | :type scope: Optional[str] |
278 | 338 | try: | 402 | :returns: Either the whole config as a Config, or a key from it. |
279 | 339 | config_data = json.loads( | 403 | :rtype: Any |
280 | 340 | subprocess.check_output(config_cmd_line).decode('UTF-8')) | 404 | """ |
281 | 405 | global _cache_config | ||
282 | 406 | config_cmd_line = ['config-get', '--all', '--format=json'] | ||
283 | 407 | try: | ||
284 | 408 | # JSON Decode Exception for Python3.5+ | ||
285 | 409 | exc_json = json.decoder.JSONDecodeError | ||
286 | 410 | except AttributeError: | ||
287 | 411 | # JSON Decode Exception for Python2.7 through Python3.4 | ||
288 | 412 | exc_json = ValueError | ||
289 | 413 | try: | ||
290 | 414 | if _cache_config is None: | ||
291 | 415 | config_data = json.loads( | ||
292 | 416 | subprocess.check_output(config_cmd_line).decode('UTF-8')) | ||
293 | 417 | _cache_config = Config(config_data) | ||
294 | 341 | if scope is not None: | 418 | if scope is not None: |
298 | 342 | return config_data | 419 | return _cache_config.get(scope) |
299 | 343 | return Config(config_data) | 420 | return _cache_config |
300 | 344 | except ValueError: | 421 | except (exc_json, UnicodeDecodeError) as e: |
301 | 422 | log('Unable to parse output from config-get: config_cmd_line="{}" ' | ||
302 | 423 | 'message="{}"' | ||
303 | 424 | .format(config_cmd_line, str(e)), level=ERROR) | ||
304 | 345 | return None | 425 | return None |
305 | 346 | 426 | ||
306 | 347 | 427 | ||
307 | @@ -435,6 +515,67 @@ | |||
308 | 435 | subprocess.check_output(units_cmd_line).decode('UTF-8')) or [] | 515 | subprocess.check_output(units_cmd_line).decode('UTF-8')) or [] |
309 | 436 | 516 | ||
310 | 437 | 517 | ||
311 | 518 | def expected_peer_units(): | ||
312 | 519 | """Get a generator for units we expect to join peer relation based on | ||
313 | 520 | goal-state. | ||
314 | 521 | |||
315 | 522 | The local unit is excluded from the result to make it easy to gauge | ||
316 | 523 | completion of all peers joining the relation with existing hook tools. | ||
317 | 524 | |||
318 | 525 | Example usage: | ||
319 | 526 | log('peer {} of {} joined peer relation' | ||
320 | 527 | .format(len(related_units()), | ||
321 | 528 | len(list(expected_peer_units())))) | ||
322 | 529 | |||
323 | 530 | This function will raise NotImplementedError if used with juju versions | ||
324 | 531 | without goal-state support. | ||
325 | 532 | |||
326 | 533 | :returns: iterator | ||
327 | 534 | :rtype: types.GeneratorType | ||
328 | 535 | :raises: NotImplementedError | ||
329 | 536 | """ | ||
330 | 537 | if not has_juju_version("2.4.0"): | ||
331 | 538 | # goal-state first appeared in 2.4.0. | ||
332 | 539 | raise NotImplementedError("goal-state") | ||
333 | 540 | _goal_state = goal_state() | ||
334 | 541 | return (key for key in _goal_state['units'] | ||
335 | 542 | if '/' in key and key != local_unit()) | ||
336 | 543 | |||
337 | 544 | |||
338 | 545 | def expected_related_units(reltype=None): | ||
339 | 546 | """Get a generator for units we expect to join relation based on | ||
340 | 547 | goal-state. | ||
341 | 548 | |||
342 | 549 | Note that you can not use this function for the peer relation, take a look | ||
343 | 550 | at expected_peer_units() for that. | ||
344 | 551 | |||
345 | 552 | This function will raise KeyError if you request information for a | ||
346 | 553 | relation type for which juju goal-state does not have information. It will | ||
347 | 554 | raise NotImplementedError if used with juju versions without goal-state | ||
348 | 555 | support. | ||
349 | 556 | |||
350 | 557 | Example usage: | ||
351 | 558 | log('participant {} of {} joined relation {}' | ||
352 | 559 | .format(len(related_units()), | ||
353 | 560 | len(list(expected_related_units())), | ||
354 | 561 | relation_type())) | ||
355 | 562 | |||
356 | 563 | :param reltype: Relation type to list data for, default is to list data for | ||
357 | 564 | the realtion type we are currently executing a hook for. | ||
358 | 565 | :type reltype: str | ||
359 | 566 | :returns: iterator | ||
360 | 567 | :rtype: types.GeneratorType | ||
361 | 568 | :raises: KeyError, NotImplementedError | ||
362 | 569 | """ | ||
363 | 570 | if not has_juju_version("2.4.4"): | ||
364 | 571 | # goal-state existed in 2.4.0, but did not list individual units to | ||
365 | 572 | # join a relation in 2.4.1 through 2.4.3. (LP: #1794739) | ||
366 | 573 | raise NotImplementedError("goal-state relation unit count") | ||
367 | 574 | reltype = reltype or relation_type() | ||
368 | 575 | _goal_state = goal_state() | ||
369 | 576 | return (key for key in _goal_state['relations'][reltype] if '/' in key) | ||
370 | 577 | |||
371 | 578 | |||
372 | 438 | @cached | 579 | @cached |
373 | 439 | def relation_for_unit(unit=None, rid=None): | 580 | def relation_for_unit(unit=None, rid=None): |
374 | 440 | """Get the json represenation of a unit's relation""" | 581 | """Get the json represenation of a unit's relation""" |
375 | @@ -478,6 +619,24 @@ | |||
376 | 478 | return yaml.safe_load(md) | 619 | return yaml.safe_load(md) |
377 | 479 | 620 | ||
378 | 480 | 621 | ||
379 | 622 | def _metadata_unit(unit): | ||
380 | 623 | """Given the name of a unit (e.g. apache2/0), get the unit charm's | ||
381 | 624 | metadata.yaml. Very similar to metadata() but allows us to inspect | ||
382 | 625 | other units. Unit needs to be co-located, such as a subordinate or | ||
383 | 626 | principal/primary. | ||
384 | 627 | |||
385 | 628 | :returns: metadata.yaml as a python object. | ||
386 | 629 | |||
387 | 630 | """ | ||
388 | 631 | basedir = os.sep.join(charm_dir().split(os.sep)[:-2]) | ||
389 | 632 | unitdir = 'unit-{}'.format(unit.replace(os.sep, '-')) | ||
390 | 633 | joineddir = os.path.join(basedir, unitdir, 'charm', 'metadata.yaml') | ||
391 | 634 | if not os.path.exists(joineddir): | ||
392 | 635 | return None | ||
393 | 636 | with open(joineddir) as md: | ||
394 | 637 | return yaml.safe_load(md) | ||
395 | 638 | |||
396 | 639 | |||
397 | 481 | @cached | 640 | @cached |
398 | 482 | def relation_types(): | 641 | def relation_types(): |
399 | 483 | """Get a list of relation types supported by this charm""" | 642 | """Get a list of relation types supported by this charm""" |
400 | @@ -602,18 +761,31 @@ | |||
401 | 602 | return False | 761 | return False |
402 | 603 | 762 | ||
403 | 604 | 763 | ||
404 | 764 | def _port_op(op_name, port, protocol="TCP"): | ||
405 | 765 | """Open or close a service network port""" | ||
406 | 766 | _args = [op_name] | ||
407 | 767 | icmp = protocol.upper() == "ICMP" | ||
408 | 768 | if icmp: | ||
409 | 769 | _args.append(protocol) | ||
410 | 770 | else: | ||
411 | 771 | _args.append('{}/{}'.format(port, protocol)) | ||
412 | 772 | try: | ||
413 | 773 | subprocess.check_call(_args) | ||
414 | 774 | except subprocess.CalledProcessError: | ||
415 | 775 | # Older Juju pre 2.3 doesn't support ICMP | ||
416 | 776 | # so treat it as a no-op if it fails. | ||
417 | 777 | if not icmp: | ||
418 | 778 | raise | ||
419 | 779 | |||
420 | 780 | |||
421 | 605 | def open_port(port, protocol="TCP"): | 781 | def open_port(port, protocol="TCP"): |
422 | 606 | """Open a service network port""" | 782 | """Open a service network port""" |
426 | 607 | _args = ['open-port'] | 783 | _port_op('open-port', port, protocol) |
424 | 608 | _args.append('{}/{}'.format(port, protocol)) | ||
425 | 609 | subprocess.check_call(_args) | ||
427 | 610 | 784 | ||
428 | 611 | 785 | ||
429 | 612 | def close_port(port, protocol="TCP"): | 786 | def close_port(port, protocol="TCP"): |
430 | 613 | """Close a service network port""" | 787 | """Close a service network port""" |
434 | 614 | _args = ['close-port'] | 788 | _port_op('close-port', port, protocol) |
432 | 615 | _args.append('{}/{}'.format(port, protocol)) | ||
433 | 616 | subprocess.check_call(_args) | ||
435 | 617 | 789 | ||
436 | 618 | 790 | ||
437 | 619 | def open_ports(start, end, protocol="TCP"): | 791 | def open_ports(start, end, protocol="TCP"): |
438 | @@ -630,6 +802,17 @@ | |||
439 | 630 | subprocess.check_call(_args) | 802 | subprocess.check_call(_args) |
440 | 631 | 803 | ||
441 | 632 | 804 | ||
442 | 805 | def opened_ports(): | ||
443 | 806 | """Get the opened ports | ||
444 | 807 | |||
445 | 808 | *Note that this will only show ports opened in a previous hook* | ||
446 | 809 | |||
447 | 810 | :returns: Opened ports as a list of strings: ``['8080/tcp', '8081-8083/tcp']`` | ||
448 | 811 | """ | ||
449 | 812 | _args = ['opened-ports', '--format=json'] | ||
450 | 813 | return json.loads(subprocess.check_output(_args).decode('UTF-8')) | ||
451 | 814 | |||
452 | 815 | |||
453 | 633 | @cached | 816 | @cached |
454 | 634 | def unit_get(attribute): | 817 | def unit_get(attribute): |
455 | 635 | """Get the unit ID for the remote unit""" | 818 | """Get the unit ID for the remote unit""" |
456 | @@ -751,8 +934,15 @@ | |||
457 | 751 | return wrapper | 934 | return wrapper |
458 | 752 | 935 | ||
459 | 753 | 936 | ||
460 | 937 | class NoNetworkBinding(Exception): | ||
461 | 938 | pass | ||
462 | 939 | |||
463 | 940 | |||
464 | 754 | def charm_dir(): | 941 | def charm_dir(): |
465 | 755 | """Return the root directory of the current charm""" | 942 | """Return the root directory of the current charm""" |
466 | 943 | d = os.environ.get('JUJU_CHARM_DIR') | ||
467 | 944 | if d is not None: | ||
468 | 945 | return d | ||
469 | 756 | return os.environ.get('CHARM_DIR') | 946 | return os.environ.get('CHARM_DIR') |
470 | 757 | 947 | ||
471 | 758 | 948 | ||
472 | @@ -874,6 +1064,14 @@ | |||
473 | 874 | 1064 | ||
474 | 875 | 1065 | ||
475 | 876 | @translate_exc(from_exc=OSError, to_exc=NotImplementedError) | 1066 | @translate_exc(from_exc=OSError, to_exc=NotImplementedError) |
476 | 1067 | @cached | ||
477 | 1068 | def goal_state(): | ||
478 | 1069 | """Juju goal state values""" | ||
479 | 1070 | cmd = ['goal-state', '--format=json'] | ||
480 | 1071 | return json.loads(subprocess.check_output(cmd).decode('UTF-8')) | ||
481 | 1072 | |||
482 | 1073 | |||
483 | 1074 | @translate_exc(from_exc=OSError, to_exc=NotImplementedError) | ||
484 | 877 | def is_leader(): | 1075 | def is_leader(): |
485 | 878 | """Does the current unit hold the juju leadership | 1076 | """Does the current unit hold the juju leadership |
486 | 879 | 1077 | ||
487 | @@ -967,7 +1165,6 @@ | |||
488 | 967 | universal_newlines=True).strip() | 1165 | universal_newlines=True).strip() |
489 | 968 | 1166 | ||
490 | 969 | 1167 | ||
491 | 970 | @cached | ||
492 | 971 | def has_juju_version(minimum_version): | 1168 | def has_juju_version(minimum_version): |
493 | 972 | """Return True if the Juju version is at least the provided version""" | 1169 | """Return True if the Juju version is at least the provided version""" |
494 | 973 | return LooseVersion(juju_version()) >= LooseVersion(minimum_version) | 1170 | return LooseVersion(juju_version()) >= LooseVersion(minimum_version) |
495 | @@ -1027,6 +1224,8 @@ | |||
496 | 1027 | @translate_exc(from_exc=OSError, to_exc=NotImplementedError) | 1224 | @translate_exc(from_exc=OSError, to_exc=NotImplementedError) |
497 | 1028 | def network_get_primary_address(binding): | 1225 | def network_get_primary_address(binding): |
498 | 1029 | ''' | 1226 | ''' |
499 | 1227 | Deprecated since Juju 2.3; use network_get() | ||
500 | 1228 | |||
501 | 1030 | Retrieve the primary network address for a named binding | 1229 | Retrieve the primary network address for a named binding |
502 | 1031 | 1230 | ||
503 | 1032 | :param binding: string. The name of a relation of extra-binding | 1231 | :param binding: string. The name of a relation of extra-binding |
504 | @@ -1034,7 +1233,41 @@ | |||
505 | 1034 | :raise: NotImplementedError if run on Juju < 2.0 | 1233 | :raise: NotImplementedError if run on Juju < 2.0 |
506 | 1035 | ''' | 1234 | ''' |
507 | 1036 | cmd = ['network-get', '--primary-address', binding] | 1235 | cmd = ['network-get', '--primary-address', binding] |
509 | 1037 | return subprocess.check_output(cmd).decode('UTF-8').strip() | 1236 | try: |
510 | 1237 | response = subprocess.check_output( | ||
511 | 1238 | cmd, | ||
512 | 1239 | stderr=subprocess.STDOUT).decode('UTF-8').strip() | ||
513 | 1240 | except CalledProcessError as e: | ||
514 | 1241 | if 'no network config found for binding' in e.output.decode('UTF-8'): | ||
515 | 1242 | raise NoNetworkBinding("No network binding for {}" | ||
516 | 1243 | .format(binding)) | ||
517 | 1244 | else: | ||
518 | 1245 | raise | ||
519 | 1246 | return response | ||
520 | 1247 | |||
521 | 1248 | |||
522 | 1249 | def network_get(endpoint, relation_id=None): | ||
523 | 1250 | """ | ||
524 | 1251 | Retrieve the network details for a relation endpoint | ||
525 | 1252 | |||
526 | 1253 | :param endpoint: string. The name of a relation endpoint | ||
527 | 1254 | :param relation_id: int. The ID of the relation for the current context. | ||
528 | 1255 | :return: dict. The loaded YAML output of the network-get query. | ||
529 | 1256 | :raise: NotImplementedError if request not supported by the Juju version. | ||
530 | 1257 | """ | ||
531 | 1258 | if not has_juju_version('2.2'): | ||
532 | 1259 | raise NotImplementedError(juju_version()) # earlier versions require --primary-address | ||
533 | 1260 | if relation_id and not has_juju_version('2.3'): | ||
534 | 1261 | raise NotImplementedError # 2.3 added the -r option | ||
535 | 1262 | |||
536 | 1263 | cmd = ['network-get', endpoint, '--format', 'yaml'] | ||
537 | 1264 | if relation_id: | ||
538 | 1265 | cmd.append('-r') | ||
539 | 1266 | cmd.append(relation_id) | ||
540 | 1267 | response = subprocess.check_output( | ||
541 | 1268 | cmd, | ||
542 | 1269 | stderr=subprocess.STDOUT).decode('UTF-8').strip() | ||
543 | 1270 | return yaml.safe_load(response) | ||
544 | 1038 | 1271 | ||
545 | 1039 | 1272 | ||
546 | 1040 | def add_metric(*args, **kwargs): | 1273 | def add_metric(*args, **kwargs): |
547 | @@ -1066,3 +1299,192 @@ | |||
548 | 1066 | """Get the meter status information, if running in the meter-status-changed | 1299 | """Get the meter status information, if running in the meter-status-changed |
549 | 1067 | hook.""" | 1300 | hook.""" |
550 | 1068 | return os.environ.get('JUJU_METER_INFO') | 1301 | return os.environ.get('JUJU_METER_INFO') |
551 | 1302 | |||
552 | 1303 | |||
553 | 1304 | def iter_units_for_relation_name(relation_name): | ||
554 | 1305 | """Iterate through all units in a relation | ||
555 | 1306 | |||
556 | 1307 | Generator that iterates through all the units in a relation and yields | ||
557 | 1308 | a named tuple with rid and unit field names. | ||
558 | 1309 | |||
559 | 1310 | Usage: | ||
560 | 1311 | data = [(u.rid, u.unit) | ||
561 | 1312 | for u in iter_units_for_relation_name(relation_name)] | ||
562 | 1313 | |||
563 | 1314 | :param relation_name: string relation name | ||
564 | 1315 | :yield: Named Tuple with rid and unit field names | ||
565 | 1316 | """ | ||
566 | 1317 | RelatedUnit = namedtuple('RelatedUnit', 'rid, unit') | ||
567 | 1318 | for rid in relation_ids(relation_name): | ||
568 | 1319 | for unit in related_units(rid): | ||
569 | 1320 | yield RelatedUnit(rid, unit) | ||
570 | 1321 | |||
571 | 1322 | |||
572 | 1323 | def ingress_address(rid=None, unit=None): | ||
573 | 1324 | """ | ||
574 | 1325 | Retrieve the ingress-address from a relation when available. | ||
575 | 1326 | Otherwise, return the private-address. | ||
576 | 1327 | |||
577 | 1328 | When used on the consuming side of the relation (unit is a remote | ||
578 | 1329 | unit), the ingress-address is the IP address that this unit needs | ||
579 | 1330 | to use to reach the provided service on the remote unit. | ||
580 | 1331 | |||
581 | 1332 | When used on the providing side of the relation (unit == local_unit()), | ||
582 | 1333 | the ingress-address is the IP address that is advertised to remote | ||
583 | 1334 | units on this relation. Remote units need to use this address to | ||
584 | 1335 | reach the local provided service on this unit. | ||
585 | 1336 | |||
586 | 1337 | Note that charms may document some other method to use in | ||
587 | 1338 | preference to the ingress_address(), such as an address provided | ||
588 | 1339 | on a different relation attribute or a service discovery mechanism. | ||
589 | 1340 | This allows charms to redirect inbound connections to their peers | ||
590 | 1341 | or different applications such as load balancers. | ||
591 | 1342 | |||
592 | 1343 | Usage: | ||
593 | 1344 | addresses = [ingress_address(rid=u.rid, unit=u.unit) | ||
594 | 1345 | for u in iter_units_for_relation_name(relation_name)] | ||
595 | 1346 | |||
596 | 1347 | :param rid: string relation id | ||
597 | 1348 | :param unit: string unit name | ||
598 | 1349 | :side effect: calls relation_get | ||
599 | 1350 | :return: string IP address | ||
600 | 1351 | """ | ||
601 | 1352 | settings = relation_get(rid=rid, unit=unit) | ||
602 | 1353 | return (settings.get('ingress-address') or | ||
603 | 1354 | settings.get('private-address')) | ||
604 | 1355 | |||
605 | 1356 | |||
606 | 1357 | def egress_subnets(rid=None, unit=None): | ||
607 | 1358 | """ | ||
608 | 1359 | Retrieve the egress-subnets from a relation. | ||
609 | 1360 | |||
610 | 1361 | This function is to be used on the providing side of the | ||
611 | 1362 | relation, and provides the ranges of addresses that client | ||
612 | 1363 | connections may come from. The result is uninteresting on | ||
613 | 1364 | the consuming side of a relation (unit == local_unit()). | ||
614 | 1365 | |||
615 | 1366 | Returns a stable list of subnets in CIDR format. | ||
616 | 1367 | eg. ['192.168.1.0/24', '2001::F00F/128'] | ||
617 | 1368 | |||
618 | 1369 | If egress-subnets is not available, falls back to using the published | ||
619 | 1370 | ingress-address, or finally private-address. | ||
620 | 1371 | |||
621 | 1372 | :param rid: string relation id | ||
622 | 1373 | :param unit: string unit name | ||
623 | 1374 | :side effect: calls relation_get | ||
624 | 1375 | :return: list of subnets in CIDR format. eg. ['192.168.1.0/24', '2001::F00F/128'] | ||
625 | 1376 | """ | ||
626 | 1377 | def _to_range(addr): | ||
627 | 1378 | if re.search(r'^(?:\d{1,3}\.){3}\d{1,3}$', addr) is not None: | ||
628 | 1379 | addr += '/32' | ||
629 | 1380 | elif ':' in addr and '/' not in addr: # IPv6 | ||
630 | 1381 | addr += '/128' | ||
631 | 1382 | return addr | ||
632 | 1383 | |||
633 | 1384 | settings = relation_get(rid=rid, unit=unit) | ||
634 | 1385 | if 'egress-subnets' in settings: | ||
635 | 1386 | return [n.strip() for n in settings['egress-subnets'].split(',') if n.strip()] | ||
636 | 1387 | if 'ingress-address' in settings: | ||
637 | 1388 | return [_to_range(settings['ingress-address'])] | ||
638 | 1389 | if 'private-address' in settings: | ||
639 | 1390 | return [_to_range(settings['private-address'])] | ||
640 | 1391 | return [] # Should never happen | ||
641 | 1392 | |||
642 | 1393 | |||
643 | 1394 | def unit_doomed(unit=None): | ||
644 | 1395 | """Determines if the unit is being removed from the model | ||
645 | 1396 | |||
646 | 1397 | Requires Juju 2.4.1. | ||
647 | 1398 | |||
648 | 1399 | :param unit: string unit name, defaults to local_unit | ||
649 | 1400 | :side effect: calls goal_state | ||
650 | 1401 | :side effect: calls local_unit | ||
651 | 1402 | :side effect: calls has_juju_version | ||
652 | 1403 | :return: True if the unit is being removed, already gone, or never existed | ||
653 | 1404 | """ | ||
654 | 1405 | if not has_juju_version("2.4.1"): | ||
655 | 1406 | # We cannot risk blindly returning False for 'we don't know', | ||
656 | 1407 | # because that could cause data loss; if call sites don't | ||
657 | 1408 | # need an accurate answer, they likely don't need this helper | ||
658 | 1409 | # at all. | ||
659 | 1410 | # goal-state existed in 2.4.0, but did not handle removals | ||
660 | 1411 | # correctly until 2.4.1. | ||
661 | 1412 | raise NotImplementedError("is_doomed") | ||
662 | 1413 | if unit is None: | ||
663 | 1414 | unit = local_unit() | ||
664 | 1415 | gs = goal_state() | ||
665 | 1416 | units = gs.get('units', {}) | ||
666 | 1417 | if unit not in units: | ||
667 | 1418 | return True | ||
668 | 1419 | # I don't think 'dead' units ever show up in the goal-state, but | ||
669 | 1420 | # check anyway in addition to 'dying'. | ||
670 | 1421 | return units[unit]['status'] in ('dying', 'dead') | ||
671 | 1422 | |||
672 | 1423 | |||
673 | 1424 | def env_proxy_settings(selected_settings=None): | ||
674 | 1425 | """Get proxy settings from process environment variables. | ||
675 | 1426 | |||
676 | 1427 | Get charm proxy settings from environment variables that correspond to | ||
677 | 1428 | juju-http-proxy, juju-https-proxy and juju-no-proxy (available as of 2.4.2, | ||
678 | 1429 | see lp:1782236) in a format suitable for passing to an application that | ||
679 | 1430 | reacts to proxy settings passed as environment variables. Some applications | ||
680 | 1431 | support lowercase or uppercase notation (e.g. curl), some support only | ||
681 | 1432 | lowercase (e.g. wget), there are also subjectively rare cases of only | ||
682 | 1433 | uppercase notation support. no_proxy CIDR and wildcard support also varies | ||
683 | 1434 | between runtimes and applications as there is no enforced standard. | ||
684 | 1435 | |||
685 | 1436 | Some applications may connect to multiple destinations and expose config | ||
686 | 1437 | options that would affect only proxy settings for a specific destination | ||
687 | 1438 | these should be handled in charms in an application-specific manner. | ||
688 | 1439 | |||
689 | 1440 | :param selected_settings: format only a subset of possible settings | ||
690 | 1441 | :type selected_settings: list | ||
691 | 1442 | :rtype: Option(None, dict[str, str]) | ||
692 | 1443 | """ | ||
693 | 1444 | SUPPORTED_SETTINGS = { | ||
694 | 1445 | 'http': 'HTTP_PROXY', | ||
695 | 1446 | 'https': 'HTTPS_PROXY', | ||
696 | 1447 | 'no_proxy': 'NO_PROXY', | ||
697 | 1448 | 'ftp': 'FTP_PROXY' | ||
698 | 1449 | } | ||
699 | 1450 | if selected_settings is None: | ||
700 | 1451 | selected_settings = SUPPORTED_SETTINGS | ||
701 | 1452 | |||
702 | 1453 | selected_vars = [v for k, v in SUPPORTED_SETTINGS.items() | ||
703 | 1454 | if k in selected_settings] | ||
704 | 1455 | proxy_settings = {} | ||
705 | 1456 | for var in selected_vars: | ||
706 | 1457 | var_val = os.getenv(var) | ||
707 | 1458 | if var_val: | ||
708 | 1459 | proxy_settings[var] = var_val | ||
709 | 1460 | proxy_settings[var.lower()] = var_val | ||
710 | 1461 | # Now handle juju-prefixed environment variables. The legacy vs new | ||
711 | 1462 | # environment variable usage is mutually exclusive | ||
712 | 1463 | charm_var_val = os.getenv('JUJU_CHARM_{}'.format(var)) | ||
713 | 1464 | if charm_var_val: | ||
714 | 1465 | proxy_settings[var] = charm_var_val | ||
715 | 1466 | proxy_settings[var.lower()] = charm_var_val | ||
716 | 1467 | if 'no_proxy' in proxy_settings: | ||
717 | 1468 | if _contains_range(proxy_settings['no_proxy']): | ||
718 | 1469 | log(RANGE_WARNING, level=WARNING) | ||
719 | 1470 | return proxy_settings if proxy_settings else None | ||
720 | 1471 | |||
721 | 1472 | |||
722 | 1473 | def _contains_range(addresses): | ||
723 | 1474 | """Check for cidr or wildcard domain in a string. | ||
724 | 1475 | |||
725 | 1476 | Given a string comprising a comma seperated list of ip addresses | ||
726 | 1477 | and domain names, determine whether the string contains IP ranges | ||
727 | 1478 | or wildcard domains. | ||
728 | 1479 | |||
729 | 1480 | :param addresses: comma seperated list of domains and ip addresses. | ||
730 | 1481 | :type addresses: str | ||
731 | 1482 | """ | ||
732 | 1483 | return ( | ||
733 | 1484 | # Test for cidr (e.g. 10.20.20.0/24) | ||
734 | 1485 | "/" in addresses or | ||
735 | 1486 | # Test for wildcard domains (*.foo.com or .foo.com) | ||
736 | 1487 | "*" in addresses or | ||
737 | 1488 | addresses.startswith(".") or | ||
738 | 1489 | ",." in addresses or | ||
739 | 1490 | " ." in addresses) | ||
740 | 1069 | 1491 | ||
741 | === modified file 'hooks/charmhelpers/core/host.py' | |||
742 | --- hooks/charmhelpers/core/host.py 2017-03-03 19:56:10 +0000 | |||
743 | +++ hooks/charmhelpers/core/host.py 2019-04-18 01:13:06 +0000 | |||
744 | @@ -34,28 +34,33 @@ | |||
745 | 34 | 34 | ||
746 | 35 | from contextlib import contextmanager | 35 | from contextlib import contextmanager |
747 | 36 | from collections import OrderedDict | 36 | from collections import OrderedDict |
749 | 37 | from .hookenv import log | 37 | from .hookenv import log, INFO, DEBUG, local_unit, charm_name |
750 | 38 | from .fstab import Fstab | 38 | from .fstab import Fstab |
751 | 39 | from charmhelpers.osplatform import get_platform | 39 | from charmhelpers.osplatform import get_platform |
752 | 40 | 40 | ||
753 | 41 | __platform__ = get_platform() | 41 | __platform__ = get_platform() |
754 | 42 | if __platform__ == "ubuntu": | 42 | if __platform__ == "ubuntu": |
756 | 43 | from charmhelpers.core.host_factory.ubuntu import ( | 43 | from charmhelpers.core.host_factory.ubuntu import ( # NOQA:F401 |
757 | 44 | service_available, | 44 | service_available, |
758 | 45 | add_new_group, | 45 | add_new_group, |
759 | 46 | lsb_release, | 46 | lsb_release, |
760 | 47 | cmp_pkgrevno, | 47 | cmp_pkgrevno, |
761 | 48 | CompareHostReleases, | ||
762 | 49 | get_distrib_codename, | ||
763 | 50 | arch | ||
764 | 48 | ) # flake8: noqa -- ignore F401 for this import | 51 | ) # flake8: noqa -- ignore F401 for this import |
765 | 49 | elif __platform__ == "centos": | 52 | elif __platform__ == "centos": |
767 | 50 | from charmhelpers.core.host_factory.centos import ( | 53 | from charmhelpers.core.host_factory.centos import ( # NOQA:F401 |
768 | 51 | service_available, | 54 | service_available, |
769 | 52 | add_new_group, | 55 | add_new_group, |
770 | 53 | lsb_release, | 56 | lsb_release, |
771 | 54 | cmp_pkgrevno, | 57 | cmp_pkgrevno, |
772 | 58 | CompareHostReleases, | ||
773 | 55 | ) # flake8: noqa -- ignore F401 for this import | 59 | ) # flake8: noqa -- ignore F401 for this import |
774 | 56 | 60 | ||
775 | 57 | UPDATEDB_PATH = '/etc/updatedb.conf' | 61 | UPDATEDB_PATH = '/etc/updatedb.conf' |
776 | 58 | 62 | ||
777 | 63 | |||
778 | 59 | def service_start(service_name, **kwargs): | 64 | def service_start(service_name, **kwargs): |
779 | 60 | """Start a system service. | 65 | """Start a system service. |
780 | 61 | 66 | ||
781 | @@ -190,6 +195,7 @@ | |||
782 | 190 | sysv_file = os.path.join(initd_dir, service_name) | 195 | sysv_file = os.path.join(initd_dir, service_name) |
783 | 191 | if init_is_systemd(): | 196 | if init_is_systemd(): |
784 | 192 | service('disable', service_name) | 197 | service('disable', service_name) |
785 | 198 | service('mask', service_name) | ||
786 | 193 | elif os.path.exists(upstart_file): | 199 | elif os.path.exists(upstart_file): |
787 | 194 | override_path = os.path.join( | 200 | override_path = os.path.join( |
788 | 195 | init_dir, '{}.override'.format(service_name)) | 201 | init_dir, '{}.override'.format(service_name)) |
789 | @@ -222,6 +228,7 @@ | |||
790 | 222 | upstart_file = os.path.join(init_dir, "{}.conf".format(service_name)) | 228 | upstart_file = os.path.join(init_dir, "{}.conf".format(service_name)) |
791 | 223 | sysv_file = os.path.join(initd_dir, service_name) | 229 | sysv_file = os.path.join(initd_dir, service_name) |
792 | 224 | if init_is_systemd(): | 230 | if init_is_systemd(): |
793 | 231 | service('unmask', service_name) | ||
794 | 225 | service('enable', service_name) | 232 | service('enable', service_name) |
795 | 226 | elif os.path.exists(upstart_file): | 233 | elif os.path.exists(upstart_file): |
796 | 227 | override_path = os.path.join( | 234 | override_path = os.path.join( |
797 | @@ -283,8 +290,8 @@ | |||
798 | 283 | for key, value in six.iteritems(kwargs): | 290 | for key, value in six.iteritems(kwargs): |
799 | 284 | parameter = '%s=%s' % (key, value) | 291 | parameter = '%s=%s' % (key, value) |
800 | 285 | cmd.append(parameter) | 292 | cmd.append(parameter) |
803 | 286 | output = subprocess.check_output(cmd, | 293 | output = subprocess.check_output( |
804 | 287 | stderr=subprocess.STDOUT).decode('UTF-8') | 294 | cmd, stderr=subprocess.STDOUT).decode('UTF-8') |
805 | 288 | except subprocess.CalledProcessError: | 295 | except subprocess.CalledProcessError: |
806 | 289 | return False | 296 | return False |
807 | 290 | else: | 297 | else: |
808 | @@ -306,6 +313,8 @@ | |||
809 | 306 | 313 | ||
810 | 307 | def init_is_systemd(): | 314 | def init_is_systemd(): |
811 | 308 | """Return True if the host system uses systemd, False otherwise.""" | 315 | """Return True if the host system uses systemd, False otherwise.""" |
812 | 316 | if lsb_release()['DISTRIB_CODENAME'] == 'trusty': | ||
813 | 317 | return False | ||
814 | 309 | return os.path.isdir(SYSTEMD_SYSTEM) | 318 | return os.path.isdir(SYSTEMD_SYSTEM) |
815 | 310 | 319 | ||
816 | 311 | 320 | ||
817 | @@ -435,6 +444,51 @@ | |||
818 | 435 | subprocess.check_call(cmd) | 444 | subprocess.check_call(cmd) |
819 | 436 | 445 | ||
820 | 437 | 446 | ||
821 | 447 | def chage(username, lastday=None, expiredate=None, inactive=None, | ||
822 | 448 | mindays=None, maxdays=None, root=None, warndays=None): | ||
823 | 449 | """Change user password expiry information | ||
824 | 450 | |||
825 | 451 | :param str username: User to update | ||
826 | 452 | :param str lastday: Set when password was changed in YYYY-MM-DD format | ||
827 | 453 | :param str expiredate: Set when user's account will no longer be | ||
828 | 454 | accessible in YYYY-MM-DD format. | ||
829 | 455 | -1 will remove an account expiration date. | ||
830 | 456 | :param str inactive: Set the number of days of inactivity after a password | ||
831 | 457 | has expired before the account is locked. | ||
832 | 458 | -1 will remove an account's inactivity. | ||
833 | 459 | :param str mindays: Set the minimum number of days between password | ||
834 | 460 | changes to MIN_DAYS. | ||
835 | 461 | 0 indicates the password can be changed anytime. | ||
836 | 462 | :param str maxdays: Set the maximum number of days during which a | ||
837 | 463 | password is valid. | ||
838 | 464 | -1 as MAX_DAYS will remove checking maxdays | ||
839 | 465 | :param str root: Apply changes in the CHROOT_DIR directory | ||
840 | 466 | :param str warndays: Set the number of days of warning before a password | ||
841 | 467 | change is required | ||
842 | 468 | :raises subprocess.CalledProcessError: if call to chage fails | ||
843 | 469 | """ | ||
844 | 470 | cmd = ['chage'] | ||
845 | 471 | if root: | ||
846 | 472 | cmd.extend(['--root', root]) | ||
847 | 473 | if lastday: | ||
848 | 474 | cmd.extend(['--lastday', lastday]) | ||
849 | 475 | if expiredate: | ||
850 | 476 | cmd.extend(['--expiredate', expiredate]) | ||
851 | 477 | if inactive: | ||
852 | 478 | cmd.extend(['--inactive', inactive]) | ||
853 | 479 | if mindays: | ||
854 | 480 | cmd.extend(['--mindays', mindays]) | ||
855 | 481 | if maxdays: | ||
856 | 482 | cmd.extend(['--maxdays', maxdays]) | ||
857 | 483 | if warndays: | ||
858 | 484 | cmd.extend(['--warndays', warndays]) | ||
859 | 485 | cmd.append(username) | ||
860 | 486 | subprocess.check_call(cmd) | ||
861 | 487 | |||
862 | 488 | |||
863 | 489 | remove_password_expiry = functools.partial(chage, expiredate='-1', inactive='-1', mindays='0', maxdays='-1') | ||
864 | 490 | |||
865 | 491 | |||
866 | 438 | def rsync(from_path, to_path, flags='-r', options=None, timeout=None): | 492 | def rsync(from_path, to_path, flags='-r', options=None, timeout=None): |
867 | 439 | """Replicate the contents of a path""" | 493 | """Replicate the contents of a path""" |
868 | 440 | options = options or ['--delete', '--executability'] | 494 | options = options or ['--delete', '--executability'] |
869 | @@ -481,13 +535,45 @@ | |||
870 | 481 | 535 | ||
871 | 482 | def write_file(path, content, owner='root', group='root', perms=0o444): | 536 | def write_file(path, content, owner='root', group='root', perms=0o444): |
872 | 483 | """Create or overwrite a file with the contents of a byte string.""" | 537 | """Create or overwrite a file with the contents of a byte string.""" |
873 | 484 | log("Writing file {} {}:{} {:o}".format(path, owner, group, perms)) | ||
874 | 485 | uid = pwd.getpwnam(owner).pw_uid | 538 | uid = pwd.getpwnam(owner).pw_uid |
875 | 486 | gid = grp.getgrnam(group).gr_gid | 539 | gid = grp.getgrnam(group).gr_gid |
880 | 487 | with open(path, 'wb') as target: | 540 | # lets see if we can grab the file and compare the context, to avoid doing |
881 | 488 | os.fchown(target.fileno(), uid, gid) | 541 | # a write. |
882 | 489 | os.fchmod(target.fileno(), perms) | 542 | existing_content = None |
883 | 490 | target.write(content) | 543 | existing_uid, existing_gid, existing_perms = None, None, None |
884 | 544 | try: | ||
885 | 545 | with open(path, 'rb') as target: | ||
886 | 546 | existing_content = target.read() | ||
887 | 547 | stat = os.stat(path) | ||
888 | 548 | existing_uid, existing_gid, existing_perms = ( | ||
889 | 549 | stat.st_uid, stat.st_gid, stat.st_mode | ||
890 | 550 | ) | ||
891 | 551 | except Exception: | ||
892 | 552 | pass | ||
893 | 553 | if content != existing_content: | ||
894 | 554 | log("Writing file {} {}:{} {:o}".format(path, owner, group, perms), | ||
895 | 555 | level=DEBUG) | ||
896 | 556 | with open(path, 'wb') as target: | ||
897 | 557 | os.fchown(target.fileno(), uid, gid) | ||
898 | 558 | os.fchmod(target.fileno(), perms) | ||
899 | 559 | if six.PY3 and isinstance(content, six.string_types): | ||
900 | 560 | content = content.encode('UTF-8') | ||
901 | 561 | target.write(content) | ||
902 | 562 | return | ||
903 | 563 | # the contents were the same, but we might still need to change the | ||
904 | 564 | # ownership or permissions. | ||
905 | 565 | if existing_uid != uid: | ||
906 | 566 | log("Changing uid on already existing content: {} -> {}" | ||
907 | 567 | .format(existing_uid, uid), level=DEBUG) | ||
908 | 568 | os.chown(path, uid, -1) | ||
909 | 569 | if existing_gid != gid: | ||
910 | 570 | log("Changing gid on already existing content: {} -> {}" | ||
911 | 571 | .format(existing_gid, gid), level=DEBUG) | ||
912 | 572 | os.chown(path, -1, gid) | ||
913 | 573 | if existing_perms != perms: | ||
914 | 574 | log("Changing permissions on existing content: {} -> {}" | ||
915 | 575 | .format(existing_perms, perms), level=DEBUG) | ||
916 | 576 | os.chmod(path, perms) | ||
917 | 491 | 577 | ||
918 | 492 | 578 | ||
919 | 493 | def fstab_remove(mp): | 579 | def fstab_remove(mp): |
920 | @@ -752,7 +838,7 @@ | |||
921 | 752 | ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n') | 838 | ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n') |
922 | 753 | ip_output = (line.strip() for line in ip_output if line) | 839 | ip_output = (line.strip() for line in ip_output if line) |
923 | 754 | 840 | ||
925 | 755 | key = re.compile('^[0-9]+:\s+(.+):') | 841 | key = re.compile(r'^[0-9]+:\s+(.+):') |
926 | 756 | for line in ip_output: | 842 | for line in ip_output: |
927 | 757 | matched = re.search(key, line) | 843 | matched = re.search(key, line) |
928 | 758 | if matched: | 844 | if matched: |
929 | @@ -897,6 +983,20 @@ | |||
930 | 897 | 983 | ||
931 | 898 | 984 | ||
932 | 899 | def add_to_updatedb_prunepath(path, updatedb_path=UPDATEDB_PATH): | 985 | def add_to_updatedb_prunepath(path, updatedb_path=UPDATEDB_PATH): |
933 | 986 | """Adds the specified path to the mlocate's udpatedb.conf PRUNEPATH list. | ||
934 | 987 | |||
935 | 988 | This method has no effect if the path specified by updatedb_path does not | ||
936 | 989 | exist or is not a file. | ||
937 | 990 | |||
938 | 991 | @param path: string the path to add to the updatedb.conf PRUNEPATHS value | ||
939 | 992 | @param updatedb_path: the path the updatedb.conf file | ||
940 | 993 | """ | ||
941 | 994 | if not os.path.exists(updatedb_path) or os.path.isdir(updatedb_path): | ||
942 | 995 | # If the updatedb.conf file doesn't exist then don't attempt to update | ||
943 | 996 | # the file as the package providing mlocate may not be installed on | ||
944 | 997 | # the local system | ||
945 | 998 | return | ||
946 | 999 | |||
947 | 900 | with open(updatedb_path, 'r+') as f_id: | 1000 | with open(updatedb_path, 'r+') as f_id: |
948 | 901 | updatedb_text = f_id.read() | 1001 | updatedb_text = f_id.read() |
949 | 902 | output = updatedb(updatedb_text, path) | 1002 | output = updatedb(updatedb_text, path) |
950 | @@ -916,3 +1016,62 @@ | |||
951 | 916 | lines[i] = 'PRUNEPATHS="{}"'.format(' '.join(paths)) | 1016 | lines[i] = 'PRUNEPATHS="{}"'.format(' '.join(paths)) |
952 | 917 | output = "\n".join(lines) | 1017 | output = "\n".join(lines) |
953 | 918 | return output | 1018 | return output |
954 | 1019 | |||
955 | 1020 | |||
956 | 1021 | def modulo_distribution(modulo=3, wait=30, non_zero_wait=False): | ||
957 | 1022 | """ Modulo distribution | ||
958 | 1023 | |||
959 | 1024 | This helper uses the unit number, a modulo value and a constant wait time | ||
960 | 1025 | to produce a calculated wait time distribution. This is useful in large | ||
961 | 1026 | scale deployments to distribute load during an expensive operation such as | ||
962 | 1027 | service restarts. | ||
963 | 1028 | |||
964 | 1029 | If you have 1000 nodes that need to restart 100 at a time 1 minute at a | ||
965 | 1030 | time: | ||
966 | 1031 | |||
967 | 1032 | time.wait(modulo_distribution(modulo=100, wait=60)) | ||
968 | 1033 | restart() | ||
969 | 1034 | |||
970 | 1035 | If you need restarts to happen serially set modulo to the exact number of | ||
971 | 1036 | nodes and set a high constant wait time: | ||
972 | 1037 | |||
973 | 1038 | time.wait(modulo_distribution(modulo=10, wait=120)) | ||
974 | 1039 | restart() | ||
975 | 1040 | |||
976 | 1041 | @param modulo: int The modulo number creates the group distribution | ||
977 | 1042 | @param wait: int The constant time wait value | ||
978 | 1043 | @param non_zero_wait: boolean Override unit % modulo == 0, | ||
979 | 1044 | return modulo * wait. Used to avoid collisions with | ||
980 | 1045 | leader nodes which are often given priority. | ||
981 | 1046 | @return: int Calculated time to wait for unit operation | ||
982 | 1047 | """ | ||
983 | 1048 | unit_number = int(local_unit().split('/')[1]) | ||
984 | 1049 | calculated_wait_time = (unit_number % modulo) * wait | ||
985 | 1050 | if non_zero_wait and calculated_wait_time == 0: | ||
986 | 1051 | return modulo * wait | ||
987 | 1052 | else: | ||
988 | 1053 | return calculated_wait_time | ||
989 | 1054 | |||
990 | 1055 | |||
991 | 1056 | def install_ca_cert(ca_cert, name=None): | ||
992 | 1057 | """ | ||
993 | 1058 | Install the given cert as a trusted CA. | ||
994 | 1059 | |||
995 | 1060 | The ``name`` is the stem of the filename where the cert is written, and if | ||
996 | 1061 | not provided, it will default to ``juju-{charm_name}``. | ||
997 | 1062 | |||
998 | 1063 | If the cert is empty or None, or is unchanged, nothing is done. | ||
999 | 1064 | """ | ||
1000 | 1065 | if not ca_cert: | ||
1001 | 1066 | return | ||
1002 | 1067 | if not isinstance(ca_cert, bytes): | ||
1003 | 1068 | ca_cert = ca_cert.encode('utf8') | ||
1004 | 1069 | if not name: | ||
1005 | 1070 | name = 'juju-{}'.format(charm_name()) | ||
1006 | 1071 | cert_file = '/usr/local/share/ca-certificates/{}.crt'.format(name) | ||
1007 | 1072 | new_hash = hashlib.md5(ca_cert).hexdigest() | ||
1008 | 1073 | if file_hash(cert_file) == new_hash: | ||
1009 | 1074 | return | ||
1010 | 1075 | log("Installing new CA cert at: {}".format(cert_file), level=INFO) | ||
1011 | 1076 | write_file(cert_file, ca_cert) | ||
1012 | 1077 | subprocess.check_call(['update-ca-certificates', '--fresh']) | ||
1013 | 919 | 1078 | ||
1014 | === modified file 'hooks/charmhelpers/core/host_factory/centos.py' | |||
1015 | --- hooks/charmhelpers/core/host_factory/centos.py 2017-03-03 19:56:10 +0000 | |||
1016 | +++ hooks/charmhelpers/core/host_factory/centos.py 2019-04-18 01:13:06 +0000 | |||
1017 | @@ -2,6 +2,22 @@ | |||
1018 | 2 | import yum | 2 | import yum |
1019 | 3 | import os | 3 | import os |
1020 | 4 | 4 | ||
1021 | 5 | from charmhelpers.core.strutils import BasicStringComparator | ||
1022 | 6 | |||
1023 | 7 | |||
1024 | 8 | class CompareHostReleases(BasicStringComparator): | ||
1025 | 9 | """Provide comparisons of Host releases. | ||
1026 | 10 | |||
1027 | 11 | Use in the form of | ||
1028 | 12 | |||
1029 | 13 | if CompareHostReleases(release) > 'trusty': | ||
1030 | 14 | # do something with mitaka | ||
1031 | 15 | """ | ||
1032 | 16 | |||
1033 | 17 | def __init__(self, item): | ||
1034 | 18 | raise NotImplementedError( | ||
1035 | 19 | "CompareHostReleases() is not implemented for CentOS") | ||
1036 | 20 | |||
1037 | 5 | 21 | ||
1038 | 6 | def service_available(service_name): | 22 | def service_available(service_name): |
1039 | 7 | # """Determine whether a system service is available.""" | 23 | # """Determine whether a system service is available.""" |
1040 | 8 | 24 | ||
1041 | === modified file 'hooks/charmhelpers/core/host_factory/ubuntu.py' | |||
1042 | --- hooks/charmhelpers/core/host_factory/ubuntu.py 2017-03-03 19:56:10 +0000 | |||
1043 | +++ hooks/charmhelpers/core/host_factory/ubuntu.py 2019-04-18 01:13:06 +0000 | |||
1044 | @@ -1,5 +1,42 @@ | |||
1045 | 1 | import subprocess | 1 | import subprocess |
1046 | 2 | 2 | ||
1047 | 3 | from charmhelpers.core.hookenv import cached | ||
1048 | 4 | from charmhelpers.core.strutils import BasicStringComparator | ||
1049 | 5 | |||
1050 | 6 | |||
1051 | 7 | UBUNTU_RELEASES = ( | ||
1052 | 8 | 'lucid', | ||
1053 | 9 | 'maverick', | ||
1054 | 10 | 'natty', | ||
1055 | 11 | 'oneiric', | ||
1056 | 12 | 'precise', | ||
1057 | 13 | 'quantal', | ||
1058 | 14 | 'raring', | ||
1059 | 15 | 'saucy', | ||
1060 | 16 | 'trusty', | ||
1061 | 17 | 'utopic', | ||
1062 | 18 | 'vivid', | ||
1063 | 19 | 'wily', | ||
1064 | 20 | 'xenial', | ||
1065 | 21 | 'yakkety', | ||
1066 | 22 | 'zesty', | ||
1067 | 23 | 'artful', | ||
1068 | 24 | 'bionic', | ||
1069 | 25 | 'cosmic', | ||
1070 | 26 | 'disco', | ||
1071 | 27 | ) | ||
1072 | 28 | |||
1073 | 29 | |||
1074 | 30 | class CompareHostReleases(BasicStringComparator): | ||
1075 | 31 | """Provide comparisons of Ubuntu releases. | ||
1076 | 32 | |||
1077 | 33 | Use in the form of | ||
1078 | 34 | |||
1079 | 35 | if CompareHostReleases(release) > 'trusty': | ||
1080 | 36 | # do something with mitaka | ||
1081 | 37 | """ | ||
1082 | 38 | _list = UBUNTU_RELEASES | ||
1083 | 39 | |||
1084 | 3 | 40 | ||
1085 | 4 | def service_available(service_name): | 41 | def service_available(service_name): |
1086 | 5 | """Determine whether a system service is available""" | 42 | """Determine whether a system service is available""" |
1087 | @@ -37,6 +74,14 @@ | |||
1088 | 37 | return d | 74 | return d |
1089 | 38 | 75 | ||
1090 | 39 | 76 | ||
1091 | 77 | def get_distrib_codename(): | ||
1092 | 78 | """Return the codename of the distribution | ||
1093 | 79 | :returns: The codename | ||
1094 | 80 | :rtype: str | ||
1095 | 81 | """ | ||
1096 | 82 | return lsb_release()['DISTRIB_CODENAME'].lower() | ||
1097 | 83 | |||
1098 | 84 | |||
1099 | 40 | def cmp_pkgrevno(package, revno, pkgcache=None): | 85 | def cmp_pkgrevno(package, revno, pkgcache=None): |
1100 | 41 | """Compare supplied revno with the revno of the installed package. | 86 | """Compare supplied revno with the revno of the installed package. |
1101 | 42 | 87 | ||
1102 | @@ -54,3 +99,16 @@ | |||
1103 | 54 | pkgcache = apt_cache() | 99 | pkgcache = apt_cache() |
1104 | 55 | pkg = pkgcache[package] | 100 | pkg = pkgcache[package] |
1105 | 56 | return apt_pkg.version_compare(pkg.current_ver.ver_str, revno) | 101 | return apt_pkg.version_compare(pkg.current_ver.ver_str, revno) |
1106 | 102 | |||
1107 | 103 | |||
1108 | 104 | @cached | ||
1109 | 105 | def arch(): | ||
1110 | 106 | """Return the package architecture as a string. | ||
1111 | 107 | |||
1112 | 108 | :returns: the architecture | ||
1113 | 109 | :rtype: str | ||
1114 | 110 | :raises: subprocess.CalledProcessError if dpkg command fails | ||
1115 | 111 | """ | ||
1116 | 112 | return subprocess.check_output( | ||
1117 | 113 | ['dpkg', '--print-architecture'] | ||
1118 | 114 | ).rstrip().decode('UTF-8') | ||
1119 | 57 | 115 | ||
1120 | === modified file 'hooks/charmhelpers/core/kernel.py' | |||
1121 | --- hooks/charmhelpers/core/kernel.py 2017-03-03 19:56:10 +0000 | |||
1122 | +++ hooks/charmhelpers/core/kernel.py 2019-04-18 01:13:06 +0000 | |||
1123 | @@ -26,12 +26,12 @@ | |||
1124 | 26 | 26 | ||
1125 | 27 | __platform__ = get_platform() | 27 | __platform__ = get_platform() |
1126 | 28 | if __platform__ == "ubuntu": | 28 | if __platform__ == "ubuntu": |
1128 | 29 | from charmhelpers.core.kernel_factory.ubuntu import ( | 29 | from charmhelpers.core.kernel_factory.ubuntu import ( # NOQA:F401 |
1129 | 30 | persistent_modprobe, | 30 | persistent_modprobe, |
1130 | 31 | update_initramfs, | 31 | update_initramfs, |
1131 | 32 | ) # flake8: noqa -- ignore F401 for this import | 32 | ) # flake8: noqa -- ignore F401 for this import |
1132 | 33 | elif __platform__ == "centos": | 33 | elif __platform__ == "centos": |
1134 | 34 | from charmhelpers.core.kernel_factory.centos import ( | 34 | from charmhelpers.core.kernel_factory.centos import ( # NOQA:F401 |
1135 | 35 | persistent_modprobe, | 35 | persistent_modprobe, |
1136 | 36 | update_initramfs, | 36 | update_initramfs, |
1137 | 37 | ) # flake8: noqa -- ignore F401 for this import | 37 | ) # flake8: noqa -- ignore F401 for this import |
1138 | 38 | 38 | ||
1139 | === modified file 'hooks/charmhelpers/core/services/base.py' | |||
1140 | --- hooks/charmhelpers/core/services/base.py 2017-03-03 19:56:10 +0000 | |||
1141 | +++ hooks/charmhelpers/core/services/base.py 2019-04-18 01:13:06 +0000 | |||
1142 | @@ -307,23 +307,34 @@ | |||
1143 | 307 | """ | 307 | """ |
1144 | 308 | def __call__(self, manager, service_name, event_name): | 308 | def __call__(self, manager, service_name, event_name): |
1145 | 309 | service = manager.get_service(service_name) | 309 | service = manager.get_service(service_name) |
1147 | 310 | new_ports = service.get('ports', []) | 310 | # turn this generator into a list, |
1148 | 311 | # as we'll be going over it multiple times | ||
1149 | 312 | new_ports = list(service.get('ports', [])) | ||
1150 | 311 | port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name)) | 313 | port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name)) |
1151 | 312 | if os.path.exists(port_file): | 314 | if os.path.exists(port_file): |
1152 | 313 | with open(port_file) as fp: | 315 | with open(port_file) as fp: |
1153 | 314 | old_ports = fp.read().split(',') | 316 | old_ports = fp.read().split(',') |
1154 | 315 | for old_port in old_ports: | 317 | for old_port in old_ports: |
1159 | 316 | if bool(old_port): | 318 | if bool(old_port) and not self.ports_contains(old_port, new_ports): |
1160 | 317 | old_port = int(old_port) | 319 | hookenv.close_port(old_port) |
1157 | 318 | if old_port not in new_ports: | ||
1158 | 319 | hookenv.close_port(old_port) | ||
1161 | 320 | with open(port_file, 'w') as fp: | 320 | with open(port_file, 'w') as fp: |
1162 | 321 | fp.write(','.join(str(port) for port in new_ports)) | 321 | fp.write(','.join(str(port) for port in new_ports)) |
1163 | 322 | for port in new_ports: | 322 | for port in new_ports: |
1164 | 323 | # A port is either a number or 'ICMP' | ||
1165 | 324 | protocol = 'TCP' | ||
1166 | 325 | if str(port).upper() == 'ICMP': | ||
1167 | 326 | protocol = 'ICMP' | ||
1168 | 323 | if event_name == 'start': | 327 | if event_name == 'start': |
1170 | 324 | hookenv.open_port(port) | 328 | hookenv.open_port(port, protocol) |
1171 | 325 | elif event_name == 'stop': | 329 | elif event_name == 'stop': |
1173 | 326 | hookenv.close_port(port) | 330 | hookenv.close_port(port, protocol) |
1174 | 331 | |||
1175 | 332 | def ports_contains(self, port, ports): | ||
1176 | 333 | if not bool(port): | ||
1177 | 334 | return False | ||
1178 | 335 | if str(port).upper() != 'ICMP': | ||
1179 | 336 | port = int(port) | ||
1180 | 337 | return port in ports | ||
1181 | 327 | 338 | ||
1182 | 328 | 339 | ||
1183 | 329 | def service_stop(service_name): | 340 | def service_stop(service_name): |
1184 | 330 | 341 | ||
1185 | === modified file 'hooks/charmhelpers/core/strutils.py' | |||
1186 | --- hooks/charmhelpers/core/strutils.py 2017-03-03 19:56:10 +0000 | |||
1187 | +++ hooks/charmhelpers/core/strutils.py 2019-04-18 01:13:06 +0000 | |||
1188 | @@ -61,10 +61,69 @@ | |||
1189 | 61 | if isinstance(value, six.string_types): | 61 | if isinstance(value, six.string_types): |
1190 | 62 | value = six.text_type(value) | 62 | value = six.text_type(value) |
1191 | 63 | else: | 63 | else: |
1193 | 64 | msg = "Unable to interpret non-string value '%s' as boolean" % (value) | 64 | msg = "Unable to interpret non-string value '%s' as bytes" % (value) |
1194 | 65 | raise ValueError(msg) | 65 | raise ValueError(msg) |
1195 | 66 | matches = re.match("([0-9]+)([a-zA-Z]+)", value) | 66 | matches = re.match("([0-9]+)([a-zA-Z]+)", value) |
1200 | 67 | if not matches: | 67 | if matches: |
1201 | 68 | msg = "Unable to interpret string value '%s' as bytes" % (value) | 68 | size = int(matches.group(1)) * (1024 ** BYTE_POWER[matches.group(2)]) |
1202 | 69 | raise ValueError(msg) | 69 | else: |
1203 | 70 | return int(matches.group(1)) * (1024 ** BYTE_POWER[matches.group(2)]) | 70 | # Assume that value passed in is bytes |
1204 | 71 | try: | ||
1205 | 72 | size = int(value) | ||
1206 | 73 | except ValueError: | ||
1207 | 74 | msg = "Unable to interpret string value '%s' as bytes" % (value) | ||
1208 | 75 | raise ValueError(msg) | ||
1209 | 76 | return size | ||
1210 | 77 | |||
1211 | 78 | |||
1212 | 79 | class BasicStringComparator(object): | ||
1213 | 80 | """Provides a class that will compare strings from an iterator type object. | ||
1214 | 81 | Used to provide > and < comparisons on strings that may not necessarily be | ||
1215 | 82 | alphanumerically ordered. e.g. OpenStack or Ubuntu releases AFTER the | ||
1216 | 83 | z-wrap. | ||
1217 | 84 | """ | ||
1218 | 85 | |||
1219 | 86 | _list = None | ||
1220 | 87 | |||
1221 | 88 | def __init__(self, item): | ||
1222 | 89 | if self._list is None: | ||
1223 | 90 | raise Exception("Must define the _list in the class definition!") | ||
1224 | 91 | try: | ||
1225 | 92 | self.index = self._list.index(item) | ||
1226 | 93 | except Exception: | ||
1227 | 94 | raise KeyError("Item '{}' is not in list '{}'" | ||
1228 | 95 | .format(item, self._list)) | ||
1229 | 96 | |||
1230 | 97 | def __eq__(self, other): | ||
1231 | 98 | assert isinstance(other, str) or isinstance(other, self.__class__) | ||
1232 | 99 | return self.index == self._list.index(other) | ||
1233 | 100 | |||
1234 | 101 | def __ne__(self, other): | ||
1235 | 102 | return not self.__eq__(other) | ||
1236 | 103 | |||
1237 | 104 | def __lt__(self, other): | ||
1238 | 105 | assert isinstance(other, str) or isinstance(other, self.__class__) | ||
1239 | 106 | return self.index < self._list.index(other) | ||
1240 | 107 | |||
1241 | 108 | def __ge__(self, other): | ||
1242 | 109 | return not self.__lt__(other) | ||
1243 | 110 | |||
1244 | 111 | def __gt__(self, other): | ||
1245 | 112 | assert isinstance(other, str) or isinstance(other, self.__class__) | ||
1246 | 113 | return self.index > self._list.index(other) | ||
1247 | 114 | |||
1248 | 115 | def __le__(self, other): | ||
1249 | 116 | return not self.__gt__(other) | ||
1250 | 117 | |||
1251 | 118 | def __str__(self): | ||
1252 | 119 | """Always give back the item at the index so it can be used in | ||
1253 | 120 | comparisons like: | ||
1254 | 121 | |||
1255 | 122 | s_mitaka = CompareOpenStack('mitaka') | ||
1256 | 123 | s_newton = CompareOpenstack('newton') | ||
1257 | 124 | |||
1258 | 125 | assert s_newton > s_mitaka | ||
1259 | 126 | |||
1260 | 127 | @returns: <string> | ||
1261 | 128 | """ | ||
1262 | 129 | return self._list[self.index] | ||
1263 | 71 | 130 | ||
1264 | === modified file 'hooks/charmhelpers/core/sysctl.py' | |||
1265 | --- hooks/charmhelpers/core/sysctl.py 2017-03-03 19:56:10 +0000 | |||
1266 | +++ hooks/charmhelpers/core/sysctl.py 2019-04-18 01:13:06 +0000 | |||
1267 | @@ -28,27 +28,38 @@ | |||
1268 | 28 | __author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>' | 28 | __author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>' |
1269 | 29 | 29 | ||
1270 | 30 | 30 | ||
1272 | 31 | def create(sysctl_dict, sysctl_file): | 31 | def create(sysctl_dict, sysctl_file, ignore=False): |
1273 | 32 | """Creates a sysctl.conf file from a YAML associative array | 32 | """Creates a sysctl.conf file from a YAML associative array |
1274 | 33 | 33 | ||
1276 | 34 | :param sysctl_dict: a YAML-formatted string of sysctl options eg "{ 'kernel.max_pid': 1337 }" | 34 | :param sysctl_dict: a dict or YAML-formatted string of sysctl |
1277 | 35 | options eg "{ 'kernel.max_pid': 1337 }" | ||
1278 | 35 | :type sysctl_dict: str | 36 | :type sysctl_dict: str |
1279 | 36 | :param sysctl_file: path to the sysctl file to be saved | 37 | :param sysctl_file: path to the sysctl file to be saved |
1280 | 37 | :type sysctl_file: str or unicode | 38 | :type sysctl_file: str or unicode |
1281 | 39 | :param ignore: If True, ignore "unknown variable" errors. | ||
1282 | 40 | :type ignore: bool | ||
1283 | 38 | :returns: None | 41 | :returns: None |
1284 | 39 | """ | 42 | """ |
1291 | 40 | try: | 43 | if type(sysctl_dict) is not dict: |
1292 | 41 | sysctl_dict_parsed = yaml.safe_load(sysctl_dict) | 44 | try: |
1293 | 42 | except yaml.YAMLError: | 45 | sysctl_dict_parsed = yaml.safe_load(sysctl_dict) |
1294 | 43 | log("Error parsing YAML sysctl_dict: {}".format(sysctl_dict), | 46 | except yaml.YAMLError: |
1295 | 44 | level=ERROR) | 47 | log("Error parsing YAML sysctl_dict: {}".format(sysctl_dict), |
1296 | 45 | return | 48 | level=ERROR) |
1297 | 49 | return | ||
1298 | 50 | else: | ||
1299 | 51 | sysctl_dict_parsed = sysctl_dict | ||
1300 | 46 | 52 | ||
1301 | 47 | with open(sysctl_file, "w") as fd: | 53 | with open(sysctl_file, "w") as fd: |
1302 | 48 | for key, value in sysctl_dict_parsed.items(): | 54 | for key, value in sysctl_dict_parsed.items(): |
1303 | 49 | fd.write("{}={}\n".format(key, value)) | 55 | fd.write("{}={}\n".format(key, value)) |
1304 | 50 | 56 | ||
1306 | 51 | log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict_parsed), | 57 | log("Updating sysctl_file: {} values: {}".format(sysctl_file, |
1307 | 58 | sysctl_dict_parsed), | ||
1308 | 52 | level=DEBUG) | 59 | level=DEBUG) |
1309 | 53 | 60 | ||
1311 | 54 | check_call(["sysctl", "-p", sysctl_file]) | 61 | call = ["sysctl", "-p", sysctl_file] |
1312 | 62 | if ignore: | ||
1313 | 63 | call.append("-e") | ||
1314 | 64 | |||
1315 | 65 | check_call(call) | ||
1316 | 55 | 66 | ||
1317 | === modified file 'hooks/charmhelpers/core/templating.py' | |||
1318 | --- hooks/charmhelpers/core/templating.py 2017-03-03 19:56:10 +0000 | |||
1319 | +++ hooks/charmhelpers/core/templating.py 2019-04-18 01:13:06 +0000 | |||
1320 | @@ -20,7 +20,8 @@ | |||
1321 | 20 | 20 | ||
1322 | 21 | 21 | ||
1323 | 22 | def render(source, target, context, owner='root', group='root', | 22 | def render(source, target, context, owner='root', group='root', |
1325 | 23 | perms=0o444, templates_dir=None, encoding='UTF-8', template_loader=None): | 23 | perms=0o444, templates_dir=None, encoding='UTF-8', |
1326 | 24 | template_loader=None, config_template=None): | ||
1327 | 24 | """ | 25 | """ |
1328 | 25 | Render a template. | 26 | Render a template. |
1329 | 26 | 27 | ||
1330 | @@ -32,6 +33,9 @@ | |||
1331 | 32 | The context should be a dict containing the values to be replaced in the | 33 | The context should be a dict containing the values to be replaced in the |
1332 | 33 | template. | 34 | template. |
1333 | 34 | 35 | ||
1334 | 36 | config_template may be provided to render from a provided template instead | ||
1335 | 37 | of loading from a file. | ||
1336 | 38 | |||
1337 | 35 | The `owner`, `group`, and `perms` options will be passed to `write_file`. | 39 | The `owner`, `group`, and `perms` options will be passed to `write_file`. |
1338 | 36 | 40 | ||
1339 | 37 | If omitted, `templates_dir` defaults to the `templates` folder in the charm. | 41 | If omitted, `templates_dir` defaults to the `templates` folder in the charm. |
1340 | @@ -65,14 +69,19 @@ | |||
1341 | 65 | if templates_dir is None: | 69 | if templates_dir is None: |
1342 | 66 | templates_dir = os.path.join(hookenv.charm_dir(), 'templates') | 70 | templates_dir = os.path.join(hookenv.charm_dir(), 'templates') |
1343 | 67 | template_env = Environment(loader=FileSystemLoader(templates_dir)) | 71 | template_env = Environment(loader=FileSystemLoader(templates_dir)) |
1352 | 68 | try: | 72 | |
1353 | 69 | source = source | 73 | # load from a string if provided explicitly |
1354 | 70 | template = template_env.get_template(source) | 74 | if config_template is not None: |
1355 | 71 | except exceptions.TemplateNotFound as e: | 75 | template = template_env.from_string(config_template) |
1356 | 72 | hookenv.log('Could not load template %s from %s.' % | 76 | else: |
1357 | 73 | (source, templates_dir), | 77 | try: |
1358 | 74 | level=hookenv.ERROR) | 78 | source = source |
1359 | 75 | raise e | 79 | template = template_env.get_template(source) |
1360 | 80 | except exceptions.TemplateNotFound as e: | ||
1361 | 81 | hookenv.log('Could not load template %s from %s.' % | ||
1362 | 82 | (source, templates_dir), | ||
1363 | 83 | level=hookenv.ERROR) | ||
1364 | 84 | raise e | ||
1365 | 76 | content = template.render(context) | 85 | content = template.render(context) |
1366 | 77 | if target is not None: | 86 | if target is not None: |
1367 | 78 | target_dir = os.path.dirname(target) | 87 | target_dir = os.path.dirname(target) |
1368 | 79 | 88 | ||
1369 | === modified file 'hooks/charmhelpers/core/unitdata.py' | |||
1370 | --- hooks/charmhelpers/core/unitdata.py 2017-03-03 19:56:10 +0000 | |||
1371 | +++ hooks/charmhelpers/core/unitdata.py 2019-04-18 01:13:06 +0000 | |||
1372 | @@ -166,6 +166,10 @@ | |||
1373 | 166 | 166 | ||
1374 | 167 | To support dicts, lists, integer, floats, and booleans values | 167 | To support dicts, lists, integer, floats, and booleans values |
1375 | 168 | are automatically json encoded/decoded. | 168 | are automatically json encoded/decoded. |
1376 | 169 | |||
1377 | 170 | Note: to facilitate unit testing, ':memory:' can be passed as the | ||
1378 | 171 | path parameter which causes sqlite3 to only build the db in memory. | ||
1379 | 172 | This should only be used for testing purposes. | ||
1380 | 169 | """ | 173 | """ |
1381 | 170 | def __init__(self, path=None): | 174 | def __init__(self, path=None): |
1382 | 171 | self.db_path = path | 175 | self.db_path = path |
1383 | @@ -175,6 +179,9 @@ | |||
1384 | 175 | else: | 179 | else: |
1385 | 176 | self.db_path = os.path.join( | 180 | self.db_path = os.path.join( |
1386 | 177 | os.environ.get('CHARM_DIR', ''), '.unit-state.db') | 181 | os.environ.get('CHARM_DIR', ''), '.unit-state.db') |
1387 | 182 | if self.db_path != ':memory:': | ||
1388 | 183 | with open(self.db_path, 'a') as f: | ||
1389 | 184 | os.fchmod(f.fileno(), 0o600) | ||
1390 | 178 | self.conn = sqlite3.connect('%s' % self.db_path) | 185 | self.conn = sqlite3.connect('%s' % self.db_path) |
1391 | 179 | self.cursor = self.conn.cursor() | 186 | self.cursor = self.conn.cursor() |
1392 | 180 | self.revision = None | 187 | self.revision = None |
1393 | @@ -358,7 +365,7 @@ | |||
1394 | 358 | try: | 365 | try: |
1395 | 359 | yield self.revision | 366 | yield self.revision |
1396 | 360 | self.revision = None | 367 | self.revision = None |
1398 | 361 | except: | 368 | except Exception: |
1399 | 362 | self.flush(False) | 369 | self.flush(False) |
1400 | 363 | self.revision = None | 370 | self.revision = None |
1401 | 364 | raise | 371 | raise |
1402 | 365 | 372 | ||
1403 | === modified file 'hooks/charmhelpers/fetch/__init__.py' | |||
1404 | --- hooks/charmhelpers/fetch/__init__.py 2017-03-03 19:56:10 +0000 | |||
1405 | +++ hooks/charmhelpers/fetch/__init__.py 2019-04-18 01:13:06 +0000 | |||
1406 | @@ -48,6 +48,13 @@ | |||
1407 | 48 | pass | 48 | pass |
1408 | 49 | 49 | ||
1409 | 50 | 50 | ||
1410 | 51 | class GPGKeyError(Exception): | ||
1411 | 52 | """Exception occurs when a GPG key cannot be fetched or used. The message | ||
1412 | 53 | indicates what the problem is. | ||
1413 | 54 | """ | ||
1414 | 55 | pass | ||
1415 | 56 | |||
1416 | 57 | |||
1417 | 51 | class BaseFetchHandler(object): | 58 | class BaseFetchHandler(object): |
1418 | 52 | 59 | ||
1419 | 53 | """Base class for FetchHandler implementations in fetch plugins""" | 60 | """Base class for FetchHandler implementations in fetch plugins""" |
1420 | @@ -77,21 +84,24 @@ | |||
1421 | 77 | fetch = importlib.import_module(module) | 84 | fetch = importlib.import_module(module) |
1422 | 78 | 85 | ||
1423 | 79 | filter_installed_packages = fetch.filter_installed_packages | 86 | filter_installed_packages = fetch.filter_installed_packages |
1428 | 80 | install = fetch.install | 87 | filter_missing_packages = fetch.filter_missing_packages |
1429 | 81 | upgrade = fetch.upgrade | 88 | install = fetch.apt_install |
1430 | 82 | update = fetch.update | 89 | upgrade = fetch.apt_upgrade |
1431 | 83 | purge = fetch.purge | 90 | update = _fetch_update = fetch.apt_update |
1432 | 91 | purge = fetch.apt_purge | ||
1433 | 84 | add_source = fetch.add_source | 92 | add_source = fetch.add_source |
1434 | 85 | 93 | ||
1435 | 86 | if __platform__ == "ubuntu": | 94 | if __platform__ == "ubuntu": |
1436 | 87 | apt_cache = fetch.apt_cache | 95 | apt_cache = fetch.apt_cache |
1441 | 88 | apt_install = fetch.install | 96 | apt_install = fetch.apt_install |
1442 | 89 | apt_update = fetch.update | 97 | apt_update = fetch.apt_update |
1443 | 90 | apt_upgrade = fetch.upgrade | 98 | apt_upgrade = fetch.apt_upgrade |
1444 | 91 | apt_purge = fetch.purge | 99 | apt_purge = fetch.apt_purge |
1445 | 100 | apt_autoremove = fetch.apt_autoremove | ||
1446 | 92 | apt_mark = fetch.apt_mark | 101 | apt_mark = fetch.apt_mark |
1447 | 93 | apt_hold = fetch.apt_hold | 102 | apt_hold = fetch.apt_hold |
1448 | 94 | apt_unhold = fetch.apt_unhold | 103 | apt_unhold = fetch.apt_unhold |
1449 | 104 | import_key = fetch.import_key | ||
1450 | 95 | get_upstream_version = fetch.get_upstream_version | 105 | get_upstream_version = fetch.get_upstream_version |
1451 | 96 | elif __platform__ == "centos": | 106 | elif __platform__ == "centos": |
1452 | 97 | yum_search = fetch.yum_search | 107 | yum_search = fetch.yum_search |
1453 | @@ -135,7 +145,7 @@ | |||
1454 | 135 | for source, key in zip(sources, keys): | 145 | for source, key in zip(sources, keys): |
1455 | 136 | add_source(source, key) | 146 | add_source(source, key) |
1456 | 137 | if update: | 147 | if update: |
1458 | 138 | fetch.update(fatal=True) | 148 | _fetch_update(fatal=True) |
1459 | 139 | 149 | ||
1460 | 140 | 150 | ||
1461 | 141 | def install_remote(source, *args, **kwargs): | 151 | def install_remote(source, *args, **kwargs): |
1462 | 142 | 152 | ||
1463 | === modified file 'hooks/charmhelpers/fetch/archiveurl.py' | |||
1464 | --- hooks/charmhelpers/fetch/archiveurl.py 2017-03-03 19:56:10 +0000 | |||
1465 | +++ hooks/charmhelpers/fetch/archiveurl.py 2019-04-18 01:13:06 +0000 | |||
1466 | @@ -89,7 +89,7 @@ | |||
1467 | 89 | :param str source: URL pointing to an archive file. | 89 | :param str source: URL pointing to an archive file. |
1468 | 90 | :param str dest: Local path location to download archive file to. | 90 | :param str dest: Local path location to download archive file to. |
1469 | 91 | """ | 91 | """ |
1471 | 92 | # propogate all exceptions | 92 | # propagate all exceptions |
1472 | 93 | # URLError, OSError, etc | 93 | # URLError, OSError, etc |
1473 | 94 | proto, netloc, path, params, query, fragment = urlparse(source) | 94 | proto, netloc, path, params, query, fragment = urlparse(source) |
1474 | 95 | if proto in ('http', 'https'): | 95 | if proto in ('http', 'https'): |
1475 | 96 | 96 | ||
1476 | === modified file 'hooks/charmhelpers/fetch/bzrurl.py' | |||
1477 | --- hooks/charmhelpers/fetch/bzrurl.py 2017-03-03 19:56:10 +0000 | |||
1478 | +++ hooks/charmhelpers/fetch/bzrurl.py 2019-04-18 01:13:06 +0000 | |||
1479 | @@ -13,7 +13,7 @@ | |||
1480 | 13 | # limitations under the License. | 13 | # limitations under the License. |
1481 | 14 | 14 | ||
1482 | 15 | import os | 15 | import os |
1484 | 16 | from subprocess import check_call | 16 | from subprocess import STDOUT, check_output |
1485 | 17 | from charmhelpers.fetch import ( | 17 | from charmhelpers.fetch import ( |
1486 | 18 | BaseFetchHandler, | 18 | BaseFetchHandler, |
1487 | 19 | UnhandledSource, | 19 | UnhandledSource, |
1488 | @@ -55,7 +55,7 @@ | |||
1489 | 55 | cmd = ['bzr', 'branch'] | 55 | cmd = ['bzr', 'branch'] |
1490 | 56 | cmd += cmd_opts | 56 | cmd += cmd_opts |
1491 | 57 | cmd += [source, dest] | 57 | cmd += [source, dest] |
1493 | 58 | check_call(cmd) | 58 | check_output(cmd, stderr=STDOUT) |
1494 | 59 | 59 | ||
1495 | 60 | def install(self, source, dest=None, revno=None): | 60 | def install(self, source, dest=None, revno=None): |
1496 | 61 | url_parts = self.parse_url(source) | 61 | url_parts = self.parse_url(source) |
1497 | 62 | 62 | ||
1498 | === modified file 'hooks/charmhelpers/fetch/centos.py' | |||
1499 | --- hooks/charmhelpers/fetch/centos.py 2017-03-03 19:56:10 +0000 | |||
1500 | +++ hooks/charmhelpers/fetch/centos.py 2019-04-18 01:13:06 +0000 | |||
1501 | @@ -132,7 +132,7 @@ | |||
1502 | 132 | key_file.write(key) | 132 | key_file.write(key) |
1503 | 133 | key_file.flush() | 133 | key_file.flush() |
1504 | 134 | key_file.seek(0) | 134 | key_file.seek(0) |
1506 | 135 | subprocess.check_call(['rpm', '--import', key_file]) | 135 | subprocess.check_call(['rpm', '--import', key_file.name]) |
1507 | 136 | else: | 136 | else: |
1508 | 137 | subprocess.check_call(['rpm', '--import', key]) | 137 | subprocess.check_call(['rpm', '--import', key]) |
1509 | 138 | 138 | ||
1510 | 139 | 139 | ||
1511 | === modified file 'hooks/charmhelpers/fetch/giturl.py' | |||
1512 | --- hooks/charmhelpers/fetch/giturl.py 2017-03-03 19:56:10 +0000 | |||
1513 | +++ hooks/charmhelpers/fetch/giturl.py 2019-04-18 01:13:06 +0000 | |||
1514 | @@ -13,7 +13,7 @@ | |||
1515 | 13 | # limitations under the License. | 13 | # limitations under the License. |
1516 | 14 | 14 | ||
1517 | 15 | import os | 15 | import os |
1519 | 16 | from subprocess import check_call, CalledProcessError | 16 | from subprocess import check_output, CalledProcessError, STDOUT |
1520 | 17 | from charmhelpers.fetch import ( | 17 | from charmhelpers.fetch import ( |
1521 | 18 | BaseFetchHandler, | 18 | BaseFetchHandler, |
1522 | 19 | UnhandledSource, | 19 | UnhandledSource, |
1523 | @@ -50,7 +50,7 @@ | |||
1524 | 50 | cmd = ['git', 'clone', source, dest, '--branch', branch] | 50 | cmd = ['git', 'clone', source, dest, '--branch', branch] |
1525 | 51 | if depth: | 51 | if depth: |
1526 | 52 | cmd.extend(['--depth', depth]) | 52 | cmd.extend(['--depth', depth]) |
1528 | 53 | check_call(cmd) | 53 | check_output(cmd, stderr=STDOUT) |
1529 | 54 | 54 | ||
1530 | 55 | def install(self, source, branch="master", dest=None, depth=None): | 55 | def install(self, source, branch="master", dest=None, depth=None): |
1531 | 56 | url_parts = self.parse_url(source) | 56 | url_parts = self.parse_url(source) |
1532 | 57 | 57 | ||
1533 | === added directory 'hooks/charmhelpers/fetch/python' | |||
1534 | === added file 'hooks/charmhelpers/fetch/python/__init__.py' | |||
1535 | --- hooks/charmhelpers/fetch/python/__init__.py 1970-01-01 00:00:00 +0000 | |||
1536 | +++ hooks/charmhelpers/fetch/python/__init__.py 2019-04-18 01:13:06 +0000 | |||
1537 | @@ -0,0 +1,13 @@ | |||
1538 | 1 | # Copyright 2014-2019 Canonical Limited. | ||
1539 | 2 | # | ||
1540 | 3 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
1541 | 4 | # you may not use this file except in compliance with the License. | ||
1542 | 5 | # You may obtain a copy of the License at | ||
1543 | 6 | # | ||
1544 | 7 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
1545 | 8 | # | ||
1546 | 9 | # Unless required by applicable law or agreed to in writing, software | ||
1547 | 10 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
1548 | 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
1549 | 12 | # See the License for the specific language governing permissions and | ||
1550 | 13 | # limitations under the License. | ||
1551 | 0 | 14 | ||
1552 | === added file 'hooks/charmhelpers/fetch/python/debug.py' | |||
1553 | --- hooks/charmhelpers/fetch/python/debug.py 1970-01-01 00:00:00 +0000 | |||
1554 | +++ hooks/charmhelpers/fetch/python/debug.py 2019-04-18 01:13:06 +0000 | |||
1555 | @@ -0,0 +1,54 @@ | |||
1556 | 1 | #!/usr/bin/env python | ||
1557 | 2 | # coding: utf-8 | ||
1558 | 3 | |||
1559 | 4 | # Copyright 2014-2015 Canonical Limited. | ||
1560 | 5 | # | ||
1561 | 6 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
1562 | 7 | # you may not use this file except in compliance with the License. | ||
1563 | 8 | # You may obtain a copy of the License at | ||
1564 | 9 | # | ||
1565 | 10 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
1566 | 11 | # | ||
1567 | 12 | # Unless required by applicable law or agreed to in writing, software | ||
1568 | 13 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
1569 | 14 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
1570 | 15 | # See the License for the specific language governing permissions and | ||
1571 | 16 | # limitations under the License. | ||
1572 | 17 | |||
1573 | 18 | from __future__ import print_function | ||
1574 | 19 | |||
1575 | 20 | import atexit | ||
1576 | 21 | import sys | ||
1577 | 22 | |||
1578 | 23 | from charmhelpers.fetch.python.rpdb import Rpdb | ||
1579 | 24 | from charmhelpers.core.hookenv import ( | ||
1580 | 25 | open_port, | ||
1581 | 26 | close_port, | ||
1582 | 27 | ERROR, | ||
1583 | 28 | log | ||
1584 | 29 | ) | ||
1585 | 30 | |||
1586 | 31 | __author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>" | ||
1587 | 32 | |||
1588 | 33 | DEFAULT_ADDR = "0.0.0.0" | ||
1589 | 34 | DEFAULT_PORT = 4444 | ||
1590 | 35 | |||
1591 | 36 | |||
1592 | 37 | def _error(message): | ||
1593 | 38 | log(message, level=ERROR) | ||
1594 | 39 | |||
1595 | 40 | |||
1596 | 41 | def set_trace(addr=DEFAULT_ADDR, port=DEFAULT_PORT): | ||
1597 | 42 | """ | ||
1598 | 43 | Set a trace point using the remote debugger | ||
1599 | 44 | """ | ||
1600 | 45 | atexit.register(close_port, port) | ||
1601 | 46 | try: | ||
1602 | 47 | log("Starting a remote python debugger session on %s:%s" % (addr, | ||
1603 | 48 | port)) | ||
1604 | 49 | open_port(port) | ||
1605 | 50 | debugger = Rpdb(addr=addr, port=port) | ||
1606 | 51 | debugger.set_trace(sys._getframe().f_back) | ||
1607 | 52 | except Exception: | ||
1608 | 53 | _error("Cannot start a remote debug session on %s:%s" % (addr, | ||
1609 | 54 | port)) | ||
1610 | 0 | 55 | ||
1611 | === added file 'hooks/charmhelpers/fetch/python/packages.py' | |||
1612 | --- hooks/charmhelpers/fetch/python/packages.py 1970-01-01 00:00:00 +0000 | |||
1613 | +++ hooks/charmhelpers/fetch/python/packages.py 2019-04-18 01:13:06 +0000 | |||
1614 | @@ -0,0 +1,154 @@ | |||
1615 | 1 | #!/usr/bin/env python | ||
1616 | 2 | # coding: utf-8 | ||
1617 | 3 | |||
1618 | 4 | # Copyright 2014-2015 Canonical Limited. | ||
1619 | 5 | # | ||
1620 | 6 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
1621 | 7 | # you may not use this file except in compliance with the License. | ||
1622 | 8 | # You may obtain a copy of the License at | ||
1623 | 9 | # | ||
1624 | 10 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
1625 | 11 | # | ||
1626 | 12 | # Unless required by applicable law or agreed to in writing, software | ||
1627 | 13 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
1628 | 14 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
1629 | 15 | # See the License for the specific language governing permissions and | ||
1630 | 16 | # limitations under the License. | ||
1631 | 17 | |||
1632 | 18 | import os | ||
1633 | 19 | import six | ||
1634 | 20 | import subprocess | ||
1635 | 21 | import sys | ||
1636 | 22 | |||
1637 | 23 | from charmhelpers.fetch import apt_install, apt_update | ||
1638 | 24 | from charmhelpers.core.hookenv import charm_dir, log | ||
1639 | 25 | |||
1640 | 26 | __author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>" | ||
1641 | 27 | |||
1642 | 28 | |||
1643 | 29 | def pip_execute(*args, **kwargs): | ||
1644 | 30 | """Overriden pip_execute() to stop sys.path being changed. | ||
1645 | 31 | |||
1646 | 32 | The act of importing main from the pip module seems to cause add wheels | ||
1647 | 33 | from the /usr/share/python-wheels which are installed by various tools. | ||
1648 | 34 | This function ensures that sys.path remains the same after the call is | ||
1649 | 35 | executed. | ||
1650 | 36 | """ | ||
1651 | 37 | try: | ||
1652 | 38 | _path = sys.path | ||
1653 | 39 | try: | ||
1654 | 40 | from pip import main as _pip_execute | ||
1655 | 41 | except ImportError: | ||
1656 | 42 | apt_update() | ||
1657 | 43 | if six.PY2: | ||
1658 | 44 | apt_install('python-pip') | ||
1659 | 45 | else: | ||
1660 | 46 | apt_install('python3-pip') | ||
1661 | 47 | from pip import main as _pip_execute | ||
1662 | 48 | _pip_execute(*args, **kwargs) | ||
1663 | 49 | finally: | ||
1664 | 50 | sys.path = _path | ||
1665 | 51 | |||
1666 | 52 | |||
1667 | 53 | def parse_options(given, available): | ||
1668 | 54 | """Given a set of options, check if available""" | ||
1669 | 55 | for key, value in sorted(given.items()): | ||
1670 | 56 | if not value: | ||
1671 | 57 | continue | ||
1672 | 58 | if key in available: | ||
1673 | 59 | yield "--{0}={1}".format(key, value) | ||
1674 | 60 | |||
1675 | 61 | |||
1676 | 62 | def pip_install_requirements(requirements, constraints=None, **options): | ||
1677 | 63 | """Install a requirements file. | ||
1678 | 64 | |||
1679 | 65 | :param constraints: Path to pip constraints file. | ||
1680 | 66 | http://pip.readthedocs.org/en/stable/user_guide/#constraints-files | ||
1681 | 67 | """ | ||
1682 | 68 | command = ["install"] | ||
1683 | 69 | |||
1684 | 70 | available_options = ('proxy', 'src', 'log', ) | ||
1685 | 71 | for option in parse_options(options, available_options): | ||
1686 | 72 | command.append(option) | ||
1687 | 73 | |||
1688 | 74 | command.append("-r {0}".format(requirements)) | ||
1689 | 75 | if constraints: | ||
1690 | 76 | command.append("-c {0}".format(constraints)) | ||
1691 | 77 | log("Installing from file: {} with constraints {} " | ||
1692 | 78 | "and options: {}".format(requirements, constraints, command)) | ||
1693 | 79 | else: | ||
1694 | 80 | log("Installing from file: {} with options: {}".format(requirements, | ||
1695 | 81 | command)) | ||
1696 | 82 | pip_execute(command) | ||
1697 | 83 | |||
1698 | 84 | |||
1699 | 85 | def pip_install(package, fatal=False, upgrade=False, venv=None, | ||
1700 | 86 | constraints=None, **options): | ||
1701 | 87 | """Install a python package""" | ||
1702 | 88 | if venv: | ||
1703 | 89 | venv_python = os.path.join(venv, 'bin/pip') | ||
1704 | 90 | command = [venv_python, "install"] | ||
1705 | 91 | else: | ||
1706 | 92 | command = ["install"] | ||
1707 | 93 | |||
1708 | 94 | available_options = ('proxy', 'src', 'log', 'index-url', ) | ||
1709 | 95 | for option in parse_options(options, available_options): | ||
1710 | 96 | command.append(option) | ||
1711 | 97 | |||
1712 | 98 | if upgrade: | ||
1713 | 99 | command.append('--upgrade') | ||
1714 | 100 | |||
1715 | 101 | if constraints: | ||
1716 | 102 | command.extend(['-c', constraints]) | ||
1717 | 103 | |||
1718 | 104 | if isinstance(package, list): | ||
1719 | 105 | command.extend(package) | ||
1720 | 106 | else: | ||
1721 | 107 | command.append(package) | ||
1722 | 108 | |||
1723 | 109 | log("Installing {} package with options: {}".format(package, | ||
1724 | 110 | command)) | ||
1725 | 111 | if venv: | ||
1726 | 112 | subprocess.check_call(command) | ||
1727 | 113 | else: | ||
1728 | 114 | pip_execute(command) | ||
1729 | 115 | |||
1730 | 116 | |||
1731 | 117 | def pip_uninstall(package, **options): | ||
1732 | 118 | """Uninstall a python package""" | ||
1733 | 119 | command = ["uninstall", "-q", "-y"] | ||
1734 | 120 | |||
1735 | 121 | available_options = ('proxy', 'log', ) | ||
1736 | 122 | for option in parse_options(options, available_options): | ||
1737 | 123 | command.append(option) | ||
1738 | 124 | |||
1739 | 125 | if isinstance(package, list): | ||
1740 | 126 | command.extend(package) | ||
1741 | 127 | else: | ||
1742 | 128 | command.append(package) | ||
1743 | 129 | |||
1744 | 130 | log("Uninstalling {} package with options: {}".format(package, | ||
1745 | 131 | command)) | ||
1746 | 132 | pip_execute(command) | ||
1747 | 133 | |||
1748 | 134 | |||
1749 | 135 | def pip_list(): | ||
1750 | 136 | """Returns the list of current python installed packages | ||
1751 | 137 | """ | ||
1752 | 138 | return pip_execute(["list"]) | ||
1753 | 139 | |||
1754 | 140 | |||
1755 | 141 | def pip_create_virtualenv(path=None): | ||
1756 | 142 | """Create an isolated Python environment.""" | ||
1757 | 143 | if six.PY2: | ||
1758 | 144 | apt_install('python-virtualenv') | ||
1759 | 145 | else: | ||
1760 | 146 | apt_install('python3-virtualenv') | ||
1761 | 147 | |||
1762 | 148 | if path: | ||
1763 | 149 | venv_path = path | ||
1764 | 150 | else: | ||
1765 | 151 | venv_path = os.path.join(charm_dir(), 'venv') | ||
1766 | 152 | |||
1767 | 153 | if not os.path.exists(venv_path): | ||
1768 | 154 | subprocess.check_call(['virtualenv', venv_path]) | ||
1769 | 0 | 155 | ||
1770 | === added file 'hooks/charmhelpers/fetch/python/rpdb.py' | |||
1771 | --- hooks/charmhelpers/fetch/python/rpdb.py 1970-01-01 00:00:00 +0000 | |||
1772 | +++ hooks/charmhelpers/fetch/python/rpdb.py 2019-04-18 01:13:06 +0000 | |||
1773 | @@ -0,0 +1,56 @@ | |||
1774 | 1 | # Copyright 2014-2015 Canonical Limited. | ||
1775 | 2 | # | ||
1776 | 3 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
1777 | 4 | # you may not use this file except in compliance with the License. | ||
1778 | 5 | # You may obtain a copy of the License at | ||
1779 | 6 | # | ||
1780 | 7 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
1781 | 8 | # | ||
1782 | 9 | # Unless required by applicable law or agreed to in writing, software | ||
1783 | 10 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
1784 | 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
1785 | 12 | # See the License for the specific language governing permissions and | ||
1786 | 13 | # limitations under the License. | ||
1787 | 14 | |||
1788 | 15 | """Remote Python Debugger (pdb wrapper).""" | ||
1789 | 16 | |||
1790 | 17 | import pdb | ||
1791 | 18 | import socket | ||
1792 | 19 | import sys | ||
1793 | 20 | |||
1794 | 21 | __author__ = "Bertrand Janin <b@janin.com>" | ||
1795 | 22 | __version__ = "0.1.3" | ||
1796 | 23 | |||
1797 | 24 | |||
1798 | 25 | class Rpdb(pdb.Pdb): | ||
1799 | 26 | |||
1800 | 27 | def __init__(self, addr="127.0.0.1", port=4444): | ||
1801 | 28 | """Initialize the socket and initialize pdb.""" | ||
1802 | 29 | |||
1803 | 30 | # Backup stdin and stdout before replacing them by the socket handle | ||
1804 | 31 | self.old_stdout = sys.stdout | ||
1805 | 32 | self.old_stdin = sys.stdin | ||
1806 | 33 | |||
1807 | 34 | # Open a 'reusable' socket to let the webapp reload on the same port | ||
1808 | 35 | self.skt = socket.socket(socket.AF_INET, socket.SOCK_STREAM) | ||
1809 | 36 | self.skt.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, True) | ||
1810 | 37 | self.skt.bind((addr, port)) | ||
1811 | 38 | self.skt.listen(1) | ||
1812 | 39 | (clientsocket, address) = self.skt.accept() | ||
1813 | 40 | handle = clientsocket.makefile('rw') | ||
1814 | 41 | pdb.Pdb.__init__(self, completekey='tab', stdin=handle, stdout=handle) | ||
1815 | 42 | sys.stdout = sys.stdin = handle | ||
1816 | 43 | |||
1817 | 44 | def shutdown(self): | ||
1818 | 45 | """Revert stdin and stdout, close the socket.""" | ||
1819 | 46 | sys.stdout = self.old_stdout | ||
1820 | 47 | sys.stdin = self.old_stdin | ||
1821 | 48 | self.skt.close() | ||
1822 | 49 | self.set_continue() | ||
1823 | 50 | |||
1824 | 51 | def do_continue(self, arg): | ||
1825 | 52 | """Stop all operation on ``continue``.""" | ||
1826 | 53 | self.shutdown() | ||
1827 | 54 | return 1 | ||
1828 | 55 | |||
1829 | 56 | do_EOF = do_quit = do_exit = do_c = do_cont = do_continue | ||
1830 | 0 | 57 | ||
1831 | === added file 'hooks/charmhelpers/fetch/python/version.py' | |||
1832 | --- hooks/charmhelpers/fetch/python/version.py 1970-01-01 00:00:00 +0000 | |||
1833 | +++ hooks/charmhelpers/fetch/python/version.py 2019-04-18 01:13:06 +0000 | |||
1834 | @@ -0,0 +1,32 @@ | |||
1835 | 1 | #!/usr/bin/env python | ||
1836 | 2 | # coding: utf-8 | ||
1837 | 3 | |||
1838 | 4 | # Copyright 2014-2015 Canonical Limited. | ||
1839 | 5 | # | ||
1840 | 6 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
1841 | 7 | # you may not use this file except in compliance with the License. | ||
1842 | 8 | # You may obtain a copy of the License at | ||
1843 | 9 | # | ||
1844 | 10 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
1845 | 11 | # | ||
1846 | 12 | # Unless required by applicable law or agreed to in writing, software | ||
1847 | 13 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
1848 | 14 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
1849 | 15 | # See the License for the specific language governing permissions and | ||
1850 | 16 | # limitations under the License. | ||
1851 | 17 | |||
1852 | 18 | import sys | ||
1853 | 19 | |||
1854 | 20 | __author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>" | ||
1855 | 21 | |||
1856 | 22 | |||
1857 | 23 | def current_version(): | ||
1858 | 24 | """Current system python version""" | ||
1859 | 25 | return sys.version_info | ||
1860 | 26 | |||
1861 | 27 | |||
1862 | 28 | def current_version_string(): | ||
1863 | 29 | """Current system python version as string major.minor.micro""" | ||
1864 | 30 | return "{0}.{1}.{2}".format(sys.version_info.major, | ||
1865 | 31 | sys.version_info.minor, | ||
1866 | 32 | sys.version_info.micro) | ||
1867 | 0 | 33 | ||
1868 | === modified file 'hooks/charmhelpers/fetch/snap.py' | |||
1869 | --- hooks/charmhelpers/fetch/snap.py 2017-03-03 22:25:32 +0000 | |||
1870 | +++ hooks/charmhelpers/fetch/snap.py 2019-04-18 01:13:06 +0000 | |||
1871 | @@ -18,21 +18,33 @@ | |||
1872 | 18 | https://lists.ubuntu.com/archives/snapcraft/2016-September/001114.html | 18 | https://lists.ubuntu.com/archives/snapcraft/2016-September/001114.html |
1873 | 19 | """ | 19 | """ |
1874 | 20 | import subprocess | 20 | import subprocess |
1876 | 21 | from os import environ | 21 | import os |
1877 | 22 | from time import sleep | 22 | from time import sleep |
1878 | 23 | from charmhelpers.core.hookenv import log | 23 | from charmhelpers.core.hookenv import log |
1879 | 24 | 24 | ||
1880 | 25 | __author__ = 'Joseph Borg <joseph.borg@canonical.com>' | 25 | __author__ = 'Joseph Borg <joseph.borg@canonical.com>' |
1881 | 26 | 26 | ||
1883 | 27 | SNAP_NO_LOCK = 1 # The return code for "couldn't acquire lock" in Snap (hopefully this will be improved). | 27 | # The return code for "couldn't acquire lock" in Snap |
1884 | 28 | # (hopefully this will be improved). | ||
1885 | 29 | SNAP_NO_LOCK = 1 | ||
1886 | 28 | SNAP_NO_LOCK_RETRY_DELAY = 10 # Wait X seconds between Snap lock checks. | 30 | SNAP_NO_LOCK_RETRY_DELAY = 10 # Wait X seconds between Snap lock checks. |
1887 | 29 | SNAP_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times. | 31 | SNAP_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times. |
1888 | 32 | SNAP_CHANNELS = [ | ||
1889 | 33 | 'edge', | ||
1890 | 34 | 'beta', | ||
1891 | 35 | 'candidate', | ||
1892 | 36 | 'stable', | ||
1893 | 37 | ] | ||
1894 | 30 | 38 | ||
1895 | 31 | 39 | ||
1896 | 32 | class CouldNotAcquireLockException(Exception): | 40 | class CouldNotAcquireLockException(Exception): |
1897 | 33 | pass | 41 | pass |
1898 | 34 | 42 | ||
1899 | 35 | 43 | ||
1900 | 44 | class InvalidSnapChannel(Exception): | ||
1901 | 45 | pass | ||
1902 | 46 | |||
1903 | 47 | |||
1904 | 36 | def _snap_exec(commands): | 48 | def _snap_exec(commands): |
1905 | 37 | """ | 49 | """ |
1906 | 38 | Execute snap commands. | 50 | Execute snap commands. |
1907 | @@ -47,13 +59,17 @@ | |||
1908 | 47 | 59 | ||
1909 | 48 | while return_code is None or return_code == SNAP_NO_LOCK: | 60 | while return_code is None or return_code == SNAP_NO_LOCK: |
1910 | 49 | try: | 61 | try: |
1912 | 50 | return_code = subprocess.check_call(['snap'] + commands, env=environ) | 62 | return_code = subprocess.check_call(['snap'] + commands, |
1913 | 63 | env=os.environ) | ||
1914 | 51 | except subprocess.CalledProcessError as e: | 64 | except subprocess.CalledProcessError as e: |
1915 | 52 | retry_count += + 1 | 65 | retry_count += + 1 |
1916 | 53 | if retry_count > SNAP_NO_LOCK_RETRY_COUNT: | 66 | if retry_count > SNAP_NO_LOCK_RETRY_COUNT: |
1918 | 54 | raise CouldNotAcquireLockException('Could not aquire lock after %s attempts' % SNAP_NO_LOCK_RETRY_COUNT) | 67 | raise CouldNotAcquireLockException( |
1919 | 68 | 'Could not aquire lock after {} attempts' | ||
1920 | 69 | .format(SNAP_NO_LOCK_RETRY_COUNT)) | ||
1921 | 55 | return_code = e.returncode | 70 | return_code = e.returncode |
1923 | 56 | log('Snap failed to acquire lock, trying again in %s seconds.' % SNAP_NO_LOCK_RETRY_DELAY, level='WARN') | 71 | log('Snap failed to acquire lock, trying again in {} seconds.' |
1924 | 72 | .format(SNAP_NO_LOCK_RETRY_DELAY, level='WARN')) | ||
1925 | 57 | sleep(SNAP_NO_LOCK_RETRY_DELAY) | 73 | sleep(SNAP_NO_LOCK_RETRY_DELAY) |
1926 | 58 | 74 | ||
1927 | 59 | return return_code | 75 | return return_code |
1928 | @@ -120,3 +136,15 @@ | |||
1929 | 120 | 136 | ||
1930 | 121 | log(message, level='INFO') | 137 | log(message, level='INFO') |
1931 | 122 | return _snap_exec(['refresh'] + flags + packages) | 138 | return _snap_exec(['refresh'] + flags + packages) |
1932 | 139 | |||
1933 | 140 | |||
1934 | 141 | def valid_snap_channel(channel): | ||
1935 | 142 | """ Validate snap channel exists | ||
1936 | 143 | |||
1937 | 144 | :raises InvalidSnapChannel: When channel does not exist | ||
1938 | 145 | :return: Boolean | ||
1939 | 146 | """ | ||
1940 | 147 | if channel.lower() in SNAP_CHANNELS: | ||
1941 | 148 | return True | ||
1942 | 149 | else: | ||
1943 | 150 | raise InvalidSnapChannel("Invalid Snap Channel: {}".format(channel)) | ||
1944 | 123 | 151 | ||
1945 | === modified file 'hooks/charmhelpers/fetch/ubuntu.py' | |||
1946 | --- hooks/charmhelpers/fetch/ubuntu.py 2017-03-03 20:50:28 +0000 | |||
1947 | +++ hooks/charmhelpers/fetch/ubuntu.py 2019-04-18 01:13:06 +0000 | |||
1948 | @@ -12,29 +12,48 @@ | |||
1949 | 12 | # See the License for the specific language governing permissions and | 12 | # See the License for the specific language governing permissions and |
1950 | 13 | # limitations under the License. | 13 | # limitations under the License. |
1951 | 14 | 14 | ||
1952 | 15 | from collections import OrderedDict | ||
1953 | 15 | import os | 16 | import os |
1954 | 17 | import platform | ||
1955 | 18 | import re | ||
1956 | 16 | import six | 19 | import six |
1957 | 17 | import time | 20 | import time |
1958 | 18 | import subprocess | 21 | import subprocess |
1959 | 19 | 22 | ||
1963 | 20 | from tempfile import NamedTemporaryFile | 23 | from charmhelpers.core.host import get_distrib_codename |
1964 | 21 | from charmhelpers.core.host import ( | 24 | |
1965 | 22 | lsb_release | 25 | from charmhelpers.core.hookenv import ( |
1966 | 26 | log, | ||
1967 | 27 | DEBUG, | ||
1968 | 28 | WARNING, | ||
1969 | 29 | env_proxy_settings, | ||
1970 | 23 | ) | 30 | ) |
1973 | 24 | from charmhelpers.core.hookenv import log | 31 | from charmhelpers.fetch import SourceConfigError, GPGKeyError |
1972 | 25 | from charmhelpers.fetch import SourceConfigError | ||
1974 | 26 | 32 | ||
1975 | 33 | PROPOSED_POCKET = ( | ||
1976 | 34 | "# Proposed\n" | ||
1977 | 35 | "deb http://archive.ubuntu.com/ubuntu {}-proposed main universe " | ||
1978 | 36 | "multiverse restricted\n") | ||
1979 | 37 | PROPOSED_PORTS_POCKET = ( | ||
1980 | 38 | "# Proposed\n" | ||
1981 | 39 | "deb http://ports.ubuntu.com/ubuntu-ports {}-proposed main universe " | ||
1982 | 40 | "multiverse restricted\n") | ||
1983 | 41 | # Only supports 64bit and ppc64 at the moment. | ||
1984 | 42 | ARCH_TO_PROPOSED_POCKET = { | ||
1985 | 43 | 'x86_64': PROPOSED_POCKET, | ||
1986 | 44 | 'ppc64le': PROPOSED_PORTS_POCKET, | ||
1987 | 45 | 'aarch64': PROPOSED_PORTS_POCKET, | ||
1988 | 46 | 's390x': PROPOSED_PORTS_POCKET, | ||
1989 | 47 | } | ||
1990 | 48 | CLOUD_ARCHIVE_URL = "http://ubuntu-cloud.archive.canonical.com/ubuntu" | ||
1991 | 49 | CLOUD_ARCHIVE_KEY_ID = '5EDB1B62EC4926EA' | ||
1992 | 27 | CLOUD_ARCHIVE = """# Ubuntu Cloud Archive | 50 | CLOUD_ARCHIVE = """# Ubuntu Cloud Archive |
1993 | 28 | deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main | 51 | deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main |
1994 | 29 | """ | 52 | """ |
1995 | 30 | |||
1996 | 31 | PROPOSED_POCKET = """# Proposed | ||
1997 | 32 | deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted | ||
1998 | 33 | """ | ||
1999 | 34 | |||
2000 | 35 | CLOUD_ARCHIVE_POCKETS = { | 53 | CLOUD_ARCHIVE_POCKETS = { |
2001 | 36 | # Folsom | 54 | # Folsom |
2002 | 37 | 'folsom': 'precise-updates/folsom', | 55 | 'folsom': 'precise-updates/folsom', |
2003 | 56 | 'folsom/updates': 'precise-updates/folsom', | ||
2004 | 38 | 'precise-folsom': 'precise-updates/folsom', | 57 | 'precise-folsom': 'precise-updates/folsom', |
2005 | 39 | 'precise-folsom/updates': 'precise-updates/folsom', | 58 | 'precise-folsom/updates': 'precise-updates/folsom', |
2006 | 40 | 'precise-updates/folsom': 'precise-updates/folsom', | 59 | 'precise-updates/folsom': 'precise-updates/folsom', |
2007 | @@ -43,6 +62,7 @@ | |||
2008 | 43 | 'precise-proposed/folsom': 'precise-proposed/folsom', | 62 | 'precise-proposed/folsom': 'precise-proposed/folsom', |
2009 | 44 | # Grizzly | 63 | # Grizzly |
2010 | 45 | 'grizzly': 'precise-updates/grizzly', | 64 | 'grizzly': 'precise-updates/grizzly', |
2011 | 65 | 'grizzly/updates': 'precise-updates/grizzly', | ||
2012 | 46 | 'precise-grizzly': 'precise-updates/grizzly', | 66 | 'precise-grizzly': 'precise-updates/grizzly', |
2013 | 47 | 'precise-grizzly/updates': 'precise-updates/grizzly', | 67 | 'precise-grizzly/updates': 'precise-updates/grizzly', |
2014 | 48 | 'precise-updates/grizzly': 'precise-updates/grizzly', | 68 | 'precise-updates/grizzly': 'precise-updates/grizzly', |
2015 | @@ -51,6 +71,7 @@ | |||
2016 | 51 | 'precise-proposed/grizzly': 'precise-proposed/grizzly', | 71 | 'precise-proposed/grizzly': 'precise-proposed/grizzly', |
2017 | 52 | # Havana | 72 | # Havana |
2018 | 53 | 'havana': 'precise-updates/havana', | 73 | 'havana': 'precise-updates/havana', |
2019 | 74 | 'havana/updates': 'precise-updates/havana', | ||
2020 | 54 | 'precise-havana': 'precise-updates/havana', | 75 | 'precise-havana': 'precise-updates/havana', |
2021 | 55 | 'precise-havana/updates': 'precise-updates/havana', | 76 | 'precise-havana/updates': 'precise-updates/havana', |
2022 | 56 | 'precise-updates/havana': 'precise-updates/havana', | 77 | 'precise-updates/havana': 'precise-updates/havana', |
2023 | @@ -59,6 +80,7 @@ | |||
2024 | 59 | 'precise-proposed/havana': 'precise-proposed/havana', | 80 | 'precise-proposed/havana': 'precise-proposed/havana', |
2025 | 60 | # Icehouse | 81 | # Icehouse |
2026 | 61 | 'icehouse': 'precise-updates/icehouse', | 82 | 'icehouse': 'precise-updates/icehouse', |
2027 | 83 | 'icehouse/updates': 'precise-updates/icehouse', | ||
2028 | 62 | 'precise-icehouse': 'precise-updates/icehouse', | 84 | 'precise-icehouse': 'precise-updates/icehouse', |
2029 | 63 | 'precise-icehouse/updates': 'precise-updates/icehouse', | 85 | 'precise-icehouse/updates': 'precise-updates/icehouse', |
2030 | 64 | 'precise-updates/icehouse': 'precise-updates/icehouse', | 86 | 'precise-updates/icehouse': 'precise-updates/icehouse', |
2031 | @@ -67,6 +89,7 @@ | |||
2032 | 67 | 'precise-proposed/icehouse': 'precise-proposed/icehouse', | 89 | 'precise-proposed/icehouse': 'precise-proposed/icehouse', |
2033 | 68 | # Juno | 90 | # Juno |
2034 | 69 | 'juno': 'trusty-updates/juno', | 91 | 'juno': 'trusty-updates/juno', |
2035 | 92 | 'juno/updates': 'trusty-updates/juno', | ||
2036 | 70 | 'trusty-juno': 'trusty-updates/juno', | 93 | 'trusty-juno': 'trusty-updates/juno', |
2037 | 71 | 'trusty-juno/updates': 'trusty-updates/juno', | 94 | 'trusty-juno/updates': 'trusty-updates/juno', |
2038 | 72 | 'trusty-updates/juno': 'trusty-updates/juno', | 95 | 'trusty-updates/juno': 'trusty-updates/juno', |
2039 | @@ -75,6 +98,7 @@ | |||
2040 | 75 | 'trusty-proposed/juno': 'trusty-proposed/juno', | 98 | 'trusty-proposed/juno': 'trusty-proposed/juno', |
2041 | 76 | # Kilo | 99 | # Kilo |
2042 | 77 | 'kilo': 'trusty-updates/kilo', | 100 | 'kilo': 'trusty-updates/kilo', |
2043 | 101 | 'kilo/updates': 'trusty-updates/kilo', | ||
2044 | 78 | 'trusty-kilo': 'trusty-updates/kilo', | 102 | 'trusty-kilo': 'trusty-updates/kilo', |
2045 | 79 | 'trusty-kilo/updates': 'trusty-updates/kilo', | 103 | 'trusty-kilo/updates': 'trusty-updates/kilo', |
2046 | 80 | 'trusty-updates/kilo': 'trusty-updates/kilo', | 104 | 'trusty-updates/kilo': 'trusty-updates/kilo', |
2047 | @@ -83,6 +107,7 @@ | |||
2048 | 83 | 'trusty-proposed/kilo': 'trusty-proposed/kilo', | 107 | 'trusty-proposed/kilo': 'trusty-proposed/kilo', |
2049 | 84 | # Liberty | 108 | # Liberty |
2050 | 85 | 'liberty': 'trusty-updates/liberty', | 109 | 'liberty': 'trusty-updates/liberty', |
2051 | 110 | 'liberty/updates': 'trusty-updates/liberty', | ||
2052 | 86 | 'trusty-liberty': 'trusty-updates/liberty', | 111 | 'trusty-liberty': 'trusty-updates/liberty', |
2053 | 87 | 'trusty-liberty/updates': 'trusty-updates/liberty', | 112 | 'trusty-liberty/updates': 'trusty-updates/liberty', |
2054 | 88 | 'trusty-updates/liberty': 'trusty-updates/liberty', | 113 | 'trusty-updates/liberty': 'trusty-updates/liberty', |
2055 | @@ -91,6 +116,7 @@ | |||
2056 | 91 | 'trusty-proposed/liberty': 'trusty-proposed/liberty', | 116 | 'trusty-proposed/liberty': 'trusty-proposed/liberty', |
2057 | 92 | # Mitaka | 117 | # Mitaka |
2058 | 93 | 'mitaka': 'trusty-updates/mitaka', | 118 | 'mitaka': 'trusty-updates/mitaka', |
2059 | 119 | 'mitaka/updates': 'trusty-updates/mitaka', | ||
2060 | 94 | 'trusty-mitaka': 'trusty-updates/mitaka', | 120 | 'trusty-mitaka': 'trusty-updates/mitaka', |
2061 | 95 | 'trusty-mitaka/updates': 'trusty-updates/mitaka', | 121 | 'trusty-mitaka/updates': 'trusty-updates/mitaka', |
2062 | 96 | 'trusty-updates/mitaka': 'trusty-updates/mitaka', | 122 | 'trusty-updates/mitaka': 'trusty-updates/mitaka', |
2063 | @@ -99,6 +125,7 @@ | |||
2064 | 99 | 'trusty-proposed/mitaka': 'trusty-proposed/mitaka', | 125 | 'trusty-proposed/mitaka': 'trusty-proposed/mitaka', |
2065 | 100 | # Newton | 126 | # Newton |
2066 | 101 | 'newton': 'xenial-updates/newton', | 127 | 'newton': 'xenial-updates/newton', |
2067 | 128 | 'newton/updates': 'xenial-updates/newton', | ||
2068 | 102 | 'xenial-newton': 'xenial-updates/newton', | 129 | 'xenial-newton': 'xenial-updates/newton', |
2069 | 103 | 'xenial-newton/updates': 'xenial-updates/newton', | 130 | 'xenial-newton/updates': 'xenial-updates/newton', |
2070 | 104 | 'xenial-updates/newton': 'xenial-updates/newton', | 131 | 'xenial-updates/newton': 'xenial-updates/newton', |
2071 | @@ -107,17 +134,51 @@ | |||
2072 | 107 | 'xenial-proposed/newton': 'xenial-proposed/newton', | 134 | 'xenial-proposed/newton': 'xenial-proposed/newton', |
2073 | 108 | # Ocata | 135 | # Ocata |
2074 | 109 | 'ocata': 'xenial-updates/ocata', | 136 | 'ocata': 'xenial-updates/ocata', |
2075 | 137 | 'ocata/updates': 'xenial-updates/ocata', | ||
2076 | 110 | 'xenial-ocata': 'xenial-updates/ocata', | 138 | 'xenial-ocata': 'xenial-updates/ocata', |
2077 | 111 | 'xenial-ocata/updates': 'xenial-updates/ocata', | 139 | 'xenial-ocata/updates': 'xenial-updates/ocata', |
2078 | 112 | 'xenial-updates/ocata': 'xenial-updates/ocata', | 140 | 'xenial-updates/ocata': 'xenial-updates/ocata', |
2079 | 113 | 'ocata/proposed': 'xenial-proposed/ocata', | 141 | 'ocata/proposed': 'xenial-proposed/ocata', |
2080 | 114 | 'xenial-ocata/proposed': 'xenial-proposed/ocata', | 142 | 'xenial-ocata/proposed': 'xenial-proposed/ocata', |
2082 | 115 | 'xenial-ocata/newton': 'xenial-proposed/ocata', | 143 | 'xenial-proposed/ocata': 'xenial-proposed/ocata', |
2083 | 144 | # Pike | ||
2084 | 145 | 'pike': 'xenial-updates/pike', | ||
2085 | 146 | 'xenial-pike': 'xenial-updates/pike', | ||
2086 | 147 | 'xenial-pike/updates': 'xenial-updates/pike', | ||
2087 | 148 | 'xenial-updates/pike': 'xenial-updates/pike', | ||
2088 | 149 | 'pike/proposed': 'xenial-proposed/pike', | ||
2089 | 150 | 'xenial-pike/proposed': 'xenial-proposed/pike', | ||
2090 | 151 | 'xenial-proposed/pike': 'xenial-proposed/pike', | ||
2091 | 152 | # Queens | ||
2092 | 153 | 'queens': 'xenial-updates/queens', | ||
2093 | 154 | 'xenial-queens': 'xenial-updates/queens', | ||
2094 | 155 | 'xenial-queens/updates': 'xenial-updates/queens', | ||
2095 | 156 | 'xenial-updates/queens': 'xenial-updates/queens', | ||
2096 | 157 | 'queens/proposed': 'xenial-proposed/queens', | ||
2097 | 158 | 'xenial-queens/proposed': 'xenial-proposed/queens', | ||
2098 | 159 | 'xenial-proposed/queens': 'xenial-proposed/queens', | ||
2099 | 160 | # Rocky | ||
2100 | 161 | 'rocky': 'bionic-updates/rocky', | ||
2101 | 162 | 'bionic-rocky': 'bionic-updates/rocky', | ||
2102 | 163 | 'bionic-rocky/updates': 'bionic-updates/rocky', | ||
2103 | 164 | 'bionic-updates/rocky': 'bionic-updates/rocky', | ||
2104 | 165 | 'rocky/proposed': 'bionic-proposed/rocky', | ||
2105 | 166 | 'bionic-rocky/proposed': 'bionic-proposed/rocky', | ||
2106 | 167 | 'bionic-proposed/rocky': 'bionic-proposed/rocky', | ||
2107 | 168 | # Stein | ||
2108 | 169 | 'stein': 'bionic-updates/stein', | ||
2109 | 170 | 'bionic-stein': 'bionic-updates/stein', | ||
2110 | 171 | 'bionic-stein/updates': 'bionic-updates/stein', | ||
2111 | 172 | 'bionic-updates/stein': 'bionic-updates/stein', | ||
2112 | 173 | 'stein/proposed': 'bionic-proposed/stein', | ||
2113 | 174 | 'bionic-stein/proposed': 'bionic-proposed/stein', | ||
2114 | 175 | 'bionic-proposed/stein': 'bionic-proposed/stein', | ||
2115 | 116 | } | 176 | } |
2116 | 117 | 177 | ||
2117 | 178 | |||
2118 | 118 | APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT. | 179 | APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT. |
2119 | 119 | CMD_RETRY_DELAY = 10 # Wait 10 seconds between command retries. | 180 | CMD_RETRY_DELAY = 10 # Wait 10 seconds between command retries. |
2121 | 120 | CMD_RETRY_COUNT = 30 # Retry a failing fatal command X times. | 181 | CMD_RETRY_COUNT = 3 # Retry a failing fatal command X times. |
2122 | 121 | 182 | ||
2123 | 122 | 183 | ||
2124 | 123 | def filter_installed_packages(packages): | 184 | def filter_installed_packages(packages): |
2125 | @@ -135,6 +196,18 @@ | |||
2126 | 135 | return _pkgs | 196 | return _pkgs |
2127 | 136 | 197 | ||
2128 | 137 | 198 | ||
2129 | 199 | def filter_missing_packages(packages): | ||
2130 | 200 | """Return a list of packages that are installed. | ||
2131 | 201 | |||
2132 | 202 | :param packages: list of packages to evaluate. | ||
2133 | 203 | :returns list: Packages that are installed. | ||
2134 | 204 | """ | ||
2135 | 205 | return list( | ||
2136 | 206 | set(packages) - | ||
2137 | 207 | set(filter_installed_packages(packages)) | ||
2138 | 208 | ) | ||
2139 | 209 | |||
2140 | 210 | |||
2141 | 138 | def apt_cache(in_memory=True, progress=None): | 211 | def apt_cache(in_memory=True, progress=None): |
2142 | 139 | """Build and return an apt cache.""" | 212 | """Build and return an apt cache.""" |
2143 | 140 | from apt import apt_pkg | 213 | from apt import apt_pkg |
2144 | @@ -145,7 +218,7 @@ | |||
2145 | 145 | return apt_pkg.Cache(progress) | 218 | return apt_pkg.Cache(progress) |
2146 | 146 | 219 | ||
2147 | 147 | 220 | ||
2149 | 148 | def install(packages, options=None, fatal=False): | 221 | def apt_install(packages, options=None, fatal=False): |
2150 | 149 | """Install one or more packages.""" | 222 | """Install one or more packages.""" |
2151 | 150 | if options is None: | 223 | if options is None: |
2152 | 151 | options = ['--option=Dpkg::Options::=--force-confold'] | 224 | options = ['--option=Dpkg::Options::=--force-confold'] |
2153 | @@ -162,7 +235,7 @@ | |||
2154 | 162 | _run_apt_command(cmd, fatal) | 235 | _run_apt_command(cmd, fatal) |
2155 | 163 | 236 | ||
2156 | 164 | 237 | ||
2158 | 165 | def upgrade(options=None, fatal=False, dist=False): | 238 | def apt_upgrade(options=None, fatal=False, dist=False): |
2159 | 166 | """Upgrade all packages.""" | 239 | """Upgrade all packages.""" |
2160 | 167 | if options is None: | 240 | if options is None: |
2161 | 168 | options = ['--option=Dpkg::Options::=--force-confold'] | 241 | options = ['--option=Dpkg::Options::=--force-confold'] |
2162 | @@ -177,13 +250,13 @@ | |||
2163 | 177 | _run_apt_command(cmd, fatal) | 250 | _run_apt_command(cmd, fatal) |
2164 | 178 | 251 | ||
2165 | 179 | 252 | ||
2167 | 180 | def update(fatal=False): | 253 | def apt_update(fatal=False): |
2168 | 181 | """Update local apt cache.""" | 254 | """Update local apt cache.""" |
2169 | 182 | cmd = ['apt-get', 'update'] | 255 | cmd = ['apt-get', 'update'] |
2170 | 183 | _run_apt_command(cmd, fatal) | 256 | _run_apt_command(cmd, fatal) |
2171 | 184 | 257 | ||
2172 | 185 | 258 | ||
2174 | 186 | def purge(packages, fatal=False): | 259 | def apt_purge(packages, fatal=False): |
2175 | 187 | """Purge one or more packages.""" | 260 | """Purge one or more packages.""" |
2176 | 188 | cmd = ['apt-get', '--assume-yes', 'purge'] | 261 | cmd = ['apt-get', '--assume-yes', 'purge'] |
2177 | 189 | if isinstance(packages, six.string_types): | 262 | if isinstance(packages, six.string_types): |
2178 | @@ -194,6 +267,14 @@ | |||
2179 | 194 | _run_apt_command(cmd, fatal) | 267 | _run_apt_command(cmd, fatal) |
2180 | 195 | 268 | ||
2181 | 196 | 269 | ||
2182 | 270 | def apt_autoremove(purge=True, fatal=False): | ||
2183 | 271 | """Purge one or more packages.""" | ||
2184 | 272 | cmd = ['apt-get', '--assume-yes', 'autoremove'] | ||
2185 | 273 | if purge: | ||
2186 | 274 | cmd.append('--purge') | ||
2187 | 275 | _run_apt_command(cmd, fatal) | ||
2188 | 276 | |||
2189 | 277 | |||
2190 | 197 | def apt_mark(packages, mark, fatal=False): | 278 | def apt_mark(packages, mark, fatal=False): |
2191 | 198 | """Flag one or more packages using apt-mark.""" | 279 | """Flag one or more packages using apt-mark.""" |
2192 | 199 | log("Marking {} as {}".format(packages, mark)) | 280 | log("Marking {} as {}".format(packages, mark)) |
2193 | @@ -217,7 +298,159 @@ | |||
2194 | 217 | return apt_mark(packages, 'unhold', fatal=fatal) | 298 | return apt_mark(packages, 'unhold', fatal=fatal) |
2195 | 218 | 299 | ||
2196 | 219 | 300 | ||
2198 | 220 | def add_source(source, key=None): | 301 | def import_key(key): |
2199 | 302 | """Import an ASCII Armor key. | ||
2200 | 303 | |||
2201 | 304 | A Radix64 format keyid is also supported for backwards | ||
2202 | 305 | compatibility. In this case Ubuntu keyserver will be | ||
2203 | 306 | queried for a key via HTTPS by its keyid. This method | ||
2204 | 307 | is less preferrable because https proxy servers may | ||
2205 | 308 | require traffic decryption which is equivalent to a | ||
2206 | 309 | man-in-the-middle attack (a proxy server impersonates | ||
2207 | 310 | keyserver TLS certificates and has to be explicitly | ||
2208 | 311 | trusted by the system). | ||
2209 | 312 | |||
2210 | 313 | :param key: A GPG key in ASCII armor format, | ||
2211 | 314 | including BEGIN and END markers or a keyid. | ||
2212 | 315 | :type key: (bytes, str) | ||
2213 | 316 | :raises: GPGKeyError if the key could not be imported | ||
2214 | 317 | """ | ||
2215 | 318 | key = key.strip() | ||
2216 | 319 | if '-' in key or '\n' in key: | ||
2217 | 320 | # Send everything not obviously a keyid to GPG to import, as | ||
2218 | 321 | # we trust its validation better than our own. eg. handling | ||
2219 | 322 | # comments before the key. | ||
2220 | 323 | log("PGP key found (looks like ASCII Armor format)", level=DEBUG) | ||
2221 | 324 | if ('-----BEGIN PGP PUBLIC KEY BLOCK-----' in key and | ||
2222 | 325 | '-----END PGP PUBLIC KEY BLOCK-----' in key): | ||
2223 | 326 | log("Writing provided PGP key in the binary format", level=DEBUG) | ||
2224 | 327 | if six.PY3: | ||
2225 | 328 | key_bytes = key.encode('utf-8') | ||
2226 | 329 | else: | ||
2227 | 330 | key_bytes = key | ||
2228 | 331 | key_name = _get_keyid_by_gpg_key(key_bytes) | ||
2229 | 332 | key_gpg = _dearmor_gpg_key(key_bytes) | ||
2230 | 333 | _write_apt_gpg_keyfile(key_name=key_name, key_material=key_gpg) | ||
2231 | 334 | else: | ||
2232 | 335 | raise GPGKeyError("ASCII armor markers missing from GPG key") | ||
2233 | 336 | else: | ||
2234 | 337 | log("PGP key found (looks like Radix64 format)", level=WARNING) | ||
2235 | 338 | log("SECURELY importing PGP key from keyserver; " | ||
2236 | 339 | "full key not provided.", level=WARNING) | ||
2237 | 340 | # as of bionic add-apt-repository uses curl with an HTTPS keyserver URL | ||
2238 | 341 | # to retrieve GPG keys. `apt-key adv` command is deprecated as is | ||
2239 | 342 | # apt-key in general as noted in its manpage. See lp:1433761 for more | ||
2240 | 343 | # history. Instead, /etc/apt/trusted.gpg.d is used directly to drop | ||
2241 | 344 | # gpg | ||
2242 | 345 | key_asc = _get_key_by_keyid(key) | ||
2243 | 346 | # write the key in GPG format so that apt-key list shows it | ||
2244 | 347 | key_gpg = _dearmor_gpg_key(key_asc) | ||
2245 | 348 | _write_apt_gpg_keyfile(key_name=key, key_material=key_gpg) | ||
2246 | 349 | |||
2247 | 350 | |||
2248 | 351 | def _get_keyid_by_gpg_key(key_material): | ||
2249 | 352 | """Get a GPG key fingerprint by GPG key material. | ||
2250 | 353 | Gets a GPG key fingerprint (40-digit, 160-bit) by the ASCII armor-encoded | ||
2251 | 354 | or binary GPG key material. Can be used, for example, to generate file | ||
2252 | 355 | names for keys passed via charm options. | ||
2253 | 356 | |||
2254 | 357 | :param key_material: ASCII armor-encoded or binary GPG key material | ||
2255 | 358 | :type key_material: bytes | ||
2256 | 359 | :raises: GPGKeyError if invalid key material has been provided | ||
2257 | 360 | :returns: A GPG key fingerprint | ||
2258 | 361 | :rtype: str | ||
2259 | 362 | """ | ||
2260 | 363 | # Use the same gpg command for both Xenial and Bionic | ||
2261 | 364 | cmd = 'gpg --with-colons --with-fingerprint' | ||
2262 | 365 | ps = subprocess.Popen(cmd.split(), | ||
2263 | 366 | stdout=subprocess.PIPE, | ||
2264 | 367 | stderr=subprocess.PIPE, | ||
2265 | 368 | stdin=subprocess.PIPE) | ||
2266 | 369 | out, err = ps.communicate(input=key_material) | ||
2267 | 370 | if six.PY3: | ||
2268 | 371 | out = out.decode('utf-8') | ||
2269 | 372 | err = err.decode('utf-8') | ||
2270 | 373 | if 'gpg: no valid OpenPGP data found.' in err: | ||
2271 | 374 | raise GPGKeyError('Invalid GPG key material provided') | ||
2272 | 375 | # from gnupg2 docs: fpr :: Fingerprint (fingerprint is in field 10) | ||
2273 | 376 | return re.search(r"^fpr:{9}([0-9A-F]{40}):$", out, re.MULTILINE).group(1) | ||
2274 | 377 | |||
2275 | 378 | |||
2276 | 379 | def _get_key_by_keyid(keyid): | ||
2277 | 380 | """Get a key via HTTPS from the Ubuntu keyserver. | ||
2278 | 381 | Different key ID formats are supported by SKS keyservers (the longer ones | ||
2279 | 382 | are more secure, see "dead beef attack" and https://evil32.com/). Since | ||
2280 | 383 | HTTPS is used, if SSLBump-like HTTPS proxies are in place, they will | ||
2281 | 384 | impersonate keyserver.ubuntu.com and generate a certificate with | ||
2282 | 385 | keyserver.ubuntu.com in the CN field or in SubjAltName fields of a | ||
2283 | 386 | certificate. If such proxy behavior is expected it is necessary to add the | ||
2284 | 387 | CA certificate chain containing the intermediate CA of the SSLBump proxy to | ||
2285 | 388 | every machine that this code runs on via ca-certs cloud-init directive (via | ||
2286 | 389 | cloudinit-userdata model-config) or via other means (such as through a | ||
2287 | 390 | custom charm option). Also note that DNS resolution for the hostname in a | ||
2288 | 391 | URL is done at a proxy server - not at the client side. | ||
2289 | 392 | |||
2290 | 393 | 8-digit (32 bit) key ID | ||
2291 | 394 | https://keyserver.ubuntu.com/pks/lookup?search=0x4652B4E6 | ||
2292 | 395 | 16-digit (64 bit) key ID | ||
2293 | 396 | https://keyserver.ubuntu.com/pks/lookup?search=0x6E85A86E4652B4E6 | ||
2294 | 397 | 40-digit key ID: | ||
2295 | 398 | https://keyserver.ubuntu.com/pks/lookup?search=0x35F77D63B5CEC106C577ED856E85A86E4652B4E6 | ||
2296 | 399 | |||
2297 | 400 | :param keyid: An 8, 16 or 40 hex digit keyid to find a key for | ||
2298 | 401 | :type keyid: (bytes, str) | ||
2299 | 402 | :returns: A key material for the specified GPG key id | ||
2300 | 403 | :rtype: (str, bytes) | ||
2301 | 404 | :raises: subprocess.CalledProcessError | ||
2302 | 405 | """ | ||
2303 | 406 | # options=mr - machine-readable output (disables html wrappers) | ||
2304 | 407 | keyserver_url = ('https://keyserver.ubuntu.com' | ||
2305 | 408 | '/pks/lookup?op=get&options=mr&exact=on&search=0x{}') | ||
2306 | 409 | curl_cmd = ['curl', keyserver_url.format(keyid)] | ||
2307 | 410 | # use proxy server settings in order to retrieve the key | ||
2308 | 411 | return subprocess.check_output(curl_cmd, | ||
2309 | 412 | env=env_proxy_settings(['https'])) | ||
2310 | 413 | |||
2311 | 414 | |||
2312 | 415 | def _dearmor_gpg_key(key_asc): | ||
2313 | 416 | """Converts a GPG key in the ASCII armor format to the binary format. | ||
2314 | 417 | |||
2315 | 418 | :param key_asc: A GPG key in ASCII armor format. | ||
2316 | 419 | :type key_asc: (str, bytes) | ||
2317 | 420 | :returns: A GPG key in binary format | ||
2318 | 421 | :rtype: (str, bytes) | ||
2319 | 422 | :raises: GPGKeyError | ||
2320 | 423 | """ | ||
2321 | 424 | ps = subprocess.Popen(['gpg', '--dearmor'], | ||
2322 | 425 | stdout=subprocess.PIPE, | ||
2323 | 426 | stderr=subprocess.PIPE, | ||
2324 | 427 | stdin=subprocess.PIPE) | ||
2325 | 428 | out, err = ps.communicate(input=key_asc) | ||
2326 | 429 | # no need to decode output as it is binary (invalid utf-8), only error | ||
2327 | 430 | if six.PY3: | ||
2328 | 431 | err = err.decode('utf-8') | ||
2329 | 432 | if 'gpg: no valid OpenPGP data found.' in err: | ||
2330 | 433 | raise GPGKeyError('Invalid GPG key material. Check your network setup' | ||
2331 | 434 | ' (MTU, routing, DNS) and/or proxy server settings' | ||
2332 | 435 | ' as well as destination keyserver status.') | ||
2333 | 436 | else: | ||
2334 | 437 | return out | ||
2335 | 438 | |||
2336 | 439 | |||
2337 | 440 | def _write_apt_gpg_keyfile(key_name, key_material): | ||
2338 | 441 | """Writes GPG key material into a file at a provided path. | ||
2339 | 442 | |||
2340 | 443 | :param key_name: A key name to use for a key file (could be a fingerprint) | ||
2341 | 444 | :type key_name: str | ||
2342 | 445 | :param key_material: A GPG key material (binary) | ||
2343 | 446 | :type key_material: (str, bytes) | ||
2344 | 447 | """ | ||
2345 | 448 | with open('/etc/apt/trusted.gpg.d/{}.gpg'.format(key_name), | ||
2346 | 449 | 'wb') as keyf: | ||
2347 | 450 | keyf.write(key_material) | ||
2348 | 451 | |||
2349 | 452 | |||
2350 | 453 | def add_source(source, key=None, fail_invalid=False): | ||
2351 | 221 | """Add a package source to this system. | 454 | """Add a package source to this system. |
2352 | 222 | 455 | ||
2353 | 223 | @param source: a URL or sources.list entry, as supported by | 456 | @param source: a URL or sources.list entry, as supported by |
2354 | @@ -233,6 +466,33 @@ | |||
2355 | 233 | such as 'cloud:icehouse' | 466 | such as 'cloud:icehouse' |
2356 | 234 | 'distro' may be used as a noop | 467 | 'distro' may be used as a noop |
2357 | 235 | 468 | ||
2358 | 469 | Full list of source specifications supported by the function are: | ||
2359 | 470 | |||
2360 | 471 | 'distro': A NOP; i.e. it has no effect. | ||
2361 | 472 | 'proposed': the proposed deb spec [2] is wrtten to | ||
2362 | 473 | /etc/apt/sources.list/proposed | ||
2363 | 474 | 'distro-proposed': adds <version>-proposed to the debs [2] | ||
2364 | 475 | 'ppa:<ppa-name>': add-apt-repository --yes <ppa_name> | ||
2365 | 476 | 'deb <deb-spec>': add-apt-repository --yes deb <deb-spec> | ||
2366 | 477 | 'http://....': add-apt-repository --yes http://... | ||
2367 | 478 | 'cloud-archive:<spec>': add-apt-repository -yes cloud-archive:<spec> | ||
2368 | 479 | 'cloud:<release>[-staging]': specify a Cloud Archive pocket <release> with | ||
2369 | 480 | optional staging version. If staging is used then the staging PPA [2] | ||
2370 | 481 | with be used. If staging is NOT used then the cloud archive [3] will be | ||
2371 | 482 | added, and the 'ubuntu-cloud-keyring' package will be added for the | ||
2372 | 483 | current distro. | ||
2373 | 484 | |||
2374 | 485 | Otherwise the source is not recognised and this is logged to the juju log. | ||
2375 | 486 | However, no error is raised, unless sys_error_on_exit is True. | ||
2376 | 487 | |||
2377 | 488 | [1] deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main | ||
2378 | 489 | where {} is replaced with the derived pocket name. | ||
2379 | 490 | [2] deb http://archive.ubuntu.com/ubuntu {}-proposed \ | ||
2380 | 491 | main universe multiverse restricted | ||
2381 | 492 | where {} is replaced with the lsb_release codename (e.g. xenial) | ||
2382 | 493 | [3] deb http://ubuntu-cloud.archive.canonical.com/ubuntu <pocket> | ||
2383 | 494 | to /etc/apt/sources.list.d/cloud-archive-list | ||
2384 | 495 | |||
2385 | 236 | @param key: A key to be added to the system's APT keyring and used | 496 | @param key: A key to be added to the system's APT keyring and used |
2386 | 237 | to verify the signatures on packages. Ideally, this should be an | 497 | to verify the signatures on packages. Ideally, this should be an |
2387 | 238 | ASCII format GPG public key including the block headers. A GPG key | 498 | ASCII format GPG public key including the block headers. A GPG key |
2388 | @@ -240,51 +500,150 @@ | |||
2389 | 240 | available to retrieve the actual public key from a public keyserver | 500 | available to retrieve the actual public key from a public keyserver |
2390 | 241 | placing your Juju environment at risk. ppa and cloud archive keys | 501 | placing your Juju environment at risk. ppa and cloud archive keys |
2391 | 242 | are securely added automtically, so sould not be provided. | 502 | are securely added automtically, so sould not be provided. |
2392 | 503 | |||
2393 | 504 | @param fail_invalid: (boolean) if True, then the function raises a | ||
2394 | 505 | SourceConfigError is there is no matching installation source. | ||
2395 | 506 | |||
2396 | 507 | @raises SourceConfigError() if for cloud:<pocket>, the <pocket> is not a | ||
2397 | 508 | valid pocket in CLOUD_ARCHIVE_POCKETS | ||
2398 | 243 | """ | 509 | """ |
2399 | 510 | _mapping = OrderedDict([ | ||
2400 | 511 | (r"^distro$", lambda: None), # This is a NOP | ||
2401 | 512 | (r"^(?:proposed|distro-proposed)$", _add_proposed), | ||
2402 | 513 | (r"^cloud-archive:(.*)$", _add_apt_repository), | ||
2403 | 514 | (r"^((?:deb |http:|https:|ppa:).*)$", _add_apt_repository), | ||
2404 | 515 | (r"^cloud:(.*)-(.*)\/staging$", _add_cloud_staging), | ||
2405 | 516 | (r"^cloud:(.*)-(.*)$", _add_cloud_distro_check), | ||
2406 | 517 | (r"^cloud:(.*)$", _add_cloud_pocket), | ||
2407 | 518 | (r"^snap:.*-(.*)-(.*)$", _add_cloud_distro_check), | ||
2408 | 519 | ]) | ||
2409 | 244 | if source is None: | 520 | if source is None: |
2421 | 245 | log('Source is not present. Skipping') | 521 | source = '' |
2422 | 246 | return | 522 | for r, fn in six.iteritems(_mapping): |
2423 | 247 | 523 | m = re.match(r, source) | |
2424 | 248 | if (source.startswith('ppa:') or | 524 | if m: |
2425 | 249 | source.startswith('http') or | 525 | # call the assoicated function with the captured groups |
2426 | 250 | source.startswith('deb ') or | 526 | # raises SourceConfigError on error. |
2427 | 251 | source.startswith('cloud-archive:')): | 527 | fn(*m.groups()) |
2428 | 252 | cmd = ['add-apt-repository', '--yes', source] | 528 | if key: |
2429 | 253 | _run_with_retries(cmd) | 529 | try: |
2430 | 254 | elif source.startswith('cloud:'): | 530 | import_key(key) |
2431 | 255 | install(filter_installed_packages(['ubuntu-cloud-keyring']), | 531 | except GPGKeyError as e: |
2432 | 532 | raise SourceConfigError(str(e)) | ||
2433 | 533 | break | ||
2434 | 534 | else: | ||
2435 | 535 | # nothing matched. log an error and maybe sys.exit | ||
2436 | 536 | err = "Unknown source: {!r}".format(source) | ||
2437 | 537 | log(err) | ||
2438 | 538 | if fail_invalid: | ||
2439 | 539 | raise SourceConfigError(err) | ||
2440 | 540 | |||
2441 | 541 | |||
2442 | 542 | def _add_proposed(): | ||
2443 | 543 | """Add the PROPOSED_POCKET as /etc/apt/source.list.d/proposed.list | ||
2444 | 544 | |||
2445 | 545 | Uses get_distrib_codename to determine the correct stanza for | ||
2446 | 546 | the deb line. | ||
2447 | 547 | |||
2448 | 548 | For intel architecutres PROPOSED_POCKET is used for the release, but for | ||
2449 | 549 | other architectures PROPOSED_PORTS_POCKET is used for the release. | ||
2450 | 550 | """ | ||
2451 | 551 | release = get_distrib_codename() | ||
2452 | 552 | arch = platform.machine() | ||
2453 | 553 | if arch not in six.iterkeys(ARCH_TO_PROPOSED_POCKET): | ||
2454 | 554 | raise SourceConfigError("Arch {} not supported for (distro-)proposed" | ||
2455 | 555 | .format(arch)) | ||
2456 | 556 | with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt: | ||
2457 | 557 | apt.write(ARCH_TO_PROPOSED_POCKET[arch].format(release)) | ||
2458 | 558 | |||
2459 | 559 | |||
2460 | 560 | def _add_apt_repository(spec): | ||
2461 | 561 | """Add the spec using add_apt_repository | ||
2462 | 562 | |||
2463 | 563 | :param spec: the parameter to pass to add_apt_repository | ||
2464 | 564 | :type spec: str | ||
2465 | 565 | """ | ||
2466 | 566 | if '{series}' in spec: | ||
2467 | 567 | series = get_distrib_codename() | ||
2468 | 568 | spec = spec.replace('{series}', series) | ||
2469 | 569 | # software-properties package for bionic properly reacts to proxy settings | ||
2470 | 570 | # passed as environment variables (See lp:1433761). This is not the case | ||
2471 | 571 | # LTS and non-LTS releases below bionic. | ||
2472 | 572 | _run_with_retries(['add-apt-repository', '--yes', spec], | ||
2473 | 573 | cmd_env=env_proxy_settings(['https'])) | ||
2474 | 574 | |||
2475 | 575 | |||
2476 | 576 | def _add_cloud_pocket(pocket): | ||
2477 | 577 | """Add a cloud pocket as /etc/apt/sources.d/cloud-archive.list | ||
2478 | 578 | |||
2479 | 579 | Note that this overwrites the existing file if there is one. | ||
2480 | 580 | |||
2481 | 581 | This function also converts the simple pocket in to the actual pocket using | ||
2482 | 582 | the CLOUD_ARCHIVE_POCKETS mapping. | ||
2483 | 583 | |||
2484 | 584 | :param pocket: string representing the pocket to add a deb spec for. | ||
2485 | 585 | :raises: SourceConfigError if the cloud pocket doesn't exist or the | ||
2486 | 586 | requested release doesn't match the current distro version. | ||
2487 | 587 | """ | ||
2488 | 588 | apt_install(filter_installed_packages(['ubuntu-cloud-keyring']), | ||
2489 | 256 | fatal=True) | 589 | fatal=True) |
2521 | 257 | pocket = source.split(':')[-1] | 590 | if pocket not in CLOUD_ARCHIVE_POCKETS: |
2522 | 258 | if pocket not in CLOUD_ARCHIVE_POCKETS: | 591 | raise SourceConfigError( |
2523 | 259 | raise SourceConfigError( | 592 | 'Unsupported cloud: source option %s' % |
2524 | 260 | 'Unsupported cloud: source option %s' % | 593 | pocket) |
2525 | 261 | pocket) | 594 | actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket] |
2526 | 262 | actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket] | 595 | with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt: |
2527 | 263 | with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt: | 596 | apt.write(CLOUD_ARCHIVE.format(actual_pocket)) |
2528 | 264 | apt.write(CLOUD_ARCHIVE.format(actual_pocket)) | 597 | |
2529 | 265 | elif source == 'proposed': | 598 | |
2530 | 266 | release = lsb_release()['DISTRIB_CODENAME'] | 599 | def _add_cloud_staging(cloud_archive_release, openstack_release): |
2531 | 267 | with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt: | 600 | """Add the cloud staging repository which is in |
2532 | 268 | apt.write(PROPOSED_POCKET.format(release)) | 601 | ppa:ubuntu-cloud-archive/<openstack_release>-staging |
2533 | 269 | elif source == 'distro': | 602 | |
2534 | 270 | pass | 603 | This function checks that the cloud_archive_release matches the current |
2535 | 271 | else: | 604 | codename for the distro that charm is being installed on. |
2536 | 272 | log("Unknown source: {!r}".format(source)) | 605 | |
2537 | 273 | 606 | :param cloud_archive_release: string, codename for the release. | |
2538 | 274 | if key: | 607 | :param openstack_release: String, codename for the openstack release. |
2539 | 275 | if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key: | 608 | :raises: SourceConfigError if the cloud_archive_release doesn't match the |
2540 | 276 | with NamedTemporaryFile('w+') as key_file: | 609 | current version of the os. |
2541 | 277 | key_file.write(key) | 610 | """ |
2542 | 278 | key_file.flush() | 611 | _verify_is_ubuntu_rel(cloud_archive_release, openstack_release) |
2543 | 279 | key_file.seek(0) | 612 | ppa = 'ppa:ubuntu-cloud-archive/{}-staging'.format(openstack_release) |
2544 | 280 | subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file) | 613 | cmd = 'add-apt-repository -y {}'.format(ppa) |
2545 | 281 | else: | 614 | _run_with_retries(cmd.split(' ')) |
2546 | 282 | # Note that hkp: is in no way a secure protocol. Using a | 615 | |
2547 | 283 | # GPG key id is pointless from a security POV unless you | 616 | |
2548 | 284 | # absolutely trust your network and DNS. | 617 | def _add_cloud_distro_check(cloud_archive_release, openstack_release): |
2549 | 285 | subprocess.check_call(['apt-key', 'adv', '--keyserver', | 618 | """Add the cloud pocket, but also check the cloud_archive_release against |
2550 | 286 | 'hkp://keyserver.ubuntu.com:80', '--recv', | 619 | the current distro, and use the openstack_release as the full lookup. |
2551 | 287 | key]) | 620 | |
2552 | 621 | This just calls _add_cloud_pocket() with the openstack_release as pocket | ||
2553 | 622 | to get the correct cloud-archive.list for dpkg to work with. | ||
2554 | 623 | |||
2555 | 624 | :param cloud_archive_release:String, codename for the distro release. | ||
2556 | 625 | :param openstack_release: String, spec for the release to look up in the | ||
2557 | 626 | CLOUD_ARCHIVE_POCKETS | ||
2558 | 627 | :raises: SourceConfigError if this is the wrong distro, or the pocket spec | ||
2559 | 628 | doesn't exist. | ||
2560 | 629 | """ | ||
2561 | 630 | _verify_is_ubuntu_rel(cloud_archive_release, openstack_release) | ||
2562 | 631 | _add_cloud_pocket("{}-{}".format(cloud_archive_release, openstack_release)) | ||
2563 | 632 | |||
2564 | 633 | |||
2565 | 634 | def _verify_is_ubuntu_rel(release, os_release): | ||
2566 | 635 | """Verify that the release is in the same as the current ubuntu release. | ||
2567 | 636 | |||
2568 | 637 | :param release: String, lowercase for the release. | ||
2569 | 638 | :param os_release: String, the os_release being asked for | ||
2570 | 639 | :raises: SourceConfigError if the release is not the same as the ubuntu | ||
2571 | 640 | release. | ||
2572 | 641 | """ | ||
2573 | 642 | ubuntu_rel = get_distrib_codename() | ||
2574 | 643 | if release != ubuntu_rel: | ||
2575 | 644 | raise SourceConfigError( | ||
2576 | 645 | 'Invalid Cloud Archive release specified: {}-{} on this Ubuntu' | ||
2577 | 646 | 'version ({})'.format(release, os_release, ubuntu_rel)) | ||
2578 | 288 | 647 | ||
2579 | 289 | 648 | ||
2580 | 290 | def _run_with_retries(cmd, max_retries=CMD_RETRY_COUNT, retry_exitcodes=(1,), | 649 | def _run_with_retries(cmd, max_retries=CMD_RETRY_COUNT, retry_exitcodes=(1,), |
2581 | @@ -300,9 +659,12 @@ | |||
2582 | 300 | :param: cmd_env: dict: Environment variables to add to the command run. | 659 | :param: cmd_env: dict: Environment variables to add to the command run. |
2583 | 301 | """ | 660 | """ |
2584 | 302 | 661 | ||
2586 | 303 | env = os.environ.copy() | 662 | env = None |
2587 | 663 | kwargs = {} | ||
2588 | 304 | if cmd_env: | 664 | if cmd_env: |
2589 | 665 | env = os.environ.copy() | ||
2590 | 305 | env.update(cmd_env) | 666 | env.update(cmd_env) |
2591 | 667 | kwargs['env'] = env | ||
2592 | 306 | 668 | ||
2593 | 307 | if not retry_message: | 669 | if not retry_message: |
2594 | 308 | retry_message = "Failed executing '{}'".format(" ".join(cmd)) | 670 | retry_message = "Failed executing '{}'".format(" ".join(cmd)) |
2595 | @@ -314,7 +676,8 @@ | |||
2596 | 314 | retry_results = (None,) + retry_exitcodes | 676 | retry_results = (None,) + retry_exitcodes |
2597 | 315 | while result in retry_results: | 677 | while result in retry_results: |
2598 | 316 | try: | 678 | try: |
2600 | 317 | result = subprocess.check_call(cmd, env=env) | 679 | # result = subprocess.check_call(cmd, env=env) |
2601 | 680 | result = subprocess.check_call(cmd, **kwargs) | ||
2602 | 318 | except subprocess.CalledProcessError as e: | 681 | except subprocess.CalledProcessError as e: |
2603 | 319 | retry_count = retry_count + 1 | 682 | retry_count = retry_count + 1 |
2604 | 320 | if retry_count > max_retries: | 683 | if retry_count > max_retries: |
2605 | @@ -327,6 +690,7 @@ | |||
2606 | 327 | def _run_apt_command(cmd, fatal=False): | 690 | def _run_apt_command(cmd, fatal=False): |
2607 | 328 | """Run an apt command with optional retries. | 691 | """Run an apt command with optional retries. |
2608 | 329 | 692 | ||
2609 | 693 | :param: cmd: str: The apt command to run. | ||
2610 | 330 | :param: fatal: bool: Whether the command's output should be checked and | 694 | :param: fatal: bool: Whether the command's output should be checked and |
2611 | 331 | retried. | 695 | retried. |
2612 | 332 | """ | 696 | """ |
2613 | @@ -353,7 +717,7 @@ | |||
2614 | 353 | cache = apt_cache() | 717 | cache = apt_cache() |
2615 | 354 | try: | 718 | try: |
2616 | 355 | pkg = cache[package] | 719 | pkg = cache[package] |
2618 | 356 | except: | 720 | except Exception: |
2619 | 357 | # the package is unknown to the current apt cache. | 721 | # the package is unknown to the current apt cache. |
2620 | 358 | return None | 722 | return None |
2621 | 359 | 723 |
Command: make ci-test /ci.lscape. net/job/ latch-test- xenial/ 3926/
Result: Fail
Revno: 69
Branch: lp:~xavpaice/landscape-client-charm/lp1825267
Jenkins: https:/