Merge lp:~simpoir/landscape-charm/charmhelpers-and-keys into lp:~landscape/landscape-charm/trunk

Proposed by Simon Poirier
Status: Merged
Approved by: Simon Poirier
Approved revision: 397
Merged at revision: 397
Proposed branch: lp:~simpoir/landscape-charm/charmhelpers-and-keys
Merge into: lp:~landscape/landscape-charm/trunk
Diff against target: 2745 lines (+1661/-199)
26 files modified
Makefile (+1/-2)
charmhelpers/__init__.py (+65/-4)
charmhelpers/contrib/hahelpers/apache.py (+5/-14)
charmhelpers/contrib/hahelpers/cluster.py (+43/-0)
charmhelpers/core/hookenv.py (+450/-28)
charmhelpers/core/host.py (+166/-11)
charmhelpers/core/host_factory/ubuntu.py (+26/-0)
charmhelpers/core/kernel.py (+2/-2)
charmhelpers/core/services/base.py (+18/-7)
charmhelpers/core/strutils.py (+11/-5)
charmhelpers/core/sysctl.py (+21/-10)
charmhelpers/core/templating.py (+18/-9)
charmhelpers/core/unitdata.py (+8/-1)
charmhelpers/fetch/__init__.py (+19/-9)
charmhelpers/fetch/archiveurl.py (+1/-1)
charmhelpers/fetch/bzrurl.py (+2/-2)
charmhelpers/fetch/centos.py (+1/-1)
charmhelpers/fetch/giturl.py (+2/-2)
charmhelpers/fetch/python/__init__.py (+13/-0)
charmhelpers/fetch/python/debug.py (+54/-0)
charmhelpers/fetch/python/packages.py (+154/-0)
charmhelpers/fetch/python/rpdb.py (+56/-0)
charmhelpers/fetch/python/version.py (+32/-0)
charmhelpers/fetch/snap.py (+33/-5)
charmhelpers/fetch/ubuntu.py (+428/-62)
dev/charm_helpers_sync.py (+32/-24)
To merge this branch: bzr merge lp:~simpoir/landscape-charm/charmhelpers-and-keys
Reviewer Review Type Date Requested Status
🤖 Landscape Builder test results Approve
Adam Collard (community) Approve
Review via email: mp+367881@code.launchpad.net

Commit message

This branch updates charm helpers, and add the fix proposed as
https://github.com/juju/charm-helpers/pull/326

This should fix apt failures when specifying a deb source and key.

Description of the change

This branch updates charm helpers, and add the fix proposed as
https://github.com/juju/charm-helpers/pull/326

This should fix apt failures when specifying a deb source and key.

Testing instructions:

juju deploy . --config install_sources='["deb http://ppa.launchpad.net/landscape/18.03/ubuntu bionic main"]' --config install_keys='["-----BEGIN PGP PUBLIC KEY BLOCK-----\\nVersion: SKS 1.1.6\\nComment: Hostname: keyserver.ubuntu.com\\n\\nmI0ESXN/egEEAOgRYISU9dnQm4BB5ZEEwKT+NKUDNd/DhMYdtBMw9Yk7S5cyoqpbtwoPJVzK\\nAXxq+ng5e3yYypSv98pLMr5UF09FGaeyGlD4s1uaVFWkFCO4jsTg7pWIY6qzO/jMxB5+Yu/G\\n0GjWQMNKxFk0oHMa0PhNBZtdPacVz65mOVmCsh/lABEBAAG0G0xhdW5jaHBhZCBQUEEgZm9y\\nIExhbmRzY2FwZYi2BBMBAgAgBQJJc396AhsDBgsJCAcDAgQVAggDBBYCAwECHgECF4AACgkQ\\nboWobkZStOb+rwP+ONKUWeX+MTIPqGWkknBPV7jm8nyyIUojC4IhS+9YR6GYnn0hMABSkEHm\\nIV73feKmrT2GESYI1UdYeKiOkWsPN/JyBk+eTvKet0qsw5TluqiHSW+LEi/+zUyrS3dDMX3o\\nyaLgYa+UkjIyxnaKLkQuCiS+D+fYwnJulIkhaKObtdE=\\n=UwRd\\n-----END PGP PUBLIC KEY BLOCK-----"]'

juju debug-log

check for apt refresh failures (or lack thereof). Packages will install and charm will block on relations. Previous behaviour was a series of failed update/install, eventually succeeding after many retries.

To post a comment you must log in.
Revision history for this message
Adam Collard (adam-collard) wrote :

+1

review: Approve
Revision history for this message
🤖 Landscape Builder (landscape-builder) wrote :

Voting does not meet specified criteria. Required: Approve >= 2, Disapprove == 0. Got: 1 Approve.

Revision history for this message
🤖 Landscape Builder (landscape-builder) wrote :

No approved revision specified.

Revision history for this message
🤖 Landscape Builder (landscape-builder) :
review: Abstain (executing tests)
Revision history for this message
🤖 Landscape Builder (landscape-builder) wrote :

Command: make ci-test
Result: Success
Revno: 397
Branch: lp:~simpoir/landscape-charm/charmhelpers-and-keys
Jenkins: https://ci.lscape.net/job/latch-test-xenial/3942/

review: Approve (test results)

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'Makefile'
2--- Makefile 2019-01-17 14:23:55 +0000
3+++ Makefile 2019-05-24 12:43:31 +0000
4@@ -88,8 +88,7 @@
5
6 dev/charm_helpers_sync.py:
7 @mkdir -p dev
8- @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
9- > dev/charm_helpers_sync.py
10+ @curl https://git.launchpad.net/charm-helpers/plain/tools/charm_helpers_sync/charm_helpers_sync.py > dev/charm_helpers_sync.py
11
12 sync: dev/charm_helpers_sync.py
13 $(PYTHON) dev/charm_helpers_sync.py -c charm-helpers.yaml
14
15=== modified file 'charmhelpers/__init__.py'
16--- charmhelpers/__init__.py 2017-03-03 21:03:14 +0000
17+++ charmhelpers/__init__.py 2019-05-24 12:43:31 +0000
18@@ -14,23 +14,84 @@
19
20 # Bootstrap charm-helpers, installing its dependencies if necessary using
21 # only standard libraries.
22+from __future__ import print_function
23+from __future__ import absolute_import
24+
25+import functools
26+import inspect
27 import subprocess
28 import sys
29
30 try:
31- import six # flake8: noqa
32+ import six # NOQA:F401
33 except ImportError:
34 if sys.version_info.major == 2:
35 subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])
36 else:
37 subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])
38- import six # flake8: noqa
39+ import six # NOQA:F401
40
41 try:
42- import yaml # flake8: noqa
43+ import yaml # NOQA:F401
44 except ImportError:
45 if sys.version_info.major == 2:
46 subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])
47 else:
48 subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])
49- import yaml # flake8: noqa
50+ import yaml # NOQA:F401
51+
52+
53+# Holds a list of mapping of mangled function names that have been deprecated
54+# using the @deprecate decorator below. This is so that the warning is only
55+# printed once for each usage of the function.
56+__deprecated_functions = {}
57+
58+
59+def deprecate(warning, date=None, log=None):
60+ """Add a deprecation warning the first time the function is used.
61+ The date, which is a string in semi-ISO8660 format indicate the year-month
62+ that the function is officially going to be removed.
63+
64+ usage:
65+
66+ @deprecate('use core/fetch/add_source() instead', '2017-04')
67+ def contributed_add_source_thing(...):
68+ ...
69+
70+ And it then prints to the log ONCE that the function is deprecated.
71+ The reason for passing the logging function (log) is so that hookenv.log
72+ can be used for a charm if needed.
73+
74+ :param warning: String to indicat where it has moved ot.
75+ :param date: optional sting, in YYYY-MM format to indicate when the
76+ function will definitely (probably) be removed.
77+ :param log: The log function to call to log. If not, logs to stdout
78+ """
79+ def wrap(f):
80+
81+ @functools.wraps(f)
82+ def wrapped_f(*args, **kwargs):
83+ try:
84+ module = inspect.getmodule(f)
85+ file = inspect.getsourcefile(f)
86+ lines = inspect.getsourcelines(f)
87+ f_name = "{}-{}-{}..{}-{}".format(
88+ module.__name__, file, lines[0], lines[-1], f.__name__)
89+ except (IOError, TypeError):
90+ # assume it was local, so just use the name of the function
91+ f_name = f.__name__
92+ if f_name not in __deprecated_functions:
93+ __deprecated_functions[f_name] = True
94+ s = "DEPRECATION WARNING: Function {} is being removed".format(
95+ f.__name__)
96+ if date:
97+ s = "{} on/around {}".format(s, date)
98+ if warning:
99+ s = "{} : {}".format(s, warning)
100+ if log:
101+ log(s)
102+ else:
103+ print(s)
104+ return f(*args, **kwargs)
105+ return wrapped_f
106+ return wrap
107
108=== modified file 'charmhelpers/contrib/hahelpers/apache.py'
109--- charmhelpers/contrib/hahelpers/apache.py 2017-03-03 21:03:14 +0000
110+++ charmhelpers/contrib/hahelpers/apache.py 2019-05-24 12:43:31 +0000
111@@ -23,8 +23,8 @@
112 #
113
114 import os
115-import subprocess
116
117+from charmhelpers.core import host
118 from charmhelpers.core.hookenv import (
119 config as config_get,
120 relation_get,
121@@ -65,7 +65,8 @@
122 if ca_cert is None:
123 log("Inspecting identity-service relations for CA SSL certificate.",
124 level=INFO)
125- for r_id in relation_ids('identity-service'):
126+ for r_id in (relation_ids('identity-service') +
127+ relation_ids('identity-credentials')):
128 for unit in relation_list(r_id):
129 if ca_cert is None:
130 ca_cert = relation_get('ca_cert',
131@@ -76,20 +77,10 @@
132 def retrieve_ca_cert(cert_file):
133 cert = None
134 if os.path.isfile(cert_file):
135- with open(cert_file, 'r') as crt:
136+ with open(cert_file, 'rb') as crt:
137 cert = crt.read()
138 return cert
139
140
141 def install_ca_cert(ca_cert):
142- if ca_cert:
143- cert_file = ('/usr/local/share/ca-certificates/'
144- 'keystone_juju_ca_cert.crt')
145- old_cert = retrieve_ca_cert(cert_file)
146- if old_cert and old_cert == ca_cert:
147- log("CA cert is the same as installed version", level=INFO)
148- else:
149- log("Installing new CA cert", level=INFO)
150- with open(cert_file, 'w') as crt:
151- crt.write(ca_cert)
152- subprocess.check_call(['update-ca-certificates', '--fresh'])
153+ host.install_ca_cert(ca_cert, 'keystone_juju_ca_cert')
154
155=== modified file 'charmhelpers/contrib/hahelpers/cluster.py'
156--- charmhelpers/contrib/hahelpers/cluster.py 2017-03-03 21:03:14 +0000
157+++ charmhelpers/contrib/hahelpers/cluster.py 2019-05-24 12:43:31 +0000
158@@ -27,6 +27,7 @@
159
160 import subprocess
161 import os
162+import time
163
164 from socket import gethostname as get_unit_hostname
165
166@@ -45,6 +46,9 @@
167 is_leader as juju_is_leader,
168 status_set,
169 )
170+from charmhelpers.core.host import (
171+ modulo_distribution,
172+)
173 from charmhelpers.core.decorators import (
174 retry_on_exception,
175 )
176@@ -219,6 +223,11 @@
177 return True
178 if config_get('ssl_cert') and config_get('ssl_key'):
179 return True
180+ for r_id in relation_ids('certificates'):
181+ for unit in relation_list(r_id):
182+ ca = relation_get('ca', rid=r_id, unit=unit)
183+ if ca:
184+ return True
185 for r_id in relation_ids('identity-service'):
186 for unit in relation_list(r_id):
187 # TODO - needs fixing for new helper as ssl_cert/key suffixes with CN
188@@ -361,3 +370,37 @@
189 else:
190 addr = unit_get('private-address')
191 return '%s://%s' % (scheme, addr)
192+
193+
194+def distributed_wait(modulo=None, wait=None, operation_name='operation'):
195+ ''' Distribute operations by waiting based on modulo_distribution
196+
197+ If modulo and or wait are not set, check config_get for those values.
198+ If config values are not set, default to modulo=3 and wait=30.
199+
200+ :param modulo: int The modulo number creates the group distribution
201+ :param wait: int The constant time wait value
202+ :param operation_name: string Operation name for status message
203+ i.e. 'restart'
204+ :side effect: Calls config_get()
205+ :side effect: Calls log()
206+ :side effect: Calls status_set()
207+ :side effect: Calls time.sleep()
208+ '''
209+ if modulo is None:
210+ modulo = config_get('modulo-nodes') or 3
211+ if wait is None:
212+ wait = config_get('known-wait') or 30
213+ if juju_is_leader():
214+ # The leader should never wait
215+ calculated_wait = 0
216+ else:
217+ # non_zero_wait=True guarantees the non-leader who gets modulo 0
218+ # will still wait
219+ calculated_wait = modulo_distribution(modulo=modulo, wait=wait,
220+ non_zero_wait=True)
221+ msg = "Waiting {} seconds for {} ...".format(calculated_wait,
222+ operation_name)
223+ log(msg, DEBUG)
224+ status_set('maintenance', msg)
225+ time.sleep(calculated_wait)
226
227=== modified file 'charmhelpers/core/hookenv.py'
228--- charmhelpers/core/hookenv.py 2017-03-03 21:03:14 +0000
229+++ charmhelpers/core/hookenv.py 2019-05-24 12:43:31 +0000
230@@ -22,10 +22,12 @@
231 import copy
232 from distutils.version import LooseVersion
233 from functools import wraps
234+from collections import namedtuple
235 import glob
236 import os
237 import json
238 import yaml
239+import re
240 import subprocess
241 import sys
242 import errno
243@@ -38,12 +40,20 @@
244 else:
245 from collections import UserDict
246
247+
248 CRITICAL = "CRITICAL"
249 ERROR = "ERROR"
250 WARNING = "WARNING"
251 INFO = "INFO"
252 DEBUG = "DEBUG"
253+TRACE = "TRACE"
254 MARKER = object()
255+SH_MAX_ARG = 131071
256+
257+
258+RANGE_WARNING = ('Passing NO_PROXY string that includes a cidr. '
259+ 'This may not be compatible with software you are '
260+ 'running in your shell.')
261
262 cache = {}
263
264@@ -64,7 +74,7 @@
265 @wraps(func)
266 def wrapper(*args, **kwargs):
267 global cache
268- key = str((func, args, kwargs))
269+ key = json.dumps((func, args, kwargs), sort_keys=True, default=str)
270 try:
271 return cache[key]
272 except KeyError:
273@@ -94,7 +104,7 @@
274 command += ['-l', level]
275 if not isinstance(message, six.string_types):
276 message = repr(message)
277- command += [message]
278+ command += [message[:SH_MAX_ARG]]
279 # Missing juju-log should not cause failures in unit tests
280 # Send log output to stderr
281 try:
282@@ -197,9 +207,56 @@
283 return os.environ.get('JUJU_REMOTE_UNIT', None)
284
285
286+def application_name():
287+ """
288+ The name of the deployed application this unit belongs to.
289+ """
290+ return local_unit().split('/')[0]
291+
292+
293 def service_name():
294- """The name service group this unit belongs to"""
295- return local_unit().split('/')[0]
296+ """
297+ .. deprecated:: 0.19.1
298+ Alias for :func:`application_name`.
299+ """
300+ return application_name()
301+
302+
303+def model_name():
304+ """
305+ Name of the model that this unit is deployed in.
306+ """
307+ return os.environ['JUJU_MODEL_NAME']
308+
309+
310+def model_uuid():
311+ """
312+ UUID of the model that this unit is deployed in.
313+ """
314+ return os.environ['JUJU_MODEL_UUID']
315+
316+
317+def principal_unit():
318+ """Returns the principal unit of this unit, otherwise None"""
319+ # Juju 2.2 and above provides JUJU_PRINCIPAL_UNIT
320+ principal_unit = os.environ.get('JUJU_PRINCIPAL_UNIT', None)
321+ # If it's empty, then this unit is the principal
322+ if principal_unit == '':
323+ return os.environ['JUJU_UNIT_NAME']
324+ elif principal_unit is not None:
325+ return principal_unit
326+ # For Juju 2.1 and below, let's try work out the principle unit by
327+ # the various charms' metadata.yaml.
328+ for reltype in relation_types():
329+ for rid in relation_ids(reltype):
330+ for unit in related_units(rid):
331+ md = _metadata_unit(unit)
332+ if not md:
333+ continue
334+ subordinate = md.pop('subordinate', None)
335+ if not subordinate:
336+ return unit
337+ return None
338
339
340 @cached
341@@ -263,7 +320,7 @@
342 self.implicit_save = True
343 self._prev_dict = None
344 self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)
345- if os.path.exists(self.path):
346+ if os.path.exists(self.path) and os.stat(self.path).st_size:
347 self.load_previous()
348 atexit(self._implicit_save)
349
350@@ -283,7 +340,11 @@
351 """
352 self.path = path or self.path
353 with open(self.path) as f:
354- self._prev_dict = json.load(f)
355+ try:
356+ self._prev_dict = json.load(f)
357+ except ValueError as e:
358+ log('Unable to parse previous config data - {}'.format(str(e)),
359+ level=ERROR)
360 for k, v in copy.deepcopy(self._prev_dict).items():
361 if k not in self:
362 self[k] = v
363@@ -319,6 +380,7 @@
364
365 """
366 with open(self.path, 'w') as f:
367+ os.fchmod(f.fileno(), 0o600)
368 json.dump(self, f)
369
370 def _implicit_save(self):
371@@ -326,22 +388,40 @@
372 self.save()
373
374
375-@cached
376+_cache_config = None
377+
378+
379 def config(scope=None):
380- """Juju charm configuration"""
381- config_cmd_line = ['config-get']
382- if scope is not None:
383- config_cmd_line.append(scope)
384- else:
385- config_cmd_line.append('--all')
386- config_cmd_line.append('--format=json')
387- try:
388- config_data = json.loads(
389- subprocess.check_output(config_cmd_line).decode('UTF-8'))
390+ """
391+ Get the juju charm configuration (scope==None) or individual key,
392+ (scope=str). The returned value is a Python data structure loaded as
393+ JSON from the Juju config command.
394+
395+ :param scope: If set, return the value for the specified key.
396+ :type scope: Optional[str]
397+ :returns: Either the whole config as a Config, or a key from it.
398+ :rtype: Any
399+ """
400+ global _cache_config
401+ config_cmd_line = ['config-get', '--all', '--format=json']
402+ try:
403+ # JSON Decode Exception for Python3.5+
404+ exc_json = json.decoder.JSONDecodeError
405+ except AttributeError:
406+ # JSON Decode Exception for Python2.7 through Python3.4
407+ exc_json = ValueError
408+ try:
409+ if _cache_config is None:
410+ config_data = json.loads(
411+ subprocess.check_output(config_cmd_line).decode('UTF-8'))
412+ _cache_config = Config(config_data)
413 if scope is not None:
414- return config_data
415- return Config(config_data)
416- except ValueError:
417+ return _cache_config.get(scope)
418+ return _cache_config
419+ except (exc_json, UnicodeDecodeError) as e:
420+ log('Unable to parse output from config-get: config_cmd_line="{}" '
421+ 'message="{}"'
422+ .format(config_cmd_line, str(e)), level=ERROR)
423 return None
424
425
426@@ -435,6 +515,67 @@
427 subprocess.check_output(units_cmd_line).decode('UTF-8')) or []
428
429
430+def expected_peer_units():
431+ """Get a generator for units we expect to join peer relation based on
432+ goal-state.
433+
434+ The local unit is excluded from the result to make it easy to gauge
435+ completion of all peers joining the relation with existing hook tools.
436+
437+ Example usage:
438+ log('peer {} of {} joined peer relation'
439+ .format(len(related_units()),
440+ len(list(expected_peer_units()))))
441+
442+ This function will raise NotImplementedError if used with juju versions
443+ without goal-state support.
444+
445+ :returns: iterator
446+ :rtype: types.GeneratorType
447+ :raises: NotImplementedError
448+ """
449+ if not has_juju_version("2.4.0"):
450+ # goal-state first appeared in 2.4.0.
451+ raise NotImplementedError("goal-state")
452+ _goal_state = goal_state()
453+ return (key for key in _goal_state['units']
454+ if '/' in key and key != local_unit())
455+
456+
457+def expected_related_units(reltype=None):
458+ """Get a generator for units we expect to join relation based on
459+ goal-state.
460+
461+ Note that you can not use this function for the peer relation, take a look
462+ at expected_peer_units() for that.
463+
464+ This function will raise KeyError if you request information for a
465+ relation type for which juju goal-state does not have information. It will
466+ raise NotImplementedError if used with juju versions without goal-state
467+ support.
468+
469+ Example usage:
470+ log('participant {} of {} joined relation {}'
471+ .format(len(related_units()),
472+ len(list(expected_related_units())),
473+ relation_type()))
474+
475+ :param reltype: Relation type to list data for, default is to list data for
476+ the realtion type we are currently executing a hook for.
477+ :type reltype: str
478+ :returns: iterator
479+ :rtype: types.GeneratorType
480+ :raises: KeyError, NotImplementedError
481+ """
482+ if not has_juju_version("2.4.4"):
483+ # goal-state existed in 2.4.0, but did not list individual units to
484+ # join a relation in 2.4.1 through 2.4.3. (LP: #1794739)
485+ raise NotImplementedError("goal-state relation unit count")
486+ reltype = reltype or relation_type()
487+ _goal_state = goal_state()
488+ return (key for key in _goal_state['relations'][reltype] if '/' in key)
489+
490+
491 @cached
492 def relation_for_unit(unit=None, rid=None):
493 """Get the json represenation of a unit's relation"""
494@@ -478,6 +619,24 @@
495 return yaml.safe_load(md)
496
497
498+def _metadata_unit(unit):
499+ """Given the name of a unit (e.g. apache2/0), get the unit charm's
500+ metadata.yaml. Very similar to metadata() but allows us to inspect
501+ other units. Unit needs to be co-located, such as a subordinate or
502+ principal/primary.
503+
504+ :returns: metadata.yaml as a python object.
505+
506+ """
507+ basedir = os.sep.join(charm_dir().split(os.sep)[:-2])
508+ unitdir = 'unit-{}'.format(unit.replace(os.sep, '-'))
509+ joineddir = os.path.join(basedir, unitdir, 'charm', 'metadata.yaml')
510+ if not os.path.exists(joineddir):
511+ return None
512+ with open(joineddir) as md:
513+ return yaml.safe_load(md)
514+
515+
516 @cached
517 def relation_types():
518 """Get a list of relation types supported by this charm"""
519@@ -602,18 +761,31 @@
520 return False
521
522
523+def _port_op(op_name, port, protocol="TCP"):
524+ """Open or close a service network port"""
525+ _args = [op_name]
526+ icmp = protocol.upper() == "ICMP"
527+ if icmp:
528+ _args.append(protocol)
529+ else:
530+ _args.append('{}/{}'.format(port, protocol))
531+ try:
532+ subprocess.check_call(_args)
533+ except subprocess.CalledProcessError:
534+ # Older Juju pre 2.3 doesn't support ICMP
535+ # so treat it as a no-op if it fails.
536+ if not icmp:
537+ raise
538+
539+
540 def open_port(port, protocol="TCP"):
541 """Open a service network port"""
542- _args = ['open-port']
543- _args.append('{}/{}'.format(port, protocol))
544- subprocess.check_call(_args)
545+ _port_op('open-port', port, protocol)
546
547
548 def close_port(port, protocol="TCP"):
549 """Close a service network port"""
550- _args = ['close-port']
551- _args.append('{}/{}'.format(port, protocol))
552- subprocess.check_call(_args)
553+ _port_op('close-port', port, protocol)
554
555
556 def open_ports(start, end, protocol="TCP"):
557@@ -630,6 +802,17 @@
558 subprocess.check_call(_args)
559
560
561+def opened_ports():
562+ """Get the opened ports
563+
564+ *Note that this will only show ports opened in a previous hook*
565+
566+ :returns: Opened ports as a list of strings: ``['8080/tcp', '8081-8083/tcp']``
567+ """
568+ _args = ['opened-ports', '--format=json']
569+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
570+
571+
572 @cached
573 def unit_get(attribute):
574 """Get the unit ID for the remote unit"""
575@@ -751,8 +934,15 @@
576 return wrapper
577
578
579+class NoNetworkBinding(Exception):
580+ pass
581+
582+
583 def charm_dir():
584 """Return the root directory of the current charm"""
585+ d = os.environ.get('JUJU_CHARM_DIR')
586+ if d is not None:
587+ return d
588 return os.environ.get('CHARM_DIR')
589
590
591@@ -874,6 +1064,14 @@
592
593
594 @translate_exc(from_exc=OSError, to_exc=NotImplementedError)
595+@cached
596+def goal_state():
597+ """Juju goal state values"""
598+ cmd = ['goal-state', '--format=json']
599+ return json.loads(subprocess.check_output(cmd).decode('UTF-8'))
600+
601+
602+@translate_exc(from_exc=OSError, to_exc=NotImplementedError)
603 def is_leader():
604 """Does the current unit hold the juju leadership
605
606@@ -967,7 +1165,6 @@
607 universal_newlines=True).strip()
608
609
610-@cached
611 def has_juju_version(minimum_version):
612 """Return True if the Juju version is at least the provided version"""
613 return LooseVersion(juju_version()) >= LooseVersion(minimum_version)
614@@ -1027,6 +1224,8 @@
615 @translate_exc(from_exc=OSError, to_exc=NotImplementedError)
616 def network_get_primary_address(binding):
617 '''
618+ Deprecated since Juju 2.3; use network_get()
619+
620 Retrieve the primary network address for a named binding
621
622 :param binding: string. The name of a relation of extra-binding
623@@ -1034,7 +1233,41 @@
624 :raise: NotImplementedError if run on Juju < 2.0
625 '''
626 cmd = ['network-get', '--primary-address', binding]
627- return subprocess.check_output(cmd).decode('UTF-8').strip()
628+ try:
629+ response = subprocess.check_output(
630+ cmd,
631+ stderr=subprocess.STDOUT).decode('UTF-8').strip()
632+ except CalledProcessError as e:
633+ if 'no network config found for binding' in e.output.decode('UTF-8'):
634+ raise NoNetworkBinding("No network binding for {}"
635+ .format(binding))
636+ else:
637+ raise
638+ return response
639+
640+
641+def network_get(endpoint, relation_id=None):
642+ """
643+ Retrieve the network details for a relation endpoint
644+
645+ :param endpoint: string. The name of a relation endpoint
646+ :param relation_id: int. The ID of the relation for the current context.
647+ :return: dict. The loaded YAML output of the network-get query.
648+ :raise: NotImplementedError if request not supported by the Juju version.
649+ """
650+ if not has_juju_version('2.2'):
651+ raise NotImplementedError(juju_version()) # earlier versions require --primary-address
652+ if relation_id and not has_juju_version('2.3'):
653+ raise NotImplementedError # 2.3 added the -r option
654+
655+ cmd = ['network-get', endpoint, '--format', 'yaml']
656+ if relation_id:
657+ cmd.append('-r')
658+ cmd.append(relation_id)
659+ response = subprocess.check_output(
660+ cmd,
661+ stderr=subprocess.STDOUT).decode('UTF-8').strip()
662+ return yaml.safe_load(response)
663
664
665 def add_metric(*args, **kwargs):
666@@ -1066,3 +1299,192 @@
667 """Get the meter status information, if running in the meter-status-changed
668 hook."""
669 return os.environ.get('JUJU_METER_INFO')
670+
671+
672+def iter_units_for_relation_name(relation_name):
673+ """Iterate through all units in a relation
674+
675+ Generator that iterates through all the units in a relation and yields
676+ a named tuple with rid and unit field names.
677+
678+ Usage:
679+ data = [(u.rid, u.unit)
680+ for u in iter_units_for_relation_name(relation_name)]
681+
682+ :param relation_name: string relation name
683+ :yield: Named Tuple with rid and unit field names
684+ """
685+ RelatedUnit = namedtuple('RelatedUnit', 'rid, unit')
686+ for rid in relation_ids(relation_name):
687+ for unit in related_units(rid):
688+ yield RelatedUnit(rid, unit)
689+
690+
691+def ingress_address(rid=None, unit=None):
692+ """
693+ Retrieve the ingress-address from a relation when available.
694+ Otherwise, return the private-address.
695+
696+ When used on the consuming side of the relation (unit is a remote
697+ unit), the ingress-address is the IP address that this unit needs
698+ to use to reach the provided service on the remote unit.
699+
700+ When used on the providing side of the relation (unit == local_unit()),
701+ the ingress-address is the IP address that is advertised to remote
702+ units on this relation. Remote units need to use this address to
703+ reach the local provided service on this unit.
704+
705+ Note that charms may document some other method to use in
706+ preference to the ingress_address(), such as an address provided
707+ on a different relation attribute or a service discovery mechanism.
708+ This allows charms to redirect inbound connections to their peers
709+ or different applications such as load balancers.
710+
711+ Usage:
712+ addresses = [ingress_address(rid=u.rid, unit=u.unit)
713+ for u in iter_units_for_relation_name(relation_name)]
714+
715+ :param rid: string relation id
716+ :param unit: string unit name
717+ :side effect: calls relation_get
718+ :return: string IP address
719+ """
720+ settings = relation_get(rid=rid, unit=unit)
721+ return (settings.get('ingress-address') or
722+ settings.get('private-address'))
723+
724+
725+def egress_subnets(rid=None, unit=None):
726+ """
727+ Retrieve the egress-subnets from a relation.
728+
729+ This function is to be used on the providing side of the
730+ relation, and provides the ranges of addresses that client
731+ connections may come from. The result is uninteresting on
732+ the consuming side of a relation (unit == local_unit()).
733+
734+ Returns a stable list of subnets in CIDR format.
735+ eg. ['192.168.1.0/24', '2001::F00F/128']
736+
737+ If egress-subnets is not available, falls back to using the published
738+ ingress-address, or finally private-address.
739+
740+ :param rid: string relation id
741+ :param unit: string unit name
742+ :side effect: calls relation_get
743+ :return: list of subnets in CIDR format. eg. ['192.168.1.0/24', '2001::F00F/128']
744+ """
745+ def _to_range(addr):
746+ if re.search(r'^(?:\d{1,3}\.){3}\d{1,3}$', addr) is not None:
747+ addr += '/32'
748+ elif ':' in addr and '/' not in addr: # IPv6
749+ addr += '/128'
750+ return addr
751+
752+ settings = relation_get(rid=rid, unit=unit)
753+ if 'egress-subnets' in settings:
754+ return [n.strip() for n in settings['egress-subnets'].split(',') if n.strip()]
755+ if 'ingress-address' in settings:
756+ return [_to_range(settings['ingress-address'])]
757+ if 'private-address' in settings:
758+ return [_to_range(settings['private-address'])]
759+ return [] # Should never happen
760+
761+
762+def unit_doomed(unit=None):
763+ """Determines if the unit is being removed from the model
764+
765+ Requires Juju 2.4.1.
766+
767+ :param unit: string unit name, defaults to local_unit
768+ :side effect: calls goal_state
769+ :side effect: calls local_unit
770+ :side effect: calls has_juju_version
771+ :return: True if the unit is being removed, already gone, or never existed
772+ """
773+ if not has_juju_version("2.4.1"):
774+ # We cannot risk blindly returning False for 'we don't know',
775+ # because that could cause data loss; if call sites don't
776+ # need an accurate answer, they likely don't need this helper
777+ # at all.
778+ # goal-state existed in 2.4.0, but did not handle removals
779+ # correctly until 2.4.1.
780+ raise NotImplementedError("is_doomed")
781+ if unit is None:
782+ unit = local_unit()
783+ gs = goal_state()
784+ units = gs.get('units', {})
785+ if unit not in units:
786+ return True
787+ # I don't think 'dead' units ever show up in the goal-state, but
788+ # check anyway in addition to 'dying'.
789+ return units[unit]['status'] in ('dying', 'dead')
790+
791+
792+def env_proxy_settings(selected_settings=None):
793+ """Get proxy settings from process environment variables.
794+
795+ Get charm proxy settings from environment variables that correspond to
796+ juju-http-proxy, juju-https-proxy and juju-no-proxy (available as of 2.4.2,
797+ see lp:1782236) in a format suitable for passing to an application that
798+ reacts to proxy settings passed as environment variables. Some applications
799+ support lowercase or uppercase notation (e.g. curl), some support only
800+ lowercase (e.g. wget), there are also subjectively rare cases of only
801+ uppercase notation support. no_proxy CIDR and wildcard support also varies
802+ between runtimes and applications as there is no enforced standard.
803+
804+ Some applications may connect to multiple destinations and expose config
805+ options that would affect only proxy settings for a specific destination
806+ these should be handled in charms in an application-specific manner.
807+
808+ :param selected_settings: format only a subset of possible settings
809+ :type selected_settings: list
810+ :rtype: Option(None, dict[str, str])
811+ """
812+ SUPPORTED_SETTINGS = {
813+ 'http': 'HTTP_PROXY',
814+ 'https': 'HTTPS_PROXY',
815+ 'no_proxy': 'NO_PROXY',
816+ 'ftp': 'FTP_PROXY'
817+ }
818+ if selected_settings is None:
819+ selected_settings = SUPPORTED_SETTINGS
820+
821+ selected_vars = [v for k, v in SUPPORTED_SETTINGS.items()
822+ if k in selected_settings]
823+ proxy_settings = {}
824+ for var in selected_vars:
825+ var_val = os.getenv(var)
826+ if var_val:
827+ proxy_settings[var] = var_val
828+ proxy_settings[var.lower()] = var_val
829+ # Now handle juju-prefixed environment variables. The legacy vs new
830+ # environment variable usage is mutually exclusive
831+ charm_var_val = os.getenv('JUJU_CHARM_{}'.format(var))
832+ if charm_var_val:
833+ proxy_settings[var] = charm_var_val
834+ proxy_settings[var.lower()] = charm_var_val
835+ if 'no_proxy' in proxy_settings:
836+ if _contains_range(proxy_settings['no_proxy']):
837+ log(RANGE_WARNING, level=WARNING)
838+ return proxy_settings if proxy_settings else None
839+
840+
841+def _contains_range(addresses):
842+ """Check for cidr or wildcard domain in a string.
843+
844+ Given a string comprising a comma seperated list of ip addresses
845+ and domain names, determine whether the string contains IP ranges
846+ or wildcard domains.
847+
848+ :param addresses: comma seperated list of domains and ip addresses.
849+ :type addresses: str
850+ """
851+ return (
852+ # Test for cidr (e.g. 10.20.20.0/24)
853+ "/" in addresses or
854+ # Test for wildcard domains (*.foo.com or .foo.com)
855+ "*" in addresses or
856+ addresses.startswith(".") or
857+ ",." in addresses or
858+ " ." in addresses)
859
860=== modified file 'charmhelpers/core/host.py'
861--- charmhelpers/core/host.py 2017-04-11 18:01:45 +0000
862+++ charmhelpers/core/host.py 2019-05-24 12:43:31 +0000
863@@ -34,21 +34,23 @@
864
865 from contextlib import contextmanager
866 from collections import OrderedDict
867-from .hookenv import log
868+from .hookenv import log, INFO, DEBUG, local_unit, charm_name
869 from .fstab import Fstab
870 from charmhelpers.osplatform import get_platform
871
872 __platform__ = get_platform()
873 if __platform__ == "ubuntu":
874- from charmhelpers.core.host_factory.ubuntu import (
875+ from charmhelpers.core.host_factory.ubuntu import ( # NOQA:F401
876 service_available,
877 add_new_group,
878 lsb_release,
879 cmp_pkgrevno,
880 CompareHostReleases,
881+ get_distrib_codename,
882+ arch
883 ) # flake8: noqa -- ignore F401 for this import
884 elif __platform__ == "centos":
885- from charmhelpers.core.host_factory.centos import (
886+ from charmhelpers.core.host_factory.centos import ( # NOQA:F401
887 service_available,
888 add_new_group,
889 lsb_release,
890@@ -58,6 +60,7 @@
891
892 UPDATEDB_PATH = '/etc/updatedb.conf'
893
894+
895 def service_start(service_name, **kwargs):
896 """Start a system service.
897
898@@ -191,6 +194,7 @@
899 upstart_file = os.path.join(init_dir, "{}.conf".format(service_name))
900 sysv_file = os.path.join(initd_dir, service_name)
901 if init_is_systemd():
902+ service('disable', service_name)
903 service('mask', service_name)
904 elif os.path.exists(upstart_file):
905 override_path = os.path.join(
906@@ -225,6 +229,7 @@
907 sysv_file = os.path.join(initd_dir, service_name)
908 if init_is_systemd():
909 service('unmask', service_name)
910+ service('enable', service_name)
911 elif os.path.exists(upstart_file):
912 override_path = os.path.join(
913 init_dir, '{}.override'.format(service_name))
914@@ -285,8 +290,8 @@
915 for key, value in six.iteritems(kwargs):
916 parameter = '%s=%s' % (key, value)
917 cmd.append(parameter)
918- output = subprocess.check_output(cmd,
919- stderr=subprocess.STDOUT).decode('UTF-8')
920+ output = subprocess.check_output(
921+ cmd, stderr=subprocess.STDOUT).decode('UTF-8')
922 except subprocess.CalledProcessError:
923 return False
924 else:
925@@ -439,6 +444,51 @@
926 subprocess.check_call(cmd)
927
928
929+def chage(username, lastday=None, expiredate=None, inactive=None,
930+ mindays=None, maxdays=None, root=None, warndays=None):
931+ """Change user password expiry information
932+
933+ :param str username: User to update
934+ :param str lastday: Set when password was changed in YYYY-MM-DD format
935+ :param str expiredate: Set when user's account will no longer be
936+ accessible in YYYY-MM-DD format.
937+ -1 will remove an account expiration date.
938+ :param str inactive: Set the number of days of inactivity after a password
939+ has expired before the account is locked.
940+ -1 will remove an account's inactivity.
941+ :param str mindays: Set the minimum number of days between password
942+ changes to MIN_DAYS.
943+ 0 indicates the password can be changed anytime.
944+ :param str maxdays: Set the maximum number of days during which a
945+ password is valid.
946+ -1 as MAX_DAYS will remove checking maxdays
947+ :param str root: Apply changes in the CHROOT_DIR directory
948+ :param str warndays: Set the number of days of warning before a password
949+ change is required
950+ :raises subprocess.CalledProcessError: if call to chage fails
951+ """
952+ cmd = ['chage']
953+ if root:
954+ cmd.extend(['--root', root])
955+ if lastday:
956+ cmd.extend(['--lastday', lastday])
957+ if expiredate:
958+ cmd.extend(['--expiredate', expiredate])
959+ if inactive:
960+ cmd.extend(['--inactive', inactive])
961+ if mindays:
962+ cmd.extend(['--mindays', mindays])
963+ if maxdays:
964+ cmd.extend(['--maxdays', maxdays])
965+ if warndays:
966+ cmd.extend(['--warndays', warndays])
967+ cmd.append(username)
968+ subprocess.check_call(cmd)
969+
970+
971+remove_password_expiry = functools.partial(chage, expiredate='-1', inactive='-1', mindays='0', maxdays='-1')
972+
973+
974 def rsync(from_path, to_path, flags='-r', options=None, timeout=None):
975 """Replicate the contents of a path"""
976 options = options or ['--delete', '--executability']
977@@ -485,13 +535,45 @@
978
979 def write_file(path, content, owner='root', group='root', perms=0o444):
980 """Create or overwrite a file with the contents of a byte string."""
981- log("Writing file {} {}:{} {:o}".format(path, owner, group, perms))
982 uid = pwd.getpwnam(owner).pw_uid
983 gid = grp.getgrnam(group).gr_gid
984- with open(path, 'wb') as target:
985- os.fchown(target.fileno(), uid, gid)
986- os.fchmod(target.fileno(), perms)
987- target.write(content)
988+ # lets see if we can grab the file and compare the context, to avoid doing
989+ # a write.
990+ existing_content = None
991+ existing_uid, existing_gid, existing_perms = None, None, None
992+ try:
993+ with open(path, 'rb') as target:
994+ existing_content = target.read()
995+ stat = os.stat(path)
996+ existing_uid, existing_gid, existing_perms = (
997+ stat.st_uid, stat.st_gid, stat.st_mode
998+ )
999+ except Exception:
1000+ pass
1001+ if content != existing_content:
1002+ log("Writing file {} {}:{} {:o}".format(path, owner, group, perms),
1003+ level=DEBUG)
1004+ with open(path, 'wb') as target:
1005+ os.fchown(target.fileno(), uid, gid)
1006+ os.fchmod(target.fileno(), perms)
1007+ if six.PY3 and isinstance(content, six.string_types):
1008+ content = content.encode('UTF-8')
1009+ target.write(content)
1010+ return
1011+ # the contents were the same, but we might still need to change the
1012+ # ownership or permissions.
1013+ if existing_uid != uid:
1014+ log("Changing uid on already existing content: {} -> {}"
1015+ .format(existing_uid, uid), level=DEBUG)
1016+ os.chown(path, uid, -1)
1017+ if existing_gid != gid:
1018+ log("Changing gid on already existing content: {} -> {}"
1019+ .format(existing_gid, gid), level=DEBUG)
1020+ os.chown(path, -1, gid)
1021+ if existing_perms != perms:
1022+ log("Changing permissions on existing content: {} -> {}"
1023+ .format(existing_perms, perms), level=DEBUG)
1024+ os.chmod(path, perms)
1025
1026
1027 def fstab_remove(mp):
1028@@ -756,7 +838,7 @@
1029 ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1030 ip_output = (line.strip() for line in ip_output if line)
1031
1032- key = re.compile('^[0-9]+:\s+(.+):')
1033+ key = re.compile(r'^[0-9]+:\s+(.+):')
1034 for line in ip_output:
1035 matched = re.search(key, line)
1036 if matched:
1037@@ -901,6 +983,20 @@
1038
1039
1040 def add_to_updatedb_prunepath(path, updatedb_path=UPDATEDB_PATH):
1041+ """Adds the specified path to the mlocate's udpatedb.conf PRUNEPATH list.
1042+
1043+ This method has no effect if the path specified by updatedb_path does not
1044+ exist or is not a file.
1045+
1046+ @param path: string the path to add to the updatedb.conf PRUNEPATHS value
1047+ @param updatedb_path: the path the updatedb.conf file
1048+ """
1049+ if not os.path.exists(updatedb_path) or os.path.isdir(updatedb_path):
1050+ # If the updatedb.conf file doesn't exist then don't attempt to update
1051+ # the file as the package providing mlocate may not be installed on
1052+ # the local system
1053+ return
1054+
1055 with open(updatedb_path, 'r+') as f_id:
1056 updatedb_text = f_id.read()
1057 output = updatedb(updatedb_text, path)
1058@@ -920,3 +1016,62 @@
1059 lines[i] = 'PRUNEPATHS="{}"'.format(' '.join(paths))
1060 output = "\n".join(lines)
1061 return output
1062+
1063+
1064+def modulo_distribution(modulo=3, wait=30, non_zero_wait=False):
1065+ """ Modulo distribution
1066+
1067+ This helper uses the unit number, a modulo value and a constant wait time
1068+ to produce a calculated wait time distribution. This is useful in large
1069+ scale deployments to distribute load during an expensive operation such as
1070+ service restarts.
1071+
1072+ If you have 1000 nodes that need to restart 100 at a time 1 minute at a
1073+ time:
1074+
1075+ time.wait(modulo_distribution(modulo=100, wait=60))
1076+ restart()
1077+
1078+ If you need restarts to happen serially set modulo to the exact number of
1079+ nodes and set a high constant wait time:
1080+
1081+ time.wait(modulo_distribution(modulo=10, wait=120))
1082+ restart()
1083+
1084+ @param modulo: int The modulo number creates the group distribution
1085+ @param wait: int The constant time wait value
1086+ @param non_zero_wait: boolean Override unit % modulo == 0,
1087+ return modulo * wait. Used to avoid collisions with
1088+ leader nodes which are often given priority.
1089+ @return: int Calculated time to wait for unit operation
1090+ """
1091+ unit_number = int(local_unit().split('/')[1])
1092+ calculated_wait_time = (unit_number % modulo) * wait
1093+ if non_zero_wait and calculated_wait_time == 0:
1094+ return modulo * wait
1095+ else:
1096+ return calculated_wait_time
1097+
1098+
1099+def install_ca_cert(ca_cert, name=None):
1100+ """
1101+ Install the given cert as a trusted CA.
1102+
1103+ The ``name`` is the stem of the filename where the cert is written, and if
1104+ not provided, it will default to ``juju-{charm_name}``.
1105+
1106+ If the cert is empty or None, or is unchanged, nothing is done.
1107+ """
1108+ if not ca_cert:
1109+ return
1110+ if not isinstance(ca_cert, bytes):
1111+ ca_cert = ca_cert.encode('utf8')
1112+ if not name:
1113+ name = 'juju-{}'.format(charm_name())
1114+ cert_file = '/usr/local/share/ca-certificates/{}.crt'.format(name)
1115+ new_hash = hashlib.md5(ca_cert).hexdigest()
1116+ if file_hash(cert_file) == new_hash:
1117+ return
1118+ log("Installing new CA cert at: {}".format(cert_file), level=INFO)
1119+ write_file(cert_file, ca_cert)
1120+ subprocess.check_call(['update-ca-certificates', '--fresh'])
1121
1122=== modified file 'charmhelpers/core/host_factory/ubuntu.py'
1123--- charmhelpers/core/host_factory/ubuntu.py 2017-04-11 18:01:45 +0000
1124+++ charmhelpers/core/host_factory/ubuntu.py 2019-05-24 12:43:31 +0000
1125@@ -1,5 +1,6 @@
1126 import subprocess
1127
1128+from charmhelpers.core.hookenv import cached
1129 from charmhelpers.core.strutils import BasicStringComparator
1130
1131
1132@@ -19,6 +20,10 @@
1133 'xenial',
1134 'yakkety',
1135 'zesty',
1136+ 'artful',
1137+ 'bionic',
1138+ 'cosmic',
1139+ 'disco',
1140 )
1141
1142
1143@@ -69,6 +74,14 @@
1144 return d
1145
1146
1147+def get_distrib_codename():
1148+ """Return the codename of the distribution
1149+ :returns: The codename
1150+ :rtype: str
1151+ """
1152+ return lsb_release()['DISTRIB_CODENAME'].lower()
1153+
1154+
1155 def cmp_pkgrevno(package, revno, pkgcache=None):
1156 """Compare supplied revno with the revno of the installed package.
1157
1158@@ -86,3 +99,16 @@
1159 pkgcache = apt_cache()
1160 pkg = pkgcache[package]
1161 return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
1162+
1163+
1164+@cached
1165+def arch():
1166+ """Return the package architecture as a string.
1167+
1168+ :returns: the architecture
1169+ :rtype: str
1170+ :raises: subprocess.CalledProcessError if dpkg command fails
1171+ """
1172+ return subprocess.check_output(
1173+ ['dpkg', '--print-architecture']
1174+ ).rstrip().decode('UTF-8')
1175
1176=== modified file 'charmhelpers/core/kernel.py'
1177--- charmhelpers/core/kernel.py 2017-03-03 21:03:14 +0000
1178+++ charmhelpers/core/kernel.py 2019-05-24 12:43:31 +0000
1179@@ -26,12 +26,12 @@
1180
1181 __platform__ = get_platform()
1182 if __platform__ == "ubuntu":
1183- from charmhelpers.core.kernel_factory.ubuntu import (
1184+ from charmhelpers.core.kernel_factory.ubuntu import ( # NOQA:F401
1185 persistent_modprobe,
1186 update_initramfs,
1187 ) # flake8: noqa -- ignore F401 for this import
1188 elif __platform__ == "centos":
1189- from charmhelpers.core.kernel_factory.centos import (
1190+ from charmhelpers.core.kernel_factory.centos import ( # NOQA:F401
1191 persistent_modprobe,
1192 update_initramfs,
1193 ) # flake8: noqa -- ignore F401 for this import
1194
1195=== modified file 'charmhelpers/core/services/base.py'
1196--- charmhelpers/core/services/base.py 2017-03-03 21:03:14 +0000
1197+++ charmhelpers/core/services/base.py 2019-05-24 12:43:31 +0000
1198@@ -307,23 +307,34 @@
1199 """
1200 def __call__(self, manager, service_name, event_name):
1201 service = manager.get_service(service_name)
1202- new_ports = service.get('ports', [])
1203+ # turn this generator into a list,
1204+ # as we'll be going over it multiple times
1205+ new_ports = list(service.get('ports', []))
1206 port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
1207 if os.path.exists(port_file):
1208 with open(port_file) as fp:
1209 old_ports = fp.read().split(',')
1210 for old_port in old_ports:
1211- if bool(old_port):
1212- old_port = int(old_port)
1213- if old_port not in new_ports:
1214- hookenv.close_port(old_port)
1215+ if bool(old_port) and not self.ports_contains(old_port, new_ports):
1216+ hookenv.close_port(old_port)
1217 with open(port_file, 'w') as fp:
1218 fp.write(','.join(str(port) for port in new_ports))
1219 for port in new_ports:
1220+ # A port is either a number or 'ICMP'
1221+ protocol = 'TCP'
1222+ if str(port).upper() == 'ICMP':
1223+ protocol = 'ICMP'
1224 if event_name == 'start':
1225- hookenv.open_port(port)
1226+ hookenv.open_port(port, protocol)
1227 elif event_name == 'stop':
1228- hookenv.close_port(port)
1229+ hookenv.close_port(port, protocol)
1230+
1231+ def ports_contains(self, port, ports):
1232+ if not bool(port):
1233+ return False
1234+ if str(port).upper() != 'ICMP':
1235+ port = int(port)
1236+ return port in ports
1237
1238
1239 def service_stop(service_name):
1240
1241=== modified file 'charmhelpers/core/strutils.py'
1242--- charmhelpers/core/strutils.py 2017-04-11 18:01:45 +0000
1243+++ charmhelpers/core/strutils.py 2019-05-24 12:43:31 +0000
1244@@ -61,13 +61,19 @@
1245 if isinstance(value, six.string_types):
1246 value = six.text_type(value)
1247 else:
1248- msg = "Unable to interpret non-string value '%s' as boolean" % (value)
1249+ msg = "Unable to interpret non-string value '%s' as bytes" % (value)
1250 raise ValueError(msg)
1251 matches = re.match("([0-9]+)([a-zA-Z]+)", value)
1252- if not matches:
1253- msg = "Unable to interpret string value '%s' as bytes" % (value)
1254- raise ValueError(msg)
1255- return int(matches.group(1)) * (1024 ** BYTE_POWER[matches.group(2)])
1256+ if matches:
1257+ size = int(matches.group(1)) * (1024 ** BYTE_POWER[matches.group(2)])
1258+ else:
1259+ # Assume that value passed in is bytes
1260+ try:
1261+ size = int(value)
1262+ except ValueError:
1263+ msg = "Unable to interpret string value '%s' as bytes" % (value)
1264+ raise ValueError(msg)
1265+ return size
1266
1267
1268 class BasicStringComparator(object):
1269
1270=== modified file 'charmhelpers/core/sysctl.py'
1271--- charmhelpers/core/sysctl.py 2017-03-03 21:03:14 +0000
1272+++ charmhelpers/core/sysctl.py 2019-05-24 12:43:31 +0000
1273@@ -28,27 +28,38 @@
1274 __author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
1275
1276
1277-def create(sysctl_dict, sysctl_file):
1278+def create(sysctl_dict, sysctl_file, ignore=False):
1279 """Creates a sysctl.conf file from a YAML associative array
1280
1281- :param sysctl_dict: a YAML-formatted string of sysctl options eg "{ 'kernel.max_pid': 1337 }"
1282+ :param sysctl_dict: a dict or YAML-formatted string of sysctl
1283+ options eg "{ 'kernel.max_pid': 1337 }"
1284 :type sysctl_dict: str
1285 :param sysctl_file: path to the sysctl file to be saved
1286 :type sysctl_file: str or unicode
1287+ :param ignore: If True, ignore "unknown variable" errors.
1288+ :type ignore: bool
1289 :returns: None
1290 """
1291- try:
1292- sysctl_dict_parsed = yaml.safe_load(sysctl_dict)
1293- except yaml.YAMLError:
1294- log("Error parsing YAML sysctl_dict: {}".format(sysctl_dict),
1295- level=ERROR)
1296- return
1297+ if type(sysctl_dict) is not dict:
1298+ try:
1299+ sysctl_dict_parsed = yaml.safe_load(sysctl_dict)
1300+ except yaml.YAMLError:
1301+ log("Error parsing YAML sysctl_dict: {}".format(sysctl_dict),
1302+ level=ERROR)
1303+ return
1304+ else:
1305+ sysctl_dict_parsed = sysctl_dict
1306
1307 with open(sysctl_file, "w") as fd:
1308 for key, value in sysctl_dict_parsed.items():
1309 fd.write("{}={}\n".format(key, value))
1310
1311- log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict_parsed),
1312+ log("Updating sysctl_file: {} values: {}".format(sysctl_file,
1313+ sysctl_dict_parsed),
1314 level=DEBUG)
1315
1316- check_call(["sysctl", "-p", sysctl_file])
1317+ call = ["sysctl", "-p", sysctl_file]
1318+ if ignore:
1319+ call.append("-e")
1320+
1321+ check_call(call)
1322
1323=== modified file 'charmhelpers/core/templating.py'
1324--- charmhelpers/core/templating.py 2017-03-03 21:03:14 +0000
1325+++ charmhelpers/core/templating.py 2019-05-24 12:43:31 +0000
1326@@ -20,7 +20,8 @@
1327
1328
1329 def render(source, target, context, owner='root', group='root',
1330- perms=0o444, templates_dir=None, encoding='UTF-8', template_loader=None):
1331+ perms=0o444, templates_dir=None, encoding='UTF-8',
1332+ template_loader=None, config_template=None):
1333 """
1334 Render a template.
1335
1336@@ -32,6 +33,9 @@
1337 The context should be a dict containing the values to be replaced in the
1338 template.
1339
1340+ config_template may be provided to render from a provided template instead
1341+ of loading from a file.
1342+
1343 The `owner`, `group`, and `perms` options will be passed to `write_file`.
1344
1345 If omitted, `templates_dir` defaults to the `templates` folder in the charm.
1346@@ -65,14 +69,19 @@
1347 if templates_dir is None:
1348 templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
1349 template_env = Environment(loader=FileSystemLoader(templates_dir))
1350- try:
1351- source = source
1352- template = template_env.get_template(source)
1353- except exceptions.TemplateNotFound as e:
1354- hookenv.log('Could not load template %s from %s.' %
1355- (source, templates_dir),
1356- level=hookenv.ERROR)
1357- raise e
1358+
1359+ # load from a string if provided explicitly
1360+ if config_template is not None:
1361+ template = template_env.from_string(config_template)
1362+ else:
1363+ try:
1364+ source = source
1365+ template = template_env.get_template(source)
1366+ except exceptions.TemplateNotFound as e:
1367+ hookenv.log('Could not load template %s from %s.' %
1368+ (source, templates_dir),
1369+ level=hookenv.ERROR)
1370+ raise e
1371 content = template.render(context)
1372 if target is not None:
1373 target_dir = os.path.dirname(target)
1374
1375=== modified file 'charmhelpers/core/unitdata.py'
1376--- charmhelpers/core/unitdata.py 2017-03-03 21:03:14 +0000
1377+++ charmhelpers/core/unitdata.py 2019-05-24 12:43:31 +0000
1378@@ -166,6 +166,10 @@
1379
1380 To support dicts, lists, integer, floats, and booleans values
1381 are automatically json encoded/decoded.
1382+
1383+ Note: to facilitate unit testing, ':memory:' can be passed as the
1384+ path parameter which causes sqlite3 to only build the db in memory.
1385+ This should only be used for testing purposes.
1386 """
1387 def __init__(self, path=None):
1388 self.db_path = path
1389@@ -175,6 +179,9 @@
1390 else:
1391 self.db_path = os.path.join(
1392 os.environ.get('CHARM_DIR', ''), '.unit-state.db')
1393+ if self.db_path != ':memory:':
1394+ with open(self.db_path, 'a') as f:
1395+ os.fchmod(f.fileno(), 0o600)
1396 self.conn = sqlite3.connect('%s' % self.db_path)
1397 self.cursor = self.conn.cursor()
1398 self.revision = None
1399@@ -358,7 +365,7 @@
1400 try:
1401 yield self.revision
1402 self.revision = None
1403- except:
1404+ except Exception:
1405 self.flush(False)
1406 self.revision = None
1407 raise
1408
1409=== modified file 'charmhelpers/fetch/__init__.py'
1410--- charmhelpers/fetch/__init__.py 2017-03-03 21:03:14 +0000
1411+++ charmhelpers/fetch/__init__.py 2019-05-24 12:43:31 +0000
1412@@ -48,6 +48,13 @@
1413 pass
1414
1415
1416+class GPGKeyError(Exception):
1417+ """Exception occurs when a GPG key cannot be fetched or used. The message
1418+ indicates what the problem is.
1419+ """
1420+ pass
1421+
1422+
1423 class BaseFetchHandler(object):
1424
1425 """Base class for FetchHandler implementations in fetch plugins"""
1426@@ -77,21 +84,24 @@
1427 fetch = importlib.import_module(module)
1428
1429 filter_installed_packages = fetch.filter_installed_packages
1430-install = fetch.install
1431-upgrade = fetch.upgrade
1432-update = fetch.update
1433-purge = fetch.purge
1434+filter_missing_packages = fetch.filter_missing_packages
1435+install = fetch.apt_install
1436+upgrade = fetch.apt_upgrade
1437+update = _fetch_update = fetch.apt_update
1438+purge = fetch.apt_purge
1439 add_source = fetch.add_source
1440
1441 if __platform__ == "ubuntu":
1442 apt_cache = fetch.apt_cache
1443- apt_install = fetch.install
1444- apt_update = fetch.update
1445- apt_upgrade = fetch.upgrade
1446- apt_purge = fetch.purge
1447+ apt_install = fetch.apt_install
1448+ apt_update = fetch.apt_update
1449+ apt_upgrade = fetch.apt_upgrade
1450+ apt_purge = fetch.apt_purge
1451+ apt_autoremove = fetch.apt_autoremove
1452 apt_mark = fetch.apt_mark
1453 apt_hold = fetch.apt_hold
1454 apt_unhold = fetch.apt_unhold
1455+ import_key = fetch.import_key
1456 get_upstream_version = fetch.get_upstream_version
1457 elif __platform__ == "centos":
1458 yum_search = fetch.yum_search
1459@@ -135,7 +145,7 @@
1460 for source, key in zip(sources, keys):
1461 add_source(source, key)
1462 if update:
1463- fetch.update(fatal=True)
1464+ _fetch_update(fatal=True)
1465
1466
1467 def install_remote(source, *args, **kwargs):
1468
1469=== modified file 'charmhelpers/fetch/archiveurl.py'
1470--- charmhelpers/fetch/archiveurl.py 2017-03-03 21:03:14 +0000
1471+++ charmhelpers/fetch/archiveurl.py 2019-05-24 12:43:31 +0000
1472@@ -89,7 +89,7 @@
1473 :param str source: URL pointing to an archive file.
1474 :param str dest: Local path location to download archive file to.
1475 """
1476- # propogate all exceptions
1477+ # propagate all exceptions
1478 # URLError, OSError, etc
1479 proto, netloc, path, params, query, fragment = urlparse(source)
1480 if proto in ('http', 'https'):
1481
1482=== modified file 'charmhelpers/fetch/bzrurl.py'
1483--- charmhelpers/fetch/bzrurl.py 2017-03-03 21:03:14 +0000
1484+++ charmhelpers/fetch/bzrurl.py 2019-05-24 12:43:31 +0000
1485@@ -13,7 +13,7 @@
1486 # limitations under the License.
1487
1488 import os
1489-from subprocess import check_call
1490+from subprocess import STDOUT, check_output
1491 from charmhelpers.fetch import (
1492 BaseFetchHandler,
1493 UnhandledSource,
1494@@ -55,7 +55,7 @@
1495 cmd = ['bzr', 'branch']
1496 cmd += cmd_opts
1497 cmd += [source, dest]
1498- check_call(cmd)
1499+ check_output(cmd, stderr=STDOUT)
1500
1501 def install(self, source, dest=None, revno=None):
1502 url_parts = self.parse_url(source)
1503
1504=== modified file 'charmhelpers/fetch/centos.py'
1505--- charmhelpers/fetch/centos.py 2017-03-03 21:03:14 +0000
1506+++ charmhelpers/fetch/centos.py 2019-05-24 12:43:31 +0000
1507@@ -132,7 +132,7 @@
1508 key_file.write(key)
1509 key_file.flush()
1510 key_file.seek(0)
1511- subprocess.check_call(['rpm', '--import', key_file])
1512+ subprocess.check_call(['rpm', '--import', key_file.name])
1513 else:
1514 subprocess.check_call(['rpm', '--import', key])
1515
1516
1517=== modified file 'charmhelpers/fetch/giturl.py'
1518--- charmhelpers/fetch/giturl.py 2017-03-03 21:03:14 +0000
1519+++ charmhelpers/fetch/giturl.py 2019-05-24 12:43:31 +0000
1520@@ -13,7 +13,7 @@
1521 # limitations under the License.
1522
1523 import os
1524-from subprocess import check_call, CalledProcessError
1525+from subprocess import check_output, CalledProcessError, STDOUT
1526 from charmhelpers.fetch import (
1527 BaseFetchHandler,
1528 UnhandledSource,
1529@@ -50,7 +50,7 @@
1530 cmd = ['git', 'clone', source, dest, '--branch', branch]
1531 if depth:
1532 cmd.extend(['--depth', depth])
1533- check_call(cmd)
1534+ check_output(cmd, stderr=STDOUT)
1535
1536 def install(self, source, branch="master", dest=None, depth=None):
1537 url_parts = self.parse_url(source)
1538
1539=== added directory 'charmhelpers/fetch/python'
1540=== added file 'charmhelpers/fetch/python/__init__.py'
1541--- charmhelpers/fetch/python/__init__.py 1970-01-01 00:00:00 +0000
1542+++ charmhelpers/fetch/python/__init__.py 2019-05-24 12:43:31 +0000
1543@@ -0,0 +1,13 @@
1544+# Copyright 2014-2019 Canonical Limited.
1545+#
1546+# Licensed under the Apache License, Version 2.0 (the "License");
1547+# you may not use this file except in compliance with the License.
1548+# You may obtain a copy of the License at
1549+#
1550+# http://www.apache.org/licenses/LICENSE-2.0
1551+#
1552+# Unless required by applicable law or agreed to in writing, software
1553+# distributed under the License is distributed on an "AS IS" BASIS,
1554+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1555+# See the License for the specific language governing permissions and
1556+# limitations under the License.
1557
1558=== added file 'charmhelpers/fetch/python/debug.py'
1559--- charmhelpers/fetch/python/debug.py 1970-01-01 00:00:00 +0000
1560+++ charmhelpers/fetch/python/debug.py 2019-05-24 12:43:31 +0000
1561@@ -0,0 +1,54 @@
1562+#!/usr/bin/env python
1563+# coding: utf-8
1564+
1565+# Copyright 2014-2015 Canonical Limited.
1566+#
1567+# Licensed under the Apache License, Version 2.0 (the "License");
1568+# you may not use this file except in compliance with the License.
1569+# You may obtain a copy of the License at
1570+#
1571+# http://www.apache.org/licenses/LICENSE-2.0
1572+#
1573+# Unless required by applicable law or agreed to in writing, software
1574+# distributed under the License is distributed on an "AS IS" BASIS,
1575+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1576+# See the License for the specific language governing permissions and
1577+# limitations under the License.
1578+
1579+from __future__ import print_function
1580+
1581+import atexit
1582+import sys
1583+
1584+from charmhelpers.fetch.python.rpdb import Rpdb
1585+from charmhelpers.core.hookenv import (
1586+ open_port,
1587+ close_port,
1588+ ERROR,
1589+ log
1590+)
1591+
1592+__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
1593+
1594+DEFAULT_ADDR = "0.0.0.0"
1595+DEFAULT_PORT = 4444
1596+
1597+
1598+def _error(message):
1599+ log(message, level=ERROR)
1600+
1601+
1602+def set_trace(addr=DEFAULT_ADDR, port=DEFAULT_PORT):
1603+ """
1604+ Set a trace point using the remote debugger
1605+ """
1606+ atexit.register(close_port, port)
1607+ try:
1608+ log("Starting a remote python debugger session on %s:%s" % (addr,
1609+ port))
1610+ open_port(port)
1611+ debugger = Rpdb(addr=addr, port=port)
1612+ debugger.set_trace(sys._getframe().f_back)
1613+ except Exception:
1614+ _error("Cannot start a remote debug session on %s:%s" % (addr,
1615+ port))
1616
1617=== added file 'charmhelpers/fetch/python/packages.py'
1618--- charmhelpers/fetch/python/packages.py 1970-01-01 00:00:00 +0000
1619+++ charmhelpers/fetch/python/packages.py 2019-05-24 12:43:31 +0000
1620@@ -0,0 +1,154 @@
1621+#!/usr/bin/env python
1622+# coding: utf-8
1623+
1624+# Copyright 2014-2015 Canonical Limited.
1625+#
1626+# Licensed under the Apache License, Version 2.0 (the "License");
1627+# you may not use this file except in compliance with the License.
1628+# You may obtain a copy of the License at
1629+#
1630+# http://www.apache.org/licenses/LICENSE-2.0
1631+#
1632+# Unless required by applicable law or agreed to in writing, software
1633+# distributed under the License is distributed on an "AS IS" BASIS,
1634+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1635+# See the License for the specific language governing permissions and
1636+# limitations under the License.
1637+
1638+import os
1639+import six
1640+import subprocess
1641+import sys
1642+
1643+from charmhelpers.fetch import apt_install, apt_update
1644+from charmhelpers.core.hookenv import charm_dir, log
1645+
1646+__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
1647+
1648+
1649+def pip_execute(*args, **kwargs):
1650+ """Overriden pip_execute() to stop sys.path being changed.
1651+
1652+ The act of importing main from the pip module seems to cause add wheels
1653+ from the /usr/share/python-wheels which are installed by various tools.
1654+ This function ensures that sys.path remains the same after the call is
1655+ executed.
1656+ """
1657+ try:
1658+ _path = sys.path
1659+ try:
1660+ from pip import main as _pip_execute
1661+ except ImportError:
1662+ apt_update()
1663+ if six.PY2:
1664+ apt_install('python-pip')
1665+ else:
1666+ apt_install('python3-pip')
1667+ from pip import main as _pip_execute
1668+ _pip_execute(*args, **kwargs)
1669+ finally:
1670+ sys.path = _path
1671+
1672+
1673+def parse_options(given, available):
1674+ """Given a set of options, check if available"""
1675+ for key, value in sorted(given.items()):
1676+ if not value:
1677+ continue
1678+ if key in available:
1679+ yield "--{0}={1}".format(key, value)
1680+
1681+
1682+def pip_install_requirements(requirements, constraints=None, **options):
1683+ """Install a requirements file.
1684+
1685+ :param constraints: Path to pip constraints file.
1686+ http://pip.readthedocs.org/en/stable/user_guide/#constraints-files
1687+ """
1688+ command = ["install"]
1689+
1690+ available_options = ('proxy', 'src', 'log', )
1691+ for option in parse_options(options, available_options):
1692+ command.append(option)
1693+
1694+ command.append("-r {0}".format(requirements))
1695+ if constraints:
1696+ command.append("-c {0}".format(constraints))
1697+ log("Installing from file: {} with constraints {} "
1698+ "and options: {}".format(requirements, constraints, command))
1699+ else:
1700+ log("Installing from file: {} with options: {}".format(requirements,
1701+ command))
1702+ pip_execute(command)
1703+
1704+
1705+def pip_install(package, fatal=False, upgrade=False, venv=None,
1706+ constraints=None, **options):
1707+ """Install a python package"""
1708+ if venv:
1709+ venv_python = os.path.join(venv, 'bin/pip')
1710+ command = [venv_python, "install"]
1711+ else:
1712+ command = ["install"]
1713+
1714+ available_options = ('proxy', 'src', 'log', 'index-url', )
1715+ for option in parse_options(options, available_options):
1716+ command.append(option)
1717+
1718+ if upgrade:
1719+ command.append('--upgrade')
1720+
1721+ if constraints:
1722+ command.extend(['-c', constraints])
1723+
1724+ if isinstance(package, list):
1725+ command.extend(package)
1726+ else:
1727+ command.append(package)
1728+
1729+ log("Installing {} package with options: {}".format(package,
1730+ command))
1731+ if venv:
1732+ subprocess.check_call(command)
1733+ else:
1734+ pip_execute(command)
1735+
1736+
1737+def pip_uninstall(package, **options):
1738+ """Uninstall a python package"""
1739+ command = ["uninstall", "-q", "-y"]
1740+
1741+ available_options = ('proxy', 'log', )
1742+ for option in parse_options(options, available_options):
1743+ command.append(option)
1744+
1745+ if isinstance(package, list):
1746+ command.extend(package)
1747+ else:
1748+ command.append(package)
1749+
1750+ log("Uninstalling {} package with options: {}".format(package,
1751+ command))
1752+ pip_execute(command)
1753+
1754+
1755+def pip_list():
1756+ """Returns the list of current python installed packages
1757+ """
1758+ return pip_execute(["list"])
1759+
1760+
1761+def pip_create_virtualenv(path=None):
1762+ """Create an isolated Python environment."""
1763+ if six.PY2:
1764+ apt_install('python-virtualenv')
1765+ else:
1766+ apt_install('python3-virtualenv')
1767+
1768+ if path:
1769+ venv_path = path
1770+ else:
1771+ venv_path = os.path.join(charm_dir(), 'venv')
1772+
1773+ if not os.path.exists(venv_path):
1774+ subprocess.check_call(['virtualenv', venv_path])
1775
1776=== added file 'charmhelpers/fetch/python/rpdb.py'
1777--- charmhelpers/fetch/python/rpdb.py 1970-01-01 00:00:00 +0000
1778+++ charmhelpers/fetch/python/rpdb.py 2019-05-24 12:43:31 +0000
1779@@ -0,0 +1,56 @@
1780+# Copyright 2014-2015 Canonical Limited.
1781+#
1782+# Licensed under the Apache License, Version 2.0 (the "License");
1783+# you may not use this file except in compliance with the License.
1784+# You may obtain a copy of the License at
1785+#
1786+# http://www.apache.org/licenses/LICENSE-2.0
1787+#
1788+# Unless required by applicable law or agreed to in writing, software
1789+# distributed under the License is distributed on an "AS IS" BASIS,
1790+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1791+# See the License for the specific language governing permissions and
1792+# limitations under the License.
1793+
1794+"""Remote Python Debugger (pdb wrapper)."""
1795+
1796+import pdb
1797+import socket
1798+import sys
1799+
1800+__author__ = "Bertrand Janin <b@janin.com>"
1801+__version__ = "0.1.3"
1802+
1803+
1804+class Rpdb(pdb.Pdb):
1805+
1806+ def __init__(self, addr="127.0.0.1", port=4444):
1807+ """Initialize the socket and initialize pdb."""
1808+
1809+ # Backup stdin and stdout before replacing them by the socket handle
1810+ self.old_stdout = sys.stdout
1811+ self.old_stdin = sys.stdin
1812+
1813+ # Open a 'reusable' socket to let the webapp reload on the same port
1814+ self.skt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
1815+ self.skt.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, True)
1816+ self.skt.bind((addr, port))
1817+ self.skt.listen(1)
1818+ (clientsocket, address) = self.skt.accept()
1819+ handle = clientsocket.makefile('rw')
1820+ pdb.Pdb.__init__(self, completekey='tab', stdin=handle, stdout=handle)
1821+ sys.stdout = sys.stdin = handle
1822+
1823+ def shutdown(self):
1824+ """Revert stdin and stdout, close the socket."""
1825+ sys.stdout = self.old_stdout
1826+ sys.stdin = self.old_stdin
1827+ self.skt.close()
1828+ self.set_continue()
1829+
1830+ def do_continue(self, arg):
1831+ """Stop all operation on ``continue``."""
1832+ self.shutdown()
1833+ return 1
1834+
1835+ do_EOF = do_quit = do_exit = do_c = do_cont = do_continue
1836
1837=== added file 'charmhelpers/fetch/python/version.py'
1838--- charmhelpers/fetch/python/version.py 1970-01-01 00:00:00 +0000
1839+++ charmhelpers/fetch/python/version.py 2019-05-24 12:43:31 +0000
1840@@ -0,0 +1,32 @@
1841+#!/usr/bin/env python
1842+# coding: utf-8
1843+
1844+# Copyright 2014-2015 Canonical Limited.
1845+#
1846+# Licensed under the Apache License, Version 2.0 (the "License");
1847+# you may not use this file except in compliance with the License.
1848+# You may obtain a copy of the License at
1849+#
1850+# http://www.apache.org/licenses/LICENSE-2.0
1851+#
1852+# Unless required by applicable law or agreed to in writing, software
1853+# distributed under the License is distributed on an "AS IS" BASIS,
1854+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1855+# See the License for the specific language governing permissions and
1856+# limitations under the License.
1857+
1858+import sys
1859+
1860+__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
1861+
1862+
1863+def current_version():
1864+ """Current system python version"""
1865+ return sys.version_info
1866+
1867+
1868+def current_version_string():
1869+ """Current system python version as string major.minor.micro"""
1870+ return "{0}.{1}.{2}".format(sys.version_info.major,
1871+ sys.version_info.minor,
1872+ sys.version_info.micro)
1873
1874=== modified file 'charmhelpers/fetch/snap.py'
1875--- charmhelpers/fetch/snap.py 2017-03-03 21:03:14 +0000
1876+++ charmhelpers/fetch/snap.py 2019-05-24 12:43:31 +0000
1877@@ -18,21 +18,33 @@
1878 https://lists.ubuntu.com/archives/snapcraft/2016-September/001114.html
1879 """
1880 import subprocess
1881-from os import environ
1882+import os
1883 from time import sleep
1884 from charmhelpers.core.hookenv import log
1885
1886 __author__ = 'Joseph Borg <joseph.borg@canonical.com>'
1887
1888-SNAP_NO_LOCK = 1 # The return code for "couldn't acquire lock" in Snap (hopefully this will be improved).
1889+# The return code for "couldn't acquire lock" in Snap
1890+# (hopefully this will be improved).
1891+SNAP_NO_LOCK = 1
1892 SNAP_NO_LOCK_RETRY_DELAY = 10 # Wait X seconds between Snap lock checks.
1893 SNAP_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.
1894+SNAP_CHANNELS = [
1895+ 'edge',
1896+ 'beta',
1897+ 'candidate',
1898+ 'stable',
1899+]
1900
1901
1902 class CouldNotAcquireLockException(Exception):
1903 pass
1904
1905
1906+class InvalidSnapChannel(Exception):
1907+ pass
1908+
1909+
1910 def _snap_exec(commands):
1911 """
1912 Execute snap commands.
1913@@ -47,13 +59,17 @@
1914
1915 while return_code is None or return_code == SNAP_NO_LOCK:
1916 try:
1917- return_code = subprocess.check_call(['snap'] + commands, env=environ)
1918+ return_code = subprocess.check_call(['snap'] + commands,
1919+ env=os.environ)
1920 except subprocess.CalledProcessError as e:
1921 retry_count += + 1
1922 if retry_count > SNAP_NO_LOCK_RETRY_COUNT:
1923- raise CouldNotAcquireLockException('Could not aquire lock after %s attempts' % SNAP_NO_LOCK_RETRY_COUNT)
1924+ raise CouldNotAcquireLockException(
1925+ 'Could not aquire lock after {} attempts'
1926+ .format(SNAP_NO_LOCK_RETRY_COUNT))
1927 return_code = e.returncode
1928- log('Snap failed to acquire lock, trying again in %s seconds.' % SNAP_NO_LOCK_RETRY_DELAY, level='WARN')
1929+ log('Snap failed to acquire lock, trying again in {} seconds.'
1930+ .format(SNAP_NO_LOCK_RETRY_DELAY, level='WARN'))
1931 sleep(SNAP_NO_LOCK_RETRY_DELAY)
1932
1933 return return_code
1934@@ -120,3 +136,15 @@
1935
1936 log(message, level='INFO')
1937 return _snap_exec(['refresh'] + flags + packages)
1938+
1939+
1940+def valid_snap_channel(channel):
1941+ """ Validate snap channel exists
1942+
1943+ :raises InvalidSnapChannel: When channel does not exist
1944+ :return: Boolean
1945+ """
1946+ if channel.lower() in SNAP_CHANNELS:
1947+ return True
1948+ else:
1949+ raise InvalidSnapChannel("Invalid Snap Channel: {}".format(channel))
1950
1951=== modified file 'charmhelpers/fetch/ubuntu.py'
1952--- charmhelpers/fetch/ubuntu.py 2017-03-03 21:03:14 +0000
1953+++ charmhelpers/fetch/ubuntu.py 2019-05-24 12:43:31 +0000
1954@@ -12,29 +12,48 @@
1955 # See the License for the specific language governing permissions and
1956 # limitations under the License.
1957
1958+from collections import OrderedDict
1959 import os
1960+import platform
1961+import re
1962 import six
1963 import time
1964 import subprocess
1965
1966-from tempfile import NamedTemporaryFile
1967-from charmhelpers.core.host import (
1968- lsb_release
1969+from charmhelpers.core.host import get_distrib_codename
1970+
1971+from charmhelpers.core.hookenv import (
1972+ log,
1973+ DEBUG,
1974+ WARNING,
1975+ env_proxy_settings,
1976 )
1977-from charmhelpers.core.hookenv import log
1978-from charmhelpers.fetch import SourceConfigError
1979+from charmhelpers.fetch import SourceConfigError, GPGKeyError
1980
1981+PROPOSED_POCKET = (
1982+ "# Proposed\n"
1983+ "deb http://archive.ubuntu.com/ubuntu {}-proposed main universe "
1984+ "multiverse restricted\n")
1985+PROPOSED_PORTS_POCKET = (
1986+ "# Proposed\n"
1987+ "deb http://ports.ubuntu.com/ubuntu-ports {}-proposed main universe "
1988+ "multiverse restricted\n")
1989+# Only supports 64bit and ppc64 at the moment.
1990+ARCH_TO_PROPOSED_POCKET = {
1991+ 'x86_64': PROPOSED_POCKET,
1992+ 'ppc64le': PROPOSED_PORTS_POCKET,
1993+ 'aarch64': PROPOSED_PORTS_POCKET,
1994+ 's390x': PROPOSED_PORTS_POCKET,
1995+}
1996+CLOUD_ARCHIVE_URL = "http://ubuntu-cloud.archive.canonical.com/ubuntu"
1997+CLOUD_ARCHIVE_KEY_ID = '5EDB1B62EC4926EA'
1998 CLOUD_ARCHIVE = """# Ubuntu Cloud Archive
1999 deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
2000 """
2001-
2002-PROPOSED_POCKET = """# Proposed
2003-deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted
2004-"""
2005-
2006 CLOUD_ARCHIVE_POCKETS = {
2007 # Folsom
2008 'folsom': 'precise-updates/folsom',
2009+ 'folsom/updates': 'precise-updates/folsom',
2010 'precise-folsom': 'precise-updates/folsom',
2011 'precise-folsom/updates': 'precise-updates/folsom',
2012 'precise-updates/folsom': 'precise-updates/folsom',
2013@@ -43,6 +62,7 @@
2014 'precise-proposed/folsom': 'precise-proposed/folsom',
2015 # Grizzly
2016 'grizzly': 'precise-updates/grizzly',
2017+ 'grizzly/updates': 'precise-updates/grizzly',
2018 'precise-grizzly': 'precise-updates/grizzly',
2019 'precise-grizzly/updates': 'precise-updates/grizzly',
2020 'precise-updates/grizzly': 'precise-updates/grizzly',
2021@@ -51,6 +71,7 @@
2022 'precise-proposed/grizzly': 'precise-proposed/grizzly',
2023 # Havana
2024 'havana': 'precise-updates/havana',
2025+ 'havana/updates': 'precise-updates/havana',
2026 'precise-havana': 'precise-updates/havana',
2027 'precise-havana/updates': 'precise-updates/havana',
2028 'precise-updates/havana': 'precise-updates/havana',
2029@@ -59,6 +80,7 @@
2030 'precise-proposed/havana': 'precise-proposed/havana',
2031 # Icehouse
2032 'icehouse': 'precise-updates/icehouse',
2033+ 'icehouse/updates': 'precise-updates/icehouse',
2034 'precise-icehouse': 'precise-updates/icehouse',
2035 'precise-icehouse/updates': 'precise-updates/icehouse',
2036 'precise-updates/icehouse': 'precise-updates/icehouse',
2037@@ -67,6 +89,7 @@
2038 'precise-proposed/icehouse': 'precise-proposed/icehouse',
2039 # Juno
2040 'juno': 'trusty-updates/juno',
2041+ 'juno/updates': 'trusty-updates/juno',
2042 'trusty-juno': 'trusty-updates/juno',
2043 'trusty-juno/updates': 'trusty-updates/juno',
2044 'trusty-updates/juno': 'trusty-updates/juno',
2045@@ -75,6 +98,7 @@
2046 'trusty-proposed/juno': 'trusty-proposed/juno',
2047 # Kilo
2048 'kilo': 'trusty-updates/kilo',
2049+ 'kilo/updates': 'trusty-updates/kilo',
2050 'trusty-kilo': 'trusty-updates/kilo',
2051 'trusty-kilo/updates': 'trusty-updates/kilo',
2052 'trusty-updates/kilo': 'trusty-updates/kilo',
2053@@ -83,6 +107,7 @@
2054 'trusty-proposed/kilo': 'trusty-proposed/kilo',
2055 # Liberty
2056 'liberty': 'trusty-updates/liberty',
2057+ 'liberty/updates': 'trusty-updates/liberty',
2058 'trusty-liberty': 'trusty-updates/liberty',
2059 'trusty-liberty/updates': 'trusty-updates/liberty',
2060 'trusty-updates/liberty': 'trusty-updates/liberty',
2061@@ -91,6 +116,7 @@
2062 'trusty-proposed/liberty': 'trusty-proposed/liberty',
2063 # Mitaka
2064 'mitaka': 'trusty-updates/mitaka',
2065+ 'mitaka/updates': 'trusty-updates/mitaka',
2066 'trusty-mitaka': 'trusty-updates/mitaka',
2067 'trusty-mitaka/updates': 'trusty-updates/mitaka',
2068 'trusty-updates/mitaka': 'trusty-updates/mitaka',
2069@@ -99,6 +125,7 @@
2070 'trusty-proposed/mitaka': 'trusty-proposed/mitaka',
2071 # Newton
2072 'newton': 'xenial-updates/newton',
2073+ 'newton/updates': 'xenial-updates/newton',
2074 'xenial-newton': 'xenial-updates/newton',
2075 'xenial-newton/updates': 'xenial-updates/newton',
2076 'xenial-updates/newton': 'xenial-updates/newton',
2077@@ -107,17 +134,51 @@
2078 'xenial-proposed/newton': 'xenial-proposed/newton',
2079 # Ocata
2080 'ocata': 'xenial-updates/ocata',
2081+ 'ocata/updates': 'xenial-updates/ocata',
2082 'xenial-ocata': 'xenial-updates/ocata',
2083 'xenial-ocata/updates': 'xenial-updates/ocata',
2084 'xenial-updates/ocata': 'xenial-updates/ocata',
2085 'ocata/proposed': 'xenial-proposed/ocata',
2086 'xenial-ocata/proposed': 'xenial-proposed/ocata',
2087- 'xenial-ocata/newton': 'xenial-proposed/ocata',
2088+ 'xenial-proposed/ocata': 'xenial-proposed/ocata',
2089+ # Pike
2090+ 'pike': 'xenial-updates/pike',
2091+ 'xenial-pike': 'xenial-updates/pike',
2092+ 'xenial-pike/updates': 'xenial-updates/pike',
2093+ 'xenial-updates/pike': 'xenial-updates/pike',
2094+ 'pike/proposed': 'xenial-proposed/pike',
2095+ 'xenial-pike/proposed': 'xenial-proposed/pike',
2096+ 'xenial-proposed/pike': 'xenial-proposed/pike',
2097+ # Queens
2098+ 'queens': 'xenial-updates/queens',
2099+ 'xenial-queens': 'xenial-updates/queens',
2100+ 'xenial-queens/updates': 'xenial-updates/queens',
2101+ 'xenial-updates/queens': 'xenial-updates/queens',
2102+ 'queens/proposed': 'xenial-proposed/queens',
2103+ 'xenial-queens/proposed': 'xenial-proposed/queens',
2104+ 'xenial-proposed/queens': 'xenial-proposed/queens',
2105+ # Rocky
2106+ 'rocky': 'bionic-updates/rocky',
2107+ 'bionic-rocky': 'bionic-updates/rocky',
2108+ 'bionic-rocky/updates': 'bionic-updates/rocky',
2109+ 'bionic-updates/rocky': 'bionic-updates/rocky',
2110+ 'rocky/proposed': 'bionic-proposed/rocky',
2111+ 'bionic-rocky/proposed': 'bionic-proposed/rocky',
2112+ 'bionic-proposed/rocky': 'bionic-proposed/rocky',
2113+ # Stein
2114+ 'stein': 'bionic-updates/stein',
2115+ 'bionic-stein': 'bionic-updates/stein',
2116+ 'bionic-stein/updates': 'bionic-updates/stein',
2117+ 'bionic-updates/stein': 'bionic-updates/stein',
2118+ 'stein/proposed': 'bionic-proposed/stein',
2119+ 'bionic-stein/proposed': 'bionic-proposed/stein',
2120+ 'bionic-proposed/stein': 'bionic-proposed/stein',
2121 }
2122
2123+
2124 APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.
2125 CMD_RETRY_DELAY = 10 # Wait 10 seconds between command retries.
2126-CMD_RETRY_COUNT = 30 # Retry a failing fatal command X times.
2127+CMD_RETRY_COUNT = 3 # Retry a failing fatal command X times.
2128
2129
2130 def filter_installed_packages(packages):
2131@@ -135,6 +196,18 @@
2132 return _pkgs
2133
2134
2135+def filter_missing_packages(packages):
2136+ """Return a list of packages that are installed.
2137+
2138+ :param packages: list of packages to evaluate.
2139+ :returns list: Packages that are installed.
2140+ """
2141+ return list(
2142+ set(packages) -
2143+ set(filter_installed_packages(packages))
2144+ )
2145+
2146+
2147 def apt_cache(in_memory=True, progress=None):
2148 """Build and return an apt cache."""
2149 from apt import apt_pkg
2150@@ -145,7 +218,7 @@
2151 return apt_pkg.Cache(progress)
2152
2153
2154-def install(packages, options=None, fatal=False):
2155+def apt_install(packages, options=None, fatal=False):
2156 """Install one or more packages."""
2157 if options is None:
2158 options = ['--option=Dpkg::Options::=--force-confold']
2159@@ -162,7 +235,7 @@
2160 _run_apt_command(cmd, fatal)
2161
2162
2163-def upgrade(options=None, fatal=False, dist=False):
2164+def apt_upgrade(options=None, fatal=False, dist=False):
2165 """Upgrade all packages."""
2166 if options is None:
2167 options = ['--option=Dpkg::Options::=--force-confold']
2168@@ -177,13 +250,13 @@
2169 _run_apt_command(cmd, fatal)
2170
2171
2172-def update(fatal=False):
2173+def apt_update(fatal=False):
2174 """Update local apt cache."""
2175 cmd = ['apt-get', 'update']
2176 _run_apt_command(cmd, fatal)
2177
2178
2179-def purge(packages, fatal=False):
2180+def apt_purge(packages, fatal=False):
2181 """Purge one or more packages."""
2182 cmd = ['apt-get', '--assume-yes', 'purge']
2183 if isinstance(packages, six.string_types):
2184@@ -194,6 +267,14 @@
2185 _run_apt_command(cmd, fatal)
2186
2187
2188+def apt_autoremove(purge=True, fatal=False):
2189+ """Purge one or more packages."""
2190+ cmd = ['apt-get', '--assume-yes', 'autoremove']
2191+ if purge:
2192+ cmd.append('--purge')
2193+ _run_apt_command(cmd, fatal)
2194+
2195+
2196 def apt_mark(packages, mark, fatal=False):
2197 """Flag one or more packages using apt-mark."""
2198 log("Marking {} as {}".format(packages, mark))
2199@@ -217,7 +298,159 @@
2200 return apt_mark(packages, 'unhold', fatal=fatal)
2201
2202
2203-def add_source(source, key=None):
2204+def import_key(key):
2205+ """Import an ASCII Armor key.
2206+
2207+ A Radix64 format keyid is also supported for backwards
2208+ compatibility. In this case Ubuntu keyserver will be
2209+ queried for a key via HTTPS by its keyid. This method
2210+ is less preferrable because https proxy servers may
2211+ require traffic decryption which is equivalent to a
2212+ man-in-the-middle attack (a proxy server impersonates
2213+ keyserver TLS certificates and has to be explicitly
2214+ trusted by the system).
2215+
2216+ :param key: A GPG key in ASCII armor format,
2217+ including BEGIN and END markers or a keyid.
2218+ :type key: (bytes, str)
2219+ :raises: GPGKeyError if the key could not be imported
2220+ """
2221+ key = key.strip()
2222+ if '-' in key or '\n' in key:
2223+ # Send everything not obviously a keyid to GPG to import, as
2224+ # we trust its validation better than our own. eg. handling
2225+ # comments before the key.
2226+ log("PGP key found (looks like ASCII Armor format)", level=DEBUG)
2227+ if ('-----BEGIN PGP PUBLIC KEY BLOCK-----' in key and
2228+ '-----END PGP PUBLIC KEY BLOCK-----' in key):
2229+ log("Writing provided PGP key in the binary format", level=DEBUG)
2230+ if six.PY3:
2231+ key_bytes = key.encode('utf-8')
2232+ else:
2233+ key_bytes = key
2234+ key_name = _get_keyid_by_gpg_key(key_bytes)
2235+ key_gpg = _dearmor_gpg_key(key_bytes)
2236+ _write_apt_gpg_keyfile(key_name=key_name, key_material=key_gpg)
2237+ else:
2238+ raise GPGKeyError("ASCII armor markers missing from GPG key")
2239+ else:
2240+ log("PGP key found (looks like Radix64 format)", level=WARNING)
2241+ log("SECURELY importing PGP key from keyserver; "
2242+ "full key not provided.", level=WARNING)
2243+ # as of bionic add-apt-repository uses curl with an HTTPS keyserver URL
2244+ # to retrieve GPG keys. `apt-key adv` command is deprecated as is
2245+ # apt-key in general as noted in its manpage. See lp:1433761 for more
2246+ # history. Instead, /etc/apt/trusted.gpg.d is used directly to drop
2247+ # gpg
2248+ key_asc = _get_key_by_keyid(key)
2249+ # write the key in GPG format so that apt-key list shows it
2250+ key_gpg = _dearmor_gpg_key(key_asc)
2251+ _write_apt_gpg_keyfile(key_name=key, key_material=key_gpg)
2252+
2253+
2254+def _get_keyid_by_gpg_key(key_material):
2255+ """Get a GPG key fingerprint by GPG key material.
2256+ Gets a GPG key fingerprint (40-digit, 160-bit) by the ASCII armor-encoded
2257+ or binary GPG key material. Can be used, for example, to generate file
2258+ names for keys passed via charm options.
2259+
2260+ :param key_material: ASCII armor-encoded or binary GPG key material
2261+ :type key_material: bytes
2262+ :raises: GPGKeyError if invalid key material has been provided
2263+ :returns: A GPG key fingerprint
2264+ :rtype: str
2265+ """
2266+ # Use the same gpg command for both Xenial and Bionic
2267+ cmd = 'gpg --with-colons --with-fingerprint'
2268+ ps = subprocess.Popen(cmd.split(),
2269+ stdout=subprocess.PIPE,
2270+ stderr=subprocess.PIPE,
2271+ stdin=subprocess.PIPE)
2272+ out, err = ps.communicate(input=key_material)
2273+ if six.PY3:
2274+ out = out.decode('utf-8')
2275+ err = err.decode('utf-8')
2276+ if 'gpg: no valid OpenPGP data found.' in err:
2277+ raise GPGKeyError('Invalid GPG key material provided')
2278+ # from gnupg2 docs: fpr :: Fingerprint (fingerprint is in field 10)
2279+ return re.search(r"^fpr:{9}([0-9A-F]{40}):$", out, re.MULTILINE).group(1)
2280+
2281+
2282+def _get_key_by_keyid(keyid):
2283+ """Get a key via HTTPS from the Ubuntu keyserver.
2284+ Different key ID formats are supported by SKS keyservers (the longer ones
2285+ are more secure, see "dead beef attack" and https://evil32.com/). Since
2286+ HTTPS is used, if SSLBump-like HTTPS proxies are in place, they will
2287+ impersonate keyserver.ubuntu.com and generate a certificate with
2288+ keyserver.ubuntu.com in the CN field or in SubjAltName fields of a
2289+ certificate. If such proxy behavior is expected it is necessary to add the
2290+ CA certificate chain containing the intermediate CA of the SSLBump proxy to
2291+ every machine that this code runs on via ca-certs cloud-init directive (via
2292+ cloudinit-userdata model-config) or via other means (such as through a
2293+ custom charm option). Also note that DNS resolution for the hostname in a
2294+ URL is done at a proxy server - not at the client side.
2295+
2296+ 8-digit (32 bit) key ID
2297+ https://keyserver.ubuntu.com/pks/lookup?search=0x4652B4E6
2298+ 16-digit (64 bit) key ID
2299+ https://keyserver.ubuntu.com/pks/lookup?search=0x6E85A86E4652B4E6
2300+ 40-digit key ID:
2301+ https://keyserver.ubuntu.com/pks/lookup?search=0x35F77D63B5CEC106C577ED856E85A86E4652B4E6
2302+
2303+ :param keyid: An 8, 16 or 40 hex digit keyid to find a key for
2304+ :type keyid: (bytes, str)
2305+ :returns: A key material for the specified GPG key id
2306+ :rtype: (str, bytes)
2307+ :raises: subprocess.CalledProcessError
2308+ """
2309+ # options=mr - machine-readable output (disables html wrappers)
2310+ keyserver_url = ('https://keyserver.ubuntu.com'
2311+ '/pks/lookup?op=get&options=mr&exact=on&search=0x{}')
2312+ curl_cmd = ['curl', keyserver_url.format(keyid)]
2313+ # use proxy server settings in order to retrieve the key
2314+ return subprocess.check_output(curl_cmd,
2315+ env=env_proxy_settings(['https']))
2316+
2317+
2318+def _dearmor_gpg_key(key_asc):
2319+ """Converts a GPG key in the ASCII armor format to the binary format.
2320+
2321+ :param key_asc: A GPG key in ASCII armor format.
2322+ :type key_asc: (str, bytes)
2323+ :returns: A GPG key in binary format
2324+ :rtype: (str, bytes)
2325+ :raises: GPGKeyError
2326+ """
2327+ ps = subprocess.Popen(['gpg', '--dearmor'],
2328+ stdout=subprocess.PIPE,
2329+ stderr=subprocess.PIPE,
2330+ stdin=subprocess.PIPE)
2331+ out, err = ps.communicate(input=key_asc)
2332+ # no need to decode output as it is binary (invalid utf-8), only error
2333+ if six.PY3:
2334+ err = err.decode('utf-8')
2335+ if 'gpg: no valid OpenPGP data found.' in err:
2336+ raise GPGKeyError('Invalid GPG key material. Check your network setup'
2337+ ' (MTU, routing, DNS) and/or proxy server settings'
2338+ ' as well as destination keyserver status.')
2339+ else:
2340+ return out
2341+
2342+
2343+def _write_apt_gpg_keyfile(key_name, key_material):
2344+ """Writes GPG key material into a file at a provided path.
2345+
2346+ :param key_name: A key name to use for a key file (could be a fingerprint)
2347+ :type key_name: str
2348+ :param key_material: A GPG key material (binary)
2349+ :type key_material: (str, bytes)
2350+ """
2351+ with open('/etc/apt/trusted.gpg.d/{}.gpg'.format(key_name),
2352+ 'wb') as keyf:
2353+ keyf.write(key_material)
2354+
2355+
2356+def add_source(source, key=None, fail_invalid=False):
2357 """Add a package source to this system.
2358
2359 @param source: a URL or sources.list entry, as supported by
2360@@ -233,6 +466,33 @@
2361 such as 'cloud:icehouse'
2362 'distro' may be used as a noop
2363
2364+ Full list of source specifications supported by the function are:
2365+
2366+ 'distro': A NOP; i.e. it has no effect.
2367+ 'proposed': the proposed deb spec [2] is wrtten to
2368+ /etc/apt/sources.list/proposed
2369+ 'distro-proposed': adds <version>-proposed to the debs [2]
2370+ 'ppa:<ppa-name>': add-apt-repository --yes <ppa_name>
2371+ 'deb <deb-spec>': add-apt-repository --yes deb <deb-spec>
2372+ 'http://....': add-apt-repository --yes http://...
2373+ 'cloud-archive:<spec>': add-apt-repository -yes cloud-archive:<spec>
2374+ 'cloud:<release>[-staging]': specify a Cloud Archive pocket <release> with
2375+ optional staging version. If staging is used then the staging PPA [2]
2376+ with be used. If staging is NOT used then the cloud archive [3] will be
2377+ added, and the 'ubuntu-cloud-keyring' package will be added for the
2378+ current distro.
2379+
2380+ Otherwise the source is not recognised and this is logged to the juju log.
2381+ However, no error is raised, unless sys_error_on_exit is True.
2382+
2383+ [1] deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
2384+ where {} is replaced with the derived pocket name.
2385+ [2] deb http://archive.ubuntu.com/ubuntu {}-proposed \
2386+ main universe multiverse restricted
2387+ where {} is replaced with the lsb_release codename (e.g. xenial)
2388+ [3] deb http://ubuntu-cloud.archive.canonical.com/ubuntu <pocket>
2389+ to /etc/apt/sources.list.d/cloud-archive-list
2390+
2391 @param key: A key to be added to the system's APT keyring and used
2392 to verify the signatures on packages. Ideally, this should be an
2393 ASCII format GPG public key including the block headers. A GPG key
2394@@ -240,51 +500,152 @@
2395 available to retrieve the actual public key from a public keyserver
2396 placing your Juju environment at risk. ppa and cloud archive keys
2397 are securely added automtically, so sould not be provided.
2398+
2399+ @param fail_invalid: (boolean) if True, then the function raises a
2400+ SourceConfigError is there is no matching installation source.
2401+
2402+ @raises SourceConfigError() if for cloud:<pocket>, the <pocket> is not a
2403+ valid pocket in CLOUD_ARCHIVE_POCKETS
2404 """
2405+ _mapping = OrderedDict([
2406+ (r"^distro$", lambda: None), # This is a NOP
2407+ (r"^(?:proposed|distro-proposed)$", _add_proposed),
2408+ (r"^cloud-archive:(.*)$", _add_apt_repository),
2409+ (r"^((?:deb |http:|https:|ppa:).*)$", _add_apt_repository),
2410+ (r"^cloud:(.*)-(.*)\/staging$", _add_cloud_staging),
2411+ (r"^cloud:(.*)-(.*)$", _add_cloud_distro_check),
2412+ (r"^cloud:(.*)$", _add_cloud_pocket),
2413+ (r"^snap:.*-(.*)-(.*)$", _add_cloud_distro_check),
2414+ ])
2415 if source is None:
2416- log('Source is not present. Skipping')
2417- return
2418-
2419- if (source.startswith('ppa:') or
2420- source.startswith('http') or
2421- source.startswith('deb ') or
2422- source.startswith('cloud-archive:')):
2423- cmd = ['add-apt-repository', '--yes', source]
2424- _run_with_retries(cmd)
2425- elif source.startswith('cloud:'):
2426- install(filter_installed_packages(['ubuntu-cloud-keyring']),
2427+ source = ''
2428+ for r, fn in six.iteritems(_mapping):
2429+ m = re.match(r, source)
2430+ if m:
2431+ if key:
2432+ # Import key before adding the source which depends on it,
2433+ # as refreshing packages could fail otherwise.
2434+ try:
2435+ import_key(key)
2436+ except GPGKeyError as e:
2437+ raise SourceConfigError(str(e))
2438+ # call the associated function with the captured groups
2439+ # raises SourceConfigError on error.
2440+ fn(*m.groups())
2441+ break
2442+ else:
2443+ # nothing matched. log an error and maybe sys.exit
2444+ err = "Unknown source: {!r}".format(source)
2445+ log(err)
2446+ if fail_invalid:
2447+ raise SourceConfigError(err)
2448+
2449+
2450+def _add_proposed():
2451+ """Add the PROPOSED_POCKET as /etc/apt/source.list.d/proposed.list
2452+
2453+ Uses get_distrib_codename to determine the correct stanza for
2454+ the deb line.
2455+
2456+ For intel architecutres PROPOSED_POCKET is used for the release, but for
2457+ other architectures PROPOSED_PORTS_POCKET is used for the release.
2458+ """
2459+ release = get_distrib_codename()
2460+ arch = platform.machine()
2461+ if arch not in six.iterkeys(ARCH_TO_PROPOSED_POCKET):
2462+ raise SourceConfigError("Arch {} not supported for (distro-)proposed"
2463+ .format(arch))
2464+ with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
2465+ apt.write(ARCH_TO_PROPOSED_POCKET[arch].format(release))
2466+
2467+
2468+def _add_apt_repository(spec):
2469+ """Add the spec using add_apt_repository
2470+
2471+ :param spec: the parameter to pass to add_apt_repository
2472+ :type spec: str
2473+ """
2474+ if '{series}' in spec:
2475+ series = get_distrib_codename()
2476+ spec = spec.replace('{series}', series)
2477+ # software-properties package for bionic properly reacts to proxy settings
2478+ # passed as environment variables (See lp:1433761). This is not the case
2479+ # LTS and non-LTS releases below bionic.
2480+ _run_with_retries(['add-apt-repository', '--yes', spec],
2481+ cmd_env=env_proxy_settings(['https']))
2482+
2483+
2484+def _add_cloud_pocket(pocket):
2485+ """Add a cloud pocket as /etc/apt/sources.d/cloud-archive.list
2486+
2487+ Note that this overwrites the existing file if there is one.
2488+
2489+ This function also converts the simple pocket in to the actual pocket using
2490+ the CLOUD_ARCHIVE_POCKETS mapping.
2491+
2492+ :param pocket: string representing the pocket to add a deb spec for.
2493+ :raises: SourceConfigError if the cloud pocket doesn't exist or the
2494+ requested release doesn't match the current distro version.
2495+ """
2496+ apt_install(filter_installed_packages(['ubuntu-cloud-keyring']),
2497 fatal=True)
2498- pocket = source.split(':')[-1]
2499- if pocket not in CLOUD_ARCHIVE_POCKETS:
2500- raise SourceConfigError(
2501- 'Unsupported cloud: source option %s' %
2502- pocket)
2503- actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
2504- with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
2505- apt.write(CLOUD_ARCHIVE.format(actual_pocket))
2506- elif source == 'proposed':
2507- release = lsb_release()['DISTRIB_CODENAME']
2508- with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
2509- apt.write(PROPOSED_POCKET.format(release))
2510- elif source == 'distro':
2511- pass
2512- else:
2513- log("Unknown source: {!r}".format(source))
2514-
2515- if key:
2516- if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key:
2517- with NamedTemporaryFile('w+') as key_file:
2518- key_file.write(key)
2519- key_file.flush()
2520- key_file.seek(0)
2521- subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file)
2522- else:
2523- # Note that hkp: is in no way a secure protocol. Using a
2524- # GPG key id is pointless from a security POV unless you
2525- # absolutely trust your network and DNS.
2526- subprocess.check_call(['apt-key', 'adv', '--keyserver',
2527- 'hkp://keyserver.ubuntu.com:80', '--recv',
2528- key])
2529+ if pocket not in CLOUD_ARCHIVE_POCKETS:
2530+ raise SourceConfigError(
2531+ 'Unsupported cloud: source option %s' %
2532+ pocket)
2533+ actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
2534+ with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
2535+ apt.write(CLOUD_ARCHIVE.format(actual_pocket))
2536+
2537+
2538+def _add_cloud_staging(cloud_archive_release, openstack_release):
2539+ """Add the cloud staging repository which is in
2540+ ppa:ubuntu-cloud-archive/<openstack_release>-staging
2541+
2542+ This function checks that the cloud_archive_release matches the current
2543+ codename for the distro that charm is being installed on.
2544+
2545+ :param cloud_archive_release: string, codename for the release.
2546+ :param openstack_release: String, codename for the openstack release.
2547+ :raises: SourceConfigError if the cloud_archive_release doesn't match the
2548+ current version of the os.
2549+ """
2550+ _verify_is_ubuntu_rel(cloud_archive_release, openstack_release)
2551+ ppa = 'ppa:ubuntu-cloud-archive/{}-staging'.format(openstack_release)
2552+ cmd = 'add-apt-repository -y {}'.format(ppa)
2553+ _run_with_retries(cmd.split(' '))
2554+
2555+
2556+def _add_cloud_distro_check(cloud_archive_release, openstack_release):
2557+ """Add the cloud pocket, but also check the cloud_archive_release against
2558+ the current distro, and use the openstack_release as the full lookup.
2559+
2560+ This just calls _add_cloud_pocket() with the openstack_release as pocket
2561+ to get the correct cloud-archive.list for dpkg to work with.
2562+
2563+ :param cloud_archive_release:String, codename for the distro release.
2564+ :param openstack_release: String, spec for the release to look up in the
2565+ CLOUD_ARCHIVE_POCKETS
2566+ :raises: SourceConfigError if this is the wrong distro, or the pocket spec
2567+ doesn't exist.
2568+ """
2569+ _verify_is_ubuntu_rel(cloud_archive_release, openstack_release)
2570+ _add_cloud_pocket("{}-{}".format(cloud_archive_release, openstack_release))
2571+
2572+
2573+def _verify_is_ubuntu_rel(release, os_release):
2574+ """Verify that the release is in the same as the current ubuntu release.
2575+
2576+ :param release: String, lowercase for the release.
2577+ :param os_release: String, the os_release being asked for
2578+ :raises: SourceConfigError if the release is not the same as the ubuntu
2579+ release.
2580+ """
2581+ ubuntu_rel = get_distrib_codename()
2582+ if release != ubuntu_rel:
2583+ raise SourceConfigError(
2584+ 'Invalid Cloud Archive release specified: {}-{} on this Ubuntu'
2585+ 'version ({})'.format(release, os_release, ubuntu_rel))
2586
2587
2588 def _run_with_retries(cmd, max_retries=CMD_RETRY_COUNT, retry_exitcodes=(1,),
2589@@ -300,9 +661,12 @@
2590 :param: cmd_env: dict: Environment variables to add to the command run.
2591 """
2592
2593- env = os.environ.copy()
2594+ env = None
2595+ kwargs = {}
2596 if cmd_env:
2597+ env = os.environ.copy()
2598 env.update(cmd_env)
2599+ kwargs['env'] = env
2600
2601 if not retry_message:
2602 retry_message = "Failed executing '{}'".format(" ".join(cmd))
2603@@ -314,7 +678,8 @@
2604 retry_results = (None,) + retry_exitcodes
2605 while result in retry_results:
2606 try:
2607- result = subprocess.check_call(cmd, env=env)
2608+ # result = subprocess.check_call(cmd, env=env)
2609+ result = subprocess.check_call(cmd, **kwargs)
2610 except subprocess.CalledProcessError as e:
2611 retry_count = retry_count + 1
2612 if retry_count > max_retries:
2613@@ -327,6 +692,7 @@
2614 def _run_apt_command(cmd, fatal=False):
2615 """Run an apt command with optional retries.
2616
2617+ :param: cmd: str: The apt command to run.
2618 :param: fatal: bool: Whether the command's output should be checked and
2619 retried.
2620 """
2621@@ -353,7 +719,7 @@
2622 cache = apt_cache()
2623 try:
2624 pkg = cache[package]
2625- except:
2626+ except Exception:
2627 # the package is unknown to the current apt cache.
2628 return None
2629
2630
2631=== modified file 'dev/charm_helpers_sync.py'
2632--- dev/charm_helpers_sync.py 2015-01-28 08:59:02 +0000
2633+++ dev/charm_helpers_sync.py 2019-05-24 12:43:31 +0000
2634@@ -2,19 +2,17 @@
2635
2636 # Copyright 2014-2015 Canonical Limited.
2637 #
2638-# This file is part of charm-helpers.
2639-#
2640-# charm-helpers is free software: you can redistribute it and/or modify
2641-# it under the terms of the GNU Lesser General Public License version 3 as
2642-# published by the Free Software Foundation.
2643-#
2644-# charm-helpers is distributed in the hope that it will be useful,
2645-# but WITHOUT ANY WARRANTY; without even the implied warranty of
2646-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
2647-# GNU Lesser General Public License for more details.
2648-#
2649-# You should have received a copy of the GNU Lesser General Public License
2650-# along with charm-helpers. If not, see <http://www.gnu.org/licenses/>.
2651+# Licensed under the Apache License, Version 2.0 (the "License");
2652+# you may not use this file except in compliance with the License.
2653+# You may obtain a copy of the License at
2654+#
2655+# http://www.apache.org/licenses/LICENSE-2.0
2656+#
2657+# Unless required by applicable law or agreed to in writing, software
2658+# distributed under the License is distributed on an "AS IS" BASIS,
2659+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
2660+# See the License for the specific language governing permissions and
2661+# limitations under the License.
2662
2663 # Authors:
2664 # Adam Gandelman <adamg@ubuntu.com>
2665@@ -31,7 +29,7 @@
2666
2667 import six
2668
2669-CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
2670+CHARM_HELPERS_REPO = 'https://github.com/juju/charm-helpers'
2671
2672
2673 def parse_config(conf_file):
2674@@ -41,10 +39,16 @@
2675 return yaml.load(open(conf_file).read())
2676
2677
2678-def clone_helpers(work_dir, branch):
2679+def clone_helpers(work_dir, repo):
2680 dest = os.path.join(work_dir, 'charm-helpers')
2681- logging.info('Checking out %s to %s.' % (branch, dest))
2682- cmd = ['bzr', 'checkout', '--lightweight', branch, dest]
2683+ logging.info('Cloning out %s to %s.' % (repo, dest))
2684+ branch = None
2685+ if '@' in repo:
2686+ repo, branch = repo.split('@', 1)
2687+ cmd = ['git', 'clone', '--depth=1']
2688+ if branch is not None:
2689+ cmd += ['--branch', branch]
2690+ cmd += [repo, dest]
2691 subprocess.check_call(cmd)
2692 return dest
2693
2694@@ -176,6 +180,9 @@
2695
2696
2697 def sync_helpers(include, src, dest, options=None):
2698+ if os.path.exists(dest):
2699+ logging.debug('Removing existing directory: %s' % dest)
2700+ shutil.rmtree(dest)
2701 if not os.path.isdir(dest):
2702 os.makedirs(dest)
2703
2704@@ -193,14 +200,15 @@
2705 inc, opts = extract_options(m, global_options)
2706 sync(src, dest, '%s.%s' % (k, inc), opts)
2707
2708+
2709 if __name__ == '__main__':
2710 parser = optparse.OptionParser()
2711 parser.add_option('-c', '--config', action='store', dest='config',
2712 default=None, help='helper config file')
2713 parser.add_option('-D', '--debug', action='store_true', dest='debug',
2714 default=False, help='debug')
2715- parser.add_option('-b', '--branch', action='store', dest='branch',
2716- help='charm-helpers bzr branch (overrides config)')
2717+ parser.add_option('-r', '--repository', action='store', dest='repo',
2718+ help='charm-helpers git repository (overrides config)')
2719 parser.add_option('-d', '--destination', action='store', dest='dest_dir',
2720 help='sync destination dir (overrides config)')
2721 (opts, args) = parser.parse_args()
2722@@ -219,10 +227,10 @@
2723 else:
2724 config = {}
2725
2726- if 'branch' not in config:
2727- config['branch'] = CHARM_HELPERS_BRANCH
2728- if opts.branch:
2729- config['branch'] = opts.branch
2730+ if 'repo' not in config:
2731+ config['repo'] = CHARM_HELPERS_REPO
2732+ if opts.repo:
2733+ config['repo'] = opts.repo
2734 if opts.dest_dir:
2735 config['destination'] = opts.dest_dir
2736
2737@@ -242,7 +250,7 @@
2738 sync_options = config['options']
2739 tmpd = tempfile.mkdtemp()
2740 try:
2741- checkout = clone_helpers(tmpd, config['branch'])
2742+ checkout = clone_helpers(tmpd, config['repo'])
2743 sync_helpers(config['include'], checkout, config['destination'],
2744 options=sync_options)
2745 except Exception as e:

Subscribers

People subscribed via source and target branches