Merge lp:~simpoir/landscape-charm/charmhelpers-and-keys into lp:~landscape/landscape-charm/trunk

Proposed by Simon Poirier
Status: Merged
Approved by: Simon Poirier
Approved revision: 397
Merged at revision: 397
Proposed branch: lp:~simpoir/landscape-charm/charmhelpers-and-keys
Merge into: lp:~landscape/landscape-charm/trunk
Diff against target: 2745 lines (+1661/-199)
26 files modified
Makefile (+1/-2)
charmhelpers/__init__.py (+65/-4)
charmhelpers/contrib/hahelpers/apache.py (+5/-14)
charmhelpers/contrib/hahelpers/cluster.py (+43/-0)
charmhelpers/core/hookenv.py (+450/-28)
charmhelpers/core/host.py (+166/-11)
charmhelpers/core/host_factory/ubuntu.py (+26/-0)
charmhelpers/core/kernel.py (+2/-2)
charmhelpers/core/services/base.py (+18/-7)
charmhelpers/core/strutils.py (+11/-5)
charmhelpers/core/sysctl.py (+21/-10)
charmhelpers/core/templating.py (+18/-9)
charmhelpers/core/unitdata.py (+8/-1)
charmhelpers/fetch/__init__.py (+19/-9)
charmhelpers/fetch/archiveurl.py (+1/-1)
charmhelpers/fetch/bzrurl.py (+2/-2)
charmhelpers/fetch/centos.py (+1/-1)
charmhelpers/fetch/giturl.py (+2/-2)
charmhelpers/fetch/python/__init__.py (+13/-0)
charmhelpers/fetch/python/debug.py (+54/-0)
charmhelpers/fetch/python/packages.py (+154/-0)
charmhelpers/fetch/python/rpdb.py (+56/-0)
charmhelpers/fetch/python/version.py (+32/-0)
charmhelpers/fetch/snap.py (+33/-5)
charmhelpers/fetch/ubuntu.py (+428/-62)
dev/charm_helpers_sync.py (+32/-24)
To merge this branch: bzr merge lp:~simpoir/landscape-charm/charmhelpers-and-keys
Reviewer Review Type Date Requested Status
🤖 Landscape Builder test results Approve
Adam Collard (community) Approve
Review via email: mp+367881@code.launchpad.net

Commit message

This branch updates charm helpers, and add the fix proposed as
https://github.com/juju/charm-helpers/pull/326

This should fix apt failures when specifying a deb source and key.

Description of the change

This branch updates charm helpers, and add the fix proposed as
https://github.com/juju/charm-helpers/pull/326

This should fix apt failures when specifying a deb source and key.

Testing instructions:

juju deploy . --config install_sources='["deb http://ppa.launchpad.net/landscape/18.03/ubuntu bionic main"]' --config install_keys='["-----BEGIN PGP PUBLIC KEY BLOCK-----\\nVersion: SKS 1.1.6\\nComment: Hostname: keyserver.ubuntu.com\\n\\nmI0ESXN/egEEAOgRYISU9dnQm4BB5ZEEwKT+NKUDNd/DhMYdtBMw9Yk7S5cyoqpbtwoPJVzK\\nAXxq+ng5e3yYypSv98pLMr5UF09FGaeyGlD4s1uaVFWkFCO4jsTg7pWIY6qzO/jMxB5+Yu/G\\n0GjWQMNKxFk0oHMa0PhNBZtdPacVz65mOVmCsh/lABEBAAG0G0xhdW5jaHBhZCBQUEEgZm9y\\nIExhbmRzY2FwZYi2BBMBAgAgBQJJc396AhsDBgsJCAcDAgQVAggDBBYCAwECHgECF4AACgkQ\\nboWobkZStOb+rwP+ONKUWeX+MTIPqGWkknBPV7jm8nyyIUojC4IhS+9YR6GYnn0hMABSkEHm\\nIV73feKmrT2GESYI1UdYeKiOkWsPN/JyBk+eTvKet0qsw5TluqiHSW+LEi/+zUyrS3dDMX3o\\nyaLgYa+UkjIyxnaKLkQuCiS+D+fYwnJulIkhaKObtdE=\\n=UwRd\\n-----END PGP PUBLIC KEY BLOCK-----"]'

juju debug-log

check for apt refresh failures (or lack thereof). Packages will install and charm will block on relations. Previous behaviour was a series of failed update/install, eventually succeeding after many retries.

To post a comment you must log in.
Revision history for this message
Adam Collard (adam-collard) wrote :

+1

review: Approve
Revision history for this message
🤖 Landscape Builder (landscape-builder) wrote :

Voting does not meet specified criteria. Required: Approve >= 2, Disapprove == 0. Got: 1 Approve.

Revision history for this message
🤖 Landscape Builder (landscape-builder) wrote :

No approved revision specified.

Revision history for this message
🤖 Landscape Builder (landscape-builder) :
review: Abstain (executing tests)
Revision history for this message
🤖 Landscape Builder (landscape-builder) wrote :

Command: make ci-test
Result: Success
Revno: 397
Branch: lp:~simpoir/landscape-charm/charmhelpers-and-keys
Jenkins: https://ci.lscape.net/job/latch-test-xenial/3942/

review: Approve (test results)

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'Makefile'
--- Makefile 2019-01-17 14:23:55 +0000
+++ Makefile 2019-05-24 12:43:31 +0000
@@ -88,8 +88,7 @@
8888
89dev/charm_helpers_sync.py:89dev/charm_helpers_sync.py:
90 @mkdir -p dev90 @mkdir -p dev
91 @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \91 @curl https://git.launchpad.net/charm-helpers/plain/tools/charm_helpers_sync/charm_helpers_sync.py > dev/charm_helpers_sync.py
92 > dev/charm_helpers_sync.py
9392
94sync: dev/charm_helpers_sync.py93sync: dev/charm_helpers_sync.py
95 $(PYTHON) dev/charm_helpers_sync.py -c charm-helpers.yaml94 $(PYTHON) dev/charm_helpers_sync.py -c charm-helpers.yaml
9695
=== modified file 'charmhelpers/__init__.py'
--- charmhelpers/__init__.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/__init__.py 2019-05-24 12:43:31 +0000
@@ -14,23 +14,84 @@
1414
15# Bootstrap charm-helpers, installing its dependencies if necessary using15# Bootstrap charm-helpers, installing its dependencies if necessary using
16# only standard libraries.16# only standard libraries.
17from __future__ import print_function
18from __future__ import absolute_import
19
20import functools
21import inspect
17import subprocess22import subprocess
18import sys23import sys
1924
20try:25try:
21 import six # flake8: noqa26 import six # NOQA:F401
22except ImportError:27except ImportError:
23 if sys.version_info.major == 2:28 if sys.version_info.major == 2:
24 subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])29 subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])
25 else:30 else:
26 subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])31 subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])
27 import six # flake8: noqa32 import six # NOQA:F401
2833
29try:34try:
30 import yaml # flake8: noqa35 import yaml # NOQA:F401
31except ImportError:36except ImportError:
32 if sys.version_info.major == 2:37 if sys.version_info.major == 2:
33 subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])38 subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])
34 else:39 else:
35 subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])40 subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])
36 import yaml # flake8: noqa41 import yaml # NOQA:F401
42
43
44# Holds a list of mapping of mangled function names that have been deprecated
45# using the @deprecate decorator below. This is so that the warning is only
46# printed once for each usage of the function.
47__deprecated_functions = {}
48
49
50def deprecate(warning, date=None, log=None):
51 """Add a deprecation warning the first time the function is used.
52 The date, which is a string in semi-ISO8660 format indicate the year-month
53 that the function is officially going to be removed.
54
55 usage:
56
57 @deprecate('use core/fetch/add_source() instead', '2017-04')
58 def contributed_add_source_thing(...):
59 ...
60
61 And it then prints to the log ONCE that the function is deprecated.
62 The reason for passing the logging function (log) is so that hookenv.log
63 can be used for a charm if needed.
64
65 :param warning: String to indicat where it has moved ot.
66 :param date: optional sting, in YYYY-MM format to indicate when the
67 function will definitely (probably) be removed.
68 :param log: The log function to call to log. If not, logs to stdout
69 """
70 def wrap(f):
71
72 @functools.wraps(f)
73 def wrapped_f(*args, **kwargs):
74 try:
75 module = inspect.getmodule(f)
76 file = inspect.getsourcefile(f)
77 lines = inspect.getsourcelines(f)
78 f_name = "{}-{}-{}..{}-{}".format(
79 module.__name__, file, lines[0], lines[-1], f.__name__)
80 except (IOError, TypeError):
81 # assume it was local, so just use the name of the function
82 f_name = f.__name__
83 if f_name not in __deprecated_functions:
84 __deprecated_functions[f_name] = True
85 s = "DEPRECATION WARNING: Function {} is being removed".format(
86 f.__name__)
87 if date:
88 s = "{} on/around {}".format(s, date)
89 if warning:
90 s = "{} : {}".format(s, warning)
91 if log:
92 log(s)
93 else:
94 print(s)
95 return f(*args, **kwargs)
96 return wrapped_f
97 return wrap
3798
=== modified file 'charmhelpers/contrib/hahelpers/apache.py'
--- charmhelpers/contrib/hahelpers/apache.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/contrib/hahelpers/apache.py 2019-05-24 12:43:31 +0000
@@ -23,8 +23,8 @@
23#23#
2424
25import os25import os
26import subprocess
2726
27from charmhelpers.core import host
28from charmhelpers.core.hookenv import (28from charmhelpers.core.hookenv import (
29 config as config_get,29 config as config_get,
30 relation_get,30 relation_get,
@@ -65,7 +65,8 @@
65 if ca_cert is None:65 if ca_cert is None:
66 log("Inspecting identity-service relations for CA SSL certificate.",66 log("Inspecting identity-service relations for CA SSL certificate.",
67 level=INFO)67 level=INFO)
68 for r_id in relation_ids('identity-service'):68 for r_id in (relation_ids('identity-service') +
69 relation_ids('identity-credentials')):
69 for unit in relation_list(r_id):70 for unit in relation_list(r_id):
70 if ca_cert is None:71 if ca_cert is None:
71 ca_cert = relation_get('ca_cert',72 ca_cert = relation_get('ca_cert',
@@ -76,20 +77,10 @@
76def retrieve_ca_cert(cert_file):77def retrieve_ca_cert(cert_file):
77 cert = None78 cert = None
78 if os.path.isfile(cert_file):79 if os.path.isfile(cert_file):
79 with open(cert_file, 'r') as crt:80 with open(cert_file, 'rb') as crt:
80 cert = crt.read()81 cert = crt.read()
81 return cert82 return cert
8283
8384
84def install_ca_cert(ca_cert):85def install_ca_cert(ca_cert):
85 if ca_cert:86 host.install_ca_cert(ca_cert, 'keystone_juju_ca_cert')
86 cert_file = ('/usr/local/share/ca-certificates/'
87 'keystone_juju_ca_cert.crt')
88 old_cert = retrieve_ca_cert(cert_file)
89 if old_cert and old_cert == ca_cert:
90 log("CA cert is the same as installed version", level=INFO)
91 else:
92 log("Installing new CA cert", level=INFO)
93 with open(cert_file, 'w') as crt:
94 crt.write(ca_cert)
95 subprocess.check_call(['update-ca-certificates', '--fresh'])
9687
=== modified file 'charmhelpers/contrib/hahelpers/cluster.py'
--- charmhelpers/contrib/hahelpers/cluster.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/contrib/hahelpers/cluster.py 2019-05-24 12:43:31 +0000
@@ -27,6 +27,7 @@
2727
28import subprocess28import subprocess
29import os29import os
30import time
3031
31from socket import gethostname as get_unit_hostname32from socket import gethostname as get_unit_hostname
3233
@@ -45,6 +46,9 @@
45 is_leader as juju_is_leader,46 is_leader as juju_is_leader,
46 status_set,47 status_set,
47)48)
49from charmhelpers.core.host import (
50 modulo_distribution,
51)
48from charmhelpers.core.decorators import (52from charmhelpers.core.decorators import (
49 retry_on_exception,53 retry_on_exception,
50)54)
@@ -219,6 +223,11 @@
219 return True223 return True
220 if config_get('ssl_cert') and config_get('ssl_key'):224 if config_get('ssl_cert') and config_get('ssl_key'):
221 return True225 return True
226 for r_id in relation_ids('certificates'):
227 for unit in relation_list(r_id):
228 ca = relation_get('ca', rid=r_id, unit=unit)
229 if ca:
230 return True
222 for r_id in relation_ids('identity-service'):231 for r_id in relation_ids('identity-service'):
223 for unit in relation_list(r_id):232 for unit in relation_list(r_id):
224 # TODO - needs fixing for new helper as ssl_cert/key suffixes with CN233 # TODO - needs fixing for new helper as ssl_cert/key suffixes with CN
@@ -361,3 +370,37 @@
361 else:370 else:
362 addr = unit_get('private-address')371 addr = unit_get('private-address')
363 return '%s://%s' % (scheme, addr)372 return '%s://%s' % (scheme, addr)
373
374
375def distributed_wait(modulo=None, wait=None, operation_name='operation'):
376 ''' Distribute operations by waiting based on modulo_distribution
377
378 If modulo and or wait are not set, check config_get for those values.
379 If config values are not set, default to modulo=3 and wait=30.
380
381 :param modulo: int The modulo number creates the group distribution
382 :param wait: int The constant time wait value
383 :param operation_name: string Operation name for status message
384 i.e. 'restart'
385 :side effect: Calls config_get()
386 :side effect: Calls log()
387 :side effect: Calls status_set()
388 :side effect: Calls time.sleep()
389 '''
390 if modulo is None:
391 modulo = config_get('modulo-nodes') or 3
392 if wait is None:
393 wait = config_get('known-wait') or 30
394 if juju_is_leader():
395 # The leader should never wait
396 calculated_wait = 0
397 else:
398 # non_zero_wait=True guarantees the non-leader who gets modulo 0
399 # will still wait
400 calculated_wait = modulo_distribution(modulo=modulo, wait=wait,
401 non_zero_wait=True)
402 msg = "Waiting {} seconds for {} ...".format(calculated_wait,
403 operation_name)
404 log(msg, DEBUG)
405 status_set('maintenance', msg)
406 time.sleep(calculated_wait)
364407
=== modified file 'charmhelpers/core/hookenv.py'
--- charmhelpers/core/hookenv.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/core/hookenv.py 2019-05-24 12:43:31 +0000
@@ -22,10 +22,12 @@
22import copy22import copy
23from distutils.version import LooseVersion23from distutils.version import LooseVersion
24from functools import wraps24from functools import wraps
25from collections import namedtuple
25import glob26import glob
26import os27import os
27import json28import json
28import yaml29import yaml
30import re
29import subprocess31import subprocess
30import sys32import sys
31import errno33import errno
@@ -38,12 +40,20 @@
38else:40else:
39 from collections import UserDict41 from collections import UserDict
4042
43
41CRITICAL = "CRITICAL"44CRITICAL = "CRITICAL"
42ERROR = "ERROR"45ERROR = "ERROR"
43WARNING = "WARNING"46WARNING = "WARNING"
44INFO = "INFO"47INFO = "INFO"
45DEBUG = "DEBUG"48DEBUG = "DEBUG"
49TRACE = "TRACE"
46MARKER = object()50MARKER = object()
51SH_MAX_ARG = 131071
52
53
54RANGE_WARNING = ('Passing NO_PROXY string that includes a cidr. '
55 'This may not be compatible with software you are '
56 'running in your shell.')
4757
48cache = {}58cache = {}
4959
@@ -64,7 +74,7 @@
64 @wraps(func)74 @wraps(func)
65 def wrapper(*args, **kwargs):75 def wrapper(*args, **kwargs):
66 global cache76 global cache
67 key = str((func, args, kwargs))77 key = json.dumps((func, args, kwargs), sort_keys=True, default=str)
68 try:78 try:
69 return cache[key]79 return cache[key]
70 except KeyError:80 except KeyError:
@@ -94,7 +104,7 @@
94 command += ['-l', level]104 command += ['-l', level]
95 if not isinstance(message, six.string_types):105 if not isinstance(message, six.string_types):
96 message = repr(message)106 message = repr(message)
97 command += [message]107 command += [message[:SH_MAX_ARG]]
98 # Missing juju-log should not cause failures in unit tests108 # Missing juju-log should not cause failures in unit tests
99 # Send log output to stderr109 # Send log output to stderr
100 try:110 try:
@@ -197,9 +207,56 @@
197 return os.environ.get('JUJU_REMOTE_UNIT', None)207 return os.environ.get('JUJU_REMOTE_UNIT', None)
198208
199209
210def application_name():
211 """
212 The name of the deployed application this unit belongs to.
213 """
214 return local_unit().split('/')[0]
215
216
200def service_name():217def service_name():
201 """The name service group this unit belongs to"""218 """
202 return local_unit().split('/')[0]219 .. deprecated:: 0.19.1
220 Alias for :func:`application_name`.
221 """
222 return application_name()
223
224
225def model_name():
226 """
227 Name of the model that this unit is deployed in.
228 """
229 return os.environ['JUJU_MODEL_NAME']
230
231
232def model_uuid():
233 """
234 UUID of the model that this unit is deployed in.
235 """
236 return os.environ['JUJU_MODEL_UUID']
237
238
239def principal_unit():
240 """Returns the principal unit of this unit, otherwise None"""
241 # Juju 2.2 and above provides JUJU_PRINCIPAL_UNIT
242 principal_unit = os.environ.get('JUJU_PRINCIPAL_UNIT', None)
243 # If it's empty, then this unit is the principal
244 if principal_unit == '':
245 return os.environ['JUJU_UNIT_NAME']
246 elif principal_unit is not None:
247 return principal_unit
248 # For Juju 2.1 and below, let's try work out the principle unit by
249 # the various charms' metadata.yaml.
250 for reltype in relation_types():
251 for rid in relation_ids(reltype):
252 for unit in related_units(rid):
253 md = _metadata_unit(unit)
254 if not md:
255 continue
256 subordinate = md.pop('subordinate', None)
257 if not subordinate:
258 return unit
259 return None
203260
204261
205@cached262@cached
@@ -263,7 +320,7 @@
263 self.implicit_save = True320 self.implicit_save = True
264 self._prev_dict = None321 self._prev_dict = None
265 self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)322 self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)
266 if os.path.exists(self.path):323 if os.path.exists(self.path) and os.stat(self.path).st_size:
267 self.load_previous()324 self.load_previous()
268 atexit(self._implicit_save)325 atexit(self._implicit_save)
269326
@@ -283,7 +340,11 @@
283 """340 """
284 self.path = path or self.path341 self.path = path or self.path
285 with open(self.path) as f:342 with open(self.path) as f:
286 self._prev_dict = json.load(f)343 try:
344 self._prev_dict = json.load(f)
345 except ValueError as e:
346 log('Unable to parse previous config data - {}'.format(str(e)),
347 level=ERROR)
287 for k, v in copy.deepcopy(self._prev_dict).items():348 for k, v in copy.deepcopy(self._prev_dict).items():
288 if k not in self:349 if k not in self:
289 self[k] = v350 self[k] = v
@@ -319,6 +380,7 @@
319380
320 """381 """
321 with open(self.path, 'w') as f:382 with open(self.path, 'w') as f:
383 os.fchmod(f.fileno(), 0o600)
322 json.dump(self, f)384 json.dump(self, f)
323385
324 def _implicit_save(self):386 def _implicit_save(self):
@@ -326,22 +388,40 @@
326 self.save()388 self.save()
327389
328390
329@cached391_cache_config = None
392
393
330def config(scope=None):394def config(scope=None):
331 """Juju charm configuration"""395 """
332 config_cmd_line = ['config-get']396 Get the juju charm configuration (scope==None) or individual key,
333 if scope is not None:397 (scope=str). The returned value is a Python data structure loaded as
334 config_cmd_line.append(scope)398 JSON from the Juju config command.
335 else:399
336 config_cmd_line.append('--all')400 :param scope: If set, return the value for the specified key.
337 config_cmd_line.append('--format=json')401 :type scope: Optional[str]
338 try:402 :returns: Either the whole config as a Config, or a key from it.
339 config_data = json.loads(403 :rtype: Any
340 subprocess.check_output(config_cmd_line).decode('UTF-8'))404 """
405 global _cache_config
406 config_cmd_line = ['config-get', '--all', '--format=json']
407 try:
408 # JSON Decode Exception for Python3.5+
409 exc_json = json.decoder.JSONDecodeError
410 except AttributeError:
411 # JSON Decode Exception for Python2.7 through Python3.4
412 exc_json = ValueError
413 try:
414 if _cache_config is None:
415 config_data = json.loads(
416 subprocess.check_output(config_cmd_line).decode('UTF-8'))
417 _cache_config = Config(config_data)
341 if scope is not None:418 if scope is not None:
342 return config_data419 return _cache_config.get(scope)
343 return Config(config_data)420 return _cache_config
344 except ValueError:421 except (exc_json, UnicodeDecodeError) as e:
422 log('Unable to parse output from config-get: config_cmd_line="{}" '
423 'message="{}"'
424 .format(config_cmd_line, str(e)), level=ERROR)
345 return None425 return None
346426
347427
@@ -435,6 +515,67 @@
435 subprocess.check_output(units_cmd_line).decode('UTF-8')) or []515 subprocess.check_output(units_cmd_line).decode('UTF-8')) or []
436516
437517
518def expected_peer_units():
519 """Get a generator for units we expect to join peer relation based on
520 goal-state.
521
522 The local unit is excluded from the result to make it easy to gauge
523 completion of all peers joining the relation with existing hook tools.
524
525 Example usage:
526 log('peer {} of {} joined peer relation'
527 .format(len(related_units()),
528 len(list(expected_peer_units()))))
529
530 This function will raise NotImplementedError if used with juju versions
531 without goal-state support.
532
533 :returns: iterator
534 :rtype: types.GeneratorType
535 :raises: NotImplementedError
536 """
537 if not has_juju_version("2.4.0"):
538 # goal-state first appeared in 2.4.0.
539 raise NotImplementedError("goal-state")
540 _goal_state = goal_state()
541 return (key for key in _goal_state['units']
542 if '/' in key and key != local_unit())
543
544
545def expected_related_units(reltype=None):
546 """Get a generator for units we expect to join relation based on
547 goal-state.
548
549 Note that you can not use this function for the peer relation, take a look
550 at expected_peer_units() for that.
551
552 This function will raise KeyError if you request information for a
553 relation type for which juju goal-state does not have information. It will
554 raise NotImplementedError if used with juju versions without goal-state
555 support.
556
557 Example usage:
558 log('participant {} of {} joined relation {}'
559 .format(len(related_units()),
560 len(list(expected_related_units())),
561 relation_type()))
562
563 :param reltype: Relation type to list data for, default is to list data for
564 the realtion type we are currently executing a hook for.
565 :type reltype: str
566 :returns: iterator
567 :rtype: types.GeneratorType
568 :raises: KeyError, NotImplementedError
569 """
570 if not has_juju_version("2.4.4"):
571 # goal-state existed in 2.4.0, but did not list individual units to
572 # join a relation in 2.4.1 through 2.4.3. (LP: #1794739)
573 raise NotImplementedError("goal-state relation unit count")
574 reltype = reltype or relation_type()
575 _goal_state = goal_state()
576 return (key for key in _goal_state['relations'][reltype] if '/' in key)
577
578
438@cached579@cached
439def relation_for_unit(unit=None, rid=None):580def relation_for_unit(unit=None, rid=None):
440 """Get the json represenation of a unit's relation"""581 """Get the json represenation of a unit's relation"""
@@ -478,6 +619,24 @@
478 return yaml.safe_load(md)619 return yaml.safe_load(md)
479620
480621
622def _metadata_unit(unit):
623 """Given the name of a unit (e.g. apache2/0), get the unit charm's
624 metadata.yaml. Very similar to metadata() but allows us to inspect
625 other units. Unit needs to be co-located, such as a subordinate or
626 principal/primary.
627
628 :returns: metadata.yaml as a python object.
629
630 """
631 basedir = os.sep.join(charm_dir().split(os.sep)[:-2])
632 unitdir = 'unit-{}'.format(unit.replace(os.sep, '-'))
633 joineddir = os.path.join(basedir, unitdir, 'charm', 'metadata.yaml')
634 if not os.path.exists(joineddir):
635 return None
636 with open(joineddir) as md:
637 return yaml.safe_load(md)
638
639
481@cached640@cached
482def relation_types():641def relation_types():
483 """Get a list of relation types supported by this charm"""642 """Get a list of relation types supported by this charm"""
@@ -602,18 +761,31 @@
602 return False761 return False
603762
604763
764def _port_op(op_name, port, protocol="TCP"):
765 """Open or close a service network port"""
766 _args = [op_name]
767 icmp = protocol.upper() == "ICMP"
768 if icmp:
769 _args.append(protocol)
770 else:
771 _args.append('{}/{}'.format(port, protocol))
772 try:
773 subprocess.check_call(_args)
774 except subprocess.CalledProcessError:
775 # Older Juju pre 2.3 doesn't support ICMP
776 # so treat it as a no-op if it fails.
777 if not icmp:
778 raise
779
780
605def open_port(port, protocol="TCP"):781def open_port(port, protocol="TCP"):
606 """Open a service network port"""782 """Open a service network port"""
607 _args = ['open-port']783 _port_op('open-port', port, protocol)
608 _args.append('{}/{}'.format(port, protocol))
609 subprocess.check_call(_args)
610784
611785
612def close_port(port, protocol="TCP"):786def close_port(port, protocol="TCP"):
613 """Close a service network port"""787 """Close a service network port"""
614 _args = ['close-port']788 _port_op('close-port', port, protocol)
615 _args.append('{}/{}'.format(port, protocol))
616 subprocess.check_call(_args)
617789
618790
619def open_ports(start, end, protocol="TCP"):791def open_ports(start, end, protocol="TCP"):
@@ -630,6 +802,17 @@
630 subprocess.check_call(_args)802 subprocess.check_call(_args)
631803
632804
805def opened_ports():
806 """Get the opened ports
807
808 *Note that this will only show ports opened in a previous hook*
809
810 :returns: Opened ports as a list of strings: ``['8080/tcp', '8081-8083/tcp']``
811 """
812 _args = ['opened-ports', '--format=json']
813 return json.loads(subprocess.check_output(_args).decode('UTF-8'))
814
815
633@cached816@cached
634def unit_get(attribute):817def unit_get(attribute):
635 """Get the unit ID for the remote unit"""818 """Get the unit ID for the remote unit"""
@@ -751,8 +934,15 @@
751 return wrapper934 return wrapper
752935
753936
937class NoNetworkBinding(Exception):
938 pass
939
940
754def charm_dir():941def charm_dir():
755 """Return the root directory of the current charm"""942 """Return the root directory of the current charm"""
943 d = os.environ.get('JUJU_CHARM_DIR')
944 if d is not None:
945 return d
756 return os.environ.get('CHARM_DIR')946 return os.environ.get('CHARM_DIR')
757947
758948
@@ -874,6 +1064,14 @@
8741064
8751065
876@translate_exc(from_exc=OSError, to_exc=NotImplementedError)1066@translate_exc(from_exc=OSError, to_exc=NotImplementedError)
1067@cached
1068def goal_state():
1069 """Juju goal state values"""
1070 cmd = ['goal-state', '--format=json']
1071 return json.loads(subprocess.check_output(cmd).decode('UTF-8'))
1072
1073
1074@translate_exc(from_exc=OSError, to_exc=NotImplementedError)
877def is_leader():1075def is_leader():
878 """Does the current unit hold the juju leadership1076 """Does the current unit hold the juju leadership
8791077
@@ -967,7 +1165,6 @@
967 universal_newlines=True).strip()1165 universal_newlines=True).strip()
9681166
9691167
970@cached
971def has_juju_version(minimum_version):1168def has_juju_version(minimum_version):
972 """Return True if the Juju version is at least the provided version"""1169 """Return True if the Juju version is at least the provided version"""
973 return LooseVersion(juju_version()) >= LooseVersion(minimum_version)1170 return LooseVersion(juju_version()) >= LooseVersion(minimum_version)
@@ -1027,6 +1224,8 @@
1027@translate_exc(from_exc=OSError, to_exc=NotImplementedError)1224@translate_exc(from_exc=OSError, to_exc=NotImplementedError)
1028def network_get_primary_address(binding):1225def network_get_primary_address(binding):
1029 '''1226 '''
1227 Deprecated since Juju 2.3; use network_get()
1228
1030 Retrieve the primary network address for a named binding1229 Retrieve the primary network address for a named binding
10311230
1032 :param binding: string. The name of a relation of extra-binding1231 :param binding: string. The name of a relation of extra-binding
@@ -1034,7 +1233,41 @@
1034 :raise: NotImplementedError if run on Juju < 2.01233 :raise: NotImplementedError if run on Juju < 2.0
1035 '''1234 '''
1036 cmd = ['network-get', '--primary-address', binding]1235 cmd = ['network-get', '--primary-address', binding]
1037 return subprocess.check_output(cmd).decode('UTF-8').strip()1236 try:
1237 response = subprocess.check_output(
1238 cmd,
1239 stderr=subprocess.STDOUT).decode('UTF-8').strip()
1240 except CalledProcessError as e:
1241 if 'no network config found for binding' in e.output.decode('UTF-8'):
1242 raise NoNetworkBinding("No network binding for {}"
1243 .format(binding))
1244 else:
1245 raise
1246 return response
1247
1248
1249def network_get(endpoint, relation_id=None):
1250 """
1251 Retrieve the network details for a relation endpoint
1252
1253 :param endpoint: string. The name of a relation endpoint
1254 :param relation_id: int. The ID of the relation for the current context.
1255 :return: dict. The loaded YAML output of the network-get query.
1256 :raise: NotImplementedError if request not supported by the Juju version.
1257 """
1258 if not has_juju_version('2.2'):
1259 raise NotImplementedError(juju_version()) # earlier versions require --primary-address
1260 if relation_id and not has_juju_version('2.3'):
1261 raise NotImplementedError # 2.3 added the -r option
1262
1263 cmd = ['network-get', endpoint, '--format', 'yaml']
1264 if relation_id:
1265 cmd.append('-r')
1266 cmd.append(relation_id)
1267 response = subprocess.check_output(
1268 cmd,
1269 stderr=subprocess.STDOUT).decode('UTF-8').strip()
1270 return yaml.safe_load(response)
10381271
10391272
1040def add_metric(*args, **kwargs):1273def add_metric(*args, **kwargs):
@@ -1066,3 +1299,192 @@
1066 """Get the meter status information, if running in the meter-status-changed1299 """Get the meter status information, if running in the meter-status-changed
1067 hook."""1300 hook."""
1068 return os.environ.get('JUJU_METER_INFO')1301 return os.environ.get('JUJU_METER_INFO')
1302
1303
1304def iter_units_for_relation_name(relation_name):
1305 """Iterate through all units in a relation
1306
1307 Generator that iterates through all the units in a relation and yields
1308 a named tuple with rid and unit field names.
1309
1310 Usage:
1311 data = [(u.rid, u.unit)
1312 for u in iter_units_for_relation_name(relation_name)]
1313
1314 :param relation_name: string relation name
1315 :yield: Named Tuple with rid and unit field names
1316 """
1317 RelatedUnit = namedtuple('RelatedUnit', 'rid, unit')
1318 for rid in relation_ids(relation_name):
1319 for unit in related_units(rid):
1320 yield RelatedUnit(rid, unit)
1321
1322
1323def ingress_address(rid=None, unit=None):
1324 """
1325 Retrieve the ingress-address from a relation when available.
1326 Otherwise, return the private-address.
1327
1328 When used on the consuming side of the relation (unit is a remote
1329 unit), the ingress-address is the IP address that this unit needs
1330 to use to reach the provided service on the remote unit.
1331
1332 When used on the providing side of the relation (unit == local_unit()),
1333 the ingress-address is the IP address that is advertised to remote
1334 units on this relation. Remote units need to use this address to
1335 reach the local provided service on this unit.
1336
1337 Note that charms may document some other method to use in
1338 preference to the ingress_address(), such as an address provided
1339 on a different relation attribute or a service discovery mechanism.
1340 This allows charms to redirect inbound connections to their peers
1341 or different applications such as load balancers.
1342
1343 Usage:
1344 addresses = [ingress_address(rid=u.rid, unit=u.unit)
1345 for u in iter_units_for_relation_name(relation_name)]
1346
1347 :param rid: string relation id
1348 :param unit: string unit name
1349 :side effect: calls relation_get
1350 :return: string IP address
1351 """
1352 settings = relation_get(rid=rid, unit=unit)
1353 return (settings.get('ingress-address') or
1354 settings.get('private-address'))
1355
1356
1357def egress_subnets(rid=None, unit=None):
1358 """
1359 Retrieve the egress-subnets from a relation.
1360
1361 This function is to be used on the providing side of the
1362 relation, and provides the ranges of addresses that client
1363 connections may come from. The result is uninteresting on
1364 the consuming side of a relation (unit == local_unit()).
1365
1366 Returns a stable list of subnets in CIDR format.
1367 eg. ['192.168.1.0/24', '2001::F00F/128']
1368
1369 If egress-subnets is not available, falls back to using the published
1370 ingress-address, or finally private-address.
1371
1372 :param rid: string relation id
1373 :param unit: string unit name
1374 :side effect: calls relation_get
1375 :return: list of subnets in CIDR format. eg. ['192.168.1.0/24', '2001::F00F/128']
1376 """
1377 def _to_range(addr):
1378 if re.search(r'^(?:\d{1,3}\.){3}\d{1,3}$', addr) is not None:
1379 addr += '/32'
1380 elif ':' in addr and '/' not in addr: # IPv6
1381 addr += '/128'
1382 return addr
1383
1384 settings = relation_get(rid=rid, unit=unit)
1385 if 'egress-subnets' in settings:
1386 return [n.strip() for n in settings['egress-subnets'].split(',') if n.strip()]
1387 if 'ingress-address' in settings:
1388 return [_to_range(settings['ingress-address'])]
1389 if 'private-address' in settings:
1390 return [_to_range(settings['private-address'])]
1391 return [] # Should never happen
1392
1393
1394def unit_doomed(unit=None):
1395 """Determines if the unit is being removed from the model
1396
1397 Requires Juju 2.4.1.
1398
1399 :param unit: string unit name, defaults to local_unit
1400 :side effect: calls goal_state
1401 :side effect: calls local_unit
1402 :side effect: calls has_juju_version
1403 :return: True if the unit is being removed, already gone, or never existed
1404 """
1405 if not has_juju_version("2.4.1"):
1406 # We cannot risk blindly returning False for 'we don't know',
1407 # because that could cause data loss; if call sites don't
1408 # need an accurate answer, they likely don't need this helper
1409 # at all.
1410 # goal-state existed in 2.4.0, but did not handle removals
1411 # correctly until 2.4.1.
1412 raise NotImplementedError("is_doomed")
1413 if unit is None:
1414 unit = local_unit()
1415 gs = goal_state()
1416 units = gs.get('units', {})
1417 if unit not in units:
1418 return True
1419 # I don't think 'dead' units ever show up in the goal-state, but
1420 # check anyway in addition to 'dying'.
1421 return units[unit]['status'] in ('dying', 'dead')
1422
1423
1424def env_proxy_settings(selected_settings=None):
1425 """Get proxy settings from process environment variables.
1426
1427 Get charm proxy settings from environment variables that correspond to
1428 juju-http-proxy, juju-https-proxy and juju-no-proxy (available as of 2.4.2,
1429 see lp:1782236) in a format suitable for passing to an application that
1430 reacts to proxy settings passed as environment variables. Some applications
1431 support lowercase or uppercase notation (e.g. curl), some support only
1432 lowercase (e.g. wget), there are also subjectively rare cases of only
1433 uppercase notation support. no_proxy CIDR and wildcard support also varies
1434 between runtimes and applications as there is no enforced standard.
1435
1436 Some applications may connect to multiple destinations and expose config
1437 options that would affect only proxy settings for a specific destination
1438 these should be handled in charms in an application-specific manner.
1439
1440 :param selected_settings: format only a subset of possible settings
1441 :type selected_settings: list
1442 :rtype: Option(None, dict[str, str])
1443 """
1444 SUPPORTED_SETTINGS = {
1445 'http': 'HTTP_PROXY',
1446 'https': 'HTTPS_PROXY',
1447 'no_proxy': 'NO_PROXY',
1448 'ftp': 'FTP_PROXY'
1449 }
1450 if selected_settings is None:
1451 selected_settings = SUPPORTED_SETTINGS
1452
1453 selected_vars = [v for k, v in SUPPORTED_SETTINGS.items()
1454 if k in selected_settings]
1455 proxy_settings = {}
1456 for var in selected_vars:
1457 var_val = os.getenv(var)
1458 if var_val:
1459 proxy_settings[var] = var_val
1460 proxy_settings[var.lower()] = var_val
1461 # Now handle juju-prefixed environment variables. The legacy vs new
1462 # environment variable usage is mutually exclusive
1463 charm_var_val = os.getenv('JUJU_CHARM_{}'.format(var))
1464 if charm_var_val:
1465 proxy_settings[var] = charm_var_val
1466 proxy_settings[var.lower()] = charm_var_val
1467 if 'no_proxy' in proxy_settings:
1468 if _contains_range(proxy_settings['no_proxy']):
1469 log(RANGE_WARNING, level=WARNING)
1470 return proxy_settings if proxy_settings else None
1471
1472
1473def _contains_range(addresses):
1474 """Check for cidr or wildcard domain in a string.
1475
1476 Given a string comprising a comma seperated list of ip addresses
1477 and domain names, determine whether the string contains IP ranges
1478 or wildcard domains.
1479
1480 :param addresses: comma seperated list of domains and ip addresses.
1481 :type addresses: str
1482 """
1483 return (
1484 # Test for cidr (e.g. 10.20.20.0/24)
1485 "/" in addresses or
1486 # Test for wildcard domains (*.foo.com or .foo.com)
1487 "*" in addresses or
1488 addresses.startswith(".") or
1489 ",." in addresses or
1490 " ." in addresses)
10691491
=== modified file 'charmhelpers/core/host.py'
--- charmhelpers/core/host.py 2017-04-11 18:01:45 +0000
+++ charmhelpers/core/host.py 2019-05-24 12:43:31 +0000
@@ -34,21 +34,23 @@
3434
35from contextlib import contextmanager35from contextlib import contextmanager
36from collections import OrderedDict36from collections import OrderedDict
37from .hookenv import log37from .hookenv import log, INFO, DEBUG, local_unit, charm_name
38from .fstab import Fstab38from .fstab import Fstab
39from charmhelpers.osplatform import get_platform39from charmhelpers.osplatform import get_platform
4040
41__platform__ = get_platform()41__platform__ = get_platform()
42if __platform__ == "ubuntu":42if __platform__ == "ubuntu":
43 from charmhelpers.core.host_factory.ubuntu import (43 from charmhelpers.core.host_factory.ubuntu import ( # NOQA:F401
44 service_available,44 service_available,
45 add_new_group,45 add_new_group,
46 lsb_release,46 lsb_release,
47 cmp_pkgrevno,47 cmp_pkgrevno,
48 CompareHostReleases,48 CompareHostReleases,
49 get_distrib_codename,
50 arch
49 ) # flake8: noqa -- ignore F401 for this import51 ) # flake8: noqa -- ignore F401 for this import
50elif __platform__ == "centos":52elif __platform__ == "centos":
51 from charmhelpers.core.host_factory.centos import (53 from charmhelpers.core.host_factory.centos import ( # NOQA:F401
52 service_available,54 service_available,
53 add_new_group,55 add_new_group,
54 lsb_release,56 lsb_release,
@@ -58,6 +60,7 @@
5860
59UPDATEDB_PATH = '/etc/updatedb.conf'61UPDATEDB_PATH = '/etc/updatedb.conf'
6062
63
61def service_start(service_name, **kwargs):64def service_start(service_name, **kwargs):
62 """Start a system service.65 """Start a system service.
6366
@@ -191,6 +194,7 @@
191 upstart_file = os.path.join(init_dir, "{}.conf".format(service_name))194 upstart_file = os.path.join(init_dir, "{}.conf".format(service_name))
192 sysv_file = os.path.join(initd_dir, service_name)195 sysv_file = os.path.join(initd_dir, service_name)
193 if init_is_systemd():196 if init_is_systemd():
197 service('disable', service_name)
194 service('mask', service_name)198 service('mask', service_name)
195 elif os.path.exists(upstart_file):199 elif os.path.exists(upstart_file):
196 override_path = os.path.join(200 override_path = os.path.join(
@@ -225,6 +229,7 @@
225 sysv_file = os.path.join(initd_dir, service_name)229 sysv_file = os.path.join(initd_dir, service_name)
226 if init_is_systemd():230 if init_is_systemd():
227 service('unmask', service_name)231 service('unmask', service_name)
232 service('enable', service_name)
228 elif os.path.exists(upstart_file):233 elif os.path.exists(upstart_file):
229 override_path = os.path.join(234 override_path = os.path.join(
230 init_dir, '{}.override'.format(service_name))235 init_dir, '{}.override'.format(service_name))
@@ -285,8 +290,8 @@
285 for key, value in six.iteritems(kwargs):290 for key, value in six.iteritems(kwargs):
286 parameter = '%s=%s' % (key, value)291 parameter = '%s=%s' % (key, value)
287 cmd.append(parameter)292 cmd.append(parameter)
288 output = subprocess.check_output(cmd,293 output = subprocess.check_output(
289 stderr=subprocess.STDOUT).decode('UTF-8')294 cmd, stderr=subprocess.STDOUT).decode('UTF-8')
290 except subprocess.CalledProcessError:295 except subprocess.CalledProcessError:
291 return False296 return False
292 else:297 else:
@@ -439,6 +444,51 @@
439 subprocess.check_call(cmd)444 subprocess.check_call(cmd)
440445
441446
447def chage(username, lastday=None, expiredate=None, inactive=None,
448 mindays=None, maxdays=None, root=None, warndays=None):
449 """Change user password expiry information
450
451 :param str username: User to update
452 :param str lastday: Set when password was changed in YYYY-MM-DD format
453 :param str expiredate: Set when user's account will no longer be
454 accessible in YYYY-MM-DD format.
455 -1 will remove an account expiration date.
456 :param str inactive: Set the number of days of inactivity after a password
457 has expired before the account is locked.
458 -1 will remove an account's inactivity.
459 :param str mindays: Set the minimum number of days between password
460 changes to MIN_DAYS.
461 0 indicates the password can be changed anytime.
462 :param str maxdays: Set the maximum number of days during which a
463 password is valid.
464 -1 as MAX_DAYS will remove checking maxdays
465 :param str root: Apply changes in the CHROOT_DIR directory
466 :param str warndays: Set the number of days of warning before a password
467 change is required
468 :raises subprocess.CalledProcessError: if call to chage fails
469 """
470 cmd = ['chage']
471 if root:
472 cmd.extend(['--root', root])
473 if lastday:
474 cmd.extend(['--lastday', lastday])
475 if expiredate:
476 cmd.extend(['--expiredate', expiredate])
477 if inactive:
478 cmd.extend(['--inactive', inactive])
479 if mindays:
480 cmd.extend(['--mindays', mindays])
481 if maxdays:
482 cmd.extend(['--maxdays', maxdays])
483 if warndays:
484 cmd.extend(['--warndays', warndays])
485 cmd.append(username)
486 subprocess.check_call(cmd)
487
488
489remove_password_expiry = functools.partial(chage, expiredate='-1', inactive='-1', mindays='0', maxdays='-1')
490
491
442def rsync(from_path, to_path, flags='-r', options=None, timeout=None):492def rsync(from_path, to_path, flags='-r', options=None, timeout=None):
443 """Replicate the contents of a path"""493 """Replicate the contents of a path"""
444 options = options or ['--delete', '--executability']494 options = options or ['--delete', '--executability']
@@ -485,13 +535,45 @@
485535
486def write_file(path, content, owner='root', group='root', perms=0o444):536def write_file(path, content, owner='root', group='root', perms=0o444):
487 """Create or overwrite a file with the contents of a byte string."""537 """Create or overwrite a file with the contents of a byte string."""
488 log("Writing file {} {}:{} {:o}".format(path, owner, group, perms))
489 uid = pwd.getpwnam(owner).pw_uid538 uid = pwd.getpwnam(owner).pw_uid
490 gid = grp.getgrnam(group).gr_gid539 gid = grp.getgrnam(group).gr_gid
491 with open(path, 'wb') as target:540 # lets see if we can grab the file and compare the context, to avoid doing
492 os.fchown(target.fileno(), uid, gid)541 # a write.
493 os.fchmod(target.fileno(), perms)542 existing_content = None
494 target.write(content)543 existing_uid, existing_gid, existing_perms = None, None, None
544 try:
545 with open(path, 'rb') as target:
546 existing_content = target.read()
547 stat = os.stat(path)
548 existing_uid, existing_gid, existing_perms = (
549 stat.st_uid, stat.st_gid, stat.st_mode
550 )
551 except Exception:
552 pass
553 if content != existing_content:
554 log("Writing file {} {}:{} {:o}".format(path, owner, group, perms),
555 level=DEBUG)
556 with open(path, 'wb') as target:
557 os.fchown(target.fileno(), uid, gid)
558 os.fchmod(target.fileno(), perms)
559 if six.PY3 and isinstance(content, six.string_types):
560 content = content.encode('UTF-8')
561 target.write(content)
562 return
563 # the contents were the same, but we might still need to change the
564 # ownership or permissions.
565 if existing_uid != uid:
566 log("Changing uid on already existing content: {} -> {}"
567 .format(existing_uid, uid), level=DEBUG)
568 os.chown(path, uid, -1)
569 if existing_gid != gid:
570 log("Changing gid on already existing content: {} -> {}"
571 .format(existing_gid, gid), level=DEBUG)
572 os.chown(path, -1, gid)
573 if existing_perms != perms:
574 log("Changing permissions on existing content: {} -> {}"
575 .format(existing_perms, perms), level=DEBUG)
576 os.chmod(path, perms)
495577
496578
497def fstab_remove(mp):579def fstab_remove(mp):
@@ -756,7 +838,7 @@
756 ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')838 ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
757 ip_output = (line.strip() for line in ip_output if line)839 ip_output = (line.strip() for line in ip_output if line)
758840
759 key = re.compile('^[0-9]+:\s+(.+):')841 key = re.compile(r'^[0-9]+:\s+(.+):')
760 for line in ip_output:842 for line in ip_output:
761 matched = re.search(key, line)843 matched = re.search(key, line)
762 if matched:844 if matched:
@@ -901,6 +983,20 @@
901983
902984
903def add_to_updatedb_prunepath(path, updatedb_path=UPDATEDB_PATH):985def add_to_updatedb_prunepath(path, updatedb_path=UPDATEDB_PATH):
986 """Adds the specified path to the mlocate's udpatedb.conf PRUNEPATH list.
987
988 This method has no effect if the path specified by updatedb_path does not
989 exist or is not a file.
990
991 @param path: string the path to add to the updatedb.conf PRUNEPATHS value
992 @param updatedb_path: the path the updatedb.conf file
993 """
994 if not os.path.exists(updatedb_path) or os.path.isdir(updatedb_path):
995 # If the updatedb.conf file doesn't exist then don't attempt to update
996 # the file as the package providing mlocate may not be installed on
997 # the local system
998 return
999
904 with open(updatedb_path, 'r+') as f_id:1000 with open(updatedb_path, 'r+') as f_id:
905 updatedb_text = f_id.read()1001 updatedb_text = f_id.read()
906 output = updatedb(updatedb_text, path)1002 output = updatedb(updatedb_text, path)
@@ -920,3 +1016,62 @@
920 lines[i] = 'PRUNEPATHS="{}"'.format(' '.join(paths))1016 lines[i] = 'PRUNEPATHS="{}"'.format(' '.join(paths))
921 output = "\n".join(lines)1017 output = "\n".join(lines)
922 return output1018 return output
1019
1020
1021def modulo_distribution(modulo=3, wait=30, non_zero_wait=False):
1022 """ Modulo distribution
1023
1024 This helper uses the unit number, a modulo value and a constant wait time
1025 to produce a calculated wait time distribution. This is useful in large
1026 scale deployments to distribute load during an expensive operation such as
1027 service restarts.
1028
1029 If you have 1000 nodes that need to restart 100 at a time 1 minute at a
1030 time:
1031
1032 time.wait(modulo_distribution(modulo=100, wait=60))
1033 restart()
1034
1035 If you need restarts to happen serially set modulo to the exact number of
1036 nodes and set a high constant wait time:
1037
1038 time.wait(modulo_distribution(modulo=10, wait=120))
1039 restart()
1040
1041 @param modulo: int The modulo number creates the group distribution
1042 @param wait: int The constant time wait value
1043 @param non_zero_wait: boolean Override unit % modulo == 0,
1044 return modulo * wait. Used to avoid collisions with
1045 leader nodes which are often given priority.
1046 @return: int Calculated time to wait for unit operation
1047 """
1048 unit_number = int(local_unit().split('/')[1])
1049 calculated_wait_time = (unit_number % modulo) * wait
1050 if non_zero_wait and calculated_wait_time == 0:
1051 return modulo * wait
1052 else:
1053 return calculated_wait_time
1054
1055
1056def install_ca_cert(ca_cert, name=None):
1057 """
1058 Install the given cert as a trusted CA.
1059
1060 The ``name`` is the stem of the filename where the cert is written, and if
1061 not provided, it will default to ``juju-{charm_name}``.
1062
1063 If the cert is empty or None, or is unchanged, nothing is done.
1064 """
1065 if not ca_cert:
1066 return
1067 if not isinstance(ca_cert, bytes):
1068 ca_cert = ca_cert.encode('utf8')
1069 if not name:
1070 name = 'juju-{}'.format(charm_name())
1071 cert_file = '/usr/local/share/ca-certificates/{}.crt'.format(name)
1072 new_hash = hashlib.md5(ca_cert).hexdigest()
1073 if file_hash(cert_file) == new_hash:
1074 return
1075 log("Installing new CA cert at: {}".format(cert_file), level=INFO)
1076 write_file(cert_file, ca_cert)
1077 subprocess.check_call(['update-ca-certificates', '--fresh'])
9231078
=== modified file 'charmhelpers/core/host_factory/ubuntu.py'
--- charmhelpers/core/host_factory/ubuntu.py 2017-04-11 18:01:45 +0000
+++ charmhelpers/core/host_factory/ubuntu.py 2019-05-24 12:43:31 +0000
@@ -1,5 +1,6 @@
1import subprocess1import subprocess
22
3from charmhelpers.core.hookenv import cached
3from charmhelpers.core.strutils import BasicStringComparator4from charmhelpers.core.strutils import BasicStringComparator
45
56
@@ -19,6 +20,10 @@
19 'xenial',20 'xenial',
20 'yakkety',21 'yakkety',
21 'zesty',22 'zesty',
23 'artful',
24 'bionic',
25 'cosmic',
26 'disco',
22)27)
2328
2429
@@ -69,6 +74,14 @@
69 return d74 return d
7075
7176
77def get_distrib_codename():
78 """Return the codename of the distribution
79 :returns: The codename
80 :rtype: str
81 """
82 return lsb_release()['DISTRIB_CODENAME'].lower()
83
84
72def cmp_pkgrevno(package, revno, pkgcache=None):85def cmp_pkgrevno(package, revno, pkgcache=None):
73 """Compare supplied revno with the revno of the installed package.86 """Compare supplied revno with the revno of the installed package.
7487
@@ -86,3 +99,16 @@
86 pkgcache = apt_cache()99 pkgcache = apt_cache()
87 pkg = pkgcache[package]100 pkg = pkgcache[package]
88 return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)101 return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
102
103
104@cached
105def arch():
106 """Return the package architecture as a string.
107
108 :returns: the architecture
109 :rtype: str
110 :raises: subprocess.CalledProcessError if dpkg command fails
111 """
112 return subprocess.check_output(
113 ['dpkg', '--print-architecture']
114 ).rstrip().decode('UTF-8')
89115
=== modified file 'charmhelpers/core/kernel.py'
--- charmhelpers/core/kernel.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/core/kernel.py 2019-05-24 12:43:31 +0000
@@ -26,12 +26,12 @@
2626
27__platform__ = get_platform()27__platform__ = get_platform()
28if __platform__ == "ubuntu":28if __platform__ == "ubuntu":
29 from charmhelpers.core.kernel_factory.ubuntu import (29 from charmhelpers.core.kernel_factory.ubuntu import ( # NOQA:F401
30 persistent_modprobe,30 persistent_modprobe,
31 update_initramfs,31 update_initramfs,
32 ) # flake8: noqa -- ignore F401 for this import32 ) # flake8: noqa -- ignore F401 for this import
33elif __platform__ == "centos":33elif __platform__ == "centos":
34 from charmhelpers.core.kernel_factory.centos import (34 from charmhelpers.core.kernel_factory.centos import ( # NOQA:F401
35 persistent_modprobe,35 persistent_modprobe,
36 update_initramfs,36 update_initramfs,
37 ) # flake8: noqa -- ignore F401 for this import37 ) # flake8: noqa -- ignore F401 for this import
3838
=== modified file 'charmhelpers/core/services/base.py'
--- charmhelpers/core/services/base.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/core/services/base.py 2019-05-24 12:43:31 +0000
@@ -307,23 +307,34 @@
307 """307 """
308 def __call__(self, manager, service_name, event_name):308 def __call__(self, manager, service_name, event_name):
309 service = manager.get_service(service_name)309 service = manager.get_service(service_name)
310 new_ports = service.get('ports', [])310 # turn this generator into a list,
311 # as we'll be going over it multiple times
312 new_ports = list(service.get('ports', []))
311 port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))313 port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
312 if os.path.exists(port_file):314 if os.path.exists(port_file):
313 with open(port_file) as fp:315 with open(port_file) as fp:
314 old_ports = fp.read().split(',')316 old_ports = fp.read().split(',')
315 for old_port in old_ports:317 for old_port in old_ports:
316 if bool(old_port):318 if bool(old_port) and not self.ports_contains(old_port, new_ports):
317 old_port = int(old_port)319 hookenv.close_port(old_port)
318 if old_port not in new_ports:
319 hookenv.close_port(old_port)
320 with open(port_file, 'w') as fp:320 with open(port_file, 'w') as fp:
321 fp.write(','.join(str(port) for port in new_ports))321 fp.write(','.join(str(port) for port in new_ports))
322 for port in new_ports:322 for port in new_ports:
323 # A port is either a number or 'ICMP'
324 protocol = 'TCP'
325 if str(port).upper() == 'ICMP':
326 protocol = 'ICMP'
323 if event_name == 'start':327 if event_name == 'start':
324 hookenv.open_port(port)328 hookenv.open_port(port, protocol)
325 elif event_name == 'stop':329 elif event_name == 'stop':
326 hookenv.close_port(port)330 hookenv.close_port(port, protocol)
331
332 def ports_contains(self, port, ports):
333 if not bool(port):
334 return False
335 if str(port).upper() != 'ICMP':
336 port = int(port)
337 return port in ports
327338
328339
329def service_stop(service_name):340def service_stop(service_name):
330341
=== modified file 'charmhelpers/core/strutils.py'
--- charmhelpers/core/strutils.py 2017-04-11 18:01:45 +0000
+++ charmhelpers/core/strutils.py 2019-05-24 12:43:31 +0000
@@ -61,13 +61,19 @@
61 if isinstance(value, six.string_types):61 if isinstance(value, six.string_types):
62 value = six.text_type(value)62 value = six.text_type(value)
63 else:63 else:
64 msg = "Unable to interpret non-string value '%s' as boolean" % (value)64 msg = "Unable to interpret non-string value '%s' as bytes" % (value)
65 raise ValueError(msg)65 raise ValueError(msg)
66 matches = re.match("([0-9]+)([a-zA-Z]+)", value)66 matches = re.match("([0-9]+)([a-zA-Z]+)", value)
67 if not matches:67 if matches:
68 msg = "Unable to interpret string value '%s' as bytes" % (value)68 size = int(matches.group(1)) * (1024 ** BYTE_POWER[matches.group(2)])
69 raise ValueError(msg)69 else:
70 return int(matches.group(1)) * (1024 ** BYTE_POWER[matches.group(2)])70 # Assume that value passed in is bytes
71 try:
72 size = int(value)
73 except ValueError:
74 msg = "Unable to interpret string value '%s' as bytes" % (value)
75 raise ValueError(msg)
76 return size
7177
7278
73class BasicStringComparator(object):79class BasicStringComparator(object):
7480
=== modified file 'charmhelpers/core/sysctl.py'
--- charmhelpers/core/sysctl.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/core/sysctl.py 2019-05-24 12:43:31 +0000
@@ -28,27 +28,38 @@
28__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'28__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
2929
3030
31def create(sysctl_dict, sysctl_file):31def create(sysctl_dict, sysctl_file, ignore=False):
32 """Creates a sysctl.conf file from a YAML associative array32 """Creates a sysctl.conf file from a YAML associative array
3333
34 :param sysctl_dict: a YAML-formatted string of sysctl options eg "{ 'kernel.max_pid': 1337 }"34 :param sysctl_dict: a dict or YAML-formatted string of sysctl
35 options eg "{ 'kernel.max_pid': 1337 }"
35 :type sysctl_dict: str36 :type sysctl_dict: str
36 :param sysctl_file: path to the sysctl file to be saved37 :param sysctl_file: path to the sysctl file to be saved
37 :type sysctl_file: str or unicode38 :type sysctl_file: str or unicode
39 :param ignore: If True, ignore "unknown variable" errors.
40 :type ignore: bool
38 :returns: None41 :returns: None
39 """42 """
40 try:43 if type(sysctl_dict) is not dict:
41 sysctl_dict_parsed = yaml.safe_load(sysctl_dict)44 try:
42 except yaml.YAMLError:45 sysctl_dict_parsed = yaml.safe_load(sysctl_dict)
43 log("Error parsing YAML sysctl_dict: {}".format(sysctl_dict),46 except yaml.YAMLError:
44 level=ERROR)47 log("Error parsing YAML sysctl_dict: {}".format(sysctl_dict),
45 return48 level=ERROR)
49 return
50 else:
51 sysctl_dict_parsed = sysctl_dict
4652
47 with open(sysctl_file, "w") as fd:53 with open(sysctl_file, "w") as fd:
48 for key, value in sysctl_dict_parsed.items():54 for key, value in sysctl_dict_parsed.items():
49 fd.write("{}={}\n".format(key, value))55 fd.write("{}={}\n".format(key, value))
5056
51 log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict_parsed),57 log("Updating sysctl_file: {} values: {}".format(sysctl_file,
58 sysctl_dict_parsed),
52 level=DEBUG)59 level=DEBUG)
5360
54 check_call(["sysctl", "-p", sysctl_file])61 call = ["sysctl", "-p", sysctl_file]
62 if ignore:
63 call.append("-e")
64
65 check_call(call)
5566
=== modified file 'charmhelpers/core/templating.py'
--- charmhelpers/core/templating.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/core/templating.py 2019-05-24 12:43:31 +0000
@@ -20,7 +20,8 @@
2020
2121
22def render(source, target, context, owner='root', group='root',22def render(source, target, context, owner='root', group='root',
23 perms=0o444, templates_dir=None, encoding='UTF-8', template_loader=None):23 perms=0o444, templates_dir=None, encoding='UTF-8',
24 template_loader=None, config_template=None):
24 """25 """
25 Render a template.26 Render a template.
2627
@@ -32,6 +33,9 @@
32 The context should be a dict containing the values to be replaced in the33 The context should be a dict containing the values to be replaced in the
33 template.34 template.
3435
36 config_template may be provided to render from a provided template instead
37 of loading from a file.
38
35 The `owner`, `group`, and `perms` options will be passed to `write_file`.39 The `owner`, `group`, and `perms` options will be passed to `write_file`.
3640
37 If omitted, `templates_dir` defaults to the `templates` folder in the charm.41 If omitted, `templates_dir` defaults to the `templates` folder in the charm.
@@ -65,14 +69,19 @@
65 if templates_dir is None:69 if templates_dir is None:
66 templates_dir = os.path.join(hookenv.charm_dir(), 'templates')70 templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
67 template_env = Environment(loader=FileSystemLoader(templates_dir))71 template_env = Environment(loader=FileSystemLoader(templates_dir))
68 try:72
69 source = source73 # load from a string if provided explicitly
70 template = template_env.get_template(source)74 if config_template is not None:
71 except exceptions.TemplateNotFound as e:75 template = template_env.from_string(config_template)
72 hookenv.log('Could not load template %s from %s.' %76 else:
73 (source, templates_dir),77 try:
74 level=hookenv.ERROR)78 source = source
75 raise e79 template = template_env.get_template(source)
80 except exceptions.TemplateNotFound as e:
81 hookenv.log('Could not load template %s from %s.' %
82 (source, templates_dir),
83 level=hookenv.ERROR)
84 raise e
76 content = template.render(context)85 content = template.render(context)
77 if target is not None:86 if target is not None:
78 target_dir = os.path.dirname(target)87 target_dir = os.path.dirname(target)
7988
=== modified file 'charmhelpers/core/unitdata.py'
--- charmhelpers/core/unitdata.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/core/unitdata.py 2019-05-24 12:43:31 +0000
@@ -166,6 +166,10 @@
166166
167 To support dicts, lists, integer, floats, and booleans values167 To support dicts, lists, integer, floats, and booleans values
168 are automatically json encoded/decoded.168 are automatically json encoded/decoded.
169
170 Note: to facilitate unit testing, ':memory:' can be passed as the
171 path parameter which causes sqlite3 to only build the db in memory.
172 This should only be used for testing purposes.
169 """173 """
170 def __init__(self, path=None):174 def __init__(self, path=None):
171 self.db_path = path175 self.db_path = path
@@ -175,6 +179,9 @@
175 else:179 else:
176 self.db_path = os.path.join(180 self.db_path = os.path.join(
177 os.environ.get('CHARM_DIR', ''), '.unit-state.db')181 os.environ.get('CHARM_DIR', ''), '.unit-state.db')
182 if self.db_path != ':memory:':
183 with open(self.db_path, 'a') as f:
184 os.fchmod(f.fileno(), 0o600)
178 self.conn = sqlite3.connect('%s' % self.db_path)185 self.conn = sqlite3.connect('%s' % self.db_path)
179 self.cursor = self.conn.cursor()186 self.cursor = self.conn.cursor()
180 self.revision = None187 self.revision = None
@@ -358,7 +365,7 @@
358 try:365 try:
359 yield self.revision366 yield self.revision
360 self.revision = None367 self.revision = None
361 except:368 except Exception:
362 self.flush(False)369 self.flush(False)
363 self.revision = None370 self.revision = None
364 raise371 raise
365372
=== modified file 'charmhelpers/fetch/__init__.py'
--- charmhelpers/fetch/__init__.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/__init__.py 2019-05-24 12:43:31 +0000
@@ -48,6 +48,13 @@
48 pass48 pass
4949
5050
51class GPGKeyError(Exception):
52 """Exception occurs when a GPG key cannot be fetched or used. The message
53 indicates what the problem is.
54 """
55 pass
56
57
51class BaseFetchHandler(object):58class BaseFetchHandler(object):
5259
53 """Base class for FetchHandler implementations in fetch plugins"""60 """Base class for FetchHandler implementations in fetch plugins"""
@@ -77,21 +84,24 @@
77fetch = importlib.import_module(module)84fetch = importlib.import_module(module)
7885
79filter_installed_packages = fetch.filter_installed_packages86filter_installed_packages = fetch.filter_installed_packages
80install = fetch.install87filter_missing_packages = fetch.filter_missing_packages
81upgrade = fetch.upgrade88install = fetch.apt_install
82update = fetch.update89upgrade = fetch.apt_upgrade
83purge = fetch.purge90update = _fetch_update = fetch.apt_update
91purge = fetch.apt_purge
84add_source = fetch.add_source92add_source = fetch.add_source
8593
86if __platform__ == "ubuntu":94if __platform__ == "ubuntu":
87 apt_cache = fetch.apt_cache95 apt_cache = fetch.apt_cache
88 apt_install = fetch.install96 apt_install = fetch.apt_install
89 apt_update = fetch.update97 apt_update = fetch.apt_update
90 apt_upgrade = fetch.upgrade98 apt_upgrade = fetch.apt_upgrade
91 apt_purge = fetch.purge99 apt_purge = fetch.apt_purge
100 apt_autoremove = fetch.apt_autoremove
92 apt_mark = fetch.apt_mark101 apt_mark = fetch.apt_mark
93 apt_hold = fetch.apt_hold102 apt_hold = fetch.apt_hold
94 apt_unhold = fetch.apt_unhold103 apt_unhold = fetch.apt_unhold
104 import_key = fetch.import_key
95 get_upstream_version = fetch.get_upstream_version105 get_upstream_version = fetch.get_upstream_version
96elif __platform__ == "centos":106elif __platform__ == "centos":
97 yum_search = fetch.yum_search107 yum_search = fetch.yum_search
@@ -135,7 +145,7 @@
135 for source, key in zip(sources, keys):145 for source, key in zip(sources, keys):
136 add_source(source, key)146 add_source(source, key)
137 if update:147 if update:
138 fetch.update(fatal=True)148 _fetch_update(fatal=True)
139149
140150
141def install_remote(source, *args, **kwargs):151def install_remote(source, *args, **kwargs):
142152
=== modified file 'charmhelpers/fetch/archiveurl.py'
--- charmhelpers/fetch/archiveurl.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/archiveurl.py 2019-05-24 12:43:31 +0000
@@ -89,7 +89,7 @@
89 :param str source: URL pointing to an archive file.89 :param str source: URL pointing to an archive file.
90 :param str dest: Local path location to download archive file to.90 :param str dest: Local path location to download archive file to.
91 """91 """
92 # propogate all exceptions92 # propagate all exceptions
93 # URLError, OSError, etc93 # URLError, OSError, etc
94 proto, netloc, path, params, query, fragment = urlparse(source)94 proto, netloc, path, params, query, fragment = urlparse(source)
95 if proto in ('http', 'https'):95 if proto in ('http', 'https'):
9696
=== modified file 'charmhelpers/fetch/bzrurl.py'
--- charmhelpers/fetch/bzrurl.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/bzrurl.py 2019-05-24 12:43:31 +0000
@@ -13,7 +13,7 @@
13# limitations under the License.13# limitations under the License.
1414
15import os15import os
16from subprocess import check_call16from subprocess import STDOUT, check_output
17from charmhelpers.fetch import (17from charmhelpers.fetch import (
18 BaseFetchHandler,18 BaseFetchHandler,
19 UnhandledSource,19 UnhandledSource,
@@ -55,7 +55,7 @@
55 cmd = ['bzr', 'branch']55 cmd = ['bzr', 'branch']
56 cmd += cmd_opts56 cmd += cmd_opts
57 cmd += [source, dest]57 cmd += [source, dest]
58 check_call(cmd)58 check_output(cmd, stderr=STDOUT)
5959
60 def install(self, source, dest=None, revno=None):60 def install(self, source, dest=None, revno=None):
61 url_parts = self.parse_url(source)61 url_parts = self.parse_url(source)
6262
=== modified file 'charmhelpers/fetch/centos.py'
--- charmhelpers/fetch/centos.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/centos.py 2019-05-24 12:43:31 +0000
@@ -132,7 +132,7 @@
132 key_file.write(key)132 key_file.write(key)
133 key_file.flush()133 key_file.flush()
134 key_file.seek(0)134 key_file.seek(0)
135 subprocess.check_call(['rpm', '--import', key_file])135 subprocess.check_call(['rpm', '--import', key_file.name])
136 else:136 else:
137 subprocess.check_call(['rpm', '--import', key])137 subprocess.check_call(['rpm', '--import', key])
138138
139139
=== modified file 'charmhelpers/fetch/giturl.py'
--- charmhelpers/fetch/giturl.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/giturl.py 2019-05-24 12:43:31 +0000
@@ -13,7 +13,7 @@
13# limitations under the License.13# limitations under the License.
1414
15import os15import os
16from subprocess import check_call, CalledProcessError16from subprocess import check_output, CalledProcessError, STDOUT
17from charmhelpers.fetch import (17from charmhelpers.fetch import (
18 BaseFetchHandler,18 BaseFetchHandler,
19 UnhandledSource,19 UnhandledSource,
@@ -50,7 +50,7 @@
50 cmd = ['git', 'clone', source, dest, '--branch', branch]50 cmd = ['git', 'clone', source, dest, '--branch', branch]
51 if depth:51 if depth:
52 cmd.extend(['--depth', depth])52 cmd.extend(['--depth', depth])
53 check_call(cmd)53 check_output(cmd, stderr=STDOUT)
5454
55 def install(self, source, branch="master", dest=None, depth=None):55 def install(self, source, branch="master", dest=None, depth=None):
56 url_parts = self.parse_url(source)56 url_parts = self.parse_url(source)
5757
=== added directory 'charmhelpers/fetch/python'
=== added file 'charmhelpers/fetch/python/__init__.py'
--- charmhelpers/fetch/python/__init__.py 1970-01-01 00:00:00 +0000
+++ charmhelpers/fetch/python/__init__.py 2019-05-24 12:43:31 +0000
@@ -0,0 +1,13 @@
1# Copyright 2014-2019 Canonical Limited.
2#
3# Licensed under the Apache License, Version 2.0 (the "License");
4# you may not use this file except in compliance with the License.
5# You may obtain a copy of the License at
6#
7# http://www.apache.org/licenses/LICENSE-2.0
8#
9# Unless required by applicable law or agreed to in writing, software
10# distributed under the License is distributed on an "AS IS" BASIS,
11# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12# See the License for the specific language governing permissions and
13# limitations under the License.
014
=== added file 'charmhelpers/fetch/python/debug.py'
--- charmhelpers/fetch/python/debug.py 1970-01-01 00:00:00 +0000
+++ charmhelpers/fetch/python/debug.py 2019-05-24 12:43:31 +0000
@@ -0,0 +1,54 @@
1#!/usr/bin/env python
2# coding: utf-8
3
4# Copyright 2014-2015 Canonical Limited.
5#
6# Licensed under the Apache License, Version 2.0 (the "License");
7# you may not use this file except in compliance with the License.
8# You may obtain a copy of the License at
9#
10# http://www.apache.org/licenses/LICENSE-2.0
11#
12# Unless required by applicable law or agreed to in writing, software
13# distributed under the License is distributed on an "AS IS" BASIS,
14# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15# See the License for the specific language governing permissions and
16# limitations under the License.
17
18from __future__ import print_function
19
20import atexit
21import sys
22
23from charmhelpers.fetch.python.rpdb import Rpdb
24from charmhelpers.core.hookenv import (
25 open_port,
26 close_port,
27 ERROR,
28 log
29)
30
31__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
32
33DEFAULT_ADDR = "0.0.0.0"
34DEFAULT_PORT = 4444
35
36
37def _error(message):
38 log(message, level=ERROR)
39
40
41def set_trace(addr=DEFAULT_ADDR, port=DEFAULT_PORT):
42 """
43 Set a trace point using the remote debugger
44 """
45 atexit.register(close_port, port)
46 try:
47 log("Starting a remote python debugger session on %s:%s" % (addr,
48 port))
49 open_port(port)
50 debugger = Rpdb(addr=addr, port=port)
51 debugger.set_trace(sys._getframe().f_back)
52 except Exception:
53 _error("Cannot start a remote debug session on %s:%s" % (addr,
54 port))
055
=== added file 'charmhelpers/fetch/python/packages.py'
--- charmhelpers/fetch/python/packages.py 1970-01-01 00:00:00 +0000
+++ charmhelpers/fetch/python/packages.py 2019-05-24 12:43:31 +0000
@@ -0,0 +1,154 @@
1#!/usr/bin/env python
2# coding: utf-8
3
4# Copyright 2014-2015 Canonical Limited.
5#
6# Licensed under the Apache License, Version 2.0 (the "License");
7# you may not use this file except in compliance with the License.
8# You may obtain a copy of the License at
9#
10# http://www.apache.org/licenses/LICENSE-2.0
11#
12# Unless required by applicable law or agreed to in writing, software
13# distributed under the License is distributed on an "AS IS" BASIS,
14# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15# See the License for the specific language governing permissions and
16# limitations under the License.
17
18import os
19import six
20import subprocess
21import sys
22
23from charmhelpers.fetch import apt_install, apt_update
24from charmhelpers.core.hookenv import charm_dir, log
25
26__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
27
28
29def pip_execute(*args, **kwargs):
30 """Overriden pip_execute() to stop sys.path being changed.
31
32 The act of importing main from the pip module seems to cause add wheels
33 from the /usr/share/python-wheels which are installed by various tools.
34 This function ensures that sys.path remains the same after the call is
35 executed.
36 """
37 try:
38 _path = sys.path
39 try:
40 from pip import main as _pip_execute
41 except ImportError:
42 apt_update()
43 if six.PY2:
44 apt_install('python-pip')
45 else:
46 apt_install('python3-pip')
47 from pip import main as _pip_execute
48 _pip_execute(*args, **kwargs)
49 finally:
50 sys.path = _path
51
52
53def parse_options(given, available):
54 """Given a set of options, check if available"""
55 for key, value in sorted(given.items()):
56 if not value:
57 continue
58 if key in available:
59 yield "--{0}={1}".format(key, value)
60
61
62def pip_install_requirements(requirements, constraints=None, **options):
63 """Install a requirements file.
64
65 :param constraints: Path to pip constraints file.
66 http://pip.readthedocs.org/en/stable/user_guide/#constraints-files
67 """
68 command = ["install"]
69
70 available_options = ('proxy', 'src', 'log', )
71 for option in parse_options(options, available_options):
72 command.append(option)
73
74 command.append("-r {0}".format(requirements))
75 if constraints:
76 command.append("-c {0}".format(constraints))
77 log("Installing from file: {} with constraints {} "
78 "and options: {}".format(requirements, constraints, command))
79 else:
80 log("Installing from file: {} with options: {}".format(requirements,
81 command))
82 pip_execute(command)
83
84
85def pip_install(package, fatal=False, upgrade=False, venv=None,
86 constraints=None, **options):
87 """Install a python package"""
88 if venv:
89 venv_python = os.path.join(venv, 'bin/pip')
90 command = [venv_python, "install"]
91 else:
92 command = ["install"]
93
94 available_options = ('proxy', 'src', 'log', 'index-url', )
95 for option in parse_options(options, available_options):
96 command.append(option)
97
98 if upgrade:
99 command.append('--upgrade')
100
101 if constraints:
102 command.extend(['-c', constraints])
103
104 if isinstance(package, list):
105 command.extend(package)
106 else:
107 command.append(package)
108
109 log("Installing {} package with options: {}".format(package,
110 command))
111 if venv:
112 subprocess.check_call(command)
113 else:
114 pip_execute(command)
115
116
117def pip_uninstall(package, **options):
118 """Uninstall a python package"""
119 command = ["uninstall", "-q", "-y"]
120
121 available_options = ('proxy', 'log', )
122 for option in parse_options(options, available_options):
123 command.append(option)
124
125 if isinstance(package, list):
126 command.extend(package)
127 else:
128 command.append(package)
129
130 log("Uninstalling {} package with options: {}".format(package,
131 command))
132 pip_execute(command)
133
134
135def pip_list():
136 """Returns the list of current python installed packages
137 """
138 return pip_execute(["list"])
139
140
141def pip_create_virtualenv(path=None):
142 """Create an isolated Python environment."""
143 if six.PY2:
144 apt_install('python-virtualenv')
145 else:
146 apt_install('python3-virtualenv')
147
148 if path:
149 venv_path = path
150 else:
151 venv_path = os.path.join(charm_dir(), 'venv')
152
153 if not os.path.exists(venv_path):
154 subprocess.check_call(['virtualenv', venv_path])
0155
=== added file 'charmhelpers/fetch/python/rpdb.py'
--- charmhelpers/fetch/python/rpdb.py 1970-01-01 00:00:00 +0000
+++ charmhelpers/fetch/python/rpdb.py 2019-05-24 12:43:31 +0000
@@ -0,0 +1,56 @@
1# Copyright 2014-2015 Canonical Limited.
2#
3# Licensed under the Apache License, Version 2.0 (the "License");
4# you may not use this file except in compliance with the License.
5# You may obtain a copy of the License at
6#
7# http://www.apache.org/licenses/LICENSE-2.0
8#
9# Unless required by applicable law or agreed to in writing, software
10# distributed under the License is distributed on an "AS IS" BASIS,
11# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12# See the License for the specific language governing permissions and
13# limitations under the License.
14
15"""Remote Python Debugger (pdb wrapper)."""
16
17import pdb
18import socket
19import sys
20
21__author__ = "Bertrand Janin <b@janin.com>"
22__version__ = "0.1.3"
23
24
25class Rpdb(pdb.Pdb):
26
27 def __init__(self, addr="127.0.0.1", port=4444):
28 """Initialize the socket and initialize pdb."""
29
30 # Backup stdin and stdout before replacing them by the socket handle
31 self.old_stdout = sys.stdout
32 self.old_stdin = sys.stdin
33
34 # Open a 'reusable' socket to let the webapp reload on the same port
35 self.skt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
36 self.skt.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, True)
37 self.skt.bind((addr, port))
38 self.skt.listen(1)
39 (clientsocket, address) = self.skt.accept()
40 handle = clientsocket.makefile('rw')
41 pdb.Pdb.__init__(self, completekey='tab', stdin=handle, stdout=handle)
42 sys.stdout = sys.stdin = handle
43
44 def shutdown(self):
45 """Revert stdin and stdout, close the socket."""
46 sys.stdout = self.old_stdout
47 sys.stdin = self.old_stdin
48 self.skt.close()
49 self.set_continue()
50
51 def do_continue(self, arg):
52 """Stop all operation on ``continue``."""
53 self.shutdown()
54 return 1
55
56 do_EOF = do_quit = do_exit = do_c = do_cont = do_continue
057
=== added file 'charmhelpers/fetch/python/version.py'
--- charmhelpers/fetch/python/version.py 1970-01-01 00:00:00 +0000
+++ charmhelpers/fetch/python/version.py 2019-05-24 12:43:31 +0000
@@ -0,0 +1,32 @@
1#!/usr/bin/env python
2# coding: utf-8
3
4# Copyright 2014-2015 Canonical Limited.
5#
6# Licensed under the Apache License, Version 2.0 (the "License");
7# you may not use this file except in compliance with the License.
8# You may obtain a copy of the License at
9#
10# http://www.apache.org/licenses/LICENSE-2.0
11#
12# Unless required by applicable law or agreed to in writing, software
13# distributed under the License is distributed on an "AS IS" BASIS,
14# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15# See the License for the specific language governing permissions and
16# limitations under the License.
17
18import sys
19
20__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
21
22
23def current_version():
24 """Current system python version"""
25 return sys.version_info
26
27
28def current_version_string():
29 """Current system python version as string major.minor.micro"""
30 return "{0}.{1}.{2}".format(sys.version_info.major,
31 sys.version_info.minor,
32 sys.version_info.micro)
033
=== modified file 'charmhelpers/fetch/snap.py'
--- charmhelpers/fetch/snap.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/snap.py 2019-05-24 12:43:31 +0000
@@ -18,21 +18,33 @@
18https://lists.ubuntu.com/archives/snapcraft/2016-September/001114.html18https://lists.ubuntu.com/archives/snapcraft/2016-September/001114.html
19"""19"""
20import subprocess20import subprocess
21from os import environ21import os
22from time import sleep22from time import sleep
23from charmhelpers.core.hookenv import log23from charmhelpers.core.hookenv import log
2424
25__author__ = 'Joseph Borg <joseph.borg@canonical.com>'25__author__ = 'Joseph Borg <joseph.borg@canonical.com>'
2626
27SNAP_NO_LOCK = 1 # The return code for "couldn't acquire lock" in Snap (hopefully this will be improved).27# The return code for "couldn't acquire lock" in Snap
28# (hopefully this will be improved).
29SNAP_NO_LOCK = 1
28SNAP_NO_LOCK_RETRY_DELAY = 10 # Wait X seconds between Snap lock checks.30SNAP_NO_LOCK_RETRY_DELAY = 10 # Wait X seconds between Snap lock checks.
29SNAP_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.31SNAP_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.
32SNAP_CHANNELS = [
33 'edge',
34 'beta',
35 'candidate',
36 'stable',
37]
3038
3139
32class CouldNotAcquireLockException(Exception):40class CouldNotAcquireLockException(Exception):
33 pass41 pass
3442
3543
44class InvalidSnapChannel(Exception):
45 pass
46
47
36def _snap_exec(commands):48def _snap_exec(commands):
37 """49 """
38 Execute snap commands.50 Execute snap commands.
@@ -47,13 +59,17 @@
4759
48 while return_code is None or return_code == SNAP_NO_LOCK:60 while return_code is None or return_code == SNAP_NO_LOCK:
49 try:61 try:
50 return_code = subprocess.check_call(['snap'] + commands, env=environ)62 return_code = subprocess.check_call(['snap'] + commands,
63 env=os.environ)
51 except subprocess.CalledProcessError as e:64 except subprocess.CalledProcessError as e:
52 retry_count += + 165 retry_count += + 1
53 if retry_count > SNAP_NO_LOCK_RETRY_COUNT:66 if retry_count > SNAP_NO_LOCK_RETRY_COUNT:
54 raise CouldNotAcquireLockException('Could not aquire lock after %s attempts' % SNAP_NO_LOCK_RETRY_COUNT)67 raise CouldNotAcquireLockException(
68 'Could not aquire lock after {} attempts'
69 .format(SNAP_NO_LOCK_RETRY_COUNT))
55 return_code = e.returncode70 return_code = e.returncode
56 log('Snap failed to acquire lock, trying again in %s seconds.' % SNAP_NO_LOCK_RETRY_DELAY, level='WARN')71 log('Snap failed to acquire lock, trying again in {} seconds.'
72 .format(SNAP_NO_LOCK_RETRY_DELAY, level='WARN'))
57 sleep(SNAP_NO_LOCK_RETRY_DELAY)73 sleep(SNAP_NO_LOCK_RETRY_DELAY)
5874
59 return return_code75 return return_code
@@ -120,3 +136,15 @@
120136
121 log(message, level='INFO')137 log(message, level='INFO')
122 return _snap_exec(['refresh'] + flags + packages)138 return _snap_exec(['refresh'] + flags + packages)
139
140
141def valid_snap_channel(channel):
142 """ Validate snap channel exists
143
144 :raises InvalidSnapChannel: When channel does not exist
145 :return: Boolean
146 """
147 if channel.lower() in SNAP_CHANNELS:
148 return True
149 else:
150 raise InvalidSnapChannel("Invalid Snap Channel: {}".format(channel))
123151
=== modified file 'charmhelpers/fetch/ubuntu.py'
--- charmhelpers/fetch/ubuntu.py 2017-03-03 21:03:14 +0000
+++ charmhelpers/fetch/ubuntu.py 2019-05-24 12:43:31 +0000
@@ -12,29 +12,48 @@
12# See the License for the specific language governing permissions and12# See the License for the specific language governing permissions and
13# limitations under the License.13# limitations under the License.
1414
15from collections import OrderedDict
15import os16import os
17import platform
18import re
16import six19import six
17import time20import time
18import subprocess21import subprocess
1922
20from tempfile import NamedTemporaryFile23from charmhelpers.core.host import get_distrib_codename
21from charmhelpers.core.host import (24
22 lsb_release25from charmhelpers.core.hookenv import (
26 log,
27 DEBUG,
28 WARNING,
29 env_proxy_settings,
23)30)
24from charmhelpers.core.hookenv import log31from charmhelpers.fetch import SourceConfigError, GPGKeyError
25from charmhelpers.fetch import SourceConfigError
2632
33PROPOSED_POCKET = (
34 "# Proposed\n"
35 "deb http://archive.ubuntu.com/ubuntu {}-proposed main universe "
36 "multiverse restricted\n")
37PROPOSED_PORTS_POCKET = (
38 "# Proposed\n"
39 "deb http://ports.ubuntu.com/ubuntu-ports {}-proposed main universe "
40 "multiverse restricted\n")
41# Only supports 64bit and ppc64 at the moment.
42ARCH_TO_PROPOSED_POCKET = {
43 'x86_64': PROPOSED_POCKET,
44 'ppc64le': PROPOSED_PORTS_POCKET,
45 'aarch64': PROPOSED_PORTS_POCKET,
46 's390x': PROPOSED_PORTS_POCKET,
47}
48CLOUD_ARCHIVE_URL = "http://ubuntu-cloud.archive.canonical.com/ubuntu"
49CLOUD_ARCHIVE_KEY_ID = '5EDB1B62EC4926EA'
27CLOUD_ARCHIVE = """# Ubuntu Cloud Archive50CLOUD_ARCHIVE = """# Ubuntu Cloud Archive
28deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main51deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
29"""52"""
30
31PROPOSED_POCKET = """# Proposed
32deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted
33"""
34
35CLOUD_ARCHIVE_POCKETS = {53CLOUD_ARCHIVE_POCKETS = {
36 # Folsom54 # Folsom
37 'folsom': 'precise-updates/folsom',55 'folsom': 'precise-updates/folsom',
56 'folsom/updates': 'precise-updates/folsom',
38 'precise-folsom': 'precise-updates/folsom',57 'precise-folsom': 'precise-updates/folsom',
39 'precise-folsom/updates': 'precise-updates/folsom',58 'precise-folsom/updates': 'precise-updates/folsom',
40 'precise-updates/folsom': 'precise-updates/folsom',59 'precise-updates/folsom': 'precise-updates/folsom',
@@ -43,6 +62,7 @@
43 'precise-proposed/folsom': 'precise-proposed/folsom',62 'precise-proposed/folsom': 'precise-proposed/folsom',
44 # Grizzly63 # Grizzly
45 'grizzly': 'precise-updates/grizzly',64 'grizzly': 'precise-updates/grizzly',
65 'grizzly/updates': 'precise-updates/grizzly',
46 'precise-grizzly': 'precise-updates/grizzly',66 'precise-grizzly': 'precise-updates/grizzly',
47 'precise-grizzly/updates': 'precise-updates/grizzly',67 'precise-grizzly/updates': 'precise-updates/grizzly',
48 'precise-updates/grizzly': 'precise-updates/grizzly',68 'precise-updates/grizzly': 'precise-updates/grizzly',
@@ -51,6 +71,7 @@
51 'precise-proposed/grizzly': 'precise-proposed/grizzly',71 'precise-proposed/grizzly': 'precise-proposed/grizzly',
52 # Havana72 # Havana
53 'havana': 'precise-updates/havana',73 'havana': 'precise-updates/havana',
74 'havana/updates': 'precise-updates/havana',
54 'precise-havana': 'precise-updates/havana',75 'precise-havana': 'precise-updates/havana',
55 'precise-havana/updates': 'precise-updates/havana',76 'precise-havana/updates': 'precise-updates/havana',
56 'precise-updates/havana': 'precise-updates/havana',77 'precise-updates/havana': 'precise-updates/havana',
@@ -59,6 +80,7 @@
59 'precise-proposed/havana': 'precise-proposed/havana',80 'precise-proposed/havana': 'precise-proposed/havana',
60 # Icehouse81 # Icehouse
61 'icehouse': 'precise-updates/icehouse',82 'icehouse': 'precise-updates/icehouse',
83 'icehouse/updates': 'precise-updates/icehouse',
62 'precise-icehouse': 'precise-updates/icehouse',84 'precise-icehouse': 'precise-updates/icehouse',
63 'precise-icehouse/updates': 'precise-updates/icehouse',85 'precise-icehouse/updates': 'precise-updates/icehouse',
64 'precise-updates/icehouse': 'precise-updates/icehouse',86 'precise-updates/icehouse': 'precise-updates/icehouse',
@@ -67,6 +89,7 @@
67 'precise-proposed/icehouse': 'precise-proposed/icehouse',89 'precise-proposed/icehouse': 'precise-proposed/icehouse',
68 # Juno90 # Juno
69 'juno': 'trusty-updates/juno',91 'juno': 'trusty-updates/juno',
92 'juno/updates': 'trusty-updates/juno',
70 'trusty-juno': 'trusty-updates/juno',93 'trusty-juno': 'trusty-updates/juno',
71 'trusty-juno/updates': 'trusty-updates/juno',94 'trusty-juno/updates': 'trusty-updates/juno',
72 'trusty-updates/juno': 'trusty-updates/juno',95 'trusty-updates/juno': 'trusty-updates/juno',
@@ -75,6 +98,7 @@
75 'trusty-proposed/juno': 'trusty-proposed/juno',98 'trusty-proposed/juno': 'trusty-proposed/juno',
76 # Kilo99 # Kilo
77 'kilo': 'trusty-updates/kilo',100 'kilo': 'trusty-updates/kilo',
101 'kilo/updates': 'trusty-updates/kilo',
78 'trusty-kilo': 'trusty-updates/kilo',102 'trusty-kilo': 'trusty-updates/kilo',
79 'trusty-kilo/updates': 'trusty-updates/kilo',103 'trusty-kilo/updates': 'trusty-updates/kilo',
80 'trusty-updates/kilo': 'trusty-updates/kilo',104 'trusty-updates/kilo': 'trusty-updates/kilo',
@@ -83,6 +107,7 @@
83 'trusty-proposed/kilo': 'trusty-proposed/kilo',107 'trusty-proposed/kilo': 'trusty-proposed/kilo',
84 # Liberty108 # Liberty
85 'liberty': 'trusty-updates/liberty',109 'liberty': 'trusty-updates/liberty',
110 'liberty/updates': 'trusty-updates/liberty',
86 'trusty-liberty': 'trusty-updates/liberty',111 'trusty-liberty': 'trusty-updates/liberty',
87 'trusty-liberty/updates': 'trusty-updates/liberty',112 'trusty-liberty/updates': 'trusty-updates/liberty',
88 'trusty-updates/liberty': 'trusty-updates/liberty',113 'trusty-updates/liberty': 'trusty-updates/liberty',
@@ -91,6 +116,7 @@
91 'trusty-proposed/liberty': 'trusty-proposed/liberty',116 'trusty-proposed/liberty': 'trusty-proposed/liberty',
92 # Mitaka117 # Mitaka
93 'mitaka': 'trusty-updates/mitaka',118 'mitaka': 'trusty-updates/mitaka',
119 'mitaka/updates': 'trusty-updates/mitaka',
94 'trusty-mitaka': 'trusty-updates/mitaka',120 'trusty-mitaka': 'trusty-updates/mitaka',
95 'trusty-mitaka/updates': 'trusty-updates/mitaka',121 'trusty-mitaka/updates': 'trusty-updates/mitaka',
96 'trusty-updates/mitaka': 'trusty-updates/mitaka',122 'trusty-updates/mitaka': 'trusty-updates/mitaka',
@@ -99,6 +125,7 @@
99 'trusty-proposed/mitaka': 'trusty-proposed/mitaka',125 'trusty-proposed/mitaka': 'trusty-proposed/mitaka',
100 # Newton126 # Newton
101 'newton': 'xenial-updates/newton',127 'newton': 'xenial-updates/newton',
128 'newton/updates': 'xenial-updates/newton',
102 'xenial-newton': 'xenial-updates/newton',129 'xenial-newton': 'xenial-updates/newton',
103 'xenial-newton/updates': 'xenial-updates/newton',130 'xenial-newton/updates': 'xenial-updates/newton',
104 'xenial-updates/newton': 'xenial-updates/newton',131 'xenial-updates/newton': 'xenial-updates/newton',
@@ -107,17 +134,51 @@
107 'xenial-proposed/newton': 'xenial-proposed/newton',134 'xenial-proposed/newton': 'xenial-proposed/newton',
108 # Ocata135 # Ocata
109 'ocata': 'xenial-updates/ocata',136 'ocata': 'xenial-updates/ocata',
137 'ocata/updates': 'xenial-updates/ocata',
110 'xenial-ocata': 'xenial-updates/ocata',138 'xenial-ocata': 'xenial-updates/ocata',
111 'xenial-ocata/updates': 'xenial-updates/ocata',139 'xenial-ocata/updates': 'xenial-updates/ocata',
112 'xenial-updates/ocata': 'xenial-updates/ocata',140 'xenial-updates/ocata': 'xenial-updates/ocata',
113 'ocata/proposed': 'xenial-proposed/ocata',141 'ocata/proposed': 'xenial-proposed/ocata',
114 'xenial-ocata/proposed': 'xenial-proposed/ocata',142 'xenial-ocata/proposed': 'xenial-proposed/ocata',
115 'xenial-ocata/newton': 'xenial-proposed/ocata',143 'xenial-proposed/ocata': 'xenial-proposed/ocata',
144 # Pike
145 'pike': 'xenial-updates/pike',
146 'xenial-pike': 'xenial-updates/pike',
147 'xenial-pike/updates': 'xenial-updates/pike',
148 'xenial-updates/pike': 'xenial-updates/pike',
149 'pike/proposed': 'xenial-proposed/pike',
150 'xenial-pike/proposed': 'xenial-proposed/pike',
151 'xenial-proposed/pike': 'xenial-proposed/pike',
152 # Queens
153 'queens': 'xenial-updates/queens',
154 'xenial-queens': 'xenial-updates/queens',
155 'xenial-queens/updates': 'xenial-updates/queens',
156 'xenial-updates/queens': 'xenial-updates/queens',
157 'queens/proposed': 'xenial-proposed/queens',
158 'xenial-queens/proposed': 'xenial-proposed/queens',
159 'xenial-proposed/queens': 'xenial-proposed/queens',
160 # Rocky
161 'rocky': 'bionic-updates/rocky',
162 'bionic-rocky': 'bionic-updates/rocky',
163 'bionic-rocky/updates': 'bionic-updates/rocky',
164 'bionic-updates/rocky': 'bionic-updates/rocky',
165 'rocky/proposed': 'bionic-proposed/rocky',
166 'bionic-rocky/proposed': 'bionic-proposed/rocky',
167 'bionic-proposed/rocky': 'bionic-proposed/rocky',
168 # Stein
169 'stein': 'bionic-updates/stein',
170 'bionic-stein': 'bionic-updates/stein',
171 'bionic-stein/updates': 'bionic-updates/stein',
172 'bionic-updates/stein': 'bionic-updates/stein',
173 'stein/proposed': 'bionic-proposed/stein',
174 'bionic-stein/proposed': 'bionic-proposed/stein',
175 'bionic-proposed/stein': 'bionic-proposed/stein',
116}176}
117177
178
118APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.179APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.
119CMD_RETRY_DELAY = 10 # Wait 10 seconds between command retries.180CMD_RETRY_DELAY = 10 # Wait 10 seconds between command retries.
120CMD_RETRY_COUNT = 30 # Retry a failing fatal command X times.181CMD_RETRY_COUNT = 3 # Retry a failing fatal command X times.
121182
122183
123def filter_installed_packages(packages):184def filter_installed_packages(packages):
@@ -135,6 +196,18 @@
135 return _pkgs196 return _pkgs
136197
137198
199def filter_missing_packages(packages):
200 """Return a list of packages that are installed.
201
202 :param packages: list of packages to evaluate.
203 :returns list: Packages that are installed.
204 """
205 return list(
206 set(packages) -
207 set(filter_installed_packages(packages))
208 )
209
210
138def apt_cache(in_memory=True, progress=None):211def apt_cache(in_memory=True, progress=None):
139 """Build and return an apt cache."""212 """Build and return an apt cache."""
140 from apt import apt_pkg213 from apt import apt_pkg
@@ -145,7 +218,7 @@
145 return apt_pkg.Cache(progress)218 return apt_pkg.Cache(progress)
146219
147220
148def install(packages, options=None, fatal=False):221def apt_install(packages, options=None, fatal=False):
149 """Install one or more packages."""222 """Install one or more packages."""
150 if options is None:223 if options is None:
151 options = ['--option=Dpkg::Options::=--force-confold']224 options = ['--option=Dpkg::Options::=--force-confold']
@@ -162,7 +235,7 @@
162 _run_apt_command(cmd, fatal)235 _run_apt_command(cmd, fatal)
163236
164237
165def upgrade(options=None, fatal=False, dist=False):238def apt_upgrade(options=None, fatal=False, dist=False):
166 """Upgrade all packages."""239 """Upgrade all packages."""
167 if options is None:240 if options is None:
168 options = ['--option=Dpkg::Options::=--force-confold']241 options = ['--option=Dpkg::Options::=--force-confold']
@@ -177,13 +250,13 @@
177 _run_apt_command(cmd, fatal)250 _run_apt_command(cmd, fatal)
178251
179252
180def update(fatal=False):253def apt_update(fatal=False):
181 """Update local apt cache."""254 """Update local apt cache."""
182 cmd = ['apt-get', 'update']255 cmd = ['apt-get', 'update']
183 _run_apt_command(cmd, fatal)256 _run_apt_command(cmd, fatal)
184257
185258
186def purge(packages, fatal=False):259def apt_purge(packages, fatal=False):
187 """Purge one or more packages."""260 """Purge one or more packages."""
188 cmd = ['apt-get', '--assume-yes', 'purge']261 cmd = ['apt-get', '--assume-yes', 'purge']
189 if isinstance(packages, six.string_types):262 if isinstance(packages, six.string_types):
@@ -194,6 +267,14 @@
194 _run_apt_command(cmd, fatal)267 _run_apt_command(cmd, fatal)
195268
196269
270def apt_autoremove(purge=True, fatal=False):
271 """Purge one or more packages."""
272 cmd = ['apt-get', '--assume-yes', 'autoremove']
273 if purge:
274 cmd.append('--purge')
275 _run_apt_command(cmd, fatal)
276
277
197def apt_mark(packages, mark, fatal=False):278def apt_mark(packages, mark, fatal=False):
198 """Flag one or more packages using apt-mark."""279 """Flag one or more packages using apt-mark."""
199 log("Marking {} as {}".format(packages, mark))280 log("Marking {} as {}".format(packages, mark))
@@ -217,7 +298,159 @@
217 return apt_mark(packages, 'unhold', fatal=fatal)298 return apt_mark(packages, 'unhold', fatal=fatal)
218299
219300
220def add_source(source, key=None):301def import_key(key):
302 """Import an ASCII Armor key.
303
304 A Radix64 format keyid is also supported for backwards
305 compatibility. In this case Ubuntu keyserver will be
306 queried for a key via HTTPS by its keyid. This method
307 is less preferrable because https proxy servers may
308 require traffic decryption which is equivalent to a
309 man-in-the-middle attack (a proxy server impersonates
310 keyserver TLS certificates and has to be explicitly
311 trusted by the system).
312
313 :param key: A GPG key in ASCII armor format,
314 including BEGIN and END markers or a keyid.
315 :type key: (bytes, str)
316 :raises: GPGKeyError if the key could not be imported
317 """
318 key = key.strip()
319 if '-' in key or '\n' in key:
320 # Send everything not obviously a keyid to GPG to import, as
321 # we trust its validation better than our own. eg. handling
322 # comments before the key.
323 log("PGP key found (looks like ASCII Armor format)", level=DEBUG)
324 if ('-----BEGIN PGP PUBLIC KEY BLOCK-----' in key and
325 '-----END PGP PUBLIC KEY BLOCK-----' in key):
326 log("Writing provided PGP key in the binary format", level=DEBUG)
327 if six.PY3:
328 key_bytes = key.encode('utf-8')
329 else:
330 key_bytes = key
331 key_name = _get_keyid_by_gpg_key(key_bytes)
332 key_gpg = _dearmor_gpg_key(key_bytes)
333 _write_apt_gpg_keyfile(key_name=key_name, key_material=key_gpg)
334 else:
335 raise GPGKeyError("ASCII armor markers missing from GPG key")
336 else:
337 log("PGP key found (looks like Radix64 format)", level=WARNING)
338 log("SECURELY importing PGP key from keyserver; "
339 "full key not provided.", level=WARNING)
340 # as of bionic add-apt-repository uses curl with an HTTPS keyserver URL
341 # to retrieve GPG keys. `apt-key adv` command is deprecated as is
342 # apt-key in general as noted in its manpage. See lp:1433761 for more
343 # history. Instead, /etc/apt/trusted.gpg.d is used directly to drop
344 # gpg
345 key_asc = _get_key_by_keyid(key)
346 # write the key in GPG format so that apt-key list shows it
347 key_gpg = _dearmor_gpg_key(key_asc)
348 _write_apt_gpg_keyfile(key_name=key, key_material=key_gpg)
349
350
351def _get_keyid_by_gpg_key(key_material):
352 """Get a GPG key fingerprint by GPG key material.
353 Gets a GPG key fingerprint (40-digit, 160-bit) by the ASCII armor-encoded
354 or binary GPG key material. Can be used, for example, to generate file
355 names for keys passed via charm options.
356
357 :param key_material: ASCII armor-encoded or binary GPG key material
358 :type key_material: bytes
359 :raises: GPGKeyError if invalid key material has been provided
360 :returns: A GPG key fingerprint
361 :rtype: str
362 """
363 # Use the same gpg command for both Xenial and Bionic
364 cmd = 'gpg --with-colons --with-fingerprint'
365 ps = subprocess.Popen(cmd.split(),
366 stdout=subprocess.PIPE,
367 stderr=subprocess.PIPE,
368 stdin=subprocess.PIPE)
369 out, err = ps.communicate(input=key_material)
370 if six.PY3:
371 out = out.decode('utf-8')
372 err = err.decode('utf-8')
373 if 'gpg: no valid OpenPGP data found.' in err:
374 raise GPGKeyError('Invalid GPG key material provided')
375 # from gnupg2 docs: fpr :: Fingerprint (fingerprint is in field 10)
376 return re.search(r"^fpr:{9}([0-9A-F]{40}):$", out, re.MULTILINE).group(1)
377
378
379def _get_key_by_keyid(keyid):
380 """Get a key via HTTPS from the Ubuntu keyserver.
381 Different key ID formats are supported by SKS keyservers (the longer ones
382 are more secure, see "dead beef attack" and https://evil32.com/). Since
383 HTTPS is used, if SSLBump-like HTTPS proxies are in place, they will
384 impersonate keyserver.ubuntu.com and generate a certificate with
385 keyserver.ubuntu.com in the CN field or in SubjAltName fields of a
386 certificate. If such proxy behavior is expected it is necessary to add the
387 CA certificate chain containing the intermediate CA of the SSLBump proxy to
388 every machine that this code runs on via ca-certs cloud-init directive (via
389 cloudinit-userdata model-config) or via other means (such as through a
390 custom charm option). Also note that DNS resolution for the hostname in a
391 URL is done at a proxy server - not at the client side.
392
393 8-digit (32 bit) key ID
394 https://keyserver.ubuntu.com/pks/lookup?search=0x4652B4E6
395 16-digit (64 bit) key ID
396 https://keyserver.ubuntu.com/pks/lookup?search=0x6E85A86E4652B4E6
397 40-digit key ID:
398 https://keyserver.ubuntu.com/pks/lookup?search=0x35F77D63B5CEC106C577ED856E85A86E4652B4E6
399
400 :param keyid: An 8, 16 or 40 hex digit keyid to find a key for
401 :type keyid: (bytes, str)
402 :returns: A key material for the specified GPG key id
403 :rtype: (str, bytes)
404 :raises: subprocess.CalledProcessError
405 """
406 # options=mr - machine-readable output (disables html wrappers)
407 keyserver_url = ('https://keyserver.ubuntu.com'
408 '/pks/lookup?op=get&options=mr&exact=on&search=0x{}')
409 curl_cmd = ['curl', keyserver_url.format(keyid)]
410 # use proxy server settings in order to retrieve the key
411 return subprocess.check_output(curl_cmd,
412 env=env_proxy_settings(['https']))
413
414
415def _dearmor_gpg_key(key_asc):
416 """Converts a GPG key in the ASCII armor format to the binary format.
417
418 :param key_asc: A GPG key in ASCII armor format.
419 :type key_asc: (str, bytes)
420 :returns: A GPG key in binary format
421 :rtype: (str, bytes)
422 :raises: GPGKeyError
423 """
424 ps = subprocess.Popen(['gpg', '--dearmor'],
425 stdout=subprocess.PIPE,
426 stderr=subprocess.PIPE,
427 stdin=subprocess.PIPE)
428 out, err = ps.communicate(input=key_asc)
429 # no need to decode output as it is binary (invalid utf-8), only error
430 if six.PY3:
431 err = err.decode('utf-8')
432 if 'gpg: no valid OpenPGP data found.' in err:
433 raise GPGKeyError('Invalid GPG key material. Check your network setup'
434 ' (MTU, routing, DNS) and/or proxy server settings'
435 ' as well as destination keyserver status.')
436 else:
437 return out
438
439
440def _write_apt_gpg_keyfile(key_name, key_material):
441 """Writes GPG key material into a file at a provided path.
442
443 :param key_name: A key name to use for a key file (could be a fingerprint)
444 :type key_name: str
445 :param key_material: A GPG key material (binary)
446 :type key_material: (str, bytes)
447 """
448 with open('/etc/apt/trusted.gpg.d/{}.gpg'.format(key_name),
449 'wb') as keyf:
450 keyf.write(key_material)
451
452
453def add_source(source, key=None, fail_invalid=False):
221 """Add a package source to this system.454 """Add a package source to this system.
222455
223 @param source: a URL or sources.list entry, as supported by456 @param source: a URL or sources.list entry, as supported by
@@ -233,6 +466,33 @@
233 such as 'cloud:icehouse'466 such as 'cloud:icehouse'
234 'distro' may be used as a noop467 'distro' may be used as a noop
235468
469 Full list of source specifications supported by the function are:
470
471 'distro': A NOP; i.e. it has no effect.
472 'proposed': the proposed deb spec [2] is wrtten to
473 /etc/apt/sources.list/proposed
474 'distro-proposed': adds <version>-proposed to the debs [2]
475 'ppa:<ppa-name>': add-apt-repository --yes <ppa_name>
476 'deb <deb-spec>': add-apt-repository --yes deb <deb-spec>
477 'http://....': add-apt-repository --yes http://...
478 'cloud-archive:<spec>': add-apt-repository -yes cloud-archive:<spec>
479 'cloud:<release>[-staging]': specify a Cloud Archive pocket <release> with
480 optional staging version. If staging is used then the staging PPA [2]
481 with be used. If staging is NOT used then the cloud archive [3] will be
482 added, and the 'ubuntu-cloud-keyring' package will be added for the
483 current distro.
484
485 Otherwise the source is not recognised and this is logged to the juju log.
486 However, no error is raised, unless sys_error_on_exit is True.
487
488 [1] deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
489 where {} is replaced with the derived pocket name.
490 [2] deb http://archive.ubuntu.com/ubuntu {}-proposed \
491 main universe multiverse restricted
492 where {} is replaced with the lsb_release codename (e.g. xenial)
493 [3] deb http://ubuntu-cloud.archive.canonical.com/ubuntu <pocket>
494 to /etc/apt/sources.list.d/cloud-archive-list
495
236 @param key: A key to be added to the system's APT keyring and used496 @param key: A key to be added to the system's APT keyring and used
237 to verify the signatures on packages. Ideally, this should be an497 to verify the signatures on packages. Ideally, this should be an
238 ASCII format GPG public key including the block headers. A GPG key498 ASCII format GPG public key including the block headers. A GPG key
@@ -240,51 +500,152 @@
240 available to retrieve the actual public key from a public keyserver500 available to retrieve the actual public key from a public keyserver
241 placing your Juju environment at risk. ppa and cloud archive keys501 placing your Juju environment at risk. ppa and cloud archive keys
242 are securely added automtically, so sould not be provided.502 are securely added automtically, so sould not be provided.
503
504 @param fail_invalid: (boolean) if True, then the function raises a
505 SourceConfigError is there is no matching installation source.
506
507 @raises SourceConfigError() if for cloud:<pocket>, the <pocket> is not a
508 valid pocket in CLOUD_ARCHIVE_POCKETS
243 """509 """
510 _mapping = OrderedDict([
511 (r"^distro$", lambda: None), # This is a NOP
512 (r"^(?:proposed|distro-proposed)$", _add_proposed),
513 (r"^cloud-archive:(.*)$", _add_apt_repository),
514 (r"^((?:deb |http:|https:|ppa:).*)$", _add_apt_repository),
515 (r"^cloud:(.*)-(.*)\/staging$", _add_cloud_staging),
516 (r"^cloud:(.*)-(.*)$", _add_cloud_distro_check),
517 (r"^cloud:(.*)$", _add_cloud_pocket),
518 (r"^snap:.*-(.*)-(.*)$", _add_cloud_distro_check),
519 ])
244 if source is None:520 if source is None:
245 log('Source is not present. Skipping')521 source = ''
246 return522 for r, fn in six.iteritems(_mapping):
247523 m = re.match(r, source)
248 if (source.startswith('ppa:') or524 if m:
249 source.startswith('http') or525 if key:
250 source.startswith('deb ') or526 # Import key before adding the source which depends on it,
251 source.startswith('cloud-archive:')):527 # as refreshing packages could fail otherwise.
252 cmd = ['add-apt-repository', '--yes', source]528 try:
253 _run_with_retries(cmd)529 import_key(key)
254 elif source.startswith('cloud:'):530 except GPGKeyError as e:
255 install(filter_installed_packages(['ubuntu-cloud-keyring']),531 raise SourceConfigError(str(e))
532 # call the associated function with the captured groups
533 # raises SourceConfigError on error.
534 fn(*m.groups())
535 break
536 else:
537 # nothing matched. log an error and maybe sys.exit
538 err = "Unknown source: {!r}".format(source)
539 log(err)
540 if fail_invalid:
541 raise SourceConfigError(err)
542
543
544def _add_proposed():
545 """Add the PROPOSED_POCKET as /etc/apt/source.list.d/proposed.list
546
547 Uses get_distrib_codename to determine the correct stanza for
548 the deb line.
549
550 For intel architecutres PROPOSED_POCKET is used for the release, but for
551 other architectures PROPOSED_PORTS_POCKET is used for the release.
552 """
553 release = get_distrib_codename()
554 arch = platform.machine()
555 if arch not in six.iterkeys(ARCH_TO_PROPOSED_POCKET):
556 raise SourceConfigError("Arch {} not supported for (distro-)proposed"
557 .format(arch))
558 with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
559 apt.write(ARCH_TO_PROPOSED_POCKET[arch].format(release))
560
561
562def _add_apt_repository(spec):
563 """Add the spec using add_apt_repository
564
565 :param spec: the parameter to pass to add_apt_repository
566 :type spec: str
567 """
568 if '{series}' in spec:
569 series = get_distrib_codename()
570 spec = spec.replace('{series}', series)
571 # software-properties package for bionic properly reacts to proxy settings
572 # passed as environment variables (See lp:1433761). This is not the case
573 # LTS and non-LTS releases below bionic.
574 _run_with_retries(['add-apt-repository', '--yes', spec],
575 cmd_env=env_proxy_settings(['https']))
576
577
578def _add_cloud_pocket(pocket):
579 """Add a cloud pocket as /etc/apt/sources.d/cloud-archive.list
580
581 Note that this overwrites the existing file if there is one.
582
583 This function also converts the simple pocket in to the actual pocket using
584 the CLOUD_ARCHIVE_POCKETS mapping.
585
586 :param pocket: string representing the pocket to add a deb spec for.
587 :raises: SourceConfigError if the cloud pocket doesn't exist or the
588 requested release doesn't match the current distro version.
589 """
590 apt_install(filter_installed_packages(['ubuntu-cloud-keyring']),
256 fatal=True)591 fatal=True)
257 pocket = source.split(':')[-1]592 if pocket not in CLOUD_ARCHIVE_POCKETS:
258 if pocket not in CLOUD_ARCHIVE_POCKETS:593 raise SourceConfigError(
259 raise SourceConfigError(594 'Unsupported cloud: source option %s' %
260 'Unsupported cloud: source option %s' %595 pocket)
261 pocket)596 actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
262 actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]597 with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
263 with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:598 apt.write(CLOUD_ARCHIVE.format(actual_pocket))
264 apt.write(CLOUD_ARCHIVE.format(actual_pocket))599
265 elif source == 'proposed':600
266 release = lsb_release()['DISTRIB_CODENAME']601def _add_cloud_staging(cloud_archive_release, openstack_release):
267 with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:602 """Add the cloud staging repository which is in
268 apt.write(PROPOSED_POCKET.format(release))603 ppa:ubuntu-cloud-archive/<openstack_release>-staging
269 elif source == 'distro':604
270 pass605 This function checks that the cloud_archive_release matches the current
271 else:606 codename for the distro that charm is being installed on.
272 log("Unknown source: {!r}".format(source))607
273608 :param cloud_archive_release: string, codename for the release.
274 if key:609 :param openstack_release: String, codename for the openstack release.
275 if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key:610 :raises: SourceConfigError if the cloud_archive_release doesn't match the
276 with NamedTemporaryFile('w+') as key_file:611 current version of the os.
277 key_file.write(key)612 """
278 key_file.flush()613 _verify_is_ubuntu_rel(cloud_archive_release, openstack_release)
279 key_file.seek(0)614 ppa = 'ppa:ubuntu-cloud-archive/{}-staging'.format(openstack_release)
280 subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file)615 cmd = 'add-apt-repository -y {}'.format(ppa)
281 else:616 _run_with_retries(cmd.split(' '))
282 # Note that hkp: is in no way a secure protocol. Using a617
283 # GPG key id is pointless from a security POV unless you618
284 # absolutely trust your network and DNS.619def _add_cloud_distro_check(cloud_archive_release, openstack_release):
285 subprocess.check_call(['apt-key', 'adv', '--keyserver',620 """Add the cloud pocket, but also check the cloud_archive_release against
286 'hkp://keyserver.ubuntu.com:80', '--recv',621 the current distro, and use the openstack_release as the full lookup.
287 key])622
623 This just calls _add_cloud_pocket() with the openstack_release as pocket
624 to get the correct cloud-archive.list for dpkg to work with.
625
626 :param cloud_archive_release:String, codename for the distro release.
627 :param openstack_release: String, spec for the release to look up in the
628 CLOUD_ARCHIVE_POCKETS
629 :raises: SourceConfigError if this is the wrong distro, or the pocket spec
630 doesn't exist.
631 """
632 _verify_is_ubuntu_rel(cloud_archive_release, openstack_release)
633 _add_cloud_pocket("{}-{}".format(cloud_archive_release, openstack_release))
634
635
636def _verify_is_ubuntu_rel(release, os_release):
637 """Verify that the release is in the same as the current ubuntu release.
638
639 :param release: String, lowercase for the release.
640 :param os_release: String, the os_release being asked for
641 :raises: SourceConfigError if the release is not the same as the ubuntu
642 release.
643 """
644 ubuntu_rel = get_distrib_codename()
645 if release != ubuntu_rel:
646 raise SourceConfigError(
647 'Invalid Cloud Archive release specified: {}-{} on this Ubuntu'
648 'version ({})'.format(release, os_release, ubuntu_rel))
288649
289650
290def _run_with_retries(cmd, max_retries=CMD_RETRY_COUNT, retry_exitcodes=(1,),651def _run_with_retries(cmd, max_retries=CMD_RETRY_COUNT, retry_exitcodes=(1,),
@@ -300,9 +661,12 @@
300 :param: cmd_env: dict: Environment variables to add to the command run.661 :param: cmd_env: dict: Environment variables to add to the command run.
301 """662 """
302663
303 env = os.environ.copy()664 env = None
665 kwargs = {}
304 if cmd_env:666 if cmd_env:
667 env = os.environ.copy()
305 env.update(cmd_env)668 env.update(cmd_env)
669 kwargs['env'] = env
306670
307 if not retry_message:671 if not retry_message:
308 retry_message = "Failed executing '{}'".format(" ".join(cmd))672 retry_message = "Failed executing '{}'".format(" ".join(cmd))
@@ -314,7 +678,8 @@
314 retry_results = (None,) + retry_exitcodes678 retry_results = (None,) + retry_exitcodes
315 while result in retry_results:679 while result in retry_results:
316 try:680 try:
317 result = subprocess.check_call(cmd, env=env)681 # result = subprocess.check_call(cmd, env=env)
682 result = subprocess.check_call(cmd, **kwargs)
318 except subprocess.CalledProcessError as e:683 except subprocess.CalledProcessError as e:
319 retry_count = retry_count + 1684 retry_count = retry_count + 1
320 if retry_count > max_retries:685 if retry_count > max_retries:
@@ -327,6 +692,7 @@
327def _run_apt_command(cmd, fatal=False):692def _run_apt_command(cmd, fatal=False):
328 """Run an apt command with optional retries.693 """Run an apt command with optional retries.
329694
695 :param: cmd: str: The apt command to run.
330 :param: fatal: bool: Whether the command's output should be checked and696 :param: fatal: bool: Whether the command's output should be checked and
331 retried.697 retried.
332 """698 """
@@ -353,7 +719,7 @@
353 cache = apt_cache()719 cache = apt_cache()
354 try:720 try:
355 pkg = cache[package]721 pkg = cache[package]
356 except:722 except Exception:
357 # the package is unknown to the current apt cache.723 # the package is unknown to the current apt cache.
358 return None724 return None
359725
360726
=== modified file 'dev/charm_helpers_sync.py'
--- dev/charm_helpers_sync.py 2015-01-28 08:59:02 +0000
+++ dev/charm_helpers_sync.py 2019-05-24 12:43:31 +0000
@@ -2,19 +2,17 @@
22
3# Copyright 2014-2015 Canonical Limited.3# Copyright 2014-2015 Canonical Limited.
4#4#
5# This file is part of charm-helpers.5# Licensed under the Apache License, Version 2.0 (the "License");
6#6# you may not use this file except in compliance with the License.
7# charm-helpers is free software: you can redistribute it and/or modify7# You may obtain a copy of the License at
8# it under the terms of the GNU Lesser General Public License version 3 as8#
9# published by the Free Software Foundation.9# http://www.apache.org/licenses/LICENSE-2.0
10#10#
11# charm-helpers is distributed in the hope that it will be useful,11# Unless required by applicable law or agreed to in writing, software
12# but WITHOUT ANY WARRANTY; without even the implied warranty of12# distributed under the License is distributed on an "AS IS" BASIS,
13# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the13# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14# GNU Lesser General Public License for more details.14# See the License for the specific language governing permissions and
15#15# limitations under the License.
16# You should have received a copy of the GNU Lesser General Public License
17# along with charm-helpers. If not, see <http://www.gnu.org/licenses/>.
1816
19# Authors:17# Authors:
20# Adam Gandelman <adamg@ubuntu.com>18# Adam Gandelman <adamg@ubuntu.com>
@@ -31,7 +29,7 @@
3129
32import six30import six
3331
34CHARM_HELPERS_BRANCH = 'lp:charm-helpers'32CHARM_HELPERS_REPO = 'https://github.com/juju/charm-helpers'
3533
3634
37def parse_config(conf_file):35def parse_config(conf_file):
@@ -41,10 +39,16 @@
41 return yaml.load(open(conf_file).read())39 return yaml.load(open(conf_file).read())
4240
4341
44def clone_helpers(work_dir, branch):42def clone_helpers(work_dir, repo):
45 dest = os.path.join(work_dir, 'charm-helpers')43 dest = os.path.join(work_dir, 'charm-helpers')
46 logging.info('Checking out %s to %s.' % (branch, dest))44 logging.info('Cloning out %s to %s.' % (repo, dest))
47 cmd = ['bzr', 'checkout', '--lightweight', branch, dest]45 branch = None
46 if '@' in repo:
47 repo, branch = repo.split('@', 1)
48 cmd = ['git', 'clone', '--depth=1']
49 if branch is not None:
50 cmd += ['--branch', branch]
51 cmd += [repo, dest]
48 subprocess.check_call(cmd)52 subprocess.check_call(cmd)
49 return dest53 return dest
5054
@@ -176,6 +180,9 @@
176180
177181
178def sync_helpers(include, src, dest, options=None):182def sync_helpers(include, src, dest, options=None):
183 if os.path.exists(dest):
184 logging.debug('Removing existing directory: %s' % dest)
185 shutil.rmtree(dest)
179 if not os.path.isdir(dest):186 if not os.path.isdir(dest):
180 os.makedirs(dest)187 os.makedirs(dest)
181188
@@ -193,14 +200,15 @@
193 inc, opts = extract_options(m, global_options)200 inc, opts = extract_options(m, global_options)
194 sync(src, dest, '%s.%s' % (k, inc), opts)201 sync(src, dest, '%s.%s' % (k, inc), opts)
195202
203
196if __name__ == '__main__':204if __name__ == '__main__':
197 parser = optparse.OptionParser()205 parser = optparse.OptionParser()
198 parser.add_option('-c', '--config', action='store', dest='config',206 parser.add_option('-c', '--config', action='store', dest='config',
199 default=None, help='helper config file')207 default=None, help='helper config file')
200 parser.add_option('-D', '--debug', action='store_true', dest='debug',208 parser.add_option('-D', '--debug', action='store_true', dest='debug',
201 default=False, help='debug')209 default=False, help='debug')
202 parser.add_option('-b', '--branch', action='store', dest='branch',210 parser.add_option('-r', '--repository', action='store', dest='repo',
203 help='charm-helpers bzr branch (overrides config)')211 help='charm-helpers git repository (overrides config)')
204 parser.add_option('-d', '--destination', action='store', dest='dest_dir',212 parser.add_option('-d', '--destination', action='store', dest='dest_dir',
205 help='sync destination dir (overrides config)')213 help='sync destination dir (overrides config)')
206 (opts, args) = parser.parse_args()214 (opts, args) = parser.parse_args()
@@ -219,10 +227,10 @@
219 else:227 else:
220 config = {}228 config = {}
221229
222 if 'branch' not in config:230 if 'repo' not in config:
223 config['branch'] = CHARM_HELPERS_BRANCH231 config['repo'] = CHARM_HELPERS_REPO
224 if opts.branch:232 if opts.repo:
225 config['branch'] = opts.branch233 config['repo'] = opts.repo
226 if opts.dest_dir:234 if opts.dest_dir:
227 config['destination'] = opts.dest_dir235 config['destination'] = opts.dest_dir
228236
@@ -242,7 +250,7 @@
242 sync_options = config['options']250 sync_options = config['options']
243 tmpd = tempfile.mkdtemp()251 tmpd = tempfile.mkdtemp()
244 try:252 try:
245 checkout = clone_helpers(tmpd, config['branch'])253 checkout = clone_helpers(tmpd, config['repo'])
246 sync_helpers(config['include'], checkout, config['destination'],254 sync_helpers(config['include'], checkout, config['destination'],
247 options=sync_options)255 options=sync_options)
248 except Exception as e:256 except Exception as e:

Subscribers

People subscribed via source and target branches