Merge lp:~hopem/charms/trusty/jenkins/python-redux into lp:charms/trusty/jenkins

Proposed by Edward Hope-Morley
Status: Superseded
Proposed branch: lp:~hopem/charms/trusty/jenkins/python-redux
Merge into: lp:charms/trusty/jenkins
Diff against target: 3635 lines (+3190/-236)
35 files modified
Makefile (+30/-0)
bin/charm_helpers_sync.py (+225/-0)
charm-helpers-hooks.yaml (+7/-0)
config.yaml (+1/-1)
hooks/charmhelpers/__init__.py (+22/-0)
hooks/charmhelpers/core/fstab.py (+118/-0)
hooks/charmhelpers/core/hookenv.py (+552/-0)
hooks/charmhelpers/core/host.py (+416/-0)
hooks/charmhelpers/core/services/__init__.py (+2/-0)
hooks/charmhelpers/core/services/base.py (+313/-0)
hooks/charmhelpers/core/services/helpers.py (+243/-0)
hooks/charmhelpers/core/sysctl.py (+34/-0)
hooks/charmhelpers/core/templating.py (+52/-0)
hooks/charmhelpers/fetch/__init__.py (+416/-0)
hooks/charmhelpers/fetch/archiveurl.py (+145/-0)
hooks/charmhelpers/fetch/bzrurl.py (+54/-0)
hooks/charmhelpers/fetch/giturl.py (+51/-0)
hooks/charmhelpers/payload/__init__.py (+1/-0)
hooks/charmhelpers/payload/execd.py (+50/-0)
hooks/config-changed (+0/-7)
hooks/install (+0/-151)
hooks/jenkins_hooks.py (+220/-0)
hooks/jenkins_utils.py (+169/-0)
hooks/master-relation-broken (+0/-17)
hooks/master-relation-changed (+0/-24)
hooks/master-relation-departed (+0/-12)
hooks/master-relation-joined (+0/-5)
hooks/start (+0/-3)
hooks/stop (+0/-3)
hooks/upgrade-charm (+0/-7)
hooks/website-relation-joined (+0/-5)
tests/100-deploy-trusty (+1/-1)
tests/README (+56/-0)
unit_tests/test_jenkins_hooks.py (+6/-0)
unit_tests/test_jenkins_utils.py (+6/-0)
To merge this branch: bzr merge lp:~hopem/charms/trusty/jenkins/python-redux
Reviewer Review Type Date Requested Status
Review Queue (community) automated testing Needs Fixing
James Page Pending
Jorge Niedbalski Pending
Felipe Reyes Pending
Ryan Beisner Pending
Paul Larson Pending
Review via email: mp+244562@code.launchpad.net

This proposal supersedes a proposal from 2014-12-11.

This proposal has been superseded by a proposal from 2015-01-07.

To post a comment you must log in.
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10336-results

review: Needs Fixing (automated testing)
Revision history for this message
Felipe Reyes (freyes) wrote : Posted in a previous version of this proposal

Setting the password doesn't work, deploying as below doesn't allow you to login with admin/admin. Also when first deploying from the charm store and then upgrading to this branch breaks the password.

---
jenkins:
    password: "admin"
---

$ juju deploy --config config.yaml local:trusty/jenkins

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote : Posted in a previous version of this proposal

Thanks Felipe, taking a look.

Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10636-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10684-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote :

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10704-results

review: Needs Fixing (automated testing)
47. By Edward Hope-Morley

tell amulet to deploy on trusty (default is precise)

Revision history for this message
Review Queue (review-queue) wrote :

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10869-results

review: Needs Fixing (automated testing)
48. By Edward Hope-Morley

fix amulet test

49. By Edward Hope-Morley

synced charmhelpers

50. By Edward Hope-Morley

allow retries when adding node

51. By Edward Hope-Morley

 * Fixed Makefile amulet test filename
 * Synced charm-helpers python-six deps
 * Synced charm-helpers test deps

52. By Edward Hope-Morley

added precise and trusty amulet

53. By Edward Hope-Morley

switch makefile rules to names that juju ci (hopefully) understands

54. By Edward Hope-Morley

ensure apt update prior to install

55. By Edward Hope-Morley

added venv for tests and lint

Unmerged revisions

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'Makefile'
2--- Makefile 1970-01-01 00:00:00 +0000
3+++ Makefile 2015-01-07 17:22:49 +0000
4@@ -0,0 +1,30 @@
5+#!/usr/bin/make
6+PYTHON := /usr/bin/env python
7+
8+lint:
9+ @flake8 --exclude hooks/charmhelpers hooks unit_tests tests
10+ @charm proof
11+
12+test:
13+ @echo Starting Amulet tests...
14+ # coreycb note: The -v should only be temporary until Amulet sends
15+ # raise_status() messages to stderr:
16+ # https://bugs.launchpad.net/amulet/+bug/1320357
17+ @juju test -v -p AMULET_HTTP_PROXY --timeout 900 \
18+ 00-setup 100-deploy
19+
20+unit_test:
21+ @echo Starting unit tests...
22+ @$(PYTHON) /usr/bin/nosetests --nologcapture --with-coverage unit_tests
23+
24+bin/charm_helpers_sync.py:
25+ @mkdir -p bin
26+ @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
27+ > bin/charm_helpers_sync.py
28+
29+sync: bin/charm_helpers_sync.py
30+ @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers-hooks.yaml
31+
32+publish: lint unit_test
33+ bzr push lp:charms/jenkins
34+ bzr push lp:charms/trusty/jenkins
35
36=== added directory 'bin'
37=== added file 'bin/charm_helpers_sync.py'
38--- bin/charm_helpers_sync.py 1970-01-01 00:00:00 +0000
39+++ bin/charm_helpers_sync.py 2015-01-07 17:22:49 +0000
40@@ -0,0 +1,225 @@
41+#!/usr/bin/python
42+#
43+# Copyright 2013 Canonical Ltd.
44+
45+# Authors:
46+# Adam Gandelman <adamg@ubuntu.com>
47+#
48+
49+import logging
50+import optparse
51+import os
52+import subprocess
53+import shutil
54+import sys
55+import tempfile
56+import yaml
57+
58+from fnmatch import fnmatch
59+
60+CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
61+
62+
63+def parse_config(conf_file):
64+ if not os.path.isfile(conf_file):
65+ logging.error('Invalid config file: %s.' % conf_file)
66+ return False
67+ return yaml.load(open(conf_file).read())
68+
69+
70+def clone_helpers(work_dir, branch):
71+ dest = os.path.join(work_dir, 'charm-helpers')
72+ logging.info('Checking out %s to %s.' % (branch, dest))
73+ cmd = ['bzr', 'checkout', '--lightweight', branch, dest]
74+ subprocess.check_call(cmd)
75+ return dest
76+
77+
78+def _module_path(module):
79+ return os.path.join(*module.split('.'))
80+
81+
82+def _src_path(src, module):
83+ return os.path.join(src, 'charmhelpers', _module_path(module))
84+
85+
86+def _dest_path(dest, module):
87+ return os.path.join(dest, _module_path(module))
88+
89+
90+def _is_pyfile(path):
91+ return os.path.isfile(path + '.py')
92+
93+
94+def ensure_init(path):
95+ '''
96+ ensure directories leading up to path are importable, omitting
97+ parent directory, eg path='/hooks/helpers/foo'/:
98+ hooks/
99+ hooks/helpers/__init__.py
100+ hooks/helpers/foo/__init__.py
101+ '''
102+ for d, dirs, files in os.walk(os.path.join(*path.split('/')[:2])):
103+ _i = os.path.join(d, '__init__.py')
104+ if not os.path.exists(_i):
105+ logging.info('Adding missing __init__.py: %s' % _i)
106+ open(_i, 'wb').close()
107+
108+
109+def sync_pyfile(src, dest):
110+ src = src + '.py'
111+ src_dir = os.path.dirname(src)
112+ logging.info('Syncing pyfile: %s -> %s.' % (src, dest))
113+ if not os.path.exists(dest):
114+ os.makedirs(dest)
115+ shutil.copy(src, dest)
116+ if os.path.isfile(os.path.join(src_dir, '__init__.py')):
117+ shutil.copy(os.path.join(src_dir, '__init__.py'),
118+ dest)
119+ ensure_init(dest)
120+
121+
122+def get_filter(opts=None):
123+ opts = opts or []
124+ if 'inc=*' in opts:
125+ # do not filter any files, include everything
126+ return None
127+
128+ def _filter(dir, ls):
129+ incs = [opt.split('=').pop() for opt in opts if 'inc=' in opt]
130+ _filter = []
131+ for f in ls:
132+ _f = os.path.join(dir, f)
133+
134+ if not os.path.isdir(_f) and not _f.endswith('.py') and incs:
135+ if True not in [fnmatch(_f, inc) for inc in incs]:
136+ logging.debug('Not syncing %s, does not match include '
137+ 'filters (%s)' % (_f, incs))
138+ _filter.append(f)
139+ else:
140+ logging.debug('Including file, which matches include '
141+ 'filters (%s): %s' % (incs, _f))
142+ elif (os.path.isfile(_f) and not _f.endswith('.py')):
143+ logging.debug('Not syncing file: %s' % f)
144+ _filter.append(f)
145+ elif (os.path.isdir(_f) and not
146+ os.path.isfile(os.path.join(_f, '__init__.py'))):
147+ logging.debug('Not syncing directory: %s' % f)
148+ _filter.append(f)
149+ return _filter
150+ return _filter
151+
152+
153+def sync_directory(src, dest, opts=None):
154+ if os.path.exists(dest):
155+ logging.debug('Removing existing directory: %s' % dest)
156+ shutil.rmtree(dest)
157+ logging.info('Syncing directory: %s -> %s.' % (src, dest))
158+
159+ shutil.copytree(src, dest, ignore=get_filter(opts))
160+ ensure_init(dest)
161+
162+
163+def sync(src, dest, module, opts=None):
164+ if os.path.isdir(_src_path(src, module)):
165+ sync_directory(_src_path(src, module), _dest_path(dest, module), opts)
166+ elif _is_pyfile(_src_path(src, module)):
167+ sync_pyfile(_src_path(src, module),
168+ os.path.dirname(_dest_path(dest, module)))
169+ else:
170+ logging.warn('Could not sync: %s. Neither a pyfile or directory, '
171+ 'does it even exist?' % module)
172+
173+
174+def parse_sync_options(options):
175+ if not options:
176+ return []
177+ return options.split(',')
178+
179+
180+def extract_options(inc, global_options=None):
181+ global_options = global_options or []
182+ if global_options and isinstance(global_options, basestring):
183+ global_options = [global_options]
184+ if '|' not in inc:
185+ return (inc, global_options)
186+ inc, opts = inc.split('|')
187+ return (inc, parse_sync_options(opts) + global_options)
188+
189+
190+def sync_helpers(include, src, dest, options=None):
191+ if not os.path.isdir(dest):
192+ os.makedirs(dest)
193+
194+ global_options = parse_sync_options(options)
195+
196+ for inc in include:
197+ if isinstance(inc, str):
198+ inc, opts = extract_options(inc, global_options)
199+ sync(src, dest, inc, opts)
200+ elif isinstance(inc, dict):
201+ # could also do nested dicts here.
202+ for k, v in inc.iteritems():
203+ if isinstance(v, list):
204+ for m in v:
205+ inc, opts = extract_options(m, global_options)
206+ sync(src, dest, '%s.%s' % (k, inc), opts)
207+
208+if __name__ == '__main__':
209+ parser = optparse.OptionParser()
210+ parser.add_option('-c', '--config', action='store', dest='config',
211+ default=None, help='helper config file')
212+ parser.add_option('-D', '--debug', action='store_true', dest='debug',
213+ default=False, help='debug')
214+ parser.add_option('-b', '--branch', action='store', dest='branch',
215+ help='charm-helpers bzr branch (overrides config)')
216+ parser.add_option('-d', '--destination', action='store', dest='dest_dir',
217+ help='sync destination dir (overrides config)')
218+ (opts, args) = parser.parse_args()
219+
220+ if opts.debug:
221+ logging.basicConfig(level=logging.DEBUG)
222+ else:
223+ logging.basicConfig(level=logging.INFO)
224+
225+ if opts.config:
226+ logging.info('Loading charm helper config from %s.' % opts.config)
227+ config = parse_config(opts.config)
228+ if not config:
229+ logging.error('Could not parse config from %s.' % opts.config)
230+ sys.exit(1)
231+ else:
232+ config = {}
233+
234+ if 'branch' not in config:
235+ config['branch'] = CHARM_HELPERS_BRANCH
236+ if opts.branch:
237+ config['branch'] = opts.branch
238+ if opts.dest_dir:
239+ config['destination'] = opts.dest_dir
240+
241+ if 'destination' not in config:
242+ logging.error('No destination dir. specified as option or config.')
243+ sys.exit(1)
244+
245+ if 'include' not in config:
246+ if not args:
247+ logging.error('No modules to sync specified as option or config.')
248+ sys.exit(1)
249+ config['include'] = []
250+ [config['include'].append(a) for a in args]
251+
252+ sync_options = None
253+ if 'options' in config:
254+ sync_options = config['options']
255+ tmpd = tempfile.mkdtemp()
256+ try:
257+ checkout = clone_helpers(tmpd, config['branch'])
258+ sync_helpers(config['include'], checkout, config['destination'],
259+ options=sync_options)
260+ except Exception, e:
261+ logging.error("Could not sync: %s" % e)
262+ raise e
263+ finally:
264+ logging.debug('Cleaning up %s' % tmpd)
265+ shutil.rmtree(tmpd)
266
267=== added file 'charm-helpers-hooks.yaml'
268--- charm-helpers-hooks.yaml 1970-01-01 00:00:00 +0000
269+++ charm-helpers-hooks.yaml 2015-01-07 17:22:49 +0000
270@@ -0,0 +1,7 @@
271+branch: lp:charm-helpers
272+destination: hooks/charmhelpers
273+include:
274+ - __init__
275+ - core
276+ - fetch
277+ - payload.execd
278
279=== modified file 'config.yaml'
280--- config.yaml 2014-08-14 19:53:02 +0000
281+++ config.yaml 2015-01-07 17:22:49 +0000
282@@ -17,9 +17,9 @@
283 slave nodes so please don't change in Jenkins.
284 password:
285 type: string
286+ default: ""
287 description: Admin user password - used to manage
288 slave nodes so please don't change in Jenkins.
289- default:
290 plugins:
291 type: string
292 default: ""
293
294=== added directory 'hooks/charmhelpers'
295=== added file 'hooks/charmhelpers/__init__.py'
296--- hooks/charmhelpers/__init__.py 1970-01-01 00:00:00 +0000
297+++ hooks/charmhelpers/__init__.py 2015-01-07 17:22:49 +0000
298@@ -0,0 +1,22 @@
299+# Bootstrap charm-helpers, installing its dependencies if necessary using
300+# only standard libraries.
301+import subprocess
302+import sys
303+
304+try:
305+ import six # flake8: noqa
306+except ImportError:
307+ if sys.version_info.major == 2:
308+ subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])
309+ else:
310+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])
311+ import six # flake8: noqa
312+
313+try:
314+ import yaml # flake8: noqa
315+except ImportError:
316+ if sys.version_info.major == 2:
317+ subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])
318+ else:
319+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])
320+ import yaml # flake8: noqa
321
322=== added directory 'hooks/charmhelpers/contrib'
323=== added file 'hooks/charmhelpers/contrib/__init__.py'
324=== added directory 'hooks/charmhelpers/core'
325=== added file 'hooks/charmhelpers/core/__init__.py'
326=== added file 'hooks/charmhelpers/core/fstab.py'
327--- hooks/charmhelpers/core/fstab.py 1970-01-01 00:00:00 +0000
328+++ hooks/charmhelpers/core/fstab.py 2015-01-07 17:22:49 +0000
329@@ -0,0 +1,118 @@
330+#!/usr/bin/env python
331+# -*- coding: utf-8 -*-
332+
333+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
334+
335+import io
336+import os
337+
338+
339+class Fstab(io.FileIO):
340+ """This class extends file in order to implement a file reader/writer
341+ for file `/etc/fstab`
342+ """
343+
344+ class Entry(object):
345+ """Entry class represents a non-comment line on the `/etc/fstab` file
346+ """
347+ def __init__(self, device, mountpoint, filesystem,
348+ options, d=0, p=0):
349+ self.device = device
350+ self.mountpoint = mountpoint
351+ self.filesystem = filesystem
352+
353+ if not options:
354+ options = "defaults"
355+
356+ self.options = options
357+ self.d = int(d)
358+ self.p = int(p)
359+
360+ def __eq__(self, o):
361+ return str(self) == str(o)
362+
363+ def __str__(self):
364+ return "{} {} {} {} {} {}".format(self.device,
365+ self.mountpoint,
366+ self.filesystem,
367+ self.options,
368+ self.d,
369+ self.p)
370+
371+ DEFAULT_PATH = os.path.join(os.path.sep, 'etc', 'fstab')
372+
373+ def __init__(self, path=None):
374+ if path:
375+ self._path = path
376+ else:
377+ self._path = self.DEFAULT_PATH
378+ super(Fstab, self).__init__(self._path, 'rb+')
379+
380+ def _hydrate_entry(self, line):
381+ # NOTE: use split with no arguments to split on any
382+ # whitespace including tabs
383+ return Fstab.Entry(*filter(
384+ lambda x: x not in ('', None),
385+ line.strip("\n").split()))
386+
387+ @property
388+ def entries(self):
389+ self.seek(0)
390+ for line in self.readlines():
391+ line = line.decode('us-ascii')
392+ try:
393+ if line.strip() and not line.startswith("#"):
394+ yield self._hydrate_entry(line)
395+ except ValueError:
396+ pass
397+
398+ def get_entry_by_attr(self, attr, value):
399+ for entry in self.entries:
400+ e_attr = getattr(entry, attr)
401+ if e_attr == value:
402+ return entry
403+ return None
404+
405+ def add_entry(self, entry):
406+ if self.get_entry_by_attr('device', entry.device):
407+ return False
408+
409+ self.write((str(entry) + '\n').encode('us-ascii'))
410+ self.truncate()
411+ return entry
412+
413+ def remove_entry(self, entry):
414+ self.seek(0)
415+
416+ lines = [l.decode('us-ascii') for l in self.readlines()]
417+
418+ found = False
419+ for index, line in enumerate(lines):
420+ if not line.startswith("#"):
421+ if self._hydrate_entry(line) == entry:
422+ found = True
423+ break
424+
425+ if not found:
426+ return False
427+
428+ lines.remove(line)
429+
430+ self.seek(0)
431+ self.write(''.join(lines).encode('us-ascii'))
432+ self.truncate()
433+ return True
434+
435+ @classmethod
436+ def remove_by_mountpoint(cls, mountpoint, path=None):
437+ fstab = cls(path=path)
438+ entry = fstab.get_entry_by_attr('mountpoint', mountpoint)
439+ if entry:
440+ return fstab.remove_entry(entry)
441+ return False
442+
443+ @classmethod
444+ def add(cls, device, mountpoint, filesystem, options=None, path=None):
445+ return cls(path=path).add_entry(Fstab.Entry(device,
446+ mountpoint, filesystem,
447+ options=options))
448
449=== added file 'hooks/charmhelpers/core/hookenv.py'
450--- hooks/charmhelpers/core/hookenv.py 1970-01-01 00:00:00 +0000
451+++ hooks/charmhelpers/core/hookenv.py 2015-01-07 17:22:49 +0000
452@@ -0,0 +1,552 @@
453+"Interactions with the Juju environment"
454+# Copyright 2013 Canonical Ltd.
455+#
456+# Authors:
457+# Charm Helpers Developers <juju@lists.ubuntu.com>
458+
459+import os
460+import json
461+import yaml
462+import subprocess
463+import sys
464+from subprocess import CalledProcessError
465+
466+import six
467+if not six.PY3:
468+ from UserDict import UserDict
469+else:
470+ from collections import UserDict
471+
472+CRITICAL = "CRITICAL"
473+ERROR = "ERROR"
474+WARNING = "WARNING"
475+INFO = "INFO"
476+DEBUG = "DEBUG"
477+MARKER = object()
478+
479+cache = {}
480+
481+
482+def cached(func):
483+ """Cache return values for multiple executions of func + args
484+
485+ For example::
486+
487+ @cached
488+ def unit_get(attribute):
489+ pass
490+
491+ unit_get('test')
492+
493+ will cache the result of unit_get + 'test' for future calls.
494+ """
495+ def wrapper(*args, **kwargs):
496+ global cache
497+ key = str((func, args, kwargs))
498+ try:
499+ return cache[key]
500+ except KeyError:
501+ res = func(*args, **kwargs)
502+ cache[key] = res
503+ return res
504+ return wrapper
505+
506+
507+def flush(key):
508+ """Flushes any entries from function cache where the
509+ key is found in the function+args """
510+ flush_list = []
511+ for item in cache:
512+ if key in item:
513+ flush_list.append(item)
514+ for item in flush_list:
515+ del cache[item]
516+
517+
518+def log(message, level=None):
519+ """Write a message to the juju log"""
520+ command = ['juju-log']
521+ if level:
522+ command += ['-l', level]
523+ if not isinstance(message, six.string_types):
524+ message = repr(message)
525+ command += [message]
526+ subprocess.call(command)
527+
528+
529+class Serializable(UserDict):
530+ """Wrapper, an object that can be serialized to yaml or json"""
531+
532+ def __init__(self, obj):
533+ # wrap the object
534+ UserDict.__init__(self)
535+ self.data = obj
536+
537+ def __getattr__(self, attr):
538+ # See if this object has attribute.
539+ if attr in ("json", "yaml", "data"):
540+ return self.__dict__[attr]
541+ # Check for attribute in wrapped object.
542+ got = getattr(self.data, attr, MARKER)
543+ if got is not MARKER:
544+ return got
545+ # Proxy to the wrapped object via dict interface.
546+ try:
547+ return self.data[attr]
548+ except KeyError:
549+ raise AttributeError(attr)
550+
551+ def __getstate__(self):
552+ # Pickle as a standard dictionary.
553+ return self.data
554+
555+ def __setstate__(self, state):
556+ # Unpickle into our wrapper.
557+ self.data = state
558+
559+ def json(self):
560+ """Serialize the object to json"""
561+ return json.dumps(self.data)
562+
563+ def yaml(self):
564+ """Serialize the object to yaml"""
565+ return yaml.dump(self.data)
566+
567+
568+def execution_environment():
569+ """A convenient bundling of the current execution context"""
570+ context = {}
571+ context['conf'] = config()
572+ if relation_id():
573+ context['reltype'] = relation_type()
574+ context['relid'] = relation_id()
575+ context['rel'] = relation_get()
576+ context['unit'] = local_unit()
577+ context['rels'] = relations()
578+ context['env'] = os.environ
579+ return context
580+
581+
582+def in_relation_hook():
583+ """Determine whether we're running in a relation hook"""
584+ return 'JUJU_RELATION' in os.environ
585+
586+
587+def relation_type():
588+ """The scope for the current relation hook"""
589+ return os.environ.get('JUJU_RELATION', None)
590+
591+
592+def relation_id():
593+ """The relation ID for the current relation hook"""
594+ return os.environ.get('JUJU_RELATION_ID', None)
595+
596+
597+def local_unit():
598+ """Local unit ID"""
599+ return os.environ['JUJU_UNIT_NAME']
600+
601+
602+def remote_unit():
603+ """The remote unit for the current relation hook"""
604+ return os.environ['JUJU_REMOTE_UNIT']
605+
606+
607+def service_name():
608+ """The name service group this unit belongs to"""
609+ return local_unit().split('/')[0]
610+
611+
612+def hook_name():
613+ """The name of the currently executing hook"""
614+ return os.path.basename(sys.argv[0])
615+
616+
617+class Config(dict):
618+ """A dictionary representation of the charm's config.yaml, with some
619+ extra features:
620+
621+ - See which values in the dictionary have changed since the previous hook.
622+ - For values that have changed, see what the previous value was.
623+ - Store arbitrary data for use in a later hook.
624+
625+ NOTE: Do not instantiate this object directly - instead call
626+ ``hookenv.config()``, which will return an instance of :class:`Config`.
627+
628+ Example usage::
629+
630+ >>> # inside a hook
631+ >>> from charmhelpers.core import hookenv
632+ >>> config = hookenv.config()
633+ >>> config['foo']
634+ 'bar'
635+ >>> # store a new key/value for later use
636+ >>> config['mykey'] = 'myval'
637+
638+
639+ >>> # user runs `juju set mycharm foo=baz`
640+ >>> # now we're inside subsequent config-changed hook
641+ >>> config = hookenv.config()
642+ >>> config['foo']
643+ 'baz'
644+ >>> # test to see if this val has changed since last hook
645+ >>> config.changed('foo')
646+ True
647+ >>> # what was the previous value?
648+ >>> config.previous('foo')
649+ 'bar'
650+ >>> # keys/values that we add are preserved across hooks
651+ >>> config['mykey']
652+ 'myval'
653+
654+ """
655+ CONFIG_FILE_NAME = '.juju-persistent-config'
656+
657+ def __init__(self, *args, **kw):
658+ super(Config, self).__init__(*args, **kw)
659+ self.implicit_save = True
660+ self._prev_dict = None
661+ self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)
662+ if os.path.exists(self.path):
663+ self.load_previous()
664+
665+ def __getitem__(self, key):
666+ """For regular dict lookups, check the current juju config first,
667+ then the previous (saved) copy. This ensures that user-saved values
668+ will be returned by a dict lookup.
669+
670+ """
671+ try:
672+ return dict.__getitem__(self, key)
673+ except KeyError:
674+ return (self._prev_dict or {})[key]
675+
676+ def keys(self):
677+ prev_keys = []
678+ if self._prev_dict is not None:
679+ prev_keys = self._prev_dict.keys()
680+ return list(set(prev_keys + list(dict.keys(self))))
681+
682+ def load_previous(self, path=None):
683+ """Load previous copy of config from disk.
684+
685+ In normal usage you don't need to call this method directly - it
686+ is called automatically at object initialization.
687+
688+ :param path:
689+
690+ File path from which to load the previous config. If `None`,
691+ config is loaded from the default location. If `path` is
692+ specified, subsequent `save()` calls will write to the same
693+ path.
694+
695+ """
696+ self.path = path or self.path
697+ with open(self.path) as f:
698+ self._prev_dict = json.load(f)
699+
700+ def changed(self, key):
701+ """Return True if the current value for this key is different from
702+ the previous value.
703+
704+ """
705+ if self._prev_dict is None:
706+ return True
707+ return self.previous(key) != self.get(key)
708+
709+ def previous(self, key):
710+ """Return previous value for this key, or None if there
711+ is no previous value.
712+
713+ """
714+ if self._prev_dict:
715+ return self._prev_dict.get(key)
716+ return None
717+
718+ def save(self):
719+ """Save this config to disk.
720+
721+ If the charm is using the :mod:`Services Framework <services.base>`
722+ or :meth:'@hook <Hooks.hook>' decorator, this
723+ is called automatically at the end of successful hook execution.
724+ Otherwise, it should be called directly by user code.
725+
726+ To disable automatic saves, set ``implicit_save=False`` on this
727+ instance.
728+
729+ """
730+ if self._prev_dict:
731+ for k, v in six.iteritems(self._prev_dict):
732+ if k not in self:
733+ self[k] = v
734+ with open(self.path, 'w') as f:
735+ json.dump(self, f)
736+
737+
738+@cached
739+def config(scope=None):
740+ """Juju charm configuration"""
741+ config_cmd_line = ['config-get']
742+ if scope is not None:
743+ config_cmd_line.append(scope)
744+ config_cmd_line.append('--format=json')
745+ try:
746+ config_data = json.loads(
747+ subprocess.check_output(config_cmd_line).decode('UTF-8'))
748+ if scope is not None:
749+ return config_data
750+ return Config(config_data)
751+ except ValueError:
752+ return None
753+
754+
755+@cached
756+def relation_get(attribute=None, unit=None, rid=None):
757+ """Get relation information"""
758+ _args = ['relation-get', '--format=json']
759+ if rid:
760+ _args.append('-r')
761+ _args.append(rid)
762+ _args.append(attribute or '-')
763+ if unit:
764+ _args.append(unit)
765+ try:
766+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
767+ except ValueError:
768+ return None
769+ except CalledProcessError as e:
770+ if e.returncode == 2:
771+ return None
772+ raise
773+
774+
775+def relation_set(relation_id=None, relation_settings=None, **kwargs):
776+ """Set relation information for the current unit"""
777+ relation_settings = relation_settings if relation_settings else {}
778+ relation_cmd_line = ['relation-set']
779+ if relation_id is not None:
780+ relation_cmd_line.extend(('-r', relation_id))
781+ for k, v in (list(relation_settings.items()) + list(kwargs.items())):
782+ if v is None:
783+ relation_cmd_line.append('{}='.format(k))
784+ else:
785+ relation_cmd_line.append('{}={}'.format(k, v))
786+ subprocess.check_call(relation_cmd_line)
787+ # Flush cache of any relation-gets for local unit
788+ flush(local_unit())
789+
790+
791+@cached
792+def relation_ids(reltype=None):
793+ """A list of relation_ids"""
794+ reltype = reltype or relation_type()
795+ relid_cmd_line = ['relation-ids', '--format=json']
796+ if reltype is not None:
797+ relid_cmd_line.append(reltype)
798+ return json.loads(
799+ subprocess.check_output(relid_cmd_line).decode('UTF-8')) or []
800+ return []
801+
802+
803+@cached
804+def related_units(relid=None):
805+ """A list of related units"""
806+ relid = relid or relation_id()
807+ units_cmd_line = ['relation-list', '--format=json']
808+ if relid is not None:
809+ units_cmd_line.extend(('-r', relid))
810+ return json.loads(
811+ subprocess.check_output(units_cmd_line).decode('UTF-8')) or []
812+
813+
814+@cached
815+def relation_for_unit(unit=None, rid=None):
816+ """Get the json represenation of a unit's relation"""
817+ unit = unit or remote_unit()
818+ relation = relation_get(unit=unit, rid=rid)
819+ for key in relation:
820+ if key.endswith('-list'):
821+ relation[key] = relation[key].split()
822+ relation['__unit__'] = unit
823+ return relation
824+
825+
826+@cached
827+def relations_for_id(relid=None):
828+ """Get relations of a specific relation ID"""
829+ relation_data = []
830+ relid = relid or relation_ids()
831+ for unit in related_units(relid):
832+ unit_data = relation_for_unit(unit, relid)
833+ unit_data['__relid__'] = relid
834+ relation_data.append(unit_data)
835+ return relation_data
836+
837+
838+@cached
839+def relations_of_type(reltype=None):
840+ """Get relations of a specific type"""
841+ relation_data = []
842+ reltype = reltype or relation_type()
843+ for relid in relation_ids(reltype):
844+ for relation in relations_for_id(relid):
845+ relation['__relid__'] = relid
846+ relation_data.append(relation)
847+ return relation_data
848+
849+
850+@cached
851+def metadata():
852+ """Get the current charm metadata.yaml contents as a python object"""
853+ with open(os.path.join(charm_dir(), 'metadata.yaml')) as md:
854+ return yaml.safe_load(md)
855+
856+
857+@cached
858+def relation_types():
859+ """Get a list of relation types supported by this charm"""
860+ rel_types = []
861+ md = metadata()
862+ for key in ('provides', 'requires', 'peers'):
863+ section = md.get(key)
864+ if section:
865+ rel_types.extend(section.keys())
866+ return rel_types
867+
868+
869+@cached
870+def charm_name():
871+ """Get the name of the current charm as is specified on metadata.yaml"""
872+ return metadata().get('name')
873+
874+
875+@cached
876+def relations():
877+ """Get a nested dictionary of relation data for all related units"""
878+ rels = {}
879+ for reltype in relation_types():
880+ relids = {}
881+ for relid in relation_ids(reltype):
882+ units = {local_unit(): relation_get(unit=local_unit(), rid=relid)}
883+ for unit in related_units(relid):
884+ reldata = relation_get(unit=unit, rid=relid)
885+ units[unit] = reldata
886+ relids[relid] = units
887+ rels[reltype] = relids
888+ return rels
889+
890+
891+@cached
892+def is_relation_made(relation, keys='private-address'):
893+ '''
894+ Determine whether a relation is established by checking for
895+ presence of key(s). If a list of keys is provided, they
896+ must all be present for the relation to be identified as made
897+ '''
898+ if isinstance(keys, str):
899+ keys = [keys]
900+ for r_id in relation_ids(relation):
901+ for unit in related_units(r_id):
902+ context = {}
903+ for k in keys:
904+ context[k] = relation_get(k, rid=r_id,
905+ unit=unit)
906+ if None not in context.values():
907+ return True
908+ return False
909+
910+
911+def open_port(port, protocol="TCP"):
912+ """Open a service network port"""
913+ _args = ['open-port']
914+ _args.append('{}/{}'.format(port, protocol))
915+ subprocess.check_call(_args)
916+
917+
918+def close_port(port, protocol="TCP"):
919+ """Close a service network port"""
920+ _args = ['close-port']
921+ _args.append('{}/{}'.format(port, protocol))
922+ subprocess.check_call(_args)
923+
924+
925+@cached
926+def unit_get(attribute):
927+ """Get the unit ID for the remote unit"""
928+ _args = ['unit-get', '--format=json', attribute]
929+ try:
930+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
931+ except ValueError:
932+ return None
933+
934+
935+def unit_private_ip():
936+ """Get this unit's private IP address"""
937+ return unit_get('private-address')
938+
939+
940+class UnregisteredHookError(Exception):
941+ """Raised when an undefined hook is called"""
942+ pass
943+
944+
945+class Hooks(object):
946+ """A convenient handler for hook functions.
947+
948+ Example::
949+
950+ hooks = Hooks()
951+
952+ # register a hook, taking its name from the function name
953+ @hooks.hook()
954+ def install():
955+ pass # your code here
956+
957+ # register a hook, providing a custom hook name
958+ @hooks.hook("config-changed")
959+ def config_changed():
960+ pass # your code here
961+
962+ if __name__ == "__main__":
963+ # execute a hook based on the name the program is called by
964+ hooks.execute(sys.argv)
965+ """
966+
967+ def __init__(self, config_save=True):
968+ super(Hooks, self).__init__()
969+ self._hooks = {}
970+ self._config_save = config_save
971+
972+ def register(self, name, function):
973+ """Register a hook"""
974+ self._hooks[name] = function
975+
976+ def execute(self, args):
977+ """Execute a registered hook based on args[0]"""
978+ hook_name = os.path.basename(args[0])
979+ if hook_name in self._hooks:
980+ self._hooks[hook_name]()
981+ if self._config_save:
982+ cfg = config()
983+ if cfg.implicit_save:
984+ cfg.save()
985+ else:
986+ raise UnregisteredHookError(hook_name)
987+
988+ def hook(self, *hook_names):
989+ """Decorator, registering them as hooks"""
990+ def wrapper(decorated):
991+ for hook_name in hook_names:
992+ self.register(hook_name, decorated)
993+ else:
994+ self.register(decorated.__name__, decorated)
995+ if '_' in decorated.__name__:
996+ self.register(
997+ decorated.__name__.replace('_', '-'), decorated)
998+ return decorated
999+ return wrapper
1000+
1001+
1002+def charm_dir():
1003+ """Return the root directory of the current charm"""
1004+ return os.environ.get('CHARM_DIR')
1005
1006=== added file 'hooks/charmhelpers/core/host.py'
1007--- hooks/charmhelpers/core/host.py 1970-01-01 00:00:00 +0000
1008+++ hooks/charmhelpers/core/host.py 2015-01-07 17:22:49 +0000
1009@@ -0,0 +1,416 @@
1010+"""Tools for working with the host system"""
1011+# Copyright 2012 Canonical Ltd.
1012+#
1013+# Authors:
1014+# Nick Moffitt <nick.moffitt@canonical.com>
1015+# Matthew Wedgwood <matthew.wedgwood@canonical.com>
1016+
1017+import os
1018+import re
1019+import pwd
1020+import grp
1021+import random
1022+import string
1023+import subprocess
1024+import hashlib
1025+from contextlib import contextmanager
1026+from collections import OrderedDict
1027+
1028+import six
1029+
1030+from .hookenv import log
1031+from .fstab import Fstab
1032+
1033+
1034+def service_start(service_name):
1035+ """Start a system service"""
1036+ return service('start', service_name)
1037+
1038+
1039+def service_stop(service_name):
1040+ """Stop a system service"""
1041+ return service('stop', service_name)
1042+
1043+
1044+def service_restart(service_name):
1045+ """Restart a system service"""
1046+ return service('restart', service_name)
1047+
1048+
1049+def service_reload(service_name, restart_on_failure=False):
1050+ """Reload a system service, optionally falling back to restart if
1051+ reload fails"""
1052+ service_result = service('reload', service_name)
1053+ if not service_result and restart_on_failure:
1054+ service_result = service('restart', service_name)
1055+ return service_result
1056+
1057+
1058+def service(action, service_name):
1059+ """Control a system service"""
1060+ cmd = ['service', service_name, action]
1061+ return subprocess.call(cmd) == 0
1062+
1063+
1064+def service_running(service):
1065+ """Determine whether a system service is running"""
1066+ try:
1067+ output = subprocess.check_output(
1068+ ['service', service, 'status'],
1069+ stderr=subprocess.STDOUT).decode('UTF-8')
1070+ except subprocess.CalledProcessError:
1071+ return False
1072+ else:
1073+ if ("start/running" in output or "is running" in output):
1074+ return True
1075+ else:
1076+ return False
1077+
1078+
1079+def service_available(service_name):
1080+ """Determine whether a system service is available"""
1081+ try:
1082+ subprocess.check_output(
1083+ ['service', service_name, 'status'],
1084+ stderr=subprocess.STDOUT).decode('UTF-8')
1085+ except subprocess.CalledProcessError as e:
1086+ return 'unrecognized service' not in e.output
1087+ else:
1088+ return True
1089+
1090+
1091+def adduser(username, password=None, shell='/bin/bash', system_user=False):
1092+ """Add a user to the system"""
1093+ try:
1094+ user_info = pwd.getpwnam(username)
1095+ log('user {0} already exists!'.format(username))
1096+ except KeyError:
1097+ log('creating user {0}'.format(username))
1098+ cmd = ['useradd']
1099+ if system_user or password is None:
1100+ cmd.append('--system')
1101+ else:
1102+ cmd.extend([
1103+ '--create-home',
1104+ '--shell', shell,
1105+ '--password', password,
1106+ ])
1107+ cmd.append(username)
1108+ subprocess.check_call(cmd)
1109+ user_info = pwd.getpwnam(username)
1110+ return user_info
1111+
1112+
1113+def add_group(group_name, system_group=False):
1114+ """Add a group to the system"""
1115+ try:
1116+ group_info = grp.getgrnam(group_name)
1117+ log('group {0} already exists!'.format(group_name))
1118+ except KeyError:
1119+ log('creating group {0}'.format(group_name))
1120+ cmd = ['addgroup']
1121+ if system_group:
1122+ cmd.append('--system')
1123+ else:
1124+ cmd.extend([
1125+ '--group',
1126+ ])
1127+ cmd.append(group_name)
1128+ subprocess.check_call(cmd)
1129+ group_info = grp.getgrnam(group_name)
1130+ return group_info
1131+
1132+
1133+def add_user_to_group(username, group):
1134+ """Add a user to a group"""
1135+ cmd = [
1136+ 'gpasswd', '-a',
1137+ username,
1138+ group
1139+ ]
1140+ log("Adding user {} to group {}".format(username, group))
1141+ subprocess.check_call(cmd)
1142+
1143+
1144+def rsync(from_path, to_path, flags='-r', options=None):
1145+ """Replicate the contents of a path"""
1146+ options = options or ['--delete', '--executability']
1147+ cmd = ['/usr/bin/rsync', flags]
1148+ cmd.extend(options)
1149+ cmd.append(from_path)
1150+ cmd.append(to_path)
1151+ log(" ".join(cmd))
1152+ return subprocess.check_output(cmd).decode('UTF-8').strip()
1153+
1154+
1155+def symlink(source, destination):
1156+ """Create a symbolic link"""
1157+ log("Symlinking {} as {}".format(source, destination))
1158+ cmd = [
1159+ 'ln',
1160+ '-sf',
1161+ source,
1162+ destination,
1163+ ]
1164+ subprocess.check_call(cmd)
1165+
1166+
1167+def mkdir(path, owner='root', group='root', perms=0o555, force=False):
1168+ """Create a directory"""
1169+ log("Making dir {} {}:{} {:o}".format(path, owner, group,
1170+ perms))
1171+ uid = pwd.getpwnam(owner).pw_uid
1172+ gid = grp.getgrnam(group).gr_gid
1173+ realpath = os.path.abspath(path)
1174+ if os.path.exists(realpath):
1175+ if force and not os.path.isdir(realpath):
1176+ log("Removing non-directory file {} prior to mkdir()".format(path))
1177+ os.unlink(realpath)
1178+ else:
1179+ os.makedirs(realpath, perms)
1180+ os.chown(realpath, uid, gid)
1181+
1182+
1183+def write_file(path, content, owner='root', group='root', perms=0o444):
1184+ """Create or overwrite a file with the contents of a string"""
1185+ log("Writing file {} {}:{} {:o}".format(path, owner, group, perms))
1186+ uid = pwd.getpwnam(owner).pw_uid
1187+ gid = grp.getgrnam(group).gr_gid
1188+ with open(path, 'w') as target:
1189+ os.fchown(target.fileno(), uid, gid)
1190+ os.fchmod(target.fileno(), perms)
1191+ target.write(content)
1192+
1193+
1194+def fstab_remove(mp):
1195+ """Remove the given mountpoint entry from /etc/fstab
1196+ """
1197+ return Fstab.remove_by_mountpoint(mp)
1198+
1199+
1200+def fstab_add(dev, mp, fs, options=None):
1201+ """Adds the given device entry to the /etc/fstab file
1202+ """
1203+ return Fstab.add(dev, mp, fs, options=options)
1204+
1205+
1206+def mount(device, mountpoint, options=None, persist=False, filesystem="ext3"):
1207+ """Mount a filesystem at a particular mountpoint"""
1208+ cmd_args = ['mount']
1209+ if options is not None:
1210+ cmd_args.extend(['-o', options])
1211+ cmd_args.extend([device, mountpoint])
1212+ try:
1213+ subprocess.check_output(cmd_args)
1214+ except subprocess.CalledProcessError as e:
1215+ log('Error mounting {} at {}\n{}'.format(device, mountpoint, e.output))
1216+ return False
1217+
1218+ if persist:
1219+ return fstab_add(device, mountpoint, filesystem, options=options)
1220+ return True
1221+
1222+
1223+def umount(mountpoint, persist=False):
1224+ """Unmount a filesystem"""
1225+ cmd_args = ['umount', mountpoint]
1226+ try:
1227+ subprocess.check_output(cmd_args)
1228+ except subprocess.CalledProcessError as e:
1229+ log('Error unmounting {}\n{}'.format(mountpoint, e.output))
1230+ return False
1231+
1232+ if persist:
1233+ return fstab_remove(mountpoint)
1234+ return True
1235+
1236+
1237+def mounts():
1238+ """Get a list of all mounted volumes as [[mountpoint,device],[...]]"""
1239+ with open('/proc/mounts') as f:
1240+ # [['/mount/point','/dev/path'],[...]]
1241+ system_mounts = [m[1::-1] for m in [l.strip().split()
1242+ for l in f.readlines()]]
1243+ return system_mounts
1244+
1245+
1246+def file_hash(path, hash_type='md5'):
1247+ """
1248+ Generate a hash checksum of the contents of 'path' or None if not found.
1249+
1250+ :param str hash_type: Any hash alrgorithm supported by :mod:`hashlib`,
1251+ such as md5, sha1, sha256, sha512, etc.
1252+ """
1253+ if os.path.exists(path):
1254+ h = getattr(hashlib, hash_type)()
1255+ with open(path, 'rb') as source:
1256+ h.update(source.read())
1257+ return h.hexdigest()
1258+ else:
1259+ return None
1260+
1261+
1262+def check_hash(path, checksum, hash_type='md5'):
1263+ """
1264+ Validate a file using a cryptographic checksum.
1265+
1266+ :param str checksum: Value of the checksum used to validate the file.
1267+ :param str hash_type: Hash algorithm used to generate `checksum`.
1268+ Can be any hash alrgorithm supported by :mod:`hashlib`,
1269+ such as md5, sha1, sha256, sha512, etc.
1270+ :raises ChecksumError: If the file fails the checksum
1271+
1272+ """
1273+ actual_checksum = file_hash(path, hash_type)
1274+ if checksum != actual_checksum:
1275+ raise ChecksumError("'%s' != '%s'" % (checksum, actual_checksum))
1276+
1277+
1278+class ChecksumError(ValueError):
1279+ pass
1280+
1281+
1282+def restart_on_change(restart_map, stopstart=False):
1283+ """Restart services based on configuration files changing
1284+
1285+ This function is used a decorator, for example::
1286+
1287+ @restart_on_change({
1288+ '/etc/ceph/ceph.conf': [ 'cinder-api', 'cinder-volume' ]
1289+ })
1290+ def ceph_client_changed():
1291+ pass # your code here
1292+
1293+ In this example, the cinder-api and cinder-volume services
1294+ would be restarted if /etc/ceph/ceph.conf is changed by the
1295+ ceph_client_changed function.
1296+ """
1297+ def wrap(f):
1298+ def wrapped_f(*args):
1299+ checksums = {}
1300+ for path in restart_map:
1301+ checksums[path] = file_hash(path)
1302+ f(*args)
1303+ restarts = []
1304+ for path in restart_map:
1305+ if checksums[path] != file_hash(path):
1306+ restarts += restart_map[path]
1307+ services_list = list(OrderedDict.fromkeys(restarts))
1308+ if not stopstart:
1309+ for service_name in services_list:
1310+ service('restart', service_name)
1311+ else:
1312+ for action in ['stop', 'start']:
1313+ for service_name in services_list:
1314+ service(action, service_name)
1315+ return wrapped_f
1316+ return wrap
1317+
1318+
1319+def lsb_release():
1320+ """Return /etc/lsb-release in a dict"""
1321+ d = {}
1322+ with open('/etc/lsb-release', 'r') as lsb:
1323+ for l in lsb:
1324+ k, v = l.split('=')
1325+ d[k.strip()] = v.strip()
1326+ return d
1327+
1328+
1329+def pwgen(length=None):
1330+ """Generate a random pasword."""
1331+ if length is None:
1332+ length = random.choice(range(35, 45))
1333+ alphanumeric_chars = [
1334+ l for l in (string.ascii_letters + string.digits)
1335+ if l not in 'l0QD1vAEIOUaeiou']
1336+ random_chars = [
1337+ random.choice(alphanumeric_chars) for _ in range(length)]
1338+ return(''.join(random_chars))
1339+
1340+
1341+def list_nics(nic_type):
1342+ '''Return a list of nics of given type(s)'''
1343+ if isinstance(nic_type, six.string_types):
1344+ int_types = [nic_type]
1345+ else:
1346+ int_types = nic_type
1347+ interfaces = []
1348+ for int_type in int_types:
1349+ cmd = ['ip', 'addr', 'show', 'label', int_type + '*']
1350+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1351+ ip_output = (line for line in ip_output if line)
1352+ for line in ip_output:
1353+ if line.split()[1].startswith(int_type):
1354+ matched = re.search('.*: (bond[0-9]+\.[0-9]+)@.*', line)
1355+ if matched:
1356+ interface = matched.groups()[0]
1357+ else:
1358+ interface = line.split()[1].replace(":", "")
1359+ interfaces.append(interface)
1360+
1361+ return interfaces
1362+
1363+
1364+def set_nic_mtu(nic, mtu):
1365+ '''Set MTU on a network interface'''
1366+ cmd = ['ip', 'link', 'set', nic, 'mtu', mtu]
1367+ subprocess.check_call(cmd)
1368+
1369+
1370+def get_nic_mtu(nic):
1371+ cmd = ['ip', 'addr', 'show', nic]
1372+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1373+ mtu = ""
1374+ for line in ip_output:
1375+ words = line.split()
1376+ if 'mtu' in words:
1377+ mtu = words[words.index("mtu") + 1]
1378+ return mtu
1379+
1380+
1381+def get_nic_hwaddr(nic):
1382+ cmd = ['ip', '-o', '-0', 'addr', 'show', nic]
1383+ ip_output = subprocess.check_output(cmd).decode('UTF-8')
1384+ hwaddr = ""
1385+ words = ip_output.split()
1386+ if 'link/ether' in words:
1387+ hwaddr = words[words.index('link/ether') + 1]
1388+ return hwaddr
1389+
1390+
1391+def cmp_pkgrevno(package, revno, pkgcache=None):
1392+ '''Compare supplied revno with the revno of the installed package
1393+
1394+ * 1 => Installed revno is greater than supplied arg
1395+ * 0 => Installed revno is the same as supplied arg
1396+ * -1 => Installed revno is less than supplied arg
1397+
1398+ '''
1399+ import apt_pkg
1400+ if not pkgcache:
1401+ from charmhelpers.fetch import apt_cache
1402+ pkgcache = apt_cache()
1403+ pkg = pkgcache[package]
1404+ return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
1405+
1406+
1407+@contextmanager
1408+def chdir(d):
1409+ cur = os.getcwd()
1410+ try:
1411+ yield os.chdir(d)
1412+ finally:
1413+ os.chdir(cur)
1414+
1415+
1416+def chownr(path, owner, group):
1417+ uid = pwd.getpwnam(owner).pw_uid
1418+ gid = grp.getgrnam(group).gr_gid
1419+
1420+ for root, dirs, files in os.walk(path):
1421+ for name in dirs + files:
1422+ full = os.path.join(root, name)
1423+ broken_symlink = os.path.lexists(full) and not os.path.exists(full)
1424+ if not broken_symlink:
1425+ os.chown(full, uid, gid)
1426
1427=== added directory 'hooks/charmhelpers/core/services'
1428=== added file 'hooks/charmhelpers/core/services/__init__.py'
1429--- hooks/charmhelpers/core/services/__init__.py 1970-01-01 00:00:00 +0000
1430+++ hooks/charmhelpers/core/services/__init__.py 2015-01-07 17:22:49 +0000
1431@@ -0,0 +1,2 @@
1432+from .base import * # NOQA
1433+from .helpers import * # NOQA
1434
1435=== added file 'hooks/charmhelpers/core/services/base.py'
1436--- hooks/charmhelpers/core/services/base.py 1970-01-01 00:00:00 +0000
1437+++ hooks/charmhelpers/core/services/base.py 2015-01-07 17:22:49 +0000
1438@@ -0,0 +1,313 @@
1439+import os
1440+import re
1441+import json
1442+from collections import Iterable
1443+
1444+from charmhelpers.core import host
1445+from charmhelpers.core import hookenv
1446+
1447+
1448+__all__ = ['ServiceManager', 'ManagerCallback',
1449+ 'PortManagerCallback', 'open_ports', 'close_ports', 'manage_ports',
1450+ 'service_restart', 'service_stop']
1451+
1452+
1453+class ServiceManager(object):
1454+ def __init__(self, services=None):
1455+ """
1456+ Register a list of services, given their definitions.
1457+
1458+ Service definitions are dicts in the following formats (all keys except
1459+ 'service' are optional)::
1460+
1461+ {
1462+ "service": <service name>,
1463+ "required_data": <list of required data contexts>,
1464+ "provided_data": <list of provided data contexts>,
1465+ "data_ready": <one or more callbacks>,
1466+ "data_lost": <one or more callbacks>,
1467+ "start": <one or more callbacks>,
1468+ "stop": <one or more callbacks>,
1469+ "ports": <list of ports to manage>,
1470+ }
1471+
1472+ The 'required_data' list should contain dicts of required data (or
1473+ dependency managers that act like dicts and know how to collect the data).
1474+ Only when all items in the 'required_data' list are populated are the list
1475+ of 'data_ready' and 'start' callbacks executed. See `is_ready()` for more
1476+ information.
1477+
1478+ The 'provided_data' list should contain relation data providers, most likely
1479+ a subclass of :class:`charmhelpers.core.services.helpers.RelationContext`,
1480+ that will indicate a set of data to set on a given relation.
1481+
1482+ The 'data_ready' value should be either a single callback, or a list of
1483+ callbacks, to be called when all items in 'required_data' pass `is_ready()`.
1484+ Each callback will be called with the service name as the only parameter.
1485+ After all of the 'data_ready' callbacks are called, the 'start' callbacks
1486+ are fired.
1487+
1488+ The 'data_lost' value should be either a single callback, or a list of
1489+ callbacks, to be called when a 'required_data' item no longer passes
1490+ `is_ready()`. Each callback will be called with the service name as the
1491+ only parameter. After all of the 'data_lost' callbacks are called,
1492+ the 'stop' callbacks are fired.
1493+
1494+ The 'start' value should be either a single callback, or a list of
1495+ callbacks, to be called when starting the service, after the 'data_ready'
1496+ callbacks are complete. Each callback will be called with the service
1497+ name as the only parameter. This defaults to
1498+ `[host.service_start, services.open_ports]`.
1499+
1500+ The 'stop' value should be either a single callback, or a list of
1501+ callbacks, to be called when stopping the service. If the service is
1502+ being stopped because it no longer has all of its 'required_data', this
1503+ will be called after all of the 'data_lost' callbacks are complete.
1504+ Each callback will be called with the service name as the only parameter.
1505+ This defaults to `[services.close_ports, host.service_stop]`.
1506+
1507+ The 'ports' value should be a list of ports to manage. The default
1508+ 'start' handler will open the ports after the service is started,
1509+ and the default 'stop' handler will close the ports prior to stopping
1510+ the service.
1511+
1512+
1513+ Examples:
1514+
1515+ The following registers an Upstart service called bingod that depends on
1516+ a mongodb relation and which runs a custom `db_migrate` function prior to
1517+ restarting the service, and a Runit service called spadesd::
1518+
1519+ manager = services.ServiceManager([
1520+ {
1521+ 'service': 'bingod',
1522+ 'ports': [80, 443],
1523+ 'required_data': [MongoRelation(), config(), {'my': 'data'}],
1524+ 'data_ready': [
1525+ services.template(source='bingod.conf'),
1526+ services.template(source='bingod.ini',
1527+ target='/etc/bingod.ini',
1528+ owner='bingo', perms=0400),
1529+ ],
1530+ },
1531+ {
1532+ 'service': 'spadesd',
1533+ 'data_ready': services.template(source='spadesd_run.j2',
1534+ target='/etc/sv/spadesd/run',
1535+ perms=0555),
1536+ 'start': runit_start,
1537+ 'stop': runit_stop,
1538+ },
1539+ ])
1540+ manager.manage()
1541+ """
1542+ self._ready_file = os.path.join(hookenv.charm_dir(), 'READY-SERVICES.json')
1543+ self._ready = None
1544+ self.services = {}
1545+ for service in services or []:
1546+ service_name = service['service']
1547+ self.services[service_name] = service
1548+
1549+ def manage(self):
1550+ """
1551+ Handle the current hook by doing The Right Thing with the registered services.
1552+ """
1553+ hook_name = hookenv.hook_name()
1554+ if hook_name == 'stop':
1555+ self.stop_services()
1556+ else:
1557+ self.provide_data()
1558+ self.reconfigure_services()
1559+ cfg = hookenv.config()
1560+ if cfg.implicit_save:
1561+ cfg.save()
1562+
1563+ def provide_data(self):
1564+ """
1565+ Set the relation data for each provider in the ``provided_data`` list.
1566+
1567+ A provider must have a `name` attribute, which indicates which relation
1568+ to set data on, and a `provide_data()` method, which returns a dict of
1569+ data to set.
1570+ """
1571+ hook_name = hookenv.hook_name()
1572+ for service in self.services.values():
1573+ for provider in service.get('provided_data', []):
1574+ if re.match(r'{}-relation-(joined|changed)'.format(provider.name), hook_name):
1575+ data = provider.provide_data()
1576+ _ready = provider._is_ready(data) if hasattr(provider, '_is_ready') else data
1577+ if _ready:
1578+ hookenv.relation_set(None, data)
1579+
1580+ def reconfigure_services(self, *service_names):
1581+ """
1582+ Update all files for one or more registered services, and,
1583+ if ready, optionally restart them.
1584+
1585+ If no service names are given, reconfigures all registered services.
1586+ """
1587+ for service_name in service_names or self.services.keys():
1588+ if self.is_ready(service_name):
1589+ self.fire_event('data_ready', service_name)
1590+ self.fire_event('start', service_name, default=[
1591+ service_restart,
1592+ manage_ports])
1593+ self.save_ready(service_name)
1594+ else:
1595+ if self.was_ready(service_name):
1596+ self.fire_event('data_lost', service_name)
1597+ self.fire_event('stop', service_name, default=[
1598+ manage_ports,
1599+ service_stop])
1600+ self.save_lost(service_name)
1601+
1602+ def stop_services(self, *service_names):
1603+ """
1604+ Stop one or more registered services, by name.
1605+
1606+ If no service names are given, stops all registered services.
1607+ """
1608+ for service_name in service_names or self.services.keys():
1609+ self.fire_event('stop', service_name, default=[
1610+ manage_ports,
1611+ service_stop])
1612+
1613+ def get_service(self, service_name):
1614+ """
1615+ Given the name of a registered service, return its service definition.
1616+ """
1617+ service = self.services.get(service_name)
1618+ if not service:
1619+ raise KeyError('Service not registered: %s' % service_name)
1620+ return service
1621+
1622+ def fire_event(self, event_name, service_name, default=None):
1623+ """
1624+ Fire a data_ready, data_lost, start, or stop event on a given service.
1625+ """
1626+ service = self.get_service(service_name)
1627+ callbacks = service.get(event_name, default)
1628+ if not callbacks:
1629+ return
1630+ if not isinstance(callbacks, Iterable):
1631+ callbacks = [callbacks]
1632+ for callback in callbacks:
1633+ if isinstance(callback, ManagerCallback):
1634+ callback(self, service_name, event_name)
1635+ else:
1636+ callback(service_name)
1637+
1638+ def is_ready(self, service_name):
1639+ """
1640+ Determine if a registered service is ready, by checking its 'required_data'.
1641+
1642+ A 'required_data' item can be any mapping type, and is considered ready
1643+ if `bool(item)` evaluates as True.
1644+ """
1645+ service = self.get_service(service_name)
1646+ reqs = service.get('required_data', [])
1647+ return all(bool(req) for req in reqs)
1648+
1649+ def _load_ready_file(self):
1650+ if self._ready is not None:
1651+ return
1652+ if os.path.exists(self._ready_file):
1653+ with open(self._ready_file) as fp:
1654+ self._ready = set(json.load(fp))
1655+ else:
1656+ self._ready = set()
1657+
1658+ def _save_ready_file(self):
1659+ if self._ready is None:
1660+ return
1661+ with open(self._ready_file, 'w') as fp:
1662+ json.dump(list(self._ready), fp)
1663+
1664+ def save_ready(self, service_name):
1665+ """
1666+ Save an indicator that the given service is now data_ready.
1667+ """
1668+ self._load_ready_file()
1669+ self._ready.add(service_name)
1670+ self._save_ready_file()
1671+
1672+ def save_lost(self, service_name):
1673+ """
1674+ Save an indicator that the given service is no longer data_ready.
1675+ """
1676+ self._load_ready_file()
1677+ self._ready.discard(service_name)
1678+ self._save_ready_file()
1679+
1680+ def was_ready(self, service_name):
1681+ """
1682+ Determine if the given service was previously data_ready.
1683+ """
1684+ self._load_ready_file()
1685+ return service_name in self._ready
1686+
1687+
1688+class ManagerCallback(object):
1689+ """
1690+ Special case of a callback that takes the `ServiceManager` instance
1691+ in addition to the service name.
1692+
1693+ Subclasses should implement `__call__` which should accept three parameters:
1694+
1695+ * `manager` The `ServiceManager` instance
1696+ * `service_name` The name of the service it's being triggered for
1697+ * `event_name` The name of the event that this callback is handling
1698+ """
1699+ def __call__(self, manager, service_name, event_name):
1700+ raise NotImplementedError()
1701+
1702+
1703+class PortManagerCallback(ManagerCallback):
1704+ """
1705+ Callback class that will open or close ports, for use as either
1706+ a start or stop action.
1707+ """
1708+ def __call__(self, manager, service_name, event_name):
1709+ service = manager.get_service(service_name)
1710+ new_ports = service.get('ports', [])
1711+ port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
1712+ if os.path.exists(port_file):
1713+ with open(port_file) as fp:
1714+ old_ports = fp.read().split(',')
1715+ for old_port in old_ports:
1716+ if bool(old_port):
1717+ old_port = int(old_port)
1718+ if old_port not in new_ports:
1719+ hookenv.close_port(old_port)
1720+ with open(port_file, 'w') as fp:
1721+ fp.write(','.join(str(port) for port in new_ports))
1722+ for port in new_ports:
1723+ if event_name == 'start':
1724+ hookenv.open_port(port)
1725+ elif event_name == 'stop':
1726+ hookenv.close_port(port)
1727+
1728+
1729+def service_stop(service_name):
1730+ """
1731+ Wrapper around host.service_stop to prevent spurious "unknown service"
1732+ messages in the logs.
1733+ """
1734+ if host.service_running(service_name):
1735+ host.service_stop(service_name)
1736+
1737+
1738+def service_restart(service_name):
1739+ """
1740+ Wrapper around host.service_restart to prevent spurious "unknown service"
1741+ messages in the logs.
1742+ """
1743+ if host.service_available(service_name):
1744+ if host.service_running(service_name):
1745+ host.service_restart(service_name)
1746+ else:
1747+ host.service_start(service_name)
1748+
1749+
1750+# Convenience aliases
1751+open_ports = close_ports = manage_ports = PortManagerCallback()
1752
1753=== added file 'hooks/charmhelpers/core/services/helpers.py'
1754--- hooks/charmhelpers/core/services/helpers.py 1970-01-01 00:00:00 +0000
1755+++ hooks/charmhelpers/core/services/helpers.py 2015-01-07 17:22:49 +0000
1756@@ -0,0 +1,243 @@
1757+import os
1758+import yaml
1759+from charmhelpers.core import hookenv
1760+from charmhelpers.core import templating
1761+
1762+from charmhelpers.core.services.base import ManagerCallback
1763+
1764+
1765+__all__ = ['RelationContext', 'TemplateCallback',
1766+ 'render_template', 'template']
1767+
1768+
1769+class RelationContext(dict):
1770+ """
1771+ Base class for a context generator that gets relation data from juju.
1772+
1773+ Subclasses must provide the attributes `name`, which is the name of the
1774+ interface of interest, `interface`, which is the type of the interface of
1775+ interest, and `required_keys`, which is the set of keys required for the
1776+ relation to be considered complete. The data for all interfaces matching
1777+ the `name` attribute that are complete will used to populate the dictionary
1778+ values (see `get_data`, below).
1779+
1780+ The generated context will be namespaced under the relation :attr:`name`,
1781+ to prevent potential naming conflicts.
1782+
1783+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1784+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1785+ """
1786+ name = None
1787+ interface = None
1788+ required_keys = []
1789+
1790+ def __init__(self, name=None, additional_required_keys=None):
1791+ if name is not None:
1792+ self.name = name
1793+ if additional_required_keys is not None:
1794+ self.required_keys.extend(additional_required_keys)
1795+ self.get_data()
1796+
1797+ def __bool__(self):
1798+ """
1799+ Returns True if all of the required_keys are available.
1800+ """
1801+ return self.is_ready()
1802+
1803+ __nonzero__ = __bool__
1804+
1805+ def __repr__(self):
1806+ return super(RelationContext, self).__repr__()
1807+
1808+ def is_ready(self):
1809+ """
1810+ Returns True if all of the `required_keys` are available from any units.
1811+ """
1812+ ready = len(self.get(self.name, [])) > 0
1813+ if not ready:
1814+ hookenv.log('Incomplete relation: {}'.format(self.__class__.__name__), hookenv.DEBUG)
1815+ return ready
1816+
1817+ def _is_ready(self, unit_data):
1818+ """
1819+ Helper method that tests a set of relation data and returns True if
1820+ all of the `required_keys` are present.
1821+ """
1822+ return set(unit_data.keys()).issuperset(set(self.required_keys))
1823+
1824+ def get_data(self):
1825+ """
1826+ Retrieve the relation data for each unit involved in a relation and,
1827+ if complete, store it in a list under `self[self.name]`. This
1828+ is automatically called when the RelationContext is instantiated.
1829+
1830+ The units are sorted lexographically first by the service ID, then by
1831+ the unit ID. Thus, if an interface has two other services, 'db:1'
1832+ and 'db:2', with 'db:1' having two units, 'wordpress/0' and 'wordpress/1',
1833+ and 'db:2' having one unit, 'mediawiki/0', all of which have a complete
1834+ set of data, the relation data for the units will be stored in the
1835+ order: 'wordpress/0', 'wordpress/1', 'mediawiki/0'.
1836+
1837+ If you only care about a single unit on the relation, you can just
1838+ access it as `{{ interface[0]['key'] }}`. However, if you can at all
1839+ support multiple units on a relation, you should iterate over the list,
1840+ like::
1841+
1842+ {% for unit in interface -%}
1843+ {{ unit['key'] }}{% if not loop.last %},{% endif %}
1844+ {%- endfor %}
1845+
1846+ Note that since all sets of relation data from all related services and
1847+ units are in a single list, if you need to know which service or unit a
1848+ set of data came from, you'll need to extend this class to preserve
1849+ that information.
1850+ """
1851+ if not hookenv.relation_ids(self.name):
1852+ return
1853+
1854+ ns = self.setdefault(self.name, [])
1855+ for rid in sorted(hookenv.relation_ids(self.name)):
1856+ for unit in sorted(hookenv.related_units(rid)):
1857+ reldata = hookenv.relation_get(rid=rid, unit=unit)
1858+ if self._is_ready(reldata):
1859+ ns.append(reldata)
1860+
1861+ def provide_data(self):
1862+ """
1863+ Return data to be relation_set for this interface.
1864+ """
1865+ return {}
1866+
1867+
1868+class MysqlRelation(RelationContext):
1869+ """
1870+ Relation context for the `mysql` interface.
1871+
1872+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1873+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1874+ """
1875+ name = 'db'
1876+ interface = 'mysql'
1877+ required_keys = ['host', 'user', 'password', 'database']
1878+
1879+
1880+class HttpRelation(RelationContext):
1881+ """
1882+ Relation context for the `http` interface.
1883+
1884+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1885+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1886+ """
1887+ name = 'website'
1888+ interface = 'http'
1889+ required_keys = ['host', 'port']
1890+
1891+ def provide_data(self):
1892+ return {
1893+ 'host': hookenv.unit_get('private-address'),
1894+ 'port': 80,
1895+ }
1896+
1897+
1898+class RequiredConfig(dict):
1899+ """
1900+ Data context that loads config options with one or more mandatory options.
1901+
1902+ Once the required options have been changed from their default values, all
1903+ config options will be available, namespaced under `config` to prevent
1904+ potential naming conflicts (for example, between a config option and a
1905+ relation property).
1906+
1907+ :param list *args: List of options that must be changed from their default values.
1908+ """
1909+
1910+ def __init__(self, *args):
1911+ self.required_options = args
1912+ self['config'] = hookenv.config()
1913+ with open(os.path.join(hookenv.charm_dir(), 'config.yaml')) as fp:
1914+ self.config = yaml.load(fp).get('options', {})
1915+
1916+ def __bool__(self):
1917+ for option in self.required_options:
1918+ if option not in self['config']:
1919+ return False
1920+ current_value = self['config'][option]
1921+ default_value = self.config[option].get('default')
1922+ if current_value == default_value:
1923+ return False
1924+ if current_value in (None, '') and default_value in (None, ''):
1925+ return False
1926+ return True
1927+
1928+ def __nonzero__(self):
1929+ return self.__bool__()
1930+
1931+
1932+class StoredContext(dict):
1933+ """
1934+ A data context that always returns the data that it was first created with.
1935+
1936+ This is useful to do a one-time generation of things like passwords, that
1937+ will thereafter use the same value that was originally generated, instead
1938+ of generating a new value each time it is run.
1939+ """
1940+ def __init__(self, file_name, config_data):
1941+ """
1942+ If the file exists, populate `self` with the data from the file.
1943+ Otherwise, populate with the given data and persist it to the file.
1944+ """
1945+ if os.path.exists(file_name):
1946+ self.update(self.read_context(file_name))
1947+ else:
1948+ self.store_context(file_name, config_data)
1949+ self.update(config_data)
1950+
1951+ def store_context(self, file_name, config_data):
1952+ if not os.path.isabs(file_name):
1953+ file_name = os.path.join(hookenv.charm_dir(), file_name)
1954+ with open(file_name, 'w') as file_stream:
1955+ os.fchmod(file_stream.fileno(), 0o600)
1956+ yaml.dump(config_data, file_stream)
1957+
1958+ def read_context(self, file_name):
1959+ if not os.path.isabs(file_name):
1960+ file_name = os.path.join(hookenv.charm_dir(), file_name)
1961+ with open(file_name, 'r') as file_stream:
1962+ data = yaml.load(file_stream)
1963+ if not data:
1964+ raise OSError("%s is empty" % file_name)
1965+ return data
1966+
1967+
1968+class TemplateCallback(ManagerCallback):
1969+ """
1970+ Callback class that will render a Jinja2 template, for use as a ready
1971+ action.
1972+
1973+ :param str source: The template source file, relative to
1974+ `$CHARM_DIR/templates`
1975+
1976+ :param str target: The target to write the rendered template to
1977+ :param str owner: The owner of the rendered file
1978+ :param str group: The group of the rendered file
1979+ :param int perms: The permissions of the rendered file
1980+ """
1981+ def __init__(self, source, target,
1982+ owner='root', group='root', perms=0o444):
1983+ self.source = source
1984+ self.target = target
1985+ self.owner = owner
1986+ self.group = group
1987+ self.perms = perms
1988+
1989+ def __call__(self, manager, service_name, event_name):
1990+ service = manager.get_service(service_name)
1991+ context = {}
1992+ for ctx in service.get('required_data', []):
1993+ context.update(ctx)
1994+ templating.render(self.source, self.target, context,
1995+ self.owner, self.group, self.perms)
1996+
1997+
1998+# Convenience aliases for templates
1999+render_template = template = TemplateCallback
2000
2001=== added file 'hooks/charmhelpers/core/sysctl.py'
2002--- hooks/charmhelpers/core/sysctl.py 1970-01-01 00:00:00 +0000
2003+++ hooks/charmhelpers/core/sysctl.py 2015-01-07 17:22:49 +0000
2004@@ -0,0 +1,34 @@
2005+#!/usr/bin/env python
2006+# -*- coding: utf-8 -*-
2007+
2008+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
2009+
2010+import yaml
2011+
2012+from subprocess import check_call
2013+
2014+from charmhelpers.core.hookenv import (
2015+ log,
2016+ DEBUG,
2017+)
2018+
2019+
2020+def create(sysctl_dict, sysctl_file):
2021+ """Creates a sysctl.conf file from a YAML associative array
2022+
2023+ :param sysctl_dict: a dict of sysctl options eg { 'kernel.max_pid': 1337 }
2024+ :type sysctl_dict: dict
2025+ :param sysctl_file: path to the sysctl file to be saved
2026+ :type sysctl_file: str or unicode
2027+ :returns: None
2028+ """
2029+ sysctl_dict = yaml.load(sysctl_dict)
2030+
2031+ with open(sysctl_file, "w") as fd:
2032+ for key, value in sysctl_dict.items():
2033+ fd.write("{}={}\n".format(key, value))
2034+
2035+ log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict),
2036+ level=DEBUG)
2037+
2038+ check_call(["sysctl", "-p", sysctl_file])
2039
2040=== added file 'hooks/charmhelpers/core/templating.py'
2041--- hooks/charmhelpers/core/templating.py 1970-01-01 00:00:00 +0000
2042+++ hooks/charmhelpers/core/templating.py 2015-01-07 17:22:49 +0000
2043@@ -0,0 +1,52 @@
2044+import os
2045+
2046+from charmhelpers.core import host
2047+from charmhelpers.core import hookenv
2048+
2049+
2050+def render(source, target, context, owner='root', group='root',
2051+ perms=0o444, templates_dir=None):
2052+ """
2053+ Render a template.
2054+
2055+ The `source` path, if not absolute, is relative to the `templates_dir`.
2056+
2057+ The `target` path should be absolute.
2058+
2059+ The context should be a dict containing the values to be replaced in the
2060+ template.
2061+
2062+ The `owner`, `group`, and `perms` options will be passed to `write_file`.
2063+
2064+ If omitted, `templates_dir` defaults to the `templates` folder in the charm.
2065+
2066+ Note: Using this requires python-jinja2; if it is not installed, calling
2067+ this will attempt to use charmhelpers.fetch.apt_install to install it.
2068+ """
2069+ try:
2070+ from jinja2 import FileSystemLoader, Environment, exceptions
2071+ except ImportError:
2072+ try:
2073+ from charmhelpers.fetch import apt_install
2074+ except ImportError:
2075+ hookenv.log('Could not import jinja2, and could not import '
2076+ 'charmhelpers.fetch to install it',
2077+ level=hookenv.ERROR)
2078+ raise
2079+ apt_install('python-jinja2', fatal=True)
2080+ from jinja2 import FileSystemLoader, Environment, exceptions
2081+
2082+ if templates_dir is None:
2083+ templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
2084+ loader = Environment(loader=FileSystemLoader(templates_dir))
2085+ try:
2086+ source = source
2087+ template = loader.get_template(source)
2088+ except exceptions.TemplateNotFound as e:
2089+ hookenv.log('Could not load template %s from %s.' %
2090+ (source, templates_dir),
2091+ level=hookenv.ERROR)
2092+ raise e
2093+ content = template.render(context)
2094+ host.mkdir(os.path.dirname(target))
2095+ host.write_file(target, content, owner, group, perms)
2096
2097=== added directory 'hooks/charmhelpers/fetch'
2098=== added file 'hooks/charmhelpers/fetch/__init__.py'
2099--- hooks/charmhelpers/fetch/__init__.py 1970-01-01 00:00:00 +0000
2100+++ hooks/charmhelpers/fetch/__init__.py 2015-01-07 17:22:49 +0000
2101@@ -0,0 +1,416 @@
2102+import importlib
2103+from tempfile import NamedTemporaryFile
2104+import time
2105+from yaml import safe_load
2106+from charmhelpers.core.host import (
2107+ lsb_release
2108+)
2109+import subprocess
2110+from charmhelpers.core.hookenv import (
2111+ config,
2112+ log,
2113+)
2114+import os
2115+
2116+import six
2117+if six.PY3:
2118+ from urllib.parse import urlparse, urlunparse
2119+else:
2120+ from urlparse import urlparse, urlunparse
2121+
2122+
2123+CLOUD_ARCHIVE = """# Ubuntu Cloud Archive
2124+deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
2125+"""
2126+PROPOSED_POCKET = """# Proposed
2127+deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted
2128+"""
2129+CLOUD_ARCHIVE_POCKETS = {
2130+ # Folsom
2131+ 'folsom': 'precise-updates/folsom',
2132+ 'precise-folsom': 'precise-updates/folsom',
2133+ 'precise-folsom/updates': 'precise-updates/folsom',
2134+ 'precise-updates/folsom': 'precise-updates/folsom',
2135+ 'folsom/proposed': 'precise-proposed/folsom',
2136+ 'precise-folsom/proposed': 'precise-proposed/folsom',
2137+ 'precise-proposed/folsom': 'precise-proposed/folsom',
2138+ # Grizzly
2139+ 'grizzly': 'precise-updates/grizzly',
2140+ 'precise-grizzly': 'precise-updates/grizzly',
2141+ 'precise-grizzly/updates': 'precise-updates/grizzly',
2142+ 'precise-updates/grizzly': 'precise-updates/grizzly',
2143+ 'grizzly/proposed': 'precise-proposed/grizzly',
2144+ 'precise-grizzly/proposed': 'precise-proposed/grizzly',
2145+ 'precise-proposed/grizzly': 'precise-proposed/grizzly',
2146+ # Havana
2147+ 'havana': 'precise-updates/havana',
2148+ 'precise-havana': 'precise-updates/havana',
2149+ 'precise-havana/updates': 'precise-updates/havana',
2150+ 'precise-updates/havana': 'precise-updates/havana',
2151+ 'havana/proposed': 'precise-proposed/havana',
2152+ 'precise-havana/proposed': 'precise-proposed/havana',
2153+ 'precise-proposed/havana': 'precise-proposed/havana',
2154+ # Icehouse
2155+ 'icehouse': 'precise-updates/icehouse',
2156+ 'precise-icehouse': 'precise-updates/icehouse',
2157+ 'precise-icehouse/updates': 'precise-updates/icehouse',
2158+ 'precise-updates/icehouse': 'precise-updates/icehouse',
2159+ 'icehouse/proposed': 'precise-proposed/icehouse',
2160+ 'precise-icehouse/proposed': 'precise-proposed/icehouse',
2161+ 'precise-proposed/icehouse': 'precise-proposed/icehouse',
2162+ # Juno
2163+ 'juno': 'trusty-updates/juno',
2164+ 'trusty-juno': 'trusty-updates/juno',
2165+ 'trusty-juno/updates': 'trusty-updates/juno',
2166+ 'trusty-updates/juno': 'trusty-updates/juno',
2167+ 'juno/proposed': 'trusty-proposed/juno',
2168+ 'juno/proposed': 'trusty-proposed/juno',
2169+ 'trusty-juno/proposed': 'trusty-proposed/juno',
2170+ 'trusty-proposed/juno': 'trusty-proposed/juno',
2171+}
2172+
2173+# The order of this list is very important. Handlers should be listed in from
2174+# least- to most-specific URL matching.
2175+FETCH_HANDLERS = (
2176+ 'charmhelpers.fetch.archiveurl.ArchiveUrlFetchHandler',
2177+ 'charmhelpers.fetch.bzrurl.BzrUrlFetchHandler',
2178+ 'charmhelpers.fetch.giturl.GitUrlFetchHandler',
2179+)
2180+
2181+APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.
2182+APT_NO_LOCK_RETRY_DELAY = 10 # Wait 10 seconds between apt lock checks.
2183+APT_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.
2184+
2185+
2186+class SourceConfigError(Exception):
2187+ pass
2188+
2189+
2190+class UnhandledSource(Exception):
2191+ pass
2192+
2193+
2194+class AptLockError(Exception):
2195+ pass
2196+
2197+
2198+class BaseFetchHandler(object):
2199+
2200+ """Base class for FetchHandler implementations in fetch plugins"""
2201+
2202+ def can_handle(self, source):
2203+ """Returns True if the source can be handled. Otherwise returns
2204+ a string explaining why it cannot"""
2205+ return "Wrong source type"
2206+
2207+ def install(self, source):
2208+ """Try to download and unpack the source. Return the path to the
2209+ unpacked files or raise UnhandledSource."""
2210+ raise UnhandledSource("Wrong source type {}".format(source))
2211+
2212+ def parse_url(self, url):
2213+ return urlparse(url)
2214+
2215+ def base_url(self, url):
2216+ """Return url without querystring or fragment"""
2217+ parts = list(self.parse_url(url))
2218+ parts[4:] = ['' for i in parts[4:]]
2219+ return urlunparse(parts)
2220+
2221+
2222+def filter_installed_packages(packages):
2223+ """Returns a list of packages that require installation"""
2224+ cache = apt_cache()
2225+ _pkgs = []
2226+ for package in packages:
2227+ try:
2228+ p = cache[package]
2229+ p.current_ver or _pkgs.append(package)
2230+ except KeyError:
2231+ log('Package {} has no installation candidate.'.format(package),
2232+ level='WARNING')
2233+ _pkgs.append(package)
2234+ return _pkgs
2235+
2236+
2237+def apt_cache(in_memory=True):
2238+ """Build and return an apt cache"""
2239+ import apt_pkg
2240+ apt_pkg.init()
2241+ if in_memory:
2242+ apt_pkg.config.set("Dir::Cache::pkgcache", "")
2243+ apt_pkg.config.set("Dir::Cache::srcpkgcache", "")
2244+ return apt_pkg.Cache()
2245+
2246+
2247+def apt_install(packages, options=None, fatal=False):
2248+ """Install one or more packages"""
2249+ if options is None:
2250+ options = ['--option=Dpkg::Options::=--force-confold']
2251+
2252+ cmd = ['apt-get', '--assume-yes']
2253+ cmd.extend(options)
2254+ cmd.append('install')
2255+ if isinstance(packages, six.string_types):
2256+ cmd.append(packages)
2257+ else:
2258+ cmd.extend(packages)
2259+ log("Installing {} with options: {}".format(packages,
2260+ options))
2261+ _run_apt_command(cmd, fatal)
2262+
2263+
2264+def apt_upgrade(options=None, fatal=False, dist=False):
2265+ """Upgrade all packages"""
2266+ if options is None:
2267+ options = ['--option=Dpkg::Options::=--force-confold']
2268+
2269+ cmd = ['apt-get', '--assume-yes']
2270+ cmd.extend(options)
2271+ if dist:
2272+ cmd.append('dist-upgrade')
2273+ else:
2274+ cmd.append('upgrade')
2275+ log("Upgrading with options: {}".format(options))
2276+ _run_apt_command(cmd, fatal)
2277+
2278+
2279+def apt_update(fatal=False):
2280+ """Update local apt cache"""
2281+ cmd = ['apt-get', 'update']
2282+ _run_apt_command(cmd, fatal)
2283+
2284+
2285+def apt_purge(packages, fatal=False):
2286+ """Purge one or more packages"""
2287+ cmd = ['apt-get', '--assume-yes', 'purge']
2288+ if isinstance(packages, six.string_types):
2289+ cmd.append(packages)
2290+ else:
2291+ cmd.extend(packages)
2292+ log("Purging {}".format(packages))
2293+ _run_apt_command(cmd, fatal)
2294+
2295+
2296+def apt_hold(packages, fatal=False):
2297+ """Hold one or more packages"""
2298+ cmd = ['apt-mark', 'hold']
2299+ if isinstance(packages, six.string_types):
2300+ cmd.append(packages)
2301+ else:
2302+ cmd.extend(packages)
2303+ log("Holding {}".format(packages))
2304+
2305+ if fatal:
2306+ subprocess.check_call(cmd)
2307+ else:
2308+ subprocess.call(cmd)
2309+
2310+
2311+def add_source(source, key=None):
2312+ """Add a package source to this system.
2313+
2314+ @param source: a URL or sources.list entry, as supported by
2315+ add-apt-repository(1). Examples::
2316+
2317+ ppa:charmers/example
2318+ deb https://stub:key@private.example.com/ubuntu trusty main
2319+
2320+ In addition:
2321+ 'proposed:' may be used to enable the standard 'proposed'
2322+ pocket for the release.
2323+ 'cloud:' may be used to activate official cloud archive pockets,
2324+ such as 'cloud:icehouse'
2325+ 'distro' may be used as a noop
2326+
2327+ @param key: A key to be added to the system's APT keyring and used
2328+ to verify the signatures on packages. Ideally, this should be an
2329+ ASCII format GPG public key including the block headers. A GPG key
2330+ id may also be used, but be aware that only insecure protocols are
2331+ available to retrieve the actual public key from a public keyserver
2332+ placing your Juju environment at risk. ppa and cloud archive keys
2333+ are securely added automtically, so sould not be provided.
2334+ """
2335+ if source is None:
2336+ log('Source is not present. Skipping')
2337+ return
2338+
2339+ if (source.startswith('ppa:') or
2340+ source.startswith('http') or
2341+ source.startswith('deb ') or
2342+ source.startswith('cloud-archive:')):
2343+ subprocess.check_call(['add-apt-repository', '--yes', source])
2344+ elif source.startswith('cloud:'):
2345+ apt_install(filter_installed_packages(['ubuntu-cloud-keyring']),
2346+ fatal=True)
2347+ pocket = source.split(':')[-1]
2348+ if pocket not in CLOUD_ARCHIVE_POCKETS:
2349+ raise SourceConfigError(
2350+ 'Unsupported cloud: source option %s' %
2351+ pocket)
2352+ actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
2353+ with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
2354+ apt.write(CLOUD_ARCHIVE.format(actual_pocket))
2355+ elif source == 'proposed':
2356+ release = lsb_release()['DISTRIB_CODENAME']
2357+ with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
2358+ apt.write(PROPOSED_POCKET.format(release))
2359+ elif source == 'distro':
2360+ pass
2361+ else:
2362+ log("Unknown source: {!r}".format(source))
2363+
2364+ if key:
2365+ if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key:
2366+ with NamedTemporaryFile('w+') as key_file:
2367+ key_file.write(key)
2368+ key_file.flush()
2369+ key_file.seek(0)
2370+ subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file)
2371+ else:
2372+ # Note that hkp: is in no way a secure protocol. Using a
2373+ # GPG key id is pointless from a security POV unless you
2374+ # absolutely trust your network and DNS.
2375+ subprocess.check_call(['apt-key', 'adv', '--keyserver',
2376+ 'hkp://keyserver.ubuntu.com:80', '--recv',
2377+ key])
2378+
2379+
2380+def configure_sources(update=False,
2381+ sources_var='install_sources',
2382+ keys_var='install_keys'):
2383+ """
2384+ Configure multiple sources from charm configuration.
2385+
2386+ The lists are encoded as yaml fragments in the configuration.
2387+ The frament needs to be included as a string. Sources and their
2388+ corresponding keys are of the types supported by add_source().
2389+
2390+ Example config:
2391+ install_sources: |
2392+ - "ppa:foo"
2393+ - "http://example.com/repo precise main"
2394+ install_keys: |
2395+ - null
2396+ - "a1b2c3d4"
2397+
2398+ Note that 'null' (a.k.a. None) should not be quoted.
2399+ """
2400+ sources = safe_load((config(sources_var) or '').strip()) or []
2401+ keys = safe_load((config(keys_var) or '').strip()) or None
2402+
2403+ if isinstance(sources, six.string_types):
2404+ sources = [sources]
2405+
2406+ if keys is None:
2407+ for source in sources:
2408+ add_source(source, None)
2409+ else:
2410+ if isinstance(keys, six.string_types):
2411+ keys = [keys]
2412+
2413+ if len(sources) != len(keys):
2414+ raise SourceConfigError(
2415+ 'Install sources and keys lists are different lengths')
2416+ for source, key in zip(sources, keys):
2417+ add_source(source, key)
2418+ if update:
2419+ apt_update(fatal=True)
2420+
2421+
2422+def install_remote(source, *args, **kwargs):
2423+ """
2424+ Install a file tree from a remote source
2425+
2426+ The specified source should be a url of the form:
2427+ scheme://[host]/path[#[option=value][&...]]
2428+
2429+ Schemes supported are based on this modules submodules.
2430+ Options supported are submodule-specific.
2431+ Additional arguments are passed through to the submodule.
2432+
2433+ For example::
2434+
2435+ dest = install_remote('http://example.com/archive.tgz',
2436+ checksum='deadbeef',
2437+ hash_type='sha1')
2438+
2439+ This will download `archive.tgz`, validate it using SHA1 and, if
2440+ the file is ok, extract it and return the directory in which it
2441+ was extracted. If the checksum fails, it will raise
2442+ :class:`charmhelpers.core.host.ChecksumError`.
2443+ """
2444+ # We ONLY check for True here because can_handle may return a string
2445+ # explaining why it can't handle a given source.
2446+ handlers = [h for h in plugins() if h.can_handle(source) is True]
2447+ installed_to = None
2448+ for handler in handlers:
2449+ try:
2450+ installed_to = handler.install(source, *args, **kwargs)
2451+ except UnhandledSource:
2452+ pass
2453+ if not installed_to:
2454+ raise UnhandledSource("No handler found for source {}".format(source))
2455+ return installed_to
2456+
2457+
2458+def install_from_config(config_var_name):
2459+ charm_config = config()
2460+ source = charm_config[config_var_name]
2461+ return install_remote(source)
2462+
2463+
2464+def plugins(fetch_handlers=None):
2465+ if not fetch_handlers:
2466+ fetch_handlers = FETCH_HANDLERS
2467+ plugin_list = []
2468+ for handler_name in fetch_handlers:
2469+ package, classname = handler_name.rsplit('.', 1)
2470+ try:
2471+ handler_class = getattr(
2472+ importlib.import_module(package),
2473+ classname)
2474+ plugin_list.append(handler_class())
2475+ except (ImportError, AttributeError):
2476+ # Skip missing plugins so that they can be ommitted from
2477+ # installation if desired
2478+ log("FetchHandler {} not found, skipping plugin".format(
2479+ handler_name))
2480+ return plugin_list
2481+
2482+
2483+def _run_apt_command(cmd, fatal=False):
2484+ """
2485+ Run an APT command, checking output and retrying if the fatal flag is set
2486+ to True.
2487+
2488+ :param: cmd: str: The apt command to run.
2489+ :param: fatal: bool: Whether the command's output should be checked and
2490+ retried.
2491+ """
2492+ env = os.environ.copy()
2493+
2494+ if 'DEBIAN_FRONTEND' not in env:
2495+ env['DEBIAN_FRONTEND'] = 'noninteractive'
2496+
2497+ if fatal:
2498+ retry_count = 0
2499+ result = None
2500+
2501+ # If the command is considered "fatal", we need to retry if the apt
2502+ # lock was not acquired.
2503+
2504+ while result is None or result == APT_NO_LOCK:
2505+ try:
2506+ result = subprocess.check_call(cmd, env=env)
2507+ except subprocess.CalledProcessError as e:
2508+ retry_count = retry_count + 1
2509+ if retry_count > APT_NO_LOCK_RETRY_COUNT:
2510+ raise
2511+ result = e.returncode
2512+ log("Couldn't acquire DPKG lock. Will retry in {} seconds."
2513+ "".format(APT_NO_LOCK_RETRY_DELAY))
2514+ time.sleep(APT_NO_LOCK_RETRY_DELAY)
2515+
2516+ else:
2517+ subprocess.call(cmd, env=env)
2518
2519=== added file 'hooks/charmhelpers/fetch/archiveurl.py'
2520--- hooks/charmhelpers/fetch/archiveurl.py 1970-01-01 00:00:00 +0000
2521+++ hooks/charmhelpers/fetch/archiveurl.py 2015-01-07 17:22:49 +0000
2522@@ -0,0 +1,145 @@
2523+import os
2524+import hashlib
2525+import re
2526+
2527+import six
2528+if six.PY3:
2529+ from urllib.request import (
2530+ build_opener, install_opener, urlopen, urlretrieve,
2531+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2532+ )
2533+ from urllib.parse import urlparse, urlunparse, parse_qs
2534+ from urllib.error import URLError
2535+else:
2536+ from urllib import urlretrieve
2537+ from urllib2 import (
2538+ build_opener, install_opener, urlopen,
2539+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2540+ URLError
2541+ )
2542+ from urlparse import urlparse, urlunparse, parse_qs
2543+
2544+from charmhelpers.fetch import (
2545+ BaseFetchHandler,
2546+ UnhandledSource
2547+)
2548+from charmhelpers.payload.archive import (
2549+ get_archive_handler,
2550+ extract,
2551+)
2552+from charmhelpers.core.host import mkdir, check_hash
2553+
2554+
2555+def splituser(host):
2556+ '''urllib.splituser(), but six's support of this seems broken'''
2557+ _userprog = re.compile('^(.*)@(.*)$')
2558+ match = _userprog.match(host)
2559+ if match:
2560+ return match.group(1, 2)
2561+ return None, host
2562+
2563+
2564+def splitpasswd(user):
2565+ '''urllib.splitpasswd(), but six's support of this is missing'''
2566+ _passwdprog = re.compile('^([^:]*):(.*)$', re.S)
2567+ match = _passwdprog.match(user)
2568+ if match:
2569+ return match.group(1, 2)
2570+ return user, None
2571+
2572+
2573+class ArchiveUrlFetchHandler(BaseFetchHandler):
2574+ """
2575+ Handler to download archive files from arbitrary URLs.
2576+
2577+ Can fetch from http, https, ftp, and file URLs.
2578+
2579+ Can install either tarballs (.tar, .tgz, .tbz2, etc) or zip files.
2580+
2581+ Installs the contents of the archive in $CHARM_DIR/fetched/.
2582+ """
2583+ def can_handle(self, source):
2584+ url_parts = self.parse_url(source)
2585+ if url_parts.scheme not in ('http', 'https', 'ftp', 'file'):
2586+ return "Wrong source type"
2587+ if get_archive_handler(self.base_url(source)):
2588+ return True
2589+ return False
2590+
2591+ def download(self, source, dest):
2592+ """
2593+ Download an archive file.
2594+
2595+ :param str source: URL pointing to an archive file.
2596+ :param str dest: Local path location to download archive file to.
2597+ """
2598+ # propogate all exceptions
2599+ # URLError, OSError, etc
2600+ proto, netloc, path, params, query, fragment = urlparse(source)
2601+ if proto in ('http', 'https'):
2602+ auth, barehost = splituser(netloc)
2603+ if auth is not None:
2604+ source = urlunparse((proto, barehost, path, params, query, fragment))
2605+ username, password = splitpasswd(auth)
2606+ passman = HTTPPasswordMgrWithDefaultRealm()
2607+ # Realm is set to None in add_password to force the username and password
2608+ # to be used whatever the realm
2609+ passman.add_password(None, source, username, password)
2610+ authhandler = HTTPBasicAuthHandler(passman)
2611+ opener = build_opener(authhandler)
2612+ install_opener(opener)
2613+ response = urlopen(source)
2614+ try:
2615+ with open(dest, 'w') as dest_file:
2616+ dest_file.write(response.read())
2617+ except Exception as e:
2618+ if os.path.isfile(dest):
2619+ os.unlink(dest)
2620+ raise e
2621+
2622+ # Mandatory file validation via Sha1 or MD5 hashing.
2623+ def download_and_validate(self, url, hashsum, validate="sha1"):
2624+ tempfile, headers = urlretrieve(url)
2625+ check_hash(tempfile, hashsum, validate)
2626+ return tempfile
2627+
2628+ def install(self, source, dest=None, checksum=None, hash_type='sha1'):
2629+ """
2630+ Download and install an archive file, with optional checksum validation.
2631+
2632+ The checksum can also be given on the `source` URL's fragment.
2633+ For example::
2634+
2635+ handler.install('http://example.com/file.tgz#sha1=deadbeef')
2636+
2637+ :param str source: URL pointing to an archive file.
2638+ :param str dest: Local destination path to install to. If not given,
2639+ installs to `$CHARM_DIR/archives/archive_file_name`.
2640+ :param str checksum: If given, validate the archive file after download.
2641+ :param str hash_type: Algorithm used to generate `checksum`.
2642+ Can be any hash alrgorithm supported by :mod:`hashlib`,
2643+ such as md5, sha1, sha256, sha512, etc.
2644+
2645+ """
2646+ url_parts = self.parse_url(source)
2647+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), 'fetched')
2648+ if not os.path.exists(dest_dir):
2649+ mkdir(dest_dir, perms=0o755)
2650+ dld_file = os.path.join(dest_dir, os.path.basename(url_parts.path))
2651+ try:
2652+ self.download(source, dld_file)
2653+ except URLError as e:
2654+ raise UnhandledSource(e.reason)
2655+ except OSError as e:
2656+ raise UnhandledSource(e.strerror)
2657+ options = parse_qs(url_parts.fragment)
2658+ for key, value in options.items():
2659+ if not six.PY3:
2660+ algorithms = hashlib.algorithms
2661+ else:
2662+ algorithms = hashlib.algorithms_available
2663+ if key in algorithms:
2664+ check_hash(dld_file, value, key)
2665+ if checksum:
2666+ check_hash(dld_file, checksum, hash_type)
2667+ return extract(dld_file, dest)
2668
2669=== added file 'hooks/charmhelpers/fetch/bzrurl.py'
2670--- hooks/charmhelpers/fetch/bzrurl.py 1970-01-01 00:00:00 +0000
2671+++ hooks/charmhelpers/fetch/bzrurl.py 2015-01-07 17:22:49 +0000
2672@@ -0,0 +1,54 @@
2673+import os
2674+from charmhelpers.fetch import (
2675+ BaseFetchHandler,
2676+ UnhandledSource
2677+)
2678+from charmhelpers.core.host import mkdir
2679+
2680+import six
2681+if six.PY3:
2682+ raise ImportError('bzrlib does not support Python3')
2683+
2684+try:
2685+ from bzrlib.branch import Branch
2686+except ImportError:
2687+ from charmhelpers.fetch import apt_install
2688+ apt_install("python-bzrlib")
2689+ from bzrlib.branch import Branch
2690+
2691+
2692+class BzrUrlFetchHandler(BaseFetchHandler):
2693+ """Handler for bazaar branches via generic and lp URLs"""
2694+ def can_handle(self, source):
2695+ url_parts = self.parse_url(source)
2696+ if url_parts.scheme not in ('bzr+ssh', 'lp'):
2697+ return False
2698+ else:
2699+ return True
2700+
2701+ def branch(self, source, dest):
2702+ url_parts = self.parse_url(source)
2703+ # If we use lp:branchname scheme we need to load plugins
2704+ if not self.can_handle(source):
2705+ raise UnhandledSource("Cannot handle {}".format(source))
2706+ if url_parts.scheme == "lp":
2707+ from bzrlib.plugin import load_plugins
2708+ load_plugins()
2709+ try:
2710+ remote_branch = Branch.open(source)
2711+ remote_branch.bzrdir.sprout(dest).open_branch()
2712+ except Exception as e:
2713+ raise e
2714+
2715+ def install(self, source):
2716+ url_parts = self.parse_url(source)
2717+ branch_name = url_parts.path.strip("/").split("/")[-1]
2718+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2719+ branch_name)
2720+ if not os.path.exists(dest_dir):
2721+ mkdir(dest_dir, perms=0o755)
2722+ try:
2723+ self.branch(source, dest_dir)
2724+ except OSError as e:
2725+ raise UnhandledSource(e.strerror)
2726+ return dest_dir
2727
2728=== added file 'hooks/charmhelpers/fetch/giturl.py'
2729--- hooks/charmhelpers/fetch/giturl.py 1970-01-01 00:00:00 +0000
2730+++ hooks/charmhelpers/fetch/giturl.py 2015-01-07 17:22:49 +0000
2731@@ -0,0 +1,51 @@
2732+import os
2733+from charmhelpers.fetch import (
2734+ BaseFetchHandler,
2735+ UnhandledSource
2736+)
2737+from charmhelpers.core.host import mkdir
2738+
2739+import six
2740+if six.PY3:
2741+ raise ImportError('GitPython does not support Python 3')
2742+
2743+try:
2744+ from git import Repo
2745+except ImportError:
2746+ from charmhelpers.fetch import apt_install
2747+ apt_install("python-git")
2748+ from git import Repo
2749+
2750+
2751+class GitUrlFetchHandler(BaseFetchHandler):
2752+ """Handler for git branches via generic and github URLs"""
2753+ def can_handle(self, source):
2754+ url_parts = self.parse_url(source)
2755+ # TODO (mattyw) no support for ssh git@ yet
2756+ if url_parts.scheme not in ('http', 'https', 'git'):
2757+ return False
2758+ else:
2759+ return True
2760+
2761+ def clone(self, source, dest, branch):
2762+ if not self.can_handle(source):
2763+ raise UnhandledSource("Cannot handle {}".format(source))
2764+
2765+ repo = Repo.clone_from(source, dest)
2766+ repo.git.checkout(branch)
2767+
2768+ def install(self, source, branch="master", dest=None):
2769+ url_parts = self.parse_url(source)
2770+ branch_name = url_parts.path.strip("/").split("/")[-1]
2771+ if dest:
2772+ dest_dir = os.path.join(dest, branch_name)
2773+ else:
2774+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2775+ branch_name)
2776+ if not os.path.exists(dest_dir):
2777+ mkdir(dest_dir, perms=0o755)
2778+ try:
2779+ self.clone(source, dest_dir, branch)
2780+ except OSError as e:
2781+ raise UnhandledSource(e.strerror)
2782+ return dest_dir
2783
2784=== added directory 'hooks/charmhelpers/payload'
2785=== added file 'hooks/charmhelpers/payload/__init__.py'
2786--- hooks/charmhelpers/payload/__init__.py 1970-01-01 00:00:00 +0000
2787+++ hooks/charmhelpers/payload/__init__.py 2015-01-07 17:22:49 +0000
2788@@ -0,0 +1,1 @@
2789+"Tools for working with files injected into a charm just before deployment."
2790
2791=== added file 'hooks/charmhelpers/payload/execd.py'
2792--- hooks/charmhelpers/payload/execd.py 1970-01-01 00:00:00 +0000
2793+++ hooks/charmhelpers/payload/execd.py 2015-01-07 17:22:49 +0000
2794@@ -0,0 +1,50 @@
2795+#!/usr/bin/env python
2796+
2797+import os
2798+import sys
2799+import subprocess
2800+from charmhelpers.core import hookenv
2801+
2802+
2803+def default_execd_dir():
2804+ return os.path.join(os.environ['CHARM_DIR'], 'exec.d')
2805+
2806+
2807+def execd_module_paths(execd_dir=None):
2808+ """Generate a list of full paths to modules within execd_dir."""
2809+ if not execd_dir:
2810+ execd_dir = default_execd_dir()
2811+
2812+ if not os.path.exists(execd_dir):
2813+ return
2814+
2815+ for subpath in os.listdir(execd_dir):
2816+ module = os.path.join(execd_dir, subpath)
2817+ if os.path.isdir(module):
2818+ yield module
2819+
2820+
2821+def execd_submodule_paths(command, execd_dir=None):
2822+ """Generate a list of full paths to the specified command within exec_dir.
2823+ """
2824+ for module_path in execd_module_paths(execd_dir):
2825+ path = os.path.join(module_path, command)
2826+ if os.access(path, os.X_OK) and os.path.isfile(path):
2827+ yield path
2828+
2829+
2830+def execd_run(command, execd_dir=None, die_on_error=False, stderr=None):
2831+ """Run command for each module within execd_dir which defines it."""
2832+ for submodule_path in execd_submodule_paths(command, execd_dir):
2833+ try:
2834+ subprocess.check_call(submodule_path, shell=True, stderr=stderr)
2835+ except subprocess.CalledProcessError as e:
2836+ hookenv.log("Error ({}) running {}. Output: {}".format(
2837+ e.returncode, e.cmd, e.output))
2838+ if die_on_error:
2839+ sys.exit(e.returncode)
2840+
2841+
2842+def execd_preinstall(execd_dir=None):
2843+ """Run charm-pre-install for each module within execd_dir."""
2844+ execd_run('charm-pre-install', execd_dir=execd_dir)
2845
2846=== modified file 'hooks/config-changed'
2847--- hooks/config-changed 2011-09-22 14:46:56 +0000
2848+++ hooks/config-changed 1970-01-01 00:00:00 +0000
2849@@ -1,7 +0,0 @@
2850-#!/bin/sh
2851-set -e
2852-
2853-home=`dirname $0`
2854-
2855-juju-log "Reconfiguring charm by installing hook again."
2856-exec $home/install
2857
2858=== target is u'jenkins_hooks.py'
2859=== modified file 'hooks/install'
2860--- hooks/install 2014-04-17 12:35:18 +0000
2861+++ hooks/install 1970-01-01 00:00:00 +0000
2862@@ -1,151 +0,0 @@
2863-#!/bin/bash
2864-
2865-set -eu
2866-
2867-RELEASE=$(config-get release)
2868-ADMIN_USERNAME=$(config-get username)
2869-ADMIN_PASSWORD=$(config-get password)
2870-PLUGINS=$(config-get plugins)
2871-PLUGINS_SITE=$(config-get plugins-site)
2872-PLUGINS_CHECK_CERT=$(config-get plugins-check-certificate)
2873-REMOVE_UNLISTED_PLUGINS=$(config-get remove-unlisted-plugins)
2874-CWD=$(dirname $0)
2875-JENKINS_HOME=/var/lib/jenkins
2876-
2877-setup_source () {
2878- # Do something with < Oneiric releases - maybe PPA
2879- # apt-get -y install python-software-properties
2880- # add-apt-repository ppa:hudson-ubuntu/testing
2881- juju-log "Configuring source of jenkins as $RELEASE"
2882- # Configure to use upstream archives
2883- # lts - debian-stable
2884- # trunk - debian
2885- case $RELEASE in
2886- lts)
2887- SOURCE="debian-stable";;
2888- trunk)
2889- SOURCE="debian";;
2890- *)
2891- juju-log "release configuration not recognised" && exit 1;;
2892- esac
2893- # Setup archive to use appropriate jenkins upstream
2894- wget -q -O - http://pkg.jenkins-ci.org/$SOURCE/jenkins-ci.org.key | apt-key add -
2895- echo "deb http://pkg.jenkins-ci.org/$SOURCE binary/" \
2896- > /etc/apt/sources.list.d/jenkins.list
2897- apt-get update || true
2898-}
2899-# Only setup the source if jenkins is not already installed
2900-# this makes the config 'release' immutable - i.e. you
2901-# can change source once deployed
2902-[[ -d /var/lib/jenkins ]] || setup_source
2903-
2904-# Install jenkins
2905-install_jenkins () {
2906- juju-log "Installing/upgrading jenkins..."
2907- apt-get -y install -qq jenkins default-jre-headless
2908-}
2909-# Re-run whenever called to pickup any updates
2910-install_jenkins
2911-
2912-configure_jenkins_user () {
2913- juju-log "Configuring user for jenkins..."
2914- # Check to see if password provided
2915- if [ -z "$ADMIN_PASSWORD" ]
2916- then
2917- # Generate a random one for security
2918- # User can then override using juju set
2919- ADMIN_PASSWORD=$(< /dev/urandom tr -dc A-Za-z | head -c16)
2920- echo $ADMIN_PASSWORD > $JENKINS_HOME/.admin_password
2921- chmod 0600 $JENKINS_HOME/.admin_password
2922- fi
2923- # Generate Salt and Hash Password for Jenkins
2924- SALT="$(< /dev/urandom tr -dc A-Za-z | head -c6)"
2925- PASSWORD="$SALT:$(echo -n "$ADMIN_PASSWORD{$SALT}" | shasum -a 256 | awk '{ print $1 }')"
2926- mkdir -p $JENKINS_HOME/users/$ADMIN_USERNAME
2927- sed -e s#__USERNAME__#$ADMIN_USERNAME# -e s#__PASSWORD__#$PASSWORD# \
2928- $CWD/../templates/user-config.xml > $JENKINS_HOME/users/$ADMIN_USERNAME/config.xml
2929- chown -R jenkins:nogroup $JENKINS_HOME/users
2930-}
2931-# Always run - even if config has not changed, its safe
2932-configure_jenkins_user
2933-
2934-boostrap_jenkins_configuration (){
2935- juju-log "Bootstrapping secure initial configuration in Jenkins..."
2936- cp $CWD/../templates/jenkins-config.xml $JENKINS_HOME/config.xml
2937- chown jenkins:nogroup $JENKINS_HOME/config.xml
2938- touch /var/lib/jenkins/config.bootstrapped
2939-}
2940-# Only run on first invocation otherwise we blast
2941-# any configuration changes made
2942-[[ -f /var/lib/jenkins/config.bootstrapped ]] || boostrap_jenkins_configuration
2943-
2944-install_plugins(){
2945- juju-log "Installing plugins ($PLUGINS)"
2946- mkdir -p $JENKINS_HOME/plugins
2947- chmod a+rx $JENKINS_HOME/plugins
2948- chown jenkins:nogroup $JENKINS_HOME/plugins
2949- track_dir=`mktemp -d /tmp/plugins.installed.XXXXXXXX`
2950- installed_plugins=`find $JENKINS_HOME/plugins -name '*.hpi'`
2951- [ -z "$installed_plugins" ] || ln -s $installed_plugins $track_dir
2952- local plugin=""
2953- local plugin_file=""
2954- local opts=""
2955- pushd $JENKINS_HOME/plugins
2956- for plugin in $PLUGINS ; do
2957- plugin_file=$JENKINS_HOME/plugins/$plugin.hpi
2958- # Note that by default wget verifies certificates as of 1.10.
2959- if [ "$PLUGINS_CHECK_CERT" = "no" ] ; then
2960- opts="--no-check-certificate"
2961- fi
2962- wget $opts --timestamping $PLUGINS_SITE/latest/$plugin.hpi
2963- chmod a+r $plugin_file
2964- rm -f $track_dir/$plugin.hpi
2965- done
2966- popd
2967- # Warn about undesirable plugins, or remove them.
2968- unlisted_plugins=`ls $track_dir`
2969- [[ -n "$unlisted_plugins" ]] || return 0
2970- if [[ $REMOVE_UNLISTED_PLUGINS = "yes" ]] ; then
2971- for plugin_file in `ls $track_dir` ; do
2972- rm -vf $JENKINS_HOME/plugins/$plugin_file
2973- done
2974- else
2975- juju-log -l WARNING "Unlisted plugins: (`ls $track_dir`) Not removed. Set remove-unlisted-plugins to yes to clear them away."
2976- fi
2977-}
2978-
2979-install_plugins
2980-
2981-juju-log "Restarting jenkins to pickup configuration changes"
2982-service jenkins restart
2983-
2984-# Install helpers - python jenkins ++
2985-install_python_jenkins () {
2986- juju-log "Installing python-jenkins..."
2987- apt-get -y install -qq python-jenkins
2988-}
2989-# Only install once
2990-[[ -d /usr/share/pyshared/jenkins ]] || install_python_jenkins
2991-
2992-# Install some tools - can get set up deployment time
2993-install_tools () {
2994- juju-log "Installing tools..."
2995- apt-get -y install -qq `config-get tools`
2996-}
2997-# Always run - tools might get re-configured
2998-install_tools
2999-
3000-juju-log "Opening ports"
3001-open-port 8080
3002-
3003-# Execute any hook overlay which may be provided
3004-# by forks of this charm
3005-if [ -d hooks/install.d ]
3006-then
3007- for i in `ls -1 hooks/install.d/*`
3008- do
3009- [[ -x $i ]] && . ./$i
3010- done
3011-fi
3012-
3013-exit 0
3014
3015=== target is u'jenkins_hooks.py'
3016=== added file 'hooks/jenkins_hooks.py'
3017--- hooks/jenkins_hooks.py 1970-01-01 00:00:00 +0000
3018+++ hooks/jenkins_hooks.py 2015-01-07 17:22:49 +0000
3019@@ -0,0 +1,220 @@
3020+#!/usr/bin/python
3021+import grp
3022+import hashlib
3023+import os
3024+import pwd
3025+import shutil
3026+import subprocess
3027+import sys
3028+
3029+from charmhelpers.core.hookenv import (
3030+ Hooks,
3031+ UnregisteredHookError,
3032+ config,
3033+ remote_unit,
3034+ relation_get,
3035+ relation_set,
3036+ relation_ids,
3037+ unit_get,
3038+ open_port,
3039+ log,
3040+ DEBUG,
3041+ INFO,
3042+)
3043+from charmhelpers.fetch import apt_install
3044+from charmhelpers.core.host import (
3045+ service_start,
3046+ service_stop,
3047+)
3048+from charmhelpers.payload.execd import execd_preinstall
3049+from jenkins_utils import (
3050+ JENKINS_HOME,
3051+ JENKINS_USERS,
3052+ TEMPLATES_DIR,
3053+ add_node,
3054+ del_node,
3055+ setup_source,
3056+ install_jenkins_plugins,
3057+)
3058+
3059+hooks = Hooks()
3060+
3061+
3062+@hooks.hook('install')
3063+def install():
3064+ execd_preinstall('hooks/install.d')
3065+ # Only setup the source if jenkins is not already installed i.e. makes the
3066+ # config 'release' immutable so you can't change source once deployed
3067+ setup_source(config('release'))
3068+ config_changed()
3069+ open_port(8080)
3070+
3071+
3072+@hooks.hook('config-changed')
3073+def config_changed():
3074+ # Re-run whenever called to pickup any updates
3075+ log("Installing/upgrading jenkins.", level=DEBUG)
3076+ apt_install(['jenkins', 'default-jre-headless', 'pwgen'], fatal=True)
3077+
3078+ # Always run - even if config has not changed, its safe
3079+ log("Configuring user for jenkins.", level=DEBUG)
3080+ # Check to see if password provided
3081+ admin_passwd = config('password')
3082+ if not admin_passwd:
3083+ # Generate a random one for security. User can then override using juju
3084+ # set.
3085+ admin_passwd = subprocess.check_output(['pwgen', '-N1', '15'])
3086+ admin_passwd = admin_passwd.strip()
3087+
3088+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3089+ with open(passwd_file, 'w+') as fd:
3090+ fd.write(admin_passwd)
3091+
3092+ os.chmod(passwd_file, 0600)
3093+
3094+ jenkins_uid = pwd.getpwnam('jenkins').pw_uid
3095+ jenkins_gid = grp.getgrnam('jenkins').gr_gid
3096+ nogroup_gid = grp.getgrnam('nogroup').gr_gid
3097+
3098+ # Generate Salt and Hash Password for Jenkins
3099+ salt = subprocess.check_output(['pwgen', '-N1', '6']).strip()
3100+ csum = hashlib.sha256("%s{%s}" % (admin_passwd, salt)).hexdigest()
3101+ salty_password = "%s:%s" % (salt, csum)
3102+
3103+ admin_username = config('username')
3104+ admin_user_home = os.path.join(JENKINS_USERS, admin_username)
3105+ if not os.path.isdir(admin_user_home):
3106+ os.makedirs(admin_user_home, 0o0700)
3107+ os.chown(JENKINS_USERS, jenkins_uid, nogroup_gid)
3108+ os.chown(admin_user_home, jenkins_uid, nogroup_gid)
3109+
3110+ # NOTE: overwriting will destroy any data added by jenkins or via the ui
3111+ admin_user_config = os.path.join(admin_user_home, 'config.xml')
3112+ with open(os.path.join(TEMPLATES_DIR, 'user-config.xml')) as src_fd:
3113+ with open(admin_user_config, 'w') as dst_fd:
3114+ lines = src_fd.readlines()
3115+ for line in lines:
3116+ kvs = {'__USERNAME__': admin_username,
3117+ '__PASSWORD__': salty_password}
3118+
3119+ for key, val in kvs.iteritems():
3120+ if key in line:
3121+ line = line.replace(key, val)
3122+
3123+ dst_fd.write(line)
3124+ os.chown(admin_user_config, jenkins_uid, nogroup_gid)
3125+
3126+ # Only run on first invocation otherwise we blast
3127+ # any configuration changes made
3128+ jenkins_bootstrap_flag = '/var/lib/jenkins/config.bootstrapped'
3129+ if not os.path.exists(jenkins_bootstrap_flag):
3130+ log("Bootstrapping secure initial configuration in Jenkins.",
3131+ level=DEBUG)
3132+ src = os.path.join(TEMPLATES_DIR, 'jenkins-config.xml')
3133+ dst = os.path.join(JENKINS_HOME, 'config.xml')
3134+ shutil.copy(src, dst)
3135+ os.chown(dst, jenkins_uid, nogroup_gid)
3136+ # Touch
3137+ with open(jenkins_bootstrap_flag, 'w'):
3138+ pass
3139+
3140+ log("Stopping jenkins for plugin update(s)", level=DEBUG)
3141+ service_stop('jenkins')
3142+ install_jenkins_plugins(jenkins_uid, jenkins_gid)
3143+ log("Starting jenkins to pickup configuration changes", level=DEBUG)
3144+ service_start('jenkins')
3145+
3146+ apt_install(['python-jenkins'], fatal=True)
3147+ tools = config('tools')
3148+ if tools:
3149+ log("Installing tools.", level=DEBUG)
3150+ apt_install(tools.split(), fatal=True)
3151+
3152+
3153+@hooks.hook('start')
3154+def start():
3155+ service_start('jenkins')
3156+
3157+
3158+@hooks.hook('stop')
3159+def stop():
3160+ service_stop('jenkins')
3161+
3162+
3163+@hooks.hook('upgrade-charm')
3164+def upgrade_charm():
3165+ log("Upgrading charm.", level=DEBUG)
3166+ config_changed()
3167+
3168+
3169+@hooks.hook('master-relation-joined')
3170+def master_relation_joined():
3171+ HOSTNAME = unit_get('private-address')
3172+ log("Setting url relation to http://%s:8080" % (HOSTNAME), level=DEBUG)
3173+ relation_set(url="http://%s:8080" % (HOSTNAME))
3174+
3175+
3176+@hooks.hook('master-relation-changed')
3177+def master_relation_changed():
3178+ PASSWORD = config('password')
3179+ if PASSWORD:
3180+ with open('/var/lib/jenkins/.admin_password', 'r') as fd:
3181+ PASSWORD = fd.read()
3182+
3183+ required_settings = ['executors', 'labels', 'slavehost']
3184+ settings = relation_get()
3185+ missing = [s for s in required_settings if s not in settings]
3186+ if missing:
3187+ log("Not all required relation settings received yet (missing=%s) - "
3188+ "skipping" % (', '.join(missing)), level=INFO)
3189+ return
3190+
3191+ slavehost = settings['slavehost']
3192+ executors = settings['executors']
3193+ labels = settings['labels']
3194+
3195+ # Double check to see if this has happened yet
3196+ if "x%s" % (slavehost) == "x":
3197+ log("Slave host not yet defined - skipping", level=INFO)
3198+ return
3199+
3200+ log("Adding slave with hostname %s." % (slavehost), level=DEBUG)
3201+ add_node(slavehost, executors, labels, config('username'), PASSWORD)
3202+ log("Node slave %s added." % (slavehost), level=DEBUG)
3203+
3204+
3205+@hooks.hook('master-relation-departed')
3206+def master_relation_departed():
3207+ # Slave hostname is derived from unit name so
3208+ # this is pretty safe
3209+ slavehost = remote_unit()
3210+ log("Deleting slave with hostname %s." % (slavehost), level=DEBUG)
3211+ del_node(slavehost, config('username'), config('password'))
3212+
3213+
3214+@hooks.hook('master-relation-broken')
3215+def master_relation_broken():
3216+ password = config('password')
3217+ if not password:
3218+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3219+ with open(passwd_file, 'w+') as fd:
3220+ PASSWORD = fd.read()
3221+
3222+ for member in relation_ids():
3223+ member = member.replace('/', '-')
3224+ log("Removing node %s from Jenkins master." % (member), level=DEBUG)
3225+ del_node(member, config('username'), PASSWORD)
3226+
3227+
3228+@hooks.hook('website-relation-joined')
3229+def website_relation_joined():
3230+ hostname = unit_get('private-address')
3231+ log("Setting website URL to %s:8080" % (hostname), level=DEBUG)
3232+ relation_set(port=8080, hostname=hostname)
3233+
3234+
3235+if __name__ == '__main__':
3236+ try:
3237+ hooks.execute(sys.argv)
3238+ except UnregisteredHookError as e:
3239+ log('Unknown hook {} - skipping.'.format(e), level=INFO)
3240
3241=== added file 'hooks/jenkins_utils.py'
3242--- hooks/jenkins_utils.py 1970-01-01 00:00:00 +0000
3243+++ hooks/jenkins_utils.py 2015-01-07 17:22:49 +0000
3244@@ -0,0 +1,169 @@
3245+#!/usr/bin/python
3246+import glob
3247+import os
3248+import shutil
3249+import subprocess
3250+import tempfile
3251+
3252+from charmhelpers.core.hookenv import (
3253+ config,
3254+ log,
3255+ DEBUG,
3256+ INFO,
3257+ WARNING,
3258+)
3259+from charmhelpers.fetch import (
3260+ apt_update,
3261+ add_source,
3262+)
3263+
3264+JENKINS_HOME = '/var/lib/jenkins'
3265+JENKINS_USERS = os.path.join(JENKINS_HOME, 'users')
3266+JENKINS_PLUGINS = os.path.join(JENKINS_HOME, 'plugins')
3267+TEMPLATES_DIR = 'templates'
3268+
3269+
3270+def add_node(host, executors, labels, username, password):
3271+ import jenkins
3272+
3273+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username, password)
3274+
3275+ if l_jenkins.node_exists(host):
3276+ log("Node exists - not adding", level=DEBUG)
3277+ return
3278+
3279+ log("Adding node '%s' to Jenkins master" % (host), level=INFO)
3280+ l_jenkins.create_node(host, int(executors) * 2, host, labels=labels)
3281+
3282+ if not l_jenkins.node_exists(host):
3283+ log("Failed to create node '%s'" % (host), level=WARNING)
3284+
3285+
3286+def del_node(host, username, password):
3287+ import jenkins
3288+
3289+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username, password)
3290+
3291+ if l_jenkins.node_exists(host):
3292+ log("Node '%s' exists" % (host), level=DEBUG)
3293+ l_jenkins.delete_node(host)
3294+ else:
3295+ log("Node '%s' does not exist - not deleting" % (host), level=INFO)
3296+
3297+
3298+def setup_source(release):
3299+ """Install Jenkins archive."""
3300+ log("Configuring source of jenkins as %s" % release, level=INFO)
3301+
3302+ # Configure to use upstream archives
3303+ # lts - debian-stable
3304+ # trunk - debian
3305+ if release == 'lts':
3306+ source = "debian-stable"
3307+ elif release == 'trunk':
3308+ source = "debian"
3309+ else:
3310+ errmsg = "Release '%s' configuration not recognised" % (release)
3311+ raise Exception(errmsg)
3312+
3313+ # Setup archive to use appropriate jenkins upstream
3314+ key = 'http://pkg.jenkins-ci.org/%s/jenkins-ci.org.key' % source
3315+ target = "%s-%s" % (source, 'jenkins-ci.org.key')
3316+ subprocess.check_call(['wget', '-q', '-O', target, key])
3317+ with open(target, 'r') as fd:
3318+ key = fd.read()
3319+
3320+ deb = "deb http://pkg.jenkins-ci.org/%s binary/" % (source)
3321+ sources_file = "/etc/apt/sources.list.d/jenkins.list"
3322+
3323+ found = False
3324+ if os.path.exists(sources_file):
3325+ with open(sources_file, 'r') as fd:
3326+ for line in fd:
3327+ if deb in line:
3328+ found = True
3329+ break
3330+
3331+ if not found:
3332+ with open(sources_file, 'a') as fd:
3333+ fd.write("%s\n" % deb)
3334+ else:
3335+ with open(sources_file, 'w') as fd:
3336+ fd.write("%s\n" % deb)
3337+
3338+ if not found:
3339+ # NOTE: don't use add_source for adding source since it adds deb and
3340+ # deb-src entries but pkg.jenkins-ci.org has no deb-src.
3341+ add_source("#dummy-source", key=key)
3342+
3343+ apt_update(fatal=True)
3344+
3345+
3346+def install_jenkins_plugins(jenkins_uid, jenkins_gid):
3347+ plugins = config('plugins')
3348+ if plugins:
3349+ plugins = plugins.split()
3350+ else:
3351+ plugins = []
3352+
3353+ log("Installing plugins (%s)" % (' '.join(plugins)), level=DEBUG)
3354+ if not os.path.isdir(JENKINS_PLUGINS):
3355+ os.makedirs(JENKINS_PLUGINS)
3356+
3357+ os.chmod(JENKINS_PLUGINS, 0o0755)
3358+ os.chown(JENKINS_PLUGINS, jenkins_uid, jenkins_gid)
3359+
3360+ track_dir = tempfile.mkdtemp(prefix='/tmp/plugins.installed')
3361+ try:
3362+ installed_plugins = glob.glob("%s/*.hpi" % (JENKINS_PLUGINS))
3363+ for plugin in installed_plugins:
3364+ # Create a ref of installed plugin
3365+ with open(os.path.join(track_dir, os.path.basename(plugin)),
3366+ 'w'):
3367+ pass
3368+
3369+ plugins_site = config('plugins-site')
3370+ log("Fetching plugins from %s" % (plugins_site), level=DEBUG)
3371+ # NOTE: by default wget verifies certificates as of 1.10.
3372+ if config('plugins-check-certificate') == "no":
3373+ opts = ["--no-check-certificate"]
3374+ else:
3375+ opts = []
3376+
3377+ for plugin in plugins:
3378+ plugin_filename = "%s.hpi" % (plugin)
3379+ url = os.path.join(plugins_site, 'latest', plugin_filename)
3380+ plugin_path = os.path.join(JENKINS_PLUGINS, plugin_filename)
3381+ if not os.path.isfile(plugin_path):
3382+ log("Installing plugin %s" % (plugin_filename), level=DEBUG)
3383+ cmd = ['wget'] + opts + ['--timestamping', url, '-O',
3384+ plugin_path]
3385+ subprocess.check_call(cmd)
3386+ os.chmod(plugin_path, 0744)
3387+ os.chown(plugin_path, jenkins_uid, jenkins_gid)
3388+
3389+ else:
3390+ log("Plugin %s already installed" % (plugin_filename),
3391+ level=DEBUG)
3392+
3393+ ref = os.path.join(track_dir, plugin_filename)
3394+ if os.path.exists(ref):
3395+ # Delete ref since plugin is installed.
3396+ os.remove(ref)
3397+
3398+ installed_plugins = os.listdir(track_dir)
3399+ if installed_plugins:
3400+ if config('remove-unlisted-plugins') == "yes":
3401+ for plugin in installed_plugins:
3402+ path = os.path.join(JENKINS_HOME, 'plugins', plugin)
3403+ if os.path.isfile(path):
3404+ log("Deleting unlisted plugin '%s'" % (path),
3405+ level=INFO)
3406+ os.remove(path)
3407+ else:
3408+ log("Unlisted plugins: (%s) Not removed. Set "
3409+ "remove-unlisted-plugins to 'yes' to clear them away." %
3410+ ', '.join(installed_plugins), level=INFO)
3411+ finally:
3412+ # Delete install refs
3413+ shutil.rmtree(track_dir)
3414
3415=== modified file 'hooks/master-relation-broken'
3416--- hooks/master-relation-broken 2012-07-31 10:32:36 +0000
3417+++ hooks/master-relation-broken 1970-01-01 00:00:00 +0000
3418@@ -1,17 +0,0 @@
3419-#!/bin/sh
3420-
3421-PASSWORD=`config-get password`
3422-if [ -z "$PASSWORD" ]
3423-then
3424- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3425-fi
3426-
3427-MEMBERS=`relation-list`
3428-
3429-for MEMBER in $MEMBERS
3430-do
3431- juju-log "Removing node $MEMBER from Jenkins master..."
3432- $(dirname $0)/delnode `echo $MEMBER | sed s,/,-,` `config-get username` $PASSWORD
3433-done
3434-
3435-exit 0
3436
3437=== target is u'jenkins_hooks.py'
3438=== modified file 'hooks/master-relation-changed'
3439--- hooks/master-relation-changed 2012-07-31 10:32:36 +0000
3440+++ hooks/master-relation-changed 1970-01-01 00:00:00 +0000
3441@@ -1,24 +0,0 @@
3442-#!/bin/bash
3443-
3444-set -ue
3445-
3446-PASSWORD=`config-get password`
3447-if [ -z "$PASSWORD" ]
3448-then
3449- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3450-fi
3451-
3452-# Grab information that remote unit has posted to relation
3453-slavehost=$(relation-get slavehost)
3454-executors=$(relation-get executors)
3455-labels=$(relation-get labels)
3456-
3457-# Double check to see if this has happened yet
3458-if [ "x$slavehost" = "x" ]; then
3459- juju-log "Slave host not yet defined, exiting..."
3460- exit 0
3461-fi
3462-
3463-juju-log "Adding slave with hostname $slavehost..."
3464-$(dirname $0)/addnode $slavehost $executors "$labels" `config-get username` $PASSWORD
3465-juju-log "Node slave $slavehost added..."
3466
3467=== target is u'jenkins_hooks.py'
3468=== modified file 'hooks/master-relation-departed'
3469--- hooks/master-relation-departed 2011-09-22 14:46:56 +0000
3470+++ hooks/master-relation-departed 1970-01-01 00:00:00 +0000
3471@@ -1,12 +0,0 @@
3472-#!/bin/bash
3473-
3474-set -ue
3475-
3476-# Slave hostname is derived from unit name so
3477-# this is pretty safe
3478-slavehost=`echo $JUJU_REMOTE_UNIT | sed s,/,-,`
3479-
3480-juju-log "Deleting slave with hostname $slavehost..."
3481-$(dirname $0)/delnode $slavehost `config-get username` `config-get password`
3482-
3483-exit 0
3484
3485=== target is u'jenkins_hooks.py'
3486=== modified file 'hooks/master-relation-joined'
3487--- hooks/master-relation-joined 2011-10-07 13:43:19 +0000
3488+++ hooks/master-relation-joined 1970-01-01 00:00:00 +0000
3489@@ -1,5 +0,0 @@
3490-#!/bin/sh
3491-
3492-HOSTNAME=`unit-get private-address`
3493-juju-log "Setting url relation to http://$HOSTNAME:8080"
3494-relation-set url="http://$HOSTNAME:8080"
3495
3496=== target is u'jenkins_hooks.py'
3497=== modified file 'hooks/start'
3498--- hooks/start 2011-09-22 14:46:56 +0000
3499+++ hooks/start 1970-01-01 00:00:00 +0000
3500@@ -1,3 +0,0 @@
3501-#!/bin/bash
3502-
3503-service jenkins start || true
3504
3505=== target is u'jenkins_hooks.py'
3506=== modified file 'hooks/stop'
3507--- hooks/stop 2011-09-22 14:46:56 +0000
3508+++ hooks/stop 1970-01-01 00:00:00 +0000
3509@@ -1,3 +0,0 @@
3510-#!/bin/bash
3511-
3512-service jenkins stop
3513
3514=== target is u'jenkins_hooks.py'
3515=== modified file 'hooks/upgrade-charm'
3516--- hooks/upgrade-charm 2011-09-22 14:46:56 +0000
3517+++ hooks/upgrade-charm 1970-01-01 00:00:00 +0000
3518@@ -1,7 +0,0 @@
3519-#!/bin/sh
3520-set -e
3521-
3522-home=`dirname $0`
3523-
3524-juju-log "Upgrading charm by running install hook again."
3525-exec $home/install
3526
3527=== target is u'jenkins_hooks.py'
3528=== modified file 'hooks/website-relation-joined'
3529--- hooks/website-relation-joined 2011-10-07 13:43:19 +0000
3530+++ hooks/website-relation-joined 1970-01-01 00:00:00 +0000
3531@@ -1,5 +0,0 @@
3532-#!/bin/sh
3533-
3534-HOSTNAME=`unit-get private-address`
3535-juju-log "Setting website URL to $HOSTNAME:8080"
3536-relation-set port=8080 hostname=$HOSTNAME
3537
3538=== target is u'jenkins_hooks.py'
3539=== renamed file 'tests/100-deploy' => 'tests/100-deploy-trusty'
3540--- tests/100-deploy 2014-03-05 19:18:19 +0000
3541+++ tests/100-deploy-trusty 2015-01-07 17:22:49 +0000
3542@@ -12,7 +12,7 @@
3543 ###
3544 # Deployment Setup
3545 ###
3546-d = amulet.Deployment()
3547+d = amulet.Deployment(series='trusty')
3548
3549 d.add('haproxy') # website-relation
3550 d.add('jenkins') # Subject matter
3551
3552=== added file 'tests/README'
3553--- tests/README 1970-01-01 00:00:00 +0000
3554+++ tests/README 2015-01-07 17:22:49 +0000
3555@@ -0,0 +1,56 @@
3556+This directory provides Amulet tests that focus on verification of Jenkins
3557+deployments.
3558+
3559+In order to run tests, you'll need charm-tools installed (in addition to
3560+juju, of course):
3561+
3562+ sudo add-apt-repository ppa:juju/stable
3563+ sudo apt-get update
3564+ sudo apt-get install charm-tools
3565+
3566+If you use a web proxy server to access the web, you'll need to set the
3567+AMULET_HTTP_PROXY environment variable to the http URL of the proxy server.
3568+
3569+The following examples demonstrate different ways that tests can be executed.
3570+All examples are run from the charm's root directory.
3571+
3572+ * To run all tests (starting with 00-setup):
3573+
3574+ make test
3575+
3576+ * To run a specific test module (or modules):
3577+
3578+ juju test -v -p AMULET_HTTP_PROXY 100-deploy
3579+
3580+ * To run a specific test module (or modules), and keep the environment
3581+ deployed after a failure:
3582+
3583+ juju test --set-e -v -p AMULET_HTTP_PROXY 100-deploy
3584+
3585+ * To re-run a test module against an already deployed environment (one
3586+ that was deployed by a previous call to 'juju test --set-e'):
3587+
3588+ ./tests/100-deploy
3589+
3590+
3591+For debugging and test development purposes, all code should be idempotent.
3592+In other words, the code should have the ability to be re-run without changing
3593+the results beyond the initial run. This enables editing and re-running of a
3594+test module against an already deployed environment, as described above.
3595+
3596+
3597+Notes for additional test writing:
3598+
3599+ * Use DEBUG to turn on debug logging, use ERROR otherwise.
3600+ u = OpenStackAmuletUtils(ERROR)
3601+ u = OpenStackAmuletUtils(DEBUG)
3602+
3603+ * Preserving the deployed environment:
3604+ Even with juju --set-e, amulet will tear down the juju environment
3605+ when all tests pass. This force_fail 'test' can be used in basic_deployment.py
3606+ to simulate a failed test and keep the environment.
3607+
3608+ def test_zzzz_fake_fail(self):
3609+ '''Force a fake fail to keep juju environment after a successful test run'''
3610+ # Useful in test writing, when used with: juju test --set-e
3611+ amulet.raise_status(amulet.FAIL, msg='using fake fail to keep juju environment')
3612
3613=== added directory 'unit_tests'
3614=== added file 'unit_tests/__init__.py'
3615=== added file 'unit_tests/test_jenkins_hooks.py'
3616--- unit_tests/test_jenkins_hooks.py 1970-01-01 00:00:00 +0000
3617+++ unit_tests/test_jenkins_hooks.py 2015-01-07 17:22:49 +0000
3618@@ -0,0 +1,6 @@
3619+import unittest
3620+
3621+
3622+class JenkinsHooksTests(unittest.TestCase):
3623+ def setUp(self):
3624+ super(JenkinsHooksTests, self).setUp()
3625
3626=== added file 'unit_tests/test_jenkins_utils.py'
3627--- unit_tests/test_jenkins_utils.py 1970-01-01 00:00:00 +0000
3628+++ unit_tests/test_jenkins_utils.py 2015-01-07 17:22:49 +0000
3629@@ -0,0 +1,6 @@
3630+import unittest
3631+
3632+
3633+class JenkinsUtilsTests(unittest.TestCase):
3634+ def setUp(self):
3635+ super(JenkinsUtilsTests, self).setUp()

Subscribers

People subscribed via source and target branches

to all changes: