Merge lp:~hopem/charms/trusty/jenkins/python-redux into lp:charms/trusty/jenkins

Proposed by Edward Hope-Morley
Status: Superseded
Proposed branch: lp:~hopem/charms/trusty/jenkins/python-redux
Merge into: lp:charms/trusty/jenkins
Diff against target: 3754 lines (+3253/-274)
38 files modified
Makefile (+30/-0)
bin/charm_helpers_sync.py (+225/-0)
charm-helpers-hooks.yaml (+7/-0)
config.yaml (+1/-1)
hooks/addnode (+0/-21)
hooks/charmhelpers/__init__.py (+22/-0)
hooks/charmhelpers/core/decorators.py (+41/-0)
hooks/charmhelpers/core/fstab.py (+118/-0)
hooks/charmhelpers/core/hookenv.py (+552/-0)
hooks/charmhelpers/core/host.py (+419/-0)
hooks/charmhelpers/core/services/__init__.py (+2/-0)
hooks/charmhelpers/core/services/base.py (+313/-0)
hooks/charmhelpers/core/services/helpers.py (+243/-0)
hooks/charmhelpers/core/sysctl.py (+34/-0)
hooks/charmhelpers/core/templating.py (+52/-0)
hooks/charmhelpers/fetch/__init__.py (+423/-0)
hooks/charmhelpers/fetch/archiveurl.py (+145/-0)
hooks/charmhelpers/fetch/bzrurl.py (+54/-0)
hooks/charmhelpers/fetch/giturl.py (+51/-0)
hooks/charmhelpers/payload/__init__.py (+1/-0)
hooks/charmhelpers/payload/execd.py (+50/-0)
hooks/config-changed (+0/-7)
hooks/delnode (+0/-16)
hooks/install (+0/-151)
hooks/jenkins_hooks.py (+220/-0)
hooks/jenkins_utils.py (+178/-0)
hooks/master-relation-broken (+0/-17)
hooks/master-relation-changed (+0/-24)
hooks/master-relation-departed (+0/-12)
hooks/master-relation-joined (+0/-5)
hooks/start (+0/-3)
hooks/stop (+0/-3)
hooks/upgrade-charm (+0/-7)
hooks/website-relation-joined (+0/-5)
tests/100-deploy-trusty (+4/-2)
tests/README (+56/-0)
unit_tests/test_jenkins_hooks.py (+6/-0)
unit_tests/test_jenkins_utils.py (+6/-0)
To merge this branch: bzr merge lp:~hopem/charms/trusty/jenkins/python-redux
Reviewer Review Type Date Requested Status
Whit Morriss (community) Needs Fixing
Review Queue (community) automated testing Needs Fixing
Ryan Beisner Pending
Felipe Reyes Pending
Paul Larson Pending
Jorge Niedbalski Pending
James Page Pending
Review via email: mp+245769@code.launchpad.net

This proposal supersedes a proposal from 2014-12-12.

This proposal has been superseded by a proposal from 2015-01-20.

To post a comment you must log in.
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10336-results

review: Needs Fixing (automated testing)
Revision history for this message
Felipe Reyes (freyes) wrote : Posted in a previous version of this proposal

Setting the password doesn't work, deploying as below doesn't allow you to login with admin/admin. Also when first deploying from the charm store and then upgrading to this branch breaks the password.

---
jenkins:
    password: "admin"
---

$ juju deploy --config config.yaml local:trusty/jenkins

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote : Posted in a previous version of this proposal

Thanks Felipe, taking a look.

Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10636-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10684-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10704-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10869-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote :

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10876-results

review: Needs Fixing (automated testing)
Revision history for this message
Whit Morriss (whitmo) wrote :

Thanks Edward, your python rewrite generally looks good at a glance.

I confirmed the test failures reported by automated testing: jenkins-slave is not being found because it is in the precise series only.

Changing test/100-deploy-trusty:line 19 to "d.add('jenkins-slave', 'cs:precise/jenkins-slave')" remedies this issue until there is a trusty version of jenkins slave.

The tests also hit an error in "master-relation-changed" due to what appears to be a race condition with the configuration and restart of the jenkins slave and adding a node to master. Running the hook via debug hook runs fine.

See logs: https://gist.githubusercontent.com/anonymous/0067138ce2cc697b8c88/raw/3f4c03688400b32f3aa66e1bd3bad5b7398f80a5/jenkins-race-condition

I confirmed this as an issue in the merge targets bash implementation also. Adding some retry logic to add_node should fix this issues.

-1 for test fixes, but otherwise looks good. Thanks again!

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote :

@whitmo awesome, thanks for reviewing. I'll see if I can improve the add_node issue and I'll get the amulet test fixed up. Thanks!

48. By Edward Hope-Morley

fix amulet test

49. By Edward Hope-Morley

synced charmhelpers

50. By Edward Hope-Morley

allow retries when adding node

51. By Edward Hope-Morley

 * Fixed Makefile amulet test filename
 * Synced charm-helpers python-six deps
 * Synced charm-helpers test deps

52. By Edward Hope-Morley

added precise and trusty amulet

53. By Edward Hope-Morley

switch makefile rules to names that juju ci (hopefully) understands

54. By Edward Hope-Morley

ensure apt update prior to install

55. By Edward Hope-Morley

added venv for tests and lint

Unmerged revisions

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'Makefile'
2--- Makefile 1970-01-01 00:00:00 +0000
3+++ Makefile 2015-01-20 18:33:44 +0000
4@@ -0,0 +1,30 @@
5+#!/usr/bin/make
6+PYTHON := /usr/bin/env python
7+
8+lint:
9+ @flake8 --exclude hooks/charmhelpers hooks unit_tests tests
10+ @charm proof
11+
12+test:
13+ @echo Starting Amulet tests...
14+ # coreycb note: The -v should only be temporary until Amulet sends
15+ # raise_status() messages to stderr:
16+ # https://bugs.launchpad.net/amulet/+bug/1320357
17+ @juju test -v -p AMULET_HTTP_PROXY --timeout 900 \
18+ 00-setup 100-deploy
19+
20+unit_test:
21+ @echo Starting unit tests...
22+ @$(PYTHON) /usr/bin/nosetests --nologcapture --with-coverage unit_tests
23+
24+bin/charm_helpers_sync.py:
25+ @mkdir -p bin
26+ @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
27+ > bin/charm_helpers_sync.py
28+
29+sync: bin/charm_helpers_sync.py
30+ @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers-hooks.yaml
31+
32+publish: lint unit_test
33+ bzr push lp:charms/jenkins
34+ bzr push lp:charms/trusty/jenkins
35
36=== added directory 'bin'
37=== added file 'bin/charm_helpers_sync.py'
38--- bin/charm_helpers_sync.py 1970-01-01 00:00:00 +0000
39+++ bin/charm_helpers_sync.py 2015-01-20 18:33:44 +0000
40@@ -0,0 +1,225 @@
41+#!/usr/bin/python
42+#
43+# Copyright 2013 Canonical Ltd.
44+
45+# Authors:
46+# Adam Gandelman <adamg@ubuntu.com>
47+#
48+
49+import logging
50+import optparse
51+import os
52+import subprocess
53+import shutil
54+import sys
55+import tempfile
56+import yaml
57+
58+from fnmatch import fnmatch
59+
60+CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
61+
62+
63+def parse_config(conf_file):
64+ if not os.path.isfile(conf_file):
65+ logging.error('Invalid config file: %s.' % conf_file)
66+ return False
67+ return yaml.load(open(conf_file).read())
68+
69+
70+def clone_helpers(work_dir, branch):
71+ dest = os.path.join(work_dir, 'charm-helpers')
72+ logging.info('Checking out %s to %s.' % (branch, dest))
73+ cmd = ['bzr', 'checkout', '--lightweight', branch, dest]
74+ subprocess.check_call(cmd)
75+ return dest
76+
77+
78+def _module_path(module):
79+ return os.path.join(*module.split('.'))
80+
81+
82+def _src_path(src, module):
83+ return os.path.join(src, 'charmhelpers', _module_path(module))
84+
85+
86+def _dest_path(dest, module):
87+ return os.path.join(dest, _module_path(module))
88+
89+
90+def _is_pyfile(path):
91+ return os.path.isfile(path + '.py')
92+
93+
94+def ensure_init(path):
95+ '''
96+ ensure directories leading up to path are importable, omitting
97+ parent directory, eg path='/hooks/helpers/foo'/:
98+ hooks/
99+ hooks/helpers/__init__.py
100+ hooks/helpers/foo/__init__.py
101+ '''
102+ for d, dirs, files in os.walk(os.path.join(*path.split('/')[:2])):
103+ _i = os.path.join(d, '__init__.py')
104+ if not os.path.exists(_i):
105+ logging.info('Adding missing __init__.py: %s' % _i)
106+ open(_i, 'wb').close()
107+
108+
109+def sync_pyfile(src, dest):
110+ src = src + '.py'
111+ src_dir = os.path.dirname(src)
112+ logging.info('Syncing pyfile: %s -> %s.' % (src, dest))
113+ if not os.path.exists(dest):
114+ os.makedirs(dest)
115+ shutil.copy(src, dest)
116+ if os.path.isfile(os.path.join(src_dir, '__init__.py')):
117+ shutil.copy(os.path.join(src_dir, '__init__.py'),
118+ dest)
119+ ensure_init(dest)
120+
121+
122+def get_filter(opts=None):
123+ opts = opts or []
124+ if 'inc=*' in opts:
125+ # do not filter any files, include everything
126+ return None
127+
128+ def _filter(dir, ls):
129+ incs = [opt.split('=').pop() for opt in opts if 'inc=' in opt]
130+ _filter = []
131+ for f in ls:
132+ _f = os.path.join(dir, f)
133+
134+ if not os.path.isdir(_f) and not _f.endswith('.py') and incs:
135+ if True not in [fnmatch(_f, inc) for inc in incs]:
136+ logging.debug('Not syncing %s, does not match include '
137+ 'filters (%s)' % (_f, incs))
138+ _filter.append(f)
139+ else:
140+ logging.debug('Including file, which matches include '
141+ 'filters (%s): %s' % (incs, _f))
142+ elif (os.path.isfile(_f) and not _f.endswith('.py')):
143+ logging.debug('Not syncing file: %s' % f)
144+ _filter.append(f)
145+ elif (os.path.isdir(_f) and not
146+ os.path.isfile(os.path.join(_f, '__init__.py'))):
147+ logging.debug('Not syncing directory: %s' % f)
148+ _filter.append(f)
149+ return _filter
150+ return _filter
151+
152+
153+def sync_directory(src, dest, opts=None):
154+ if os.path.exists(dest):
155+ logging.debug('Removing existing directory: %s' % dest)
156+ shutil.rmtree(dest)
157+ logging.info('Syncing directory: %s -> %s.' % (src, dest))
158+
159+ shutil.copytree(src, dest, ignore=get_filter(opts))
160+ ensure_init(dest)
161+
162+
163+def sync(src, dest, module, opts=None):
164+ if os.path.isdir(_src_path(src, module)):
165+ sync_directory(_src_path(src, module), _dest_path(dest, module), opts)
166+ elif _is_pyfile(_src_path(src, module)):
167+ sync_pyfile(_src_path(src, module),
168+ os.path.dirname(_dest_path(dest, module)))
169+ else:
170+ logging.warn('Could not sync: %s. Neither a pyfile or directory, '
171+ 'does it even exist?' % module)
172+
173+
174+def parse_sync_options(options):
175+ if not options:
176+ return []
177+ return options.split(',')
178+
179+
180+def extract_options(inc, global_options=None):
181+ global_options = global_options or []
182+ if global_options and isinstance(global_options, basestring):
183+ global_options = [global_options]
184+ if '|' not in inc:
185+ return (inc, global_options)
186+ inc, opts = inc.split('|')
187+ return (inc, parse_sync_options(opts) + global_options)
188+
189+
190+def sync_helpers(include, src, dest, options=None):
191+ if not os.path.isdir(dest):
192+ os.makedirs(dest)
193+
194+ global_options = parse_sync_options(options)
195+
196+ for inc in include:
197+ if isinstance(inc, str):
198+ inc, opts = extract_options(inc, global_options)
199+ sync(src, dest, inc, opts)
200+ elif isinstance(inc, dict):
201+ # could also do nested dicts here.
202+ for k, v in inc.iteritems():
203+ if isinstance(v, list):
204+ for m in v:
205+ inc, opts = extract_options(m, global_options)
206+ sync(src, dest, '%s.%s' % (k, inc), opts)
207+
208+if __name__ == '__main__':
209+ parser = optparse.OptionParser()
210+ parser.add_option('-c', '--config', action='store', dest='config',
211+ default=None, help='helper config file')
212+ parser.add_option('-D', '--debug', action='store_true', dest='debug',
213+ default=False, help='debug')
214+ parser.add_option('-b', '--branch', action='store', dest='branch',
215+ help='charm-helpers bzr branch (overrides config)')
216+ parser.add_option('-d', '--destination', action='store', dest='dest_dir',
217+ help='sync destination dir (overrides config)')
218+ (opts, args) = parser.parse_args()
219+
220+ if opts.debug:
221+ logging.basicConfig(level=logging.DEBUG)
222+ else:
223+ logging.basicConfig(level=logging.INFO)
224+
225+ if opts.config:
226+ logging.info('Loading charm helper config from %s.' % opts.config)
227+ config = parse_config(opts.config)
228+ if not config:
229+ logging.error('Could not parse config from %s.' % opts.config)
230+ sys.exit(1)
231+ else:
232+ config = {}
233+
234+ if 'branch' not in config:
235+ config['branch'] = CHARM_HELPERS_BRANCH
236+ if opts.branch:
237+ config['branch'] = opts.branch
238+ if opts.dest_dir:
239+ config['destination'] = opts.dest_dir
240+
241+ if 'destination' not in config:
242+ logging.error('No destination dir. specified as option or config.')
243+ sys.exit(1)
244+
245+ if 'include' not in config:
246+ if not args:
247+ logging.error('No modules to sync specified as option or config.')
248+ sys.exit(1)
249+ config['include'] = []
250+ [config['include'].append(a) for a in args]
251+
252+ sync_options = None
253+ if 'options' in config:
254+ sync_options = config['options']
255+ tmpd = tempfile.mkdtemp()
256+ try:
257+ checkout = clone_helpers(tmpd, config['branch'])
258+ sync_helpers(config['include'], checkout, config['destination'],
259+ options=sync_options)
260+ except Exception, e:
261+ logging.error("Could not sync: %s" % e)
262+ raise e
263+ finally:
264+ logging.debug('Cleaning up %s' % tmpd)
265+ shutil.rmtree(tmpd)
266
267=== added file 'charm-helpers-hooks.yaml'
268--- charm-helpers-hooks.yaml 1970-01-01 00:00:00 +0000
269+++ charm-helpers-hooks.yaml 2015-01-20 18:33:44 +0000
270@@ -0,0 +1,7 @@
271+branch: lp:charm-helpers
272+destination: hooks/charmhelpers
273+include:
274+ - __init__
275+ - core
276+ - fetch
277+ - payload.execd
278
279=== modified file 'config.yaml'
280--- config.yaml 2014-08-14 19:53:02 +0000
281+++ config.yaml 2015-01-20 18:33:44 +0000
282@@ -17,9 +17,9 @@
283 slave nodes so please don't change in Jenkins.
284 password:
285 type: string
286+ default: ""
287 description: Admin user password - used to manage
288 slave nodes so please don't change in Jenkins.
289- default:
290 plugins:
291 type: string
292 default: ""
293
294=== removed file 'hooks/addnode'
295--- hooks/addnode 2012-04-27 13:04:33 +0000
296+++ hooks/addnode 1970-01-01 00:00:00 +0000
297@@ -1,21 +0,0 @@
298-#!/usr/bin/python
299-
300-import jenkins
301-import sys
302-
303-host=sys.argv[1]
304-executors=sys.argv[2]
305-labels=sys.argv[3]
306-username=sys.argv[4]
307-password=sys.argv[5]
308-
309-l_jenkins = jenkins.Jenkins("http://localhost:8080/",username,password)
310-
311-if l_jenkins.node_exists(host):
312- print "Node exists - not adding"
313-else:
314- print "Adding node to Jenkins master"
315- l_jenkins.create_node(host, int(executors) * 2, host , labels=labels)
316-
317-if not l_jenkins.node_exists(host):
318- print "Failed to create node"
319
320=== added directory 'hooks/charmhelpers'
321=== added file 'hooks/charmhelpers/__init__.py'
322--- hooks/charmhelpers/__init__.py 1970-01-01 00:00:00 +0000
323+++ hooks/charmhelpers/__init__.py 2015-01-20 18:33:44 +0000
324@@ -0,0 +1,22 @@
325+# Bootstrap charm-helpers, installing its dependencies if necessary using
326+# only standard libraries.
327+import subprocess
328+import sys
329+
330+try:
331+ import six # flake8: noqa
332+except ImportError:
333+ if sys.version_info.major == 2:
334+ subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])
335+ else:
336+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])
337+ import six # flake8: noqa
338+
339+try:
340+ import yaml # flake8: noqa
341+except ImportError:
342+ if sys.version_info.major == 2:
343+ subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])
344+ else:
345+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])
346+ import yaml # flake8: noqa
347
348=== added directory 'hooks/charmhelpers/contrib'
349=== added file 'hooks/charmhelpers/contrib/__init__.py'
350=== added directory 'hooks/charmhelpers/core'
351=== added file 'hooks/charmhelpers/core/__init__.py'
352=== added file 'hooks/charmhelpers/core/decorators.py'
353--- hooks/charmhelpers/core/decorators.py 1970-01-01 00:00:00 +0000
354+++ hooks/charmhelpers/core/decorators.py 2015-01-20 18:33:44 +0000
355@@ -0,0 +1,41 @@
356+#
357+# Copyright 2014 Canonical Ltd.
358+#
359+# Authors:
360+# Edward Hope-Morley <opentastic@gmail.com>
361+#
362+
363+import time
364+
365+from charmhelpers.core.hookenv import (
366+ log,
367+ INFO,
368+)
369+
370+
371+def retry_on_exception(num_retries, base_delay=0, exc_type=Exception):
372+ """If the decorated function raises exception exc_type, allow num_retries
373+ retry attempts before raise the exception.
374+ """
375+ def _retry_on_exception_inner_1(f):
376+ def _retry_on_exception_inner_2(*args, **kwargs):
377+ retries = num_retries
378+ multiplier = 1
379+ while True:
380+ try:
381+ return f(*args, **kwargs)
382+ except exc_type:
383+ if not retries:
384+ raise
385+
386+ delay = base_delay * multiplier
387+ multiplier += 1
388+ log("Retrying '%s' %d more times (delay=%s)" %
389+ (f.__name__, retries, delay), level=INFO)
390+ retries -= 1
391+ if delay:
392+ time.sleep(delay)
393+
394+ return _retry_on_exception_inner_2
395+
396+ return _retry_on_exception_inner_1
397
398=== added file 'hooks/charmhelpers/core/fstab.py'
399--- hooks/charmhelpers/core/fstab.py 1970-01-01 00:00:00 +0000
400+++ hooks/charmhelpers/core/fstab.py 2015-01-20 18:33:44 +0000
401@@ -0,0 +1,118 @@
402+#!/usr/bin/env python
403+# -*- coding: utf-8 -*-
404+
405+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
406+
407+import io
408+import os
409+
410+
411+class Fstab(io.FileIO):
412+ """This class extends file in order to implement a file reader/writer
413+ for file `/etc/fstab`
414+ """
415+
416+ class Entry(object):
417+ """Entry class represents a non-comment line on the `/etc/fstab` file
418+ """
419+ def __init__(self, device, mountpoint, filesystem,
420+ options, d=0, p=0):
421+ self.device = device
422+ self.mountpoint = mountpoint
423+ self.filesystem = filesystem
424+
425+ if not options:
426+ options = "defaults"
427+
428+ self.options = options
429+ self.d = int(d)
430+ self.p = int(p)
431+
432+ def __eq__(self, o):
433+ return str(self) == str(o)
434+
435+ def __str__(self):
436+ return "{} {} {} {} {} {}".format(self.device,
437+ self.mountpoint,
438+ self.filesystem,
439+ self.options,
440+ self.d,
441+ self.p)
442+
443+ DEFAULT_PATH = os.path.join(os.path.sep, 'etc', 'fstab')
444+
445+ def __init__(self, path=None):
446+ if path:
447+ self._path = path
448+ else:
449+ self._path = self.DEFAULT_PATH
450+ super(Fstab, self).__init__(self._path, 'rb+')
451+
452+ def _hydrate_entry(self, line):
453+ # NOTE: use split with no arguments to split on any
454+ # whitespace including tabs
455+ return Fstab.Entry(*filter(
456+ lambda x: x not in ('', None),
457+ line.strip("\n").split()))
458+
459+ @property
460+ def entries(self):
461+ self.seek(0)
462+ for line in self.readlines():
463+ line = line.decode('us-ascii')
464+ try:
465+ if line.strip() and not line.startswith("#"):
466+ yield self._hydrate_entry(line)
467+ except ValueError:
468+ pass
469+
470+ def get_entry_by_attr(self, attr, value):
471+ for entry in self.entries:
472+ e_attr = getattr(entry, attr)
473+ if e_attr == value:
474+ return entry
475+ return None
476+
477+ def add_entry(self, entry):
478+ if self.get_entry_by_attr('device', entry.device):
479+ return False
480+
481+ self.write((str(entry) + '\n').encode('us-ascii'))
482+ self.truncate()
483+ return entry
484+
485+ def remove_entry(self, entry):
486+ self.seek(0)
487+
488+ lines = [l.decode('us-ascii') for l in self.readlines()]
489+
490+ found = False
491+ for index, line in enumerate(lines):
492+ if not line.startswith("#"):
493+ if self._hydrate_entry(line) == entry:
494+ found = True
495+ break
496+
497+ if not found:
498+ return False
499+
500+ lines.remove(line)
501+
502+ self.seek(0)
503+ self.write(''.join(lines).encode('us-ascii'))
504+ self.truncate()
505+ return True
506+
507+ @classmethod
508+ def remove_by_mountpoint(cls, mountpoint, path=None):
509+ fstab = cls(path=path)
510+ entry = fstab.get_entry_by_attr('mountpoint', mountpoint)
511+ if entry:
512+ return fstab.remove_entry(entry)
513+ return False
514+
515+ @classmethod
516+ def add(cls, device, mountpoint, filesystem, options=None, path=None):
517+ return cls(path=path).add_entry(Fstab.Entry(device,
518+ mountpoint, filesystem,
519+ options=options))
520
521=== added file 'hooks/charmhelpers/core/hookenv.py'
522--- hooks/charmhelpers/core/hookenv.py 1970-01-01 00:00:00 +0000
523+++ hooks/charmhelpers/core/hookenv.py 2015-01-20 18:33:44 +0000
524@@ -0,0 +1,552 @@
525+"Interactions with the Juju environment"
526+# Copyright 2013 Canonical Ltd.
527+#
528+# Authors:
529+# Charm Helpers Developers <juju@lists.ubuntu.com>
530+
531+import os
532+import json
533+import yaml
534+import subprocess
535+import sys
536+from subprocess import CalledProcessError
537+
538+import six
539+if not six.PY3:
540+ from UserDict import UserDict
541+else:
542+ from collections import UserDict
543+
544+CRITICAL = "CRITICAL"
545+ERROR = "ERROR"
546+WARNING = "WARNING"
547+INFO = "INFO"
548+DEBUG = "DEBUG"
549+MARKER = object()
550+
551+cache = {}
552+
553+
554+def cached(func):
555+ """Cache return values for multiple executions of func + args
556+
557+ For example::
558+
559+ @cached
560+ def unit_get(attribute):
561+ pass
562+
563+ unit_get('test')
564+
565+ will cache the result of unit_get + 'test' for future calls.
566+ """
567+ def wrapper(*args, **kwargs):
568+ global cache
569+ key = str((func, args, kwargs))
570+ try:
571+ return cache[key]
572+ except KeyError:
573+ res = func(*args, **kwargs)
574+ cache[key] = res
575+ return res
576+ return wrapper
577+
578+
579+def flush(key):
580+ """Flushes any entries from function cache where the
581+ key is found in the function+args """
582+ flush_list = []
583+ for item in cache:
584+ if key in item:
585+ flush_list.append(item)
586+ for item in flush_list:
587+ del cache[item]
588+
589+
590+def log(message, level=None):
591+ """Write a message to the juju log"""
592+ command = ['juju-log']
593+ if level:
594+ command += ['-l', level]
595+ if not isinstance(message, six.string_types):
596+ message = repr(message)
597+ command += [message]
598+ subprocess.call(command)
599+
600+
601+class Serializable(UserDict):
602+ """Wrapper, an object that can be serialized to yaml or json"""
603+
604+ def __init__(self, obj):
605+ # wrap the object
606+ UserDict.__init__(self)
607+ self.data = obj
608+
609+ def __getattr__(self, attr):
610+ # See if this object has attribute.
611+ if attr in ("json", "yaml", "data"):
612+ return self.__dict__[attr]
613+ # Check for attribute in wrapped object.
614+ got = getattr(self.data, attr, MARKER)
615+ if got is not MARKER:
616+ return got
617+ # Proxy to the wrapped object via dict interface.
618+ try:
619+ return self.data[attr]
620+ except KeyError:
621+ raise AttributeError(attr)
622+
623+ def __getstate__(self):
624+ # Pickle as a standard dictionary.
625+ return self.data
626+
627+ def __setstate__(self, state):
628+ # Unpickle into our wrapper.
629+ self.data = state
630+
631+ def json(self):
632+ """Serialize the object to json"""
633+ return json.dumps(self.data)
634+
635+ def yaml(self):
636+ """Serialize the object to yaml"""
637+ return yaml.dump(self.data)
638+
639+
640+def execution_environment():
641+ """A convenient bundling of the current execution context"""
642+ context = {}
643+ context['conf'] = config()
644+ if relation_id():
645+ context['reltype'] = relation_type()
646+ context['relid'] = relation_id()
647+ context['rel'] = relation_get()
648+ context['unit'] = local_unit()
649+ context['rels'] = relations()
650+ context['env'] = os.environ
651+ return context
652+
653+
654+def in_relation_hook():
655+ """Determine whether we're running in a relation hook"""
656+ return 'JUJU_RELATION' in os.environ
657+
658+
659+def relation_type():
660+ """The scope for the current relation hook"""
661+ return os.environ.get('JUJU_RELATION', None)
662+
663+
664+def relation_id():
665+ """The relation ID for the current relation hook"""
666+ return os.environ.get('JUJU_RELATION_ID', None)
667+
668+
669+def local_unit():
670+ """Local unit ID"""
671+ return os.environ['JUJU_UNIT_NAME']
672+
673+
674+def remote_unit():
675+ """The remote unit for the current relation hook"""
676+ return os.environ['JUJU_REMOTE_UNIT']
677+
678+
679+def service_name():
680+ """The name service group this unit belongs to"""
681+ return local_unit().split('/')[0]
682+
683+
684+def hook_name():
685+ """The name of the currently executing hook"""
686+ return os.path.basename(sys.argv[0])
687+
688+
689+class Config(dict):
690+ """A dictionary representation of the charm's config.yaml, with some
691+ extra features:
692+
693+ - See which values in the dictionary have changed since the previous hook.
694+ - For values that have changed, see what the previous value was.
695+ - Store arbitrary data for use in a later hook.
696+
697+ NOTE: Do not instantiate this object directly - instead call
698+ ``hookenv.config()``, which will return an instance of :class:`Config`.
699+
700+ Example usage::
701+
702+ >>> # inside a hook
703+ >>> from charmhelpers.core import hookenv
704+ >>> config = hookenv.config()
705+ >>> config['foo']
706+ 'bar'
707+ >>> # store a new key/value for later use
708+ >>> config['mykey'] = 'myval'
709+
710+
711+ >>> # user runs `juju set mycharm foo=baz`
712+ >>> # now we're inside subsequent config-changed hook
713+ >>> config = hookenv.config()
714+ >>> config['foo']
715+ 'baz'
716+ >>> # test to see if this val has changed since last hook
717+ >>> config.changed('foo')
718+ True
719+ >>> # what was the previous value?
720+ >>> config.previous('foo')
721+ 'bar'
722+ >>> # keys/values that we add are preserved across hooks
723+ >>> config['mykey']
724+ 'myval'
725+
726+ """
727+ CONFIG_FILE_NAME = '.juju-persistent-config'
728+
729+ def __init__(self, *args, **kw):
730+ super(Config, self).__init__(*args, **kw)
731+ self.implicit_save = True
732+ self._prev_dict = None
733+ self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)
734+ if os.path.exists(self.path):
735+ self.load_previous()
736+
737+ def __getitem__(self, key):
738+ """For regular dict lookups, check the current juju config first,
739+ then the previous (saved) copy. This ensures that user-saved values
740+ will be returned by a dict lookup.
741+
742+ """
743+ try:
744+ return dict.__getitem__(self, key)
745+ except KeyError:
746+ return (self._prev_dict or {})[key]
747+
748+ def keys(self):
749+ prev_keys = []
750+ if self._prev_dict is not None:
751+ prev_keys = self._prev_dict.keys()
752+ return list(set(prev_keys + list(dict.keys(self))))
753+
754+ def load_previous(self, path=None):
755+ """Load previous copy of config from disk.
756+
757+ In normal usage you don't need to call this method directly - it
758+ is called automatically at object initialization.
759+
760+ :param path:
761+
762+ File path from which to load the previous config. If `None`,
763+ config is loaded from the default location. If `path` is
764+ specified, subsequent `save()` calls will write to the same
765+ path.
766+
767+ """
768+ self.path = path or self.path
769+ with open(self.path) as f:
770+ self._prev_dict = json.load(f)
771+
772+ def changed(self, key):
773+ """Return True if the current value for this key is different from
774+ the previous value.
775+
776+ """
777+ if self._prev_dict is None:
778+ return True
779+ return self.previous(key) != self.get(key)
780+
781+ def previous(self, key):
782+ """Return previous value for this key, or None if there
783+ is no previous value.
784+
785+ """
786+ if self._prev_dict:
787+ return self._prev_dict.get(key)
788+ return None
789+
790+ def save(self):
791+ """Save this config to disk.
792+
793+ If the charm is using the :mod:`Services Framework <services.base>`
794+ or :meth:'@hook <Hooks.hook>' decorator, this
795+ is called automatically at the end of successful hook execution.
796+ Otherwise, it should be called directly by user code.
797+
798+ To disable automatic saves, set ``implicit_save=False`` on this
799+ instance.
800+
801+ """
802+ if self._prev_dict:
803+ for k, v in six.iteritems(self._prev_dict):
804+ if k not in self:
805+ self[k] = v
806+ with open(self.path, 'w') as f:
807+ json.dump(self, f)
808+
809+
810+@cached
811+def config(scope=None):
812+ """Juju charm configuration"""
813+ config_cmd_line = ['config-get']
814+ if scope is not None:
815+ config_cmd_line.append(scope)
816+ config_cmd_line.append('--format=json')
817+ try:
818+ config_data = json.loads(
819+ subprocess.check_output(config_cmd_line).decode('UTF-8'))
820+ if scope is not None:
821+ return config_data
822+ return Config(config_data)
823+ except ValueError:
824+ return None
825+
826+
827+@cached
828+def relation_get(attribute=None, unit=None, rid=None):
829+ """Get relation information"""
830+ _args = ['relation-get', '--format=json']
831+ if rid:
832+ _args.append('-r')
833+ _args.append(rid)
834+ _args.append(attribute or '-')
835+ if unit:
836+ _args.append(unit)
837+ try:
838+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
839+ except ValueError:
840+ return None
841+ except CalledProcessError as e:
842+ if e.returncode == 2:
843+ return None
844+ raise
845+
846+
847+def relation_set(relation_id=None, relation_settings=None, **kwargs):
848+ """Set relation information for the current unit"""
849+ relation_settings = relation_settings if relation_settings else {}
850+ relation_cmd_line = ['relation-set']
851+ if relation_id is not None:
852+ relation_cmd_line.extend(('-r', relation_id))
853+ for k, v in (list(relation_settings.items()) + list(kwargs.items())):
854+ if v is None:
855+ relation_cmd_line.append('{}='.format(k))
856+ else:
857+ relation_cmd_line.append('{}={}'.format(k, v))
858+ subprocess.check_call(relation_cmd_line)
859+ # Flush cache of any relation-gets for local unit
860+ flush(local_unit())
861+
862+
863+@cached
864+def relation_ids(reltype=None):
865+ """A list of relation_ids"""
866+ reltype = reltype or relation_type()
867+ relid_cmd_line = ['relation-ids', '--format=json']
868+ if reltype is not None:
869+ relid_cmd_line.append(reltype)
870+ return json.loads(
871+ subprocess.check_output(relid_cmd_line).decode('UTF-8')) or []
872+ return []
873+
874+
875+@cached
876+def related_units(relid=None):
877+ """A list of related units"""
878+ relid = relid or relation_id()
879+ units_cmd_line = ['relation-list', '--format=json']
880+ if relid is not None:
881+ units_cmd_line.extend(('-r', relid))
882+ return json.loads(
883+ subprocess.check_output(units_cmd_line).decode('UTF-8')) or []
884+
885+
886+@cached
887+def relation_for_unit(unit=None, rid=None):
888+ """Get the json represenation of a unit's relation"""
889+ unit = unit or remote_unit()
890+ relation = relation_get(unit=unit, rid=rid)
891+ for key in relation:
892+ if key.endswith('-list'):
893+ relation[key] = relation[key].split()
894+ relation['__unit__'] = unit
895+ return relation
896+
897+
898+@cached
899+def relations_for_id(relid=None):
900+ """Get relations of a specific relation ID"""
901+ relation_data = []
902+ relid = relid or relation_ids()
903+ for unit in related_units(relid):
904+ unit_data = relation_for_unit(unit, relid)
905+ unit_data['__relid__'] = relid
906+ relation_data.append(unit_data)
907+ return relation_data
908+
909+
910+@cached
911+def relations_of_type(reltype=None):
912+ """Get relations of a specific type"""
913+ relation_data = []
914+ reltype = reltype or relation_type()
915+ for relid in relation_ids(reltype):
916+ for relation in relations_for_id(relid):
917+ relation['__relid__'] = relid
918+ relation_data.append(relation)
919+ return relation_data
920+
921+
922+@cached
923+def metadata():
924+ """Get the current charm metadata.yaml contents as a python object"""
925+ with open(os.path.join(charm_dir(), 'metadata.yaml')) as md:
926+ return yaml.safe_load(md)
927+
928+
929+@cached
930+def relation_types():
931+ """Get a list of relation types supported by this charm"""
932+ rel_types = []
933+ md = metadata()
934+ for key in ('provides', 'requires', 'peers'):
935+ section = md.get(key)
936+ if section:
937+ rel_types.extend(section.keys())
938+ return rel_types
939+
940+
941+@cached
942+def charm_name():
943+ """Get the name of the current charm as is specified on metadata.yaml"""
944+ return metadata().get('name')
945+
946+
947+@cached
948+def relations():
949+ """Get a nested dictionary of relation data for all related units"""
950+ rels = {}
951+ for reltype in relation_types():
952+ relids = {}
953+ for relid in relation_ids(reltype):
954+ units = {local_unit(): relation_get(unit=local_unit(), rid=relid)}
955+ for unit in related_units(relid):
956+ reldata = relation_get(unit=unit, rid=relid)
957+ units[unit] = reldata
958+ relids[relid] = units
959+ rels[reltype] = relids
960+ return rels
961+
962+
963+@cached
964+def is_relation_made(relation, keys='private-address'):
965+ '''
966+ Determine whether a relation is established by checking for
967+ presence of key(s). If a list of keys is provided, they
968+ must all be present for the relation to be identified as made
969+ '''
970+ if isinstance(keys, str):
971+ keys = [keys]
972+ for r_id in relation_ids(relation):
973+ for unit in related_units(r_id):
974+ context = {}
975+ for k in keys:
976+ context[k] = relation_get(k, rid=r_id,
977+ unit=unit)
978+ if None not in context.values():
979+ return True
980+ return False
981+
982+
983+def open_port(port, protocol="TCP"):
984+ """Open a service network port"""
985+ _args = ['open-port']
986+ _args.append('{}/{}'.format(port, protocol))
987+ subprocess.check_call(_args)
988+
989+
990+def close_port(port, protocol="TCP"):
991+ """Close a service network port"""
992+ _args = ['close-port']
993+ _args.append('{}/{}'.format(port, protocol))
994+ subprocess.check_call(_args)
995+
996+
997+@cached
998+def unit_get(attribute):
999+ """Get the unit ID for the remote unit"""
1000+ _args = ['unit-get', '--format=json', attribute]
1001+ try:
1002+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
1003+ except ValueError:
1004+ return None
1005+
1006+
1007+def unit_private_ip():
1008+ """Get this unit's private IP address"""
1009+ return unit_get('private-address')
1010+
1011+
1012+class UnregisteredHookError(Exception):
1013+ """Raised when an undefined hook is called"""
1014+ pass
1015+
1016+
1017+class Hooks(object):
1018+ """A convenient handler for hook functions.
1019+
1020+ Example::
1021+
1022+ hooks = Hooks()
1023+
1024+ # register a hook, taking its name from the function name
1025+ @hooks.hook()
1026+ def install():
1027+ pass # your code here
1028+
1029+ # register a hook, providing a custom hook name
1030+ @hooks.hook("config-changed")
1031+ def config_changed():
1032+ pass # your code here
1033+
1034+ if __name__ == "__main__":
1035+ # execute a hook based on the name the program is called by
1036+ hooks.execute(sys.argv)
1037+ """
1038+
1039+ def __init__(self, config_save=True):
1040+ super(Hooks, self).__init__()
1041+ self._hooks = {}
1042+ self._config_save = config_save
1043+
1044+ def register(self, name, function):
1045+ """Register a hook"""
1046+ self._hooks[name] = function
1047+
1048+ def execute(self, args):
1049+ """Execute a registered hook based on args[0]"""
1050+ hook_name = os.path.basename(args[0])
1051+ if hook_name in self._hooks:
1052+ self._hooks[hook_name]()
1053+ if self._config_save:
1054+ cfg = config()
1055+ if cfg.implicit_save:
1056+ cfg.save()
1057+ else:
1058+ raise UnregisteredHookError(hook_name)
1059+
1060+ def hook(self, *hook_names):
1061+ """Decorator, registering them as hooks"""
1062+ def wrapper(decorated):
1063+ for hook_name in hook_names:
1064+ self.register(hook_name, decorated)
1065+ else:
1066+ self.register(decorated.__name__, decorated)
1067+ if '_' in decorated.__name__:
1068+ self.register(
1069+ decorated.__name__.replace('_', '-'), decorated)
1070+ return decorated
1071+ return wrapper
1072+
1073+
1074+def charm_dir():
1075+ """Return the root directory of the current charm"""
1076+ return os.environ.get('CHARM_DIR')
1077
1078=== added file 'hooks/charmhelpers/core/host.py'
1079--- hooks/charmhelpers/core/host.py 1970-01-01 00:00:00 +0000
1080+++ hooks/charmhelpers/core/host.py 2015-01-20 18:33:44 +0000
1081@@ -0,0 +1,419 @@
1082+"""Tools for working with the host system"""
1083+# Copyright 2012 Canonical Ltd.
1084+#
1085+# Authors:
1086+# Nick Moffitt <nick.moffitt@canonical.com>
1087+# Matthew Wedgwood <matthew.wedgwood@canonical.com>
1088+
1089+import os
1090+import re
1091+import pwd
1092+import grp
1093+import random
1094+import string
1095+import subprocess
1096+import hashlib
1097+from contextlib import contextmanager
1098+from collections import OrderedDict
1099+
1100+import six
1101+
1102+from .hookenv import log
1103+from .fstab import Fstab
1104+
1105+
1106+def service_start(service_name):
1107+ """Start a system service"""
1108+ return service('start', service_name)
1109+
1110+
1111+def service_stop(service_name):
1112+ """Stop a system service"""
1113+ return service('stop', service_name)
1114+
1115+
1116+def service_restart(service_name):
1117+ """Restart a system service"""
1118+ return service('restart', service_name)
1119+
1120+
1121+def service_reload(service_name, restart_on_failure=False):
1122+ """Reload a system service, optionally falling back to restart if
1123+ reload fails"""
1124+ service_result = service('reload', service_name)
1125+ if not service_result and restart_on_failure:
1126+ service_result = service('restart', service_name)
1127+ return service_result
1128+
1129+
1130+def service(action, service_name):
1131+ """Control a system service"""
1132+ cmd = ['service', service_name, action]
1133+ return subprocess.call(cmd) == 0
1134+
1135+
1136+def service_running(service):
1137+ """Determine whether a system service is running"""
1138+ try:
1139+ output = subprocess.check_output(
1140+ ['service', service, 'status'],
1141+ stderr=subprocess.STDOUT).decode('UTF-8')
1142+ except subprocess.CalledProcessError:
1143+ return False
1144+ else:
1145+ if ("start/running" in output or "is running" in output):
1146+ return True
1147+ else:
1148+ return False
1149+
1150+
1151+def service_available(service_name):
1152+ """Determine whether a system service is available"""
1153+ try:
1154+ subprocess.check_output(
1155+ ['service', service_name, 'status'],
1156+ stderr=subprocess.STDOUT).decode('UTF-8')
1157+ except subprocess.CalledProcessError as e:
1158+ return 'unrecognized service' not in e.output
1159+ else:
1160+ return True
1161+
1162+
1163+def adduser(username, password=None, shell='/bin/bash', system_user=False):
1164+ """Add a user to the system"""
1165+ try:
1166+ user_info = pwd.getpwnam(username)
1167+ log('user {0} already exists!'.format(username))
1168+ except KeyError:
1169+ log('creating user {0}'.format(username))
1170+ cmd = ['useradd']
1171+ if system_user or password is None:
1172+ cmd.append('--system')
1173+ else:
1174+ cmd.extend([
1175+ '--create-home',
1176+ '--shell', shell,
1177+ '--password', password,
1178+ ])
1179+ cmd.append(username)
1180+ subprocess.check_call(cmd)
1181+ user_info = pwd.getpwnam(username)
1182+ return user_info
1183+
1184+
1185+def add_group(group_name, system_group=False):
1186+ """Add a group to the system"""
1187+ try:
1188+ group_info = grp.getgrnam(group_name)
1189+ log('group {0} already exists!'.format(group_name))
1190+ except KeyError:
1191+ log('creating group {0}'.format(group_name))
1192+ cmd = ['addgroup']
1193+ if system_group:
1194+ cmd.append('--system')
1195+ else:
1196+ cmd.extend([
1197+ '--group',
1198+ ])
1199+ cmd.append(group_name)
1200+ subprocess.check_call(cmd)
1201+ group_info = grp.getgrnam(group_name)
1202+ return group_info
1203+
1204+
1205+def add_user_to_group(username, group):
1206+ """Add a user to a group"""
1207+ cmd = [
1208+ 'gpasswd', '-a',
1209+ username,
1210+ group
1211+ ]
1212+ log("Adding user {} to group {}".format(username, group))
1213+ subprocess.check_call(cmd)
1214+
1215+
1216+def rsync(from_path, to_path, flags='-r', options=None):
1217+ """Replicate the contents of a path"""
1218+ options = options or ['--delete', '--executability']
1219+ cmd = ['/usr/bin/rsync', flags]
1220+ cmd.extend(options)
1221+ cmd.append(from_path)
1222+ cmd.append(to_path)
1223+ log(" ".join(cmd))
1224+ return subprocess.check_output(cmd).decode('UTF-8').strip()
1225+
1226+
1227+def symlink(source, destination):
1228+ """Create a symbolic link"""
1229+ log("Symlinking {} as {}".format(source, destination))
1230+ cmd = [
1231+ 'ln',
1232+ '-sf',
1233+ source,
1234+ destination,
1235+ ]
1236+ subprocess.check_call(cmd)
1237+
1238+
1239+def mkdir(path, owner='root', group='root', perms=0o555, force=False):
1240+ """Create a directory"""
1241+ log("Making dir {} {}:{} {:o}".format(path, owner, group,
1242+ perms))
1243+ uid = pwd.getpwnam(owner).pw_uid
1244+ gid = grp.getgrnam(group).gr_gid
1245+ realpath = os.path.abspath(path)
1246+ path_exists = os.path.exists(realpath)
1247+ if path_exists and force:
1248+ if not os.path.isdir(realpath):
1249+ log("Removing non-directory file {} prior to mkdir()".format(path))
1250+ os.unlink(realpath)
1251+ os.makedirs(realpath, perms)
1252+ os.chown(realpath, uid, gid)
1253+ elif not path_exists:
1254+ os.makedirs(realpath, perms)
1255+ os.chown(realpath, uid, gid)
1256+
1257+
1258+def write_file(path, content, owner='root', group='root', perms=0o444):
1259+ """Create or overwrite a file with the contents of a string"""
1260+ log("Writing file {} {}:{} {:o}".format(path, owner, group, perms))
1261+ uid = pwd.getpwnam(owner).pw_uid
1262+ gid = grp.getgrnam(group).gr_gid
1263+ with open(path, 'w') as target:
1264+ os.fchown(target.fileno(), uid, gid)
1265+ os.fchmod(target.fileno(), perms)
1266+ target.write(content)
1267+
1268+
1269+def fstab_remove(mp):
1270+ """Remove the given mountpoint entry from /etc/fstab
1271+ """
1272+ return Fstab.remove_by_mountpoint(mp)
1273+
1274+
1275+def fstab_add(dev, mp, fs, options=None):
1276+ """Adds the given device entry to the /etc/fstab file
1277+ """
1278+ return Fstab.add(dev, mp, fs, options=options)
1279+
1280+
1281+def mount(device, mountpoint, options=None, persist=False, filesystem="ext3"):
1282+ """Mount a filesystem at a particular mountpoint"""
1283+ cmd_args = ['mount']
1284+ if options is not None:
1285+ cmd_args.extend(['-o', options])
1286+ cmd_args.extend([device, mountpoint])
1287+ try:
1288+ subprocess.check_output(cmd_args)
1289+ except subprocess.CalledProcessError as e:
1290+ log('Error mounting {} at {}\n{}'.format(device, mountpoint, e.output))
1291+ return False
1292+
1293+ if persist:
1294+ return fstab_add(device, mountpoint, filesystem, options=options)
1295+ return True
1296+
1297+
1298+def umount(mountpoint, persist=False):
1299+ """Unmount a filesystem"""
1300+ cmd_args = ['umount', mountpoint]
1301+ try:
1302+ subprocess.check_output(cmd_args)
1303+ except subprocess.CalledProcessError as e:
1304+ log('Error unmounting {}\n{}'.format(mountpoint, e.output))
1305+ return False
1306+
1307+ if persist:
1308+ return fstab_remove(mountpoint)
1309+ return True
1310+
1311+
1312+def mounts():
1313+ """Get a list of all mounted volumes as [[mountpoint,device],[...]]"""
1314+ with open('/proc/mounts') as f:
1315+ # [['/mount/point','/dev/path'],[...]]
1316+ system_mounts = [m[1::-1] for m in [l.strip().split()
1317+ for l in f.readlines()]]
1318+ return system_mounts
1319+
1320+
1321+def file_hash(path, hash_type='md5'):
1322+ """
1323+ Generate a hash checksum of the contents of 'path' or None if not found.
1324+
1325+ :param str hash_type: Any hash alrgorithm supported by :mod:`hashlib`,
1326+ such as md5, sha1, sha256, sha512, etc.
1327+ """
1328+ if os.path.exists(path):
1329+ h = getattr(hashlib, hash_type)()
1330+ with open(path, 'rb') as source:
1331+ h.update(source.read())
1332+ return h.hexdigest()
1333+ else:
1334+ return None
1335+
1336+
1337+def check_hash(path, checksum, hash_type='md5'):
1338+ """
1339+ Validate a file using a cryptographic checksum.
1340+
1341+ :param str checksum: Value of the checksum used to validate the file.
1342+ :param str hash_type: Hash algorithm used to generate `checksum`.
1343+ Can be any hash alrgorithm supported by :mod:`hashlib`,
1344+ such as md5, sha1, sha256, sha512, etc.
1345+ :raises ChecksumError: If the file fails the checksum
1346+
1347+ """
1348+ actual_checksum = file_hash(path, hash_type)
1349+ if checksum != actual_checksum:
1350+ raise ChecksumError("'%s' != '%s'" % (checksum, actual_checksum))
1351+
1352+
1353+class ChecksumError(ValueError):
1354+ pass
1355+
1356+
1357+def restart_on_change(restart_map, stopstart=False):
1358+ """Restart services based on configuration files changing
1359+
1360+ This function is used a decorator, for example::
1361+
1362+ @restart_on_change({
1363+ '/etc/ceph/ceph.conf': [ 'cinder-api', 'cinder-volume' ]
1364+ })
1365+ def ceph_client_changed():
1366+ pass # your code here
1367+
1368+ In this example, the cinder-api and cinder-volume services
1369+ would be restarted if /etc/ceph/ceph.conf is changed by the
1370+ ceph_client_changed function.
1371+ """
1372+ def wrap(f):
1373+ def wrapped_f(*args):
1374+ checksums = {}
1375+ for path in restart_map:
1376+ checksums[path] = file_hash(path)
1377+ f(*args)
1378+ restarts = []
1379+ for path in restart_map:
1380+ if checksums[path] != file_hash(path):
1381+ restarts += restart_map[path]
1382+ services_list = list(OrderedDict.fromkeys(restarts))
1383+ if not stopstart:
1384+ for service_name in services_list:
1385+ service('restart', service_name)
1386+ else:
1387+ for action in ['stop', 'start']:
1388+ for service_name in services_list:
1389+ service(action, service_name)
1390+ return wrapped_f
1391+ return wrap
1392+
1393+
1394+def lsb_release():
1395+ """Return /etc/lsb-release in a dict"""
1396+ d = {}
1397+ with open('/etc/lsb-release', 'r') as lsb:
1398+ for l in lsb:
1399+ k, v = l.split('=')
1400+ d[k.strip()] = v.strip()
1401+ return d
1402+
1403+
1404+def pwgen(length=None):
1405+ """Generate a random pasword."""
1406+ if length is None:
1407+ length = random.choice(range(35, 45))
1408+ alphanumeric_chars = [
1409+ l for l in (string.ascii_letters + string.digits)
1410+ if l not in 'l0QD1vAEIOUaeiou']
1411+ random_chars = [
1412+ random.choice(alphanumeric_chars) for _ in range(length)]
1413+ return(''.join(random_chars))
1414+
1415+
1416+def list_nics(nic_type):
1417+ '''Return a list of nics of given type(s)'''
1418+ if isinstance(nic_type, six.string_types):
1419+ int_types = [nic_type]
1420+ else:
1421+ int_types = nic_type
1422+ interfaces = []
1423+ for int_type in int_types:
1424+ cmd = ['ip', 'addr', 'show', 'label', int_type + '*']
1425+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1426+ ip_output = (line for line in ip_output if line)
1427+ for line in ip_output:
1428+ if line.split()[1].startswith(int_type):
1429+ matched = re.search('.*: (bond[0-9]+\.[0-9]+)@.*', line)
1430+ if matched:
1431+ interface = matched.groups()[0]
1432+ else:
1433+ interface = line.split()[1].replace(":", "")
1434+ interfaces.append(interface)
1435+
1436+ return interfaces
1437+
1438+
1439+def set_nic_mtu(nic, mtu):
1440+ '''Set MTU on a network interface'''
1441+ cmd = ['ip', 'link', 'set', nic, 'mtu', mtu]
1442+ subprocess.check_call(cmd)
1443+
1444+
1445+def get_nic_mtu(nic):
1446+ cmd = ['ip', 'addr', 'show', nic]
1447+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1448+ mtu = ""
1449+ for line in ip_output:
1450+ words = line.split()
1451+ if 'mtu' in words:
1452+ mtu = words[words.index("mtu") + 1]
1453+ return mtu
1454+
1455+
1456+def get_nic_hwaddr(nic):
1457+ cmd = ['ip', '-o', '-0', 'addr', 'show', nic]
1458+ ip_output = subprocess.check_output(cmd).decode('UTF-8')
1459+ hwaddr = ""
1460+ words = ip_output.split()
1461+ if 'link/ether' in words:
1462+ hwaddr = words[words.index('link/ether') + 1]
1463+ return hwaddr
1464+
1465+
1466+def cmp_pkgrevno(package, revno, pkgcache=None):
1467+ '''Compare supplied revno with the revno of the installed package
1468+
1469+ * 1 => Installed revno is greater than supplied arg
1470+ * 0 => Installed revno is the same as supplied arg
1471+ * -1 => Installed revno is less than supplied arg
1472+
1473+ '''
1474+ import apt_pkg
1475+ if not pkgcache:
1476+ from charmhelpers.fetch import apt_cache
1477+ pkgcache = apt_cache()
1478+ pkg = pkgcache[package]
1479+ return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
1480+
1481+
1482+@contextmanager
1483+def chdir(d):
1484+ cur = os.getcwd()
1485+ try:
1486+ yield os.chdir(d)
1487+ finally:
1488+ os.chdir(cur)
1489+
1490+
1491+def chownr(path, owner, group):
1492+ uid = pwd.getpwnam(owner).pw_uid
1493+ gid = grp.getgrnam(group).gr_gid
1494+
1495+ for root, dirs, files in os.walk(path):
1496+ for name in dirs + files:
1497+ full = os.path.join(root, name)
1498+ broken_symlink = os.path.lexists(full) and not os.path.exists(full)
1499+ if not broken_symlink:
1500+ os.chown(full, uid, gid)
1501
1502=== added directory 'hooks/charmhelpers/core/services'
1503=== added file 'hooks/charmhelpers/core/services/__init__.py'
1504--- hooks/charmhelpers/core/services/__init__.py 1970-01-01 00:00:00 +0000
1505+++ hooks/charmhelpers/core/services/__init__.py 2015-01-20 18:33:44 +0000
1506@@ -0,0 +1,2 @@
1507+from .base import * # NOQA
1508+from .helpers import * # NOQA
1509
1510=== added file 'hooks/charmhelpers/core/services/base.py'
1511--- hooks/charmhelpers/core/services/base.py 1970-01-01 00:00:00 +0000
1512+++ hooks/charmhelpers/core/services/base.py 2015-01-20 18:33:44 +0000
1513@@ -0,0 +1,313 @@
1514+import os
1515+import re
1516+import json
1517+from collections import Iterable
1518+
1519+from charmhelpers.core import host
1520+from charmhelpers.core import hookenv
1521+
1522+
1523+__all__ = ['ServiceManager', 'ManagerCallback',
1524+ 'PortManagerCallback', 'open_ports', 'close_ports', 'manage_ports',
1525+ 'service_restart', 'service_stop']
1526+
1527+
1528+class ServiceManager(object):
1529+ def __init__(self, services=None):
1530+ """
1531+ Register a list of services, given their definitions.
1532+
1533+ Service definitions are dicts in the following formats (all keys except
1534+ 'service' are optional)::
1535+
1536+ {
1537+ "service": <service name>,
1538+ "required_data": <list of required data contexts>,
1539+ "provided_data": <list of provided data contexts>,
1540+ "data_ready": <one or more callbacks>,
1541+ "data_lost": <one or more callbacks>,
1542+ "start": <one or more callbacks>,
1543+ "stop": <one or more callbacks>,
1544+ "ports": <list of ports to manage>,
1545+ }
1546+
1547+ The 'required_data' list should contain dicts of required data (or
1548+ dependency managers that act like dicts and know how to collect the data).
1549+ Only when all items in the 'required_data' list are populated are the list
1550+ of 'data_ready' and 'start' callbacks executed. See `is_ready()` for more
1551+ information.
1552+
1553+ The 'provided_data' list should contain relation data providers, most likely
1554+ a subclass of :class:`charmhelpers.core.services.helpers.RelationContext`,
1555+ that will indicate a set of data to set on a given relation.
1556+
1557+ The 'data_ready' value should be either a single callback, or a list of
1558+ callbacks, to be called when all items in 'required_data' pass `is_ready()`.
1559+ Each callback will be called with the service name as the only parameter.
1560+ After all of the 'data_ready' callbacks are called, the 'start' callbacks
1561+ are fired.
1562+
1563+ The 'data_lost' value should be either a single callback, or a list of
1564+ callbacks, to be called when a 'required_data' item no longer passes
1565+ `is_ready()`. Each callback will be called with the service name as the
1566+ only parameter. After all of the 'data_lost' callbacks are called,
1567+ the 'stop' callbacks are fired.
1568+
1569+ The 'start' value should be either a single callback, or a list of
1570+ callbacks, to be called when starting the service, after the 'data_ready'
1571+ callbacks are complete. Each callback will be called with the service
1572+ name as the only parameter. This defaults to
1573+ `[host.service_start, services.open_ports]`.
1574+
1575+ The 'stop' value should be either a single callback, or a list of
1576+ callbacks, to be called when stopping the service. If the service is
1577+ being stopped because it no longer has all of its 'required_data', this
1578+ will be called after all of the 'data_lost' callbacks are complete.
1579+ Each callback will be called with the service name as the only parameter.
1580+ This defaults to `[services.close_ports, host.service_stop]`.
1581+
1582+ The 'ports' value should be a list of ports to manage. The default
1583+ 'start' handler will open the ports after the service is started,
1584+ and the default 'stop' handler will close the ports prior to stopping
1585+ the service.
1586+
1587+
1588+ Examples:
1589+
1590+ The following registers an Upstart service called bingod that depends on
1591+ a mongodb relation and which runs a custom `db_migrate` function prior to
1592+ restarting the service, and a Runit service called spadesd::
1593+
1594+ manager = services.ServiceManager([
1595+ {
1596+ 'service': 'bingod',
1597+ 'ports': [80, 443],
1598+ 'required_data': [MongoRelation(), config(), {'my': 'data'}],
1599+ 'data_ready': [
1600+ services.template(source='bingod.conf'),
1601+ services.template(source='bingod.ini',
1602+ target='/etc/bingod.ini',
1603+ owner='bingo', perms=0400),
1604+ ],
1605+ },
1606+ {
1607+ 'service': 'spadesd',
1608+ 'data_ready': services.template(source='spadesd_run.j2',
1609+ target='/etc/sv/spadesd/run',
1610+ perms=0555),
1611+ 'start': runit_start,
1612+ 'stop': runit_stop,
1613+ },
1614+ ])
1615+ manager.manage()
1616+ """
1617+ self._ready_file = os.path.join(hookenv.charm_dir(), 'READY-SERVICES.json')
1618+ self._ready = None
1619+ self.services = {}
1620+ for service in services or []:
1621+ service_name = service['service']
1622+ self.services[service_name] = service
1623+
1624+ def manage(self):
1625+ """
1626+ Handle the current hook by doing The Right Thing with the registered services.
1627+ """
1628+ hook_name = hookenv.hook_name()
1629+ if hook_name == 'stop':
1630+ self.stop_services()
1631+ else:
1632+ self.provide_data()
1633+ self.reconfigure_services()
1634+ cfg = hookenv.config()
1635+ if cfg.implicit_save:
1636+ cfg.save()
1637+
1638+ def provide_data(self):
1639+ """
1640+ Set the relation data for each provider in the ``provided_data`` list.
1641+
1642+ A provider must have a `name` attribute, which indicates which relation
1643+ to set data on, and a `provide_data()` method, which returns a dict of
1644+ data to set.
1645+ """
1646+ hook_name = hookenv.hook_name()
1647+ for service in self.services.values():
1648+ for provider in service.get('provided_data', []):
1649+ if re.match(r'{}-relation-(joined|changed)'.format(provider.name), hook_name):
1650+ data = provider.provide_data()
1651+ _ready = provider._is_ready(data) if hasattr(provider, '_is_ready') else data
1652+ if _ready:
1653+ hookenv.relation_set(None, data)
1654+
1655+ def reconfigure_services(self, *service_names):
1656+ """
1657+ Update all files for one or more registered services, and,
1658+ if ready, optionally restart them.
1659+
1660+ If no service names are given, reconfigures all registered services.
1661+ """
1662+ for service_name in service_names or self.services.keys():
1663+ if self.is_ready(service_name):
1664+ self.fire_event('data_ready', service_name)
1665+ self.fire_event('start', service_name, default=[
1666+ service_restart,
1667+ manage_ports])
1668+ self.save_ready(service_name)
1669+ else:
1670+ if self.was_ready(service_name):
1671+ self.fire_event('data_lost', service_name)
1672+ self.fire_event('stop', service_name, default=[
1673+ manage_ports,
1674+ service_stop])
1675+ self.save_lost(service_name)
1676+
1677+ def stop_services(self, *service_names):
1678+ """
1679+ Stop one or more registered services, by name.
1680+
1681+ If no service names are given, stops all registered services.
1682+ """
1683+ for service_name in service_names or self.services.keys():
1684+ self.fire_event('stop', service_name, default=[
1685+ manage_ports,
1686+ service_stop])
1687+
1688+ def get_service(self, service_name):
1689+ """
1690+ Given the name of a registered service, return its service definition.
1691+ """
1692+ service = self.services.get(service_name)
1693+ if not service:
1694+ raise KeyError('Service not registered: %s' % service_name)
1695+ return service
1696+
1697+ def fire_event(self, event_name, service_name, default=None):
1698+ """
1699+ Fire a data_ready, data_lost, start, or stop event on a given service.
1700+ """
1701+ service = self.get_service(service_name)
1702+ callbacks = service.get(event_name, default)
1703+ if not callbacks:
1704+ return
1705+ if not isinstance(callbacks, Iterable):
1706+ callbacks = [callbacks]
1707+ for callback in callbacks:
1708+ if isinstance(callback, ManagerCallback):
1709+ callback(self, service_name, event_name)
1710+ else:
1711+ callback(service_name)
1712+
1713+ def is_ready(self, service_name):
1714+ """
1715+ Determine if a registered service is ready, by checking its 'required_data'.
1716+
1717+ A 'required_data' item can be any mapping type, and is considered ready
1718+ if `bool(item)` evaluates as True.
1719+ """
1720+ service = self.get_service(service_name)
1721+ reqs = service.get('required_data', [])
1722+ return all(bool(req) for req in reqs)
1723+
1724+ def _load_ready_file(self):
1725+ if self._ready is not None:
1726+ return
1727+ if os.path.exists(self._ready_file):
1728+ with open(self._ready_file) as fp:
1729+ self._ready = set(json.load(fp))
1730+ else:
1731+ self._ready = set()
1732+
1733+ def _save_ready_file(self):
1734+ if self._ready is None:
1735+ return
1736+ with open(self._ready_file, 'w') as fp:
1737+ json.dump(list(self._ready), fp)
1738+
1739+ def save_ready(self, service_name):
1740+ """
1741+ Save an indicator that the given service is now data_ready.
1742+ """
1743+ self._load_ready_file()
1744+ self._ready.add(service_name)
1745+ self._save_ready_file()
1746+
1747+ def save_lost(self, service_name):
1748+ """
1749+ Save an indicator that the given service is no longer data_ready.
1750+ """
1751+ self._load_ready_file()
1752+ self._ready.discard(service_name)
1753+ self._save_ready_file()
1754+
1755+ def was_ready(self, service_name):
1756+ """
1757+ Determine if the given service was previously data_ready.
1758+ """
1759+ self._load_ready_file()
1760+ return service_name in self._ready
1761+
1762+
1763+class ManagerCallback(object):
1764+ """
1765+ Special case of a callback that takes the `ServiceManager` instance
1766+ in addition to the service name.
1767+
1768+ Subclasses should implement `__call__` which should accept three parameters:
1769+
1770+ * `manager` The `ServiceManager` instance
1771+ * `service_name` The name of the service it's being triggered for
1772+ * `event_name` The name of the event that this callback is handling
1773+ """
1774+ def __call__(self, manager, service_name, event_name):
1775+ raise NotImplementedError()
1776+
1777+
1778+class PortManagerCallback(ManagerCallback):
1779+ """
1780+ Callback class that will open or close ports, for use as either
1781+ a start or stop action.
1782+ """
1783+ def __call__(self, manager, service_name, event_name):
1784+ service = manager.get_service(service_name)
1785+ new_ports = service.get('ports', [])
1786+ port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
1787+ if os.path.exists(port_file):
1788+ with open(port_file) as fp:
1789+ old_ports = fp.read().split(',')
1790+ for old_port in old_ports:
1791+ if bool(old_port):
1792+ old_port = int(old_port)
1793+ if old_port not in new_ports:
1794+ hookenv.close_port(old_port)
1795+ with open(port_file, 'w') as fp:
1796+ fp.write(','.join(str(port) for port in new_ports))
1797+ for port in new_ports:
1798+ if event_name == 'start':
1799+ hookenv.open_port(port)
1800+ elif event_name == 'stop':
1801+ hookenv.close_port(port)
1802+
1803+
1804+def service_stop(service_name):
1805+ """
1806+ Wrapper around host.service_stop to prevent spurious "unknown service"
1807+ messages in the logs.
1808+ """
1809+ if host.service_running(service_name):
1810+ host.service_stop(service_name)
1811+
1812+
1813+def service_restart(service_name):
1814+ """
1815+ Wrapper around host.service_restart to prevent spurious "unknown service"
1816+ messages in the logs.
1817+ """
1818+ if host.service_available(service_name):
1819+ if host.service_running(service_name):
1820+ host.service_restart(service_name)
1821+ else:
1822+ host.service_start(service_name)
1823+
1824+
1825+# Convenience aliases
1826+open_ports = close_ports = manage_ports = PortManagerCallback()
1827
1828=== added file 'hooks/charmhelpers/core/services/helpers.py'
1829--- hooks/charmhelpers/core/services/helpers.py 1970-01-01 00:00:00 +0000
1830+++ hooks/charmhelpers/core/services/helpers.py 2015-01-20 18:33:44 +0000
1831@@ -0,0 +1,243 @@
1832+import os
1833+import yaml
1834+from charmhelpers.core import hookenv
1835+from charmhelpers.core import templating
1836+
1837+from charmhelpers.core.services.base import ManagerCallback
1838+
1839+
1840+__all__ = ['RelationContext', 'TemplateCallback',
1841+ 'render_template', 'template']
1842+
1843+
1844+class RelationContext(dict):
1845+ """
1846+ Base class for a context generator that gets relation data from juju.
1847+
1848+ Subclasses must provide the attributes `name`, which is the name of the
1849+ interface of interest, `interface`, which is the type of the interface of
1850+ interest, and `required_keys`, which is the set of keys required for the
1851+ relation to be considered complete. The data for all interfaces matching
1852+ the `name` attribute that are complete will used to populate the dictionary
1853+ values (see `get_data`, below).
1854+
1855+ The generated context will be namespaced under the relation :attr:`name`,
1856+ to prevent potential naming conflicts.
1857+
1858+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1859+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1860+ """
1861+ name = None
1862+ interface = None
1863+ required_keys = []
1864+
1865+ def __init__(self, name=None, additional_required_keys=None):
1866+ if name is not None:
1867+ self.name = name
1868+ if additional_required_keys is not None:
1869+ self.required_keys.extend(additional_required_keys)
1870+ self.get_data()
1871+
1872+ def __bool__(self):
1873+ """
1874+ Returns True if all of the required_keys are available.
1875+ """
1876+ return self.is_ready()
1877+
1878+ __nonzero__ = __bool__
1879+
1880+ def __repr__(self):
1881+ return super(RelationContext, self).__repr__()
1882+
1883+ def is_ready(self):
1884+ """
1885+ Returns True if all of the `required_keys` are available from any units.
1886+ """
1887+ ready = len(self.get(self.name, [])) > 0
1888+ if not ready:
1889+ hookenv.log('Incomplete relation: {}'.format(self.__class__.__name__), hookenv.DEBUG)
1890+ return ready
1891+
1892+ def _is_ready(self, unit_data):
1893+ """
1894+ Helper method that tests a set of relation data and returns True if
1895+ all of the `required_keys` are present.
1896+ """
1897+ return set(unit_data.keys()).issuperset(set(self.required_keys))
1898+
1899+ def get_data(self):
1900+ """
1901+ Retrieve the relation data for each unit involved in a relation and,
1902+ if complete, store it in a list under `self[self.name]`. This
1903+ is automatically called when the RelationContext is instantiated.
1904+
1905+ The units are sorted lexographically first by the service ID, then by
1906+ the unit ID. Thus, if an interface has two other services, 'db:1'
1907+ and 'db:2', with 'db:1' having two units, 'wordpress/0' and 'wordpress/1',
1908+ and 'db:2' having one unit, 'mediawiki/0', all of which have a complete
1909+ set of data, the relation data for the units will be stored in the
1910+ order: 'wordpress/0', 'wordpress/1', 'mediawiki/0'.
1911+
1912+ If you only care about a single unit on the relation, you can just
1913+ access it as `{{ interface[0]['key'] }}`. However, if you can at all
1914+ support multiple units on a relation, you should iterate over the list,
1915+ like::
1916+
1917+ {% for unit in interface -%}
1918+ {{ unit['key'] }}{% if not loop.last %},{% endif %}
1919+ {%- endfor %}
1920+
1921+ Note that since all sets of relation data from all related services and
1922+ units are in a single list, if you need to know which service or unit a
1923+ set of data came from, you'll need to extend this class to preserve
1924+ that information.
1925+ """
1926+ if not hookenv.relation_ids(self.name):
1927+ return
1928+
1929+ ns = self.setdefault(self.name, [])
1930+ for rid in sorted(hookenv.relation_ids(self.name)):
1931+ for unit in sorted(hookenv.related_units(rid)):
1932+ reldata = hookenv.relation_get(rid=rid, unit=unit)
1933+ if self._is_ready(reldata):
1934+ ns.append(reldata)
1935+
1936+ def provide_data(self):
1937+ """
1938+ Return data to be relation_set for this interface.
1939+ """
1940+ return {}
1941+
1942+
1943+class MysqlRelation(RelationContext):
1944+ """
1945+ Relation context for the `mysql` interface.
1946+
1947+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1948+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1949+ """
1950+ name = 'db'
1951+ interface = 'mysql'
1952+ required_keys = ['host', 'user', 'password', 'database']
1953+
1954+
1955+class HttpRelation(RelationContext):
1956+ """
1957+ Relation context for the `http` interface.
1958+
1959+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1960+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1961+ """
1962+ name = 'website'
1963+ interface = 'http'
1964+ required_keys = ['host', 'port']
1965+
1966+ def provide_data(self):
1967+ return {
1968+ 'host': hookenv.unit_get('private-address'),
1969+ 'port': 80,
1970+ }
1971+
1972+
1973+class RequiredConfig(dict):
1974+ """
1975+ Data context that loads config options with one or more mandatory options.
1976+
1977+ Once the required options have been changed from their default values, all
1978+ config options will be available, namespaced under `config` to prevent
1979+ potential naming conflicts (for example, between a config option and a
1980+ relation property).
1981+
1982+ :param list *args: List of options that must be changed from their default values.
1983+ """
1984+
1985+ def __init__(self, *args):
1986+ self.required_options = args
1987+ self['config'] = hookenv.config()
1988+ with open(os.path.join(hookenv.charm_dir(), 'config.yaml')) as fp:
1989+ self.config = yaml.load(fp).get('options', {})
1990+
1991+ def __bool__(self):
1992+ for option in self.required_options:
1993+ if option not in self['config']:
1994+ return False
1995+ current_value = self['config'][option]
1996+ default_value = self.config[option].get('default')
1997+ if current_value == default_value:
1998+ return False
1999+ if current_value in (None, '') and default_value in (None, ''):
2000+ return False
2001+ return True
2002+
2003+ def __nonzero__(self):
2004+ return self.__bool__()
2005+
2006+
2007+class StoredContext(dict):
2008+ """
2009+ A data context that always returns the data that it was first created with.
2010+
2011+ This is useful to do a one-time generation of things like passwords, that
2012+ will thereafter use the same value that was originally generated, instead
2013+ of generating a new value each time it is run.
2014+ """
2015+ def __init__(self, file_name, config_data):
2016+ """
2017+ If the file exists, populate `self` with the data from the file.
2018+ Otherwise, populate with the given data and persist it to the file.
2019+ """
2020+ if os.path.exists(file_name):
2021+ self.update(self.read_context(file_name))
2022+ else:
2023+ self.store_context(file_name, config_data)
2024+ self.update(config_data)
2025+
2026+ def store_context(self, file_name, config_data):
2027+ if not os.path.isabs(file_name):
2028+ file_name = os.path.join(hookenv.charm_dir(), file_name)
2029+ with open(file_name, 'w') as file_stream:
2030+ os.fchmod(file_stream.fileno(), 0o600)
2031+ yaml.dump(config_data, file_stream)
2032+
2033+ def read_context(self, file_name):
2034+ if not os.path.isabs(file_name):
2035+ file_name = os.path.join(hookenv.charm_dir(), file_name)
2036+ with open(file_name, 'r') as file_stream:
2037+ data = yaml.load(file_stream)
2038+ if not data:
2039+ raise OSError("%s is empty" % file_name)
2040+ return data
2041+
2042+
2043+class TemplateCallback(ManagerCallback):
2044+ """
2045+ Callback class that will render a Jinja2 template, for use as a ready
2046+ action.
2047+
2048+ :param str source: The template source file, relative to
2049+ `$CHARM_DIR/templates`
2050+
2051+ :param str target: The target to write the rendered template to
2052+ :param str owner: The owner of the rendered file
2053+ :param str group: The group of the rendered file
2054+ :param int perms: The permissions of the rendered file
2055+ """
2056+ def __init__(self, source, target,
2057+ owner='root', group='root', perms=0o444):
2058+ self.source = source
2059+ self.target = target
2060+ self.owner = owner
2061+ self.group = group
2062+ self.perms = perms
2063+
2064+ def __call__(self, manager, service_name, event_name):
2065+ service = manager.get_service(service_name)
2066+ context = {}
2067+ for ctx in service.get('required_data', []):
2068+ context.update(ctx)
2069+ templating.render(self.source, self.target, context,
2070+ self.owner, self.group, self.perms)
2071+
2072+
2073+# Convenience aliases for templates
2074+render_template = template = TemplateCallback
2075
2076=== added file 'hooks/charmhelpers/core/sysctl.py'
2077--- hooks/charmhelpers/core/sysctl.py 1970-01-01 00:00:00 +0000
2078+++ hooks/charmhelpers/core/sysctl.py 2015-01-20 18:33:44 +0000
2079@@ -0,0 +1,34 @@
2080+#!/usr/bin/env python
2081+# -*- coding: utf-8 -*-
2082+
2083+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
2084+
2085+import yaml
2086+
2087+from subprocess import check_call
2088+
2089+from charmhelpers.core.hookenv import (
2090+ log,
2091+ DEBUG,
2092+)
2093+
2094+
2095+def create(sysctl_dict, sysctl_file):
2096+ """Creates a sysctl.conf file from a YAML associative array
2097+
2098+ :param sysctl_dict: a dict of sysctl options eg { 'kernel.max_pid': 1337 }
2099+ :type sysctl_dict: dict
2100+ :param sysctl_file: path to the sysctl file to be saved
2101+ :type sysctl_file: str or unicode
2102+ :returns: None
2103+ """
2104+ sysctl_dict = yaml.load(sysctl_dict)
2105+
2106+ with open(sysctl_file, "w") as fd:
2107+ for key, value in sysctl_dict.items():
2108+ fd.write("{}={}\n".format(key, value))
2109+
2110+ log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict),
2111+ level=DEBUG)
2112+
2113+ check_call(["sysctl", "-p", sysctl_file])
2114
2115=== added file 'hooks/charmhelpers/core/templating.py'
2116--- hooks/charmhelpers/core/templating.py 1970-01-01 00:00:00 +0000
2117+++ hooks/charmhelpers/core/templating.py 2015-01-20 18:33:44 +0000
2118@@ -0,0 +1,52 @@
2119+import os
2120+
2121+from charmhelpers.core import host
2122+from charmhelpers.core import hookenv
2123+
2124+
2125+def render(source, target, context, owner='root', group='root',
2126+ perms=0o444, templates_dir=None):
2127+ """
2128+ Render a template.
2129+
2130+ The `source` path, if not absolute, is relative to the `templates_dir`.
2131+
2132+ The `target` path should be absolute.
2133+
2134+ The context should be a dict containing the values to be replaced in the
2135+ template.
2136+
2137+ The `owner`, `group`, and `perms` options will be passed to `write_file`.
2138+
2139+ If omitted, `templates_dir` defaults to the `templates` folder in the charm.
2140+
2141+ Note: Using this requires python-jinja2; if it is not installed, calling
2142+ this will attempt to use charmhelpers.fetch.apt_install to install it.
2143+ """
2144+ try:
2145+ from jinja2 import FileSystemLoader, Environment, exceptions
2146+ except ImportError:
2147+ try:
2148+ from charmhelpers.fetch import apt_install
2149+ except ImportError:
2150+ hookenv.log('Could not import jinja2, and could not import '
2151+ 'charmhelpers.fetch to install it',
2152+ level=hookenv.ERROR)
2153+ raise
2154+ apt_install('python-jinja2', fatal=True)
2155+ from jinja2 import FileSystemLoader, Environment, exceptions
2156+
2157+ if templates_dir is None:
2158+ templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
2159+ loader = Environment(loader=FileSystemLoader(templates_dir))
2160+ try:
2161+ source = source
2162+ template = loader.get_template(source)
2163+ except exceptions.TemplateNotFound as e:
2164+ hookenv.log('Could not load template %s from %s.' %
2165+ (source, templates_dir),
2166+ level=hookenv.ERROR)
2167+ raise e
2168+ content = template.render(context)
2169+ host.mkdir(os.path.dirname(target), owner, group)
2170+ host.write_file(target, content, owner, group, perms)
2171
2172=== added directory 'hooks/charmhelpers/fetch'
2173=== added file 'hooks/charmhelpers/fetch/__init__.py'
2174--- hooks/charmhelpers/fetch/__init__.py 1970-01-01 00:00:00 +0000
2175+++ hooks/charmhelpers/fetch/__init__.py 2015-01-20 18:33:44 +0000
2176@@ -0,0 +1,423 @@
2177+import importlib
2178+from tempfile import NamedTemporaryFile
2179+import time
2180+from yaml import safe_load
2181+from charmhelpers.core.host import (
2182+ lsb_release
2183+)
2184+import subprocess
2185+from charmhelpers.core.hookenv import (
2186+ config,
2187+ log,
2188+)
2189+import os
2190+
2191+import six
2192+if six.PY3:
2193+ from urllib.parse import urlparse, urlunparse
2194+else:
2195+ from urlparse import urlparse, urlunparse
2196+
2197+
2198+CLOUD_ARCHIVE = """# Ubuntu Cloud Archive
2199+deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
2200+"""
2201+PROPOSED_POCKET = """# Proposed
2202+deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted
2203+"""
2204+CLOUD_ARCHIVE_POCKETS = {
2205+ # Folsom
2206+ 'folsom': 'precise-updates/folsom',
2207+ 'precise-folsom': 'precise-updates/folsom',
2208+ 'precise-folsom/updates': 'precise-updates/folsom',
2209+ 'precise-updates/folsom': 'precise-updates/folsom',
2210+ 'folsom/proposed': 'precise-proposed/folsom',
2211+ 'precise-folsom/proposed': 'precise-proposed/folsom',
2212+ 'precise-proposed/folsom': 'precise-proposed/folsom',
2213+ # Grizzly
2214+ 'grizzly': 'precise-updates/grizzly',
2215+ 'precise-grizzly': 'precise-updates/grizzly',
2216+ 'precise-grizzly/updates': 'precise-updates/grizzly',
2217+ 'precise-updates/grizzly': 'precise-updates/grizzly',
2218+ 'grizzly/proposed': 'precise-proposed/grizzly',
2219+ 'precise-grizzly/proposed': 'precise-proposed/grizzly',
2220+ 'precise-proposed/grizzly': 'precise-proposed/grizzly',
2221+ # Havana
2222+ 'havana': 'precise-updates/havana',
2223+ 'precise-havana': 'precise-updates/havana',
2224+ 'precise-havana/updates': 'precise-updates/havana',
2225+ 'precise-updates/havana': 'precise-updates/havana',
2226+ 'havana/proposed': 'precise-proposed/havana',
2227+ 'precise-havana/proposed': 'precise-proposed/havana',
2228+ 'precise-proposed/havana': 'precise-proposed/havana',
2229+ # Icehouse
2230+ 'icehouse': 'precise-updates/icehouse',
2231+ 'precise-icehouse': 'precise-updates/icehouse',
2232+ 'precise-icehouse/updates': 'precise-updates/icehouse',
2233+ 'precise-updates/icehouse': 'precise-updates/icehouse',
2234+ 'icehouse/proposed': 'precise-proposed/icehouse',
2235+ 'precise-icehouse/proposed': 'precise-proposed/icehouse',
2236+ 'precise-proposed/icehouse': 'precise-proposed/icehouse',
2237+ # Juno
2238+ 'juno': 'trusty-updates/juno',
2239+ 'trusty-juno': 'trusty-updates/juno',
2240+ 'trusty-juno/updates': 'trusty-updates/juno',
2241+ 'trusty-updates/juno': 'trusty-updates/juno',
2242+ 'juno/proposed': 'trusty-proposed/juno',
2243+ 'trusty-juno/proposed': 'trusty-proposed/juno',
2244+ 'trusty-proposed/juno': 'trusty-proposed/juno',
2245+ # Kilo
2246+ 'kilo': 'trusty-updates/kilo',
2247+ 'trusty-kilo': 'trusty-updates/kilo',
2248+ 'trusty-kilo/updates': 'trusty-updates/kilo',
2249+ 'trusty-updates/kilo': 'trusty-updates/kilo',
2250+ 'kilo/proposed': 'trusty-proposed/kilo',
2251+ 'trusty-kilo/proposed': 'trusty-proposed/kilo',
2252+ 'trusty-proposed/kilo': 'trusty-proposed/kilo',
2253+}
2254+
2255+# The order of this list is very important. Handlers should be listed in from
2256+# least- to most-specific URL matching.
2257+FETCH_HANDLERS = (
2258+ 'charmhelpers.fetch.archiveurl.ArchiveUrlFetchHandler',
2259+ 'charmhelpers.fetch.bzrurl.BzrUrlFetchHandler',
2260+ 'charmhelpers.fetch.giturl.GitUrlFetchHandler',
2261+)
2262+
2263+APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.
2264+APT_NO_LOCK_RETRY_DELAY = 10 # Wait 10 seconds between apt lock checks.
2265+APT_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.
2266+
2267+
2268+class SourceConfigError(Exception):
2269+ pass
2270+
2271+
2272+class UnhandledSource(Exception):
2273+ pass
2274+
2275+
2276+class AptLockError(Exception):
2277+ pass
2278+
2279+
2280+class BaseFetchHandler(object):
2281+
2282+ """Base class for FetchHandler implementations in fetch plugins"""
2283+
2284+ def can_handle(self, source):
2285+ """Returns True if the source can be handled. Otherwise returns
2286+ a string explaining why it cannot"""
2287+ return "Wrong source type"
2288+
2289+ def install(self, source):
2290+ """Try to download and unpack the source. Return the path to the
2291+ unpacked files or raise UnhandledSource."""
2292+ raise UnhandledSource("Wrong source type {}".format(source))
2293+
2294+ def parse_url(self, url):
2295+ return urlparse(url)
2296+
2297+ def base_url(self, url):
2298+ """Return url without querystring or fragment"""
2299+ parts = list(self.parse_url(url))
2300+ parts[4:] = ['' for i in parts[4:]]
2301+ return urlunparse(parts)
2302+
2303+
2304+def filter_installed_packages(packages):
2305+ """Returns a list of packages that require installation"""
2306+ cache = apt_cache()
2307+ _pkgs = []
2308+ for package in packages:
2309+ try:
2310+ p = cache[package]
2311+ p.current_ver or _pkgs.append(package)
2312+ except KeyError:
2313+ log('Package {} has no installation candidate.'.format(package),
2314+ level='WARNING')
2315+ _pkgs.append(package)
2316+ return _pkgs
2317+
2318+
2319+def apt_cache(in_memory=True):
2320+ """Build and return an apt cache"""
2321+ import apt_pkg
2322+ apt_pkg.init()
2323+ if in_memory:
2324+ apt_pkg.config.set("Dir::Cache::pkgcache", "")
2325+ apt_pkg.config.set("Dir::Cache::srcpkgcache", "")
2326+ return apt_pkg.Cache()
2327+
2328+
2329+def apt_install(packages, options=None, fatal=False):
2330+ """Install one or more packages"""
2331+ if options is None:
2332+ options = ['--option=Dpkg::Options::=--force-confold']
2333+
2334+ cmd = ['apt-get', '--assume-yes']
2335+ cmd.extend(options)
2336+ cmd.append('install')
2337+ if isinstance(packages, six.string_types):
2338+ cmd.append(packages)
2339+ else:
2340+ cmd.extend(packages)
2341+ log("Installing {} with options: {}".format(packages,
2342+ options))
2343+ _run_apt_command(cmd, fatal)
2344+
2345+
2346+def apt_upgrade(options=None, fatal=False, dist=False):
2347+ """Upgrade all packages"""
2348+ if options is None:
2349+ options = ['--option=Dpkg::Options::=--force-confold']
2350+
2351+ cmd = ['apt-get', '--assume-yes']
2352+ cmd.extend(options)
2353+ if dist:
2354+ cmd.append('dist-upgrade')
2355+ else:
2356+ cmd.append('upgrade')
2357+ log("Upgrading with options: {}".format(options))
2358+ _run_apt_command(cmd, fatal)
2359+
2360+
2361+def apt_update(fatal=False):
2362+ """Update local apt cache"""
2363+ cmd = ['apt-get', 'update']
2364+ _run_apt_command(cmd, fatal)
2365+
2366+
2367+def apt_purge(packages, fatal=False):
2368+ """Purge one or more packages"""
2369+ cmd = ['apt-get', '--assume-yes', 'purge']
2370+ if isinstance(packages, six.string_types):
2371+ cmd.append(packages)
2372+ else:
2373+ cmd.extend(packages)
2374+ log("Purging {}".format(packages))
2375+ _run_apt_command(cmd, fatal)
2376+
2377+
2378+def apt_hold(packages, fatal=False):
2379+ """Hold one or more packages"""
2380+ cmd = ['apt-mark', 'hold']
2381+ if isinstance(packages, six.string_types):
2382+ cmd.append(packages)
2383+ else:
2384+ cmd.extend(packages)
2385+ log("Holding {}".format(packages))
2386+
2387+ if fatal:
2388+ subprocess.check_call(cmd)
2389+ else:
2390+ subprocess.call(cmd)
2391+
2392+
2393+def add_source(source, key=None):
2394+ """Add a package source to this system.
2395+
2396+ @param source: a URL or sources.list entry, as supported by
2397+ add-apt-repository(1). Examples::
2398+
2399+ ppa:charmers/example
2400+ deb https://stub:key@private.example.com/ubuntu trusty main
2401+
2402+ In addition:
2403+ 'proposed:' may be used to enable the standard 'proposed'
2404+ pocket for the release.
2405+ 'cloud:' may be used to activate official cloud archive pockets,
2406+ such as 'cloud:icehouse'
2407+ 'distro' may be used as a noop
2408+
2409+ @param key: A key to be added to the system's APT keyring and used
2410+ to verify the signatures on packages. Ideally, this should be an
2411+ ASCII format GPG public key including the block headers. A GPG key
2412+ id may also be used, but be aware that only insecure protocols are
2413+ available to retrieve the actual public key from a public keyserver
2414+ placing your Juju environment at risk. ppa and cloud archive keys
2415+ are securely added automtically, so sould not be provided.
2416+ """
2417+ if source is None:
2418+ log('Source is not present. Skipping')
2419+ return
2420+
2421+ if (source.startswith('ppa:') or
2422+ source.startswith('http') or
2423+ source.startswith('deb ') or
2424+ source.startswith('cloud-archive:')):
2425+ subprocess.check_call(['add-apt-repository', '--yes', source])
2426+ elif source.startswith('cloud:'):
2427+ apt_install(filter_installed_packages(['ubuntu-cloud-keyring']),
2428+ fatal=True)
2429+ pocket = source.split(':')[-1]
2430+ if pocket not in CLOUD_ARCHIVE_POCKETS:
2431+ raise SourceConfigError(
2432+ 'Unsupported cloud: source option %s' %
2433+ pocket)
2434+ actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
2435+ with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
2436+ apt.write(CLOUD_ARCHIVE.format(actual_pocket))
2437+ elif source == 'proposed':
2438+ release = lsb_release()['DISTRIB_CODENAME']
2439+ with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
2440+ apt.write(PROPOSED_POCKET.format(release))
2441+ elif source == 'distro':
2442+ pass
2443+ else:
2444+ log("Unknown source: {!r}".format(source))
2445+
2446+ if key:
2447+ if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key:
2448+ with NamedTemporaryFile('w+') as key_file:
2449+ key_file.write(key)
2450+ key_file.flush()
2451+ key_file.seek(0)
2452+ subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file)
2453+ else:
2454+ # Note that hkp: is in no way a secure protocol. Using a
2455+ # GPG key id is pointless from a security POV unless you
2456+ # absolutely trust your network and DNS.
2457+ subprocess.check_call(['apt-key', 'adv', '--keyserver',
2458+ 'hkp://keyserver.ubuntu.com:80', '--recv',
2459+ key])
2460+
2461+
2462+def configure_sources(update=False,
2463+ sources_var='install_sources',
2464+ keys_var='install_keys'):
2465+ """
2466+ Configure multiple sources from charm configuration.
2467+
2468+ The lists are encoded as yaml fragments in the configuration.
2469+ The frament needs to be included as a string. Sources and their
2470+ corresponding keys are of the types supported by add_source().
2471+
2472+ Example config:
2473+ install_sources: |
2474+ - "ppa:foo"
2475+ - "http://example.com/repo precise main"
2476+ install_keys: |
2477+ - null
2478+ - "a1b2c3d4"
2479+
2480+ Note that 'null' (a.k.a. None) should not be quoted.
2481+ """
2482+ sources = safe_load((config(sources_var) or '').strip()) or []
2483+ keys = safe_load((config(keys_var) or '').strip()) or None
2484+
2485+ if isinstance(sources, six.string_types):
2486+ sources = [sources]
2487+
2488+ if keys is None:
2489+ for source in sources:
2490+ add_source(source, None)
2491+ else:
2492+ if isinstance(keys, six.string_types):
2493+ keys = [keys]
2494+
2495+ if len(sources) != len(keys):
2496+ raise SourceConfigError(
2497+ 'Install sources and keys lists are different lengths')
2498+ for source, key in zip(sources, keys):
2499+ add_source(source, key)
2500+ if update:
2501+ apt_update(fatal=True)
2502+
2503+
2504+def install_remote(source, *args, **kwargs):
2505+ """
2506+ Install a file tree from a remote source
2507+
2508+ The specified source should be a url of the form:
2509+ scheme://[host]/path[#[option=value][&...]]
2510+
2511+ Schemes supported are based on this modules submodules.
2512+ Options supported are submodule-specific.
2513+ Additional arguments are passed through to the submodule.
2514+
2515+ For example::
2516+
2517+ dest = install_remote('http://example.com/archive.tgz',
2518+ checksum='deadbeef',
2519+ hash_type='sha1')
2520+
2521+ This will download `archive.tgz`, validate it using SHA1 and, if
2522+ the file is ok, extract it and return the directory in which it
2523+ was extracted. If the checksum fails, it will raise
2524+ :class:`charmhelpers.core.host.ChecksumError`.
2525+ """
2526+ # We ONLY check for True here because can_handle may return a string
2527+ # explaining why it can't handle a given source.
2528+ handlers = [h for h in plugins() if h.can_handle(source) is True]
2529+ installed_to = None
2530+ for handler in handlers:
2531+ try:
2532+ installed_to = handler.install(source, *args, **kwargs)
2533+ except UnhandledSource:
2534+ pass
2535+ if not installed_to:
2536+ raise UnhandledSource("No handler found for source {}".format(source))
2537+ return installed_to
2538+
2539+
2540+def install_from_config(config_var_name):
2541+ charm_config = config()
2542+ source = charm_config[config_var_name]
2543+ return install_remote(source)
2544+
2545+
2546+def plugins(fetch_handlers=None):
2547+ if not fetch_handlers:
2548+ fetch_handlers = FETCH_HANDLERS
2549+ plugin_list = []
2550+ for handler_name in fetch_handlers:
2551+ package, classname = handler_name.rsplit('.', 1)
2552+ try:
2553+ handler_class = getattr(
2554+ importlib.import_module(package),
2555+ classname)
2556+ plugin_list.append(handler_class())
2557+ except (ImportError, AttributeError):
2558+ # Skip missing plugins so that they can be ommitted from
2559+ # installation if desired
2560+ log("FetchHandler {} not found, skipping plugin".format(
2561+ handler_name))
2562+ return plugin_list
2563+
2564+
2565+def _run_apt_command(cmd, fatal=False):
2566+ """
2567+ Run an APT command, checking output and retrying if the fatal flag is set
2568+ to True.
2569+
2570+ :param: cmd: str: The apt command to run.
2571+ :param: fatal: bool: Whether the command's output should be checked and
2572+ retried.
2573+ """
2574+ env = os.environ.copy()
2575+
2576+ if 'DEBIAN_FRONTEND' not in env:
2577+ env['DEBIAN_FRONTEND'] = 'noninteractive'
2578+
2579+ if fatal:
2580+ retry_count = 0
2581+ result = None
2582+
2583+ # If the command is considered "fatal", we need to retry if the apt
2584+ # lock was not acquired.
2585+
2586+ while result is None or result == APT_NO_LOCK:
2587+ try:
2588+ result = subprocess.check_call(cmd, env=env)
2589+ except subprocess.CalledProcessError as e:
2590+ retry_count = retry_count + 1
2591+ if retry_count > APT_NO_LOCK_RETRY_COUNT:
2592+ raise
2593+ result = e.returncode
2594+ log("Couldn't acquire DPKG lock. Will retry in {} seconds."
2595+ "".format(APT_NO_LOCK_RETRY_DELAY))
2596+ time.sleep(APT_NO_LOCK_RETRY_DELAY)
2597+
2598+ else:
2599+ subprocess.call(cmd, env=env)
2600
2601=== added file 'hooks/charmhelpers/fetch/archiveurl.py'
2602--- hooks/charmhelpers/fetch/archiveurl.py 1970-01-01 00:00:00 +0000
2603+++ hooks/charmhelpers/fetch/archiveurl.py 2015-01-20 18:33:44 +0000
2604@@ -0,0 +1,145 @@
2605+import os
2606+import hashlib
2607+import re
2608+
2609+import six
2610+if six.PY3:
2611+ from urllib.request import (
2612+ build_opener, install_opener, urlopen, urlretrieve,
2613+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2614+ )
2615+ from urllib.parse import urlparse, urlunparse, parse_qs
2616+ from urllib.error import URLError
2617+else:
2618+ from urllib import urlretrieve
2619+ from urllib2 import (
2620+ build_opener, install_opener, urlopen,
2621+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2622+ URLError
2623+ )
2624+ from urlparse import urlparse, urlunparse, parse_qs
2625+
2626+from charmhelpers.fetch import (
2627+ BaseFetchHandler,
2628+ UnhandledSource
2629+)
2630+from charmhelpers.payload.archive import (
2631+ get_archive_handler,
2632+ extract,
2633+)
2634+from charmhelpers.core.host import mkdir, check_hash
2635+
2636+
2637+def splituser(host):
2638+ '''urllib.splituser(), but six's support of this seems broken'''
2639+ _userprog = re.compile('^(.*)@(.*)$')
2640+ match = _userprog.match(host)
2641+ if match:
2642+ return match.group(1, 2)
2643+ return None, host
2644+
2645+
2646+def splitpasswd(user):
2647+ '''urllib.splitpasswd(), but six's support of this is missing'''
2648+ _passwdprog = re.compile('^([^:]*):(.*)$', re.S)
2649+ match = _passwdprog.match(user)
2650+ if match:
2651+ return match.group(1, 2)
2652+ return user, None
2653+
2654+
2655+class ArchiveUrlFetchHandler(BaseFetchHandler):
2656+ """
2657+ Handler to download archive files from arbitrary URLs.
2658+
2659+ Can fetch from http, https, ftp, and file URLs.
2660+
2661+ Can install either tarballs (.tar, .tgz, .tbz2, etc) or zip files.
2662+
2663+ Installs the contents of the archive in $CHARM_DIR/fetched/.
2664+ """
2665+ def can_handle(self, source):
2666+ url_parts = self.parse_url(source)
2667+ if url_parts.scheme not in ('http', 'https', 'ftp', 'file'):
2668+ return "Wrong source type"
2669+ if get_archive_handler(self.base_url(source)):
2670+ return True
2671+ return False
2672+
2673+ def download(self, source, dest):
2674+ """
2675+ Download an archive file.
2676+
2677+ :param str source: URL pointing to an archive file.
2678+ :param str dest: Local path location to download archive file to.
2679+ """
2680+ # propogate all exceptions
2681+ # URLError, OSError, etc
2682+ proto, netloc, path, params, query, fragment = urlparse(source)
2683+ if proto in ('http', 'https'):
2684+ auth, barehost = splituser(netloc)
2685+ if auth is not None:
2686+ source = urlunparse((proto, barehost, path, params, query, fragment))
2687+ username, password = splitpasswd(auth)
2688+ passman = HTTPPasswordMgrWithDefaultRealm()
2689+ # Realm is set to None in add_password to force the username and password
2690+ # to be used whatever the realm
2691+ passman.add_password(None, source, username, password)
2692+ authhandler = HTTPBasicAuthHandler(passman)
2693+ opener = build_opener(authhandler)
2694+ install_opener(opener)
2695+ response = urlopen(source)
2696+ try:
2697+ with open(dest, 'w') as dest_file:
2698+ dest_file.write(response.read())
2699+ except Exception as e:
2700+ if os.path.isfile(dest):
2701+ os.unlink(dest)
2702+ raise e
2703+
2704+ # Mandatory file validation via Sha1 or MD5 hashing.
2705+ def download_and_validate(self, url, hashsum, validate="sha1"):
2706+ tempfile, headers = urlretrieve(url)
2707+ check_hash(tempfile, hashsum, validate)
2708+ return tempfile
2709+
2710+ def install(self, source, dest=None, checksum=None, hash_type='sha1'):
2711+ """
2712+ Download and install an archive file, with optional checksum validation.
2713+
2714+ The checksum can also be given on the `source` URL's fragment.
2715+ For example::
2716+
2717+ handler.install('http://example.com/file.tgz#sha1=deadbeef')
2718+
2719+ :param str source: URL pointing to an archive file.
2720+ :param str dest: Local destination path to install to. If not given,
2721+ installs to `$CHARM_DIR/archives/archive_file_name`.
2722+ :param str checksum: If given, validate the archive file after download.
2723+ :param str hash_type: Algorithm used to generate `checksum`.
2724+ Can be any hash alrgorithm supported by :mod:`hashlib`,
2725+ such as md5, sha1, sha256, sha512, etc.
2726+
2727+ """
2728+ url_parts = self.parse_url(source)
2729+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), 'fetched')
2730+ if not os.path.exists(dest_dir):
2731+ mkdir(dest_dir, perms=0o755)
2732+ dld_file = os.path.join(dest_dir, os.path.basename(url_parts.path))
2733+ try:
2734+ self.download(source, dld_file)
2735+ except URLError as e:
2736+ raise UnhandledSource(e.reason)
2737+ except OSError as e:
2738+ raise UnhandledSource(e.strerror)
2739+ options = parse_qs(url_parts.fragment)
2740+ for key, value in options.items():
2741+ if not six.PY3:
2742+ algorithms = hashlib.algorithms
2743+ else:
2744+ algorithms = hashlib.algorithms_available
2745+ if key in algorithms:
2746+ check_hash(dld_file, value, key)
2747+ if checksum:
2748+ check_hash(dld_file, checksum, hash_type)
2749+ return extract(dld_file, dest)
2750
2751=== added file 'hooks/charmhelpers/fetch/bzrurl.py'
2752--- hooks/charmhelpers/fetch/bzrurl.py 1970-01-01 00:00:00 +0000
2753+++ hooks/charmhelpers/fetch/bzrurl.py 2015-01-20 18:33:44 +0000
2754@@ -0,0 +1,54 @@
2755+import os
2756+from charmhelpers.fetch import (
2757+ BaseFetchHandler,
2758+ UnhandledSource
2759+)
2760+from charmhelpers.core.host import mkdir
2761+
2762+import six
2763+if six.PY3:
2764+ raise ImportError('bzrlib does not support Python3')
2765+
2766+try:
2767+ from bzrlib.branch import Branch
2768+except ImportError:
2769+ from charmhelpers.fetch import apt_install
2770+ apt_install("python-bzrlib")
2771+ from bzrlib.branch import Branch
2772+
2773+
2774+class BzrUrlFetchHandler(BaseFetchHandler):
2775+ """Handler for bazaar branches via generic and lp URLs"""
2776+ def can_handle(self, source):
2777+ url_parts = self.parse_url(source)
2778+ if url_parts.scheme not in ('bzr+ssh', 'lp'):
2779+ return False
2780+ else:
2781+ return True
2782+
2783+ def branch(self, source, dest):
2784+ url_parts = self.parse_url(source)
2785+ # If we use lp:branchname scheme we need to load plugins
2786+ if not self.can_handle(source):
2787+ raise UnhandledSource("Cannot handle {}".format(source))
2788+ if url_parts.scheme == "lp":
2789+ from bzrlib.plugin import load_plugins
2790+ load_plugins()
2791+ try:
2792+ remote_branch = Branch.open(source)
2793+ remote_branch.bzrdir.sprout(dest).open_branch()
2794+ except Exception as e:
2795+ raise e
2796+
2797+ def install(self, source):
2798+ url_parts = self.parse_url(source)
2799+ branch_name = url_parts.path.strip("/").split("/")[-1]
2800+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2801+ branch_name)
2802+ if not os.path.exists(dest_dir):
2803+ mkdir(dest_dir, perms=0o755)
2804+ try:
2805+ self.branch(source, dest_dir)
2806+ except OSError as e:
2807+ raise UnhandledSource(e.strerror)
2808+ return dest_dir
2809
2810=== added file 'hooks/charmhelpers/fetch/giturl.py'
2811--- hooks/charmhelpers/fetch/giturl.py 1970-01-01 00:00:00 +0000
2812+++ hooks/charmhelpers/fetch/giturl.py 2015-01-20 18:33:44 +0000
2813@@ -0,0 +1,51 @@
2814+import os
2815+from charmhelpers.fetch import (
2816+ BaseFetchHandler,
2817+ UnhandledSource
2818+)
2819+from charmhelpers.core.host import mkdir
2820+
2821+import six
2822+if six.PY3:
2823+ raise ImportError('GitPython does not support Python 3')
2824+
2825+try:
2826+ from git import Repo
2827+except ImportError:
2828+ from charmhelpers.fetch import apt_install
2829+ apt_install("python-git")
2830+ from git import Repo
2831+
2832+
2833+class GitUrlFetchHandler(BaseFetchHandler):
2834+ """Handler for git branches via generic and github URLs"""
2835+ def can_handle(self, source):
2836+ url_parts = self.parse_url(source)
2837+ # TODO (mattyw) no support for ssh git@ yet
2838+ if url_parts.scheme not in ('http', 'https', 'git'):
2839+ return False
2840+ else:
2841+ return True
2842+
2843+ def clone(self, source, dest, branch):
2844+ if not self.can_handle(source):
2845+ raise UnhandledSource("Cannot handle {}".format(source))
2846+
2847+ repo = Repo.clone_from(source, dest)
2848+ repo.git.checkout(branch)
2849+
2850+ def install(self, source, branch="master", dest=None):
2851+ url_parts = self.parse_url(source)
2852+ branch_name = url_parts.path.strip("/").split("/")[-1]
2853+ if dest:
2854+ dest_dir = os.path.join(dest, branch_name)
2855+ else:
2856+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2857+ branch_name)
2858+ if not os.path.exists(dest_dir):
2859+ mkdir(dest_dir, perms=0o755)
2860+ try:
2861+ self.clone(source, dest_dir, branch)
2862+ except OSError as e:
2863+ raise UnhandledSource(e.strerror)
2864+ return dest_dir
2865
2866=== added directory 'hooks/charmhelpers/payload'
2867=== added file 'hooks/charmhelpers/payload/__init__.py'
2868--- hooks/charmhelpers/payload/__init__.py 1970-01-01 00:00:00 +0000
2869+++ hooks/charmhelpers/payload/__init__.py 2015-01-20 18:33:44 +0000
2870@@ -0,0 +1,1 @@
2871+"Tools for working with files injected into a charm just before deployment."
2872
2873=== added file 'hooks/charmhelpers/payload/execd.py'
2874--- hooks/charmhelpers/payload/execd.py 1970-01-01 00:00:00 +0000
2875+++ hooks/charmhelpers/payload/execd.py 2015-01-20 18:33:44 +0000
2876@@ -0,0 +1,50 @@
2877+#!/usr/bin/env python
2878+
2879+import os
2880+import sys
2881+import subprocess
2882+from charmhelpers.core import hookenv
2883+
2884+
2885+def default_execd_dir():
2886+ return os.path.join(os.environ['CHARM_DIR'], 'exec.d')
2887+
2888+
2889+def execd_module_paths(execd_dir=None):
2890+ """Generate a list of full paths to modules within execd_dir."""
2891+ if not execd_dir:
2892+ execd_dir = default_execd_dir()
2893+
2894+ if not os.path.exists(execd_dir):
2895+ return
2896+
2897+ for subpath in os.listdir(execd_dir):
2898+ module = os.path.join(execd_dir, subpath)
2899+ if os.path.isdir(module):
2900+ yield module
2901+
2902+
2903+def execd_submodule_paths(command, execd_dir=None):
2904+ """Generate a list of full paths to the specified command within exec_dir.
2905+ """
2906+ for module_path in execd_module_paths(execd_dir):
2907+ path = os.path.join(module_path, command)
2908+ if os.access(path, os.X_OK) and os.path.isfile(path):
2909+ yield path
2910+
2911+
2912+def execd_run(command, execd_dir=None, die_on_error=False, stderr=None):
2913+ """Run command for each module within execd_dir which defines it."""
2914+ for submodule_path in execd_submodule_paths(command, execd_dir):
2915+ try:
2916+ subprocess.check_call(submodule_path, shell=True, stderr=stderr)
2917+ except subprocess.CalledProcessError as e:
2918+ hookenv.log("Error ({}) running {}. Output: {}".format(
2919+ e.returncode, e.cmd, e.output))
2920+ if die_on_error:
2921+ sys.exit(e.returncode)
2922+
2923+
2924+def execd_preinstall(execd_dir=None):
2925+ """Run charm-pre-install for each module within execd_dir."""
2926+ execd_run('charm-pre-install', execd_dir=execd_dir)
2927
2928=== modified file 'hooks/config-changed'
2929--- hooks/config-changed 2011-09-22 14:46:56 +0000
2930+++ hooks/config-changed 1970-01-01 00:00:00 +0000
2931@@ -1,7 +0,0 @@
2932-#!/bin/sh
2933-set -e
2934-
2935-home=`dirname $0`
2936-
2937-juju-log "Reconfiguring charm by installing hook again."
2938-exec $home/install
2939
2940=== target is u'jenkins_hooks.py'
2941=== removed file 'hooks/delnode'
2942--- hooks/delnode 2011-09-22 14:46:56 +0000
2943+++ hooks/delnode 1970-01-01 00:00:00 +0000
2944@@ -1,16 +0,0 @@
2945-#!/usr/bin/python
2946-
2947-import jenkins
2948-import sys
2949-
2950-host=sys.argv[1]
2951-username=sys.argv[2]
2952-password=sys.argv[3]
2953-
2954-l_jenkins = jenkins.Jenkins("http://localhost:8080/",username,password)
2955-
2956-if l_jenkins.node_exists(host):
2957- print "Node exists"
2958- l_jenkins.delete_node(host)
2959-else:
2960- print "Node does not exist - not deleting"
2961
2962=== modified file 'hooks/install'
2963--- hooks/install 2014-04-17 12:35:18 +0000
2964+++ hooks/install 1970-01-01 00:00:00 +0000
2965@@ -1,151 +0,0 @@
2966-#!/bin/bash
2967-
2968-set -eu
2969-
2970-RELEASE=$(config-get release)
2971-ADMIN_USERNAME=$(config-get username)
2972-ADMIN_PASSWORD=$(config-get password)
2973-PLUGINS=$(config-get plugins)
2974-PLUGINS_SITE=$(config-get plugins-site)
2975-PLUGINS_CHECK_CERT=$(config-get plugins-check-certificate)
2976-REMOVE_UNLISTED_PLUGINS=$(config-get remove-unlisted-plugins)
2977-CWD=$(dirname $0)
2978-JENKINS_HOME=/var/lib/jenkins
2979-
2980-setup_source () {
2981- # Do something with < Oneiric releases - maybe PPA
2982- # apt-get -y install python-software-properties
2983- # add-apt-repository ppa:hudson-ubuntu/testing
2984- juju-log "Configuring source of jenkins as $RELEASE"
2985- # Configure to use upstream archives
2986- # lts - debian-stable
2987- # trunk - debian
2988- case $RELEASE in
2989- lts)
2990- SOURCE="debian-stable";;
2991- trunk)
2992- SOURCE="debian";;
2993- *)
2994- juju-log "release configuration not recognised" && exit 1;;
2995- esac
2996- # Setup archive to use appropriate jenkins upstream
2997- wget -q -O - http://pkg.jenkins-ci.org/$SOURCE/jenkins-ci.org.key | apt-key add -
2998- echo "deb http://pkg.jenkins-ci.org/$SOURCE binary/" \
2999- > /etc/apt/sources.list.d/jenkins.list
3000- apt-get update || true
3001-}
3002-# Only setup the source if jenkins is not already installed
3003-# this makes the config 'release' immutable - i.e. you
3004-# can change source once deployed
3005-[[ -d /var/lib/jenkins ]] || setup_source
3006-
3007-# Install jenkins
3008-install_jenkins () {
3009- juju-log "Installing/upgrading jenkins..."
3010- apt-get -y install -qq jenkins default-jre-headless
3011-}
3012-# Re-run whenever called to pickup any updates
3013-install_jenkins
3014-
3015-configure_jenkins_user () {
3016- juju-log "Configuring user for jenkins..."
3017- # Check to see if password provided
3018- if [ -z "$ADMIN_PASSWORD" ]
3019- then
3020- # Generate a random one for security
3021- # User can then override using juju set
3022- ADMIN_PASSWORD=$(< /dev/urandom tr -dc A-Za-z | head -c16)
3023- echo $ADMIN_PASSWORD > $JENKINS_HOME/.admin_password
3024- chmod 0600 $JENKINS_HOME/.admin_password
3025- fi
3026- # Generate Salt and Hash Password for Jenkins
3027- SALT="$(< /dev/urandom tr -dc A-Za-z | head -c6)"
3028- PASSWORD="$SALT:$(echo -n "$ADMIN_PASSWORD{$SALT}" | shasum -a 256 | awk '{ print $1 }')"
3029- mkdir -p $JENKINS_HOME/users/$ADMIN_USERNAME
3030- sed -e s#__USERNAME__#$ADMIN_USERNAME# -e s#__PASSWORD__#$PASSWORD# \
3031- $CWD/../templates/user-config.xml > $JENKINS_HOME/users/$ADMIN_USERNAME/config.xml
3032- chown -R jenkins:nogroup $JENKINS_HOME/users
3033-}
3034-# Always run - even if config has not changed, its safe
3035-configure_jenkins_user
3036-
3037-boostrap_jenkins_configuration (){
3038- juju-log "Bootstrapping secure initial configuration in Jenkins..."
3039- cp $CWD/../templates/jenkins-config.xml $JENKINS_HOME/config.xml
3040- chown jenkins:nogroup $JENKINS_HOME/config.xml
3041- touch /var/lib/jenkins/config.bootstrapped
3042-}
3043-# Only run on first invocation otherwise we blast
3044-# any configuration changes made
3045-[[ -f /var/lib/jenkins/config.bootstrapped ]] || boostrap_jenkins_configuration
3046-
3047-install_plugins(){
3048- juju-log "Installing plugins ($PLUGINS)"
3049- mkdir -p $JENKINS_HOME/plugins
3050- chmod a+rx $JENKINS_HOME/plugins
3051- chown jenkins:nogroup $JENKINS_HOME/plugins
3052- track_dir=`mktemp -d /tmp/plugins.installed.XXXXXXXX`
3053- installed_plugins=`find $JENKINS_HOME/plugins -name '*.hpi'`
3054- [ -z "$installed_plugins" ] || ln -s $installed_plugins $track_dir
3055- local plugin=""
3056- local plugin_file=""
3057- local opts=""
3058- pushd $JENKINS_HOME/plugins
3059- for plugin in $PLUGINS ; do
3060- plugin_file=$JENKINS_HOME/plugins/$plugin.hpi
3061- # Note that by default wget verifies certificates as of 1.10.
3062- if [ "$PLUGINS_CHECK_CERT" = "no" ] ; then
3063- opts="--no-check-certificate"
3064- fi
3065- wget $opts --timestamping $PLUGINS_SITE/latest/$plugin.hpi
3066- chmod a+r $plugin_file
3067- rm -f $track_dir/$plugin.hpi
3068- done
3069- popd
3070- # Warn about undesirable plugins, or remove them.
3071- unlisted_plugins=`ls $track_dir`
3072- [[ -n "$unlisted_plugins" ]] || return 0
3073- if [[ $REMOVE_UNLISTED_PLUGINS = "yes" ]] ; then
3074- for plugin_file in `ls $track_dir` ; do
3075- rm -vf $JENKINS_HOME/plugins/$plugin_file
3076- done
3077- else
3078- juju-log -l WARNING "Unlisted plugins: (`ls $track_dir`) Not removed. Set remove-unlisted-plugins to yes to clear them away."
3079- fi
3080-}
3081-
3082-install_plugins
3083-
3084-juju-log "Restarting jenkins to pickup configuration changes"
3085-service jenkins restart
3086-
3087-# Install helpers - python jenkins ++
3088-install_python_jenkins () {
3089- juju-log "Installing python-jenkins..."
3090- apt-get -y install -qq python-jenkins
3091-}
3092-# Only install once
3093-[[ -d /usr/share/pyshared/jenkins ]] || install_python_jenkins
3094-
3095-# Install some tools - can get set up deployment time
3096-install_tools () {
3097- juju-log "Installing tools..."
3098- apt-get -y install -qq `config-get tools`
3099-}
3100-# Always run - tools might get re-configured
3101-install_tools
3102-
3103-juju-log "Opening ports"
3104-open-port 8080
3105-
3106-# Execute any hook overlay which may be provided
3107-# by forks of this charm
3108-if [ -d hooks/install.d ]
3109-then
3110- for i in `ls -1 hooks/install.d/*`
3111- do
3112- [[ -x $i ]] && . ./$i
3113- done
3114-fi
3115-
3116-exit 0
3117
3118=== target is u'jenkins_hooks.py'
3119=== added file 'hooks/jenkins_hooks.py'
3120--- hooks/jenkins_hooks.py 1970-01-01 00:00:00 +0000
3121+++ hooks/jenkins_hooks.py 2015-01-20 18:33:44 +0000
3122@@ -0,0 +1,220 @@
3123+#!/usr/bin/python
3124+import grp
3125+import hashlib
3126+import os
3127+import pwd
3128+import shutil
3129+import subprocess
3130+import sys
3131+
3132+from charmhelpers.core.hookenv import (
3133+ Hooks,
3134+ UnregisteredHookError,
3135+ config,
3136+ remote_unit,
3137+ relation_get,
3138+ relation_set,
3139+ relation_ids,
3140+ unit_get,
3141+ open_port,
3142+ log,
3143+ DEBUG,
3144+ INFO,
3145+)
3146+from charmhelpers.fetch import apt_install
3147+from charmhelpers.core.host import (
3148+ service_start,
3149+ service_stop,
3150+)
3151+from charmhelpers.payload.execd import execd_preinstall
3152+from jenkins_utils import (
3153+ JENKINS_HOME,
3154+ JENKINS_USERS,
3155+ TEMPLATES_DIR,
3156+ add_node,
3157+ del_node,
3158+ setup_source,
3159+ install_jenkins_plugins,
3160+)
3161+
3162+hooks = Hooks()
3163+
3164+
3165+@hooks.hook('install')
3166+def install():
3167+ execd_preinstall('hooks/install.d')
3168+ # Only setup the source if jenkins is not already installed i.e. makes the
3169+ # config 'release' immutable so you can't change source once deployed
3170+ setup_source(config('release'))
3171+ config_changed()
3172+ open_port(8080)
3173+
3174+
3175+@hooks.hook('config-changed')
3176+def config_changed():
3177+ # Re-run whenever called to pickup any updates
3178+ log("Installing/upgrading jenkins.", level=DEBUG)
3179+ apt_install(['jenkins', 'default-jre-headless', 'pwgen'], fatal=True)
3180+
3181+ # Always run - even if config has not changed, its safe
3182+ log("Configuring user for jenkins.", level=DEBUG)
3183+ # Check to see if password provided
3184+ admin_passwd = config('password')
3185+ if not admin_passwd:
3186+ # Generate a random one for security. User can then override using juju
3187+ # set.
3188+ admin_passwd = subprocess.check_output(['pwgen', '-N1', '15'])
3189+ admin_passwd = admin_passwd.strip()
3190+
3191+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3192+ with open(passwd_file, 'w+') as fd:
3193+ fd.write(admin_passwd)
3194+
3195+ os.chmod(passwd_file, 0600)
3196+
3197+ jenkins_uid = pwd.getpwnam('jenkins').pw_uid
3198+ jenkins_gid = grp.getgrnam('jenkins').gr_gid
3199+ nogroup_gid = grp.getgrnam('nogroup').gr_gid
3200+
3201+ # Generate Salt and Hash Password for Jenkins
3202+ salt = subprocess.check_output(['pwgen', '-N1', '6']).strip()
3203+ csum = hashlib.sha256("%s{%s}" % (admin_passwd, salt)).hexdigest()
3204+ salty_password = "%s:%s" % (salt, csum)
3205+
3206+ admin_username = config('username')
3207+ admin_user_home = os.path.join(JENKINS_USERS, admin_username)
3208+ if not os.path.isdir(admin_user_home):
3209+ os.makedirs(admin_user_home, 0o0700)
3210+ os.chown(JENKINS_USERS, jenkins_uid, nogroup_gid)
3211+ os.chown(admin_user_home, jenkins_uid, nogroup_gid)
3212+
3213+ # NOTE: overwriting will destroy any data added by jenkins or via the ui
3214+ admin_user_config = os.path.join(admin_user_home, 'config.xml')
3215+ with open(os.path.join(TEMPLATES_DIR, 'user-config.xml')) as src_fd:
3216+ with open(admin_user_config, 'w') as dst_fd:
3217+ lines = src_fd.readlines()
3218+ for line in lines:
3219+ kvs = {'__USERNAME__': admin_username,
3220+ '__PASSWORD__': salty_password}
3221+
3222+ for key, val in kvs.iteritems():
3223+ if key in line:
3224+ line = line.replace(key, val)
3225+
3226+ dst_fd.write(line)
3227+ os.chown(admin_user_config, jenkins_uid, nogroup_gid)
3228+
3229+ # Only run on first invocation otherwise we blast
3230+ # any configuration changes made
3231+ jenkins_bootstrap_flag = '/var/lib/jenkins/config.bootstrapped'
3232+ if not os.path.exists(jenkins_bootstrap_flag):
3233+ log("Bootstrapping secure initial configuration in Jenkins.",
3234+ level=DEBUG)
3235+ src = os.path.join(TEMPLATES_DIR, 'jenkins-config.xml')
3236+ dst = os.path.join(JENKINS_HOME, 'config.xml')
3237+ shutil.copy(src, dst)
3238+ os.chown(dst, jenkins_uid, nogroup_gid)
3239+ # Touch
3240+ with open(jenkins_bootstrap_flag, 'w'):
3241+ pass
3242+
3243+ log("Stopping jenkins for plugin update(s)", level=DEBUG)
3244+ service_stop('jenkins')
3245+ install_jenkins_plugins(jenkins_uid, jenkins_gid)
3246+ log("Starting jenkins to pickup configuration changes", level=DEBUG)
3247+ service_start('jenkins')
3248+
3249+ apt_install(['python-jenkins'], fatal=True)
3250+ tools = config('tools')
3251+ if tools:
3252+ log("Installing tools.", level=DEBUG)
3253+ apt_install(tools.split(), fatal=True)
3254+
3255+
3256+@hooks.hook('start')
3257+def start():
3258+ service_start('jenkins')
3259+
3260+
3261+@hooks.hook('stop')
3262+def stop():
3263+ service_stop('jenkins')
3264+
3265+
3266+@hooks.hook('upgrade-charm')
3267+def upgrade_charm():
3268+ log("Upgrading charm.", level=DEBUG)
3269+ config_changed()
3270+
3271+
3272+@hooks.hook('master-relation-joined')
3273+def master_relation_joined():
3274+ HOSTNAME = unit_get('private-address')
3275+ log("Setting url relation to http://%s:8080" % (HOSTNAME), level=DEBUG)
3276+ relation_set(url="http://%s:8080" % (HOSTNAME))
3277+
3278+
3279+@hooks.hook('master-relation-changed')
3280+def master_relation_changed():
3281+ PASSWORD = config('password')
3282+ if PASSWORD:
3283+ with open('/var/lib/jenkins/.admin_password', 'r') as fd:
3284+ PASSWORD = fd.read()
3285+
3286+ required_settings = ['executors', 'labels', 'slavehost']
3287+ settings = relation_get()
3288+ missing = [s for s in required_settings if s not in settings]
3289+ if missing:
3290+ log("Not all required relation settings received yet (missing=%s) - "
3291+ "skipping" % (', '.join(missing)), level=INFO)
3292+ return
3293+
3294+ slavehost = settings['slavehost']
3295+ executors = settings['executors']
3296+ labels = settings['labels']
3297+
3298+ # Double check to see if this has happened yet
3299+ if "x%s" % (slavehost) == "x":
3300+ log("Slave host not yet defined - skipping", level=INFO)
3301+ return
3302+
3303+ log("Adding slave with hostname %s." % (slavehost), level=DEBUG)
3304+ add_node(slavehost, executors, labels, config('username'), PASSWORD)
3305+ log("Node slave %s added." % (slavehost), level=DEBUG)
3306+
3307+
3308+@hooks.hook('master-relation-departed')
3309+def master_relation_departed():
3310+ # Slave hostname is derived from unit name so
3311+ # this is pretty safe
3312+ slavehost = remote_unit()
3313+ log("Deleting slave with hostname %s." % (slavehost), level=DEBUG)
3314+ del_node(slavehost, config('username'), config('password'))
3315+
3316+
3317+@hooks.hook('master-relation-broken')
3318+def master_relation_broken():
3319+ password = config('password')
3320+ if not password:
3321+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3322+ with open(passwd_file, 'w+') as fd:
3323+ PASSWORD = fd.read()
3324+
3325+ for member in relation_ids():
3326+ member = member.replace('/', '-')
3327+ log("Removing node %s from Jenkins master." % (member), level=DEBUG)
3328+ del_node(member, config('username'), PASSWORD)
3329+
3330+
3331+@hooks.hook('website-relation-joined')
3332+def website_relation_joined():
3333+ hostname = unit_get('private-address')
3334+ log("Setting website URL to %s:8080" % (hostname), level=DEBUG)
3335+ relation_set(port=8080, hostname=hostname)
3336+
3337+
3338+if __name__ == '__main__':
3339+ try:
3340+ hooks.execute(sys.argv)
3341+ except UnregisteredHookError as e:
3342+ log('Unknown hook {} - skipping.'.format(e), level=INFO)
3343
3344=== added file 'hooks/jenkins_utils.py'
3345--- hooks/jenkins_utils.py 1970-01-01 00:00:00 +0000
3346+++ hooks/jenkins_utils.py 2015-01-20 18:33:44 +0000
3347@@ -0,0 +1,178 @@
3348+#!/usr/bin/python
3349+import glob
3350+import os
3351+import shutil
3352+import subprocess
3353+import tempfile
3354+
3355+from charmhelpers.core.hookenv import (
3356+ config,
3357+ log,
3358+ DEBUG,
3359+ INFO,
3360+ WARNING,
3361+)
3362+from charmhelpers.fetch import (
3363+ apt_update,
3364+ add_source,
3365+)
3366+
3367+from charmhelpers.core.decorators import (
3368+ retry_on_exception,
3369+)
3370+
3371+JENKINS_HOME = '/var/lib/jenkins'
3372+JENKINS_USERS = os.path.join(JENKINS_HOME, 'users')
3373+JENKINS_PLUGINS = os.path.join(JENKINS_HOME, 'plugins')
3374+TEMPLATES_DIR = 'templates'
3375+
3376+
3377+def add_node(host, executors, labels, username, password):
3378+ import jenkins
3379+
3380+ @retry_on_exception(2, 2, exc_type=jenkins.JenkinsException)
3381+ def _add_node(*args, **kwargs):
3382+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username,
3383+ password)
3384+
3385+ if l_jenkins.node_exists(host):
3386+ log("Node exists - not adding", level=DEBUG)
3387+ return
3388+
3389+ log("Adding node '%s' to Jenkins master" % (host), level=INFO)
3390+ l_jenkins.create_node(host, int(executors) * 2, host, labels=labels)
3391+
3392+ if not l_jenkins.node_exists(host):
3393+ log("Failed to create node '%s'" % (host), level=WARNING)
3394+
3395+ return _add_node()
3396+
3397+
3398+def del_node(host, username, password):
3399+ import jenkins
3400+
3401+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username, password)
3402+
3403+ if l_jenkins.node_exists(host):
3404+ log("Node '%s' exists" % (host), level=DEBUG)
3405+ l_jenkins.delete_node(host)
3406+ else:
3407+ log("Node '%s' does not exist - not deleting" % (host), level=INFO)
3408+
3409+
3410+def setup_source(release):
3411+ """Install Jenkins archive."""
3412+ log("Configuring source of jenkins as %s" % release, level=INFO)
3413+
3414+ # Configure to use upstream archives
3415+ # lts - debian-stable
3416+ # trunk - debian
3417+ if release == 'lts':
3418+ source = "debian-stable"
3419+ elif release == 'trunk':
3420+ source = "debian"
3421+ else:
3422+ errmsg = "Release '%s' configuration not recognised" % (release)
3423+ raise Exception(errmsg)
3424+
3425+ # Setup archive to use appropriate jenkins upstream
3426+ key = 'http://pkg.jenkins-ci.org/%s/jenkins-ci.org.key' % source
3427+ target = "%s-%s" % (source, 'jenkins-ci.org.key')
3428+ subprocess.check_call(['wget', '-q', '-O', target, key])
3429+ with open(target, 'r') as fd:
3430+ key = fd.read()
3431+
3432+ deb = "deb http://pkg.jenkins-ci.org/%s binary/" % (source)
3433+ sources_file = "/etc/apt/sources.list.d/jenkins.list"
3434+
3435+ found = False
3436+ if os.path.exists(sources_file):
3437+ with open(sources_file, 'r') as fd:
3438+ for line in fd:
3439+ if deb in line:
3440+ found = True
3441+ break
3442+
3443+ if not found:
3444+ with open(sources_file, 'a') as fd:
3445+ fd.write("%s\n" % deb)
3446+ else:
3447+ with open(sources_file, 'w') as fd:
3448+ fd.write("%s\n" % deb)
3449+
3450+ if not found:
3451+ # NOTE: don't use add_source for adding source since it adds deb and
3452+ # deb-src entries but pkg.jenkins-ci.org has no deb-src.
3453+ add_source("#dummy-source", key=key)
3454+
3455+ apt_update(fatal=True)
3456+
3457+
3458+def install_jenkins_plugins(jenkins_uid, jenkins_gid):
3459+ plugins = config('plugins')
3460+ if plugins:
3461+ plugins = plugins.split()
3462+ else:
3463+ plugins = []
3464+
3465+ log("Installing plugins (%s)" % (' '.join(plugins)), level=DEBUG)
3466+ if not os.path.isdir(JENKINS_PLUGINS):
3467+ os.makedirs(JENKINS_PLUGINS)
3468+
3469+ os.chmod(JENKINS_PLUGINS, 0o0755)
3470+ os.chown(JENKINS_PLUGINS, jenkins_uid, jenkins_gid)
3471+
3472+ track_dir = tempfile.mkdtemp(prefix='/tmp/plugins.installed')
3473+ try:
3474+ installed_plugins = glob.glob("%s/*.hpi" % (JENKINS_PLUGINS))
3475+ for plugin in installed_plugins:
3476+ # Create a ref of installed plugin
3477+ with open(os.path.join(track_dir, os.path.basename(plugin)),
3478+ 'w'):
3479+ pass
3480+
3481+ plugins_site = config('plugins-site')
3482+ log("Fetching plugins from %s" % (plugins_site), level=DEBUG)
3483+ # NOTE: by default wget verifies certificates as of 1.10.
3484+ if config('plugins-check-certificate') == "no":
3485+ opts = ["--no-check-certificate"]
3486+ else:
3487+ opts = []
3488+
3489+ for plugin in plugins:
3490+ plugin_filename = "%s.hpi" % (plugin)
3491+ url = os.path.join(plugins_site, 'latest', plugin_filename)
3492+ plugin_path = os.path.join(JENKINS_PLUGINS, plugin_filename)
3493+ if not os.path.isfile(plugin_path):
3494+ log("Installing plugin %s" % (plugin_filename), level=DEBUG)
3495+ cmd = ['wget'] + opts + ['--timestamping', url, '-O',
3496+ plugin_path]
3497+ subprocess.check_call(cmd)
3498+ os.chmod(plugin_path, 0744)
3499+ os.chown(plugin_path, jenkins_uid, jenkins_gid)
3500+
3501+ else:
3502+ log("Plugin %s already installed" % (plugin_filename),
3503+ level=DEBUG)
3504+
3505+ ref = os.path.join(track_dir, plugin_filename)
3506+ if os.path.exists(ref):
3507+ # Delete ref since plugin is installed.
3508+ os.remove(ref)
3509+
3510+ installed_plugins = os.listdir(track_dir)
3511+ if installed_plugins:
3512+ if config('remove-unlisted-plugins') == "yes":
3513+ for plugin in installed_plugins:
3514+ path = os.path.join(JENKINS_HOME, 'plugins', plugin)
3515+ if os.path.isfile(path):
3516+ log("Deleting unlisted plugin '%s'" % (path),
3517+ level=INFO)
3518+ os.remove(path)
3519+ else:
3520+ log("Unlisted plugins: (%s) Not removed. Set "
3521+ "remove-unlisted-plugins to 'yes' to clear them away." %
3522+ ', '.join(installed_plugins), level=INFO)
3523+ finally:
3524+ # Delete install refs
3525+ shutil.rmtree(track_dir)
3526
3527=== modified file 'hooks/master-relation-broken'
3528--- hooks/master-relation-broken 2012-07-31 10:32:36 +0000
3529+++ hooks/master-relation-broken 1970-01-01 00:00:00 +0000
3530@@ -1,17 +0,0 @@
3531-#!/bin/sh
3532-
3533-PASSWORD=`config-get password`
3534-if [ -z "$PASSWORD" ]
3535-then
3536- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3537-fi
3538-
3539-MEMBERS=`relation-list`
3540-
3541-for MEMBER in $MEMBERS
3542-do
3543- juju-log "Removing node $MEMBER from Jenkins master..."
3544- $(dirname $0)/delnode `echo $MEMBER | sed s,/,-,` `config-get username` $PASSWORD
3545-done
3546-
3547-exit 0
3548
3549=== target is u'jenkins_hooks.py'
3550=== modified file 'hooks/master-relation-changed'
3551--- hooks/master-relation-changed 2012-07-31 10:32:36 +0000
3552+++ hooks/master-relation-changed 1970-01-01 00:00:00 +0000
3553@@ -1,24 +0,0 @@
3554-#!/bin/bash
3555-
3556-set -ue
3557-
3558-PASSWORD=`config-get password`
3559-if [ -z "$PASSWORD" ]
3560-then
3561- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3562-fi
3563-
3564-# Grab information that remote unit has posted to relation
3565-slavehost=$(relation-get slavehost)
3566-executors=$(relation-get executors)
3567-labels=$(relation-get labels)
3568-
3569-# Double check to see if this has happened yet
3570-if [ "x$slavehost" = "x" ]; then
3571- juju-log "Slave host not yet defined, exiting..."
3572- exit 0
3573-fi
3574-
3575-juju-log "Adding slave with hostname $slavehost..."
3576-$(dirname $0)/addnode $slavehost $executors "$labels" `config-get username` $PASSWORD
3577-juju-log "Node slave $slavehost added..."
3578
3579=== target is u'jenkins_hooks.py'
3580=== modified file 'hooks/master-relation-departed'
3581--- hooks/master-relation-departed 2011-09-22 14:46:56 +0000
3582+++ hooks/master-relation-departed 1970-01-01 00:00:00 +0000
3583@@ -1,12 +0,0 @@
3584-#!/bin/bash
3585-
3586-set -ue
3587-
3588-# Slave hostname is derived from unit name so
3589-# this is pretty safe
3590-slavehost=`echo $JUJU_REMOTE_UNIT | sed s,/,-,`
3591-
3592-juju-log "Deleting slave with hostname $slavehost..."
3593-$(dirname $0)/delnode $slavehost `config-get username` `config-get password`
3594-
3595-exit 0
3596
3597=== target is u'jenkins_hooks.py'
3598=== modified file 'hooks/master-relation-joined'
3599--- hooks/master-relation-joined 2011-10-07 13:43:19 +0000
3600+++ hooks/master-relation-joined 1970-01-01 00:00:00 +0000
3601@@ -1,5 +0,0 @@
3602-#!/bin/sh
3603-
3604-HOSTNAME=`unit-get private-address`
3605-juju-log "Setting url relation to http://$HOSTNAME:8080"
3606-relation-set url="http://$HOSTNAME:8080"
3607
3608=== target is u'jenkins_hooks.py'
3609=== modified file 'hooks/start'
3610--- hooks/start 2011-09-22 14:46:56 +0000
3611+++ hooks/start 1970-01-01 00:00:00 +0000
3612@@ -1,3 +0,0 @@
3613-#!/bin/bash
3614-
3615-service jenkins start || true
3616
3617=== target is u'jenkins_hooks.py'
3618=== modified file 'hooks/stop'
3619--- hooks/stop 2011-09-22 14:46:56 +0000
3620+++ hooks/stop 1970-01-01 00:00:00 +0000
3621@@ -1,3 +0,0 @@
3622-#!/bin/bash
3623-
3624-service jenkins stop
3625
3626=== target is u'jenkins_hooks.py'
3627=== modified file 'hooks/upgrade-charm'
3628--- hooks/upgrade-charm 2011-09-22 14:46:56 +0000
3629+++ hooks/upgrade-charm 1970-01-01 00:00:00 +0000
3630@@ -1,7 +0,0 @@
3631-#!/bin/sh
3632-set -e
3633-
3634-home=`dirname $0`
3635-
3636-juju-log "Upgrading charm by running install hook again."
3637-exec $home/install
3638
3639=== target is u'jenkins_hooks.py'
3640=== modified file 'hooks/website-relation-joined'
3641--- hooks/website-relation-joined 2011-10-07 13:43:19 +0000
3642+++ hooks/website-relation-joined 1970-01-01 00:00:00 +0000
3643@@ -1,5 +0,0 @@
3644-#!/bin/sh
3645-
3646-HOSTNAME=`unit-get private-address`
3647-juju-log "Setting website URL to $HOSTNAME:8080"
3648-relation-set port=8080 hostname=$HOSTNAME
3649
3650=== target is u'jenkins_hooks.py'
3651=== renamed file 'tests/100-deploy' => 'tests/100-deploy-trusty'
3652--- tests/100-deploy 2014-03-05 19:18:19 +0000
3653+++ tests/100-deploy-trusty 2015-01-20 18:33:44 +0000
3654@@ -12,11 +12,13 @@
3655 ###
3656 # Deployment Setup
3657 ###
3658-d = amulet.Deployment()
3659+d = amulet.Deployment(series='trusty')
3660
3661 d.add('haproxy') # website-relation
3662 d.add('jenkins') # Subject matter
3663-d.add('jenkins-slave') # Job Runner
3664+# TODO(hopem): we don't yet have a precise version of jenkins-slave
3665+# so use the precise version for now.
3666+d.add('jenkins-slave', 'cs:precise/jenkins-slave') # Job Runner
3667
3668
3669 d.relate('jenkins:website', 'haproxy:reverseproxy')
3670
3671=== added file 'tests/README'
3672--- tests/README 1970-01-01 00:00:00 +0000
3673+++ tests/README 2015-01-20 18:33:44 +0000
3674@@ -0,0 +1,56 @@
3675+This directory provides Amulet tests that focus on verification of Jenkins
3676+deployments.
3677+
3678+In order to run tests, you'll need charm-tools installed (in addition to
3679+juju, of course):
3680+
3681+ sudo add-apt-repository ppa:juju/stable
3682+ sudo apt-get update
3683+ sudo apt-get install charm-tools
3684+
3685+If you use a web proxy server to access the web, you'll need to set the
3686+AMULET_HTTP_PROXY environment variable to the http URL of the proxy server.
3687+
3688+The following examples demonstrate different ways that tests can be executed.
3689+All examples are run from the charm's root directory.
3690+
3691+ * To run all tests (starting with 00-setup):
3692+
3693+ make test
3694+
3695+ * To run a specific test module (or modules):
3696+
3697+ juju test -v -p AMULET_HTTP_PROXY 100-deploy
3698+
3699+ * To run a specific test module (or modules), and keep the environment
3700+ deployed after a failure:
3701+
3702+ juju test --set-e -v -p AMULET_HTTP_PROXY 100-deploy
3703+
3704+ * To re-run a test module against an already deployed environment (one
3705+ that was deployed by a previous call to 'juju test --set-e'):
3706+
3707+ ./tests/100-deploy
3708+
3709+
3710+For debugging and test development purposes, all code should be idempotent.
3711+In other words, the code should have the ability to be re-run without changing
3712+the results beyond the initial run. This enables editing and re-running of a
3713+test module against an already deployed environment, as described above.
3714+
3715+
3716+Notes for additional test writing:
3717+
3718+ * Use DEBUG to turn on debug logging, use ERROR otherwise.
3719+ u = OpenStackAmuletUtils(ERROR)
3720+ u = OpenStackAmuletUtils(DEBUG)
3721+
3722+ * Preserving the deployed environment:
3723+ Even with juju --set-e, amulet will tear down the juju environment
3724+ when all tests pass. This force_fail 'test' can be used in basic_deployment.py
3725+ to simulate a failed test and keep the environment.
3726+
3727+ def test_zzzz_fake_fail(self):
3728+ '''Force a fake fail to keep juju environment after a successful test run'''
3729+ # Useful in test writing, when used with: juju test --set-e
3730+ amulet.raise_status(amulet.FAIL, msg='using fake fail to keep juju environment')
3731
3732=== added directory 'unit_tests'
3733=== added file 'unit_tests/__init__.py'
3734=== added file 'unit_tests/test_jenkins_hooks.py'
3735--- unit_tests/test_jenkins_hooks.py 1970-01-01 00:00:00 +0000
3736+++ unit_tests/test_jenkins_hooks.py 2015-01-20 18:33:44 +0000
3737@@ -0,0 +1,6 @@
3738+import unittest
3739+
3740+
3741+class JenkinsHooksTests(unittest.TestCase):
3742+ def setUp(self):
3743+ super(JenkinsHooksTests, self).setUp()
3744
3745=== added file 'unit_tests/test_jenkins_utils.py'
3746--- unit_tests/test_jenkins_utils.py 1970-01-01 00:00:00 +0000
3747+++ unit_tests/test_jenkins_utils.py 2015-01-20 18:33:44 +0000
3748@@ -0,0 +1,6 @@
3749+import unittest
3750+
3751+
3752+class JenkinsUtilsTests(unittest.TestCase):
3753+ def setUp(self):
3754+ super(JenkinsUtilsTests, self).setUp()

Subscribers

People subscribed via source and target branches

to all changes: