Merge lp:~hopem/charms/trusty/jenkins/python-redux into lp:charms/trusty/jenkins

Proposed by Edward Hope-Morley
Status: Superseded
Proposed branch: lp:~hopem/charms/trusty/jenkins/python-redux
Merge into: lp:charms/trusty/jenkins
Diff against target: 4642 lines (+4089/-275)
45 files modified
Makefile (+31/-0)
bin/charm_helpers_sync.py (+225/-0)
charm-helpers-hooks.yaml (+8/-0)
charm-helpers-tests.yaml (+5/-0)
config.yaml (+1/-1)
hooks/addnode (+0/-21)
hooks/charmhelpers/__init__.py (+22/-0)
hooks/charmhelpers/contrib/python/packages.py (+80/-0)
hooks/charmhelpers/core/decorators.py (+41/-0)
hooks/charmhelpers/core/fstab.py (+118/-0)
hooks/charmhelpers/core/hookenv.py (+552/-0)
hooks/charmhelpers/core/host.py (+419/-0)
hooks/charmhelpers/core/services/__init__.py (+2/-0)
hooks/charmhelpers/core/services/base.py (+313/-0)
hooks/charmhelpers/core/services/helpers.py (+243/-0)
hooks/charmhelpers/core/sysctl.py (+34/-0)
hooks/charmhelpers/core/templating.py (+52/-0)
hooks/charmhelpers/fetch/__init__.py (+423/-0)
hooks/charmhelpers/fetch/archiveurl.py (+145/-0)
hooks/charmhelpers/fetch/bzrurl.py (+54/-0)
hooks/charmhelpers/fetch/giturl.py (+51/-0)
hooks/charmhelpers/payload/__init__.py (+1/-0)
hooks/charmhelpers/payload/execd.py (+50/-0)
hooks/config-changed (+0/-7)
hooks/delnode (+0/-16)
hooks/install (+0/-151)
hooks/jenkins_hooks.py (+220/-0)
hooks/jenkins_utils.py (+178/-0)
hooks/master-relation-broken (+0/-17)
hooks/master-relation-changed (+0/-24)
hooks/master-relation-departed (+0/-12)
hooks/master-relation-joined (+0/-5)
hooks/start (+0/-3)
hooks/stop (+0/-3)
hooks/upgrade-charm (+0/-7)
hooks/website-relation-joined (+0/-5)
tests/100-deploy-precise (+123/-0)
tests/100-deploy-trusty (+5/-3)
tests/README (+56/-0)
tests/charmhelpers/contrib/amulet/deployment.py (+77/-0)
tests/charmhelpers/contrib/amulet/utils.py (+178/-0)
tests/charmhelpers/contrib/openstack/amulet/deployment.py (+92/-0)
tests/charmhelpers/contrib/openstack/amulet/utils.py (+278/-0)
unit_tests/test_jenkins_hooks.py (+6/-0)
unit_tests/test_jenkins_utils.py (+6/-0)
To merge this branch: bzr merge lp:~hopem/charms/trusty/jenkins/python-redux
Reviewer Review Type Date Requested Status
Review Queue (community) automated testing Needs Fixing
James Page Pending
Jorge Niedbalski Pending
Felipe Reyes Pending
Paul Larson Pending
Whit Morriss Pending
Ryan Beisner Pending
Review via email: mp+247106@code.launchpad.net

This proposal supersedes a proposal from 2015-01-20.

This proposal has been superseded by a proposal from 2015-01-22.

To post a comment you must log in.
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10336-results

review: Needs Fixing (automated testing)
Revision history for this message
Felipe Reyes (freyes) wrote : Posted in a previous version of this proposal

Setting the password doesn't work, deploying as below doesn't allow you to login with admin/admin. Also when first deploying from the charm store and then upgrading to this branch breaks the password.

---
jenkins:
    password: "admin"
---

$ juju deploy --config config.yaml local:trusty/jenkins

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote : Posted in a previous version of this proposal

Thanks Felipe, taking a look.

Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10636-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10684-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10704-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10869-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10876-results

review: Needs Fixing (automated testing)
Revision history for this message
Whit Morriss (whitmo) wrote : Posted in a previous version of this proposal

Thanks Edward, your python rewrite generally looks good at a glance.

I confirmed the test failures reported by automated testing: jenkins-slave is not being found because it is in the precise series only.

Changing test/100-deploy-trusty:line 19 to "d.add('jenkins-slave', 'cs:precise/jenkins-slave')" remedies this issue until there is a trusty version of jenkins slave.

The tests also hit an error in "master-relation-changed" due to what appears to be a race condition with the configuration and restart of the jenkins slave and adding a node to master. Running the hook via debug hook runs fine.

See logs: https://gist.githubusercontent.com/anonymous/0067138ce2cc697b8c88/raw/3f4c03688400b32f3aa66e1bd3bad5b7398f80a5/jenkins-race-condition

I confirmed this as an issue in the merge targets bash implementation also. Adding some retry logic to add_node should fix this issues.

-1 for test fixes, but otherwise looks good. Thanks again!

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote : Posted in a previous version of this proposal

@whitmo awesome, thanks for reviewing. I'll see if I can improve the add_node issue and I'll get the amulet test fixed up. Thanks!

Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10962-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote :

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10963-results

review: Needs Fixing (automated testing)
53. By Edward Hope-Morley

switch makefile rules to names that juju ci (hopefully) understands

54. By Edward Hope-Morley

ensure apt update prior to install

55. By Edward Hope-Morley

added venv for tests and lint

Unmerged revisions

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'Makefile'
2--- Makefile 1970-01-01 00:00:00 +0000
3+++ Makefile 2015-01-22 15:04:01 +0000
4@@ -0,0 +1,31 @@
5+#!/usr/bin/make
6+PYTHON := /usr/bin/env python
7+
8+lint:
9+ @flake8 --exclude hooks/charmhelpers hooks unit_tests tests
10+ @charm proof
11+
12+functional_test:
13+ @echo Starting Amulet tests...
14+ # coreycb note: The -v should only be temporary until Amulet sends
15+ # raise_status() messages to stderr:
16+ # https://bugs.launchpad.net/amulet/+bug/1320357
17+ @juju test -v -p AMULET_HTTP_PROXY --timeout 900 \
18+ 00-setup 100-deploy-precise 100-deploy-trusty
19+
20+test:
21+ @echo Starting unit tests...
22+ @$(PYTHON) /usr/bin/nosetests --nologcapture --with-coverage unit_tests
23+
24+bin/charm_helpers_sync.py:
25+ @mkdir -p bin
26+ @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
27+ > bin/charm_helpers_sync.py
28+
29+sync: bin/charm_helpers_sync.py
30+ @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers-hooks.yaml
31+ @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers-tests.yaml
32+
33+publish: lint unit_test
34+ bzr push lp:charms/jenkins
35+ bzr push lp:charms/trusty/jenkins
36
37=== added directory 'bin'
38=== added file 'bin/charm_helpers_sync.py'
39--- bin/charm_helpers_sync.py 1970-01-01 00:00:00 +0000
40+++ bin/charm_helpers_sync.py 2015-01-22 15:04:01 +0000
41@@ -0,0 +1,225 @@
42+#!/usr/bin/python
43+#
44+# Copyright 2013 Canonical Ltd.
45+
46+# Authors:
47+# Adam Gandelman <adamg@ubuntu.com>
48+#
49+
50+import logging
51+import optparse
52+import os
53+import subprocess
54+import shutil
55+import sys
56+import tempfile
57+import yaml
58+
59+from fnmatch import fnmatch
60+
61+CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
62+
63+
64+def parse_config(conf_file):
65+ if not os.path.isfile(conf_file):
66+ logging.error('Invalid config file: %s.' % conf_file)
67+ return False
68+ return yaml.load(open(conf_file).read())
69+
70+
71+def clone_helpers(work_dir, branch):
72+ dest = os.path.join(work_dir, 'charm-helpers')
73+ logging.info('Checking out %s to %s.' % (branch, dest))
74+ cmd = ['bzr', 'checkout', '--lightweight', branch, dest]
75+ subprocess.check_call(cmd)
76+ return dest
77+
78+
79+def _module_path(module):
80+ return os.path.join(*module.split('.'))
81+
82+
83+def _src_path(src, module):
84+ return os.path.join(src, 'charmhelpers', _module_path(module))
85+
86+
87+def _dest_path(dest, module):
88+ return os.path.join(dest, _module_path(module))
89+
90+
91+def _is_pyfile(path):
92+ return os.path.isfile(path + '.py')
93+
94+
95+def ensure_init(path):
96+ '''
97+ ensure directories leading up to path are importable, omitting
98+ parent directory, eg path='/hooks/helpers/foo'/:
99+ hooks/
100+ hooks/helpers/__init__.py
101+ hooks/helpers/foo/__init__.py
102+ '''
103+ for d, dirs, files in os.walk(os.path.join(*path.split('/')[:2])):
104+ _i = os.path.join(d, '__init__.py')
105+ if not os.path.exists(_i):
106+ logging.info('Adding missing __init__.py: %s' % _i)
107+ open(_i, 'wb').close()
108+
109+
110+def sync_pyfile(src, dest):
111+ src = src + '.py'
112+ src_dir = os.path.dirname(src)
113+ logging.info('Syncing pyfile: %s -> %s.' % (src, dest))
114+ if not os.path.exists(dest):
115+ os.makedirs(dest)
116+ shutil.copy(src, dest)
117+ if os.path.isfile(os.path.join(src_dir, '__init__.py')):
118+ shutil.copy(os.path.join(src_dir, '__init__.py'),
119+ dest)
120+ ensure_init(dest)
121+
122+
123+def get_filter(opts=None):
124+ opts = opts or []
125+ if 'inc=*' in opts:
126+ # do not filter any files, include everything
127+ return None
128+
129+ def _filter(dir, ls):
130+ incs = [opt.split('=').pop() for opt in opts if 'inc=' in opt]
131+ _filter = []
132+ for f in ls:
133+ _f = os.path.join(dir, f)
134+
135+ if not os.path.isdir(_f) and not _f.endswith('.py') and incs:
136+ if True not in [fnmatch(_f, inc) for inc in incs]:
137+ logging.debug('Not syncing %s, does not match include '
138+ 'filters (%s)' % (_f, incs))
139+ _filter.append(f)
140+ else:
141+ logging.debug('Including file, which matches include '
142+ 'filters (%s): %s' % (incs, _f))
143+ elif (os.path.isfile(_f) and not _f.endswith('.py')):
144+ logging.debug('Not syncing file: %s' % f)
145+ _filter.append(f)
146+ elif (os.path.isdir(_f) and not
147+ os.path.isfile(os.path.join(_f, '__init__.py'))):
148+ logging.debug('Not syncing directory: %s' % f)
149+ _filter.append(f)
150+ return _filter
151+ return _filter
152+
153+
154+def sync_directory(src, dest, opts=None):
155+ if os.path.exists(dest):
156+ logging.debug('Removing existing directory: %s' % dest)
157+ shutil.rmtree(dest)
158+ logging.info('Syncing directory: %s -> %s.' % (src, dest))
159+
160+ shutil.copytree(src, dest, ignore=get_filter(opts))
161+ ensure_init(dest)
162+
163+
164+def sync(src, dest, module, opts=None):
165+ if os.path.isdir(_src_path(src, module)):
166+ sync_directory(_src_path(src, module), _dest_path(dest, module), opts)
167+ elif _is_pyfile(_src_path(src, module)):
168+ sync_pyfile(_src_path(src, module),
169+ os.path.dirname(_dest_path(dest, module)))
170+ else:
171+ logging.warn('Could not sync: %s. Neither a pyfile or directory, '
172+ 'does it even exist?' % module)
173+
174+
175+def parse_sync_options(options):
176+ if not options:
177+ return []
178+ return options.split(',')
179+
180+
181+def extract_options(inc, global_options=None):
182+ global_options = global_options or []
183+ if global_options and isinstance(global_options, basestring):
184+ global_options = [global_options]
185+ if '|' not in inc:
186+ return (inc, global_options)
187+ inc, opts = inc.split('|')
188+ return (inc, parse_sync_options(opts) + global_options)
189+
190+
191+def sync_helpers(include, src, dest, options=None):
192+ if not os.path.isdir(dest):
193+ os.makedirs(dest)
194+
195+ global_options = parse_sync_options(options)
196+
197+ for inc in include:
198+ if isinstance(inc, str):
199+ inc, opts = extract_options(inc, global_options)
200+ sync(src, dest, inc, opts)
201+ elif isinstance(inc, dict):
202+ # could also do nested dicts here.
203+ for k, v in inc.iteritems():
204+ if isinstance(v, list):
205+ for m in v:
206+ inc, opts = extract_options(m, global_options)
207+ sync(src, dest, '%s.%s' % (k, inc), opts)
208+
209+if __name__ == '__main__':
210+ parser = optparse.OptionParser()
211+ parser.add_option('-c', '--config', action='store', dest='config',
212+ default=None, help='helper config file')
213+ parser.add_option('-D', '--debug', action='store_true', dest='debug',
214+ default=False, help='debug')
215+ parser.add_option('-b', '--branch', action='store', dest='branch',
216+ help='charm-helpers bzr branch (overrides config)')
217+ parser.add_option('-d', '--destination', action='store', dest='dest_dir',
218+ help='sync destination dir (overrides config)')
219+ (opts, args) = parser.parse_args()
220+
221+ if opts.debug:
222+ logging.basicConfig(level=logging.DEBUG)
223+ else:
224+ logging.basicConfig(level=logging.INFO)
225+
226+ if opts.config:
227+ logging.info('Loading charm helper config from %s.' % opts.config)
228+ config = parse_config(opts.config)
229+ if not config:
230+ logging.error('Could not parse config from %s.' % opts.config)
231+ sys.exit(1)
232+ else:
233+ config = {}
234+
235+ if 'branch' not in config:
236+ config['branch'] = CHARM_HELPERS_BRANCH
237+ if opts.branch:
238+ config['branch'] = opts.branch
239+ if opts.dest_dir:
240+ config['destination'] = opts.dest_dir
241+
242+ if 'destination' not in config:
243+ logging.error('No destination dir. specified as option or config.')
244+ sys.exit(1)
245+
246+ if 'include' not in config:
247+ if not args:
248+ logging.error('No modules to sync specified as option or config.')
249+ sys.exit(1)
250+ config['include'] = []
251+ [config['include'].append(a) for a in args]
252+
253+ sync_options = None
254+ if 'options' in config:
255+ sync_options = config['options']
256+ tmpd = tempfile.mkdtemp()
257+ try:
258+ checkout = clone_helpers(tmpd, config['branch'])
259+ sync_helpers(config['include'], checkout, config['destination'],
260+ options=sync_options)
261+ except Exception, e:
262+ logging.error("Could not sync: %s" % e)
263+ raise e
264+ finally:
265+ logging.debug('Cleaning up %s' % tmpd)
266+ shutil.rmtree(tmpd)
267
268=== added file 'charm-helpers-hooks.yaml'
269--- charm-helpers-hooks.yaml 1970-01-01 00:00:00 +0000
270+++ charm-helpers-hooks.yaml 2015-01-22 15:04:01 +0000
271@@ -0,0 +1,8 @@
272+branch: lp:charm-helpers
273+destination: hooks/charmhelpers
274+include:
275+ - __init__
276+ - contrib.python.packages
277+ - core
278+ - fetch
279+ - payload.execd
280
281=== added file 'charm-helpers-tests.yaml'
282--- charm-helpers-tests.yaml 1970-01-01 00:00:00 +0000
283+++ charm-helpers-tests.yaml 2015-01-22 15:04:01 +0000
284@@ -0,0 +1,5 @@
285+branch: lp:charm-helpers
286+destination: tests/charmhelpers
287+include:
288+ - contrib.amulet
289+ - contrib.openstack.amulet
290
291=== modified file 'config.yaml'
292--- config.yaml 2014-08-14 19:53:02 +0000
293+++ config.yaml 2015-01-22 15:04:01 +0000
294@@ -17,9 +17,9 @@
295 slave nodes so please don't change in Jenkins.
296 password:
297 type: string
298+ default: ""
299 description: Admin user password - used to manage
300 slave nodes so please don't change in Jenkins.
301- default:
302 plugins:
303 type: string
304 default: ""
305
306=== removed file 'hooks/addnode'
307--- hooks/addnode 2012-04-27 13:04:33 +0000
308+++ hooks/addnode 1970-01-01 00:00:00 +0000
309@@ -1,21 +0,0 @@
310-#!/usr/bin/python
311-
312-import jenkins
313-import sys
314-
315-host=sys.argv[1]
316-executors=sys.argv[2]
317-labels=sys.argv[3]
318-username=sys.argv[4]
319-password=sys.argv[5]
320-
321-l_jenkins = jenkins.Jenkins("http://localhost:8080/",username,password)
322-
323-if l_jenkins.node_exists(host):
324- print "Node exists - not adding"
325-else:
326- print "Adding node to Jenkins master"
327- l_jenkins.create_node(host, int(executors) * 2, host , labels=labels)
328-
329-if not l_jenkins.node_exists(host):
330- print "Failed to create node"
331
332=== added directory 'hooks/charmhelpers'
333=== added file 'hooks/charmhelpers/__init__.py'
334--- hooks/charmhelpers/__init__.py 1970-01-01 00:00:00 +0000
335+++ hooks/charmhelpers/__init__.py 2015-01-22 15:04:01 +0000
336@@ -0,0 +1,22 @@
337+# Bootstrap charm-helpers, installing its dependencies if necessary using
338+# only standard libraries.
339+import subprocess
340+import sys
341+
342+try:
343+ import six # flake8: noqa
344+except ImportError:
345+ if sys.version_info.major == 2:
346+ subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])
347+ else:
348+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])
349+ import six # flake8: noqa
350+
351+try:
352+ import yaml # flake8: noqa
353+except ImportError:
354+ if sys.version_info.major == 2:
355+ subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])
356+ else:
357+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])
358+ import yaml # flake8: noqa
359
360=== added directory 'hooks/charmhelpers/contrib'
361=== added file 'hooks/charmhelpers/contrib/__init__.py'
362=== added directory 'hooks/charmhelpers/contrib/python'
363=== added file 'hooks/charmhelpers/contrib/python/__init__.py'
364=== added file 'hooks/charmhelpers/contrib/python/packages.py'
365--- hooks/charmhelpers/contrib/python/packages.py 1970-01-01 00:00:00 +0000
366+++ hooks/charmhelpers/contrib/python/packages.py 2015-01-22 15:04:01 +0000
367@@ -0,0 +1,80 @@
368+#!/usr/bin/env python
369+# coding: utf-8
370+
371+__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
372+
373+from charmhelpers.fetch import apt_install, apt_update
374+from charmhelpers.core.hookenv import log
375+
376+try:
377+ from pip import main as pip_execute
378+except ImportError:
379+ apt_update()
380+ apt_install('python-pip')
381+ from pip import main as pip_execute
382+
383+
384+def parse_options(given, available):
385+ """Given a set of options, check if available"""
386+ for key, value in sorted(given.items()):
387+ if key in available:
388+ yield "--{0}={1}".format(key, value)
389+
390+
391+def pip_install_requirements(requirements, **options):
392+ """Install a requirements file """
393+ command = ["install"]
394+
395+ available_options = ('proxy', 'src', 'log', )
396+ for option in parse_options(options, available_options):
397+ command.append(option)
398+
399+ command.append("-r {0}".format(requirements))
400+ log("Installing from file: {} with options: {}".format(requirements,
401+ command))
402+ pip_execute(command)
403+
404+
405+def pip_install(package, fatal=False, upgrade=False, **options):
406+ """Install a python package"""
407+ command = ["install"]
408+
409+ available_options = ('proxy', 'src', 'log', "index-url", )
410+ for option in parse_options(options, available_options):
411+ command.append(option)
412+
413+ if upgrade:
414+ command.append('--upgrade')
415+
416+ if isinstance(package, list):
417+ command.extend(package)
418+ else:
419+ command.append(package)
420+
421+ log("Installing {} package with options: {}".format(package,
422+ command))
423+ pip_execute(command)
424+
425+
426+def pip_uninstall(package, **options):
427+ """Uninstall a python package"""
428+ command = ["uninstall", "-q", "-y"]
429+
430+ available_options = ('proxy', 'log', )
431+ for option in parse_options(options, available_options):
432+ command.append(option)
433+
434+ if isinstance(package, list):
435+ command.extend(package)
436+ else:
437+ command.append(package)
438+
439+ log("Uninstalling {} package with options: {}".format(package,
440+ command))
441+ pip_execute(command)
442+
443+
444+def pip_list():
445+ """Returns the list of current python installed packages
446+ """
447+ return pip_execute(["list"])
448
449=== added directory 'hooks/charmhelpers/core'
450=== added file 'hooks/charmhelpers/core/__init__.py'
451=== added file 'hooks/charmhelpers/core/decorators.py'
452--- hooks/charmhelpers/core/decorators.py 1970-01-01 00:00:00 +0000
453+++ hooks/charmhelpers/core/decorators.py 2015-01-22 15:04:01 +0000
454@@ -0,0 +1,41 @@
455+#
456+# Copyright 2014 Canonical Ltd.
457+#
458+# Authors:
459+# Edward Hope-Morley <opentastic@gmail.com>
460+#
461+
462+import time
463+
464+from charmhelpers.core.hookenv import (
465+ log,
466+ INFO,
467+)
468+
469+
470+def retry_on_exception(num_retries, base_delay=0, exc_type=Exception):
471+ """If the decorated function raises exception exc_type, allow num_retries
472+ retry attempts before raise the exception.
473+ """
474+ def _retry_on_exception_inner_1(f):
475+ def _retry_on_exception_inner_2(*args, **kwargs):
476+ retries = num_retries
477+ multiplier = 1
478+ while True:
479+ try:
480+ return f(*args, **kwargs)
481+ except exc_type:
482+ if not retries:
483+ raise
484+
485+ delay = base_delay * multiplier
486+ multiplier += 1
487+ log("Retrying '%s' %d more times (delay=%s)" %
488+ (f.__name__, retries, delay), level=INFO)
489+ retries -= 1
490+ if delay:
491+ time.sleep(delay)
492+
493+ return _retry_on_exception_inner_2
494+
495+ return _retry_on_exception_inner_1
496
497=== added file 'hooks/charmhelpers/core/fstab.py'
498--- hooks/charmhelpers/core/fstab.py 1970-01-01 00:00:00 +0000
499+++ hooks/charmhelpers/core/fstab.py 2015-01-22 15:04:01 +0000
500@@ -0,0 +1,118 @@
501+#!/usr/bin/env python
502+# -*- coding: utf-8 -*-
503+
504+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
505+
506+import io
507+import os
508+
509+
510+class Fstab(io.FileIO):
511+ """This class extends file in order to implement a file reader/writer
512+ for file `/etc/fstab`
513+ """
514+
515+ class Entry(object):
516+ """Entry class represents a non-comment line on the `/etc/fstab` file
517+ """
518+ def __init__(self, device, mountpoint, filesystem,
519+ options, d=0, p=0):
520+ self.device = device
521+ self.mountpoint = mountpoint
522+ self.filesystem = filesystem
523+
524+ if not options:
525+ options = "defaults"
526+
527+ self.options = options
528+ self.d = int(d)
529+ self.p = int(p)
530+
531+ def __eq__(self, o):
532+ return str(self) == str(o)
533+
534+ def __str__(self):
535+ return "{} {} {} {} {} {}".format(self.device,
536+ self.mountpoint,
537+ self.filesystem,
538+ self.options,
539+ self.d,
540+ self.p)
541+
542+ DEFAULT_PATH = os.path.join(os.path.sep, 'etc', 'fstab')
543+
544+ def __init__(self, path=None):
545+ if path:
546+ self._path = path
547+ else:
548+ self._path = self.DEFAULT_PATH
549+ super(Fstab, self).__init__(self._path, 'rb+')
550+
551+ def _hydrate_entry(self, line):
552+ # NOTE: use split with no arguments to split on any
553+ # whitespace including tabs
554+ return Fstab.Entry(*filter(
555+ lambda x: x not in ('', None),
556+ line.strip("\n").split()))
557+
558+ @property
559+ def entries(self):
560+ self.seek(0)
561+ for line in self.readlines():
562+ line = line.decode('us-ascii')
563+ try:
564+ if line.strip() and not line.startswith("#"):
565+ yield self._hydrate_entry(line)
566+ except ValueError:
567+ pass
568+
569+ def get_entry_by_attr(self, attr, value):
570+ for entry in self.entries:
571+ e_attr = getattr(entry, attr)
572+ if e_attr == value:
573+ return entry
574+ return None
575+
576+ def add_entry(self, entry):
577+ if self.get_entry_by_attr('device', entry.device):
578+ return False
579+
580+ self.write((str(entry) + '\n').encode('us-ascii'))
581+ self.truncate()
582+ return entry
583+
584+ def remove_entry(self, entry):
585+ self.seek(0)
586+
587+ lines = [l.decode('us-ascii') for l in self.readlines()]
588+
589+ found = False
590+ for index, line in enumerate(lines):
591+ if not line.startswith("#"):
592+ if self._hydrate_entry(line) == entry:
593+ found = True
594+ break
595+
596+ if not found:
597+ return False
598+
599+ lines.remove(line)
600+
601+ self.seek(0)
602+ self.write(''.join(lines).encode('us-ascii'))
603+ self.truncate()
604+ return True
605+
606+ @classmethod
607+ def remove_by_mountpoint(cls, mountpoint, path=None):
608+ fstab = cls(path=path)
609+ entry = fstab.get_entry_by_attr('mountpoint', mountpoint)
610+ if entry:
611+ return fstab.remove_entry(entry)
612+ return False
613+
614+ @classmethod
615+ def add(cls, device, mountpoint, filesystem, options=None, path=None):
616+ return cls(path=path).add_entry(Fstab.Entry(device,
617+ mountpoint, filesystem,
618+ options=options))
619
620=== added file 'hooks/charmhelpers/core/hookenv.py'
621--- hooks/charmhelpers/core/hookenv.py 1970-01-01 00:00:00 +0000
622+++ hooks/charmhelpers/core/hookenv.py 2015-01-22 15:04:01 +0000
623@@ -0,0 +1,552 @@
624+"Interactions with the Juju environment"
625+# Copyright 2013 Canonical Ltd.
626+#
627+# Authors:
628+# Charm Helpers Developers <juju@lists.ubuntu.com>
629+
630+import os
631+import json
632+import yaml
633+import subprocess
634+import sys
635+from subprocess import CalledProcessError
636+
637+import six
638+if not six.PY3:
639+ from UserDict import UserDict
640+else:
641+ from collections import UserDict
642+
643+CRITICAL = "CRITICAL"
644+ERROR = "ERROR"
645+WARNING = "WARNING"
646+INFO = "INFO"
647+DEBUG = "DEBUG"
648+MARKER = object()
649+
650+cache = {}
651+
652+
653+def cached(func):
654+ """Cache return values for multiple executions of func + args
655+
656+ For example::
657+
658+ @cached
659+ def unit_get(attribute):
660+ pass
661+
662+ unit_get('test')
663+
664+ will cache the result of unit_get + 'test' for future calls.
665+ """
666+ def wrapper(*args, **kwargs):
667+ global cache
668+ key = str((func, args, kwargs))
669+ try:
670+ return cache[key]
671+ except KeyError:
672+ res = func(*args, **kwargs)
673+ cache[key] = res
674+ return res
675+ return wrapper
676+
677+
678+def flush(key):
679+ """Flushes any entries from function cache where the
680+ key is found in the function+args """
681+ flush_list = []
682+ for item in cache:
683+ if key in item:
684+ flush_list.append(item)
685+ for item in flush_list:
686+ del cache[item]
687+
688+
689+def log(message, level=None):
690+ """Write a message to the juju log"""
691+ command = ['juju-log']
692+ if level:
693+ command += ['-l', level]
694+ if not isinstance(message, six.string_types):
695+ message = repr(message)
696+ command += [message]
697+ subprocess.call(command)
698+
699+
700+class Serializable(UserDict):
701+ """Wrapper, an object that can be serialized to yaml or json"""
702+
703+ def __init__(self, obj):
704+ # wrap the object
705+ UserDict.__init__(self)
706+ self.data = obj
707+
708+ def __getattr__(self, attr):
709+ # See if this object has attribute.
710+ if attr in ("json", "yaml", "data"):
711+ return self.__dict__[attr]
712+ # Check for attribute in wrapped object.
713+ got = getattr(self.data, attr, MARKER)
714+ if got is not MARKER:
715+ return got
716+ # Proxy to the wrapped object via dict interface.
717+ try:
718+ return self.data[attr]
719+ except KeyError:
720+ raise AttributeError(attr)
721+
722+ def __getstate__(self):
723+ # Pickle as a standard dictionary.
724+ return self.data
725+
726+ def __setstate__(self, state):
727+ # Unpickle into our wrapper.
728+ self.data = state
729+
730+ def json(self):
731+ """Serialize the object to json"""
732+ return json.dumps(self.data)
733+
734+ def yaml(self):
735+ """Serialize the object to yaml"""
736+ return yaml.dump(self.data)
737+
738+
739+def execution_environment():
740+ """A convenient bundling of the current execution context"""
741+ context = {}
742+ context['conf'] = config()
743+ if relation_id():
744+ context['reltype'] = relation_type()
745+ context['relid'] = relation_id()
746+ context['rel'] = relation_get()
747+ context['unit'] = local_unit()
748+ context['rels'] = relations()
749+ context['env'] = os.environ
750+ return context
751+
752+
753+def in_relation_hook():
754+ """Determine whether we're running in a relation hook"""
755+ return 'JUJU_RELATION' in os.environ
756+
757+
758+def relation_type():
759+ """The scope for the current relation hook"""
760+ return os.environ.get('JUJU_RELATION', None)
761+
762+
763+def relation_id():
764+ """The relation ID for the current relation hook"""
765+ return os.environ.get('JUJU_RELATION_ID', None)
766+
767+
768+def local_unit():
769+ """Local unit ID"""
770+ return os.environ['JUJU_UNIT_NAME']
771+
772+
773+def remote_unit():
774+ """The remote unit for the current relation hook"""
775+ return os.environ['JUJU_REMOTE_UNIT']
776+
777+
778+def service_name():
779+ """The name service group this unit belongs to"""
780+ return local_unit().split('/')[0]
781+
782+
783+def hook_name():
784+ """The name of the currently executing hook"""
785+ return os.path.basename(sys.argv[0])
786+
787+
788+class Config(dict):
789+ """A dictionary representation of the charm's config.yaml, with some
790+ extra features:
791+
792+ - See which values in the dictionary have changed since the previous hook.
793+ - For values that have changed, see what the previous value was.
794+ - Store arbitrary data for use in a later hook.
795+
796+ NOTE: Do not instantiate this object directly - instead call
797+ ``hookenv.config()``, which will return an instance of :class:`Config`.
798+
799+ Example usage::
800+
801+ >>> # inside a hook
802+ >>> from charmhelpers.core import hookenv
803+ >>> config = hookenv.config()
804+ >>> config['foo']
805+ 'bar'
806+ >>> # store a new key/value for later use
807+ >>> config['mykey'] = 'myval'
808+
809+
810+ >>> # user runs `juju set mycharm foo=baz`
811+ >>> # now we're inside subsequent config-changed hook
812+ >>> config = hookenv.config()
813+ >>> config['foo']
814+ 'baz'
815+ >>> # test to see if this val has changed since last hook
816+ >>> config.changed('foo')
817+ True
818+ >>> # what was the previous value?
819+ >>> config.previous('foo')
820+ 'bar'
821+ >>> # keys/values that we add are preserved across hooks
822+ >>> config['mykey']
823+ 'myval'
824+
825+ """
826+ CONFIG_FILE_NAME = '.juju-persistent-config'
827+
828+ def __init__(self, *args, **kw):
829+ super(Config, self).__init__(*args, **kw)
830+ self.implicit_save = True
831+ self._prev_dict = None
832+ self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)
833+ if os.path.exists(self.path):
834+ self.load_previous()
835+
836+ def __getitem__(self, key):
837+ """For regular dict lookups, check the current juju config first,
838+ then the previous (saved) copy. This ensures that user-saved values
839+ will be returned by a dict lookup.
840+
841+ """
842+ try:
843+ return dict.__getitem__(self, key)
844+ except KeyError:
845+ return (self._prev_dict or {})[key]
846+
847+ def keys(self):
848+ prev_keys = []
849+ if self._prev_dict is not None:
850+ prev_keys = self._prev_dict.keys()
851+ return list(set(prev_keys + list(dict.keys(self))))
852+
853+ def load_previous(self, path=None):
854+ """Load previous copy of config from disk.
855+
856+ In normal usage you don't need to call this method directly - it
857+ is called automatically at object initialization.
858+
859+ :param path:
860+
861+ File path from which to load the previous config. If `None`,
862+ config is loaded from the default location. If `path` is
863+ specified, subsequent `save()` calls will write to the same
864+ path.
865+
866+ """
867+ self.path = path or self.path
868+ with open(self.path) as f:
869+ self._prev_dict = json.load(f)
870+
871+ def changed(self, key):
872+ """Return True if the current value for this key is different from
873+ the previous value.
874+
875+ """
876+ if self._prev_dict is None:
877+ return True
878+ return self.previous(key) != self.get(key)
879+
880+ def previous(self, key):
881+ """Return previous value for this key, or None if there
882+ is no previous value.
883+
884+ """
885+ if self._prev_dict:
886+ return self._prev_dict.get(key)
887+ return None
888+
889+ def save(self):
890+ """Save this config to disk.
891+
892+ If the charm is using the :mod:`Services Framework <services.base>`
893+ or :meth:'@hook <Hooks.hook>' decorator, this
894+ is called automatically at the end of successful hook execution.
895+ Otherwise, it should be called directly by user code.
896+
897+ To disable automatic saves, set ``implicit_save=False`` on this
898+ instance.
899+
900+ """
901+ if self._prev_dict:
902+ for k, v in six.iteritems(self._prev_dict):
903+ if k not in self:
904+ self[k] = v
905+ with open(self.path, 'w') as f:
906+ json.dump(self, f)
907+
908+
909+@cached
910+def config(scope=None):
911+ """Juju charm configuration"""
912+ config_cmd_line = ['config-get']
913+ if scope is not None:
914+ config_cmd_line.append(scope)
915+ config_cmd_line.append('--format=json')
916+ try:
917+ config_data = json.loads(
918+ subprocess.check_output(config_cmd_line).decode('UTF-8'))
919+ if scope is not None:
920+ return config_data
921+ return Config(config_data)
922+ except ValueError:
923+ return None
924+
925+
926+@cached
927+def relation_get(attribute=None, unit=None, rid=None):
928+ """Get relation information"""
929+ _args = ['relation-get', '--format=json']
930+ if rid:
931+ _args.append('-r')
932+ _args.append(rid)
933+ _args.append(attribute or '-')
934+ if unit:
935+ _args.append(unit)
936+ try:
937+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
938+ except ValueError:
939+ return None
940+ except CalledProcessError as e:
941+ if e.returncode == 2:
942+ return None
943+ raise
944+
945+
946+def relation_set(relation_id=None, relation_settings=None, **kwargs):
947+ """Set relation information for the current unit"""
948+ relation_settings = relation_settings if relation_settings else {}
949+ relation_cmd_line = ['relation-set']
950+ if relation_id is not None:
951+ relation_cmd_line.extend(('-r', relation_id))
952+ for k, v in (list(relation_settings.items()) + list(kwargs.items())):
953+ if v is None:
954+ relation_cmd_line.append('{}='.format(k))
955+ else:
956+ relation_cmd_line.append('{}={}'.format(k, v))
957+ subprocess.check_call(relation_cmd_line)
958+ # Flush cache of any relation-gets for local unit
959+ flush(local_unit())
960+
961+
962+@cached
963+def relation_ids(reltype=None):
964+ """A list of relation_ids"""
965+ reltype = reltype or relation_type()
966+ relid_cmd_line = ['relation-ids', '--format=json']
967+ if reltype is not None:
968+ relid_cmd_line.append(reltype)
969+ return json.loads(
970+ subprocess.check_output(relid_cmd_line).decode('UTF-8')) or []
971+ return []
972+
973+
974+@cached
975+def related_units(relid=None):
976+ """A list of related units"""
977+ relid = relid or relation_id()
978+ units_cmd_line = ['relation-list', '--format=json']
979+ if relid is not None:
980+ units_cmd_line.extend(('-r', relid))
981+ return json.loads(
982+ subprocess.check_output(units_cmd_line).decode('UTF-8')) or []
983+
984+
985+@cached
986+def relation_for_unit(unit=None, rid=None):
987+ """Get the json represenation of a unit's relation"""
988+ unit = unit or remote_unit()
989+ relation = relation_get(unit=unit, rid=rid)
990+ for key in relation:
991+ if key.endswith('-list'):
992+ relation[key] = relation[key].split()
993+ relation['__unit__'] = unit
994+ return relation
995+
996+
997+@cached
998+def relations_for_id(relid=None):
999+ """Get relations of a specific relation ID"""
1000+ relation_data = []
1001+ relid = relid or relation_ids()
1002+ for unit in related_units(relid):
1003+ unit_data = relation_for_unit(unit, relid)
1004+ unit_data['__relid__'] = relid
1005+ relation_data.append(unit_data)
1006+ return relation_data
1007+
1008+
1009+@cached
1010+def relations_of_type(reltype=None):
1011+ """Get relations of a specific type"""
1012+ relation_data = []
1013+ reltype = reltype or relation_type()
1014+ for relid in relation_ids(reltype):
1015+ for relation in relations_for_id(relid):
1016+ relation['__relid__'] = relid
1017+ relation_data.append(relation)
1018+ return relation_data
1019+
1020+
1021+@cached
1022+def metadata():
1023+ """Get the current charm metadata.yaml contents as a python object"""
1024+ with open(os.path.join(charm_dir(), 'metadata.yaml')) as md:
1025+ return yaml.safe_load(md)
1026+
1027+
1028+@cached
1029+def relation_types():
1030+ """Get a list of relation types supported by this charm"""
1031+ rel_types = []
1032+ md = metadata()
1033+ for key in ('provides', 'requires', 'peers'):
1034+ section = md.get(key)
1035+ if section:
1036+ rel_types.extend(section.keys())
1037+ return rel_types
1038+
1039+
1040+@cached
1041+def charm_name():
1042+ """Get the name of the current charm as is specified on metadata.yaml"""
1043+ return metadata().get('name')
1044+
1045+
1046+@cached
1047+def relations():
1048+ """Get a nested dictionary of relation data for all related units"""
1049+ rels = {}
1050+ for reltype in relation_types():
1051+ relids = {}
1052+ for relid in relation_ids(reltype):
1053+ units = {local_unit(): relation_get(unit=local_unit(), rid=relid)}
1054+ for unit in related_units(relid):
1055+ reldata = relation_get(unit=unit, rid=relid)
1056+ units[unit] = reldata
1057+ relids[relid] = units
1058+ rels[reltype] = relids
1059+ return rels
1060+
1061+
1062+@cached
1063+def is_relation_made(relation, keys='private-address'):
1064+ '''
1065+ Determine whether a relation is established by checking for
1066+ presence of key(s). If a list of keys is provided, they
1067+ must all be present for the relation to be identified as made
1068+ '''
1069+ if isinstance(keys, str):
1070+ keys = [keys]
1071+ for r_id in relation_ids(relation):
1072+ for unit in related_units(r_id):
1073+ context = {}
1074+ for k in keys:
1075+ context[k] = relation_get(k, rid=r_id,
1076+ unit=unit)
1077+ if None not in context.values():
1078+ return True
1079+ return False
1080+
1081+
1082+def open_port(port, protocol="TCP"):
1083+ """Open a service network port"""
1084+ _args = ['open-port']
1085+ _args.append('{}/{}'.format(port, protocol))
1086+ subprocess.check_call(_args)
1087+
1088+
1089+def close_port(port, protocol="TCP"):
1090+ """Close a service network port"""
1091+ _args = ['close-port']
1092+ _args.append('{}/{}'.format(port, protocol))
1093+ subprocess.check_call(_args)
1094+
1095+
1096+@cached
1097+def unit_get(attribute):
1098+ """Get the unit ID for the remote unit"""
1099+ _args = ['unit-get', '--format=json', attribute]
1100+ try:
1101+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
1102+ except ValueError:
1103+ return None
1104+
1105+
1106+def unit_private_ip():
1107+ """Get this unit's private IP address"""
1108+ return unit_get('private-address')
1109+
1110+
1111+class UnregisteredHookError(Exception):
1112+ """Raised when an undefined hook is called"""
1113+ pass
1114+
1115+
1116+class Hooks(object):
1117+ """A convenient handler for hook functions.
1118+
1119+ Example::
1120+
1121+ hooks = Hooks()
1122+
1123+ # register a hook, taking its name from the function name
1124+ @hooks.hook()
1125+ def install():
1126+ pass # your code here
1127+
1128+ # register a hook, providing a custom hook name
1129+ @hooks.hook("config-changed")
1130+ def config_changed():
1131+ pass # your code here
1132+
1133+ if __name__ == "__main__":
1134+ # execute a hook based on the name the program is called by
1135+ hooks.execute(sys.argv)
1136+ """
1137+
1138+ def __init__(self, config_save=True):
1139+ super(Hooks, self).__init__()
1140+ self._hooks = {}
1141+ self._config_save = config_save
1142+
1143+ def register(self, name, function):
1144+ """Register a hook"""
1145+ self._hooks[name] = function
1146+
1147+ def execute(self, args):
1148+ """Execute a registered hook based on args[0]"""
1149+ hook_name = os.path.basename(args[0])
1150+ if hook_name in self._hooks:
1151+ self._hooks[hook_name]()
1152+ if self._config_save:
1153+ cfg = config()
1154+ if cfg.implicit_save:
1155+ cfg.save()
1156+ else:
1157+ raise UnregisteredHookError(hook_name)
1158+
1159+ def hook(self, *hook_names):
1160+ """Decorator, registering them as hooks"""
1161+ def wrapper(decorated):
1162+ for hook_name in hook_names:
1163+ self.register(hook_name, decorated)
1164+ else:
1165+ self.register(decorated.__name__, decorated)
1166+ if '_' in decorated.__name__:
1167+ self.register(
1168+ decorated.__name__.replace('_', '-'), decorated)
1169+ return decorated
1170+ return wrapper
1171+
1172+
1173+def charm_dir():
1174+ """Return the root directory of the current charm"""
1175+ return os.environ.get('CHARM_DIR')
1176
1177=== added file 'hooks/charmhelpers/core/host.py'
1178--- hooks/charmhelpers/core/host.py 1970-01-01 00:00:00 +0000
1179+++ hooks/charmhelpers/core/host.py 2015-01-22 15:04:01 +0000
1180@@ -0,0 +1,419 @@
1181+"""Tools for working with the host system"""
1182+# Copyright 2012 Canonical Ltd.
1183+#
1184+# Authors:
1185+# Nick Moffitt <nick.moffitt@canonical.com>
1186+# Matthew Wedgwood <matthew.wedgwood@canonical.com>
1187+
1188+import os
1189+import re
1190+import pwd
1191+import grp
1192+import random
1193+import string
1194+import subprocess
1195+import hashlib
1196+from contextlib import contextmanager
1197+from collections import OrderedDict
1198+
1199+import six
1200+
1201+from .hookenv import log
1202+from .fstab import Fstab
1203+
1204+
1205+def service_start(service_name):
1206+ """Start a system service"""
1207+ return service('start', service_name)
1208+
1209+
1210+def service_stop(service_name):
1211+ """Stop a system service"""
1212+ return service('stop', service_name)
1213+
1214+
1215+def service_restart(service_name):
1216+ """Restart a system service"""
1217+ return service('restart', service_name)
1218+
1219+
1220+def service_reload(service_name, restart_on_failure=False):
1221+ """Reload a system service, optionally falling back to restart if
1222+ reload fails"""
1223+ service_result = service('reload', service_name)
1224+ if not service_result and restart_on_failure:
1225+ service_result = service('restart', service_name)
1226+ return service_result
1227+
1228+
1229+def service(action, service_name):
1230+ """Control a system service"""
1231+ cmd = ['service', service_name, action]
1232+ return subprocess.call(cmd) == 0
1233+
1234+
1235+def service_running(service):
1236+ """Determine whether a system service is running"""
1237+ try:
1238+ output = subprocess.check_output(
1239+ ['service', service, 'status'],
1240+ stderr=subprocess.STDOUT).decode('UTF-8')
1241+ except subprocess.CalledProcessError:
1242+ return False
1243+ else:
1244+ if ("start/running" in output or "is running" in output):
1245+ return True
1246+ else:
1247+ return False
1248+
1249+
1250+def service_available(service_name):
1251+ """Determine whether a system service is available"""
1252+ try:
1253+ subprocess.check_output(
1254+ ['service', service_name, 'status'],
1255+ stderr=subprocess.STDOUT).decode('UTF-8')
1256+ except subprocess.CalledProcessError as e:
1257+ return 'unrecognized service' not in e.output
1258+ else:
1259+ return True
1260+
1261+
1262+def adduser(username, password=None, shell='/bin/bash', system_user=False):
1263+ """Add a user to the system"""
1264+ try:
1265+ user_info = pwd.getpwnam(username)
1266+ log('user {0} already exists!'.format(username))
1267+ except KeyError:
1268+ log('creating user {0}'.format(username))
1269+ cmd = ['useradd']
1270+ if system_user or password is None:
1271+ cmd.append('--system')
1272+ else:
1273+ cmd.extend([
1274+ '--create-home',
1275+ '--shell', shell,
1276+ '--password', password,
1277+ ])
1278+ cmd.append(username)
1279+ subprocess.check_call(cmd)
1280+ user_info = pwd.getpwnam(username)
1281+ return user_info
1282+
1283+
1284+def add_group(group_name, system_group=False):
1285+ """Add a group to the system"""
1286+ try:
1287+ group_info = grp.getgrnam(group_name)
1288+ log('group {0} already exists!'.format(group_name))
1289+ except KeyError:
1290+ log('creating group {0}'.format(group_name))
1291+ cmd = ['addgroup']
1292+ if system_group:
1293+ cmd.append('--system')
1294+ else:
1295+ cmd.extend([
1296+ '--group',
1297+ ])
1298+ cmd.append(group_name)
1299+ subprocess.check_call(cmd)
1300+ group_info = grp.getgrnam(group_name)
1301+ return group_info
1302+
1303+
1304+def add_user_to_group(username, group):
1305+ """Add a user to a group"""
1306+ cmd = [
1307+ 'gpasswd', '-a',
1308+ username,
1309+ group
1310+ ]
1311+ log("Adding user {} to group {}".format(username, group))
1312+ subprocess.check_call(cmd)
1313+
1314+
1315+def rsync(from_path, to_path, flags='-r', options=None):
1316+ """Replicate the contents of a path"""
1317+ options = options or ['--delete', '--executability']
1318+ cmd = ['/usr/bin/rsync', flags]
1319+ cmd.extend(options)
1320+ cmd.append(from_path)
1321+ cmd.append(to_path)
1322+ log(" ".join(cmd))
1323+ return subprocess.check_output(cmd).decode('UTF-8').strip()
1324+
1325+
1326+def symlink(source, destination):
1327+ """Create a symbolic link"""
1328+ log("Symlinking {} as {}".format(source, destination))
1329+ cmd = [
1330+ 'ln',
1331+ '-sf',
1332+ source,
1333+ destination,
1334+ ]
1335+ subprocess.check_call(cmd)
1336+
1337+
1338+def mkdir(path, owner='root', group='root', perms=0o555, force=False):
1339+ """Create a directory"""
1340+ log("Making dir {} {}:{} {:o}".format(path, owner, group,
1341+ perms))
1342+ uid = pwd.getpwnam(owner).pw_uid
1343+ gid = grp.getgrnam(group).gr_gid
1344+ realpath = os.path.abspath(path)
1345+ path_exists = os.path.exists(realpath)
1346+ if path_exists and force:
1347+ if not os.path.isdir(realpath):
1348+ log("Removing non-directory file {} prior to mkdir()".format(path))
1349+ os.unlink(realpath)
1350+ os.makedirs(realpath, perms)
1351+ os.chown(realpath, uid, gid)
1352+ elif not path_exists:
1353+ os.makedirs(realpath, perms)
1354+ os.chown(realpath, uid, gid)
1355+
1356+
1357+def write_file(path, content, owner='root', group='root', perms=0o444):
1358+ """Create or overwrite a file with the contents of a string"""
1359+ log("Writing file {} {}:{} {:o}".format(path, owner, group, perms))
1360+ uid = pwd.getpwnam(owner).pw_uid
1361+ gid = grp.getgrnam(group).gr_gid
1362+ with open(path, 'w') as target:
1363+ os.fchown(target.fileno(), uid, gid)
1364+ os.fchmod(target.fileno(), perms)
1365+ target.write(content)
1366+
1367+
1368+def fstab_remove(mp):
1369+ """Remove the given mountpoint entry from /etc/fstab
1370+ """
1371+ return Fstab.remove_by_mountpoint(mp)
1372+
1373+
1374+def fstab_add(dev, mp, fs, options=None):
1375+ """Adds the given device entry to the /etc/fstab file
1376+ """
1377+ return Fstab.add(dev, mp, fs, options=options)
1378+
1379+
1380+def mount(device, mountpoint, options=None, persist=False, filesystem="ext3"):
1381+ """Mount a filesystem at a particular mountpoint"""
1382+ cmd_args = ['mount']
1383+ if options is not None:
1384+ cmd_args.extend(['-o', options])
1385+ cmd_args.extend([device, mountpoint])
1386+ try:
1387+ subprocess.check_output(cmd_args)
1388+ except subprocess.CalledProcessError as e:
1389+ log('Error mounting {} at {}\n{}'.format(device, mountpoint, e.output))
1390+ return False
1391+
1392+ if persist:
1393+ return fstab_add(device, mountpoint, filesystem, options=options)
1394+ return True
1395+
1396+
1397+def umount(mountpoint, persist=False):
1398+ """Unmount a filesystem"""
1399+ cmd_args = ['umount', mountpoint]
1400+ try:
1401+ subprocess.check_output(cmd_args)
1402+ except subprocess.CalledProcessError as e:
1403+ log('Error unmounting {}\n{}'.format(mountpoint, e.output))
1404+ return False
1405+
1406+ if persist:
1407+ return fstab_remove(mountpoint)
1408+ return True
1409+
1410+
1411+def mounts():
1412+ """Get a list of all mounted volumes as [[mountpoint,device],[...]]"""
1413+ with open('/proc/mounts') as f:
1414+ # [['/mount/point','/dev/path'],[...]]
1415+ system_mounts = [m[1::-1] for m in [l.strip().split()
1416+ for l in f.readlines()]]
1417+ return system_mounts
1418+
1419+
1420+def file_hash(path, hash_type='md5'):
1421+ """
1422+ Generate a hash checksum of the contents of 'path' or None if not found.
1423+
1424+ :param str hash_type: Any hash alrgorithm supported by :mod:`hashlib`,
1425+ such as md5, sha1, sha256, sha512, etc.
1426+ """
1427+ if os.path.exists(path):
1428+ h = getattr(hashlib, hash_type)()
1429+ with open(path, 'rb') as source:
1430+ h.update(source.read())
1431+ return h.hexdigest()
1432+ else:
1433+ return None
1434+
1435+
1436+def check_hash(path, checksum, hash_type='md5'):
1437+ """
1438+ Validate a file using a cryptographic checksum.
1439+
1440+ :param str checksum: Value of the checksum used to validate the file.
1441+ :param str hash_type: Hash algorithm used to generate `checksum`.
1442+ Can be any hash alrgorithm supported by :mod:`hashlib`,
1443+ such as md5, sha1, sha256, sha512, etc.
1444+ :raises ChecksumError: If the file fails the checksum
1445+
1446+ """
1447+ actual_checksum = file_hash(path, hash_type)
1448+ if checksum != actual_checksum:
1449+ raise ChecksumError("'%s' != '%s'" % (checksum, actual_checksum))
1450+
1451+
1452+class ChecksumError(ValueError):
1453+ pass
1454+
1455+
1456+def restart_on_change(restart_map, stopstart=False):
1457+ """Restart services based on configuration files changing
1458+
1459+ This function is used a decorator, for example::
1460+
1461+ @restart_on_change({
1462+ '/etc/ceph/ceph.conf': [ 'cinder-api', 'cinder-volume' ]
1463+ })
1464+ def ceph_client_changed():
1465+ pass # your code here
1466+
1467+ In this example, the cinder-api and cinder-volume services
1468+ would be restarted if /etc/ceph/ceph.conf is changed by the
1469+ ceph_client_changed function.
1470+ """
1471+ def wrap(f):
1472+ def wrapped_f(*args):
1473+ checksums = {}
1474+ for path in restart_map:
1475+ checksums[path] = file_hash(path)
1476+ f(*args)
1477+ restarts = []
1478+ for path in restart_map:
1479+ if checksums[path] != file_hash(path):
1480+ restarts += restart_map[path]
1481+ services_list = list(OrderedDict.fromkeys(restarts))
1482+ if not stopstart:
1483+ for service_name in services_list:
1484+ service('restart', service_name)
1485+ else:
1486+ for action in ['stop', 'start']:
1487+ for service_name in services_list:
1488+ service(action, service_name)
1489+ return wrapped_f
1490+ return wrap
1491+
1492+
1493+def lsb_release():
1494+ """Return /etc/lsb-release in a dict"""
1495+ d = {}
1496+ with open('/etc/lsb-release', 'r') as lsb:
1497+ for l in lsb:
1498+ k, v = l.split('=')
1499+ d[k.strip()] = v.strip()
1500+ return d
1501+
1502+
1503+def pwgen(length=None):
1504+ """Generate a random pasword."""
1505+ if length is None:
1506+ length = random.choice(range(35, 45))
1507+ alphanumeric_chars = [
1508+ l for l in (string.ascii_letters + string.digits)
1509+ if l not in 'l0QD1vAEIOUaeiou']
1510+ random_chars = [
1511+ random.choice(alphanumeric_chars) for _ in range(length)]
1512+ return(''.join(random_chars))
1513+
1514+
1515+def list_nics(nic_type):
1516+ '''Return a list of nics of given type(s)'''
1517+ if isinstance(nic_type, six.string_types):
1518+ int_types = [nic_type]
1519+ else:
1520+ int_types = nic_type
1521+ interfaces = []
1522+ for int_type in int_types:
1523+ cmd = ['ip', 'addr', 'show', 'label', int_type + '*']
1524+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1525+ ip_output = (line for line in ip_output if line)
1526+ for line in ip_output:
1527+ if line.split()[1].startswith(int_type):
1528+ matched = re.search('.*: (bond[0-9]+\.[0-9]+)@.*', line)
1529+ if matched:
1530+ interface = matched.groups()[0]
1531+ else:
1532+ interface = line.split()[1].replace(":", "")
1533+ interfaces.append(interface)
1534+
1535+ return interfaces
1536+
1537+
1538+def set_nic_mtu(nic, mtu):
1539+ '''Set MTU on a network interface'''
1540+ cmd = ['ip', 'link', 'set', nic, 'mtu', mtu]
1541+ subprocess.check_call(cmd)
1542+
1543+
1544+def get_nic_mtu(nic):
1545+ cmd = ['ip', 'addr', 'show', nic]
1546+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1547+ mtu = ""
1548+ for line in ip_output:
1549+ words = line.split()
1550+ if 'mtu' in words:
1551+ mtu = words[words.index("mtu") + 1]
1552+ return mtu
1553+
1554+
1555+def get_nic_hwaddr(nic):
1556+ cmd = ['ip', '-o', '-0', 'addr', 'show', nic]
1557+ ip_output = subprocess.check_output(cmd).decode('UTF-8')
1558+ hwaddr = ""
1559+ words = ip_output.split()
1560+ if 'link/ether' in words:
1561+ hwaddr = words[words.index('link/ether') + 1]
1562+ return hwaddr
1563+
1564+
1565+def cmp_pkgrevno(package, revno, pkgcache=None):
1566+ '''Compare supplied revno with the revno of the installed package
1567+
1568+ * 1 => Installed revno is greater than supplied arg
1569+ * 0 => Installed revno is the same as supplied arg
1570+ * -1 => Installed revno is less than supplied arg
1571+
1572+ '''
1573+ import apt_pkg
1574+ if not pkgcache:
1575+ from charmhelpers.fetch import apt_cache
1576+ pkgcache = apt_cache()
1577+ pkg = pkgcache[package]
1578+ return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
1579+
1580+
1581+@contextmanager
1582+def chdir(d):
1583+ cur = os.getcwd()
1584+ try:
1585+ yield os.chdir(d)
1586+ finally:
1587+ os.chdir(cur)
1588+
1589+
1590+def chownr(path, owner, group):
1591+ uid = pwd.getpwnam(owner).pw_uid
1592+ gid = grp.getgrnam(group).gr_gid
1593+
1594+ for root, dirs, files in os.walk(path):
1595+ for name in dirs + files:
1596+ full = os.path.join(root, name)
1597+ broken_symlink = os.path.lexists(full) and not os.path.exists(full)
1598+ if not broken_symlink:
1599+ os.chown(full, uid, gid)
1600
1601=== added directory 'hooks/charmhelpers/core/services'
1602=== added file 'hooks/charmhelpers/core/services/__init__.py'
1603--- hooks/charmhelpers/core/services/__init__.py 1970-01-01 00:00:00 +0000
1604+++ hooks/charmhelpers/core/services/__init__.py 2015-01-22 15:04:01 +0000
1605@@ -0,0 +1,2 @@
1606+from .base import * # NOQA
1607+from .helpers import * # NOQA
1608
1609=== added file 'hooks/charmhelpers/core/services/base.py'
1610--- hooks/charmhelpers/core/services/base.py 1970-01-01 00:00:00 +0000
1611+++ hooks/charmhelpers/core/services/base.py 2015-01-22 15:04:01 +0000
1612@@ -0,0 +1,313 @@
1613+import os
1614+import re
1615+import json
1616+from collections import Iterable
1617+
1618+from charmhelpers.core import host
1619+from charmhelpers.core import hookenv
1620+
1621+
1622+__all__ = ['ServiceManager', 'ManagerCallback',
1623+ 'PortManagerCallback', 'open_ports', 'close_ports', 'manage_ports',
1624+ 'service_restart', 'service_stop']
1625+
1626+
1627+class ServiceManager(object):
1628+ def __init__(self, services=None):
1629+ """
1630+ Register a list of services, given their definitions.
1631+
1632+ Service definitions are dicts in the following formats (all keys except
1633+ 'service' are optional)::
1634+
1635+ {
1636+ "service": <service name>,
1637+ "required_data": <list of required data contexts>,
1638+ "provided_data": <list of provided data contexts>,
1639+ "data_ready": <one or more callbacks>,
1640+ "data_lost": <one or more callbacks>,
1641+ "start": <one or more callbacks>,
1642+ "stop": <one or more callbacks>,
1643+ "ports": <list of ports to manage>,
1644+ }
1645+
1646+ The 'required_data' list should contain dicts of required data (or
1647+ dependency managers that act like dicts and know how to collect the data).
1648+ Only when all items in the 'required_data' list are populated are the list
1649+ of 'data_ready' and 'start' callbacks executed. See `is_ready()` for more
1650+ information.
1651+
1652+ The 'provided_data' list should contain relation data providers, most likely
1653+ a subclass of :class:`charmhelpers.core.services.helpers.RelationContext`,
1654+ that will indicate a set of data to set on a given relation.
1655+
1656+ The 'data_ready' value should be either a single callback, or a list of
1657+ callbacks, to be called when all items in 'required_data' pass `is_ready()`.
1658+ Each callback will be called with the service name as the only parameter.
1659+ After all of the 'data_ready' callbacks are called, the 'start' callbacks
1660+ are fired.
1661+
1662+ The 'data_lost' value should be either a single callback, or a list of
1663+ callbacks, to be called when a 'required_data' item no longer passes
1664+ `is_ready()`. Each callback will be called with the service name as the
1665+ only parameter. After all of the 'data_lost' callbacks are called,
1666+ the 'stop' callbacks are fired.
1667+
1668+ The 'start' value should be either a single callback, or a list of
1669+ callbacks, to be called when starting the service, after the 'data_ready'
1670+ callbacks are complete. Each callback will be called with the service
1671+ name as the only parameter. This defaults to
1672+ `[host.service_start, services.open_ports]`.
1673+
1674+ The 'stop' value should be either a single callback, or a list of
1675+ callbacks, to be called when stopping the service. If the service is
1676+ being stopped because it no longer has all of its 'required_data', this
1677+ will be called after all of the 'data_lost' callbacks are complete.
1678+ Each callback will be called with the service name as the only parameter.
1679+ This defaults to `[services.close_ports, host.service_stop]`.
1680+
1681+ The 'ports' value should be a list of ports to manage. The default
1682+ 'start' handler will open the ports after the service is started,
1683+ and the default 'stop' handler will close the ports prior to stopping
1684+ the service.
1685+
1686+
1687+ Examples:
1688+
1689+ The following registers an Upstart service called bingod that depends on
1690+ a mongodb relation and which runs a custom `db_migrate` function prior to
1691+ restarting the service, and a Runit service called spadesd::
1692+
1693+ manager = services.ServiceManager([
1694+ {
1695+ 'service': 'bingod',
1696+ 'ports': [80, 443],
1697+ 'required_data': [MongoRelation(), config(), {'my': 'data'}],
1698+ 'data_ready': [
1699+ services.template(source='bingod.conf'),
1700+ services.template(source='bingod.ini',
1701+ target='/etc/bingod.ini',
1702+ owner='bingo', perms=0400),
1703+ ],
1704+ },
1705+ {
1706+ 'service': 'spadesd',
1707+ 'data_ready': services.template(source='spadesd_run.j2',
1708+ target='/etc/sv/spadesd/run',
1709+ perms=0555),
1710+ 'start': runit_start,
1711+ 'stop': runit_stop,
1712+ },
1713+ ])
1714+ manager.manage()
1715+ """
1716+ self._ready_file = os.path.join(hookenv.charm_dir(), 'READY-SERVICES.json')
1717+ self._ready = None
1718+ self.services = {}
1719+ for service in services or []:
1720+ service_name = service['service']
1721+ self.services[service_name] = service
1722+
1723+ def manage(self):
1724+ """
1725+ Handle the current hook by doing The Right Thing with the registered services.
1726+ """
1727+ hook_name = hookenv.hook_name()
1728+ if hook_name == 'stop':
1729+ self.stop_services()
1730+ else:
1731+ self.provide_data()
1732+ self.reconfigure_services()
1733+ cfg = hookenv.config()
1734+ if cfg.implicit_save:
1735+ cfg.save()
1736+
1737+ def provide_data(self):
1738+ """
1739+ Set the relation data for each provider in the ``provided_data`` list.
1740+
1741+ A provider must have a `name` attribute, which indicates which relation
1742+ to set data on, and a `provide_data()` method, which returns a dict of
1743+ data to set.
1744+ """
1745+ hook_name = hookenv.hook_name()
1746+ for service in self.services.values():
1747+ for provider in service.get('provided_data', []):
1748+ if re.match(r'{}-relation-(joined|changed)'.format(provider.name), hook_name):
1749+ data = provider.provide_data()
1750+ _ready = provider._is_ready(data) if hasattr(provider, '_is_ready') else data
1751+ if _ready:
1752+ hookenv.relation_set(None, data)
1753+
1754+ def reconfigure_services(self, *service_names):
1755+ """
1756+ Update all files for one or more registered services, and,
1757+ if ready, optionally restart them.
1758+
1759+ If no service names are given, reconfigures all registered services.
1760+ """
1761+ for service_name in service_names or self.services.keys():
1762+ if self.is_ready(service_name):
1763+ self.fire_event('data_ready', service_name)
1764+ self.fire_event('start', service_name, default=[
1765+ service_restart,
1766+ manage_ports])
1767+ self.save_ready(service_name)
1768+ else:
1769+ if self.was_ready(service_name):
1770+ self.fire_event('data_lost', service_name)
1771+ self.fire_event('stop', service_name, default=[
1772+ manage_ports,
1773+ service_stop])
1774+ self.save_lost(service_name)
1775+
1776+ def stop_services(self, *service_names):
1777+ """
1778+ Stop one or more registered services, by name.
1779+
1780+ If no service names are given, stops all registered services.
1781+ """
1782+ for service_name in service_names or self.services.keys():
1783+ self.fire_event('stop', service_name, default=[
1784+ manage_ports,
1785+ service_stop])
1786+
1787+ def get_service(self, service_name):
1788+ """
1789+ Given the name of a registered service, return its service definition.
1790+ """
1791+ service = self.services.get(service_name)
1792+ if not service:
1793+ raise KeyError('Service not registered: %s' % service_name)
1794+ return service
1795+
1796+ def fire_event(self, event_name, service_name, default=None):
1797+ """
1798+ Fire a data_ready, data_lost, start, or stop event on a given service.
1799+ """
1800+ service = self.get_service(service_name)
1801+ callbacks = service.get(event_name, default)
1802+ if not callbacks:
1803+ return
1804+ if not isinstance(callbacks, Iterable):
1805+ callbacks = [callbacks]
1806+ for callback in callbacks:
1807+ if isinstance(callback, ManagerCallback):
1808+ callback(self, service_name, event_name)
1809+ else:
1810+ callback(service_name)
1811+
1812+ def is_ready(self, service_name):
1813+ """
1814+ Determine if a registered service is ready, by checking its 'required_data'.
1815+
1816+ A 'required_data' item can be any mapping type, and is considered ready
1817+ if `bool(item)` evaluates as True.
1818+ """
1819+ service = self.get_service(service_name)
1820+ reqs = service.get('required_data', [])
1821+ return all(bool(req) for req in reqs)
1822+
1823+ def _load_ready_file(self):
1824+ if self._ready is not None:
1825+ return
1826+ if os.path.exists(self._ready_file):
1827+ with open(self._ready_file) as fp:
1828+ self._ready = set(json.load(fp))
1829+ else:
1830+ self._ready = set()
1831+
1832+ def _save_ready_file(self):
1833+ if self._ready is None:
1834+ return
1835+ with open(self._ready_file, 'w') as fp:
1836+ json.dump(list(self._ready), fp)
1837+
1838+ def save_ready(self, service_name):
1839+ """
1840+ Save an indicator that the given service is now data_ready.
1841+ """
1842+ self._load_ready_file()
1843+ self._ready.add(service_name)
1844+ self._save_ready_file()
1845+
1846+ def save_lost(self, service_name):
1847+ """
1848+ Save an indicator that the given service is no longer data_ready.
1849+ """
1850+ self._load_ready_file()
1851+ self._ready.discard(service_name)
1852+ self._save_ready_file()
1853+
1854+ def was_ready(self, service_name):
1855+ """
1856+ Determine if the given service was previously data_ready.
1857+ """
1858+ self._load_ready_file()
1859+ return service_name in self._ready
1860+
1861+
1862+class ManagerCallback(object):
1863+ """
1864+ Special case of a callback that takes the `ServiceManager` instance
1865+ in addition to the service name.
1866+
1867+ Subclasses should implement `__call__` which should accept three parameters:
1868+
1869+ * `manager` The `ServiceManager` instance
1870+ * `service_name` The name of the service it's being triggered for
1871+ * `event_name` The name of the event that this callback is handling
1872+ """
1873+ def __call__(self, manager, service_name, event_name):
1874+ raise NotImplementedError()
1875+
1876+
1877+class PortManagerCallback(ManagerCallback):
1878+ """
1879+ Callback class that will open or close ports, for use as either
1880+ a start or stop action.
1881+ """
1882+ def __call__(self, manager, service_name, event_name):
1883+ service = manager.get_service(service_name)
1884+ new_ports = service.get('ports', [])
1885+ port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
1886+ if os.path.exists(port_file):
1887+ with open(port_file) as fp:
1888+ old_ports = fp.read().split(',')
1889+ for old_port in old_ports:
1890+ if bool(old_port):
1891+ old_port = int(old_port)
1892+ if old_port not in new_ports:
1893+ hookenv.close_port(old_port)
1894+ with open(port_file, 'w') as fp:
1895+ fp.write(','.join(str(port) for port in new_ports))
1896+ for port in new_ports:
1897+ if event_name == 'start':
1898+ hookenv.open_port(port)
1899+ elif event_name == 'stop':
1900+ hookenv.close_port(port)
1901+
1902+
1903+def service_stop(service_name):
1904+ """
1905+ Wrapper around host.service_stop to prevent spurious "unknown service"
1906+ messages in the logs.
1907+ """
1908+ if host.service_running(service_name):
1909+ host.service_stop(service_name)
1910+
1911+
1912+def service_restart(service_name):
1913+ """
1914+ Wrapper around host.service_restart to prevent spurious "unknown service"
1915+ messages in the logs.
1916+ """
1917+ if host.service_available(service_name):
1918+ if host.service_running(service_name):
1919+ host.service_restart(service_name)
1920+ else:
1921+ host.service_start(service_name)
1922+
1923+
1924+# Convenience aliases
1925+open_ports = close_ports = manage_ports = PortManagerCallback()
1926
1927=== added file 'hooks/charmhelpers/core/services/helpers.py'
1928--- hooks/charmhelpers/core/services/helpers.py 1970-01-01 00:00:00 +0000
1929+++ hooks/charmhelpers/core/services/helpers.py 2015-01-22 15:04:01 +0000
1930@@ -0,0 +1,243 @@
1931+import os
1932+import yaml
1933+from charmhelpers.core import hookenv
1934+from charmhelpers.core import templating
1935+
1936+from charmhelpers.core.services.base import ManagerCallback
1937+
1938+
1939+__all__ = ['RelationContext', 'TemplateCallback',
1940+ 'render_template', 'template']
1941+
1942+
1943+class RelationContext(dict):
1944+ """
1945+ Base class for a context generator that gets relation data from juju.
1946+
1947+ Subclasses must provide the attributes `name`, which is the name of the
1948+ interface of interest, `interface`, which is the type of the interface of
1949+ interest, and `required_keys`, which is the set of keys required for the
1950+ relation to be considered complete. The data for all interfaces matching
1951+ the `name` attribute that are complete will used to populate the dictionary
1952+ values (see `get_data`, below).
1953+
1954+ The generated context will be namespaced under the relation :attr:`name`,
1955+ to prevent potential naming conflicts.
1956+
1957+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1958+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1959+ """
1960+ name = None
1961+ interface = None
1962+ required_keys = []
1963+
1964+ def __init__(self, name=None, additional_required_keys=None):
1965+ if name is not None:
1966+ self.name = name
1967+ if additional_required_keys is not None:
1968+ self.required_keys.extend(additional_required_keys)
1969+ self.get_data()
1970+
1971+ def __bool__(self):
1972+ """
1973+ Returns True if all of the required_keys are available.
1974+ """
1975+ return self.is_ready()
1976+
1977+ __nonzero__ = __bool__
1978+
1979+ def __repr__(self):
1980+ return super(RelationContext, self).__repr__()
1981+
1982+ def is_ready(self):
1983+ """
1984+ Returns True if all of the `required_keys` are available from any units.
1985+ """
1986+ ready = len(self.get(self.name, [])) > 0
1987+ if not ready:
1988+ hookenv.log('Incomplete relation: {}'.format(self.__class__.__name__), hookenv.DEBUG)
1989+ return ready
1990+
1991+ def _is_ready(self, unit_data):
1992+ """
1993+ Helper method that tests a set of relation data and returns True if
1994+ all of the `required_keys` are present.
1995+ """
1996+ return set(unit_data.keys()).issuperset(set(self.required_keys))
1997+
1998+ def get_data(self):
1999+ """
2000+ Retrieve the relation data for each unit involved in a relation and,
2001+ if complete, store it in a list under `self[self.name]`. This
2002+ is automatically called when the RelationContext is instantiated.
2003+
2004+ The units are sorted lexographically first by the service ID, then by
2005+ the unit ID. Thus, if an interface has two other services, 'db:1'
2006+ and 'db:2', with 'db:1' having two units, 'wordpress/0' and 'wordpress/1',
2007+ and 'db:2' having one unit, 'mediawiki/0', all of which have a complete
2008+ set of data, the relation data for the units will be stored in the
2009+ order: 'wordpress/0', 'wordpress/1', 'mediawiki/0'.
2010+
2011+ If you only care about a single unit on the relation, you can just
2012+ access it as `{{ interface[0]['key'] }}`. However, if you can at all
2013+ support multiple units on a relation, you should iterate over the list,
2014+ like::
2015+
2016+ {% for unit in interface -%}
2017+ {{ unit['key'] }}{% if not loop.last %},{% endif %}
2018+ {%- endfor %}
2019+
2020+ Note that since all sets of relation data from all related services and
2021+ units are in a single list, if you need to know which service or unit a
2022+ set of data came from, you'll need to extend this class to preserve
2023+ that information.
2024+ """
2025+ if not hookenv.relation_ids(self.name):
2026+ return
2027+
2028+ ns = self.setdefault(self.name, [])
2029+ for rid in sorted(hookenv.relation_ids(self.name)):
2030+ for unit in sorted(hookenv.related_units(rid)):
2031+ reldata = hookenv.relation_get(rid=rid, unit=unit)
2032+ if self._is_ready(reldata):
2033+ ns.append(reldata)
2034+
2035+ def provide_data(self):
2036+ """
2037+ Return data to be relation_set for this interface.
2038+ """
2039+ return {}
2040+
2041+
2042+class MysqlRelation(RelationContext):
2043+ """
2044+ Relation context for the `mysql` interface.
2045+
2046+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
2047+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
2048+ """
2049+ name = 'db'
2050+ interface = 'mysql'
2051+ required_keys = ['host', 'user', 'password', 'database']
2052+
2053+
2054+class HttpRelation(RelationContext):
2055+ """
2056+ Relation context for the `http` interface.
2057+
2058+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
2059+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
2060+ """
2061+ name = 'website'
2062+ interface = 'http'
2063+ required_keys = ['host', 'port']
2064+
2065+ def provide_data(self):
2066+ return {
2067+ 'host': hookenv.unit_get('private-address'),
2068+ 'port': 80,
2069+ }
2070+
2071+
2072+class RequiredConfig(dict):
2073+ """
2074+ Data context that loads config options with one or more mandatory options.
2075+
2076+ Once the required options have been changed from their default values, all
2077+ config options will be available, namespaced under `config` to prevent
2078+ potential naming conflicts (for example, between a config option and a
2079+ relation property).
2080+
2081+ :param list *args: List of options that must be changed from their default values.
2082+ """
2083+
2084+ def __init__(self, *args):
2085+ self.required_options = args
2086+ self['config'] = hookenv.config()
2087+ with open(os.path.join(hookenv.charm_dir(), 'config.yaml')) as fp:
2088+ self.config = yaml.load(fp).get('options', {})
2089+
2090+ def __bool__(self):
2091+ for option in self.required_options:
2092+ if option not in self['config']:
2093+ return False
2094+ current_value = self['config'][option]
2095+ default_value = self.config[option].get('default')
2096+ if current_value == default_value:
2097+ return False
2098+ if current_value in (None, '') and default_value in (None, ''):
2099+ return False
2100+ return True
2101+
2102+ def __nonzero__(self):
2103+ return self.__bool__()
2104+
2105+
2106+class StoredContext(dict):
2107+ """
2108+ A data context that always returns the data that it was first created with.
2109+
2110+ This is useful to do a one-time generation of things like passwords, that
2111+ will thereafter use the same value that was originally generated, instead
2112+ of generating a new value each time it is run.
2113+ """
2114+ def __init__(self, file_name, config_data):
2115+ """
2116+ If the file exists, populate `self` with the data from the file.
2117+ Otherwise, populate with the given data and persist it to the file.
2118+ """
2119+ if os.path.exists(file_name):
2120+ self.update(self.read_context(file_name))
2121+ else:
2122+ self.store_context(file_name, config_data)
2123+ self.update(config_data)
2124+
2125+ def store_context(self, file_name, config_data):
2126+ if not os.path.isabs(file_name):
2127+ file_name = os.path.join(hookenv.charm_dir(), file_name)
2128+ with open(file_name, 'w') as file_stream:
2129+ os.fchmod(file_stream.fileno(), 0o600)
2130+ yaml.dump(config_data, file_stream)
2131+
2132+ def read_context(self, file_name):
2133+ if not os.path.isabs(file_name):
2134+ file_name = os.path.join(hookenv.charm_dir(), file_name)
2135+ with open(file_name, 'r') as file_stream:
2136+ data = yaml.load(file_stream)
2137+ if not data:
2138+ raise OSError("%s is empty" % file_name)
2139+ return data
2140+
2141+
2142+class TemplateCallback(ManagerCallback):
2143+ """
2144+ Callback class that will render a Jinja2 template, for use as a ready
2145+ action.
2146+
2147+ :param str source: The template source file, relative to
2148+ `$CHARM_DIR/templates`
2149+
2150+ :param str target: The target to write the rendered template to
2151+ :param str owner: The owner of the rendered file
2152+ :param str group: The group of the rendered file
2153+ :param int perms: The permissions of the rendered file
2154+ """
2155+ def __init__(self, source, target,
2156+ owner='root', group='root', perms=0o444):
2157+ self.source = source
2158+ self.target = target
2159+ self.owner = owner
2160+ self.group = group
2161+ self.perms = perms
2162+
2163+ def __call__(self, manager, service_name, event_name):
2164+ service = manager.get_service(service_name)
2165+ context = {}
2166+ for ctx in service.get('required_data', []):
2167+ context.update(ctx)
2168+ templating.render(self.source, self.target, context,
2169+ self.owner, self.group, self.perms)
2170+
2171+
2172+# Convenience aliases for templates
2173+render_template = template = TemplateCallback
2174
2175=== added file 'hooks/charmhelpers/core/sysctl.py'
2176--- hooks/charmhelpers/core/sysctl.py 1970-01-01 00:00:00 +0000
2177+++ hooks/charmhelpers/core/sysctl.py 2015-01-22 15:04:01 +0000
2178@@ -0,0 +1,34 @@
2179+#!/usr/bin/env python
2180+# -*- coding: utf-8 -*-
2181+
2182+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
2183+
2184+import yaml
2185+
2186+from subprocess import check_call
2187+
2188+from charmhelpers.core.hookenv import (
2189+ log,
2190+ DEBUG,
2191+)
2192+
2193+
2194+def create(sysctl_dict, sysctl_file):
2195+ """Creates a sysctl.conf file from a YAML associative array
2196+
2197+ :param sysctl_dict: a dict of sysctl options eg { 'kernel.max_pid': 1337 }
2198+ :type sysctl_dict: dict
2199+ :param sysctl_file: path to the sysctl file to be saved
2200+ :type sysctl_file: str or unicode
2201+ :returns: None
2202+ """
2203+ sysctl_dict = yaml.load(sysctl_dict)
2204+
2205+ with open(sysctl_file, "w") as fd:
2206+ for key, value in sysctl_dict.items():
2207+ fd.write("{}={}\n".format(key, value))
2208+
2209+ log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict),
2210+ level=DEBUG)
2211+
2212+ check_call(["sysctl", "-p", sysctl_file])
2213
2214=== added file 'hooks/charmhelpers/core/templating.py'
2215--- hooks/charmhelpers/core/templating.py 1970-01-01 00:00:00 +0000
2216+++ hooks/charmhelpers/core/templating.py 2015-01-22 15:04:01 +0000
2217@@ -0,0 +1,52 @@
2218+import os
2219+
2220+from charmhelpers.core import host
2221+from charmhelpers.core import hookenv
2222+
2223+
2224+def render(source, target, context, owner='root', group='root',
2225+ perms=0o444, templates_dir=None):
2226+ """
2227+ Render a template.
2228+
2229+ The `source` path, if not absolute, is relative to the `templates_dir`.
2230+
2231+ The `target` path should be absolute.
2232+
2233+ The context should be a dict containing the values to be replaced in the
2234+ template.
2235+
2236+ The `owner`, `group`, and `perms` options will be passed to `write_file`.
2237+
2238+ If omitted, `templates_dir` defaults to the `templates` folder in the charm.
2239+
2240+ Note: Using this requires python-jinja2; if it is not installed, calling
2241+ this will attempt to use charmhelpers.fetch.apt_install to install it.
2242+ """
2243+ try:
2244+ from jinja2 import FileSystemLoader, Environment, exceptions
2245+ except ImportError:
2246+ try:
2247+ from charmhelpers.fetch import apt_install
2248+ except ImportError:
2249+ hookenv.log('Could not import jinja2, and could not import '
2250+ 'charmhelpers.fetch to install it',
2251+ level=hookenv.ERROR)
2252+ raise
2253+ apt_install('python-jinja2', fatal=True)
2254+ from jinja2 import FileSystemLoader, Environment, exceptions
2255+
2256+ if templates_dir is None:
2257+ templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
2258+ loader = Environment(loader=FileSystemLoader(templates_dir))
2259+ try:
2260+ source = source
2261+ template = loader.get_template(source)
2262+ except exceptions.TemplateNotFound as e:
2263+ hookenv.log('Could not load template %s from %s.' %
2264+ (source, templates_dir),
2265+ level=hookenv.ERROR)
2266+ raise e
2267+ content = template.render(context)
2268+ host.mkdir(os.path.dirname(target), owner, group)
2269+ host.write_file(target, content, owner, group, perms)
2270
2271=== added directory 'hooks/charmhelpers/fetch'
2272=== added file 'hooks/charmhelpers/fetch/__init__.py'
2273--- hooks/charmhelpers/fetch/__init__.py 1970-01-01 00:00:00 +0000
2274+++ hooks/charmhelpers/fetch/__init__.py 2015-01-22 15:04:01 +0000
2275@@ -0,0 +1,423 @@
2276+import importlib
2277+from tempfile import NamedTemporaryFile
2278+import time
2279+from yaml import safe_load
2280+from charmhelpers.core.host import (
2281+ lsb_release
2282+)
2283+import subprocess
2284+from charmhelpers.core.hookenv import (
2285+ config,
2286+ log,
2287+)
2288+import os
2289+
2290+import six
2291+if six.PY3:
2292+ from urllib.parse import urlparse, urlunparse
2293+else:
2294+ from urlparse import urlparse, urlunparse
2295+
2296+
2297+CLOUD_ARCHIVE = """# Ubuntu Cloud Archive
2298+deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
2299+"""
2300+PROPOSED_POCKET = """# Proposed
2301+deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted
2302+"""
2303+CLOUD_ARCHIVE_POCKETS = {
2304+ # Folsom
2305+ 'folsom': 'precise-updates/folsom',
2306+ 'precise-folsom': 'precise-updates/folsom',
2307+ 'precise-folsom/updates': 'precise-updates/folsom',
2308+ 'precise-updates/folsom': 'precise-updates/folsom',
2309+ 'folsom/proposed': 'precise-proposed/folsom',
2310+ 'precise-folsom/proposed': 'precise-proposed/folsom',
2311+ 'precise-proposed/folsom': 'precise-proposed/folsom',
2312+ # Grizzly
2313+ 'grizzly': 'precise-updates/grizzly',
2314+ 'precise-grizzly': 'precise-updates/grizzly',
2315+ 'precise-grizzly/updates': 'precise-updates/grizzly',
2316+ 'precise-updates/grizzly': 'precise-updates/grizzly',
2317+ 'grizzly/proposed': 'precise-proposed/grizzly',
2318+ 'precise-grizzly/proposed': 'precise-proposed/grizzly',
2319+ 'precise-proposed/grizzly': 'precise-proposed/grizzly',
2320+ # Havana
2321+ 'havana': 'precise-updates/havana',
2322+ 'precise-havana': 'precise-updates/havana',
2323+ 'precise-havana/updates': 'precise-updates/havana',
2324+ 'precise-updates/havana': 'precise-updates/havana',
2325+ 'havana/proposed': 'precise-proposed/havana',
2326+ 'precise-havana/proposed': 'precise-proposed/havana',
2327+ 'precise-proposed/havana': 'precise-proposed/havana',
2328+ # Icehouse
2329+ 'icehouse': 'precise-updates/icehouse',
2330+ 'precise-icehouse': 'precise-updates/icehouse',
2331+ 'precise-icehouse/updates': 'precise-updates/icehouse',
2332+ 'precise-updates/icehouse': 'precise-updates/icehouse',
2333+ 'icehouse/proposed': 'precise-proposed/icehouse',
2334+ 'precise-icehouse/proposed': 'precise-proposed/icehouse',
2335+ 'precise-proposed/icehouse': 'precise-proposed/icehouse',
2336+ # Juno
2337+ 'juno': 'trusty-updates/juno',
2338+ 'trusty-juno': 'trusty-updates/juno',
2339+ 'trusty-juno/updates': 'trusty-updates/juno',
2340+ 'trusty-updates/juno': 'trusty-updates/juno',
2341+ 'juno/proposed': 'trusty-proposed/juno',
2342+ 'trusty-juno/proposed': 'trusty-proposed/juno',
2343+ 'trusty-proposed/juno': 'trusty-proposed/juno',
2344+ # Kilo
2345+ 'kilo': 'trusty-updates/kilo',
2346+ 'trusty-kilo': 'trusty-updates/kilo',
2347+ 'trusty-kilo/updates': 'trusty-updates/kilo',
2348+ 'trusty-updates/kilo': 'trusty-updates/kilo',
2349+ 'kilo/proposed': 'trusty-proposed/kilo',
2350+ 'trusty-kilo/proposed': 'trusty-proposed/kilo',
2351+ 'trusty-proposed/kilo': 'trusty-proposed/kilo',
2352+}
2353+
2354+# The order of this list is very important. Handlers should be listed in from
2355+# least- to most-specific URL matching.
2356+FETCH_HANDLERS = (
2357+ 'charmhelpers.fetch.archiveurl.ArchiveUrlFetchHandler',
2358+ 'charmhelpers.fetch.bzrurl.BzrUrlFetchHandler',
2359+ 'charmhelpers.fetch.giturl.GitUrlFetchHandler',
2360+)
2361+
2362+APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.
2363+APT_NO_LOCK_RETRY_DELAY = 10 # Wait 10 seconds between apt lock checks.
2364+APT_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.
2365+
2366+
2367+class SourceConfigError(Exception):
2368+ pass
2369+
2370+
2371+class UnhandledSource(Exception):
2372+ pass
2373+
2374+
2375+class AptLockError(Exception):
2376+ pass
2377+
2378+
2379+class BaseFetchHandler(object):
2380+
2381+ """Base class for FetchHandler implementations in fetch plugins"""
2382+
2383+ def can_handle(self, source):
2384+ """Returns True if the source can be handled. Otherwise returns
2385+ a string explaining why it cannot"""
2386+ return "Wrong source type"
2387+
2388+ def install(self, source):
2389+ """Try to download and unpack the source. Return the path to the
2390+ unpacked files or raise UnhandledSource."""
2391+ raise UnhandledSource("Wrong source type {}".format(source))
2392+
2393+ def parse_url(self, url):
2394+ return urlparse(url)
2395+
2396+ def base_url(self, url):
2397+ """Return url without querystring or fragment"""
2398+ parts = list(self.parse_url(url))
2399+ parts[4:] = ['' for i in parts[4:]]
2400+ return urlunparse(parts)
2401+
2402+
2403+def filter_installed_packages(packages):
2404+ """Returns a list of packages that require installation"""
2405+ cache = apt_cache()
2406+ _pkgs = []
2407+ for package in packages:
2408+ try:
2409+ p = cache[package]
2410+ p.current_ver or _pkgs.append(package)
2411+ except KeyError:
2412+ log('Package {} has no installation candidate.'.format(package),
2413+ level='WARNING')
2414+ _pkgs.append(package)
2415+ return _pkgs
2416+
2417+
2418+def apt_cache(in_memory=True):
2419+ """Build and return an apt cache"""
2420+ import apt_pkg
2421+ apt_pkg.init()
2422+ if in_memory:
2423+ apt_pkg.config.set("Dir::Cache::pkgcache", "")
2424+ apt_pkg.config.set("Dir::Cache::srcpkgcache", "")
2425+ return apt_pkg.Cache()
2426+
2427+
2428+def apt_install(packages, options=None, fatal=False):
2429+ """Install one or more packages"""
2430+ if options is None:
2431+ options = ['--option=Dpkg::Options::=--force-confold']
2432+
2433+ cmd = ['apt-get', '--assume-yes']
2434+ cmd.extend(options)
2435+ cmd.append('install')
2436+ if isinstance(packages, six.string_types):
2437+ cmd.append(packages)
2438+ else:
2439+ cmd.extend(packages)
2440+ log("Installing {} with options: {}".format(packages,
2441+ options))
2442+ _run_apt_command(cmd, fatal)
2443+
2444+
2445+def apt_upgrade(options=None, fatal=False, dist=False):
2446+ """Upgrade all packages"""
2447+ if options is None:
2448+ options = ['--option=Dpkg::Options::=--force-confold']
2449+
2450+ cmd = ['apt-get', '--assume-yes']
2451+ cmd.extend(options)
2452+ if dist:
2453+ cmd.append('dist-upgrade')
2454+ else:
2455+ cmd.append('upgrade')
2456+ log("Upgrading with options: {}".format(options))
2457+ _run_apt_command(cmd, fatal)
2458+
2459+
2460+def apt_update(fatal=False):
2461+ """Update local apt cache"""
2462+ cmd = ['apt-get', 'update']
2463+ _run_apt_command(cmd, fatal)
2464+
2465+
2466+def apt_purge(packages, fatal=False):
2467+ """Purge one or more packages"""
2468+ cmd = ['apt-get', '--assume-yes', 'purge']
2469+ if isinstance(packages, six.string_types):
2470+ cmd.append(packages)
2471+ else:
2472+ cmd.extend(packages)
2473+ log("Purging {}".format(packages))
2474+ _run_apt_command(cmd, fatal)
2475+
2476+
2477+def apt_hold(packages, fatal=False):
2478+ """Hold one or more packages"""
2479+ cmd = ['apt-mark', 'hold']
2480+ if isinstance(packages, six.string_types):
2481+ cmd.append(packages)
2482+ else:
2483+ cmd.extend(packages)
2484+ log("Holding {}".format(packages))
2485+
2486+ if fatal:
2487+ subprocess.check_call(cmd)
2488+ else:
2489+ subprocess.call(cmd)
2490+
2491+
2492+def add_source(source, key=None):
2493+ """Add a package source to this system.
2494+
2495+ @param source: a URL or sources.list entry, as supported by
2496+ add-apt-repository(1). Examples::
2497+
2498+ ppa:charmers/example
2499+ deb https://stub:key@private.example.com/ubuntu trusty main
2500+
2501+ In addition:
2502+ 'proposed:' may be used to enable the standard 'proposed'
2503+ pocket for the release.
2504+ 'cloud:' may be used to activate official cloud archive pockets,
2505+ such as 'cloud:icehouse'
2506+ 'distro' may be used as a noop
2507+
2508+ @param key: A key to be added to the system's APT keyring and used
2509+ to verify the signatures on packages. Ideally, this should be an
2510+ ASCII format GPG public key including the block headers. A GPG key
2511+ id may also be used, but be aware that only insecure protocols are
2512+ available to retrieve the actual public key from a public keyserver
2513+ placing your Juju environment at risk. ppa and cloud archive keys
2514+ are securely added automtically, so sould not be provided.
2515+ """
2516+ if source is None:
2517+ log('Source is not present. Skipping')
2518+ return
2519+
2520+ if (source.startswith('ppa:') or
2521+ source.startswith('http') or
2522+ source.startswith('deb ') or
2523+ source.startswith('cloud-archive:')):
2524+ subprocess.check_call(['add-apt-repository', '--yes', source])
2525+ elif source.startswith('cloud:'):
2526+ apt_install(filter_installed_packages(['ubuntu-cloud-keyring']),
2527+ fatal=True)
2528+ pocket = source.split(':')[-1]
2529+ if pocket not in CLOUD_ARCHIVE_POCKETS:
2530+ raise SourceConfigError(
2531+ 'Unsupported cloud: source option %s' %
2532+ pocket)
2533+ actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
2534+ with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
2535+ apt.write(CLOUD_ARCHIVE.format(actual_pocket))
2536+ elif source == 'proposed':
2537+ release = lsb_release()['DISTRIB_CODENAME']
2538+ with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
2539+ apt.write(PROPOSED_POCKET.format(release))
2540+ elif source == 'distro':
2541+ pass
2542+ else:
2543+ log("Unknown source: {!r}".format(source))
2544+
2545+ if key:
2546+ if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key:
2547+ with NamedTemporaryFile('w+') as key_file:
2548+ key_file.write(key)
2549+ key_file.flush()
2550+ key_file.seek(0)
2551+ subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file)
2552+ else:
2553+ # Note that hkp: is in no way a secure protocol. Using a
2554+ # GPG key id is pointless from a security POV unless you
2555+ # absolutely trust your network and DNS.
2556+ subprocess.check_call(['apt-key', 'adv', '--keyserver',
2557+ 'hkp://keyserver.ubuntu.com:80', '--recv',
2558+ key])
2559+
2560+
2561+def configure_sources(update=False,
2562+ sources_var='install_sources',
2563+ keys_var='install_keys'):
2564+ """
2565+ Configure multiple sources from charm configuration.
2566+
2567+ The lists are encoded as yaml fragments in the configuration.
2568+ The frament needs to be included as a string. Sources and their
2569+ corresponding keys are of the types supported by add_source().
2570+
2571+ Example config:
2572+ install_sources: |
2573+ - "ppa:foo"
2574+ - "http://example.com/repo precise main"
2575+ install_keys: |
2576+ - null
2577+ - "a1b2c3d4"
2578+
2579+ Note that 'null' (a.k.a. None) should not be quoted.
2580+ """
2581+ sources = safe_load((config(sources_var) or '').strip()) or []
2582+ keys = safe_load((config(keys_var) or '').strip()) or None
2583+
2584+ if isinstance(sources, six.string_types):
2585+ sources = [sources]
2586+
2587+ if keys is None:
2588+ for source in sources:
2589+ add_source(source, None)
2590+ else:
2591+ if isinstance(keys, six.string_types):
2592+ keys = [keys]
2593+
2594+ if len(sources) != len(keys):
2595+ raise SourceConfigError(
2596+ 'Install sources and keys lists are different lengths')
2597+ for source, key in zip(sources, keys):
2598+ add_source(source, key)
2599+ if update:
2600+ apt_update(fatal=True)
2601+
2602+
2603+def install_remote(source, *args, **kwargs):
2604+ """
2605+ Install a file tree from a remote source
2606+
2607+ The specified source should be a url of the form:
2608+ scheme://[host]/path[#[option=value][&...]]
2609+
2610+ Schemes supported are based on this modules submodules.
2611+ Options supported are submodule-specific.
2612+ Additional arguments are passed through to the submodule.
2613+
2614+ For example::
2615+
2616+ dest = install_remote('http://example.com/archive.tgz',
2617+ checksum='deadbeef',
2618+ hash_type='sha1')
2619+
2620+ This will download `archive.tgz`, validate it using SHA1 and, if
2621+ the file is ok, extract it and return the directory in which it
2622+ was extracted. If the checksum fails, it will raise
2623+ :class:`charmhelpers.core.host.ChecksumError`.
2624+ """
2625+ # We ONLY check for True here because can_handle may return a string
2626+ # explaining why it can't handle a given source.
2627+ handlers = [h for h in plugins() if h.can_handle(source) is True]
2628+ installed_to = None
2629+ for handler in handlers:
2630+ try:
2631+ installed_to = handler.install(source, *args, **kwargs)
2632+ except UnhandledSource:
2633+ pass
2634+ if not installed_to:
2635+ raise UnhandledSource("No handler found for source {}".format(source))
2636+ return installed_to
2637+
2638+
2639+def install_from_config(config_var_name):
2640+ charm_config = config()
2641+ source = charm_config[config_var_name]
2642+ return install_remote(source)
2643+
2644+
2645+def plugins(fetch_handlers=None):
2646+ if not fetch_handlers:
2647+ fetch_handlers = FETCH_HANDLERS
2648+ plugin_list = []
2649+ for handler_name in fetch_handlers:
2650+ package, classname = handler_name.rsplit('.', 1)
2651+ try:
2652+ handler_class = getattr(
2653+ importlib.import_module(package),
2654+ classname)
2655+ plugin_list.append(handler_class())
2656+ except (ImportError, AttributeError):
2657+ # Skip missing plugins so that they can be ommitted from
2658+ # installation if desired
2659+ log("FetchHandler {} not found, skipping plugin".format(
2660+ handler_name))
2661+ return plugin_list
2662+
2663+
2664+def _run_apt_command(cmd, fatal=False):
2665+ """
2666+ Run an APT command, checking output and retrying if the fatal flag is set
2667+ to True.
2668+
2669+ :param: cmd: str: The apt command to run.
2670+ :param: fatal: bool: Whether the command's output should be checked and
2671+ retried.
2672+ """
2673+ env = os.environ.copy()
2674+
2675+ if 'DEBIAN_FRONTEND' not in env:
2676+ env['DEBIAN_FRONTEND'] = 'noninteractive'
2677+
2678+ if fatal:
2679+ retry_count = 0
2680+ result = None
2681+
2682+ # If the command is considered "fatal", we need to retry if the apt
2683+ # lock was not acquired.
2684+
2685+ while result is None or result == APT_NO_LOCK:
2686+ try:
2687+ result = subprocess.check_call(cmd, env=env)
2688+ except subprocess.CalledProcessError as e:
2689+ retry_count = retry_count + 1
2690+ if retry_count > APT_NO_LOCK_RETRY_COUNT:
2691+ raise
2692+ result = e.returncode
2693+ log("Couldn't acquire DPKG lock. Will retry in {} seconds."
2694+ "".format(APT_NO_LOCK_RETRY_DELAY))
2695+ time.sleep(APT_NO_LOCK_RETRY_DELAY)
2696+
2697+ else:
2698+ subprocess.call(cmd, env=env)
2699
2700=== added file 'hooks/charmhelpers/fetch/archiveurl.py'
2701--- hooks/charmhelpers/fetch/archiveurl.py 1970-01-01 00:00:00 +0000
2702+++ hooks/charmhelpers/fetch/archiveurl.py 2015-01-22 15:04:01 +0000
2703@@ -0,0 +1,145 @@
2704+import os
2705+import hashlib
2706+import re
2707+
2708+import six
2709+if six.PY3:
2710+ from urllib.request import (
2711+ build_opener, install_opener, urlopen, urlretrieve,
2712+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2713+ )
2714+ from urllib.parse import urlparse, urlunparse, parse_qs
2715+ from urllib.error import URLError
2716+else:
2717+ from urllib import urlretrieve
2718+ from urllib2 import (
2719+ build_opener, install_opener, urlopen,
2720+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2721+ URLError
2722+ )
2723+ from urlparse import urlparse, urlunparse, parse_qs
2724+
2725+from charmhelpers.fetch import (
2726+ BaseFetchHandler,
2727+ UnhandledSource
2728+)
2729+from charmhelpers.payload.archive import (
2730+ get_archive_handler,
2731+ extract,
2732+)
2733+from charmhelpers.core.host import mkdir, check_hash
2734+
2735+
2736+def splituser(host):
2737+ '''urllib.splituser(), but six's support of this seems broken'''
2738+ _userprog = re.compile('^(.*)@(.*)$')
2739+ match = _userprog.match(host)
2740+ if match:
2741+ return match.group(1, 2)
2742+ return None, host
2743+
2744+
2745+def splitpasswd(user):
2746+ '''urllib.splitpasswd(), but six's support of this is missing'''
2747+ _passwdprog = re.compile('^([^:]*):(.*)$', re.S)
2748+ match = _passwdprog.match(user)
2749+ if match:
2750+ return match.group(1, 2)
2751+ return user, None
2752+
2753+
2754+class ArchiveUrlFetchHandler(BaseFetchHandler):
2755+ """
2756+ Handler to download archive files from arbitrary URLs.
2757+
2758+ Can fetch from http, https, ftp, and file URLs.
2759+
2760+ Can install either tarballs (.tar, .tgz, .tbz2, etc) or zip files.
2761+
2762+ Installs the contents of the archive in $CHARM_DIR/fetched/.
2763+ """
2764+ def can_handle(self, source):
2765+ url_parts = self.parse_url(source)
2766+ if url_parts.scheme not in ('http', 'https', 'ftp', 'file'):
2767+ return "Wrong source type"
2768+ if get_archive_handler(self.base_url(source)):
2769+ return True
2770+ return False
2771+
2772+ def download(self, source, dest):
2773+ """
2774+ Download an archive file.
2775+
2776+ :param str source: URL pointing to an archive file.
2777+ :param str dest: Local path location to download archive file to.
2778+ """
2779+ # propogate all exceptions
2780+ # URLError, OSError, etc
2781+ proto, netloc, path, params, query, fragment = urlparse(source)
2782+ if proto in ('http', 'https'):
2783+ auth, barehost = splituser(netloc)
2784+ if auth is not None:
2785+ source = urlunparse((proto, barehost, path, params, query, fragment))
2786+ username, password = splitpasswd(auth)
2787+ passman = HTTPPasswordMgrWithDefaultRealm()
2788+ # Realm is set to None in add_password to force the username and password
2789+ # to be used whatever the realm
2790+ passman.add_password(None, source, username, password)
2791+ authhandler = HTTPBasicAuthHandler(passman)
2792+ opener = build_opener(authhandler)
2793+ install_opener(opener)
2794+ response = urlopen(source)
2795+ try:
2796+ with open(dest, 'w') as dest_file:
2797+ dest_file.write(response.read())
2798+ except Exception as e:
2799+ if os.path.isfile(dest):
2800+ os.unlink(dest)
2801+ raise e
2802+
2803+ # Mandatory file validation via Sha1 or MD5 hashing.
2804+ def download_and_validate(self, url, hashsum, validate="sha1"):
2805+ tempfile, headers = urlretrieve(url)
2806+ check_hash(tempfile, hashsum, validate)
2807+ return tempfile
2808+
2809+ def install(self, source, dest=None, checksum=None, hash_type='sha1'):
2810+ """
2811+ Download and install an archive file, with optional checksum validation.
2812+
2813+ The checksum can also be given on the `source` URL's fragment.
2814+ For example::
2815+
2816+ handler.install('http://example.com/file.tgz#sha1=deadbeef')
2817+
2818+ :param str source: URL pointing to an archive file.
2819+ :param str dest: Local destination path to install to. If not given,
2820+ installs to `$CHARM_DIR/archives/archive_file_name`.
2821+ :param str checksum: If given, validate the archive file after download.
2822+ :param str hash_type: Algorithm used to generate `checksum`.
2823+ Can be any hash alrgorithm supported by :mod:`hashlib`,
2824+ such as md5, sha1, sha256, sha512, etc.
2825+
2826+ """
2827+ url_parts = self.parse_url(source)
2828+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), 'fetched')
2829+ if not os.path.exists(dest_dir):
2830+ mkdir(dest_dir, perms=0o755)
2831+ dld_file = os.path.join(dest_dir, os.path.basename(url_parts.path))
2832+ try:
2833+ self.download(source, dld_file)
2834+ except URLError as e:
2835+ raise UnhandledSource(e.reason)
2836+ except OSError as e:
2837+ raise UnhandledSource(e.strerror)
2838+ options = parse_qs(url_parts.fragment)
2839+ for key, value in options.items():
2840+ if not six.PY3:
2841+ algorithms = hashlib.algorithms
2842+ else:
2843+ algorithms = hashlib.algorithms_available
2844+ if key in algorithms:
2845+ check_hash(dld_file, value, key)
2846+ if checksum:
2847+ check_hash(dld_file, checksum, hash_type)
2848+ return extract(dld_file, dest)
2849
2850=== added file 'hooks/charmhelpers/fetch/bzrurl.py'
2851--- hooks/charmhelpers/fetch/bzrurl.py 1970-01-01 00:00:00 +0000
2852+++ hooks/charmhelpers/fetch/bzrurl.py 2015-01-22 15:04:01 +0000
2853@@ -0,0 +1,54 @@
2854+import os
2855+from charmhelpers.fetch import (
2856+ BaseFetchHandler,
2857+ UnhandledSource
2858+)
2859+from charmhelpers.core.host import mkdir
2860+
2861+import six
2862+if six.PY3:
2863+ raise ImportError('bzrlib does not support Python3')
2864+
2865+try:
2866+ from bzrlib.branch import Branch
2867+except ImportError:
2868+ from charmhelpers.fetch import apt_install
2869+ apt_install("python-bzrlib")
2870+ from bzrlib.branch import Branch
2871+
2872+
2873+class BzrUrlFetchHandler(BaseFetchHandler):
2874+ """Handler for bazaar branches via generic and lp URLs"""
2875+ def can_handle(self, source):
2876+ url_parts = self.parse_url(source)
2877+ if url_parts.scheme not in ('bzr+ssh', 'lp'):
2878+ return False
2879+ else:
2880+ return True
2881+
2882+ def branch(self, source, dest):
2883+ url_parts = self.parse_url(source)
2884+ # If we use lp:branchname scheme we need to load plugins
2885+ if not self.can_handle(source):
2886+ raise UnhandledSource("Cannot handle {}".format(source))
2887+ if url_parts.scheme == "lp":
2888+ from bzrlib.plugin import load_plugins
2889+ load_plugins()
2890+ try:
2891+ remote_branch = Branch.open(source)
2892+ remote_branch.bzrdir.sprout(dest).open_branch()
2893+ except Exception as e:
2894+ raise e
2895+
2896+ def install(self, source):
2897+ url_parts = self.parse_url(source)
2898+ branch_name = url_parts.path.strip("/").split("/")[-1]
2899+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2900+ branch_name)
2901+ if not os.path.exists(dest_dir):
2902+ mkdir(dest_dir, perms=0o755)
2903+ try:
2904+ self.branch(source, dest_dir)
2905+ except OSError as e:
2906+ raise UnhandledSource(e.strerror)
2907+ return dest_dir
2908
2909=== added file 'hooks/charmhelpers/fetch/giturl.py'
2910--- hooks/charmhelpers/fetch/giturl.py 1970-01-01 00:00:00 +0000
2911+++ hooks/charmhelpers/fetch/giturl.py 2015-01-22 15:04:01 +0000
2912@@ -0,0 +1,51 @@
2913+import os
2914+from charmhelpers.fetch import (
2915+ BaseFetchHandler,
2916+ UnhandledSource
2917+)
2918+from charmhelpers.core.host import mkdir
2919+
2920+import six
2921+if six.PY3:
2922+ raise ImportError('GitPython does not support Python 3')
2923+
2924+try:
2925+ from git import Repo
2926+except ImportError:
2927+ from charmhelpers.fetch import apt_install
2928+ apt_install("python-git")
2929+ from git import Repo
2930+
2931+
2932+class GitUrlFetchHandler(BaseFetchHandler):
2933+ """Handler for git branches via generic and github URLs"""
2934+ def can_handle(self, source):
2935+ url_parts = self.parse_url(source)
2936+ # TODO (mattyw) no support for ssh git@ yet
2937+ if url_parts.scheme not in ('http', 'https', 'git'):
2938+ return False
2939+ else:
2940+ return True
2941+
2942+ def clone(self, source, dest, branch):
2943+ if not self.can_handle(source):
2944+ raise UnhandledSource("Cannot handle {}".format(source))
2945+
2946+ repo = Repo.clone_from(source, dest)
2947+ repo.git.checkout(branch)
2948+
2949+ def install(self, source, branch="master", dest=None):
2950+ url_parts = self.parse_url(source)
2951+ branch_name = url_parts.path.strip("/").split("/")[-1]
2952+ if dest:
2953+ dest_dir = os.path.join(dest, branch_name)
2954+ else:
2955+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2956+ branch_name)
2957+ if not os.path.exists(dest_dir):
2958+ mkdir(dest_dir, perms=0o755)
2959+ try:
2960+ self.clone(source, dest_dir, branch)
2961+ except OSError as e:
2962+ raise UnhandledSource(e.strerror)
2963+ return dest_dir
2964
2965=== added directory 'hooks/charmhelpers/payload'
2966=== added file 'hooks/charmhelpers/payload/__init__.py'
2967--- hooks/charmhelpers/payload/__init__.py 1970-01-01 00:00:00 +0000
2968+++ hooks/charmhelpers/payload/__init__.py 2015-01-22 15:04:01 +0000
2969@@ -0,0 +1,1 @@
2970+"Tools for working with files injected into a charm just before deployment."
2971
2972=== added file 'hooks/charmhelpers/payload/execd.py'
2973--- hooks/charmhelpers/payload/execd.py 1970-01-01 00:00:00 +0000
2974+++ hooks/charmhelpers/payload/execd.py 2015-01-22 15:04:01 +0000
2975@@ -0,0 +1,50 @@
2976+#!/usr/bin/env python
2977+
2978+import os
2979+import sys
2980+import subprocess
2981+from charmhelpers.core import hookenv
2982+
2983+
2984+def default_execd_dir():
2985+ return os.path.join(os.environ['CHARM_DIR'], 'exec.d')
2986+
2987+
2988+def execd_module_paths(execd_dir=None):
2989+ """Generate a list of full paths to modules within execd_dir."""
2990+ if not execd_dir:
2991+ execd_dir = default_execd_dir()
2992+
2993+ if not os.path.exists(execd_dir):
2994+ return
2995+
2996+ for subpath in os.listdir(execd_dir):
2997+ module = os.path.join(execd_dir, subpath)
2998+ if os.path.isdir(module):
2999+ yield module
3000+
3001+
3002+def execd_submodule_paths(command, execd_dir=None):
3003+ """Generate a list of full paths to the specified command within exec_dir.
3004+ """
3005+ for module_path in execd_module_paths(execd_dir):
3006+ path = os.path.join(module_path, command)
3007+ if os.access(path, os.X_OK) and os.path.isfile(path):
3008+ yield path
3009+
3010+
3011+def execd_run(command, execd_dir=None, die_on_error=False, stderr=None):
3012+ """Run command for each module within execd_dir which defines it."""
3013+ for submodule_path in execd_submodule_paths(command, execd_dir):
3014+ try:
3015+ subprocess.check_call(submodule_path, shell=True, stderr=stderr)
3016+ except subprocess.CalledProcessError as e:
3017+ hookenv.log("Error ({}) running {}. Output: {}".format(
3018+ e.returncode, e.cmd, e.output))
3019+ if die_on_error:
3020+ sys.exit(e.returncode)
3021+
3022+
3023+def execd_preinstall(execd_dir=None):
3024+ """Run charm-pre-install for each module within execd_dir."""
3025+ execd_run('charm-pre-install', execd_dir=execd_dir)
3026
3027=== modified file 'hooks/config-changed'
3028--- hooks/config-changed 2011-09-22 14:46:56 +0000
3029+++ hooks/config-changed 1970-01-01 00:00:00 +0000
3030@@ -1,7 +0,0 @@
3031-#!/bin/sh
3032-set -e
3033-
3034-home=`dirname $0`
3035-
3036-juju-log "Reconfiguring charm by installing hook again."
3037-exec $home/install
3038
3039=== target is u'jenkins_hooks.py'
3040=== removed file 'hooks/delnode'
3041--- hooks/delnode 2011-09-22 14:46:56 +0000
3042+++ hooks/delnode 1970-01-01 00:00:00 +0000
3043@@ -1,16 +0,0 @@
3044-#!/usr/bin/python
3045-
3046-import jenkins
3047-import sys
3048-
3049-host=sys.argv[1]
3050-username=sys.argv[2]
3051-password=sys.argv[3]
3052-
3053-l_jenkins = jenkins.Jenkins("http://localhost:8080/",username,password)
3054-
3055-if l_jenkins.node_exists(host):
3056- print "Node exists"
3057- l_jenkins.delete_node(host)
3058-else:
3059- print "Node does not exist - not deleting"
3060
3061=== modified file 'hooks/install'
3062--- hooks/install 2014-04-17 12:35:18 +0000
3063+++ hooks/install 1970-01-01 00:00:00 +0000
3064@@ -1,151 +0,0 @@
3065-#!/bin/bash
3066-
3067-set -eu
3068-
3069-RELEASE=$(config-get release)
3070-ADMIN_USERNAME=$(config-get username)
3071-ADMIN_PASSWORD=$(config-get password)
3072-PLUGINS=$(config-get plugins)
3073-PLUGINS_SITE=$(config-get plugins-site)
3074-PLUGINS_CHECK_CERT=$(config-get plugins-check-certificate)
3075-REMOVE_UNLISTED_PLUGINS=$(config-get remove-unlisted-plugins)
3076-CWD=$(dirname $0)
3077-JENKINS_HOME=/var/lib/jenkins
3078-
3079-setup_source () {
3080- # Do something with < Oneiric releases - maybe PPA
3081- # apt-get -y install python-software-properties
3082- # add-apt-repository ppa:hudson-ubuntu/testing
3083- juju-log "Configuring source of jenkins as $RELEASE"
3084- # Configure to use upstream archives
3085- # lts - debian-stable
3086- # trunk - debian
3087- case $RELEASE in
3088- lts)
3089- SOURCE="debian-stable";;
3090- trunk)
3091- SOURCE="debian";;
3092- *)
3093- juju-log "release configuration not recognised" && exit 1;;
3094- esac
3095- # Setup archive to use appropriate jenkins upstream
3096- wget -q -O - http://pkg.jenkins-ci.org/$SOURCE/jenkins-ci.org.key | apt-key add -
3097- echo "deb http://pkg.jenkins-ci.org/$SOURCE binary/" \
3098- > /etc/apt/sources.list.d/jenkins.list
3099- apt-get update || true
3100-}
3101-# Only setup the source if jenkins is not already installed
3102-# this makes the config 'release' immutable - i.e. you
3103-# can change source once deployed
3104-[[ -d /var/lib/jenkins ]] || setup_source
3105-
3106-# Install jenkins
3107-install_jenkins () {
3108- juju-log "Installing/upgrading jenkins..."
3109- apt-get -y install -qq jenkins default-jre-headless
3110-}
3111-# Re-run whenever called to pickup any updates
3112-install_jenkins
3113-
3114-configure_jenkins_user () {
3115- juju-log "Configuring user for jenkins..."
3116- # Check to see if password provided
3117- if [ -z "$ADMIN_PASSWORD" ]
3118- then
3119- # Generate a random one for security
3120- # User can then override using juju set
3121- ADMIN_PASSWORD=$(< /dev/urandom tr -dc A-Za-z | head -c16)
3122- echo $ADMIN_PASSWORD > $JENKINS_HOME/.admin_password
3123- chmod 0600 $JENKINS_HOME/.admin_password
3124- fi
3125- # Generate Salt and Hash Password for Jenkins
3126- SALT="$(< /dev/urandom tr -dc A-Za-z | head -c6)"
3127- PASSWORD="$SALT:$(echo -n "$ADMIN_PASSWORD{$SALT}" | shasum -a 256 | awk '{ print $1 }')"
3128- mkdir -p $JENKINS_HOME/users/$ADMIN_USERNAME
3129- sed -e s#__USERNAME__#$ADMIN_USERNAME# -e s#__PASSWORD__#$PASSWORD# \
3130- $CWD/../templates/user-config.xml > $JENKINS_HOME/users/$ADMIN_USERNAME/config.xml
3131- chown -R jenkins:nogroup $JENKINS_HOME/users
3132-}
3133-# Always run - even if config has not changed, its safe
3134-configure_jenkins_user
3135-
3136-boostrap_jenkins_configuration (){
3137- juju-log "Bootstrapping secure initial configuration in Jenkins..."
3138- cp $CWD/../templates/jenkins-config.xml $JENKINS_HOME/config.xml
3139- chown jenkins:nogroup $JENKINS_HOME/config.xml
3140- touch /var/lib/jenkins/config.bootstrapped
3141-}
3142-# Only run on first invocation otherwise we blast
3143-# any configuration changes made
3144-[[ -f /var/lib/jenkins/config.bootstrapped ]] || boostrap_jenkins_configuration
3145-
3146-install_plugins(){
3147- juju-log "Installing plugins ($PLUGINS)"
3148- mkdir -p $JENKINS_HOME/plugins
3149- chmod a+rx $JENKINS_HOME/plugins
3150- chown jenkins:nogroup $JENKINS_HOME/plugins
3151- track_dir=`mktemp -d /tmp/plugins.installed.XXXXXXXX`
3152- installed_plugins=`find $JENKINS_HOME/plugins -name '*.hpi'`
3153- [ -z "$installed_plugins" ] || ln -s $installed_plugins $track_dir
3154- local plugin=""
3155- local plugin_file=""
3156- local opts=""
3157- pushd $JENKINS_HOME/plugins
3158- for plugin in $PLUGINS ; do
3159- plugin_file=$JENKINS_HOME/plugins/$plugin.hpi
3160- # Note that by default wget verifies certificates as of 1.10.
3161- if [ "$PLUGINS_CHECK_CERT" = "no" ] ; then
3162- opts="--no-check-certificate"
3163- fi
3164- wget $opts --timestamping $PLUGINS_SITE/latest/$plugin.hpi
3165- chmod a+r $plugin_file
3166- rm -f $track_dir/$plugin.hpi
3167- done
3168- popd
3169- # Warn about undesirable plugins, or remove them.
3170- unlisted_plugins=`ls $track_dir`
3171- [[ -n "$unlisted_plugins" ]] || return 0
3172- if [[ $REMOVE_UNLISTED_PLUGINS = "yes" ]] ; then
3173- for plugin_file in `ls $track_dir` ; do
3174- rm -vf $JENKINS_HOME/plugins/$plugin_file
3175- done
3176- else
3177- juju-log -l WARNING "Unlisted plugins: (`ls $track_dir`) Not removed. Set remove-unlisted-plugins to yes to clear them away."
3178- fi
3179-}
3180-
3181-install_plugins
3182-
3183-juju-log "Restarting jenkins to pickup configuration changes"
3184-service jenkins restart
3185-
3186-# Install helpers - python jenkins ++
3187-install_python_jenkins () {
3188- juju-log "Installing python-jenkins..."
3189- apt-get -y install -qq python-jenkins
3190-}
3191-# Only install once
3192-[[ -d /usr/share/pyshared/jenkins ]] || install_python_jenkins
3193-
3194-# Install some tools - can get set up deployment time
3195-install_tools () {
3196- juju-log "Installing tools..."
3197- apt-get -y install -qq `config-get tools`
3198-}
3199-# Always run - tools might get re-configured
3200-install_tools
3201-
3202-juju-log "Opening ports"
3203-open-port 8080
3204-
3205-# Execute any hook overlay which may be provided
3206-# by forks of this charm
3207-if [ -d hooks/install.d ]
3208-then
3209- for i in `ls -1 hooks/install.d/*`
3210- do
3211- [[ -x $i ]] && . ./$i
3212- done
3213-fi
3214-
3215-exit 0
3216
3217=== target is u'jenkins_hooks.py'
3218=== added file 'hooks/jenkins_hooks.py'
3219--- hooks/jenkins_hooks.py 1970-01-01 00:00:00 +0000
3220+++ hooks/jenkins_hooks.py 2015-01-22 15:04:01 +0000
3221@@ -0,0 +1,220 @@
3222+#!/usr/bin/python
3223+import grp
3224+import hashlib
3225+import os
3226+import pwd
3227+import shutil
3228+import subprocess
3229+import sys
3230+
3231+from charmhelpers.core.hookenv import (
3232+ Hooks,
3233+ UnregisteredHookError,
3234+ config,
3235+ remote_unit,
3236+ relation_get,
3237+ relation_set,
3238+ relation_ids,
3239+ unit_get,
3240+ open_port,
3241+ log,
3242+ DEBUG,
3243+ INFO,
3244+)
3245+from charmhelpers.fetch import apt_install
3246+from charmhelpers.core.host import (
3247+ service_start,
3248+ service_stop,
3249+)
3250+from charmhelpers.payload.execd import execd_preinstall
3251+from jenkins_utils import (
3252+ JENKINS_HOME,
3253+ JENKINS_USERS,
3254+ TEMPLATES_DIR,
3255+ add_node,
3256+ del_node,
3257+ setup_source,
3258+ install_jenkins_plugins,
3259+)
3260+
3261+hooks = Hooks()
3262+
3263+
3264+@hooks.hook('install')
3265+def install():
3266+ execd_preinstall('hooks/install.d')
3267+ # Only setup the source if jenkins is not already installed i.e. makes the
3268+ # config 'release' immutable so you can't change source once deployed
3269+ setup_source(config('release'))
3270+ config_changed()
3271+ open_port(8080)
3272+
3273+
3274+@hooks.hook('config-changed')
3275+def config_changed():
3276+ # Re-run whenever called to pickup any updates
3277+ log("Installing/upgrading jenkins.", level=DEBUG)
3278+ apt_install(['jenkins', 'default-jre-headless', 'pwgen'], fatal=True)
3279+
3280+ # Always run - even if config has not changed, its safe
3281+ log("Configuring user for jenkins.", level=DEBUG)
3282+ # Check to see if password provided
3283+ admin_passwd = config('password')
3284+ if not admin_passwd:
3285+ # Generate a random one for security. User can then override using juju
3286+ # set.
3287+ admin_passwd = subprocess.check_output(['pwgen', '-N1', '15'])
3288+ admin_passwd = admin_passwd.strip()
3289+
3290+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3291+ with open(passwd_file, 'w+') as fd:
3292+ fd.write(admin_passwd)
3293+
3294+ os.chmod(passwd_file, 0600)
3295+
3296+ jenkins_uid = pwd.getpwnam('jenkins').pw_uid
3297+ jenkins_gid = grp.getgrnam('jenkins').gr_gid
3298+ nogroup_gid = grp.getgrnam('nogroup').gr_gid
3299+
3300+ # Generate Salt and Hash Password for Jenkins
3301+ salt = subprocess.check_output(['pwgen', '-N1', '6']).strip()
3302+ csum = hashlib.sha256("%s{%s}" % (admin_passwd, salt)).hexdigest()
3303+ salty_password = "%s:%s" % (salt, csum)
3304+
3305+ admin_username = config('username')
3306+ admin_user_home = os.path.join(JENKINS_USERS, admin_username)
3307+ if not os.path.isdir(admin_user_home):
3308+ os.makedirs(admin_user_home, 0o0700)
3309+ os.chown(JENKINS_USERS, jenkins_uid, nogroup_gid)
3310+ os.chown(admin_user_home, jenkins_uid, nogroup_gid)
3311+
3312+ # NOTE: overwriting will destroy any data added by jenkins or via the ui
3313+ admin_user_config = os.path.join(admin_user_home, 'config.xml')
3314+ with open(os.path.join(TEMPLATES_DIR, 'user-config.xml')) as src_fd:
3315+ with open(admin_user_config, 'w') as dst_fd:
3316+ lines = src_fd.readlines()
3317+ for line in lines:
3318+ kvs = {'__USERNAME__': admin_username,
3319+ '__PASSWORD__': salty_password}
3320+
3321+ for key, val in kvs.iteritems():
3322+ if key in line:
3323+ line = line.replace(key, val)
3324+
3325+ dst_fd.write(line)
3326+ os.chown(admin_user_config, jenkins_uid, nogroup_gid)
3327+
3328+ # Only run on first invocation otherwise we blast
3329+ # any configuration changes made
3330+ jenkins_bootstrap_flag = '/var/lib/jenkins/config.bootstrapped'
3331+ if not os.path.exists(jenkins_bootstrap_flag):
3332+ log("Bootstrapping secure initial configuration in Jenkins.",
3333+ level=DEBUG)
3334+ src = os.path.join(TEMPLATES_DIR, 'jenkins-config.xml')
3335+ dst = os.path.join(JENKINS_HOME, 'config.xml')
3336+ shutil.copy(src, dst)
3337+ os.chown(dst, jenkins_uid, nogroup_gid)
3338+ # Touch
3339+ with open(jenkins_bootstrap_flag, 'w'):
3340+ pass
3341+
3342+ log("Stopping jenkins for plugin update(s)", level=DEBUG)
3343+ service_stop('jenkins')
3344+ install_jenkins_plugins(jenkins_uid, jenkins_gid)
3345+ log("Starting jenkins to pickup configuration changes", level=DEBUG)
3346+ service_start('jenkins')
3347+
3348+ apt_install(['python-jenkins'], fatal=True)
3349+ tools = config('tools')
3350+ if tools:
3351+ log("Installing tools.", level=DEBUG)
3352+ apt_install(tools.split(), fatal=True)
3353+
3354+
3355+@hooks.hook('start')
3356+def start():
3357+ service_start('jenkins')
3358+
3359+
3360+@hooks.hook('stop')
3361+def stop():
3362+ service_stop('jenkins')
3363+
3364+
3365+@hooks.hook('upgrade-charm')
3366+def upgrade_charm():
3367+ log("Upgrading charm.", level=DEBUG)
3368+ config_changed()
3369+
3370+
3371+@hooks.hook('master-relation-joined')
3372+def master_relation_joined():
3373+ HOSTNAME = unit_get('private-address')
3374+ log("Setting url relation to http://%s:8080" % (HOSTNAME), level=DEBUG)
3375+ relation_set(url="http://%s:8080" % (HOSTNAME))
3376+
3377+
3378+@hooks.hook('master-relation-changed')
3379+def master_relation_changed():
3380+ PASSWORD = config('password')
3381+ if PASSWORD:
3382+ with open('/var/lib/jenkins/.admin_password', 'r') as fd:
3383+ PASSWORD = fd.read()
3384+
3385+ required_settings = ['executors', 'labels', 'slavehost']
3386+ settings = relation_get()
3387+ missing = [s for s in required_settings if s not in settings]
3388+ if missing:
3389+ log("Not all required relation settings received yet (missing=%s) - "
3390+ "skipping" % (', '.join(missing)), level=INFO)
3391+ return
3392+
3393+ slavehost = settings['slavehost']
3394+ executors = settings['executors']
3395+ labels = settings['labels']
3396+
3397+ # Double check to see if this has happened yet
3398+ if "x%s" % (slavehost) == "x":
3399+ log("Slave host not yet defined - skipping", level=INFO)
3400+ return
3401+
3402+ log("Adding slave with hostname %s." % (slavehost), level=DEBUG)
3403+ add_node(slavehost, executors, labels, config('username'), PASSWORD)
3404+ log("Node slave %s added." % (slavehost), level=DEBUG)
3405+
3406+
3407+@hooks.hook('master-relation-departed')
3408+def master_relation_departed():
3409+ # Slave hostname is derived from unit name so
3410+ # this is pretty safe
3411+ slavehost = remote_unit()
3412+ log("Deleting slave with hostname %s." % (slavehost), level=DEBUG)
3413+ del_node(slavehost, config('username'), config('password'))
3414+
3415+
3416+@hooks.hook('master-relation-broken')
3417+def master_relation_broken():
3418+ password = config('password')
3419+ if not password:
3420+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3421+ with open(passwd_file, 'w+') as fd:
3422+ PASSWORD = fd.read()
3423+
3424+ for member in relation_ids():
3425+ member = member.replace('/', '-')
3426+ log("Removing node %s from Jenkins master." % (member), level=DEBUG)
3427+ del_node(member, config('username'), PASSWORD)
3428+
3429+
3430+@hooks.hook('website-relation-joined')
3431+def website_relation_joined():
3432+ hostname = unit_get('private-address')
3433+ log("Setting website URL to %s:8080" % (hostname), level=DEBUG)
3434+ relation_set(port=8080, hostname=hostname)
3435+
3436+
3437+if __name__ == '__main__':
3438+ try:
3439+ hooks.execute(sys.argv)
3440+ except UnregisteredHookError as e:
3441+ log('Unknown hook {} - skipping.'.format(e), level=INFO)
3442
3443=== added file 'hooks/jenkins_utils.py'
3444--- hooks/jenkins_utils.py 1970-01-01 00:00:00 +0000
3445+++ hooks/jenkins_utils.py 2015-01-22 15:04:01 +0000
3446@@ -0,0 +1,178 @@
3447+#!/usr/bin/python
3448+import glob
3449+import os
3450+import shutil
3451+import subprocess
3452+import tempfile
3453+
3454+from charmhelpers.core.hookenv import (
3455+ config,
3456+ log,
3457+ DEBUG,
3458+ INFO,
3459+ WARNING,
3460+)
3461+from charmhelpers.fetch import (
3462+ apt_update,
3463+ add_source,
3464+)
3465+
3466+from charmhelpers.core.decorators import (
3467+ retry_on_exception,
3468+)
3469+
3470+JENKINS_HOME = '/var/lib/jenkins'
3471+JENKINS_USERS = os.path.join(JENKINS_HOME, 'users')
3472+JENKINS_PLUGINS = os.path.join(JENKINS_HOME, 'plugins')
3473+TEMPLATES_DIR = 'templates'
3474+
3475+
3476+def add_node(host, executors, labels, username, password):
3477+ import jenkins
3478+
3479+ @retry_on_exception(2, 2, exc_type=jenkins.JenkinsException)
3480+ def _add_node(*args, **kwargs):
3481+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username,
3482+ password)
3483+
3484+ if l_jenkins.node_exists(host):
3485+ log("Node exists - not adding", level=DEBUG)
3486+ return
3487+
3488+ log("Adding node '%s' to Jenkins master" % (host), level=INFO)
3489+ l_jenkins.create_node(host, int(executors) * 2, host, labels=labels)
3490+
3491+ if not l_jenkins.node_exists(host):
3492+ log("Failed to create node '%s'" % (host), level=WARNING)
3493+
3494+ return _add_node()
3495+
3496+
3497+def del_node(host, username, password):
3498+ import jenkins
3499+
3500+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username, password)
3501+
3502+ if l_jenkins.node_exists(host):
3503+ log("Node '%s' exists" % (host), level=DEBUG)
3504+ l_jenkins.delete_node(host)
3505+ else:
3506+ log("Node '%s' does not exist - not deleting" % (host), level=INFO)
3507+
3508+
3509+def setup_source(release):
3510+ """Install Jenkins archive."""
3511+ log("Configuring source of jenkins as %s" % release, level=INFO)
3512+
3513+ # Configure to use upstream archives
3514+ # lts - debian-stable
3515+ # trunk - debian
3516+ if release == 'lts':
3517+ source = "debian-stable"
3518+ elif release == 'trunk':
3519+ source = "debian"
3520+ else:
3521+ errmsg = "Release '%s' configuration not recognised" % (release)
3522+ raise Exception(errmsg)
3523+
3524+ # Setup archive to use appropriate jenkins upstream
3525+ key = 'http://pkg.jenkins-ci.org/%s/jenkins-ci.org.key' % source
3526+ target = "%s-%s" % (source, 'jenkins-ci.org.key')
3527+ subprocess.check_call(['wget', '-q', '-O', target, key])
3528+ with open(target, 'r') as fd:
3529+ key = fd.read()
3530+
3531+ deb = "deb http://pkg.jenkins-ci.org/%s binary/" % (source)
3532+ sources_file = "/etc/apt/sources.list.d/jenkins.list"
3533+
3534+ found = False
3535+ if os.path.exists(sources_file):
3536+ with open(sources_file, 'r') as fd:
3537+ for line in fd:
3538+ if deb in line:
3539+ found = True
3540+ break
3541+
3542+ if not found:
3543+ with open(sources_file, 'a') as fd:
3544+ fd.write("%s\n" % deb)
3545+ else:
3546+ with open(sources_file, 'w') as fd:
3547+ fd.write("%s\n" % deb)
3548+
3549+ if not found:
3550+ # NOTE: don't use add_source for adding source since it adds deb and
3551+ # deb-src entries but pkg.jenkins-ci.org has no deb-src.
3552+ add_source("#dummy-source", key=key)
3553+
3554+ apt_update(fatal=True)
3555+
3556+
3557+def install_jenkins_plugins(jenkins_uid, jenkins_gid):
3558+ plugins = config('plugins')
3559+ if plugins:
3560+ plugins = plugins.split()
3561+ else:
3562+ plugins = []
3563+
3564+ log("Installing plugins (%s)" % (' '.join(plugins)), level=DEBUG)
3565+ if not os.path.isdir(JENKINS_PLUGINS):
3566+ os.makedirs(JENKINS_PLUGINS)
3567+
3568+ os.chmod(JENKINS_PLUGINS, 0o0755)
3569+ os.chown(JENKINS_PLUGINS, jenkins_uid, jenkins_gid)
3570+
3571+ track_dir = tempfile.mkdtemp(prefix='/tmp/plugins.installed')
3572+ try:
3573+ installed_plugins = glob.glob("%s/*.hpi" % (JENKINS_PLUGINS))
3574+ for plugin in installed_plugins:
3575+ # Create a ref of installed plugin
3576+ with open(os.path.join(track_dir, os.path.basename(plugin)),
3577+ 'w'):
3578+ pass
3579+
3580+ plugins_site = config('plugins-site')
3581+ log("Fetching plugins from %s" % (plugins_site), level=DEBUG)
3582+ # NOTE: by default wget verifies certificates as of 1.10.
3583+ if config('plugins-check-certificate') == "no":
3584+ opts = ["--no-check-certificate"]
3585+ else:
3586+ opts = []
3587+
3588+ for plugin in plugins:
3589+ plugin_filename = "%s.hpi" % (plugin)
3590+ url = os.path.join(plugins_site, 'latest', plugin_filename)
3591+ plugin_path = os.path.join(JENKINS_PLUGINS, plugin_filename)
3592+ if not os.path.isfile(plugin_path):
3593+ log("Installing plugin %s" % (plugin_filename), level=DEBUG)
3594+ cmd = ['wget'] + opts + ['--timestamping', url, '-O',
3595+ plugin_path]
3596+ subprocess.check_call(cmd)
3597+ os.chmod(plugin_path, 0744)
3598+ os.chown(plugin_path, jenkins_uid, jenkins_gid)
3599+
3600+ else:
3601+ log("Plugin %s already installed" % (plugin_filename),
3602+ level=DEBUG)
3603+
3604+ ref = os.path.join(track_dir, plugin_filename)
3605+ if os.path.exists(ref):
3606+ # Delete ref since plugin is installed.
3607+ os.remove(ref)
3608+
3609+ installed_plugins = os.listdir(track_dir)
3610+ if installed_plugins:
3611+ if config('remove-unlisted-plugins') == "yes":
3612+ for plugin in installed_plugins:
3613+ path = os.path.join(JENKINS_HOME, 'plugins', plugin)
3614+ if os.path.isfile(path):
3615+ log("Deleting unlisted plugin '%s'" % (path),
3616+ level=INFO)
3617+ os.remove(path)
3618+ else:
3619+ log("Unlisted plugins: (%s) Not removed. Set "
3620+ "remove-unlisted-plugins to 'yes' to clear them away." %
3621+ ', '.join(installed_plugins), level=INFO)
3622+ finally:
3623+ # Delete install refs
3624+ shutil.rmtree(track_dir)
3625
3626=== modified file 'hooks/master-relation-broken'
3627--- hooks/master-relation-broken 2012-07-31 10:32:36 +0000
3628+++ hooks/master-relation-broken 1970-01-01 00:00:00 +0000
3629@@ -1,17 +0,0 @@
3630-#!/bin/sh
3631-
3632-PASSWORD=`config-get password`
3633-if [ -z "$PASSWORD" ]
3634-then
3635- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3636-fi
3637-
3638-MEMBERS=`relation-list`
3639-
3640-for MEMBER in $MEMBERS
3641-do
3642- juju-log "Removing node $MEMBER from Jenkins master..."
3643- $(dirname $0)/delnode `echo $MEMBER | sed s,/,-,` `config-get username` $PASSWORD
3644-done
3645-
3646-exit 0
3647
3648=== target is u'jenkins_hooks.py'
3649=== modified file 'hooks/master-relation-changed'
3650--- hooks/master-relation-changed 2012-07-31 10:32:36 +0000
3651+++ hooks/master-relation-changed 1970-01-01 00:00:00 +0000
3652@@ -1,24 +0,0 @@
3653-#!/bin/bash
3654-
3655-set -ue
3656-
3657-PASSWORD=`config-get password`
3658-if [ -z "$PASSWORD" ]
3659-then
3660- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3661-fi
3662-
3663-# Grab information that remote unit has posted to relation
3664-slavehost=$(relation-get slavehost)
3665-executors=$(relation-get executors)
3666-labels=$(relation-get labels)
3667-
3668-# Double check to see if this has happened yet
3669-if [ "x$slavehost" = "x" ]; then
3670- juju-log "Slave host not yet defined, exiting..."
3671- exit 0
3672-fi
3673-
3674-juju-log "Adding slave with hostname $slavehost..."
3675-$(dirname $0)/addnode $slavehost $executors "$labels" `config-get username` $PASSWORD
3676-juju-log "Node slave $slavehost added..."
3677
3678=== target is u'jenkins_hooks.py'
3679=== modified file 'hooks/master-relation-departed'
3680--- hooks/master-relation-departed 2011-09-22 14:46:56 +0000
3681+++ hooks/master-relation-departed 1970-01-01 00:00:00 +0000
3682@@ -1,12 +0,0 @@
3683-#!/bin/bash
3684-
3685-set -ue
3686-
3687-# Slave hostname is derived from unit name so
3688-# this is pretty safe
3689-slavehost=`echo $JUJU_REMOTE_UNIT | sed s,/,-,`
3690-
3691-juju-log "Deleting slave with hostname $slavehost..."
3692-$(dirname $0)/delnode $slavehost `config-get username` `config-get password`
3693-
3694-exit 0
3695
3696=== target is u'jenkins_hooks.py'
3697=== modified file 'hooks/master-relation-joined'
3698--- hooks/master-relation-joined 2011-10-07 13:43:19 +0000
3699+++ hooks/master-relation-joined 1970-01-01 00:00:00 +0000
3700@@ -1,5 +0,0 @@
3701-#!/bin/sh
3702-
3703-HOSTNAME=`unit-get private-address`
3704-juju-log "Setting url relation to http://$HOSTNAME:8080"
3705-relation-set url="http://$HOSTNAME:8080"
3706
3707=== target is u'jenkins_hooks.py'
3708=== modified file 'hooks/start'
3709--- hooks/start 2011-09-22 14:46:56 +0000
3710+++ hooks/start 1970-01-01 00:00:00 +0000
3711@@ -1,3 +0,0 @@
3712-#!/bin/bash
3713-
3714-service jenkins start || true
3715
3716=== target is u'jenkins_hooks.py'
3717=== modified file 'hooks/stop'
3718--- hooks/stop 2011-09-22 14:46:56 +0000
3719+++ hooks/stop 1970-01-01 00:00:00 +0000
3720@@ -1,3 +0,0 @@
3721-#!/bin/bash
3722-
3723-service jenkins stop
3724
3725=== target is u'jenkins_hooks.py'
3726=== modified file 'hooks/upgrade-charm'
3727--- hooks/upgrade-charm 2011-09-22 14:46:56 +0000
3728+++ hooks/upgrade-charm 1970-01-01 00:00:00 +0000
3729@@ -1,7 +0,0 @@
3730-#!/bin/sh
3731-set -e
3732-
3733-home=`dirname $0`
3734-
3735-juju-log "Upgrading charm by running install hook again."
3736-exec $home/install
3737
3738=== target is u'jenkins_hooks.py'
3739=== modified file 'hooks/website-relation-joined'
3740--- hooks/website-relation-joined 2011-10-07 13:43:19 +0000
3741+++ hooks/website-relation-joined 1970-01-01 00:00:00 +0000
3742@@ -1,5 +0,0 @@
3743-#!/bin/sh
3744-
3745-HOSTNAME=`unit-get private-address`
3746-juju-log "Setting website URL to $HOSTNAME:8080"
3747-relation-set port=8080 hostname=$HOSTNAME
3748
3749=== target is u'jenkins_hooks.py'
3750=== added file 'tests/100-deploy-precise'
3751--- tests/100-deploy-precise 1970-01-01 00:00:00 +0000
3752+++ tests/100-deploy-precise 2015-01-22 15:04:01 +0000
3753@@ -0,0 +1,123 @@
3754+#!/usr/bin/env python
3755+
3756+import amulet
3757+import json
3758+import requests
3759+
3760+###
3761+# Quick Config
3762+###
3763+seconds = 900
3764+
3765+###
3766+# Deployment Setup
3767+###
3768+d = amulet.Deployment(series='precise')
3769+
3770+d.add('haproxy') # website-relation
3771+d.add('jenkins') # Subject matter
3772+d.add('jenkins-slave') # Job Runner
3773+
3774+
3775+d.relate('jenkins:website', 'haproxy:reverseproxy')
3776+d.relate('jenkins:master', 'jenkins-slave:slave')
3777+
3778+d.configure('jenkins', {'tools': 'git gcc make bzr vim-tiny',
3779+ 'release': 'lts',
3780+ 'username': 'amulet',
3781+ 'password': 'testautomation',
3782+ 'plugins': 'groovy',
3783+ 'plugins-check-certificate': 'no'})
3784+
3785+d.expose('jenkins')
3786+d.expose('haproxy')
3787+
3788+try:
3789+ d.setup(timeout=seconds)
3790+ d.sentry.wait()
3791+except amulet.helpers.TimeoutError:
3792+ amulet.raise_status(amulet.SKIP, msg="Environment wasn't stood up in time")
3793+except:
3794+ raise
3795+
3796+
3797+###
3798+# Define reconfiguration routine
3799+###
3800+
3801+
3802+###
3803+# Define sentries for quick access
3804+###
3805+jenkins = d.sentry.unit['jenkins/0']
3806+haproxy = d.sentry.unit['haproxy/0']
3807+slave = d.sentry.unit['jenkins-slave/0']
3808+
3809+
3810+###
3811+# Validate Jenkins configuration options exercised
3812+# Validate jenkins tool installation
3813+###
3814+def validate_tools():
3815+ output, code = jenkins.run('dpkg -l vim-tiny')
3816+ if not output:
3817+ amulet.raise_status(amulet.FAIL, msg="No tool installation found")
3818+
3819+
3820+def validate_release():
3821+ list_present = jenkins.file_stat('/etc/apt/sources.list.d/jenkins.list')
3822+ if not list_present:
3823+ amulet.raise_status(amulet.FAIL, msg="No sources.list update")
3824+ lc = jenkins.file_contents('/etc/apt/sources.list.d/jenkins.list')
3825+ if not 'debian-stable' in lc:
3826+ amulet.raise_status(amulet.FAIL, msg="LTS not found in sources.list")
3827+
3828+
3829+def validate_login():
3830+ #First off, validate that we have the jenkins user on the machine
3831+ output, code = jenkins.run('id -u jenkins')
3832+ if code:
3833+ amulet.raise_status(amulet.FAIL, msg="Jenkins system user not found")
3834+ #validate we have a running service of jenkins to execute the test against
3835+ output, code = jenkins.run('service jenkins status')
3836+ if code:
3837+ amulet.raise_status(amulet.FAIL, msg="No Jenkins Service Running")
3838+ payload = {'j_username': 'amulet',
3839+ 'j_password': 'testautomation',
3840+ 'from': '/'}
3841+ jenkins_url = "http://%s:8080/j_acegi_security_check" % jenkins.info['public-address']
3842+ r = requests.post(jenkins_url, data=payload)
3843+ if r.status_code is not 200:
3844+ amulet.raise_status(amulet.FAIL, msg="Failed to login")
3845+
3846+
3847+#TODO: Figure out how to test installation of NonHTTPS plugin
3848+# This is called as a flag to pyjenkins, and I dont know of any non https
3849+# plugin repositories. Pinned here for reference later.
3850+def validate_plugins():
3851+ ds = jenkins.directory_stat('/var/lib/jenkins/plugins/groovy')
3852+ if ds['size'] <= 0:
3853+ amulet.raise_status(amulet.FAIL, msg="Failed to locate plugin")
3854+
3855+
3856+def validate_website_relation():
3857+ jenkins_url = "http://%s/" % haproxy.info['public-address']
3858+ r = requests.get(jenkins_url)
3859+ if r.status_code is not 200:
3860+ amulet.raise_status(amulet.FAIL,
3861+ msg="Failed to reach jenkins through proxy")
3862+
3863+
3864+def validate_slave_relation():
3865+ jenkins_url = "http://%s:8080/computer/api/json" % jenkins.info['public-address']
3866+ r = requests.get(jenkins_url)
3867+ data = json.loads(r.text)
3868+ if not data['computer'][1]['displayName'] == "jenkins-slave-0":
3869+ amulet.raise_status(amulet.FAIL, msg="Failed to locate slave")
3870+
3871+validate_tools()
3872+validate_release()
3873+validate_login()
3874+validate_plugins()
3875+validate_website_relation()
3876+validate_slave_relation()
3877
3878=== renamed file 'tests/100-deploy' => 'tests/100-deploy-trusty'
3879--- tests/100-deploy 2014-03-05 19:18:19 +0000
3880+++ tests/100-deploy-trusty 2015-01-22 15:04:01 +0000
3881@@ -1,4 +1,4 @@
3882-#!/usr/bin/python3
3883+#!/usr/bin/env python
3884
3885 import amulet
3886 import json
3887@@ -12,11 +12,13 @@
3888 ###
3889 # Deployment Setup
3890 ###
3891-d = amulet.Deployment()
3892+d = amulet.Deployment(series='trusty')
3893
3894 d.add('haproxy') # website-relation
3895 d.add('jenkins') # Subject matter
3896-d.add('jenkins-slave') # Job Runner
3897+# TODO(hopem): we don't yet have a precise version of jenkins-slave
3898+# so use the precise version for now.
3899+d.add('jenkins-slave', 'cs:precise/jenkins-slave') # Job Runner
3900
3901
3902 d.relate('jenkins:website', 'haproxy:reverseproxy')
3903
3904=== added file 'tests/README'
3905--- tests/README 1970-01-01 00:00:00 +0000
3906+++ tests/README 2015-01-22 15:04:01 +0000
3907@@ -0,0 +1,56 @@
3908+This directory provides Amulet tests that focus on verification of Jenkins
3909+deployments.
3910+
3911+In order to run tests, you'll need charm-tools installed (in addition to
3912+juju, of course):
3913+
3914+ sudo add-apt-repository ppa:juju/stable
3915+ sudo apt-get update
3916+ sudo apt-get install charm-tools
3917+
3918+If you use a web proxy server to access the web, you'll need to set the
3919+AMULET_HTTP_PROXY environment variable to the http URL of the proxy server.
3920+
3921+The following examples demonstrate different ways that tests can be executed.
3922+All examples are run from the charm's root directory.
3923+
3924+ * To run all tests (starting with 00-setup):
3925+
3926+ make test
3927+
3928+ * To run a specific test module (or modules):
3929+
3930+ juju test -v -p AMULET_HTTP_PROXY 100-deploy
3931+
3932+ * To run a specific test module (or modules), and keep the environment
3933+ deployed after a failure:
3934+
3935+ juju test --set-e -v -p AMULET_HTTP_PROXY 100-deploy
3936+
3937+ * To re-run a test module against an already deployed environment (one
3938+ that was deployed by a previous call to 'juju test --set-e'):
3939+
3940+ ./tests/100-deploy
3941+
3942+
3943+For debugging and test development purposes, all code should be idempotent.
3944+In other words, the code should have the ability to be re-run without changing
3945+the results beyond the initial run. This enables editing and re-running of a
3946+test module against an already deployed environment, as described above.
3947+
3948+
3949+Notes for additional test writing:
3950+
3951+ * Use DEBUG to turn on debug logging, use ERROR otherwise.
3952+ u = OpenStackAmuletUtils(ERROR)
3953+ u = OpenStackAmuletUtils(DEBUG)
3954+
3955+ * Preserving the deployed environment:
3956+ Even with juju --set-e, amulet will tear down the juju environment
3957+ when all tests pass. This force_fail 'test' can be used in basic_deployment.py
3958+ to simulate a failed test and keep the environment.
3959+
3960+ def test_zzzz_fake_fail(self):
3961+ '''Force a fake fail to keep juju environment after a successful test run'''
3962+ # Useful in test writing, when used with: juju test --set-e
3963+ amulet.raise_status(amulet.FAIL, msg='using fake fail to keep juju environment')
3964
3965=== added directory 'tests/charmhelpers'
3966=== added file 'tests/charmhelpers/__init__.py'
3967=== added directory 'tests/charmhelpers/contrib'
3968=== added file 'tests/charmhelpers/contrib/__init__.py'
3969=== added directory 'tests/charmhelpers/contrib/amulet'
3970=== added file 'tests/charmhelpers/contrib/amulet/__init__.py'
3971=== added file 'tests/charmhelpers/contrib/amulet/deployment.py'
3972--- tests/charmhelpers/contrib/amulet/deployment.py 1970-01-01 00:00:00 +0000
3973+++ tests/charmhelpers/contrib/amulet/deployment.py 2015-01-22 15:04:01 +0000
3974@@ -0,0 +1,77 @@
3975+import amulet
3976+import os
3977+import six
3978+
3979+
3980+class AmuletDeployment(object):
3981+ """Amulet deployment.
3982+
3983+ This class provides generic Amulet deployment and test runner
3984+ methods.
3985+ """
3986+
3987+ def __init__(self, series=None):
3988+ """Initialize the deployment environment."""
3989+ self.series = None
3990+
3991+ if series:
3992+ self.series = series
3993+ self.d = amulet.Deployment(series=self.series)
3994+ else:
3995+ self.d = amulet.Deployment()
3996+
3997+ def _add_services(self, this_service, other_services):
3998+ """Add services.
3999+
4000+ Add services to the deployment where this_service is the local charm
4001+ that we're testing and other_services are the other services that
4002+ are being used in the local amulet tests.
4003+ """
4004+ if this_service['name'] != os.path.basename(os.getcwd()):
4005+ s = this_service['name']
4006+ msg = "The charm's root directory name needs to be {}".format(s)
4007+ amulet.raise_status(amulet.FAIL, msg=msg)
4008+
4009+ if 'units' not in this_service:
4010+ this_service['units'] = 1
4011+
4012+ self.d.add(this_service['name'], units=this_service['units'])
4013+
4014+ for svc in other_services:
4015+ if 'location' in svc:
4016+ branch_location = svc['location']
4017+ elif self.series:
4018+ branch_location = 'cs:{}/{}'.format(self.series, svc['name']),
4019+ else:
4020+ branch_location = None
4021+
4022+ if 'units' not in svc:
4023+ svc['units'] = 1
4024+
4025+ self.d.add(svc['name'], charm=branch_location, units=svc['units'])
4026+
4027+ def _add_relations(self, relations):
4028+ """Add all of the relations for the services."""
4029+ for k, v in six.iteritems(relations):
4030+ self.d.relate(k, v)
4031+
4032+ def _configure_services(self, configs):
4033+ """Configure all of the services."""
4034+ for service, config in six.iteritems(configs):
4035+ self.d.configure(service, config)
4036+
4037+ def _deploy(self):
4038+ """Deploy environment and wait for all hooks to finish executing."""
4039+ try:
4040+ self.d.setup(timeout=900)
4041+ self.d.sentry.wait(timeout=900)
4042+ except amulet.helpers.TimeoutError:
4043+ amulet.raise_status(amulet.FAIL, msg="Deployment timed out")
4044+ except Exception:
4045+ raise
4046+
4047+ def run_tests(self):
4048+ """Run all of the methods that are prefixed with 'test_'."""
4049+ for test in dir(self):
4050+ if test.startswith('test_'):
4051+ getattr(self, test)()
4052
4053=== added file 'tests/charmhelpers/contrib/amulet/utils.py'
4054--- tests/charmhelpers/contrib/amulet/utils.py 1970-01-01 00:00:00 +0000
4055+++ tests/charmhelpers/contrib/amulet/utils.py 2015-01-22 15:04:01 +0000
4056@@ -0,0 +1,178 @@
4057+import ConfigParser
4058+import io
4059+import logging
4060+import re
4061+import sys
4062+import time
4063+
4064+import six
4065+
4066+
4067+class AmuletUtils(object):
4068+ """Amulet utilities.
4069+
4070+ This class provides common utility functions that are used by Amulet
4071+ tests.
4072+ """
4073+
4074+ def __init__(self, log_level=logging.ERROR):
4075+ self.log = self.get_logger(level=log_level)
4076+
4077+ def get_logger(self, name="amulet-logger", level=logging.DEBUG):
4078+ """Get a logger object that will log to stdout."""
4079+ log = logging
4080+ logger = log.getLogger(name)
4081+ fmt = log.Formatter("%(asctime)s %(funcName)s "
4082+ "%(levelname)s: %(message)s")
4083+
4084+ handler = log.StreamHandler(stream=sys.stdout)
4085+ handler.setLevel(level)
4086+ handler.setFormatter(fmt)
4087+
4088+ logger.addHandler(handler)
4089+ logger.setLevel(level)
4090+
4091+ return logger
4092+
4093+ def valid_ip(self, ip):
4094+ if re.match(r"^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$", ip):
4095+ return True
4096+ else:
4097+ return False
4098+
4099+ def valid_url(self, url):
4100+ p = re.compile(
4101+ r'^(?:http|ftp)s?://'
4102+ r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|' # noqa
4103+ r'localhost|'
4104+ r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})'
4105+ r'(?::\d+)?'
4106+ r'(?:/?|[/?]\S+)$',
4107+ re.IGNORECASE)
4108+ if p.match(url):
4109+ return True
4110+ else:
4111+ return False
4112+
4113+ def validate_services(self, commands):
4114+ """Validate services.
4115+
4116+ Verify the specified services are running on the corresponding
4117+ service units.
4118+ """
4119+ for k, v in six.iteritems(commands):
4120+ for cmd in v:
4121+ output, code = k.run(cmd)
4122+ if code != 0:
4123+ return "command `{}` returned {}".format(cmd, str(code))
4124+ return None
4125+
4126+ def _get_config(self, unit, filename):
4127+ """Get a ConfigParser object for parsing a unit's config file."""
4128+ file_contents = unit.file_contents(filename)
4129+ config = ConfigParser.ConfigParser()
4130+ config.readfp(io.StringIO(file_contents))
4131+ return config
4132+
4133+ def validate_config_data(self, sentry_unit, config_file, section,
4134+ expected):
4135+ """Validate config file data.
4136+
4137+ Verify that the specified section of the config file contains
4138+ the expected option key:value pairs.
4139+ """
4140+ config = self._get_config(sentry_unit, config_file)
4141+
4142+ if section != 'DEFAULT' and not config.has_section(section):
4143+ return "section [{}] does not exist".format(section)
4144+
4145+ for k in expected.keys():
4146+ if not config.has_option(section, k):
4147+ return "section [{}] is missing option {}".format(section, k)
4148+ if config.get(section, k) != expected[k]:
4149+ return "section [{}] {}:{} != expected {}:{}".format(
4150+ section, k, config.get(section, k), k, expected[k])
4151+ return None
4152+
4153+ def _validate_dict_data(self, expected, actual):
4154+ """Validate dictionary data.
4155+
4156+ Compare expected dictionary data vs actual dictionary data.
4157+ The values in the 'expected' dictionary can be strings, bools, ints,
4158+ longs, or can be a function that evaluate a variable and returns a
4159+ bool.
4160+ """
4161+ for k, v in six.iteritems(expected):
4162+ if k in actual:
4163+ if (isinstance(v, six.string_types) or
4164+ isinstance(v, bool) or
4165+ isinstance(v, six.integer_types)):
4166+ if v != actual[k]:
4167+ return "{}:{}".format(k, actual[k])
4168+ elif not v(actual[k]):
4169+ return "{}:{}".format(k, actual[k])
4170+ else:
4171+ return "key '{}' does not exist".format(k)
4172+ return None
4173+
4174+ def validate_relation_data(self, sentry_unit, relation, expected):
4175+ """Validate actual relation data based on expected relation data."""
4176+ actual = sentry_unit.relation(relation[0], relation[1])
4177+ self.log.debug('actual: {}'.format(repr(actual)))
4178+ return self._validate_dict_data(expected, actual)
4179+
4180+ def _validate_list_data(self, expected, actual):
4181+ """Compare expected list vs actual list data."""
4182+ for e in expected:
4183+ if e not in actual:
4184+ return "expected item {} not found in actual list".format(e)
4185+ return None
4186+
4187+ def not_null(self, string):
4188+ if string is not None:
4189+ return True
4190+ else:
4191+ return False
4192+
4193+ def _get_file_mtime(self, sentry_unit, filename):
4194+ """Get last modification time of file."""
4195+ return sentry_unit.file_stat(filename)['mtime']
4196+
4197+ def _get_dir_mtime(self, sentry_unit, directory):
4198+ """Get last modification time of directory."""
4199+ return sentry_unit.directory_stat(directory)['mtime']
4200+
4201+ def _get_proc_start_time(self, sentry_unit, service, pgrep_full=False):
4202+ """Get process' start time.
4203+
4204+ Determine start time of the process based on the last modification
4205+ time of the /proc/pid directory. If pgrep_full is True, the process
4206+ name is matched against the full command line.
4207+ """
4208+ if pgrep_full:
4209+ cmd = 'pgrep -o -f {}'.format(service)
4210+ else:
4211+ cmd = 'pgrep -o {}'.format(service)
4212+ proc_dir = '/proc/{}'.format(sentry_unit.run(cmd)[0].strip())
4213+ return self._get_dir_mtime(sentry_unit, proc_dir)
4214+
4215+ def service_restarted(self, sentry_unit, service, filename,
4216+ pgrep_full=False, sleep_time=20):
4217+ """Check if service was restarted.
4218+
4219+ Compare a service's start time vs a file's last modification time
4220+ (such as a config file for that service) to determine if the service
4221+ has been restarted.
4222+ """
4223+ time.sleep(sleep_time)
4224+ if (self._get_proc_start_time(sentry_unit, service, pgrep_full) >=
4225+ self._get_file_mtime(sentry_unit, filename)):
4226+ return True
4227+ else:
4228+ return False
4229+
4230+ def relation_error(self, name, data):
4231+ return 'unexpected relation data in {} - {}'.format(name, data)
4232+
4233+ def endpoint_error(self, name, data):
4234+ return 'unexpected endpoint data in {} - {}'.format(name, data)
4235
4236=== added directory 'tests/charmhelpers/contrib/openstack'
4237=== added file 'tests/charmhelpers/contrib/openstack/__init__.py'
4238=== added directory 'tests/charmhelpers/contrib/openstack/amulet'
4239=== added file 'tests/charmhelpers/contrib/openstack/amulet/__init__.py'
4240=== added file 'tests/charmhelpers/contrib/openstack/amulet/deployment.py'
4241--- tests/charmhelpers/contrib/openstack/amulet/deployment.py 1970-01-01 00:00:00 +0000
4242+++ tests/charmhelpers/contrib/openstack/amulet/deployment.py 2015-01-22 15:04:01 +0000
4243@@ -0,0 +1,92 @@
4244+import six
4245+from charmhelpers.contrib.amulet.deployment import (
4246+ AmuletDeployment
4247+)
4248+
4249+
4250+class OpenStackAmuletDeployment(AmuletDeployment):
4251+ """OpenStack amulet deployment.
4252+
4253+ This class inherits from AmuletDeployment and has additional support
4254+ that is specifically for use by OpenStack charms.
4255+ """
4256+
4257+ def __init__(self, series=None, openstack=None, source=None, stable=True):
4258+ """Initialize the deployment environment."""
4259+ super(OpenStackAmuletDeployment, self).__init__(series)
4260+ self.openstack = openstack
4261+ self.source = source
4262+ self.stable = stable
4263+ # Note(coreycb): this needs to be changed when new next branches come
4264+ # out.
4265+ self.current_next = "trusty"
4266+
4267+ def _determine_branch_locations(self, other_services):
4268+ """Determine the branch locations for the other services.
4269+
4270+ Determine if the local branch being tested is derived from its
4271+ stable or next (dev) branch, and based on this, use the corresonding
4272+ stable or next branches for the other_services."""
4273+ base_charms = ['mysql', 'mongodb', 'rabbitmq-server']
4274+
4275+ if self.stable:
4276+ for svc in other_services:
4277+ temp = 'lp:charms/{}'
4278+ svc['location'] = temp.format(svc['name'])
4279+ else:
4280+ for svc in other_services:
4281+ if svc['name'] in base_charms:
4282+ temp = 'lp:charms/{}'
4283+ svc['location'] = temp.format(svc['name'])
4284+ else:
4285+ temp = 'lp:~openstack-charmers/charms/{}/{}/next'
4286+ svc['location'] = temp.format(self.current_next,
4287+ svc['name'])
4288+ return other_services
4289+
4290+ def _add_services(self, this_service, other_services):
4291+ """Add services to the deployment and set openstack-origin/source."""
4292+ other_services = self._determine_branch_locations(other_services)
4293+
4294+ super(OpenStackAmuletDeployment, self)._add_services(this_service,
4295+ other_services)
4296+
4297+ services = other_services
4298+ services.append(this_service)
4299+ use_source = ['mysql', 'mongodb', 'rabbitmq-server', 'ceph',
4300+ 'ceph-osd', 'ceph-radosgw']
4301+
4302+ if self.openstack:
4303+ for svc in services:
4304+ if svc['name'] not in use_source:
4305+ config = {'openstack-origin': self.openstack}
4306+ self.d.configure(svc['name'], config)
4307+
4308+ if self.source:
4309+ for svc in services:
4310+ if svc['name'] in use_source:
4311+ config = {'source': self.source}
4312+ self.d.configure(svc['name'], config)
4313+
4314+ def _configure_services(self, configs):
4315+ """Configure all of the services."""
4316+ for service, config in six.iteritems(configs):
4317+ self.d.configure(service, config)
4318+
4319+ def _get_openstack_release(self):
4320+ """Get openstack release.
4321+
4322+ Return an integer representing the enum value of the openstack
4323+ release.
4324+ """
4325+ (self.precise_essex, self.precise_folsom, self.precise_grizzly,
4326+ self.precise_havana, self.precise_icehouse,
4327+ self.trusty_icehouse) = range(6)
4328+ releases = {
4329+ ('precise', None): self.precise_essex,
4330+ ('precise', 'cloud:precise-folsom'): self.precise_folsom,
4331+ ('precise', 'cloud:precise-grizzly'): self.precise_grizzly,
4332+ ('precise', 'cloud:precise-havana'): self.precise_havana,
4333+ ('precise', 'cloud:precise-icehouse'): self.precise_icehouse,
4334+ ('trusty', None): self.trusty_icehouse}
4335+ return releases[(self.series, self.openstack)]
4336
4337=== added file 'tests/charmhelpers/contrib/openstack/amulet/utils.py'
4338--- tests/charmhelpers/contrib/openstack/amulet/utils.py 1970-01-01 00:00:00 +0000
4339+++ tests/charmhelpers/contrib/openstack/amulet/utils.py 2015-01-22 15:04:01 +0000
4340@@ -0,0 +1,278 @@
4341+import logging
4342+import os
4343+import time
4344+import urllib
4345+
4346+import glanceclient.v1.client as glance_client
4347+import keystoneclient.v2_0 as keystone_client
4348+import novaclient.v1_1.client as nova_client
4349+
4350+import six
4351+
4352+from charmhelpers.contrib.amulet.utils import (
4353+ AmuletUtils
4354+)
4355+
4356+DEBUG = logging.DEBUG
4357+ERROR = logging.ERROR
4358+
4359+
4360+class OpenStackAmuletUtils(AmuletUtils):
4361+ """OpenStack amulet utilities.
4362+
4363+ This class inherits from AmuletUtils and has additional support
4364+ that is specifically for use by OpenStack charms.
4365+ """
4366+
4367+ def __init__(self, log_level=ERROR):
4368+ """Initialize the deployment environment."""
4369+ super(OpenStackAmuletUtils, self).__init__(log_level)
4370+
4371+ def validate_endpoint_data(self, endpoints, admin_port, internal_port,
4372+ public_port, expected):
4373+ """Validate endpoint data.
4374+
4375+ Validate actual endpoint data vs expected endpoint data. The ports
4376+ are used to find the matching endpoint.
4377+ """
4378+ found = False
4379+ for ep in endpoints:
4380+ self.log.debug('endpoint: {}'.format(repr(ep)))
4381+ if (admin_port in ep.adminurl and
4382+ internal_port in ep.internalurl and
4383+ public_port in ep.publicurl):
4384+ found = True
4385+ actual = {'id': ep.id,
4386+ 'region': ep.region,
4387+ 'adminurl': ep.adminurl,
4388+ 'internalurl': ep.internalurl,
4389+ 'publicurl': ep.publicurl,
4390+ 'service_id': ep.service_id}
4391+ ret = self._validate_dict_data(expected, actual)
4392+ if ret:
4393+ return 'unexpected endpoint data - {}'.format(ret)
4394+
4395+ if not found:
4396+ return 'endpoint not found'
4397+
4398+ def validate_svc_catalog_endpoint_data(self, expected, actual):
4399+ """Validate service catalog endpoint data.
4400+
4401+ Validate a list of actual service catalog endpoints vs a list of
4402+ expected service catalog endpoints.
4403+ """
4404+ self.log.debug('actual: {}'.format(repr(actual)))
4405+ for k, v in six.iteritems(expected):
4406+ if k in actual:
4407+ ret = self._validate_dict_data(expected[k][0], actual[k][0])
4408+ if ret:
4409+ return self.endpoint_error(k, ret)
4410+ else:
4411+ return "endpoint {} does not exist".format(k)
4412+ return ret
4413+
4414+ def validate_tenant_data(self, expected, actual):
4415+ """Validate tenant data.
4416+
4417+ Validate a list of actual tenant data vs list of expected tenant
4418+ data.
4419+ """
4420+ self.log.debug('actual: {}'.format(repr(actual)))
4421+ for e in expected:
4422+ found = False
4423+ for act in actual:
4424+ a = {'enabled': act.enabled, 'description': act.description,
4425+ 'name': act.name, 'id': act.id}
4426+ if e['name'] == a['name']:
4427+ found = True
4428+ ret = self._validate_dict_data(e, a)
4429+ if ret:
4430+ return "unexpected tenant data - {}".format(ret)
4431+ if not found:
4432+ return "tenant {} does not exist".format(e['name'])
4433+ return ret
4434+
4435+ def validate_role_data(self, expected, actual):
4436+ """Validate role data.
4437+
4438+ Validate a list of actual role data vs a list of expected role
4439+ data.
4440+ """
4441+ self.log.debug('actual: {}'.format(repr(actual)))
4442+ for e in expected:
4443+ found = False
4444+ for act in actual:
4445+ a = {'name': act.name, 'id': act.id}
4446+ if e['name'] == a['name']:
4447+ found = True
4448+ ret = self._validate_dict_data(e, a)
4449+ if ret:
4450+ return "unexpected role data - {}".format(ret)
4451+ if not found:
4452+ return "role {} does not exist".format(e['name'])
4453+ return ret
4454+
4455+ def validate_user_data(self, expected, actual):
4456+ """Validate user data.
4457+
4458+ Validate a list of actual user data vs a list of expected user
4459+ data.
4460+ """
4461+ self.log.debug('actual: {}'.format(repr(actual)))
4462+ for e in expected:
4463+ found = False
4464+ for act in actual:
4465+ a = {'enabled': act.enabled, 'name': act.name,
4466+ 'email': act.email, 'tenantId': act.tenantId,
4467+ 'id': act.id}
4468+ if e['name'] == a['name']:
4469+ found = True
4470+ ret = self._validate_dict_data(e, a)
4471+ if ret:
4472+ return "unexpected user data - {}".format(ret)
4473+ if not found:
4474+ return "user {} does not exist".format(e['name'])
4475+ return ret
4476+
4477+ def validate_flavor_data(self, expected, actual):
4478+ """Validate flavor data.
4479+
4480+ Validate a list of actual flavors vs a list of expected flavors.
4481+ """
4482+ self.log.debug('actual: {}'.format(repr(actual)))
4483+ act = [a.name for a in actual]
4484+ return self._validate_list_data(expected, act)
4485+
4486+ def tenant_exists(self, keystone, tenant):
4487+ """Return True if tenant exists."""
4488+ return tenant in [t.name for t in keystone.tenants.list()]
4489+
4490+ def authenticate_keystone_admin(self, keystone_sentry, user, password,
4491+ tenant):
4492+ """Authenticates admin user with the keystone admin endpoint."""
4493+ unit = keystone_sentry
4494+ service_ip = unit.relation('shared-db',
4495+ 'mysql:shared-db')['private-address']
4496+ ep = "http://{}:35357/v2.0".format(service_ip.strip().decode('utf-8'))
4497+ return keystone_client.Client(username=user, password=password,
4498+ tenant_name=tenant, auth_url=ep)
4499+
4500+ def authenticate_keystone_user(self, keystone, user, password, tenant):
4501+ """Authenticates a regular user with the keystone public endpoint."""
4502+ ep = keystone.service_catalog.url_for(service_type='identity',
4503+ endpoint_type='publicURL')
4504+ return keystone_client.Client(username=user, password=password,
4505+ tenant_name=tenant, auth_url=ep)
4506+
4507+ def authenticate_glance_admin(self, keystone):
4508+ """Authenticates admin user with glance."""
4509+ ep = keystone.service_catalog.url_for(service_type='image',
4510+ endpoint_type='adminURL')
4511+ return glance_client.Client(ep, token=keystone.auth_token)
4512+
4513+ def authenticate_nova_user(self, keystone, user, password, tenant):
4514+ """Authenticates a regular user with nova-api."""
4515+ ep = keystone.service_catalog.url_for(service_type='identity',
4516+ endpoint_type='publicURL')
4517+ return nova_client.Client(username=user, api_key=password,
4518+ project_id=tenant, auth_url=ep)
4519+
4520+ def create_cirros_image(self, glance, image_name):
4521+ """Download the latest cirros image and upload it to glance."""
4522+ http_proxy = os.getenv('AMULET_HTTP_PROXY')
4523+ self.log.debug('AMULET_HTTP_PROXY: {}'.format(http_proxy))
4524+ if http_proxy:
4525+ proxies = {'http': http_proxy}
4526+ opener = urllib.FancyURLopener(proxies)
4527+ else:
4528+ opener = urllib.FancyURLopener()
4529+
4530+ f = opener.open("http://download.cirros-cloud.net/version/released")
4531+ version = f.read().strip()
4532+ cirros_img = "cirros-{}-x86_64-disk.img".format(version)
4533+ local_path = os.path.join('tests', cirros_img)
4534+
4535+ if not os.path.exists(local_path):
4536+ cirros_url = "http://{}/{}/{}".format("download.cirros-cloud.net",
4537+ version, cirros_img)
4538+ opener.retrieve(cirros_url, local_path)
4539+ f.close()
4540+
4541+ with open(local_path) as f:
4542+ image = glance.images.create(name=image_name, is_public=True,
4543+ disk_format='qcow2',
4544+ container_format='bare', data=f)
4545+ count = 1
4546+ status = image.status
4547+ while status != 'active' and count < 10:
4548+ time.sleep(3)
4549+ image = glance.images.get(image.id)
4550+ status = image.status
4551+ self.log.debug('image status: {}'.format(status))
4552+ count += 1
4553+
4554+ if status != 'active':
4555+ self.log.error('image creation timed out')
4556+ return None
4557+
4558+ return image
4559+
4560+ def delete_image(self, glance, image):
4561+ """Delete the specified image."""
4562+ num_before = len(list(glance.images.list()))
4563+ glance.images.delete(image)
4564+
4565+ count = 1
4566+ num_after = len(list(glance.images.list()))
4567+ while num_after != (num_before - 1) and count < 10:
4568+ time.sleep(3)
4569+ num_after = len(list(glance.images.list()))
4570+ self.log.debug('number of images: {}'.format(num_after))
4571+ count += 1
4572+
4573+ if num_after != (num_before - 1):
4574+ self.log.error('image deletion timed out')
4575+ return False
4576+
4577+ return True
4578+
4579+ def create_instance(self, nova, image_name, instance_name, flavor):
4580+ """Create the specified instance."""
4581+ image = nova.images.find(name=image_name)
4582+ flavor = nova.flavors.find(name=flavor)
4583+ instance = nova.servers.create(name=instance_name, image=image,
4584+ flavor=flavor)
4585+
4586+ count = 1
4587+ status = instance.status
4588+ while status != 'ACTIVE' and count < 60:
4589+ time.sleep(3)
4590+ instance = nova.servers.get(instance.id)
4591+ status = instance.status
4592+ self.log.debug('instance status: {}'.format(status))
4593+ count += 1
4594+
4595+ if status != 'ACTIVE':
4596+ self.log.error('instance creation timed out')
4597+ return None
4598+
4599+ return instance
4600+
4601+ def delete_instance(self, nova, instance):
4602+ """Delete the specified instance."""
4603+ num_before = len(list(nova.servers.list()))
4604+ nova.servers.delete(instance)
4605+
4606+ count = 1
4607+ num_after = len(list(nova.servers.list()))
4608+ while num_after != (num_before - 1) and count < 10:
4609+ time.sleep(3)
4610+ num_after = len(list(nova.servers.list()))
4611+ self.log.debug('number of instances: {}'.format(num_after))
4612+ count += 1
4613+
4614+ if num_after != (num_before - 1):
4615+ self.log.error('instance deletion timed out')
4616+ return False
4617+
4618+ return True
4619
4620=== added directory 'unit_tests'
4621=== added file 'unit_tests/__init__.py'
4622=== added file 'unit_tests/test_jenkins_hooks.py'
4623--- unit_tests/test_jenkins_hooks.py 1970-01-01 00:00:00 +0000
4624+++ unit_tests/test_jenkins_hooks.py 2015-01-22 15:04:01 +0000
4625@@ -0,0 +1,6 @@
4626+import unittest
4627+
4628+
4629+class JenkinsHooksTests(unittest.TestCase):
4630+ def setUp(self):
4631+ super(JenkinsHooksTests, self).setUp()
4632
4633=== added file 'unit_tests/test_jenkins_utils.py'
4634--- unit_tests/test_jenkins_utils.py 1970-01-01 00:00:00 +0000
4635+++ unit_tests/test_jenkins_utils.py 2015-01-22 15:04:01 +0000
4636@@ -0,0 +1,6 @@
4637+import unittest
4638+
4639+
4640+class JenkinsUtilsTests(unittest.TestCase):
4641+ def setUp(self):
4642+ super(JenkinsUtilsTests, self).setUp()

Subscribers

People subscribed via source and target branches

to all changes: