Merge lp:~hopem/charms/trusty/jenkins/python-redux into lp:charms/trusty/jenkins

Proposed by Edward Hope-Morley
Status: Merged
Merged at revision: 34
Proposed branch: lp:~hopem/charms/trusty/jenkins/python-redux
Merge into: lp:charms/trusty/jenkins
Diff against target: 4669 lines (+4110/-275)
46 files modified
Makefile (+44/-0)
bin/charm_helpers_sync.py (+225/-0)
charm-helpers-hooks.yaml (+8/-0)
charm-helpers-tests.yaml (+5/-0)
config.yaml (+1/-1)
hooks/addnode (+0/-21)
hooks/charmhelpers/__init__.py (+22/-0)
hooks/charmhelpers/contrib/python/packages.py (+80/-0)
hooks/charmhelpers/core/decorators.py (+41/-0)
hooks/charmhelpers/core/fstab.py (+118/-0)
hooks/charmhelpers/core/hookenv.py (+552/-0)
hooks/charmhelpers/core/host.py (+419/-0)
hooks/charmhelpers/core/services/__init__.py (+2/-0)
hooks/charmhelpers/core/services/base.py (+313/-0)
hooks/charmhelpers/core/services/helpers.py (+243/-0)
hooks/charmhelpers/core/sysctl.py (+34/-0)
hooks/charmhelpers/core/templating.py (+52/-0)
hooks/charmhelpers/fetch/__init__.py (+423/-0)
hooks/charmhelpers/fetch/archiveurl.py (+145/-0)
hooks/charmhelpers/fetch/bzrurl.py (+54/-0)
hooks/charmhelpers/fetch/giturl.py (+51/-0)
hooks/charmhelpers/payload/__init__.py (+1/-0)
hooks/charmhelpers/payload/execd.py (+50/-0)
hooks/config-changed (+0/-7)
hooks/delnode (+0/-16)
hooks/install (+0/-151)
hooks/jenkins_hooks.py (+224/-0)
hooks/jenkins_utils.py (+178/-0)
hooks/master-relation-broken (+0/-17)
hooks/master-relation-changed (+0/-24)
hooks/master-relation-departed (+0/-12)
hooks/master-relation-joined (+0/-5)
hooks/start (+0/-3)
hooks/stop (+0/-3)
hooks/upgrade-charm (+0/-7)
hooks/website-relation-joined (+0/-5)
test-requirements.txt (+4/-0)
tests/100-deploy-precise (+123/-0)
tests/100-deploy-trusty (+5/-3)
tests/README (+56/-0)
tests/charmhelpers/contrib/amulet/deployment.py (+77/-0)
tests/charmhelpers/contrib/amulet/utils.py (+178/-0)
tests/charmhelpers/contrib/openstack/amulet/deployment.py (+92/-0)
tests/charmhelpers/contrib/openstack/amulet/utils.py (+278/-0)
unit_tests/test_jenkins_hooks.py (+6/-0)
unit_tests/test_jenkins_utils.py (+6/-0)
To merge this branch: bzr merge lp:~hopem/charms/trusty/jenkins/python-redux
Reviewer Review Type Date Requested Status
Review Queue (community) automated testing Needs Fixing
Whit Morriss Pending
Jorge Niedbalski Pending
Paul Larson Pending
Felipe Reyes Pending
James Page Pending
Ryan Beisner Pending
Review via email: mp+247569@code.launchpad.net

This proposal supersedes a proposal from 2015-01-23.

To post a comment you must log in.
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10336-results

review: Needs Fixing (automated testing)
Revision history for this message
Felipe Reyes (freyes) wrote : Posted in a previous version of this proposal

Setting the password doesn't work, deploying as below doesn't allow you to login with admin/admin. Also when first deploying from the charm store and then upgrading to this branch breaks the password.

---
jenkins:
    password: "admin"
---

$ juju deploy --config config.yaml local:trusty/jenkins

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote : Posted in a previous version of this proposal

Thanks Felipe, taking a look.

Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10636-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10684-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10704-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10869-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10876-results

review: Needs Fixing (automated testing)
Revision history for this message
Whit Morriss (whitmo) wrote : Posted in a previous version of this proposal

Thanks Edward, your python rewrite generally looks good at a glance.

I confirmed the test failures reported by automated testing: jenkins-slave is not being found because it is in the precise series only.

Changing test/100-deploy-trusty:line 19 to "d.add('jenkins-slave', 'cs:precise/jenkins-slave')" remedies this issue until there is a trusty version of jenkins slave.

The tests also hit an error in "master-relation-changed" due to what appears to be a race condition with the configuration and restart of the jenkins slave and adding a node to master. Running the hook via debug hook runs fine.

See logs: https://gist.githubusercontent.com/anonymous/0067138ce2cc697b8c88/raw/3f4c03688400b32f3aa66e1bd3bad5b7398f80a5/jenkins-race-condition

I confirmed this as an issue in the merge targets bash implementation also. Adding some retry logic to add_node should fix this issues.

-1 for test fixes, but otherwise looks good. Thanks again!

review: Needs Fixing
Revision history for this message
Edward Hope-Morley (hopem) wrote : Posted in a previous version of this proposal

@whitmo awesome, thanks for reviewing. I'll see if I can improve the add_node issue and I'll get the amulet test fixed up. Thanks!

Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10962-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10963-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10974-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote : Posted in a previous version of this proposal

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10975-results

review: Needs Fixing (automated testing)
Revision history for this message
Review Queue (review-queue) wrote :

This items has failed automated testing! Results available here http://reports.vapour.ws/charm-tests/charm-bundle-test-10991-results

review: Needs Fixing (automated testing)
Revision history for this message
Whit Morriss (whitmo) wrote :

This testing errors look like cloud issues vs. charm issues.

Revision history for this message
Matt Bruzek (mbruzek) wrote :

I tried running these tests with bundletester by hand and got the following errors:

ERROR: jenkins::100-deploy-precise
[/tmp/bundletester-I8dVhl/jenkins/tests/100-deploy-precise exit 1]
Traceback (most recent call last):

  File "/tmp/bundletester-I8dVhl/jenkins/tests/100-deploy-precise", line 3, in <module>

    import amulet

ImportError: No module named amulet

------------------------------------------------------------------------------
ERROR: jenkins::100-deploy-trusty
[/tmp/bundletester-I8dVhl/jenkins/tests/100-deploy-trusty exit 1]
Traceback (most recent call last):

  File "/tmp/bundletester-I8dVhl/jenkins/tests/100-deploy-trusty", line 3, in <module>

    import amulet

ImportError: No module named amulet

It seems that your tests use a different python that was installed with amulet in the 00-setup method. Changing the shebang statements to #!/usr/bin/python3 solved this problem.

To reproduce this problem you can install bundletester from pip and try running the tests yourself.

pip install bundletester
bundetester -F -e hp-cloud -l DEBUG -v

Revision history for this message
Matt Bruzek (mbruzek) wrote :

Edward,

Thanks for this submission to the jenkins charm. I would have liked to see the re-write of the charm make the `release` configuration option mutable but since you did not introduce any new immutable configuration I was OK with that one issue. Immutable configuration break the user experience and should really be documented that they can not be changed in the README.md file.

After I changed the shebag for 100-deploy-trusty the bundletests passed. However the 100-deploy-precise test fails (even with the shebang update) so I removed the test. The precise version of jenkins should contain its own test and I encourage you to move that test to the precise branch.

I also updated the metadata.yaml file to include the new "tags" keyword rather than "categories".

Thanks for your time re-writing this charm in python.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'Makefile'
2--- Makefile 1970-01-01 00:00:00 +0000
3+++ Makefile 2015-01-26 11:20:09 +0000
4@@ -0,0 +1,44 @@
5+#!/usr/bin/make
6+PYTHON := /usr/bin/env python
7+
8+ensure_venv:
9+ifeq ("$(shell which virtualenv)","")
10+ @sudo apt-get install -y python-virtualenv
11+endif
12+ @virtualenv .venv --no-site-packages
13+ @. .venv/bin/activate; \
14+ pip install -q -I -r test-requirements.txt; \
15+ deactivate
16+
17+lint: ensure_venv
18+ @. .venv/bin/activate; \
19+ .venv/bin/flake8 --exclude hooks/charmhelpers hooks unit_tests tests; \
20+ charm proof; \
21+ deactivate
22+
23+functional_test:
24+ @echo Starting Amulet tests...
25+ # coreycb note: The -v should only be temporary until Amulet sends
26+ # raise_status() messages to stderr:
27+ # https://bugs.launchpad.net/amulet/+bug/1320357
28+ @juju test -v -p AMULET_HTTP_PROXY --timeout 900 \
29+ 00-setup 100-deploy-precise 100-deploy-trusty
30+
31+test: ensure_venv
32+ @echo Starting unit tests...
33+ @. .venv/bin/activate; \
34+ .venv/bin/nosetests --nologcapture --with-coverage unit_tests; \
35+ deactivate
36+
37+bin/charm_helpers_sync.py:
38+ @mkdir -p bin
39+ @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
40+ > bin/charm_helpers_sync.py
41+
42+sync: bin/charm_helpers_sync.py
43+ @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers-hooks.yaml
44+ @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers-tests.yaml
45+
46+publish: lint unit_test
47+ bzr push lp:charms/jenkins
48+ bzr push lp:charms/trusty/jenkins
49
50=== added directory 'bin'
51=== added file 'bin/charm_helpers_sync.py'
52--- bin/charm_helpers_sync.py 1970-01-01 00:00:00 +0000
53+++ bin/charm_helpers_sync.py 2015-01-26 11:20:09 +0000
54@@ -0,0 +1,225 @@
55+#!/usr/bin/python
56+#
57+# Copyright 2013 Canonical Ltd.
58+
59+# Authors:
60+# Adam Gandelman <adamg@ubuntu.com>
61+#
62+
63+import logging
64+import optparse
65+import os
66+import subprocess
67+import shutil
68+import sys
69+import tempfile
70+import yaml
71+
72+from fnmatch import fnmatch
73+
74+CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
75+
76+
77+def parse_config(conf_file):
78+ if not os.path.isfile(conf_file):
79+ logging.error('Invalid config file: %s.' % conf_file)
80+ return False
81+ return yaml.load(open(conf_file).read())
82+
83+
84+def clone_helpers(work_dir, branch):
85+ dest = os.path.join(work_dir, 'charm-helpers')
86+ logging.info('Checking out %s to %s.' % (branch, dest))
87+ cmd = ['bzr', 'checkout', '--lightweight', branch, dest]
88+ subprocess.check_call(cmd)
89+ return dest
90+
91+
92+def _module_path(module):
93+ return os.path.join(*module.split('.'))
94+
95+
96+def _src_path(src, module):
97+ return os.path.join(src, 'charmhelpers', _module_path(module))
98+
99+
100+def _dest_path(dest, module):
101+ return os.path.join(dest, _module_path(module))
102+
103+
104+def _is_pyfile(path):
105+ return os.path.isfile(path + '.py')
106+
107+
108+def ensure_init(path):
109+ '''
110+ ensure directories leading up to path are importable, omitting
111+ parent directory, eg path='/hooks/helpers/foo'/:
112+ hooks/
113+ hooks/helpers/__init__.py
114+ hooks/helpers/foo/__init__.py
115+ '''
116+ for d, dirs, files in os.walk(os.path.join(*path.split('/')[:2])):
117+ _i = os.path.join(d, '__init__.py')
118+ if not os.path.exists(_i):
119+ logging.info('Adding missing __init__.py: %s' % _i)
120+ open(_i, 'wb').close()
121+
122+
123+def sync_pyfile(src, dest):
124+ src = src + '.py'
125+ src_dir = os.path.dirname(src)
126+ logging.info('Syncing pyfile: %s -> %s.' % (src, dest))
127+ if not os.path.exists(dest):
128+ os.makedirs(dest)
129+ shutil.copy(src, dest)
130+ if os.path.isfile(os.path.join(src_dir, '__init__.py')):
131+ shutil.copy(os.path.join(src_dir, '__init__.py'),
132+ dest)
133+ ensure_init(dest)
134+
135+
136+def get_filter(opts=None):
137+ opts = opts or []
138+ if 'inc=*' in opts:
139+ # do not filter any files, include everything
140+ return None
141+
142+ def _filter(dir, ls):
143+ incs = [opt.split('=').pop() for opt in opts if 'inc=' in opt]
144+ _filter = []
145+ for f in ls:
146+ _f = os.path.join(dir, f)
147+
148+ if not os.path.isdir(_f) and not _f.endswith('.py') and incs:
149+ if True not in [fnmatch(_f, inc) for inc in incs]:
150+ logging.debug('Not syncing %s, does not match include '
151+ 'filters (%s)' % (_f, incs))
152+ _filter.append(f)
153+ else:
154+ logging.debug('Including file, which matches include '
155+ 'filters (%s): %s' % (incs, _f))
156+ elif (os.path.isfile(_f) and not _f.endswith('.py')):
157+ logging.debug('Not syncing file: %s' % f)
158+ _filter.append(f)
159+ elif (os.path.isdir(_f) and not
160+ os.path.isfile(os.path.join(_f, '__init__.py'))):
161+ logging.debug('Not syncing directory: %s' % f)
162+ _filter.append(f)
163+ return _filter
164+ return _filter
165+
166+
167+def sync_directory(src, dest, opts=None):
168+ if os.path.exists(dest):
169+ logging.debug('Removing existing directory: %s' % dest)
170+ shutil.rmtree(dest)
171+ logging.info('Syncing directory: %s -> %s.' % (src, dest))
172+
173+ shutil.copytree(src, dest, ignore=get_filter(opts))
174+ ensure_init(dest)
175+
176+
177+def sync(src, dest, module, opts=None):
178+ if os.path.isdir(_src_path(src, module)):
179+ sync_directory(_src_path(src, module), _dest_path(dest, module), opts)
180+ elif _is_pyfile(_src_path(src, module)):
181+ sync_pyfile(_src_path(src, module),
182+ os.path.dirname(_dest_path(dest, module)))
183+ else:
184+ logging.warn('Could not sync: %s. Neither a pyfile or directory, '
185+ 'does it even exist?' % module)
186+
187+
188+def parse_sync_options(options):
189+ if not options:
190+ return []
191+ return options.split(',')
192+
193+
194+def extract_options(inc, global_options=None):
195+ global_options = global_options or []
196+ if global_options and isinstance(global_options, basestring):
197+ global_options = [global_options]
198+ if '|' not in inc:
199+ return (inc, global_options)
200+ inc, opts = inc.split('|')
201+ return (inc, parse_sync_options(opts) + global_options)
202+
203+
204+def sync_helpers(include, src, dest, options=None):
205+ if not os.path.isdir(dest):
206+ os.makedirs(dest)
207+
208+ global_options = parse_sync_options(options)
209+
210+ for inc in include:
211+ if isinstance(inc, str):
212+ inc, opts = extract_options(inc, global_options)
213+ sync(src, dest, inc, opts)
214+ elif isinstance(inc, dict):
215+ # could also do nested dicts here.
216+ for k, v in inc.iteritems():
217+ if isinstance(v, list):
218+ for m in v:
219+ inc, opts = extract_options(m, global_options)
220+ sync(src, dest, '%s.%s' % (k, inc), opts)
221+
222+if __name__ == '__main__':
223+ parser = optparse.OptionParser()
224+ parser.add_option('-c', '--config', action='store', dest='config',
225+ default=None, help='helper config file')
226+ parser.add_option('-D', '--debug', action='store_true', dest='debug',
227+ default=False, help='debug')
228+ parser.add_option('-b', '--branch', action='store', dest='branch',
229+ help='charm-helpers bzr branch (overrides config)')
230+ parser.add_option('-d', '--destination', action='store', dest='dest_dir',
231+ help='sync destination dir (overrides config)')
232+ (opts, args) = parser.parse_args()
233+
234+ if opts.debug:
235+ logging.basicConfig(level=logging.DEBUG)
236+ else:
237+ logging.basicConfig(level=logging.INFO)
238+
239+ if opts.config:
240+ logging.info('Loading charm helper config from %s.' % opts.config)
241+ config = parse_config(opts.config)
242+ if not config:
243+ logging.error('Could not parse config from %s.' % opts.config)
244+ sys.exit(1)
245+ else:
246+ config = {}
247+
248+ if 'branch' not in config:
249+ config['branch'] = CHARM_HELPERS_BRANCH
250+ if opts.branch:
251+ config['branch'] = opts.branch
252+ if opts.dest_dir:
253+ config['destination'] = opts.dest_dir
254+
255+ if 'destination' not in config:
256+ logging.error('No destination dir. specified as option or config.')
257+ sys.exit(1)
258+
259+ if 'include' not in config:
260+ if not args:
261+ logging.error('No modules to sync specified as option or config.')
262+ sys.exit(1)
263+ config['include'] = []
264+ [config['include'].append(a) for a in args]
265+
266+ sync_options = None
267+ if 'options' in config:
268+ sync_options = config['options']
269+ tmpd = tempfile.mkdtemp()
270+ try:
271+ checkout = clone_helpers(tmpd, config['branch'])
272+ sync_helpers(config['include'], checkout, config['destination'],
273+ options=sync_options)
274+ except Exception, e:
275+ logging.error("Could not sync: %s" % e)
276+ raise e
277+ finally:
278+ logging.debug('Cleaning up %s' % tmpd)
279+ shutil.rmtree(tmpd)
280
281=== added file 'charm-helpers-hooks.yaml'
282--- charm-helpers-hooks.yaml 1970-01-01 00:00:00 +0000
283+++ charm-helpers-hooks.yaml 2015-01-26 11:20:09 +0000
284@@ -0,0 +1,8 @@
285+branch: lp:charm-helpers
286+destination: hooks/charmhelpers
287+include:
288+ - __init__
289+ - contrib.python.packages
290+ - core
291+ - fetch
292+ - payload.execd
293
294=== added file 'charm-helpers-tests.yaml'
295--- charm-helpers-tests.yaml 1970-01-01 00:00:00 +0000
296+++ charm-helpers-tests.yaml 2015-01-26 11:20:09 +0000
297@@ -0,0 +1,5 @@
298+branch: lp:charm-helpers
299+destination: tests/charmhelpers
300+include:
301+ - contrib.amulet
302+ - contrib.openstack.amulet
303
304=== modified file 'config.yaml'
305--- config.yaml 2014-08-14 19:53:02 +0000
306+++ config.yaml 2015-01-26 11:20:09 +0000
307@@ -17,9 +17,9 @@
308 slave nodes so please don't change in Jenkins.
309 password:
310 type: string
311+ default: ""
312 description: Admin user password - used to manage
313 slave nodes so please don't change in Jenkins.
314- default:
315 plugins:
316 type: string
317 default: ""
318
319=== removed file 'hooks/addnode'
320--- hooks/addnode 2012-04-27 13:04:33 +0000
321+++ hooks/addnode 1970-01-01 00:00:00 +0000
322@@ -1,21 +0,0 @@
323-#!/usr/bin/python
324-
325-import jenkins
326-import sys
327-
328-host=sys.argv[1]
329-executors=sys.argv[2]
330-labels=sys.argv[3]
331-username=sys.argv[4]
332-password=sys.argv[5]
333-
334-l_jenkins = jenkins.Jenkins("http://localhost:8080/",username,password)
335-
336-if l_jenkins.node_exists(host):
337- print "Node exists - not adding"
338-else:
339- print "Adding node to Jenkins master"
340- l_jenkins.create_node(host, int(executors) * 2, host , labels=labels)
341-
342-if not l_jenkins.node_exists(host):
343- print "Failed to create node"
344
345=== added directory 'hooks/charmhelpers'
346=== added file 'hooks/charmhelpers/__init__.py'
347--- hooks/charmhelpers/__init__.py 1970-01-01 00:00:00 +0000
348+++ hooks/charmhelpers/__init__.py 2015-01-26 11:20:09 +0000
349@@ -0,0 +1,22 @@
350+# Bootstrap charm-helpers, installing its dependencies if necessary using
351+# only standard libraries.
352+import subprocess
353+import sys
354+
355+try:
356+ import six # flake8: noqa
357+except ImportError:
358+ if sys.version_info.major == 2:
359+ subprocess.check_call(['apt-get', 'install', '-y', 'python-six'])
360+ else:
361+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-six'])
362+ import six # flake8: noqa
363+
364+try:
365+ import yaml # flake8: noqa
366+except ImportError:
367+ if sys.version_info.major == 2:
368+ subprocess.check_call(['apt-get', 'install', '-y', 'python-yaml'])
369+ else:
370+ subprocess.check_call(['apt-get', 'install', '-y', 'python3-yaml'])
371+ import yaml # flake8: noqa
372
373=== added directory 'hooks/charmhelpers/contrib'
374=== added file 'hooks/charmhelpers/contrib/__init__.py'
375=== added directory 'hooks/charmhelpers/contrib/python'
376=== added file 'hooks/charmhelpers/contrib/python/__init__.py'
377=== added file 'hooks/charmhelpers/contrib/python/packages.py'
378--- hooks/charmhelpers/contrib/python/packages.py 1970-01-01 00:00:00 +0000
379+++ hooks/charmhelpers/contrib/python/packages.py 2015-01-26 11:20:09 +0000
380@@ -0,0 +1,80 @@
381+#!/usr/bin/env python
382+# coding: utf-8
383+
384+__author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
385+
386+from charmhelpers.fetch import apt_install, apt_update
387+from charmhelpers.core.hookenv import log
388+
389+try:
390+ from pip import main as pip_execute
391+except ImportError:
392+ apt_update()
393+ apt_install('python-pip')
394+ from pip import main as pip_execute
395+
396+
397+def parse_options(given, available):
398+ """Given a set of options, check if available"""
399+ for key, value in sorted(given.items()):
400+ if key in available:
401+ yield "--{0}={1}".format(key, value)
402+
403+
404+def pip_install_requirements(requirements, **options):
405+ """Install a requirements file """
406+ command = ["install"]
407+
408+ available_options = ('proxy', 'src', 'log', )
409+ for option in parse_options(options, available_options):
410+ command.append(option)
411+
412+ command.append("-r {0}".format(requirements))
413+ log("Installing from file: {} with options: {}".format(requirements,
414+ command))
415+ pip_execute(command)
416+
417+
418+def pip_install(package, fatal=False, upgrade=False, **options):
419+ """Install a python package"""
420+ command = ["install"]
421+
422+ available_options = ('proxy', 'src', 'log', "index-url", )
423+ for option in parse_options(options, available_options):
424+ command.append(option)
425+
426+ if upgrade:
427+ command.append('--upgrade')
428+
429+ if isinstance(package, list):
430+ command.extend(package)
431+ else:
432+ command.append(package)
433+
434+ log("Installing {} package with options: {}".format(package,
435+ command))
436+ pip_execute(command)
437+
438+
439+def pip_uninstall(package, **options):
440+ """Uninstall a python package"""
441+ command = ["uninstall", "-q", "-y"]
442+
443+ available_options = ('proxy', 'log', )
444+ for option in parse_options(options, available_options):
445+ command.append(option)
446+
447+ if isinstance(package, list):
448+ command.extend(package)
449+ else:
450+ command.append(package)
451+
452+ log("Uninstalling {} package with options: {}".format(package,
453+ command))
454+ pip_execute(command)
455+
456+
457+def pip_list():
458+ """Returns the list of current python installed packages
459+ """
460+ return pip_execute(["list"])
461
462=== added directory 'hooks/charmhelpers/core'
463=== added file 'hooks/charmhelpers/core/__init__.py'
464=== added file 'hooks/charmhelpers/core/decorators.py'
465--- hooks/charmhelpers/core/decorators.py 1970-01-01 00:00:00 +0000
466+++ hooks/charmhelpers/core/decorators.py 2015-01-26 11:20:09 +0000
467@@ -0,0 +1,41 @@
468+#
469+# Copyright 2014 Canonical Ltd.
470+#
471+# Authors:
472+# Edward Hope-Morley <opentastic@gmail.com>
473+#
474+
475+import time
476+
477+from charmhelpers.core.hookenv import (
478+ log,
479+ INFO,
480+)
481+
482+
483+def retry_on_exception(num_retries, base_delay=0, exc_type=Exception):
484+ """If the decorated function raises exception exc_type, allow num_retries
485+ retry attempts before raise the exception.
486+ """
487+ def _retry_on_exception_inner_1(f):
488+ def _retry_on_exception_inner_2(*args, **kwargs):
489+ retries = num_retries
490+ multiplier = 1
491+ while True:
492+ try:
493+ return f(*args, **kwargs)
494+ except exc_type:
495+ if not retries:
496+ raise
497+
498+ delay = base_delay * multiplier
499+ multiplier += 1
500+ log("Retrying '%s' %d more times (delay=%s)" %
501+ (f.__name__, retries, delay), level=INFO)
502+ retries -= 1
503+ if delay:
504+ time.sleep(delay)
505+
506+ return _retry_on_exception_inner_2
507+
508+ return _retry_on_exception_inner_1
509
510=== added file 'hooks/charmhelpers/core/fstab.py'
511--- hooks/charmhelpers/core/fstab.py 1970-01-01 00:00:00 +0000
512+++ hooks/charmhelpers/core/fstab.py 2015-01-26 11:20:09 +0000
513@@ -0,0 +1,118 @@
514+#!/usr/bin/env python
515+# -*- coding: utf-8 -*-
516+
517+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
518+
519+import io
520+import os
521+
522+
523+class Fstab(io.FileIO):
524+ """This class extends file in order to implement a file reader/writer
525+ for file `/etc/fstab`
526+ """
527+
528+ class Entry(object):
529+ """Entry class represents a non-comment line on the `/etc/fstab` file
530+ """
531+ def __init__(self, device, mountpoint, filesystem,
532+ options, d=0, p=0):
533+ self.device = device
534+ self.mountpoint = mountpoint
535+ self.filesystem = filesystem
536+
537+ if not options:
538+ options = "defaults"
539+
540+ self.options = options
541+ self.d = int(d)
542+ self.p = int(p)
543+
544+ def __eq__(self, o):
545+ return str(self) == str(o)
546+
547+ def __str__(self):
548+ return "{} {} {} {} {} {}".format(self.device,
549+ self.mountpoint,
550+ self.filesystem,
551+ self.options,
552+ self.d,
553+ self.p)
554+
555+ DEFAULT_PATH = os.path.join(os.path.sep, 'etc', 'fstab')
556+
557+ def __init__(self, path=None):
558+ if path:
559+ self._path = path
560+ else:
561+ self._path = self.DEFAULT_PATH
562+ super(Fstab, self).__init__(self._path, 'rb+')
563+
564+ def _hydrate_entry(self, line):
565+ # NOTE: use split with no arguments to split on any
566+ # whitespace including tabs
567+ return Fstab.Entry(*filter(
568+ lambda x: x not in ('', None),
569+ line.strip("\n").split()))
570+
571+ @property
572+ def entries(self):
573+ self.seek(0)
574+ for line in self.readlines():
575+ line = line.decode('us-ascii')
576+ try:
577+ if line.strip() and not line.startswith("#"):
578+ yield self._hydrate_entry(line)
579+ except ValueError:
580+ pass
581+
582+ def get_entry_by_attr(self, attr, value):
583+ for entry in self.entries:
584+ e_attr = getattr(entry, attr)
585+ if e_attr == value:
586+ return entry
587+ return None
588+
589+ def add_entry(self, entry):
590+ if self.get_entry_by_attr('device', entry.device):
591+ return False
592+
593+ self.write((str(entry) + '\n').encode('us-ascii'))
594+ self.truncate()
595+ return entry
596+
597+ def remove_entry(self, entry):
598+ self.seek(0)
599+
600+ lines = [l.decode('us-ascii') for l in self.readlines()]
601+
602+ found = False
603+ for index, line in enumerate(lines):
604+ if not line.startswith("#"):
605+ if self._hydrate_entry(line) == entry:
606+ found = True
607+ break
608+
609+ if not found:
610+ return False
611+
612+ lines.remove(line)
613+
614+ self.seek(0)
615+ self.write(''.join(lines).encode('us-ascii'))
616+ self.truncate()
617+ return True
618+
619+ @classmethod
620+ def remove_by_mountpoint(cls, mountpoint, path=None):
621+ fstab = cls(path=path)
622+ entry = fstab.get_entry_by_attr('mountpoint', mountpoint)
623+ if entry:
624+ return fstab.remove_entry(entry)
625+ return False
626+
627+ @classmethod
628+ def add(cls, device, mountpoint, filesystem, options=None, path=None):
629+ return cls(path=path).add_entry(Fstab.Entry(device,
630+ mountpoint, filesystem,
631+ options=options))
632
633=== added file 'hooks/charmhelpers/core/hookenv.py'
634--- hooks/charmhelpers/core/hookenv.py 1970-01-01 00:00:00 +0000
635+++ hooks/charmhelpers/core/hookenv.py 2015-01-26 11:20:09 +0000
636@@ -0,0 +1,552 @@
637+"Interactions with the Juju environment"
638+# Copyright 2013 Canonical Ltd.
639+#
640+# Authors:
641+# Charm Helpers Developers <juju@lists.ubuntu.com>
642+
643+import os
644+import json
645+import yaml
646+import subprocess
647+import sys
648+from subprocess import CalledProcessError
649+
650+import six
651+if not six.PY3:
652+ from UserDict import UserDict
653+else:
654+ from collections import UserDict
655+
656+CRITICAL = "CRITICAL"
657+ERROR = "ERROR"
658+WARNING = "WARNING"
659+INFO = "INFO"
660+DEBUG = "DEBUG"
661+MARKER = object()
662+
663+cache = {}
664+
665+
666+def cached(func):
667+ """Cache return values for multiple executions of func + args
668+
669+ For example::
670+
671+ @cached
672+ def unit_get(attribute):
673+ pass
674+
675+ unit_get('test')
676+
677+ will cache the result of unit_get + 'test' for future calls.
678+ """
679+ def wrapper(*args, **kwargs):
680+ global cache
681+ key = str((func, args, kwargs))
682+ try:
683+ return cache[key]
684+ except KeyError:
685+ res = func(*args, **kwargs)
686+ cache[key] = res
687+ return res
688+ return wrapper
689+
690+
691+def flush(key):
692+ """Flushes any entries from function cache where the
693+ key is found in the function+args """
694+ flush_list = []
695+ for item in cache:
696+ if key in item:
697+ flush_list.append(item)
698+ for item in flush_list:
699+ del cache[item]
700+
701+
702+def log(message, level=None):
703+ """Write a message to the juju log"""
704+ command = ['juju-log']
705+ if level:
706+ command += ['-l', level]
707+ if not isinstance(message, six.string_types):
708+ message = repr(message)
709+ command += [message]
710+ subprocess.call(command)
711+
712+
713+class Serializable(UserDict):
714+ """Wrapper, an object that can be serialized to yaml or json"""
715+
716+ def __init__(self, obj):
717+ # wrap the object
718+ UserDict.__init__(self)
719+ self.data = obj
720+
721+ def __getattr__(self, attr):
722+ # See if this object has attribute.
723+ if attr in ("json", "yaml", "data"):
724+ return self.__dict__[attr]
725+ # Check for attribute in wrapped object.
726+ got = getattr(self.data, attr, MARKER)
727+ if got is not MARKER:
728+ return got
729+ # Proxy to the wrapped object via dict interface.
730+ try:
731+ return self.data[attr]
732+ except KeyError:
733+ raise AttributeError(attr)
734+
735+ def __getstate__(self):
736+ # Pickle as a standard dictionary.
737+ return self.data
738+
739+ def __setstate__(self, state):
740+ # Unpickle into our wrapper.
741+ self.data = state
742+
743+ def json(self):
744+ """Serialize the object to json"""
745+ return json.dumps(self.data)
746+
747+ def yaml(self):
748+ """Serialize the object to yaml"""
749+ return yaml.dump(self.data)
750+
751+
752+def execution_environment():
753+ """A convenient bundling of the current execution context"""
754+ context = {}
755+ context['conf'] = config()
756+ if relation_id():
757+ context['reltype'] = relation_type()
758+ context['relid'] = relation_id()
759+ context['rel'] = relation_get()
760+ context['unit'] = local_unit()
761+ context['rels'] = relations()
762+ context['env'] = os.environ
763+ return context
764+
765+
766+def in_relation_hook():
767+ """Determine whether we're running in a relation hook"""
768+ return 'JUJU_RELATION' in os.environ
769+
770+
771+def relation_type():
772+ """The scope for the current relation hook"""
773+ return os.environ.get('JUJU_RELATION', None)
774+
775+
776+def relation_id():
777+ """The relation ID for the current relation hook"""
778+ return os.environ.get('JUJU_RELATION_ID', None)
779+
780+
781+def local_unit():
782+ """Local unit ID"""
783+ return os.environ['JUJU_UNIT_NAME']
784+
785+
786+def remote_unit():
787+ """The remote unit for the current relation hook"""
788+ return os.environ['JUJU_REMOTE_UNIT']
789+
790+
791+def service_name():
792+ """The name service group this unit belongs to"""
793+ return local_unit().split('/')[0]
794+
795+
796+def hook_name():
797+ """The name of the currently executing hook"""
798+ return os.path.basename(sys.argv[0])
799+
800+
801+class Config(dict):
802+ """A dictionary representation of the charm's config.yaml, with some
803+ extra features:
804+
805+ - See which values in the dictionary have changed since the previous hook.
806+ - For values that have changed, see what the previous value was.
807+ - Store arbitrary data for use in a later hook.
808+
809+ NOTE: Do not instantiate this object directly - instead call
810+ ``hookenv.config()``, which will return an instance of :class:`Config`.
811+
812+ Example usage::
813+
814+ >>> # inside a hook
815+ >>> from charmhelpers.core import hookenv
816+ >>> config = hookenv.config()
817+ >>> config['foo']
818+ 'bar'
819+ >>> # store a new key/value for later use
820+ >>> config['mykey'] = 'myval'
821+
822+
823+ >>> # user runs `juju set mycharm foo=baz`
824+ >>> # now we're inside subsequent config-changed hook
825+ >>> config = hookenv.config()
826+ >>> config['foo']
827+ 'baz'
828+ >>> # test to see if this val has changed since last hook
829+ >>> config.changed('foo')
830+ True
831+ >>> # what was the previous value?
832+ >>> config.previous('foo')
833+ 'bar'
834+ >>> # keys/values that we add are preserved across hooks
835+ >>> config['mykey']
836+ 'myval'
837+
838+ """
839+ CONFIG_FILE_NAME = '.juju-persistent-config'
840+
841+ def __init__(self, *args, **kw):
842+ super(Config, self).__init__(*args, **kw)
843+ self.implicit_save = True
844+ self._prev_dict = None
845+ self.path = os.path.join(charm_dir(), Config.CONFIG_FILE_NAME)
846+ if os.path.exists(self.path):
847+ self.load_previous()
848+
849+ def __getitem__(self, key):
850+ """For regular dict lookups, check the current juju config first,
851+ then the previous (saved) copy. This ensures that user-saved values
852+ will be returned by a dict lookup.
853+
854+ """
855+ try:
856+ return dict.__getitem__(self, key)
857+ except KeyError:
858+ return (self._prev_dict or {})[key]
859+
860+ def keys(self):
861+ prev_keys = []
862+ if self._prev_dict is not None:
863+ prev_keys = self._prev_dict.keys()
864+ return list(set(prev_keys + list(dict.keys(self))))
865+
866+ def load_previous(self, path=None):
867+ """Load previous copy of config from disk.
868+
869+ In normal usage you don't need to call this method directly - it
870+ is called automatically at object initialization.
871+
872+ :param path:
873+
874+ File path from which to load the previous config. If `None`,
875+ config is loaded from the default location. If `path` is
876+ specified, subsequent `save()` calls will write to the same
877+ path.
878+
879+ """
880+ self.path = path or self.path
881+ with open(self.path) as f:
882+ self._prev_dict = json.load(f)
883+
884+ def changed(self, key):
885+ """Return True if the current value for this key is different from
886+ the previous value.
887+
888+ """
889+ if self._prev_dict is None:
890+ return True
891+ return self.previous(key) != self.get(key)
892+
893+ def previous(self, key):
894+ """Return previous value for this key, or None if there
895+ is no previous value.
896+
897+ """
898+ if self._prev_dict:
899+ return self._prev_dict.get(key)
900+ return None
901+
902+ def save(self):
903+ """Save this config to disk.
904+
905+ If the charm is using the :mod:`Services Framework <services.base>`
906+ or :meth:'@hook <Hooks.hook>' decorator, this
907+ is called automatically at the end of successful hook execution.
908+ Otherwise, it should be called directly by user code.
909+
910+ To disable automatic saves, set ``implicit_save=False`` on this
911+ instance.
912+
913+ """
914+ if self._prev_dict:
915+ for k, v in six.iteritems(self._prev_dict):
916+ if k not in self:
917+ self[k] = v
918+ with open(self.path, 'w') as f:
919+ json.dump(self, f)
920+
921+
922+@cached
923+def config(scope=None):
924+ """Juju charm configuration"""
925+ config_cmd_line = ['config-get']
926+ if scope is not None:
927+ config_cmd_line.append(scope)
928+ config_cmd_line.append('--format=json')
929+ try:
930+ config_data = json.loads(
931+ subprocess.check_output(config_cmd_line).decode('UTF-8'))
932+ if scope is not None:
933+ return config_data
934+ return Config(config_data)
935+ except ValueError:
936+ return None
937+
938+
939+@cached
940+def relation_get(attribute=None, unit=None, rid=None):
941+ """Get relation information"""
942+ _args = ['relation-get', '--format=json']
943+ if rid:
944+ _args.append('-r')
945+ _args.append(rid)
946+ _args.append(attribute or '-')
947+ if unit:
948+ _args.append(unit)
949+ try:
950+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
951+ except ValueError:
952+ return None
953+ except CalledProcessError as e:
954+ if e.returncode == 2:
955+ return None
956+ raise
957+
958+
959+def relation_set(relation_id=None, relation_settings=None, **kwargs):
960+ """Set relation information for the current unit"""
961+ relation_settings = relation_settings if relation_settings else {}
962+ relation_cmd_line = ['relation-set']
963+ if relation_id is not None:
964+ relation_cmd_line.extend(('-r', relation_id))
965+ for k, v in (list(relation_settings.items()) + list(kwargs.items())):
966+ if v is None:
967+ relation_cmd_line.append('{}='.format(k))
968+ else:
969+ relation_cmd_line.append('{}={}'.format(k, v))
970+ subprocess.check_call(relation_cmd_line)
971+ # Flush cache of any relation-gets for local unit
972+ flush(local_unit())
973+
974+
975+@cached
976+def relation_ids(reltype=None):
977+ """A list of relation_ids"""
978+ reltype = reltype or relation_type()
979+ relid_cmd_line = ['relation-ids', '--format=json']
980+ if reltype is not None:
981+ relid_cmd_line.append(reltype)
982+ return json.loads(
983+ subprocess.check_output(relid_cmd_line).decode('UTF-8')) or []
984+ return []
985+
986+
987+@cached
988+def related_units(relid=None):
989+ """A list of related units"""
990+ relid = relid or relation_id()
991+ units_cmd_line = ['relation-list', '--format=json']
992+ if relid is not None:
993+ units_cmd_line.extend(('-r', relid))
994+ return json.loads(
995+ subprocess.check_output(units_cmd_line).decode('UTF-8')) or []
996+
997+
998+@cached
999+def relation_for_unit(unit=None, rid=None):
1000+ """Get the json represenation of a unit's relation"""
1001+ unit = unit or remote_unit()
1002+ relation = relation_get(unit=unit, rid=rid)
1003+ for key in relation:
1004+ if key.endswith('-list'):
1005+ relation[key] = relation[key].split()
1006+ relation['__unit__'] = unit
1007+ return relation
1008+
1009+
1010+@cached
1011+def relations_for_id(relid=None):
1012+ """Get relations of a specific relation ID"""
1013+ relation_data = []
1014+ relid = relid or relation_ids()
1015+ for unit in related_units(relid):
1016+ unit_data = relation_for_unit(unit, relid)
1017+ unit_data['__relid__'] = relid
1018+ relation_data.append(unit_data)
1019+ return relation_data
1020+
1021+
1022+@cached
1023+def relations_of_type(reltype=None):
1024+ """Get relations of a specific type"""
1025+ relation_data = []
1026+ reltype = reltype or relation_type()
1027+ for relid in relation_ids(reltype):
1028+ for relation in relations_for_id(relid):
1029+ relation['__relid__'] = relid
1030+ relation_data.append(relation)
1031+ return relation_data
1032+
1033+
1034+@cached
1035+def metadata():
1036+ """Get the current charm metadata.yaml contents as a python object"""
1037+ with open(os.path.join(charm_dir(), 'metadata.yaml')) as md:
1038+ return yaml.safe_load(md)
1039+
1040+
1041+@cached
1042+def relation_types():
1043+ """Get a list of relation types supported by this charm"""
1044+ rel_types = []
1045+ md = metadata()
1046+ for key in ('provides', 'requires', 'peers'):
1047+ section = md.get(key)
1048+ if section:
1049+ rel_types.extend(section.keys())
1050+ return rel_types
1051+
1052+
1053+@cached
1054+def charm_name():
1055+ """Get the name of the current charm as is specified on metadata.yaml"""
1056+ return metadata().get('name')
1057+
1058+
1059+@cached
1060+def relations():
1061+ """Get a nested dictionary of relation data for all related units"""
1062+ rels = {}
1063+ for reltype in relation_types():
1064+ relids = {}
1065+ for relid in relation_ids(reltype):
1066+ units = {local_unit(): relation_get(unit=local_unit(), rid=relid)}
1067+ for unit in related_units(relid):
1068+ reldata = relation_get(unit=unit, rid=relid)
1069+ units[unit] = reldata
1070+ relids[relid] = units
1071+ rels[reltype] = relids
1072+ return rels
1073+
1074+
1075+@cached
1076+def is_relation_made(relation, keys='private-address'):
1077+ '''
1078+ Determine whether a relation is established by checking for
1079+ presence of key(s). If a list of keys is provided, they
1080+ must all be present for the relation to be identified as made
1081+ '''
1082+ if isinstance(keys, str):
1083+ keys = [keys]
1084+ for r_id in relation_ids(relation):
1085+ for unit in related_units(r_id):
1086+ context = {}
1087+ for k in keys:
1088+ context[k] = relation_get(k, rid=r_id,
1089+ unit=unit)
1090+ if None not in context.values():
1091+ return True
1092+ return False
1093+
1094+
1095+def open_port(port, protocol="TCP"):
1096+ """Open a service network port"""
1097+ _args = ['open-port']
1098+ _args.append('{}/{}'.format(port, protocol))
1099+ subprocess.check_call(_args)
1100+
1101+
1102+def close_port(port, protocol="TCP"):
1103+ """Close a service network port"""
1104+ _args = ['close-port']
1105+ _args.append('{}/{}'.format(port, protocol))
1106+ subprocess.check_call(_args)
1107+
1108+
1109+@cached
1110+def unit_get(attribute):
1111+ """Get the unit ID for the remote unit"""
1112+ _args = ['unit-get', '--format=json', attribute]
1113+ try:
1114+ return json.loads(subprocess.check_output(_args).decode('UTF-8'))
1115+ except ValueError:
1116+ return None
1117+
1118+
1119+def unit_private_ip():
1120+ """Get this unit's private IP address"""
1121+ return unit_get('private-address')
1122+
1123+
1124+class UnregisteredHookError(Exception):
1125+ """Raised when an undefined hook is called"""
1126+ pass
1127+
1128+
1129+class Hooks(object):
1130+ """A convenient handler for hook functions.
1131+
1132+ Example::
1133+
1134+ hooks = Hooks()
1135+
1136+ # register a hook, taking its name from the function name
1137+ @hooks.hook()
1138+ def install():
1139+ pass # your code here
1140+
1141+ # register a hook, providing a custom hook name
1142+ @hooks.hook("config-changed")
1143+ def config_changed():
1144+ pass # your code here
1145+
1146+ if __name__ == "__main__":
1147+ # execute a hook based on the name the program is called by
1148+ hooks.execute(sys.argv)
1149+ """
1150+
1151+ def __init__(self, config_save=True):
1152+ super(Hooks, self).__init__()
1153+ self._hooks = {}
1154+ self._config_save = config_save
1155+
1156+ def register(self, name, function):
1157+ """Register a hook"""
1158+ self._hooks[name] = function
1159+
1160+ def execute(self, args):
1161+ """Execute a registered hook based on args[0]"""
1162+ hook_name = os.path.basename(args[0])
1163+ if hook_name in self._hooks:
1164+ self._hooks[hook_name]()
1165+ if self._config_save:
1166+ cfg = config()
1167+ if cfg.implicit_save:
1168+ cfg.save()
1169+ else:
1170+ raise UnregisteredHookError(hook_name)
1171+
1172+ def hook(self, *hook_names):
1173+ """Decorator, registering them as hooks"""
1174+ def wrapper(decorated):
1175+ for hook_name in hook_names:
1176+ self.register(hook_name, decorated)
1177+ else:
1178+ self.register(decorated.__name__, decorated)
1179+ if '_' in decorated.__name__:
1180+ self.register(
1181+ decorated.__name__.replace('_', '-'), decorated)
1182+ return decorated
1183+ return wrapper
1184+
1185+
1186+def charm_dir():
1187+ """Return the root directory of the current charm"""
1188+ return os.environ.get('CHARM_DIR')
1189
1190=== added file 'hooks/charmhelpers/core/host.py'
1191--- hooks/charmhelpers/core/host.py 1970-01-01 00:00:00 +0000
1192+++ hooks/charmhelpers/core/host.py 2015-01-26 11:20:09 +0000
1193@@ -0,0 +1,419 @@
1194+"""Tools for working with the host system"""
1195+# Copyright 2012 Canonical Ltd.
1196+#
1197+# Authors:
1198+# Nick Moffitt <nick.moffitt@canonical.com>
1199+# Matthew Wedgwood <matthew.wedgwood@canonical.com>
1200+
1201+import os
1202+import re
1203+import pwd
1204+import grp
1205+import random
1206+import string
1207+import subprocess
1208+import hashlib
1209+from contextlib import contextmanager
1210+from collections import OrderedDict
1211+
1212+import six
1213+
1214+from .hookenv import log
1215+from .fstab import Fstab
1216+
1217+
1218+def service_start(service_name):
1219+ """Start a system service"""
1220+ return service('start', service_name)
1221+
1222+
1223+def service_stop(service_name):
1224+ """Stop a system service"""
1225+ return service('stop', service_name)
1226+
1227+
1228+def service_restart(service_name):
1229+ """Restart a system service"""
1230+ return service('restart', service_name)
1231+
1232+
1233+def service_reload(service_name, restart_on_failure=False):
1234+ """Reload a system service, optionally falling back to restart if
1235+ reload fails"""
1236+ service_result = service('reload', service_name)
1237+ if not service_result and restart_on_failure:
1238+ service_result = service('restart', service_name)
1239+ return service_result
1240+
1241+
1242+def service(action, service_name):
1243+ """Control a system service"""
1244+ cmd = ['service', service_name, action]
1245+ return subprocess.call(cmd) == 0
1246+
1247+
1248+def service_running(service):
1249+ """Determine whether a system service is running"""
1250+ try:
1251+ output = subprocess.check_output(
1252+ ['service', service, 'status'],
1253+ stderr=subprocess.STDOUT).decode('UTF-8')
1254+ except subprocess.CalledProcessError:
1255+ return False
1256+ else:
1257+ if ("start/running" in output or "is running" in output):
1258+ return True
1259+ else:
1260+ return False
1261+
1262+
1263+def service_available(service_name):
1264+ """Determine whether a system service is available"""
1265+ try:
1266+ subprocess.check_output(
1267+ ['service', service_name, 'status'],
1268+ stderr=subprocess.STDOUT).decode('UTF-8')
1269+ except subprocess.CalledProcessError as e:
1270+ return 'unrecognized service' not in e.output
1271+ else:
1272+ return True
1273+
1274+
1275+def adduser(username, password=None, shell='/bin/bash', system_user=False):
1276+ """Add a user to the system"""
1277+ try:
1278+ user_info = pwd.getpwnam(username)
1279+ log('user {0} already exists!'.format(username))
1280+ except KeyError:
1281+ log('creating user {0}'.format(username))
1282+ cmd = ['useradd']
1283+ if system_user or password is None:
1284+ cmd.append('--system')
1285+ else:
1286+ cmd.extend([
1287+ '--create-home',
1288+ '--shell', shell,
1289+ '--password', password,
1290+ ])
1291+ cmd.append(username)
1292+ subprocess.check_call(cmd)
1293+ user_info = pwd.getpwnam(username)
1294+ return user_info
1295+
1296+
1297+def add_group(group_name, system_group=False):
1298+ """Add a group to the system"""
1299+ try:
1300+ group_info = grp.getgrnam(group_name)
1301+ log('group {0} already exists!'.format(group_name))
1302+ except KeyError:
1303+ log('creating group {0}'.format(group_name))
1304+ cmd = ['addgroup']
1305+ if system_group:
1306+ cmd.append('--system')
1307+ else:
1308+ cmd.extend([
1309+ '--group',
1310+ ])
1311+ cmd.append(group_name)
1312+ subprocess.check_call(cmd)
1313+ group_info = grp.getgrnam(group_name)
1314+ return group_info
1315+
1316+
1317+def add_user_to_group(username, group):
1318+ """Add a user to a group"""
1319+ cmd = [
1320+ 'gpasswd', '-a',
1321+ username,
1322+ group
1323+ ]
1324+ log("Adding user {} to group {}".format(username, group))
1325+ subprocess.check_call(cmd)
1326+
1327+
1328+def rsync(from_path, to_path, flags='-r', options=None):
1329+ """Replicate the contents of a path"""
1330+ options = options or ['--delete', '--executability']
1331+ cmd = ['/usr/bin/rsync', flags]
1332+ cmd.extend(options)
1333+ cmd.append(from_path)
1334+ cmd.append(to_path)
1335+ log(" ".join(cmd))
1336+ return subprocess.check_output(cmd).decode('UTF-8').strip()
1337+
1338+
1339+def symlink(source, destination):
1340+ """Create a symbolic link"""
1341+ log("Symlinking {} as {}".format(source, destination))
1342+ cmd = [
1343+ 'ln',
1344+ '-sf',
1345+ source,
1346+ destination,
1347+ ]
1348+ subprocess.check_call(cmd)
1349+
1350+
1351+def mkdir(path, owner='root', group='root', perms=0o555, force=False):
1352+ """Create a directory"""
1353+ log("Making dir {} {}:{} {:o}".format(path, owner, group,
1354+ perms))
1355+ uid = pwd.getpwnam(owner).pw_uid
1356+ gid = grp.getgrnam(group).gr_gid
1357+ realpath = os.path.abspath(path)
1358+ path_exists = os.path.exists(realpath)
1359+ if path_exists and force:
1360+ if not os.path.isdir(realpath):
1361+ log("Removing non-directory file {} prior to mkdir()".format(path))
1362+ os.unlink(realpath)
1363+ os.makedirs(realpath, perms)
1364+ os.chown(realpath, uid, gid)
1365+ elif not path_exists:
1366+ os.makedirs(realpath, perms)
1367+ os.chown(realpath, uid, gid)
1368+
1369+
1370+def write_file(path, content, owner='root', group='root', perms=0o444):
1371+ """Create or overwrite a file with the contents of a string"""
1372+ log("Writing file {} {}:{} {:o}".format(path, owner, group, perms))
1373+ uid = pwd.getpwnam(owner).pw_uid
1374+ gid = grp.getgrnam(group).gr_gid
1375+ with open(path, 'w') as target:
1376+ os.fchown(target.fileno(), uid, gid)
1377+ os.fchmod(target.fileno(), perms)
1378+ target.write(content)
1379+
1380+
1381+def fstab_remove(mp):
1382+ """Remove the given mountpoint entry from /etc/fstab
1383+ """
1384+ return Fstab.remove_by_mountpoint(mp)
1385+
1386+
1387+def fstab_add(dev, mp, fs, options=None):
1388+ """Adds the given device entry to the /etc/fstab file
1389+ """
1390+ return Fstab.add(dev, mp, fs, options=options)
1391+
1392+
1393+def mount(device, mountpoint, options=None, persist=False, filesystem="ext3"):
1394+ """Mount a filesystem at a particular mountpoint"""
1395+ cmd_args = ['mount']
1396+ if options is not None:
1397+ cmd_args.extend(['-o', options])
1398+ cmd_args.extend([device, mountpoint])
1399+ try:
1400+ subprocess.check_output(cmd_args)
1401+ except subprocess.CalledProcessError as e:
1402+ log('Error mounting {} at {}\n{}'.format(device, mountpoint, e.output))
1403+ return False
1404+
1405+ if persist:
1406+ return fstab_add(device, mountpoint, filesystem, options=options)
1407+ return True
1408+
1409+
1410+def umount(mountpoint, persist=False):
1411+ """Unmount a filesystem"""
1412+ cmd_args = ['umount', mountpoint]
1413+ try:
1414+ subprocess.check_output(cmd_args)
1415+ except subprocess.CalledProcessError as e:
1416+ log('Error unmounting {}\n{}'.format(mountpoint, e.output))
1417+ return False
1418+
1419+ if persist:
1420+ return fstab_remove(mountpoint)
1421+ return True
1422+
1423+
1424+def mounts():
1425+ """Get a list of all mounted volumes as [[mountpoint,device],[...]]"""
1426+ with open('/proc/mounts') as f:
1427+ # [['/mount/point','/dev/path'],[...]]
1428+ system_mounts = [m[1::-1] for m in [l.strip().split()
1429+ for l in f.readlines()]]
1430+ return system_mounts
1431+
1432+
1433+def file_hash(path, hash_type='md5'):
1434+ """
1435+ Generate a hash checksum of the contents of 'path' or None if not found.
1436+
1437+ :param str hash_type: Any hash alrgorithm supported by :mod:`hashlib`,
1438+ such as md5, sha1, sha256, sha512, etc.
1439+ """
1440+ if os.path.exists(path):
1441+ h = getattr(hashlib, hash_type)()
1442+ with open(path, 'rb') as source:
1443+ h.update(source.read())
1444+ return h.hexdigest()
1445+ else:
1446+ return None
1447+
1448+
1449+def check_hash(path, checksum, hash_type='md5'):
1450+ """
1451+ Validate a file using a cryptographic checksum.
1452+
1453+ :param str checksum: Value of the checksum used to validate the file.
1454+ :param str hash_type: Hash algorithm used to generate `checksum`.
1455+ Can be any hash alrgorithm supported by :mod:`hashlib`,
1456+ such as md5, sha1, sha256, sha512, etc.
1457+ :raises ChecksumError: If the file fails the checksum
1458+
1459+ """
1460+ actual_checksum = file_hash(path, hash_type)
1461+ if checksum != actual_checksum:
1462+ raise ChecksumError("'%s' != '%s'" % (checksum, actual_checksum))
1463+
1464+
1465+class ChecksumError(ValueError):
1466+ pass
1467+
1468+
1469+def restart_on_change(restart_map, stopstart=False):
1470+ """Restart services based on configuration files changing
1471+
1472+ This function is used a decorator, for example::
1473+
1474+ @restart_on_change({
1475+ '/etc/ceph/ceph.conf': [ 'cinder-api', 'cinder-volume' ]
1476+ })
1477+ def ceph_client_changed():
1478+ pass # your code here
1479+
1480+ In this example, the cinder-api and cinder-volume services
1481+ would be restarted if /etc/ceph/ceph.conf is changed by the
1482+ ceph_client_changed function.
1483+ """
1484+ def wrap(f):
1485+ def wrapped_f(*args):
1486+ checksums = {}
1487+ for path in restart_map:
1488+ checksums[path] = file_hash(path)
1489+ f(*args)
1490+ restarts = []
1491+ for path in restart_map:
1492+ if checksums[path] != file_hash(path):
1493+ restarts += restart_map[path]
1494+ services_list = list(OrderedDict.fromkeys(restarts))
1495+ if not stopstart:
1496+ for service_name in services_list:
1497+ service('restart', service_name)
1498+ else:
1499+ for action in ['stop', 'start']:
1500+ for service_name in services_list:
1501+ service(action, service_name)
1502+ return wrapped_f
1503+ return wrap
1504+
1505+
1506+def lsb_release():
1507+ """Return /etc/lsb-release in a dict"""
1508+ d = {}
1509+ with open('/etc/lsb-release', 'r') as lsb:
1510+ for l in lsb:
1511+ k, v = l.split('=')
1512+ d[k.strip()] = v.strip()
1513+ return d
1514+
1515+
1516+def pwgen(length=None):
1517+ """Generate a random pasword."""
1518+ if length is None:
1519+ length = random.choice(range(35, 45))
1520+ alphanumeric_chars = [
1521+ l for l in (string.ascii_letters + string.digits)
1522+ if l not in 'l0QD1vAEIOUaeiou']
1523+ random_chars = [
1524+ random.choice(alphanumeric_chars) for _ in range(length)]
1525+ return(''.join(random_chars))
1526+
1527+
1528+def list_nics(nic_type):
1529+ '''Return a list of nics of given type(s)'''
1530+ if isinstance(nic_type, six.string_types):
1531+ int_types = [nic_type]
1532+ else:
1533+ int_types = nic_type
1534+ interfaces = []
1535+ for int_type in int_types:
1536+ cmd = ['ip', 'addr', 'show', 'label', int_type + '*']
1537+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1538+ ip_output = (line for line in ip_output if line)
1539+ for line in ip_output:
1540+ if line.split()[1].startswith(int_type):
1541+ matched = re.search('.*: (bond[0-9]+\.[0-9]+)@.*', line)
1542+ if matched:
1543+ interface = matched.groups()[0]
1544+ else:
1545+ interface = line.split()[1].replace(":", "")
1546+ interfaces.append(interface)
1547+
1548+ return interfaces
1549+
1550+
1551+def set_nic_mtu(nic, mtu):
1552+ '''Set MTU on a network interface'''
1553+ cmd = ['ip', 'link', 'set', nic, 'mtu', mtu]
1554+ subprocess.check_call(cmd)
1555+
1556+
1557+def get_nic_mtu(nic):
1558+ cmd = ['ip', 'addr', 'show', nic]
1559+ ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
1560+ mtu = ""
1561+ for line in ip_output:
1562+ words = line.split()
1563+ if 'mtu' in words:
1564+ mtu = words[words.index("mtu") + 1]
1565+ return mtu
1566+
1567+
1568+def get_nic_hwaddr(nic):
1569+ cmd = ['ip', '-o', '-0', 'addr', 'show', nic]
1570+ ip_output = subprocess.check_output(cmd).decode('UTF-8')
1571+ hwaddr = ""
1572+ words = ip_output.split()
1573+ if 'link/ether' in words:
1574+ hwaddr = words[words.index('link/ether') + 1]
1575+ return hwaddr
1576+
1577+
1578+def cmp_pkgrevno(package, revno, pkgcache=None):
1579+ '''Compare supplied revno with the revno of the installed package
1580+
1581+ * 1 => Installed revno is greater than supplied arg
1582+ * 0 => Installed revno is the same as supplied arg
1583+ * -1 => Installed revno is less than supplied arg
1584+
1585+ '''
1586+ import apt_pkg
1587+ if not pkgcache:
1588+ from charmhelpers.fetch import apt_cache
1589+ pkgcache = apt_cache()
1590+ pkg = pkgcache[package]
1591+ return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
1592+
1593+
1594+@contextmanager
1595+def chdir(d):
1596+ cur = os.getcwd()
1597+ try:
1598+ yield os.chdir(d)
1599+ finally:
1600+ os.chdir(cur)
1601+
1602+
1603+def chownr(path, owner, group):
1604+ uid = pwd.getpwnam(owner).pw_uid
1605+ gid = grp.getgrnam(group).gr_gid
1606+
1607+ for root, dirs, files in os.walk(path):
1608+ for name in dirs + files:
1609+ full = os.path.join(root, name)
1610+ broken_symlink = os.path.lexists(full) and not os.path.exists(full)
1611+ if not broken_symlink:
1612+ os.chown(full, uid, gid)
1613
1614=== added directory 'hooks/charmhelpers/core/services'
1615=== added file 'hooks/charmhelpers/core/services/__init__.py'
1616--- hooks/charmhelpers/core/services/__init__.py 1970-01-01 00:00:00 +0000
1617+++ hooks/charmhelpers/core/services/__init__.py 2015-01-26 11:20:09 +0000
1618@@ -0,0 +1,2 @@
1619+from .base import * # NOQA
1620+from .helpers import * # NOQA
1621
1622=== added file 'hooks/charmhelpers/core/services/base.py'
1623--- hooks/charmhelpers/core/services/base.py 1970-01-01 00:00:00 +0000
1624+++ hooks/charmhelpers/core/services/base.py 2015-01-26 11:20:09 +0000
1625@@ -0,0 +1,313 @@
1626+import os
1627+import re
1628+import json
1629+from collections import Iterable
1630+
1631+from charmhelpers.core import host
1632+from charmhelpers.core import hookenv
1633+
1634+
1635+__all__ = ['ServiceManager', 'ManagerCallback',
1636+ 'PortManagerCallback', 'open_ports', 'close_ports', 'manage_ports',
1637+ 'service_restart', 'service_stop']
1638+
1639+
1640+class ServiceManager(object):
1641+ def __init__(self, services=None):
1642+ """
1643+ Register a list of services, given their definitions.
1644+
1645+ Service definitions are dicts in the following formats (all keys except
1646+ 'service' are optional)::
1647+
1648+ {
1649+ "service": <service name>,
1650+ "required_data": <list of required data contexts>,
1651+ "provided_data": <list of provided data contexts>,
1652+ "data_ready": <one or more callbacks>,
1653+ "data_lost": <one or more callbacks>,
1654+ "start": <one or more callbacks>,
1655+ "stop": <one or more callbacks>,
1656+ "ports": <list of ports to manage>,
1657+ }
1658+
1659+ The 'required_data' list should contain dicts of required data (or
1660+ dependency managers that act like dicts and know how to collect the data).
1661+ Only when all items in the 'required_data' list are populated are the list
1662+ of 'data_ready' and 'start' callbacks executed. See `is_ready()` for more
1663+ information.
1664+
1665+ The 'provided_data' list should contain relation data providers, most likely
1666+ a subclass of :class:`charmhelpers.core.services.helpers.RelationContext`,
1667+ that will indicate a set of data to set on a given relation.
1668+
1669+ The 'data_ready' value should be either a single callback, or a list of
1670+ callbacks, to be called when all items in 'required_data' pass `is_ready()`.
1671+ Each callback will be called with the service name as the only parameter.
1672+ After all of the 'data_ready' callbacks are called, the 'start' callbacks
1673+ are fired.
1674+
1675+ The 'data_lost' value should be either a single callback, or a list of
1676+ callbacks, to be called when a 'required_data' item no longer passes
1677+ `is_ready()`. Each callback will be called with the service name as the
1678+ only parameter. After all of the 'data_lost' callbacks are called,
1679+ the 'stop' callbacks are fired.
1680+
1681+ The 'start' value should be either a single callback, or a list of
1682+ callbacks, to be called when starting the service, after the 'data_ready'
1683+ callbacks are complete. Each callback will be called with the service
1684+ name as the only parameter. This defaults to
1685+ `[host.service_start, services.open_ports]`.
1686+
1687+ The 'stop' value should be either a single callback, or a list of
1688+ callbacks, to be called when stopping the service. If the service is
1689+ being stopped because it no longer has all of its 'required_data', this
1690+ will be called after all of the 'data_lost' callbacks are complete.
1691+ Each callback will be called with the service name as the only parameter.
1692+ This defaults to `[services.close_ports, host.service_stop]`.
1693+
1694+ The 'ports' value should be a list of ports to manage. The default
1695+ 'start' handler will open the ports after the service is started,
1696+ and the default 'stop' handler will close the ports prior to stopping
1697+ the service.
1698+
1699+
1700+ Examples:
1701+
1702+ The following registers an Upstart service called bingod that depends on
1703+ a mongodb relation and which runs a custom `db_migrate` function prior to
1704+ restarting the service, and a Runit service called spadesd::
1705+
1706+ manager = services.ServiceManager([
1707+ {
1708+ 'service': 'bingod',
1709+ 'ports': [80, 443],
1710+ 'required_data': [MongoRelation(), config(), {'my': 'data'}],
1711+ 'data_ready': [
1712+ services.template(source='bingod.conf'),
1713+ services.template(source='bingod.ini',
1714+ target='/etc/bingod.ini',
1715+ owner='bingo', perms=0400),
1716+ ],
1717+ },
1718+ {
1719+ 'service': 'spadesd',
1720+ 'data_ready': services.template(source='spadesd_run.j2',
1721+ target='/etc/sv/spadesd/run',
1722+ perms=0555),
1723+ 'start': runit_start,
1724+ 'stop': runit_stop,
1725+ },
1726+ ])
1727+ manager.manage()
1728+ """
1729+ self._ready_file = os.path.join(hookenv.charm_dir(), 'READY-SERVICES.json')
1730+ self._ready = None
1731+ self.services = {}
1732+ for service in services or []:
1733+ service_name = service['service']
1734+ self.services[service_name] = service
1735+
1736+ def manage(self):
1737+ """
1738+ Handle the current hook by doing The Right Thing with the registered services.
1739+ """
1740+ hook_name = hookenv.hook_name()
1741+ if hook_name == 'stop':
1742+ self.stop_services()
1743+ else:
1744+ self.provide_data()
1745+ self.reconfigure_services()
1746+ cfg = hookenv.config()
1747+ if cfg.implicit_save:
1748+ cfg.save()
1749+
1750+ def provide_data(self):
1751+ """
1752+ Set the relation data for each provider in the ``provided_data`` list.
1753+
1754+ A provider must have a `name` attribute, which indicates which relation
1755+ to set data on, and a `provide_data()` method, which returns a dict of
1756+ data to set.
1757+ """
1758+ hook_name = hookenv.hook_name()
1759+ for service in self.services.values():
1760+ for provider in service.get('provided_data', []):
1761+ if re.match(r'{}-relation-(joined|changed)'.format(provider.name), hook_name):
1762+ data = provider.provide_data()
1763+ _ready = provider._is_ready(data) if hasattr(provider, '_is_ready') else data
1764+ if _ready:
1765+ hookenv.relation_set(None, data)
1766+
1767+ def reconfigure_services(self, *service_names):
1768+ """
1769+ Update all files for one or more registered services, and,
1770+ if ready, optionally restart them.
1771+
1772+ If no service names are given, reconfigures all registered services.
1773+ """
1774+ for service_name in service_names or self.services.keys():
1775+ if self.is_ready(service_name):
1776+ self.fire_event('data_ready', service_name)
1777+ self.fire_event('start', service_name, default=[
1778+ service_restart,
1779+ manage_ports])
1780+ self.save_ready(service_name)
1781+ else:
1782+ if self.was_ready(service_name):
1783+ self.fire_event('data_lost', service_name)
1784+ self.fire_event('stop', service_name, default=[
1785+ manage_ports,
1786+ service_stop])
1787+ self.save_lost(service_name)
1788+
1789+ def stop_services(self, *service_names):
1790+ """
1791+ Stop one or more registered services, by name.
1792+
1793+ If no service names are given, stops all registered services.
1794+ """
1795+ for service_name in service_names or self.services.keys():
1796+ self.fire_event('stop', service_name, default=[
1797+ manage_ports,
1798+ service_stop])
1799+
1800+ def get_service(self, service_name):
1801+ """
1802+ Given the name of a registered service, return its service definition.
1803+ """
1804+ service = self.services.get(service_name)
1805+ if not service:
1806+ raise KeyError('Service not registered: %s' % service_name)
1807+ return service
1808+
1809+ def fire_event(self, event_name, service_name, default=None):
1810+ """
1811+ Fire a data_ready, data_lost, start, or stop event on a given service.
1812+ """
1813+ service = self.get_service(service_name)
1814+ callbacks = service.get(event_name, default)
1815+ if not callbacks:
1816+ return
1817+ if not isinstance(callbacks, Iterable):
1818+ callbacks = [callbacks]
1819+ for callback in callbacks:
1820+ if isinstance(callback, ManagerCallback):
1821+ callback(self, service_name, event_name)
1822+ else:
1823+ callback(service_name)
1824+
1825+ def is_ready(self, service_name):
1826+ """
1827+ Determine if a registered service is ready, by checking its 'required_data'.
1828+
1829+ A 'required_data' item can be any mapping type, and is considered ready
1830+ if `bool(item)` evaluates as True.
1831+ """
1832+ service = self.get_service(service_name)
1833+ reqs = service.get('required_data', [])
1834+ return all(bool(req) for req in reqs)
1835+
1836+ def _load_ready_file(self):
1837+ if self._ready is not None:
1838+ return
1839+ if os.path.exists(self._ready_file):
1840+ with open(self._ready_file) as fp:
1841+ self._ready = set(json.load(fp))
1842+ else:
1843+ self._ready = set()
1844+
1845+ def _save_ready_file(self):
1846+ if self._ready is None:
1847+ return
1848+ with open(self._ready_file, 'w') as fp:
1849+ json.dump(list(self._ready), fp)
1850+
1851+ def save_ready(self, service_name):
1852+ """
1853+ Save an indicator that the given service is now data_ready.
1854+ """
1855+ self._load_ready_file()
1856+ self._ready.add(service_name)
1857+ self._save_ready_file()
1858+
1859+ def save_lost(self, service_name):
1860+ """
1861+ Save an indicator that the given service is no longer data_ready.
1862+ """
1863+ self._load_ready_file()
1864+ self._ready.discard(service_name)
1865+ self._save_ready_file()
1866+
1867+ def was_ready(self, service_name):
1868+ """
1869+ Determine if the given service was previously data_ready.
1870+ """
1871+ self._load_ready_file()
1872+ return service_name in self._ready
1873+
1874+
1875+class ManagerCallback(object):
1876+ """
1877+ Special case of a callback that takes the `ServiceManager` instance
1878+ in addition to the service name.
1879+
1880+ Subclasses should implement `__call__` which should accept three parameters:
1881+
1882+ * `manager` The `ServiceManager` instance
1883+ * `service_name` The name of the service it's being triggered for
1884+ * `event_name` The name of the event that this callback is handling
1885+ """
1886+ def __call__(self, manager, service_name, event_name):
1887+ raise NotImplementedError()
1888+
1889+
1890+class PortManagerCallback(ManagerCallback):
1891+ """
1892+ Callback class that will open or close ports, for use as either
1893+ a start or stop action.
1894+ """
1895+ def __call__(self, manager, service_name, event_name):
1896+ service = manager.get_service(service_name)
1897+ new_ports = service.get('ports', [])
1898+ port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
1899+ if os.path.exists(port_file):
1900+ with open(port_file) as fp:
1901+ old_ports = fp.read().split(',')
1902+ for old_port in old_ports:
1903+ if bool(old_port):
1904+ old_port = int(old_port)
1905+ if old_port not in new_ports:
1906+ hookenv.close_port(old_port)
1907+ with open(port_file, 'w') as fp:
1908+ fp.write(','.join(str(port) for port in new_ports))
1909+ for port in new_ports:
1910+ if event_name == 'start':
1911+ hookenv.open_port(port)
1912+ elif event_name == 'stop':
1913+ hookenv.close_port(port)
1914+
1915+
1916+def service_stop(service_name):
1917+ """
1918+ Wrapper around host.service_stop to prevent spurious "unknown service"
1919+ messages in the logs.
1920+ """
1921+ if host.service_running(service_name):
1922+ host.service_stop(service_name)
1923+
1924+
1925+def service_restart(service_name):
1926+ """
1927+ Wrapper around host.service_restart to prevent spurious "unknown service"
1928+ messages in the logs.
1929+ """
1930+ if host.service_available(service_name):
1931+ if host.service_running(service_name):
1932+ host.service_restart(service_name)
1933+ else:
1934+ host.service_start(service_name)
1935+
1936+
1937+# Convenience aliases
1938+open_ports = close_ports = manage_ports = PortManagerCallback()
1939
1940=== added file 'hooks/charmhelpers/core/services/helpers.py'
1941--- hooks/charmhelpers/core/services/helpers.py 1970-01-01 00:00:00 +0000
1942+++ hooks/charmhelpers/core/services/helpers.py 2015-01-26 11:20:09 +0000
1943@@ -0,0 +1,243 @@
1944+import os
1945+import yaml
1946+from charmhelpers.core import hookenv
1947+from charmhelpers.core import templating
1948+
1949+from charmhelpers.core.services.base import ManagerCallback
1950+
1951+
1952+__all__ = ['RelationContext', 'TemplateCallback',
1953+ 'render_template', 'template']
1954+
1955+
1956+class RelationContext(dict):
1957+ """
1958+ Base class for a context generator that gets relation data from juju.
1959+
1960+ Subclasses must provide the attributes `name`, which is the name of the
1961+ interface of interest, `interface`, which is the type of the interface of
1962+ interest, and `required_keys`, which is the set of keys required for the
1963+ relation to be considered complete. The data for all interfaces matching
1964+ the `name` attribute that are complete will used to populate the dictionary
1965+ values (see `get_data`, below).
1966+
1967+ The generated context will be namespaced under the relation :attr:`name`,
1968+ to prevent potential naming conflicts.
1969+
1970+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
1971+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
1972+ """
1973+ name = None
1974+ interface = None
1975+ required_keys = []
1976+
1977+ def __init__(self, name=None, additional_required_keys=None):
1978+ if name is not None:
1979+ self.name = name
1980+ if additional_required_keys is not None:
1981+ self.required_keys.extend(additional_required_keys)
1982+ self.get_data()
1983+
1984+ def __bool__(self):
1985+ """
1986+ Returns True if all of the required_keys are available.
1987+ """
1988+ return self.is_ready()
1989+
1990+ __nonzero__ = __bool__
1991+
1992+ def __repr__(self):
1993+ return super(RelationContext, self).__repr__()
1994+
1995+ def is_ready(self):
1996+ """
1997+ Returns True if all of the `required_keys` are available from any units.
1998+ """
1999+ ready = len(self.get(self.name, [])) > 0
2000+ if not ready:
2001+ hookenv.log('Incomplete relation: {}'.format(self.__class__.__name__), hookenv.DEBUG)
2002+ return ready
2003+
2004+ def _is_ready(self, unit_data):
2005+ """
2006+ Helper method that tests a set of relation data and returns True if
2007+ all of the `required_keys` are present.
2008+ """
2009+ return set(unit_data.keys()).issuperset(set(self.required_keys))
2010+
2011+ def get_data(self):
2012+ """
2013+ Retrieve the relation data for each unit involved in a relation and,
2014+ if complete, store it in a list under `self[self.name]`. This
2015+ is automatically called when the RelationContext is instantiated.
2016+
2017+ The units are sorted lexographically first by the service ID, then by
2018+ the unit ID. Thus, if an interface has two other services, 'db:1'
2019+ and 'db:2', with 'db:1' having two units, 'wordpress/0' and 'wordpress/1',
2020+ and 'db:2' having one unit, 'mediawiki/0', all of which have a complete
2021+ set of data, the relation data for the units will be stored in the
2022+ order: 'wordpress/0', 'wordpress/1', 'mediawiki/0'.
2023+
2024+ If you only care about a single unit on the relation, you can just
2025+ access it as `{{ interface[0]['key'] }}`. However, if you can at all
2026+ support multiple units on a relation, you should iterate over the list,
2027+ like::
2028+
2029+ {% for unit in interface -%}
2030+ {{ unit['key'] }}{% if not loop.last %},{% endif %}
2031+ {%- endfor %}
2032+
2033+ Note that since all sets of relation data from all related services and
2034+ units are in a single list, if you need to know which service or unit a
2035+ set of data came from, you'll need to extend this class to preserve
2036+ that information.
2037+ """
2038+ if not hookenv.relation_ids(self.name):
2039+ return
2040+
2041+ ns = self.setdefault(self.name, [])
2042+ for rid in sorted(hookenv.relation_ids(self.name)):
2043+ for unit in sorted(hookenv.related_units(rid)):
2044+ reldata = hookenv.relation_get(rid=rid, unit=unit)
2045+ if self._is_ready(reldata):
2046+ ns.append(reldata)
2047+
2048+ def provide_data(self):
2049+ """
2050+ Return data to be relation_set for this interface.
2051+ """
2052+ return {}
2053+
2054+
2055+class MysqlRelation(RelationContext):
2056+ """
2057+ Relation context for the `mysql` interface.
2058+
2059+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
2060+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
2061+ """
2062+ name = 'db'
2063+ interface = 'mysql'
2064+ required_keys = ['host', 'user', 'password', 'database']
2065+
2066+
2067+class HttpRelation(RelationContext):
2068+ """
2069+ Relation context for the `http` interface.
2070+
2071+ :param str name: Override the relation :attr:`name`, since it can vary from charm to charm
2072+ :param list additional_required_keys: Extend the list of :attr:`required_keys`
2073+ """
2074+ name = 'website'
2075+ interface = 'http'
2076+ required_keys = ['host', 'port']
2077+
2078+ def provide_data(self):
2079+ return {
2080+ 'host': hookenv.unit_get('private-address'),
2081+ 'port': 80,
2082+ }
2083+
2084+
2085+class RequiredConfig(dict):
2086+ """
2087+ Data context that loads config options with one or more mandatory options.
2088+
2089+ Once the required options have been changed from their default values, all
2090+ config options will be available, namespaced under `config` to prevent
2091+ potential naming conflicts (for example, between a config option and a
2092+ relation property).
2093+
2094+ :param list *args: List of options that must be changed from their default values.
2095+ """
2096+
2097+ def __init__(self, *args):
2098+ self.required_options = args
2099+ self['config'] = hookenv.config()
2100+ with open(os.path.join(hookenv.charm_dir(), 'config.yaml')) as fp:
2101+ self.config = yaml.load(fp).get('options', {})
2102+
2103+ def __bool__(self):
2104+ for option in self.required_options:
2105+ if option not in self['config']:
2106+ return False
2107+ current_value = self['config'][option]
2108+ default_value = self.config[option].get('default')
2109+ if current_value == default_value:
2110+ return False
2111+ if current_value in (None, '') and default_value in (None, ''):
2112+ return False
2113+ return True
2114+
2115+ def __nonzero__(self):
2116+ return self.__bool__()
2117+
2118+
2119+class StoredContext(dict):
2120+ """
2121+ A data context that always returns the data that it was first created with.
2122+
2123+ This is useful to do a one-time generation of things like passwords, that
2124+ will thereafter use the same value that was originally generated, instead
2125+ of generating a new value each time it is run.
2126+ """
2127+ def __init__(self, file_name, config_data):
2128+ """
2129+ If the file exists, populate `self` with the data from the file.
2130+ Otherwise, populate with the given data and persist it to the file.
2131+ """
2132+ if os.path.exists(file_name):
2133+ self.update(self.read_context(file_name))
2134+ else:
2135+ self.store_context(file_name, config_data)
2136+ self.update(config_data)
2137+
2138+ def store_context(self, file_name, config_data):
2139+ if not os.path.isabs(file_name):
2140+ file_name = os.path.join(hookenv.charm_dir(), file_name)
2141+ with open(file_name, 'w') as file_stream:
2142+ os.fchmod(file_stream.fileno(), 0o600)
2143+ yaml.dump(config_data, file_stream)
2144+
2145+ def read_context(self, file_name):
2146+ if not os.path.isabs(file_name):
2147+ file_name = os.path.join(hookenv.charm_dir(), file_name)
2148+ with open(file_name, 'r') as file_stream:
2149+ data = yaml.load(file_stream)
2150+ if not data:
2151+ raise OSError("%s is empty" % file_name)
2152+ return data
2153+
2154+
2155+class TemplateCallback(ManagerCallback):
2156+ """
2157+ Callback class that will render a Jinja2 template, for use as a ready
2158+ action.
2159+
2160+ :param str source: The template source file, relative to
2161+ `$CHARM_DIR/templates`
2162+
2163+ :param str target: The target to write the rendered template to
2164+ :param str owner: The owner of the rendered file
2165+ :param str group: The group of the rendered file
2166+ :param int perms: The permissions of the rendered file
2167+ """
2168+ def __init__(self, source, target,
2169+ owner='root', group='root', perms=0o444):
2170+ self.source = source
2171+ self.target = target
2172+ self.owner = owner
2173+ self.group = group
2174+ self.perms = perms
2175+
2176+ def __call__(self, manager, service_name, event_name):
2177+ service = manager.get_service(service_name)
2178+ context = {}
2179+ for ctx in service.get('required_data', []):
2180+ context.update(ctx)
2181+ templating.render(self.source, self.target, context,
2182+ self.owner, self.group, self.perms)
2183+
2184+
2185+# Convenience aliases for templates
2186+render_template = template = TemplateCallback
2187
2188=== added file 'hooks/charmhelpers/core/sysctl.py'
2189--- hooks/charmhelpers/core/sysctl.py 1970-01-01 00:00:00 +0000
2190+++ hooks/charmhelpers/core/sysctl.py 2015-01-26 11:20:09 +0000
2191@@ -0,0 +1,34 @@
2192+#!/usr/bin/env python
2193+# -*- coding: utf-8 -*-
2194+
2195+__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>'
2196+
2197+import yaml
2198+
2199+from subprocess import check_call
2200+
2201+from charmhelpers.core.hookenv import (
2202+ log,
2203+ DEBUG,
2204+)
2205+
2206+
2207+def create(sysctl_dict, sysctl_file):
2208+ """Creates a sysctl.conf file from a YAML associative array
2209+
2210+ :param sysctl_dict: a dict of sysctl options eg { 'kernel.max_pid': 1337 }
2211+ :type sysctl_dict: dict
2212+ :param sysctl_file: path to the sysctl file to be saved
2213+ :type sysctl_file: str or unicode
2214+ :returns: None
2215+ """
2216+ sysctl_dict = yaml.load(sysctl_dict)
2217+
2218+ with open(sysctl_file, "w") as fd:
2219+ for key, value in sysctl_dict.items():
2220+ fd.write("{}={}\n".format(key, value))
2221+
2222+ log("Updating sysctl_file: %s values: %s" % (sysctl_file, sysctl_dict),
2223+ level=DEBUG)
2224+
2225+ check_call(["sysctl", "-p", sysctl_file])
2226
2227=== added file 'hooks/charmhelpers/core/templating.py'
2228--- hooks/charmhelpers/core/templating.py 1970-01-01 00:00:00 +0000
2229+++ hooks/charmhelpers/core/templating.py 2015-01-26 11:20:09 +0000
2230@@ -0,0 +1,52 @@
2231+import os
2232+
2233+from charmhelpers.core import host
2234+from charmhelpers.core import hookenv
2235+
2236+
2237+def render(source, target, context, owner='root', group='root',
2238+ perms=0o444, templates_dir=None):
2239+ """
2240+ Render a template.
2241+
2242+ The `source` path, if not absolute, is relative to the `templates_dir`.
2243+
2244+ The `target` path should be absolute.
2245+
2246+ The context should be a dict containing the values to be replaced in the
2247+ template.
2248+
2249+ The `owner`, `group`, and `perms` options will be passed to `write_file`.
2250+
2251+ If omitted, `templates_dir` defaults to the `templates` folder in the charm.
2252+
2253+ Note: Using this requires python-jinja2; if it is not installed, calling
2254+ this will attempt to use charmhelpers.fetch.apt_install to install it.
2255+ """
2256+ try:
2257+ from jinja2 import FileSystemLoader, Environment, exceptions
2258+ except ImportError:
2259+ try:
2260+ from charmhelpers.fetch import apt_install
2261+ except ImportError:
2262+ hookenv.log('Could not import jinja2, and could not import '
2263+ 'charmhelpers.fetch to install it',
2264+ level=hookenv.ERROR)
2265+ raise
2266+ apt_install('python-jinja2', fatal=True)
2267+ from jinja2 import FileSystemLoader, Environment, exceptions
2268+
2269+ if templates_dir is None:
2270+ templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
2271+ loader = Environment(loader=FileSystemLoader(templates_dir))
2272+ try:
2273+ source = source
2274+ template = loader.get_template(source)
2275+ except exceptions.TemplateNotFound as e:
2276+ hookenv.log('Could not load template %s from %s.' %
2277+ (source, templates_dir),
2278+ level=hookenv.ERROR)
2279+ raise e
2280+ content = template.render(context)
2281+ host.mkdir(os.path.dirname(target), owner, group)
2282+ host.write_file(target, content, owner, group, perms)
2283
2284=== added directory 'hooks/charmhelpers/fetch'
2285=== added file 'hooks/charmhelpers/fetch/__init__.py'
2286--- hooks/charmhelpers/fetch/__init__.py 1970-01-01 00:00:00 +0000
2287+++ hooks/charmhelpers/fetch/__init__.py 2015-01-26 11:20:09 +0000
2288@@ -0,0 +1,423 @@
2289+import importlib
2290+from tempfile import NamedTemporaryFile
2291+import time
2292+from yaml import safe_load
2293+from charmhelpers.core.host import (
2294+ lsb_release
2295+)
2296+import subprocess
2297+from charmhelpers.core.hookenv import (
2298+ config,
2299+ log,
2300+)
2301+import os
2302+
2303+import six
2304+if six.PY3:
2305+ from urllib.parse import urlparse, urlunparse
2306+else:
2307+ from urlparse import urlparse, urlunparse
2308+
2309+
2310+CLOUD_ARCHIVE = """# Ubuntu Cloud Archive
2311+deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main
2312+"""
2313+PROPOSED_POCKET = """# Proposed
2314+deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted
2315+"""
2316+CLOUD_ARCHIVE_POCKETS = {
2317+ # Folsom
2318+ 'folsom': 'precise-updates/folsom',
2319+ 'precise-folsom': 'precise-updates/folsom',
2320+ 'precise-folsom/updates': 'precise-updates/folsom',
2321+ 'precise-updates/folsom': 'precise-updates/folsom',
2322+ 'folsom/proposed': 'precise-proposed/folsom',
2323+ 'precise-folsom/proposed': 'precise-proposed/folsom',
2324+ 'precise-proposed/folsom': 'precise-proposed/folsom',
2325+ # Grizzly
2326+ 'grizzly': 'precise-updates/grizzly',
2327+ 'precise-grizzly': 'precise-updates/grizzly',
2328+ 'precise-grizzly/updates': 'precise-updates/grizzly',
2329+ 'precise-updates/grizzly': 'precise-updates/grizzly',
2330+ 'grizzly/proposed': 'precise-proposed/grizzly',
2331+ 'precise-grizzly/proposed': 'precise-proposed/grizzly',
2332+ 'precise-proposed/grizzly': 'precise-proposed/grizzly',
2333+ # Havana
2334+ 'havana': 'precise-updates/havana',
2335+ 'precise-havana': 'precise-updates/havana',
2336+ 'precise-havana/updates': 'precise-updates/havana',
2337+ 'precise-updates/havana': 'precise-updates/havana',
2338+ 'havana/proposed': 'precise-proposed/havana',
2339+ 'precise-havana/proposed': 'precise-proposed/havana',
2340+ 'precise-proposed/havana': 'precise-proposed/havana',
2341+ # Icehouse
2342+ 'icehouse': 'precise-updates/icehouse',
2343+ 'precise-icehouse': 'precise-updates/icehouse',
2344+ 'precise-icehouse/updates': 'precise-updates/icehouse',
2345+ 'precise-updates/icehouse': 'precise-updates/icehouse',
2346+ 'icehouse/proposed': 'precise-proposed/icehouse',
2347+ 'precise-icehouse/proposed': 'precise-proposed/icehouse',
2348+ 'precise-proposed/icehouse': 'precise-proposed/icehouse',
2349+ # Juno
2350+ 'juno': 'trusty-updates/juno',
2351+ 'trusty-juno': 'trusty-updates/juno',
2352+ 'trusty-juno/updates': 'trusty-updates/juno',
2353+ 'trusty-updates/juno': 'trusty-updates/juno',
2354+ 'juno/proposed': 'trusty-proposed/juno',
2355+ 'trusty-juno/proposed': 'trusty-proposed/juno',
2356+ 'trusty-proposed/juno': 'trusty-proposed/juno',
2357+ # Kilo
2358+ 'kilo': 'trusty-updates/kilo',
2359+ 'trusty-kilo': 'trusty-updates/kilo',
2360+ 'trusty-kilo/updates': 'trusty-updates/kilo',
2361+ 'trusty-updates/kilo': 'trusty-updates/kilo',
2362+ 'kilo/proposed': 'trusty-proposed/kilo',
2363+ 'trusty-kilo/proposed': 'trusty-proposed/kilo',
2364+ 'trusty-proposed/kilo': 'trusty-proposed/kilo',
2365+}
2366+
2367+# The order of this list is very important. Handlers should be listed in from
2368+# least- to most-specific URL matching.
2369+FETCH_HANDLERS = (
2370+ 'charmhelpers.fetch.archiveurl.ArchiveUrlFetchHandler',
2371+ 'charmhelpers.fetch.bzrurl.BzrUrlFetchHandler',
2372+ 'charmhelpers.fetch.giturl.GitUrlFetchHandler',
2373+)
2374+
2375+APT_NO_LOCK = 100 # The return code for "couldn't acquire lock" in APT.
2376+APT_NO_LOCK_RETRY_DELAY = 10 # Wait 10 seconds between apt lock checks.
2377+APT_NO_LOCK_RETRY_COUNT = 30 # Retry to acquire the lock X times.
2378+
2379+
2380+class SourceConfigError(Exception):
2381+ pass
2382+
2383+
2384+class UnhandledSource(Exception):
2385+ pass
2386+
2387+
2388+class AptLockError(Exception):
2389+ pass
2390+
2391+
2392+class BaseFetchHandler(object):
2393+
2394+ """Base class for FetchHandler implementations in fetch plugins"""
2395+
2396+ def can_handle(self, source):
2397+ """Returns True if the source can be handled. Otherwise returns
2398+ a string explaining why it cannot"""
2399+ return "Wrong source type"
2400+
2401+ def install(self, source):
2402+ """Try to download and unpack the source. Return the path to the
2403+ unpacked files or raise UnhandledSource."""
2404+ raise UnhandledSource("Wrong source type {}".format(source))
2405+
2406+ def parse_url(self, url):
2407+ return urlparse(url)
2408+
2409+ def base_url(self, url):
2410+ """Return url without querystring or fragment"""
2411+ parts = list(self.parse_url(url))
2412+ parts[4:] = ['' for i in parts[4:]]
2413+ return urlunparse(parts)
2414+
2415+
2416+def filter_installed_packages(packages):
2417+ """Returns a list of packages that require installation"""
2418+ cache = apt_cache()
2419+ _pkgs = []
2420+ for package in packages:
2421+ try:
2422+ p = cache[package]
2423+ p.current_ver or _pkgs.append(package)
2424+ except KeyError:
2425+ log('Package {} has no installation candidate.'.format(package),
2426+ level='WARNING')
2427+ _pkgs.append(package)
2428+ return _pkgs
2429+
2430+
2431+def apt_cache(in_memory=True):
2432+ """Build and return an apt cache"""
2433+ import apt_pkg
2434+ apt_pkg.init()
2435+ if in_memory:
2436+ apt_pkg.config.set("Dir::Cache::pkgcache", "")
2437+ apt_pkg.config.set("Dir::Cache::srcpkgcache", "")
2438+ return apt_pkg.Cache()
2439+
2440+
2441+def apt_install(packages, options=None, fatal=False):
2442+ """Install one or more packages"""
2443+ if options is None:
2444+ options = ['--option=Dpkg::Options::=--force-confold']
2445+
2446+ cmd = ['apt-get', '--assume-yes']
2447+ cmd.extend(options)
2448+ cmd.append('install')
2449+ if isinstance(packages, six.string_types):
2450+ cmd.append(packages)
2451+ else:
2452+ cmd.extend(packages)
2453+ log("Installing {} with options: {}".format(packages,
2454+ options))
2455+ _run_apt_command(cmd, fatal)
2456+
2457+
2458+def apt_upgrade(options=None, fatal=False, dist=False):
2459+ """Upgrade all packages"""
2460+ if options is None:
2461+ options = ['--option=Dpkg::Options::=--force-confold']
2462+
2463+ cmd = ['apt-get', '--assume-yes']
2464+ cmd.extend(options)
2465+ if dist:
2466+ cmd.append('dist-upgrade')
2467+ else:
2468+ cmd.append('upgrade')
2469+ log("Upgrading with options: {}".format(options))
2470+ _run_apt_command(cmd, fatal)
2471+
2472+
2473+def apt_update(fatal=False):
2474+ """Update local apt cache"""
2475+ cmd = ['apt-get', 'update']
2476+ _run_apt_command(cmd, fatal)
2477+
2478+
2479+def apt_purge(packages, fatal=False):
2480+ """Purge one or more packages"""
2481+ cmd = ['apt-get', '--assume-yes', 'purge']
2482+ if isinstance(packages, six.string_types):
2483+ cmd.append(packages)
2484+ else:
2485+ cmd.extend(packages)
2486+ log("Purging {}".format(packages))
2487+ _run_apt_command(cmd, fatal)
2488+
2489+
2490+def apt_hold(packages, fatal=False):
2491+ """Hold one or more packages"""
2492+ cmd = ['apt-mark', 'hold']
2493+ if isinstance(packages, six.string_types):
2494+ cmd.append(packages)
2495+ else:
2496+ cmd.extend(packages)
2497+ log("Holding {}".format(packages))
2498+
2499+ if fatal:
2500+ subprocess.check_call(cmd)
2501+ else:
2502+ subprocess.call(cmd)
2503+
2504+
2505+def add_source(source, key=None):
2506+ """Add a package source to this system.
2507+
2508+ @param source: a URL or sources.list entry, as supported by
2509+ add-apt-repository(1). Examples::
2510+
2511+ ppa:charmers/example
2512+ deb https://stub:key@private.example.com/ubuntu trusty main
2513+
2514+ In addition:
2515+ 'proposed:' may be used to enable the standard 'proposed'
2516+ pocket for the release.
2517+ 'cloud:' may be used to activate official cloud archive pockets,
2518+ such as 'cloud:icehouse'
2519+ 'distro' may be used as a noop
2520+
2521+ @param key: A key to be added to the system's APT keyring and used
2522+ to verify the signatures on packages. Ideally, this should be an
2523+ ASCII format GPG public key including the block headers. A GPG key
2524+ id may also be used, but be aware that only insecure protocols are
2525+ available to retrieve the actual public key from a public keyserver
2526+ placing your Juju environment at risk. ppa and cloud archive keys
2527+ are securely added automtically, so sould not be provided.
2528+ """
2529+ if source is None:
2530+ log('Source is not present. Skipping')
2531+ return
2532+
2533+ if (source.startswith('ppa:') or
2534+ source.startswith('http') or
2535+ source.startswith('deb ') or
2536+ source.startswith('cloud-archive:')):
2537+ subprocess.check_call(['add-apt-repository', '--yes', source])
2538+ elif source.startswith('cloud:'):
2539+ apt_install(filter_installed_packages(['ubuntu-cloud-keyring']),
2540+ fatal=True)
2541+ pocket = source.split(':')[-1]
2542+ if pocket not in CLOUD_ARCHIVE_POCKETS:
2543+ raise SourceConfigError(
2544+ 'Unsupported cloud: source option %s' %
2545+ pocket)
2546+ actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket]
2547+ with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt:
2548+ apt.write(CLOUD_ARCHIVE.format(actual_pocket))
2549+ elif source == 'proposed':
2550+ release = lsb_release()['DISTRIB_CODENAME']
2551+ with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt:
2552+ apt.write(PROPOSED_POCKET.format(release))
2553+ elif source == 'distro':
2554+ pass
2555+ else:
2556+ log("Unknown source: {!r}".format(source))
2557+
2558+ if key:
2559+ if '-----BEGIN PGP PUBLIC KEY BLOCK-----' in key:
2560+ with NamedTemporaryFile('w+') as key_file:
2561+ key_file.write(key)
2562+ key_file.flush()
2563+ key_file.seek(0)
2564+ subprocess.check_call(['apt-key', 'add', '-'], stdin=key_file)
2565+ else:
2566+ # Note that hkp: is in no way a secure protocol. Using a
2567+ # GPG key id is pointless from a security POV unless you
2568+ # absolutely trust your network and DNS.
2569+ subprocess.check_call(['apt-key', 'adv', '--keyserver',
2570+ 'hkp://keyserver.ubuntu.com:80', '--recv',
2571+ key])
2572+
2573+
2574+def configure_sources(update=False,
2575+ sources_var='install_sources',
2576+ keys_var='install_keys'):
2577+ """
2578+ Configure multiple sources from charm configuration.
2579+
2580+ The lists are encoded as yaml fragments in the configuration.
2581+ The frament needs to be included as a string. Sources and their
2582+ corresponding keys are of the types supported by add_source().
2583+
2584+ Example config:
2585+ install_sources: |
2586+ - "ppa:foo"
2587+ - "http://example.com/repo precise main"
2588+ install_keys: |
2589+ - null
2590+ - "a1b2c3d4"
2591+
2592+ Note that 'null' (a.k.a. None) should not be quoted.
2593+ """
2594+ sources = safe_load((config(sources_var) or '').strip()) or []
2595+ keys = safe_load((config(keys_var) or '').strip()) or None
2596+
2597+ if isinstance(sources, six.string_types):
2598+ sources = [sources]
2599+
2600+ if keys is None:
2601+ for source in sources:
2602+ add_source(source, None)
2603+ else:
2604+ if isinstance(keys, six.string_types):
2605+ keys = [keys]
2606+
2607+ if len(sources) != len(keys):
2608+ raise SourceConfigError(
2609+ 'Install sources and keys lists are different lengths')
2610+ for source, key in zip(sources, keys):
2611+ add_source(source, key)
2612+ if update:
2613+ apt_update(fatal=True)
2614+
2615+
2616+def install_remote(source, *args, **kwargs):
2617+ """
2618+ Install a file tree from a remote source
2619+
2620+ The specified source should be a url of the form:
2621+ scheme://[host]/path[#[option=value][&...]]
2622+
2623+ Schemes supported are based on this modules submodules.
2624+ Options supported are submodule-specific.
2625+ Additional arguments are passed through to the submodule.
2626+
2627+ For example::
2628+
2629+ dest = install_remote('http://example.com/archive.tgz',
2630+ checksum='deadbeef',
2631+ hash_type='sha1')
2632+
2633+ This will download `archive.tgz`, validate it using SHA1 and, if
2634+ the file is ok, extract it and return the directory in which it
2635+ was extracted. If the checksum fails, it will raise
2636+ :class:`charmhelpers.core.host.ChecksumError`.
2637+ """
2638+ # We ONLY check for True here because can_handle may return a string
2639+ # explaining why it can't handle a given source.
2640+ handlers = [h for h in plugins() if h.can_handle(source) is True]
2641+ installed_to = None
2642+ for handler in handlers:
2643+ try:
2644+ installed_to = handler.install(source, *args, **kwargs)
2645+ except UnhandledSource:
2646+ pass
2647+ if not installed_to:
2648+ raise UnhandledSource("No handler found for source {}".format(source))
2649+ return installed_to
2650+
2651+
2652+def install_from_config(config_var_name):
2653+ charm_config = config()
2654+ source = charm_config[config_var_name]
2655+ return install_remote(source)
2656+
2657+
2658+def plugins(fetch_handlers=None):
2659+ if not fetch_handlers:
2660+ fetch_handlers = FETCH_HANDLERS
2661+ plugin_list = []
2662+ for handler_name in fetch_handlers:
2663+ package, classname = handler_name.rsplit('.', 1)
2664+ try:
2665+ handler_class = getattr(
2666+ importlib.import_module(package),
2667+ classname)
2668+ plugin_list.append(handler_class())
2669+ except (ImportError, AttributeError):
2670+ # Skip missing plugins so that they can be ommitted from
2671+ # installation if desired
2672+ log("FetchHandler {} not found, skipping plugin".format(
2673+ handler_name))
2674+ return plugin_list
2675+
2676+
2677+def _run_apt_command(cmd, fatal=False):
2678+ """
2679+ Run an APT command, checking output and retrying if the fatal flag is set
2680+ to True.
2681+
2682+ :param: cmd: str: The apt command to run.
2683+ :param: fatal: bool: Whether the command's output should be checked and
2684+ retried.
2685+ """
2686+ env = os.environ.copy()
2687+
2688+ if 'DEBIAN_FRONTEND' not in env:
2689+ env['DEBIAN_FRONTEND'] = 'noninteractive'
2690+
2691+ if fatal:
2692+ retry_count = 0
2693+ result = None
2694+
2695+ # If the command is considered "fatal", we need to retry if the apt
2696+ # lock was not acquired.
2697+
2698+ while result is None or result == APT_NO_LOCK:
2699+ try:
2700+ result = subprocess.check_call(cmd, env=env)
2701+ except subprocess.CalledProcessError as e:
2702+ retry_count = retry_count + 1
2703+ if retry_count > APT_NO_LOCK_RETRY_COUNT:
2704+ raise
2705+ result = e.returncode
2706+ log("Couldn't acquire DPKG lock. Will retry in {} seconds."
2707+ "".format(APT_NO_LOCK_RETRY_DELAY))
2708+ time.sleep(APT_NO_LOCK_RETRY_DELAY)
2709+
2710+ else:
2711+ subprocess.call(cmd, env=env)
2712
2713=== added file 'hooks/charmhelpers/fetch/archiveurl.py'
2714--- hooks/charmhelpers/fetch/archiveurl.py 1970-01-01 00:00:00 +0000
2715+++ hooks/charmhelpers/fetch/archiveurl.py 2015-01-26 11:20:09 +0000
2716@@ -0,0 +1,145 @@
2717+import os
2718+import hashlib
2719+import re
2720+
2721+import six
2722+if six.PY3:
2723+ from urllib.request import (
2724+ build_opener, install_opener, urlopen, urlretrieve,
2725+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2726+ )
2727+ from urllib.parse import urlparse, urlunparse, parse_qs
2728+ from urllib.error import URLError
2729+else:
2730+ from urllib import urlretrieve
2731+ from urllib2 import (
2732+ build_opener, install_opener, urlopen,
2733+ HTTPPasswordMgrWithDefaultRealm, HTTPBasicAuthHandler,
2734+ URLError
2735+ )
2736+ from urlparse import urlparse, urlunparse, parse_qs
2737+
2738+from charmhelpers.fetch import (
2739+ BaseFetchHandler,
2740+ UnhandledSource
2741+)
2742+from charmhelpers.payload.archive import (
2743+ get_archive_handler,
2744+ extract,
2745+)
2746+from charmhelpers.core.host import mkdir, check_hash
2747+
2748+
2749+def splituser(host):
2750+ '''urllib.splituser(), but six's support of this seems broken'''
2751+ _userprog = re.compile('^(.*)@(.*)$')
2752+ match = _userprog.match(host)
2753+ if match:
2754+ return match.group(1, 2)
2755+ return None, host
2756+
2757+
2758+def splitpasswd(user):
2759+ '''urllib.splitpasswd(), but six's support of this is missing'''
2760+ _passwdprog = re.compile('^([^:]*):(.*)$', re.S)
2761+ match = _passwdprog.match(user)
2762+ if match:
2763+ return match.group(1, 2)
2764+ return user, None
2765+
2766+
2767+class ArchiveUrlFetchHandler(BaseFetchHandler):
2768+ """
2769+ Handler to download archive files from arbitrary URLs.
2770+
2771+ Can fetch from http, https, ftp, and file URLs.
2772+
2773+ Can install either tarballs (.tar, .tgz, .tbz2, etc) or zip files.
2774+
2775+ Installs the contents of the archive in $CHARM_DIR/fetched/.
2776+ """
2777+ def can_handle(self, source):
2778+ url_parts = self.parse_url(source)
2779+ if url_parts.scheme not in ('http', 'https', 'ftp', 'file'):
2780+ return "Wrong source type"
2781+ if get_archive_handler(self.base_url(source)):
2782+ return True
2783+ return False
2784+
2785+ def download(self, source, dest):
2786+ """
2787+ Download an archive file.
2788+
2789+ :param str source: URL pointing to an archive file.
2790+ :param str dest: Local path location to download archive file to.
2791+ """
2792+ # propogate all exceptions
2793+ # URLError, OSError, etc
2794+ proto, netloc, path, params, query, fragment = urlparse(source)
2795+ if proto in ('http', 'https'):
2796+ auth, barehost = splituser(netloc)
2797+ if auth is not None:
2798+ source = urlunparse((proto, barehost, path, params, query, fragment))
2799+ username, password = splitpasswd(auth)
2800+ passman = HTTPPasswordMgrWithDefaultRealm()
2801+ # Realm is set to None in add_password to force the username and password
2802+ # to be used whatever the realm
2803+ passman.add_password(None, source, username, password)
2804+ authhandler = HTTPBasicAuthHandler(passman)
2805+ opener = build_opener(authhandler)
2806+ install_opener(opener)
2807+ response = urlopen(source)
2808+ try:
2809+ with open(dest, 'w') as dest_file:
2810+ dest_file.write(response.read())
2811+ except Exception as e:
2812+ if os.path.isfile(dest):
2813+ os.unlink(dest)
2814+ raise e
2815+
2816+ # Mandatory file validation via Sha1 or MD5 hashing.
2817+ def download_and_validate(self, url, hashsum, validate="sha1"):
2818+ tempfile, headers = urlretrieve(url)
2819+ check_hash(tempfile, hashsum, validate)
2820+ return tempfile
2821+
2822+ def install(self, source, dest=None, checksum=None, hash_type='sha1'):
2823+ """
2824+ Download and install an archive file, with optional checksum validation.
2825+
2826+ The checksum can also be given on the `source` URL's fragment.
2827+ For example::
2828+
2829+ handler.install('http://example.com/file.tgz#sha1=deadbeef')
2830+
2831+ :param str source: URL pointing to an archive file.
2832+ :param str dest: Local destination path to install to. If not given,
2833+ installs to `$CHARM_DIR/archives/archive_file_name`.
2834+ :param str checksum: If given, validate the archive file after download.
2835+ :param str hash_type: Algorithm used to generate `checksum`.
2836+ Can be any hash alrgorithm supported by :mod:`hashlib`,
2837+ such as md5, sha1, sha256, sha512, etc.
2838+
2839+ """
2840+ url_parts = self.parse_url(source)
2841+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), 'fetched')
2842+ if not os.path.exists(dest_dir):
2843+ mkdir(dest_dir, perms=0o755)
2844+ dld_file = os.path.join(dest_dir, os.path.basename(url_parts.path))
2845+ try:
2846+ self.download(source, dld_file)
2847+ except URLError as e:
2848+ raise UnhandledSource(e.reason)
2849+ except OSError as e:
2850+ raise UnhandledSource(e.strerror)
2851+ options = parse_qs(url_parts.fragment)
2852+ for key, value in options.items():
2853+ if not six.PY3:
2854+ algorithms = hashlib.algorithms
2855+ else:
2856+ algorithms = hashlib.algorithms_available
2857+ if key in algorithms:
2858+ check_hash(dld_file, value, key)
2859+ if checksum:
2860+ check_hash(dld_file, checksum, hash_type)
2861+ return extract(dld_file, dest)
2862
2863=== added file 'hooks/charmhelpers/fetch/bzrurl.py'
2864--- hooks/charmhelpers/fetch/bzrurl.py 1970-01-01 00:00:00 +0000
2865+++ hooks/charmhelpers/fetch/bzrurl.py 2015-01-26 11:20:09 +0000
2866@@ -0,0 +1,54 @@
2867+import os
2868+from charmhelpers.fetch import (
2869+ BaseFetchHandler,
2870+ UnhandledSource
2871+)
2872+from charmhelpers.core.host import mkdir
2873+
2874+import six
2875+if six.PY3:
2876+ raise ImportError('bzrlib does not support Python3')
2877+
2878+try:
2879+ from bzrlib.branch import Branch
2880+except ImportError:
2881+ from charmhelpers.fetch import apt_install
2882+ apt_install("python-bzrlib")
2883+ from bzrlib.branch import Branch
2884+
2885+
2886+class BzrUrlFetchHandler(BaseFetchHandler):
2887+ """Handler for bazaar branches via generic and lp URLs"""
2888+ def can_handle(self, source):
2889+ url_parts = self.parse_url(source)
2890+ if url_parts.scheme not in ('bzr+ssh', 'lp'):
2891+ return False
2892+ else:
2893+ return True
2894+
2895+ def branch(self, source, dest):
2896+ url_parts = self.parse_url(source)
2897+ # If we use lp:branchname scheme we need to load plugins
2898+ if not self.can_handle(source):
2899+ raise UnhandledSource("Cannot handle {}".format(source))
2900+ if url_parts.scheme == "lp":
2901+ from bzrlib.plugin import load_plugins
2902+ load_plugins()
2903+ try:
2904+ remote_branch = Branch.open(source)
2905+ remote_branch.bzrdir.sprout(dest).open_branch()
2906+ except Exception as e:
2907+ raise e
2908+
2909+ def install(self, source):
2910+ url_parts = self.parse_url(source)
2911+ branch_name = url_parts.path.strip("/").split("/")[-1]
2912+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2913+ branch_name)
2914+ if not os.path.exists(dest_dir):
2915+ mkdir(dest_dir, perms=0o755)
2916+ try:
2917+ self.branch(source, dest_dir)
2918+ except OSError as e:
2919+ raise UnhandledSource(e.strerror)
2920+ return dest_dir
2921
2922=== added file 'hooks/charmhelpers/fetch/giturl.py'
2923--- hooks/charmhelpers/fetch/giturl.py 1970-01-01 00:00:00 +0000
2924+++ hooks/charmhelpers/fetch/giturl.py 2015-01-26 11:20:09 +0000
2925@@ -0,0 +1,51 @@
2926+import os
2927+from charmhelpers.fetch import (
2928+ BaseFetchHandler,
2929+ UnhandledSource
2930+)
2931+from charmhelpers.core.host import mkdir
2932+
2933+import six
2934+if six.PY3:
2935+ raise ImportError('GitPython does not support Python 3')
2936+
2937+try:
2938+ from git import Repo
2939+except ImportError:
2940+ from charmhelpers.fetch import apt_install
2941+ apt_install("python-git")
2942+ from git import Repo
2943+
2944+
2945+class GitUrlFetchHandler(BaseFetchHandler):
2946+ """Handler for git branches via generic and github URLs"""
2947+ def can_handle(self, source):
2948+ url_parts = self.parse_url(source)
2949+ # TODO (mattyw) no support for ssh git@ yet
2950+ if url_parts.scheme not in ('http', 'https', 'git'):
2951+ return False
2952+ else:
2953+ return True
2954+
2955+ def clone(self, source, dest, branch):
2956+ if not self.can_handle(source):
2957+ raise UnhandledSource("Cannot handle {}".format(source))
2958+
2959+ repo = Repo.clone_from(source, dest)
2960+ repo.git.checkout(branch)
2961+
2962+ def install(self, source, branch="master", dest=None):
2963+ url_parts = self.parse_url(source)
2964+ branch_name = url_parts.path.strip("/").split("/")[-1]
2965+ if dest:
2966+ dest_dir = os.path.join(dest, branch_name)
2967+ else:
2968+ dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched",
2969+ branch_name)
2970+ if not os.path.exists(dest_dir):
2971+ mkdir(dest_dir, perms=0o755)
2972+ try:
2973+ self.clone(source, dest_dir, branch)
2974+ except OSError as e:
2975+ raise UnhandledSource(e.strerror)
2976+ return dest_dir
2977
2978=== added directory 'hooks/charmhelpers/payload'
2979=== added file 'hooks/charmhelpers/payload/__init__.py'
2980--- hooks/charmhelpers/payload/__init__.py 1970-01-01 00:00:00 +0000
2981+++ hooks/charmhelpers/payload/__init__.py 2015-01-26 11:20:09 +0000
2982@@ -0,0 +1,1 @@
2983+"Tools for working with files injected into a charm just before deployment."
2984
2985=== added file 'hooks/charmhelpers/payload/execd.py'
2986--- hooks/charmhelpers/payload/execd.py 1970-01-01 00:00:00 +0000
2987+++ hooks/charmhelpers/payload/execd.py 2015-01-26 11:20:09 +0000
2988@@ -0,0 +1,50 @@
2989+#!/usr/bin/env python
2990+
2991+import os
2992+import sys
2993+import subprocess
2994+from charmhelpers.core import hookenv
2995+
2996+
2997+def default_execd_dir():
2998+ return os.path.join(os.environ['CHARM_DIR'], 'exec.d')
2999+
3000+
3001+def execd_module_paths(execd_dir=None):
3002+ """Generate a list of full paths to modules within execd_dir."""
3003+ if not execd_dir:
3004+ execd_dir = default_execd_dir()
3005+
3006+ if not os.path.exists(execd_dir):
3007+ return
3008+
3009+ for subpath in os.listdir(execd_dir):
3010+ module = os.path.join(execd_dir, subpath)
3011+ if os.path.isdir(module):
3012+ yield module
3013+
3014+
3015+def execd_submodule_paths(command, execd_dir=None):
3016+ """Generate a list of full paths to the specified command within exec_dir.
3017+ """
3018+ for module_path in execd_module_paths(execd_dir):
3019+ path = os.path.join(module_path, command)
3020+ if os.access(path, os.X_OK) and os.path.isfile(path):
3021+ yield path
3022+
3023+
3024+def execd_run(command, execd_dir=None, die_on_error=False, stderr=None):
3025+ """Run command for each module within execd_dir which defines it."""
3026+ for submodule_path in execd_submodule_paths(command, execd_dir):
3027+ try:
3028+ subprocess.check_call(submodule_path, shell=True, stderr=stderr)
3029+ except subprocess.CalledProcessError as e:
3030+ hookenv.log("Error ({}) running {}. Output: {}".format(
3031+ e.returncode, e.cmd, e.output))
3032+ if die_on_error:
3033+ sys.exit(e.returncode)
3034+
3035+
3036+def execd_preinstall(execd_dir=None):
3037+ """Run charm-pre-install for each module within execd_dir."""
3038+ execd_run('charm-pre-install', execd_dir=execd_dir)
3039
3040=== modified file 'hooks/config-changed'
3041--- hooks/config-changed 2011-09-22 14:46:56 +0000
3042+++ hooks/config-changed 1970-01-01 00:00:00 +0000
3043@@ -1,7 +0,0 @@
3044-#!/bin/sh
3045-set -e
3046-
3047-home=`dirname $0`
3048-
3049-juju-log "Reconfiguring charm by installing hook again."
3050-exec $home/install
3051
3052=== target is u'jenkins_hooks.py'
3053=== removed file 'hooks/delnode'
3054--- hooks/delnode 2011-09-22 14:46:56 +0000
3055+++ hooks/delnode 1970-01-01 00:00:00 +0000
3056@@ -1,16 +0,0 @@
3057-#!/usr/bin/python
3058-
3059-import jenkins
3060-import sys
3061-
3062-host=sys.argv[1]
3063-username=sys.argv[2]
3064-password=sys.argv[3]
3065-
3066-l_jenkins = jenkins.Jenkins("http://localhost:8080/",username,password)
3067-
3068-if l_jenkins.node_exists(host):
3069- print "Node exists"
3070- l_jenkins.delete_node(host)
3071-else:
3072- print "Node does not exist - not deleting"
3073
3074=== modified file 'hooks/install'
3075--- hooks/install 2014-04-17 12:35:18 +0000
3076+++ hooks/install 1970-01-01 00:00:00 +0000
3077@@ -1,151 +0,0 @@
3078-#!/bin/bash
3079-
3080-set -eu
3081-
3082-RELEASE=$(config-get release)
3083-ADMIN_USERNAME=$(config-get username)
3084-ADMIN_PASSWORD=$(config-get password)
3085-PLUGINS=$(config-get plugins)
3086-PLUGINS_SITE=$(config-get plugins-site)
3087-PLUGINS_CHECK_CERT=$(config-get plugins-check-certificate)
3088-REMOVE_UNLISTED_PLUGINS=$(config-get remove-unlisted-plugins)
3089-CWD=$(dirname $0)
3090-JENKINS_HOME=/var/lib/jenkins
3091-
3092-setup_source () {
3093- # Do something with < Oneiric releases - maybe PPA
3094- # apt-get -y install python-software-properties
3095- # add-apt-repository ppa:hudson-ubuntu/testing
3096- juju-log "Configuring source of jenkins as $RELEASE"
3097- # Configure to use upstream archives
3098- # lts - debian-stable
3099- # trunk - debian
3100- case $RELEASE in
3101- lts)
3102- SOURCE="debian-stable";;
3103- trunk)
3104- SOURCE="debian";;
3105- *)
3106- juju-log "release configuration not recognised" && exit 1;;
3107- esac
3108- # Setup archive to use appropriate jenkins upstream
3109- wget -q -O - http://pkg.jenkins-ci.org/$SOURCE/jenkins-ci.org.key | apt-key add -
3110- echo "deb http://pkg.jenkins-ci.org/$SOURCE binary/" \
3111- > /etc/apt/sources.list.d/jenkins.list
3112- apt-get update || true
3113-}
3114-# Only setup the source if jenkins is not already installed
3115-# this makes the config 'release' immutable - i.e. you
3116-# can change source once deployed
3117-[[ -d /var/lib/jenkins ]] || setup_source
3118-
3119-# Install jenkins
3120-install_jenkins () {
3121- juju-log "Installing/upgrading jenkins..."
3122- apt-get -y install -qq jenkins default-jre-headless
3123-}
3124-# Re-run whenever called to pickup any updates
3125-install_jenkins
3126-
3127-configure_jenkins_user () {
3128- juju-log "Configuring user for jenkins..."
3129- # Check to see if password provided
3130- if [ -z "$ADMIN_PASSWORD" ]
3131- then
3132- # Generate a random one for security
3133- # User can then override using juju set
3134- ADMIN_PASSWORD=$(< /dev/urandom tr -dc A-Za-z | head -c16)
3135- echo $ADMIN_PASSWORD > $JENKINS_HOME/.admin_password
3136- chmod 0600 $JENKINS_HOME/.admin_password
3137- fi
3138- # Generate Salt and Hash Password for Jenkins
3139- SALT="$(< /dev/urandom tr -dc A-Za-z | head -c6)"
3140- PASSWORD="$SALT:$(echo -n "$ADMIN_PASSWORD{$SALT}" | shasum -a 256 | awk '{ print $1 }')"
3141- mkdir -p $JENKINS_HOME/users/$ADMIN_USERNAME
3142- sed -e s#__USERNAME__#$ADMIN_USERNAME# -e s#__PASSWORD__#$PASSWORD# \
3143- $CWD/../templates/user-config.xml > $JENKINS_HOME/users/$ADMIN_USERNAME/config.xml
3144- chown -R jenkins:nogroup $JENKINS_HOME/users
3145-}
3146-# Always run - even if config has not changed, its safe
3147-configure_jenkins_user
3148-
3149-boostrap_jenkins_configuration (){
3150- juju-log "Bootstrapping secure initial configuration in Jenkins..."
3151- cp $CWD/../templates/jenkins-config.xml $JENKINS_HOME/config.xml
3152- chown jenkins:nogroup $JENKINS_HOME/config.xml
3153- touch /var/lib/jenkins/config.bootstrapped
3154-}
3155-# Only run on first invocation otherwise we blast
3156-# any configuration changes made
3157-[[ -f /var/lib/jenkins/config.bootstrapped ]] || boostrap_jenkins_configuration
3158-
3159-install_plugins(){
3160- juju-log "Installing plugins ($PLUGINS)"
3161- mkdir -p $JENKINS_HOME/plugins
3162- chmod a+rx $JENKINS_HOME/plugins
3163- chown jenkins:nogroup $JENKINS_HOME/plugins
3164- track_dir=`mktemp -d /tmp/plugins.installed.XXXXXXXX`
3165- installed_plugins=`find $JENKINS_HOME/plugins -name '*.hpi'`
3166- [ -z "$installed_plugins" ] || ln -s $installed_plugins $track_dir
3167- local plugin=""
3168- local plugin_file=""
3169- local opts=""
3170- pushd $JENKINS_HOME/plugins
3171- for plugin in $PLUGINS ; do
3172- plugin_file=$JENKINS_HOME/plugins/$plugin.hpi
3173- # Note that by default wget verifies certificates as of 1.10.
3174- if [ "$PLUGINS_CHECK_CERT" = "no" ] ; then
3175- opts="--no-check-certificate"
3176- fi
3177- wget $opts --timestamping $PLUGINS_SITE/latest/$plugin.hpi
3178- chmod a+r $plugin_file
3179- rm -f $track_dir/$plugin.hpi
3180- done
3181- popd
3182- # Warn about undesirable plugins, or remove them.
3183- unlisted_plugins=`ls $track_dir`
3184- [[ -n "$unlisted_plugins" ]] || return 0
3185- if [[ $REMOVE_UNLISTED_PLUGINS = "yes" ]] ; then
3186- for plugin_file in `ls $track_dir` ; do
3187- rm -vf $JENKINS_HOME/plugins/$plugin_file
3188- done
3189- else
3190- juju-log -l WARNING "Unlisted plugins: (`ls $track_dir`) Not removed. Set remove-unlisted-plugins to yes to clear them away."
3191- fi
3192-}
3193-
3194-install_plugins
3195-
3196-juju-log "Restarting jenkins to pickup configuration changes"
3197-service jenkins restart
3198-
3199-# Install helpers - python jenkins ++
3200-install_python_jenkins () {
3201- juju-log "Installing python-jenkins..."
3202- apt-get -y install -qq python-jenkins
3203-}
3204-# Only install once
3205-[[ -d /usr/share/pyshared/jenkins ]] || install_python_jenkins
3206-
3207-# Install some tools - can get set up deployment time
3208-install_tools () {
3209- juju-log "Installing tools..."
3210- apt-get -y install -qq `config-get tools`
3211-}
3212-# Always run - tools might get re-configured
3213-install_tools
3214-
3215-juju-log "Opening ports"
3216-open-port 8080
3217-
3218-# Execute any hook overlay which may be provided
3219-# by forks of this charm
3220-if [ -d hooks/install.d ]
3221-then
3222- for i in `ls -1 hooks/install.d/*`
3223- do
3224- [[ -x $i ]] && . ./$i
3225- done
3226-fi
3227-
3228-exit 0
3229
3230=== target is u'jenkins_hooks.py'
3231=== added file 'hooks/jenkins_hooks.py'
3232--- hooks/jenkins_hooks.py 1970-01-01 00:00:00 +0000
3233+++ hooks/jenkins_hooks.py 2015-01-26 11:20:09 +0000
3234@@ -0,0 +1,224 @@
3235+#!/usr/bin/python
3236+import grp
3237+import hashlib
3238+import os
3239+import pwd
3240+import shutil
3241+import subprocess
3242+import sys
3243+
3244+from charmhelpers.core.hookenv import (
3245+ Hooks,
3246+ UnregisteredHookError,
3247+ config,
3248+ remote_unit,
3249+ relation_get,
3250+ relation_set,
3251+ relation_ids,
3252+ unit_get,
3253+ open_port,
3254+ log,
3255+ DEBUG,
3256+ INFO,
3257+)
3258+from charmhelpers.fetch import (
3259+ apt_install,
3260+ apt_update,
3261+)
3262+from charmhelpers.core.host import (
3263+ service_start,
3264+ service_stop,
3265+)
3266+from charmhelpers.payload.execd import execd_preinstall
3267+from jenkins_utils import (
3268+ JENKINS_HOME,
3269+ JENKINS_USERS,
3270+ TEMPLATES_DIR,
3271+ add_node,
3272+ del_node,
3273+ setup_source,
3274+ install_jenkins_plugins,
3275+)
3276+
3277+hooks = Hooks()
3278+
3279+
3280+@hooks.hook('install')
3281+def install():
3282+ execd_preinstall('hooks/install.d')
3283+ # Only setup the source if jenkins is not already installed i.e. makes the
3284+ # config 'release' immutable so you can't change source once deployed
3285+ setup_source(config('release'))
3286+ config_changed()
3287+ open_port(8080)
3288+
3289+
3290+@hooks.hook('config-changed')
3291+def config_changed():
3292+ apt_update()
3293+ # Re-run whenever called to pickup any updates
3294+ log("Installing/upgrading jenkins.", level=DEBUG)
3295+ apt_install(['jenkins', 'default-jre-headless', 'pwgen'], fatal=True)
3296+
3297+ # Always run - even if config has not changed, its safe
3298+ log("Configuring user for jenkins.", level=DEBUG)
3299+ # Check to see if password provided
3300+ admin_passwd = config('password')
3301+ if not admin_passwd:
3302+ # Generate a random one for security. User can then override using juju
3303+ # set.
3304+ admin_passwd = subprocess.check_output(['pwgen', '-N1', '15'])
3305+ admin_passwd = admin_passwd.strip()
3306+
3307+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3308+ with open(passwd_file, 'w+') as fd:
3309+ fd.write(admin_passwd)
3310+
3311+ os.chmod(passwd_file, 0600)
3312+
3313+ jenkins_uid = pwd.getpwnam('jenkins').pw_uid
3314+ jenkins_gid = grp.getgrnam('jenkins').gr_gid
3315+ nogroup_gid = grp.getgrnam('nogroup').gr_gid
3316+
3317+ # Generate Salt and Hash Password for Jenkins
3318+ salt = subprocess.check_output(['pwgen', '-N1', '6']).strip()
3319+ csum = hashlib.sha256("%s{%s}" % (admin_passwd, salt)).hexdigest()
3320+ salty_password = "%s:%s" % (salt, csum)
3321+
3322+ admin_username = config('username')
3323+ admin_user_home = os.path.join(JENKINS_USERS, admin_username)
3324+ if not os.path.isdir(admin_user_home):
3325+ os.makedirs(admin_user_home, 0o0700)
3326+ os.chown(JENKINS_USERS, jenkins_uid, nogroup_gid)
3327+ os.chown(admin_user_home, jenkins_uid, nogroup_gid)
3328+
3329+ # NOTE: overwriting will destroy any data added by jenkins or via the ui
3330+ admin_user_config = os.path.join(admin_user_home, 'config.xml')
3331+ with open(os.path.join(TEMPLATES_DIR, 'user-config.xml')) as src_fd:
3332+ with open(admin_user_config, 'w') as dst_fd:
3333+ lines = src_fd.readlines()
3334+ for line in lines:
3335+ kvs = {'__USERNAME__': admin_username,
3336+ '__PASSWORD__': salty_password}
3337+
3338+ for key, val in kvs.iteritems():
3339+ if key in line:
3340+ line = line.replace(key, val)
3341+
3342+ dst_fd.write(line)
3343+ os.chown(admin_user_config, jenkins_uid, nogroup_gid)
3344+
3345+ # Only run on first invocation otherwise we blast
3346+ # any configuration changes made
3347+ jenkins_bootstrap_flag = '/var/lib/jenkins/config.bootstrapped'
3348+ if not os.path.exists(jenkins_bootstrap_flag):
3349+ log("Bootstrapping secure initial configuration in Jenkins.",
3350+ level=DEBUG)
3351+ src = os.path.join(TEMPLATES_DIR, 'jenkins-config.xml')
3352+ dst = os.path.join(JENKINS_HOME, 'config.xml')
3353+ shutil.copy(src, dst)
3354+ os.chown(dst, jenkins_uid, nogroup_gid)
3355+ # Touch
3356+ with open(jenkins_bootstrap_flag, 'w'):
3357+ pass
3358+
3359+ log("Stopping jenkins for plugin update(s)", level=DEBUG)
3360+ service_stop('jenkins')
3361+ install_jenkins_plugins(jenkins_uid, jenkins_gid)
3362+ log("Starting jenkins to pickup configuration changes", level=DEBUG)
3363+ service_start('jenkins')
3364+
3365+ apt_install(['python-jenkins'], fatal=True)
3366+ tools = config('tools')
3367+ if tools:
3368+ log("Installing tools.", level=DEBUG)
3369+ apt_install(tools.split(), fatal=True)
3370+
3371+
3372+@hooks.hook('start')
3373+def start():
3374+ service_start('jenkins')
3375+
3376+
3377+@hooks.hook('stop')
3378+def stop():
3379+ service_stop('jenkins')
3380+
3381+
3382+@hooks.hook('upgrade-charm')
3383+def upgrade_charm():
3384+ log("Upgrading charm.", level=DEBUG)
3385+ config_changed()
3386+
3387+
3388+@hooks.hook('master-relation-joined')
3389+def master_relation_joined():
3390+ HOSTNAME = unit_get('private-address')
3391+ log("Setting url relation to http://%s:8080" % (HOSTNAME), level=DEBUG)
3392+ relation_set(url="http://%s:8080" % (HOSTNAME))
3393+
3394+
3395+@hooks.hook('master-relation-changed')
3396+def master_relation_changed():
3397+ PASSWORD = config('password')
3398+ if PASSWORD:
3399+ with open('/var/lib/jenkins/.admin_password', 'r') as fd:
3400+ PASSWORD = fd.read()
3401+
3402+ required_settings = ['executors', 'labels', 'slavehost']
3403+ settings = relation_get()
3404+ missing = [s for s in required_settings if s not in settings]
3405+ if missing:
3406+ log("Not all required relation settings received yet (missing=%s) - "
3407+ "skipping" % (', '.join(missing)), level=INFO)
3408+ return
3409+
3410+ slavehost = settings['slavehost']
3411+ executors = settings['executors']
3412+ labels = settings['labels']
3413+
3414+ # Double check to see if this has happened yet
3415+ if "x%s" % (slavehost) == "x":
3416+ log("Slave host not yet defined - skipping", level=INFO)
3417+ return
3418+
3419+ log("Adding slave with hostname %s." % (slavehost), level=DEBUG)
3420+ add_node(slavehost, executors, labels, config('username'), PASSWORD)
3421+ log("Node slave %s added." % (slavehost), level=DEBUG)
3422+
3423+
3424+@hooks.hook('master-relation-departed')
3425+def master_relation_departed():
3426+ # Slave hostname is derived from unit name so
3427+ # this is pretty safe
3428+ slavehost = remote_unit()
3429+ log("Deleting slave with hostname %s." % (slavehost), level=DEBUG)
3430+ del_node(slavehost, config('username'), config('password'))
3431+
3432+
3433+@hooks.hook('master-relation-broken')
3434+def master_relation_broken():
3435+ password = config('password')
3436+ if not password:
3437+ passwd_file = os.path.join(JENKINS_HOME, '.admin_password')
3438+ with open(passwd_file, 'w+') as fd:
3439+ PASSWORD = fd.read()
3440+
3441+ for member in relation_ids():
3442+ member = member.replace('/', '-')
3443+ log("Removing node %s from Jenkins master." % (member), level=DEBUG)
3444+ del_node(member, config('username'), PASSWORD)
3445+
3446+
3447+@hooks.hook('website-relation-joined')
3448+def website_relation_joined():
3449+ hostname = unit_get('private-address')
3450+ log("Setting website URL to %s:8080" % (hostname), level=DEBUG)
3451+ relation_set(port=8080, hostname=hostname)
3452+
3453+
3454+if __name__ == '__main__':
3455+ try:
3456+ hooks.execute(sys.argv)
3457+ except UnregisteredHookError as e:
3458+ log('Unknown hook {} - skipping.'.format(e), level=INFO)
3459
3460=== added file 'hooks/jenkins_utils.py'
3461--- hooks/jenkins_utils.py 1970-01-01 00:00:00 +0000
3462+++ hooks/jenkins_utils.py 2015-01-26 11:20:09 +0000
3463@@ -0,0 +1,178 @@
3464+#!/usr/bin/python
3465+import glob
3466+import os
3467+import shutil
3468+import subprocess
3469+import tempfile
3470+
3471+from charmhelpers.core.hookenv import (
3472+ config,
3473+ log,
3474+ DEBUG,
3475+ INFO,
3476+ WARNING,
3477+)
3478+from charmhelpers.fetch import (
3479+ apt_update,
3480+ add_source,
3481+)
3482+
3483+from charmhelpers.core.decorators import (
3484+ retry_on_exception,
3485+)
3486+
3487+JENKINS_HOME = '/var/lib/jenkins'
3488+JENKINS_USERS = os.path.join(JENKINS_HOME, 'users')
3489+JENKINS_PLUGINS = os.path.join(JENKINS_HOME, 'plugins')
3490+TEMPLATES_DIR = 'templates'
3491+
3492+
3493+def add_node(host, executors, labels, username, password):
3494+ import jenkins
3495+
3496+ @retry_on_exception(2, 2, exc_type=jenkins.JenkinsException)
3497+ def _add_node(*args, **kwargs):
3498+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username,
3499+ password)
3500+
3501+ if l_jenkins.node_exists(host):
3502+ log("Node exists - not adding", level=DEBUG)
3503+ return
3504+
3505+ log("Adding node '%s' to Jenkins master" % (host), level=INFO)
3506+ l_jenkins.create_node(host, int(executors) * 2, host, labels=labels)
3507+
3508+ if not l_jenkins.node_exists(host):
3509+ log("Failed to create node '%s'" % (host), level=WARNING)
3510+
3511+ return _add_node()
3512+
3513+
3514+def del_node(host, username, password):
3515+ import jenkins
3516+
3517+ l_jenkins = jenkins.Jenkins("http://localhost:8080/", username, password)
3518+
3519+ if l_jenkins.node_exists(host):
3520+ log("Node '%s' exists" % (host), level=DEBUG)
3521+ l_jenkins.delete_node(host)
3522+ else:
3523+ log("Node '%s' does not exist - not deleting" % (host), level=INFO)
3524+
3525+
3526+def setup_source(release):
3527+ """Install Jenkins archive."""
3528+ log("Configuring source of jenkins as %s" % release, level=INFO)
3529+
3530+ # Configure to use upstream archives
3531+ # lts - debian-stable
3532+ # trunk - debian
3533+ if release == 'lts':
3534+ source = "debian-stable"
3535+ elif release == 'trunk':
3536+ source = "debian"
3537+ else:
3538+ errmsg = "Release '%s' configuration not recognised" % (release)
3539+ raise Exception(errmsg)
3540+
3541+ # Setup archive to use appropriate jenkins upstream
3542+ key = 'http://pkg.jenkins-ci.org/%s/jenkins-ci.org.key' % source
3543+ target = "%s-%s" % (source, 'jenkins-ci.org.key')
3544+ subprocess.check_call(['wget', '-q', '-O', target, key])
3545+ with open(target, 'r') as fd:
3546+ key = fd.read()
3547+
3548+ deb = "deb http://pkg.jenkins-ci.org/%s binary/" % (source)
3549+ sources_file = "/etc/apt/sources.list.d/jenkins.list"
3550+
3551+ found = False
3552+ if os.path.exists(sources_file):
3553+ with open(sources_file, 'r') as fd:
3554+ for line in fd:
3555+ if deb in line:
3556+ found = True
3557+ break
3558+
3559+ if not found:
3560+ with open(sources_file, 'a') as fd:
3561+ fd.write("%s\n" % deb)
3562+ else:
3563+ with open(sources_file, 'w') as fd:
3564+ fd.write("%s\n" % deb)
3565+
3566+ if not found:
3567+ # NOTE: don't use add_source for adding source since it adds deb and
3568+ # deb-src entries but pkg.jenkins-ci.org has no deb-src.
3569+ add_source("#dummy-source", key=key)
3570+
3571+ apt_update(fatal=True)
3572+
3573+
3574+def install_jenkins_plugins(jenkins_uid, jenkins_gid):
3575+ plugins = config('plugins')
3576+ if plugins:
3577+ plugins = plugins.split()
3578+ else:
3579+ plugins = []
3580+
3581+ log("Installing plugins (%s)" % (' '.join(plugins)), level=DEBUG)
3582+ if not os.path.isdir(JENKINS_PLUGINS):
3583+ os.makedirs(JENKINS_PLUGINS)
3584+
3585+ os.chmod(JENKINS_PLUGINS, 0o0755)
3586+ os.chown(JENKINS_PLUGINS, jenkins_uid, jenkins_gid)
3587+
3588+ track_dir = tempfile.mkdtemp(prefix='/tmp/plugins.installed')
3589+ try:
3590+ installed_plugins = glob.glob("%s/*.hpi" % (JENKINS_PLUGINS))
3591+ for plugin in installed_plugins:
3592+ # Create a ref of installed plugin
3593+ with open(os.path.join(track_dir, os.path.basename(plugin)),
3594+ 'w'):
3595+ pass
3596+
3597+ plugins_site = config('plugins-site')
3598+ log("Fetching plugins from %s" % (plugins_site), level=DEBUG)
3599+ # NOTE: by default wget verifies certificates as of 1.10.
3600+ if config('plugins-check-certificate') == "no":
3601+ opts = ["--no-check-certificate"]
3602+ else:
3603+ opts = []
3604+
3605+ for plugin in plugins:
3606+ plugin_filename = "%s.hpi" % (plugin)
3607+ url = os.path.join(plugins_site, 'latest', plugin_filename)
3608+ plugin_path = os.path.join(JENKINS_PLUGINS, plugin_filename)
3609+ if not os.path.isfile(plugin_path):
3610+ log("Installing plugin %s" % (plugin_filename), level=DEBUG)
3611+ cmd = ['wget'] + opts + ['--timestamping', url, '-O',
3612+ plugin_path]
3613+ subprocess.check_call(cmd)
3614+ os.chmod(plugin_path, 0744)
3615+ os.chown(plugin_path, jenkins_uid, jenkins_gid)
3616+
3617+ else:
3618+ log("Plugin %s already installed" % (plugin_filename),
3619+ level=DEBUG)
3620+
3621+ ref = os.path.join(track_dir, plugin_filename)
3622+ if os.path.exists(ref):
3623+ # Delete ref since plugin is installed.
3624+ os.remove(ref)
3625+
3626+ installed_plugins = os.listdir(track_dir)
3627+ if installed_plugins:
3628+ if config('remove-unlisted-plugins') == "yes":
3629+ for plugin in installed_plugins:
3630+ path = os.path.join(JENKINS_HOME, 'plugins', plugin)
3631+ if os.path.isfile(path):
3632+ log("Deleting unlisted plugin '%s'" % (path),
3633+ level=INFO)
3634+ os.remove(path)
3635+ else:
3636+ log("Unlisted plugins: (%s) Not removed. Set "
3637+ "remove-unlisted-plugins to 'yes' to clear them away." %
3638+ ', '.join(installed_plugins), level=INFO)
3639+ finally:
3640+ # Delete install refs
3641+ shutil.rmtree(track_dir)
3642
3643=== modified file 'hooks/master-relation-broken'
3644--- hooks/master-relation-broken 2012-07-31 10:32:36 +0000
3645+++ hooks/master-relation-broken 1970-01-01 00:00:00 +0000
3646@@ -1,17 +0,0 @@
3647-#!/bin/sh
3648-
3649-PASSWORD=`config-get password`
3650-if [ -z "$PASSWORD" ]
3651-then
3652- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3653-fi
3654-
3655-MEMBERS=`relation-list`
3656-
3657-for MEMBER in $MEMBERS
3658-do
3659- juju-log "Removing node $MEMBER from Jenkins master..."
3660- $(dirname $0)/delnode `echo $MEMBER | sed s,/,-,` `config-get username` $PASSWORD
3661-done
3662-
3663-exit 0
3664
3665=== target is u'jenkins_hooks.py'
3666=== modified file 'hooks/master-relation-changed'
3667--- hooks/master-relation-changed 2012-07-31 10:32:36 +0000
3668+++ hooks/master-relation-changed 1970-01-01 00:00:00 +0000
3669@@ -1,24 +0,0 @@
3670-#!/bin/bash
3671-
3672-set -ue
3673-
3674-PASSWORD=`config-get password`
3675-if [ -z "$PASSWORD" ]
3676-then
3677- PASSWORD=`cat /var/lib/jenkins/.admin_password`
3678-fi
3679-
3680-# Grab information that remote unit has posted to relation
3681-slavehost=$(relation-get slavehost)
3682-executors=$(relation-get executors)
3683-labels=$(relation-get labels)
3684-
3685-# Double check to see if this has happened yet
3686-if [ "x$slavehost" = "x" ]; then
3687- juju-log "Slave host not yet defined, exiting..."
3688- exit 0
3689-fi
3690-
3691-juju-log "Adding slave with hostname $slavehost..."
3692-$(dirname $0)/addnode $slavehost $executors "$labels" `config-get username` $PASSWORD
3693-juju-log "Node slave $slavehost added..."
3694
3695=== target is u'jenkins_hooks.py'
3696=== modified file 'hooks/master-relation-departed'
3697--- hooks/master-relation-departed 2011-09-22 14:46:56 +0000
3698+++ hooks/master-relation-departed 1970-01-01 00:00:00 +0000
3699@@ -1,12 +0,0 @@
3700-#!/bin/bash
3701-
3702-set -ue
3703-
3704-# Slave hostname is derived from unit name so
3705-# this is pretty safe
3706-slavehost=`echo $JUJU_REMOTE_UNIT | sed s,/,-,`
3707-
3708-juju-log "Deleting slave with hostname $slavehost..."
3709-$(dirname $0)/delnode $slavehost `config-get username` `config-get password`
3710-
3711-exit 0
3712
3713=== target is u'jenkins_hooks.py'
3714=== modified file 'hooks/master-relation-joined'
3715--- hooks/master-relation-joined 2011-10-07 13:43:19 +0000
3716+++ hooks/master-relation-joined 1970-01-01 00:00:00 +0000
3717@@ -1,5 +0,0 @@
3718-#!/bin/sh
3719-
3720-HOSTNAME=`unit-get private-address`
3721-juju-log "Setting url relation to http://$HOSTNAME:8080"
3722-relation-set url="http://$HOSTNAME:8080"
3723
3724=== target is u'jenkins_hooks.py'
3725=== modified file 'hooks/start'
3726--- hooks/start 2011-09-22 14:46:56 +0000
3727+++ hooks/start 1970-01-01 00:00:00 +0000
3728@@ -1,3 +0,0 @@
3729-#!/bin/bash
3730-
3731-service jenkins start || true
3732
3733=== target is u'jenkins_hooks.py'
3734=== modified file 'hooks/stop'
3735--- hooks/stop 2011-09-22 14:46:56 +0000
3736+++ hooks/stop 1970-01-01 00:00:00 +0000
3737@@ -1,3 +0,0 @@
3738-#!/bin/bash
3739-
3740-service jenkins stop
3741
3742=== target is u'jenkins_hooks.py'
3743=== modified file 'hooks/upgrade-charm'
3744--- hooks/upgrade-charm 2011-09-22 14:46:56 +0000
3745+++ hooks/upgrade-charm 1970-01-01 00:00:00 +0000
3746@@ -1,7 +0,0 @@
3747-#!/bin/sh
3748-set -e
3749-
3750-home=`dirname $0`
3751-
3752-juju-log "Upgrading charm by running install hook again."
3753-exec $home/install
3754
3755=== target is u'jenkins_hooks.py'
3756=== modified file 'hooks/website-relation-joined'
3757--- hooks/website-relation-joined 2011-10-07 13:43:19 +0000
3758+++ hooks/website-relation-joined 1970-01-01 00:00:00 +0000
3759@@ -1,5 +0,0 @@
3760-#!/bin/sh
3761-
3762-HOSTNAME=`unit-get private-address`
3763-juju-log "Setting website URL to $HOSTNAME:8080"
3764-relation-set port=8080 hostname=$HOSTNAME
3765
3766=== target is u'jenkins_hooks.py'
3767=== added file 'test-requirements.txt'
3768--- test-requirements.txt 1970-01-01 00:00:00 +0000
3769+++ test-requirements.txt 2015-01-26 11:20:09 +0000
3770@@ -0,0 +1,4 @@
3771+distribute
3772+flake8
3773+nose
3774+--allow-external python-apt
3775\ No newline at end of file
3776
3777=== added file 'tests/100-deploy-precise'
3778--- tests/100-deploy-precise 1970-01-01 00:00:00 +0000
3779+++ tests/100-deploy-precise 2015-01-26 11:20:09 +0000
3780@@ -0,0 +1,123 @@
3781+#!/usr/bin/env python
3782+
3783+import amulet
3784+import json
3785+import requests
3786+
3787+###
3788+# Quick Config
3789+###
3790+seconds = 900
3791+
3792+###
3793+# Deployment Setup
3794+###
3795+d = amulet.Deployment(series='precise')
3796+
3797+d.add('haproxy') # website-relation
3798+d.add('jenkins') # Subject matter
3799+d.add('jenkins-slave') # Job Runner
3800+
3801+
3802+d.relate('jenkins:website', 'haproxy:reverseproxy')
3803+d.relate('jenkins:master', 'jenkins-slave:slave')
3804+
3805+d.configure('jenkins', {'tools': 'git gcc make bzr vim-tiny',
3806+ 'release': 'lts',
3807+ 'username': 'amulet',
3808+ 'password': 'testautomation',
3809+ 'plugins': 'groovy',
3810+ 'plugins-check-certificate': 'no'})
3811+
3812+d.expose('jenkins')
3813+d.expose('haproxy')
3814+
3815+try:
3816+ d.setup(timeout=seconds)
3817+ d.sentry.wait()
3818+except amulet.helpers.TimeoutError:
3819+ amulet.raise_status(amulet.SKIP, msg="Environment wasn't stood up in time")
3820+except:
3821+ raise
3822+
3823+
3824+###
3825+# Define reconfiguration routine
3826+###
3827+
3828+
3829+###
3830+# Define sentries for quick access
3831+###
3832+jenkins = d.sentry.unit['jenkins/0']
3833+haproxy = d.sentry.unit['haproxy/0']
3834+slave = d.sentry.unit['jenkins-slave/0']
3835+
3836+
3837+###
3838+# Validate Jenkins configuration options exercised
3839+# Validate jenkins tool installation
3840+###
3841+def validate_tools():
3842+ output, code = jenkins.run('dpkg -l vim-tiny')
3843+ if not output:
3844+ amulet.raise_status(amulet.FAIL, msg="No tool installation found")
3845+
3846+
3847+def validate_release():
3848+ list_present = jenkins.file_stat('/etc/apt/sources.list.d/jenkins.list')
3849+ if not list_present:
3850+ amulet.raise_status(amulet.FAIL, msg="No sources.list update")
3851+ lc = jenkins.file_contents('/etc/apt/sources.list.d/jenkins.list')
3852+ if not 'debian-stable' in lc:
3853+ amulet.raise_status(amulet.FAIL, msg="LTS not found in sources.list")
3854+
3855+
3856+def validate_login():
3857+ #First off, validate that we have the jenkins user on the machine
3858+ output, code = jenkins.run('id -u jenkins')
3859+ if code:
3860+ amulet.raise_status(amulet.FAIL, msg="Jenkins system user not found")
3861+ #validate we have a running service of jenkins to execute the test against
3862+ output, code = jenkins.run('service jenkins status')
3863+ if code:
3864+ amulet.raise_status(amulet.FAIL, msg="No Jenkins Service Running")
3865+ payload = {'j_username': 'amulet',
3866+ 'j_password': 'testautomation',
3867+ 'from': '/'}
3868+ jenkins_url = "http://%s:8080/j_acegi_security_check" % jenkins.info['public-address']
3869+ r = requests.post(jenkins_url, data=payload)
3870+ if r.status_code is not 200:
3871+ amulet.raise_status(amulet.FAIL, msg="Failed to login")
3872+
3873+
3874+#TODO: Figure out how to test installation of NonHTTPS plugin
3875+# This is called as a flag to pyjenkins, and I dont know of any non https
3876+# plugin repositories. Pinned here for reference later.
3877+def validate_plugins():
3878+ ds = jenkins.directory_stat('/var/lib/jenkins/plugins/groovy')
3879+ if ds['size'] <= 0:
3880+ amulet.raise_status(amulet.FAIL, msg="Failed to locate plugin")
3881+
3882+
3883+def validate_website_relation():
3884+ jenkins_url = "http://%s/" % haproxy.info['public-address']
3885+ r = requests.get(jenkins_url)
3886+ if r.status_code is not 200:
3887+ amulet.raise_status(amulet.FAIL,
3888+ msg="Failed to reach jenkins through proxy")
3889+
3890+
3891+def validate_slave_relation():
3892+ jenkins_url = "http://%s:8080/computer/api/json" % jenkins.info['public-address']
3893+ r = requests.get(jenkins_url)
3894+ data = json.loads(r.text)
3895+ if not data['computer'][1]['displayName'] == "jenkins-slave-0":
3896+ amulet.raise_status(amulet.FAIL, msg="Failed to locate slave")
3897+
3898+validate_tools()
3899+validate_release()
3900+validate_login()
3901+validate_plugins()
3902+validate_website_relation()
3903+validate_slave_relation()
3904
3905=== renamed file 'tests/100-deploy' => 'tests/100-deploy-trusty'
3906--- tests/100-deploy 2014-03-05 19:18:19 +0000
3907+++ tests/100-deploy-trusty 2015-01-26 11:20:09 +0000
3908@@ -1,4 +1,4 @@
3909-#!/usr/bin/python3
3910+#!/usr/bin/env python
3911
3912 import amulet
3913 import json
3914@@ -12,11 +12,13 @@
3915 ###
3916 # Deployment Setup
3917 ###
3918-d = amulet.Deployment()
3919+d = amulet.Deployment(series='trusty')
3920
3921 d.add('haproxy') # website-relation
3922 d.add('jenkins') # Subject matter
3923-d.add('jenkins-slave') # Job Runner
3924+# TODO(hopem): we don't yet have a precise version of jenkins-slave
3925+# so use the precise version for now.
3926+d.add('jenkins-slave', 'cs:precise/jenkins-slave') # Job Runner
3927
3928
3929 d.relate('jenkins:website', 'haproxy:reverseproxy')
3930
3931=== added file 'tests/README'
3932--- tests/README 1970-01-01 00:00:00 +0000
3933+++ tests/README 2015-01-26 11:20:09 +0000
3934@@ -0,0 +1,56 @@
3935+This directory provides Amulet tests that focus on verification of Jenkins
3936+deployments.
3937+
3938+In order to run tests, you'll need charm-tools installed (in addition to
3939+juju, of course):
3940+
3941+ sudo add-apt-repository ppa:juju/stable
3942+ sudo apt-get update
3943+ sudo apt-get install charm-tools
3944+
3945+If you use a web proxy server to access the web, you'll need to set the
3946+AMULET_HTTP_PROXY environment variable to the http URL of the proxy server.
3947+
3948+The following examples demonstrate different ways that tests can be executed.
3949+All examples are run from the charm's root directory.
3950+
3951+ * To run all tests (starting with 00-setup):
3952+
3953+ make test
3954+
3955+ * To run a specific test module (or modules):
3956+
3957+ juju test -v -p AMULET_HTTP_PROXY 100-deploy
3958+
3959+ * To run a specific test module (or modules), and keep the environment
3960+ deployed after a failure:
3961+
3962+ juju test --set-e -v -p AMULET_HTTP_PROXY 100-deploy
3963+
3964+ * To re-run a test module against an already deployed environment (one
3965+ that was deployed by a previous call to 'juju test --set-e'):
3966+
3967+ ./tests/100-deploy
3968+
3969+
3970+For debugging and test development purposes, all code should be idempotent.
3971+In other words, the code should have the ability to be re-run without changing
3972+the results beyond the initial run. This enables editing and re-running of a
3973+test module against an already deployed environment, as described above.
3974+
3975+
3976+Notes for additional test writing:
3977+
3978+ * Use DEBUG to turn on debug logging, use ERROR otherwise.
3979+ u = OpenStackAmuletUtils(ERROR)
3980+ u = OpenStackAmuletUtils(DEBUG)
3981+
3982+ * Preserving the deployed environment:
3983+ Even with juju --set-e, amulet will tear down the juju environment
3984+ when all tests pass. This force_fail 'test' can be used in basic_deployment.py
3985+ to simulate a failed test and keep the environment.
3986+
3987+ def test_zzzz_fake_fail(self):
3988+ '''Force a fake fail to keep juju environment after a successful test run'''
3989+ # Useful in test writing, when used with: juju test --set-e
3990+ amulet.raise_status(amulet.FAIL, msg='using fake fail to keep juju environment')
3991
3992=== added directory 'tests/charmhelpers'
3993=== added file 'tests/charmhelpers/__init__.py'
3994=== added directory 'tests/charmhelpers/contrib'
3995=== added file 'tests/charmhelpers/contrib/__init__.py'
3996=== added directory 'tests/charmhelpers/contrib/amulet'
3997=== added file 'tests/charmhelpers/contrib/amulet/__init__.py'
3998=== added file 'tests/charmhelpers/contrib/amulet/deployment.py'
3999--- tests/charmhelpers/contrib/amulet/deployment.py 1970-01-01 00:00:00 +0000
4000+++ tests/charmhelpers/contrib/amulet/deployment.py 2015-01-26 11:20:09 +0000
4001@@ -0,0 +1,77 @@
4002+import amulet
4003+import os
4004+import six
4005+
4006+
4007+class AmuletDeployment(object):
4008+ """Amulet deployment.
4009+
4010+ This class provides generic Amulet deployment and test runner
4011+ methods.
4012+ """
4013+
4014+ def __init__(self, series=None):
4015+ """Initialize the deployment environment."""
4016+ self.series = None
4017+
4018+ if series:
4019+ self.series = series
4020+ self.d = amulet.Deployment(series=self.series)
4021+ else:
4022+ self.d = amulet.Deployment()
4023+
4024+ def _add_services(self, this_service, other_services):
4025+ """Add services.
4026+
4027+ Add services to the deployment where this_service is the local charm
4028+ that we're testing and other_services are the other services that
4029+ are being used in the local amulet tests.
4030+ """
4031+ if this_service['name'] != os.path.basename(os.getcwd()):
4032+ s = this_service['name']
4033+ msg = "The charm's root directory name needs to be {}".format(s)
4034+ amulet.raise_status(amulet.FAIL, msg=msg)
4035+
4036+ if 'units' not in this_service:
4037+ this_service['units'] = 1
4038+
4039+ self.d.add(this_service['name'], units=this_service['units'])
4040+
4041+ for svc in other_services:
4042+ if 'location' in svc:
4043+ branch_location = svc['location']
4044+ elif self.series:
4045+ branch_location = 'cs:{}/{}'.format(self.series, svc['name']),
4046+ else:
4047+ branch_location = None
4048+
4049+ if 'units' not in svc:
4050+ svc['units'] = 1
4051+
4052+ self.d.add(svc['name'], charm=branch_location, units=svc['units'])
4053+
4054+ def _add_relations(self, relations):
4055+ """Add all of the relations for the services."""
4056+ for k, v in six.iteritems(relations):
4057+ self.d.relate(k, v)
4058+
4059+ def _configure_services(self, configs):
4060+ """Configure all of the services."""
4061+ for service, config in six.iteritems(configs):
4062+ self.d.configure(service, config)
4063+
4064+ def _deploy(self):
4065+ """Deploy environment and wait for all hooks to finish executing."""
4066+ try:
4067+ self.d.setup(timeout=900)
4068+ self.d.sentry.wait(timeout=900)
4069+ except amulet.helpers.TimeoutError:
4070+ amulet.raise_status(amulet.FAIL, msg="Deployment timed out")
4071+ except Exception:
4072+ raise
4073+
4074+ def run_tests(self):
4075+ """Run all of the methods that are prefixed with 'test_'."""
4076+ for test in dir(self):
4077+ if test.startswith('test_'):
4078+ getattr(self, test)()
4079
4080=== added file 'tests/charmhelpers/contrib/amulet/utils.py'
4081--- tests/charmhelpers/contrib/amulet/utils.py 1970-01-01 00:00:00 +0000
4082+++ tests/charmhelpers/contrib/amulet/utils.py 2015-01-26 11:20:09 +0000
4083@@ -0,0 +1,178 @@
4084+import ConfigParser
4085+import io
4086+import logging
4087+import re
4088+import sys
4089+import time
4090+
4091+import six
4092+
4093+
4094+class AmuletUtils(object):
4095+ """Amulet utilities.
4096+
4097+ This class provides common utility functions that are used by Amulet
4098+ tests.
4099+ """
4100+
4101+ def __init__(self, log_level=logging.ERROR):
4102+ self.log = self.get_logger(level=log_level)
4103+
4104+ def get_logger(self, name="amulet-logger", level=logging.DEBUG):
4105+ """Get a logger object that will log to stdout."""
4106+ log = logging
4107+ logger = log.getLogger(name)
4108+ fmt = log.Formatter("%(asctime)s %(funcName)s "
4109+ "%(levelname)s: %(message)s")
4110+
4111+ handler = log.StreamHandler(stream=sys.stdout)
4112+ handler.setLevel(level)
4113+ handler.setFormatter(fmt)
4114+
4115+ logger.addHandler(handler)
4116+ logger.setLevel(level)
4117+
4118+ return logger
4119+
4120+ def valid_ip(self, ip):
4121+ if re.match(r"^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$", ip):
4122+ return True
4123+ else:
4124+ return False
4125+
4126+ def valid_url(self, url):
4127+ p = re.compile(
4128+ r'^(?:http|ftp)s?://'
4129+ r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|' # noqa
4130+ r'localhost|'
4131+ r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})'
4132+ r'(?::\d+)?'
4133+ r'(?:/?|[/?]\S+)$',
4134+ re.IGNORECASE)
4135+ if p.match(url):
4136+ return True
4137+ else:
4138+ return False
4139+
4140+ def validate_services(self, commands):
4141+ """Validate services.
4142+
4143+ Verify the specified services are running on the corresponding
4144+ service units.
4145+ """
4146+ for k, v in six.iteritems(commands):
4147+ for cmd in v:
4148+ output, code = k.run(cmd)
4149+ if code != 0:
4150+ return "command `{}` returned {}".format(cmd, str(code))
4151+ return None
4152+
4153+ def _get_config(self, unit, filename):
4154+ """Get a ConfigParser object for parsing a unit's config file."""
4155+ file_contents = unit.file_contents(filename)
4156+ config = ConfigParser.ConfigParser()
4157+ config.readfp(io.StringIO(file_contents))
4158+ return config
4159+
4160+ def validate_config_data(self, sentry_unit, config_file, section,
4161+ expected):
4162+ """Validate config file data.
4163+
4164+ Verify that the specified section of the config file contains
4165+ the expected option key:value pairs.
4166+ """
4167+ config = self._get_config(sentry_unit, config_file)
4168+
4169+ if section != 'DEFAULT' and not config.has_section(section):
4170+ return "section [{}] does not exist".format(section)
4171+
4172+ for k in expected.keys():
4173+ if not config.has_option(section, k):
4174+ return "section [{}] is missing option {}".format(section, k)
4175+ if config.get(section, k) != expected[k]:
4176+ return "section [{}] {}:{} != expected {}:{}".format(
4177+ section, k, config.get(section, k), k, expected[k])
4178+ return None
4179+
4180+ def _validate_dict_data(self, expected, actual):
4181+ """Validate dictionary data.
4182+
4183+ Compare expected dictionary data vs actual dictionary data.
4184+ The values in the 'expected' dictionary can be strings, bools, ints,
4185+ longs, or can be a function that evaluate a variable and returns a
4186+ bool.
4187+ """
4188+ for k, v in six.iteritems(expected):
4189+ if k in actual:
4190+ if (isinstance(v, six.string_types) or
4191+ isinstance(v, bool) or
4192+ isinstance(v, six.integer_types)):
4193+ if v != actual[k]:
4194+ return "{}:{}".format(k, actual[k])
4195+ elif not v(actual[k]):
4196+ return "{}:{}".format(k, actual[k])
4197+ else:
4198+ return "key '{}' does not exist".format(k)
4199+ return None
4200+
4201+ def validate_relation_data(self, sentry_unit, relation, expected):
4202+ """Validate actual relation data based on expected relation data."""
4203+ actual = sentry_unit.relation(relation[0], relation[1])
4204+ self.log.debug('actual: {}'.format(repr(actual)))
4205+ return self._validate_dict_data(expected, actual)
4206+
4207+ def _validate_list_data(self, expected, actual):
4208+ """Compare expected list vs actual list data."""
4209+ for e in expected:
4210+ if e not in actual:
4211+ return "expected item {} not found in actual list".format(e)
4212+ return None
4213+
4214+ def not_null(self, string):
4215+ if string is not None:
4216+ return True
4217+ else:
4218+ return False
4219+
4220+ def _get_file_mtime(self, sentry_unit, filename):
4221+ """Get last modification time of file."""
4222+ return sentry_unit.file_stat(filename)['mtime']
4223+
4224+ def _get_dir_mtime(self, sentry_unit, directory):
4225+ """Get last modification time of directory."""
4226+ return sentry_unit.directory_stat(directory)['mtime']
4227+
4228+ def _get_proc_start_time(self, sentry_unit, service, pgrep_full=False):
4229+ """Get process' start time.
4230+
4231+ Determine start time of the process based on the last modification
4232+ time of the /proc/pid directory. If pgrep_full is True, the process
4233+ name is matched against the full command line.
4234+ """
4235+ if pgrep_full:
4236+ cmd = 'pgrep -o -f {}'.format(service)
4237+ else:
4238+ cmd = 'pgrep -o {}'.format(service)
4239+ proc_dir = '/proc/{}'.format(sentry_unit.run(cmd)[0].strip())
4240+ return self._get_dir_mtime(sentry_unit, proc_dir)
4241+
4242+ def service_restarted(self, sentry_unit, service, filename,
4243+ pgrep_full=False, sleep_time=20):
4244+ """Check if service was restarted.
4245+
4246+ Compare a service's start time vs a file's last modification time
4247+ (such as a config file for that service) to determine if the service
4248+ has been restarted.
4249+ """
4250+ time.sleep(sleep_time)
4251+ if (self._get_proc_start_time(sentry_unit, service, pgrep_full) >=
4252+ self._get_file_mtime(sentry_unit, filename)):
4253+ return True
4254+ else:
4255+ return False
4256+
4257+ def relation_error(self, name, data):
4258+ return 'unexpected relation data in {} - {}'.format(name, data)
4259+
4260+ def endpoint_error(self, name, data):
4261+ return 'unexpected endpoint data in {} - {}'.format(name, data)
4262
4263=== added directory 'tests/charmhelpers/contrib/openstack'
4264=== added file 'tests/charmhelpers/contrib/openstack/__init__.py'
4265=== added directory 'tests/charmhelpers/contrib/openstack/amulet'
4266=== added file 'tests/charmhelpers/contrib/openstack/amulet/__init__.py'
4267=== added file 'tests/charmhelpers/contrib/openstack/amulet/deployment.py'
4268--- tests/charmhelpers/contrib/openstack/amulet/deployment.py 1970-01-01 00:00:00 +0000
4269+++ tests/charmhelpers/contrib/openstack/amulet/deployment.py 2015-01-26 11:20:09 +0000
4270@@ -0,0 +1,92 @@
4271+import six
4272+from charmhelpers.contrib.amulet.deployment import (
4273+ AmuletDeployment
4274+)
4275+
4276+
4277+class OpenStackAmuletDeployment(AmuletDeployment):
4278+ """OpenStack amulet deployment.
4279+
4280+ This class inherits from AmuletDeployment and has additional support
4281+ that is specifically for use by OpenStack charms.
4282+ """
4283+
4284+ def __init__(self, series=None, openstack=None, source=None, stable=True):
4285+ """Initialize the deployment environment."""
4286+ super(OpenStackAmuletDeployment, self).__init__(series)
4287+ self.openstack = openstack
4288+ self.source = source
4289+ self.stable = stable
4290+ # Note(coreycb): this needs to be changed when new next branches come
4291+ # out.
4292+ self.current_next = "trusty"
4293+
4294+ def _determine_branch_locations(self, other_services):
4295+ """Determine the branch locations for the other services.
4296+
4297+ Determine if the local branch being tested is derived from its
4298+ stable or next (dev) branch, and based on this, use the corresonding
4299+ stable or next branches for the other_services."""
4300+ base_charms = ['mysql', 'mongodb', 'rabbitmq-server']
4301+
4302+ if self.stable:
4303+ for svc in other_services:
4304+ temp = 'lp:charms/{}'
4305+ svc['location'] = temp.format(svc['name'])
4306+ else:
4307+ for svc in other_services:
4308+ if svc['name'] in base_charms:
4309+ temp = 'lp:charms/{}'
4310+ svc['location'] = temp.format(svc['name'])
4311+ else:
4312+ temp = 'lp:~openstack-charmers/charms/{}/{}/next'
4313+ svc['location'] = temp.format(self.current_next,
4314+ svc['name'])
4315+ return other_services
4316+
4317+ def _add_services(self, this_service, other_services):
4318+ """Add services to the deployment and set openstack-origin/source."""
4319+ other_services = self._determine_branch_locations(other_services)
4320+
4321+ super(OpenStackAmuletDeployment, self)._add_services(this_service,
4322+ other_services)
4323+
4324+ services = other_services
4325+ services.append(this_service)
4326+ use_source = ['mysql', 'mongodb', 'rabbitmq-server', 'ceph',
4327+ 'ceph-osd', 'ceph-radosgw']
4328+
4329+ if self.openstack:
4330+ for svc in services:
4331+ if svc['name'] not in use_source:
4332+ config = {'openstack-origin': self.openstack}
4333+ self.d.configure(svc['name'], config)
4334+
4335+ if self.source:
4336+ for svc in services:
4337+ if svc['name'] in use_source:
4338+ config = {'source': self.source}
4339+ self.d.configure(svc['name'], config)
4340+
4341+ def _configure_services(self, configs):
4342+ """Configure all of the services."""
4343+ for service, config in six.iteritems(configs):
4344+ self.d.configure(service, config)
4345+
4346+ def _get_openstack_release(self):
4347+ """Get openstack release.
4348+
4349+ Return an integer representing the enum value of the openstack
4350+ release.
4351+ """
4352+ (self.precise_essex, self.precise_folsom, self.precise_grizzly,
4353+ self.precise_havana, self.precise_icehouse,
4354+ self.trusty_icehouse) = range(6)
4355+ releases = {
4356+ ('precise', None): self.precise_essex,
4357+ ('precise', 'cloud:precise-folsom'): self.precise_folsom,
4358+ ('precise', 'cloud:precise-grizzly'): self.precise_grizzly,
4359+ ('precise', 'cloud:precise-havana'): self.precise_havana,
4360+ ('precise', 'cloud:precise-icehouse'): self.precise_icehouse,
4361+ ('trusty', None): self.trusty_icehouse}
4362+ return releases[(self.series, self.openstack)]
4363
4364=== added file 'tests/charmhelpers/contrib/openstack/amulet/utils.py'
4365--- tests/charmhelpers/contrib/openstack/amulet/utils.py 1970-01-01 00:00:00 +0000
4366+++ tests/charmhelpers/contrib/openstack/amulet/utils.py 2015-01-26 11:20:09 +0000
4367@@ -0,0 +1,278 @@
4368+import logging
4369+import os
4370+import time
4371+import urllib
4372+
4373+import glanceclient.v1.client as glance_client
4374+import keystoneclient.v2_0 as keystone_client
4375+import novaclient.v1_1.client as nova_client
4376+
4377+import six
4378+
4379+from charmhelpers.contrib.amulet.utils import (
4380+ AmuletUtils
4381+)
4382+
4383+DEBUG = logging.DEBUG
4384+ERROR = logging.ERROR
4385+
4386+
4387+class OpenStackAmuletUtils(AmuletUtils):
4388+ """OpenStack amulet utilities.
4389+
4390+ This class inherits from AmuletUtils and has additional support
4391+ that is specifically for use by OpenStack charms.
4392+ """
4393+
4394+ def __init__(self, log_level=ERROR):
4395+ """Initialize the deployment environment."""
4396+ super(OpenStackAmuletUtils, self).__init__(log_level)
4397+
4398+ def validate_endpoint_data(self, endpoints, admin_port, internal_port,
4399+ public_port, expected):
4400+ """Validate endpoint data.
4401+
4402+ Validate actual endpoint data vs expected endpoint data. The ports
4403+ are used to find the matching endpoint.
4404+ """
4405+ found = False
4406+ for ep in endpoints:
4407+ self.log.debug('endpoint: {}'.format(repr(ep)))
4408+ if (admin_port in ep.adminurl and
4409+ internal_port in ep.internalurl and
4410+ public_port in ep.publicurl):
4411+ found = True
4412+ actual = {'id': ep.id,
4413+ 'region': ep.region,
4414+ 'adminurl': ep.adminurl,
4415+ 'internalurl': ep.internalurl,
4416+ 'publicurl': ep.publicurl,
4417+ 'service_id': ep.service_id}
4418+ ret = self._validate_dict_data(expected, actual)
4419+ if ret:
4420+ return 'unexpected endpoint data - {}'.format(ret)
4421+
4422+ if not found:
4423+ return 'endpoint not found'
4424+
4425+ def validate_svc_catalog_endpoint_data(self, expected, actual):
4426+ """Validate service catalog endpoint data.
4427+
4428+ Validate a list of actual service catalog endpoints vs a list of
4429+ expected service catalog endpoints.
4430+ """
4431+ self.log.debug('actual: {}'.format(repr(actual)))
4432+ for k, v in six.iteritems(expected):
4433+ if k in actual:
4434+ ret = self._validate_dict_data(expected[k][0], actual[k][0])
4435+ if ret:
4436+ return self.endpoint_error(k, ret)
4437+ else:
4438+ return "endpoint {} does not exist".format(k)
4439+ return ret
4440+
4441+ def validate_tenant_data(self, expected, actual):
4442+ """Validate tenant data.
4443+
4444+ Validate a list of actual tenant data vs list of expected tenant
4445+ data.
4446+ """
4447+ self.log.debug('actual: {}'.format(repr(actual)))
4448+ for e in expected:
4449+ found = False
4450+ for act in actual:
4451+ a = {'enabled': act.enabled, 'description': act.description,
4452+ 'name': act.name, 'id': act.id}
4453+ if e['name'] == a['name']:
4454+ found = True
4455+ ret = self._validate_dict_data(e, a)
4456+ if ret:
4457+ return "unexpected tenant data - {}".format(ret)
4458+ if not found:
4459+ return "tenant {} does not exist".format(e['name'])
4460+ return ret
4461+
4462+ def validate_role_data(self, expected, actual):
4463+ """Validate role data.
4464+
4465+ Validate a list of actual role data vs a list of expected role
4466+ data.
4467+ """
4468+ self.log.debug('actual: {}'.format(repr(actual)))
4469+ for e in expected:
4470+ found = False
4471+ for act in actual:
4472+ a = {'name': act.name, 'id': act.id}
4473+ if e['name'] == a['name']:
4474+ found = True
4475+ ret = self._validate_dict_data(e, a)
4476+ if ret:
4477+ return "unexpected role data - {}".format(ret)
4478+ if not found:
4479+ return "role {} does not exist".format(e['name'])
4480+ return ret
4481+
4482+ def validate_user_data(self, expected, actual):
4483+ """Validate user data.
4484+
4485+ Validate a list of actual user data vs a list of expected user
4486+ data.
4487+ """
4488+ self.log.debug('actual: {}'.format(repr(actual)))
4489+ for e in expected:
4490+ found = False
4491+ for act in actual:
4492+ a = {'enabled': act.enabled, 'name': act.name,
4493+ 'email': act.email, 'tenantId': act.tenantId,
4494+ 'id': act.id}
4495+ if e['name'] == a['name']:
4496+ found = True
4497+ ret = self._validate_dict_data(e, a)
4498+ if ret:
4499+ return "unexpected user data - {}".format(ret)
4500+ if not found:
4501+ return "user {} does not exist".format(e['name'])
4502+ return ret
4503+
4504+ def validate_flavor_data(self, expected, actual):
4505+ """Validate flavor data.
4506+
4507+ Validate a list of actual flavors vs a list of expected flavors.
4508+ """
4509+ self.log.debug('actual: {}'.format(repr(actual)))
4510+ act = [a.name for a in actual]
4511+ return self._validate_list_data(expected, act)
4512+
4513+ def tenant_exists(self, keystone, tenant):
4514+ """Return True if tenant exists."""
4515+ return tenant in [t.name for t in keystone.tenants.list()]
4516+
4517+ def authenticate_keystone_admin(self, keystone_sentry, user, password,
4518+ tenant):
4519+ """Authenticates admin user with the keystone admin endpoint."""
4520+ unit = keystone_sentry
4521+ service_ip = unit.relation('shared-db',
4522+ 'mysql:shared-db')['private-address']
4523+ ep = "http://{}:35357/v2.0".format(service_ip.strip().decode('utf-8'))
4524+ return keystone_client.Client(username=user, password=password,
4525+ tenant_name=tenant, auth_url=ep)
4526+
4527+ def authenticate_keystone_user(self, keystone, user, password, tenant):
4528+ """Authenticates a regular user with the keystone public endpoint."""
4529+ ep = keystone.service_catalog.url_for(service_type='identity',
4530+ endpoint_type='publicURL')
4531+ return keystone_client.Client(username=user, password=password,
4532+ tenant_name=tenant, auth_url=ep)
4533+
4534+ def authenticate_glance_admin(self, keystone):
4535+ """Authenticates admin user with glance."""
4536+ ep = keystone.service_catalog.url_for(service_type='image',
4537+ endpoint_type='adminURL')
4538+ return glance_client.Client(ep, token=keystone.auth_token)
4539+
4540+ def authenticate_nova_user(self, keystone, user, password, tenant):
4541+ """Authenticates a regular user with nova-api."""
4542+ ep = keystone.service_catalog.url_for(service_type='identity',
4543+ endpoint_type='publicURL')
4544+ return nova_client.Client(username=user, api_key=password,
4545+ project_id=tenant, auth_url=ep)
4546+
4547+ def create_cirros_image(self, glance, image_name):
4548+ """Download the latest cirros image and upload it to glance."""
4549+ http_proxy = os.getenv('AMULET_HTTP_PROXY')
4550+ self.log.debug('AMULET_HTTP_PROXY: {}'.format(http_proxy))
4551+ if http_proxy:
4552+ proxies = {'http': http_proxy}
4553+ opener = urllib.FancyURLopener(proxies)
4554+ else:
4555+ opener = urllib.FancyURLopener()
4556+
4557+ f = opener.open("http://download.cirros-cloud.net/version/released")
4558+ version = f.read().strip()
4559+ cirros_img = "cirros-{}-x86_64-disk.img".format(version)
4560+ local_path = os.path.join('tests', cirros_img)
4561+
4562+ if not os.path.exists(local_path):
4563+ cirros_url = "http://{}/{}/{}".format("download.cirros-cloud.net",
4564+ version, cirros_img)
4565+ opener.retrieve(cirros_url, local_path)
4566+ f.close()
4567+
4568+ with open(local_path) as f:
4569+ image = glance.images.create(name=image_name, is_public=True,
4570+ disk_format='qcow2',
4571+ container_format='bare', data=f)
4572+ count = 1
4573+ status = image.status
4574+ while status != 'active' and count < 10:
4575+ time.sleep(3)
4576+ image = glance.images.get(image.id)
4577+ status = image.status
4578+ self.log.debug('image status: {}'.format(status))
4579+ count += 1
4580+
4581+ if status != 'active':
4582+ self.log.error('image creation timed out')
4583+ return None
4584+
4585+ return image
4586+
4587+ def delete_image(self, glance, image):
4588+ """Delete the specified image."""
4589+ num_before = len(list(glance.images.list()))
4590+ glance.images.delete(image)
4591+
4592+ count = 1
4593+ num_after = len(list(glance.images.list()))
4594+ while num_after != (num_before - 1) and count < 10:
4595+ time.sleep(3)
4596+ num_after = len(list(glance.images.list()))
4597+ self.log.debug('number of images: {}'.format(num_after))
4598+ count += 1
4599+
4600+ if num_after != (num_before - 1):
4601+ self.log.error('image deletion timed out')
4602+ return False
4603+
4604+ return True
4605+
4606+ def create_instance(self, nova, image_name, instance_name, flavor):
4607+ """Create the specified instance."""
4608+ image = nova.images.find(name=image_name)
4609+ flavor = nova.flavors.find(name=flavor)
4610+ instance = nova.servers.create(name=instance_name, image=image,
4611+ flavor=flavor)
4612+
4613+ count = 1
4614+ status = instance.status
4615+ while status != 'ACTIVE' and count < 60:
4616+ time.sleep(3)
4617+ instance = nova.servers.get(instance.id)
4618+ status = instance.status
4619+ self.log.debug('instance status: {}'.format(status))
4620+ count += 1
4621+
4622+ if status != 'ACTIVE':
4623+ self.log.error('instance creation timed out')
4624+ return None
4625+
4626+ return instance
4627+
4628+ def delete_instance(self, nova, instance):
4629+ """Delete the specified instance."""
4630+ num_before = len(list(nova.servers.list()))
4631+ nova.servers.delete(instance)
4632+
4633+ count = 1
4634+ num_after = len(list(nova.servers.list()))
4635+ while num_after != (num_before - 1) and count < 10:
4636+ time.sleep(3)
4637+ num_after = len(list(nova.servers.list()))
4638+ self.log.debug('number of instances: {}'.format(num_after))
4639+ count += 1
4640+
4641+ if num_after != (num_before - 1):
4642+ self.log.error('instance deletion timed out')
4643+ return False
4644+
4645+ return True
4646
4647=== added directory 'unit_tests'
4648=== added file 'unit_tests/__init__.py'
4649=== added file 'unit_tests/test_jenkins_hooks.py'
4650--- unit_tests/test_jenkins_hooks.py 1970-01-01 00:00:00 +0000
4651+++ unit_tests/test_jenkins_hooks.py 2015-01-26 11:20:09 +0000
4652@@ -0,0 +1,6 @@
4653+import unittest
4654+
4655+
4656+class JenkinsHooksTests(unittest.TestCase):
4657+ def setUp(self):
4658+ super(JenkinsHooksTests, self).setUp()
4659
4660=== added file 'unit_tests/test_jenkins_utils.py'
4661--- unit_tests/test_jenkins_utils.py 1970-01-01 00:00:00 +0000
4662+++ unit_tests/test_jenkins_utils.py 2015-01-26 11:20:09 +0000
4663@@ -0,0 +1,6 @@
4664+import unittest
4665+
4666+
4667+class JenkinsUtilsTests(unittest.TestCase):
4668+ def setUp(self):
4669+ super(JenkinsUtilsTests, self).setUp()

Subscribers

People subscribed via source and target branches

to all changes: