Merge lp:~niedbalski/charms/precise/rsyslog/lp-1310793 into lp:charms/rsyslog
- Precise Pangolin (12.04)
- lp-1310793
- Merge into trunk
Proposed by
Jorge Niedbalski
Status: | Merged |
---|---|
Merged at revision: | 10 |
Proposed branch: | lp:~niedbalski/charms/precise/rsyslog/lp-1310793 |
Merge into: | lp:charms/rsyslog |
Diff against target: |
1813 lines (+1595/-70) 24 files modified
Makefile (+22/-0) README (+0/-1) charm-helpers.yaml (+5/-0) config.yaml (+4/-0) files/60-aggregator.conf (+0/-7) hooks/charmhelpers/contrib/templating/contexts.py (+104/-0) hooks/charmhelpers/core/hookenv.py (+401/-0) hooks/charmhelpers/core/host.py (+297/-0) hooks/charmhelpers/fetch/__init__.py (+308/-0) hooks/charmhelpers/fetch/archiveurl.py (+63/-0) hooks/charmhelpers/fetch/bzrurl.py (+49/-0) hooks/config-changed (+0/-39) hooks/hooks.py (+125/-0) hooks/install (+0/-8) hooks/start (+0/-8) hooks/stop (+0/-4) hooks/upgrade-charm (+0/-2) revision (+1/-1) setup.cfg (+5/-0) templates/60-aggregator.conf (+7/-0) templates/nova-logging.conf (+12/-0) templates/rsyslog.conf (+37/-0) test_requirements.txt (+6/-0) unit_tests/test_hooks.py (+149/-0) |
To merge this branch: | bzr merge lp:~niedbalski/charms/precise/rsyslog/lp-1310793 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Marco Ceppi (community) | Approve | ||
Review via email: mp+216773@code.launchpad.net |
Commit message
Description of the change
Proposed for fix lp:1310793
To post a comment you must log in.
Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote : | # |
Revision history for this message
Marco Ceppi (marcoceppi) wrote : | # |
Hi Jorge,
In addition to the one line fix that Tim mentioned, make lint is also failing with a few flake8 errors:
$ make lint
hooks/hooks.py:5:1: F401 're' imported but unused
hooks/hooks.
make: *** [lint] Error 1
Could you address those as well prior to merging, otherwise another +1 from me, amazing work!
- 12. By Jorge Niedbalski
-
[review] modified according to Tim Van Steenburgh , and Marco Ceppi observations
Revision history for this message
Jorge Niedbalski (niedbalski) wrote : | # |
Hey guys,
Fixed.
[flake8 --exclude=
[make test] == 7 passed, 0 failed
Thanks!
- 13. By Jorge Niedbalski
-
[fix] Modload imtcp not exposed yet
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === added file 'Makefile' |
2 | --- Makefile 1970-01-01 00:00:00 +0000 |
3 | +++ Makefile 2014-05-01 19:18:03 +0000 |
4 | @@ -0,0 +1,22 @@ |
5 | +#!/usr/bin/make |
6 | +PYTHON := /usr/bin/env python |
7 | + |
8 | +build: sync-charm-helpers lint |
9 | + |
10 | +lint: |
11 | + @flake8 --exclude hooks/charmhelpers --ignore=E125 hooks |
12 | + |
13 | +test: |
14 | + @pip install -r test_requirements.txt |
15 | + @PYTHONPATH=$(PYTHON_PATH):hooks/ nosetests --nologcapture unit_tests |
16 | + |
17 | +bin/charm_helpers_sync.py: |
18 | + @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py > bin/charm_helpers_sync.py |
19 | + |
20 | +sync-charm-helpers: bin/charm_helpers_sync.py |
21 | + @$(PYTHON) bin/charm_helpers_sync.py -c charm-helpers.yaml |
22 | + |
23 | +deploy: |
24 | + @echo Deploying charm-rsyslog template. |
25 | + @juju deploy --repository=../.. local:rsyslog |
26 | + @echo See the README for explorations after deploying. |
27 | |
28 | === modified file 'README' |
29 | --- README 2012-09-11 15:59:06 +0000 |
30 | +++ README 2014-05-01 19:18:03 +0000 |
31 | @@ -5,5 +5,4 @@ |
32 | TODO: |
33 | ----- |
34 | -Tighter security |
35 | --Configurable filters |
36 | -Forwarding |
37 | |
38 | === added file 'charm-helpers.yaml' |
39 | --- charm-helpers.yaml 1970-01-01 00:00:00 +0000 |
40 | +++ charm-helpers.yaml 2014-05-01 19:18:03 +0000 |
41 | @@ -0,0 +1,5 @@ |
42 | +branch: lp:charm-helpers |
43 | +destination: hooks/charmhelpers |
44 | +include: |
45 | + - core |
46 | + - fetch |
47 | |
48 | === modified file 'config.yaml' |
49 | --- config.yaml 2014-03-14 16:33:25 +0000 |
50 | +++ config.yaml 2014-05-01 19:18:03 +0000 |
51 | @@ -7,3 +7,7 @@ |
52 | type: int |
53 | default: 4 |
54 | description: "Number of days to rotate /var/log/messages and other associated files" |
55 | + nova_logs: |
56 | + type: boolean |
57 | + default: false |
58 | + description: "Have an individual log file for each compute node as well as an aggregated log that contains nova logs from all nodes" |
59 | |
60 | === removed directory 'files' |
61 | === removed file 'files/60-aggregator.conf' |
62 | --- files/60-aggregator.conf 2012-04-17 09:35:17 +0000 |
63 | +++ files/60-aggregator.conf 1970-01-01 00:00:00 +0000 |
64 | @@ -1,7 +0,0 @@ |
65 | -# Remote listening for aggregation |
66 | -$ModLoad imudp |
67 | -$UDPServerRun 514 |
68 | - |
69 | -$ModLoad imtcp |
70 | -$InputTCPMaxSessions 500 |
71 | -$InputTCPServerRun 10514 |
72 | |
73 | === added directory 'hooks/charmhelpers' |
74 | === added file 'hooks/charmhelpers/__init__.py' |
75 | === added directory 'hooks/charmhelpers/contrib' |
76 | === added file 'hooks/charmhelpers/contrib/__init__.py' |
77 | === added directory 'hooks/charmhelpers/contrib/templating' |
78 | === added file 'hooks/charmhelpers/contrib/templating/__init__.py' |
79 | === added file 'hooks/charmhelpers/contrib/templating/contexts.py' |
80 | --- hooks/charmhelpers/contrib/templating/contexts.py 1970-01-01 00:00:00 +0000 |
81 | +++ hooks/charmhelpers/contrib/templating/contexts.py 2014-05-01 19:18:03 +0000 |
82 | @@ -0,0 +1,104 @@ |
83 | +# Copyright 2013 Canonical Ltd. |
84 | +# |
85 | +# Authors: |
86 | +# Charm Helpers Developers <juju@lists.ubuntu.com> |
87 | +"""A helper to create a yaml cache of config with namespaced relation data.""" |
88 | +import os |
89 | +import yaml |
90 | + |
91 | +import charmhelpers.core.hookenv |
92 | + |
93 | + |
94 | +charm_dir = os.environ.get('CHARM_DIR', '') |
95 | + |
96 | + |
97 | +def dict_keys_without_hyphens(a_dict): |
98 | + """Return the a new dict with underscores instead of hyphens in keys.""" |
99 | + return dict( |
100 | + (key.replace('-', '_'), val) for key, val in a_dict.items()) |
101 | + |
102 | + |
103 | +def update_relations(context, namespace_separator=':'): |
104 | + """Update the context with the relation data.""" |
105 | + # Add any relation data prefixed with the relation type. |
106 | + relation_type = charmhelpers.core.hookenv.relation_type() |
107 | + relations = [] |
108 | + context['current_relation'] = {} |
109 | + if relation_type is not None: |
110 | + relation_data = charmhelpers.core.hookenv.relation_get() |
111 | + context['current_relation'] = relation_data |
112 | + # Deprecated: the following use of relation data as keys |
113 | + # directly in the context will be removed. |
114 | + relation_data = dict( |
115 | + ("{relation_type}{namespace_separator}{key}".format( |
116 | + relation_type=relation_type, |
117 | + key=key, |
118 | + namespace_separator=namespace_separator), val) |
119 | + for key, val in relation_data.items()) |
120 | + relation_data = dict_keys_without_hyphens(relation_data) |
121 | + context.update(relation_data) |
122 | + relations = charmhelpers.core.hookenv.relations_of_type(relation_type) |
123 | + relations = [dict_keys_without_hyphens(rel) for rel in relations] |
124 | + |
125 | + if 'relations_deprecated' not in context: |
126 | + context['relations_deprecated'] = {} |
127 | + if relation_type is not None: |
128 | + relation_type = relation_type.replace('-', '_') |
129 | + context['relations_deprecated'][relation_type] = relations |
130 | + |
131 | + context['relations'] = charmhelpers.core.hookenv.relations() |
132 | + |
133 | + |
134 | +def juju_state_to_yaml(yaml_path, namespace_separator=':', |
135 | + allow_hyphens_in_keys=True): |
136 | + """Update the juju config and state in a yaml file. |
137 | + |
138 | + This includes any current relation-get data, and the charm |
139 | + directory. |
140 | + |
141 | + This function was created for the ansible and saltstack |
142 | + support, as those libraries can use a yaml file to supply |
143 | + context to templates, but it may be useful generally to |
144 | + create and update an on-disk cache of all the config, including |
145 | + previous relation data. |
146 | + |
147 | + By default, hyphens are allowed in keys as this is supported |
148 | + by yaml, but for tools like ansible, hyphens are not valid [1]. |
149 | + |
150 | + [1] http://www.ansibleworks.com/docs/playbooks_variables.html#what-makes-a-valid-variable-name |
151 | + """ |
152 | + config = charmhelpers.core.hookenv.config() |
153 | + |
154 | + # Add the charm_dir which we will need to refer to charm |
155 | + # file resources etc. |
156 | + config['charm_dir'] = charm_dir |
157 | + config['local_unit'] = charmhelpers.core.hookenv.local_unit() |
158 | + config['unit_private_address'] = charmhelpers.core.hookenv.unit_private_ip() |
159 | + config['unit_public_address'] = charmhelpers.core.hookenv.unit_get( |
160 | + 'public-address' |
161 | + ) |
162 | + |
163 | + # Don't use non-standard tags for unicode which will not |
164 | + # work when salt uses yaml.load_safe. |
165 | + yaml.add_representer(unicode, lambda dumper, |
166 | + value: dumper.represent_scalar( |
167 | + u'tag:yaml.org,2002:str', value)) |
168 | + |
169 | + yaml_dir = os.path.dirname(yaml_path) |
170 | + if not os.path.exists(yaml_dir): |
171 | + os.makedirs(yaml_dir) |
172 | + |
173 | + if os.path.exists(yaml_path): |
174 | + with open(yaml_path, "r") as existing_vars_file: |
175 | + existing_vars = yaml.load(existing_vars_file.read()) |
176 | + else: |
177 | + existing_vars = {} |
178 | + |
179 | + if not allow_hyphens_in_keys: |
180 | + config = dict_keys_without_hyphens(config) |
181 | + existing_vars.update(config) |
182 | + |
183 | + update_relations(existing_vars, namespace_separator) |
184 | + |
185 | + with open(yaml_path, "w+") as fp: |
186 | + fp.write(yaml.dump(existing_vars)) |
187 | |
188 | === added directory 'hooks/charmhelpers/core' |
189 | === added file 'hooks/charmhelpers/core/__init__.py' |
190 | === added file 'hooks/charmhelpers/core/hookenv.py' |
191 | --- hooks/charmhelpers/core/hookenv.py 1970-01-01 00:00:00 +0000 |
192 | +++ hooks/charmhelpers/core/hookenv.py 2014-05-01 19:18:03 +0000 |
193 | @@ -0,0 +1,401 @@ |
194 | +"Interactions with the Juju environment" |
195 | +# Copyright 2013 Canonical Ltd. |
196 | +# |
197 | +# Authors: |
198 | +# Charm Helpers Developers <juju@lists.ubuntu.com> |
199 | + |
200 | +import os |
201 | +import json |
202 | +import yaml |
203 | +import subprocess |
204 | +import sys |
205 | +import UserDict |
206 | +from subprocess import CalledProcessError |
207 | + |
208 | +CRITICAL = "CRITICAL" |
209 | +ERROR = "ERROR" |
210 | +WARNING = "WARNING" |
211 | +INFO = "INFO" |
212 | +DEBUG = "DEBUG" |
213 | +MARKER = object() |
214 | + |
215 | +cache = {} |
216 | + |
217 | + |
218 | +def cached(func): |
219 | + """Cache return values for multiple executions of func + args |
220 | + |
221 | + For example: |
222 | + |
223 | + @cached |
224 | + def unit_get(attribute): |
225 | + pass |
226 | + |
227 | + unit_get('test') |
228 | + |
229 | + will cache the result of unit_get + 'test' for future calls. |
230 | + """ |
231 | + def wrapper(*args, **kwargs): |
232 | + global cache |
233 | + key = str((func, args, kwargs)) |
234 | + try: |
235 | + return cache[key] |
236 | + except KeyError: |
237 | + res = func(*args, **kwargs) |
238 | + cache[key] = res |
239 | + return res |
240 | + return wrapper |
241 | + |
242 | + |
243 | +def flush(key): |
244 | + """Flushes any entries from function cache where the |
245 | + key is found in the function+args """ |
246 | + flush_list = [] |
247 | + for item in cache: |
248 | + if key in item: |
249 | + flush_list.append(item) |
250 | + for item in flush_list: |
251 | + del cache[item] |
252 | + |
253 | + |
254 | +def log(message, level=None): |
255 | + """Write a message to the juju log""" |
256 | + command = ['juju-log'] |
257 | + if level: |
258 | + command += ['-l', level] |
259 | + command += [message] |
260 | + subprocess.call(command) |
261 | + |
262 | + |
263 | +class Serializable(UserDict.IterableUserDict): |
264 | + """Wrapper, an object that can be serialized to yaml or json""" |
265 | + |
266 | + def __init__(self, obj): |
267 | + # wrap the object |
268 | + UserDict.IterableUserDict.__init__(self) |
269 | + self.data = obj |
270 | + |
271 | + def __getattr__(self, attr): |
272 | + # See if this object has attribute. |
273 | + if attr in ("json", "yaml", "data"): |
274 | + return self.__dict__[attr] |
275 | + # Check for attribute in wrapped object. |
276 | + got = getattr(self.data, attr, MARKER) |
277 | + if got is not MARKER: |
278 | + return got |
279 | + # Proxy to the wrapped object via dict interface. |
280 | + try: |
281 | + return self.data[attr] |
282 | + except KeyError: |
283 | + raise AttributeError(attr) |
284 | + |
285 | + def __getstate__(self): |
286 | + # Pickle as a standard dictionary. |
287 | + return self.data |
288 | + |
289 | + def __setstate__(self, state): |
290 | + # Unpickle into our wrapper. |
291 | + self.data = state |
292 | + |
293 | + def json(self): |
294 | + """Serialize the object to json""" |
295 | + return json.dumps(self.data) |
296 | + |
297 | + def yaml(self): |
298 | + """Serialize the object to yaml""" |
299 | + return yaml.dump(self.data) |
300 | + |
301 | + |
302 | +def execution_environment(): |
303 | + """A convenient bundling of the current execution context""" |
304 | + context = {} |
305 | + context['conf'] = config() |
306 | + if relation_id(): |
307 | + context['reltype'] = relation_type() |
308 | + context['relid'] = relation_id() |
309 | + context['rel'] = relation_get() |
310 | + context['unit'] = local_unit() |
311 | + context['rels'] = relations() |
312 | + context['env'] = os.environ |
313 | + return context |
314 | + |
315 | + |
316 | +def in_relation_hook(): |
317 | + """Determine whether we're running in a relation hook""" |
318 | + return 'JUJU_RELATION' in os.environ |
319 | + |
320 | + |
321 | +def relation_type(): |
322 | + """The scope for the current relation hook""" |
323 | + return os.environ.get('JUJU_RELATION', None) |
324 | + |
325 | + |
326 | +def relation_id(): |
327 | + """The relation ID for the current relation hook""" |
328 | + return os.environ.get('JUJU_RELATION_ID', None) |
329 | + |
330 | + |
331 | +def local_unit(): |
332 | + """Local unit ID""" |
333 | + return os.environ['JUJU_UNIT_NAME'] |
334 | + |
335 | + |
336 | +def remote_unit(): |
337 | + """The remote unit for the current relation hook""" |
338 | + return os.environ['JUJU_REMOTE_UNIT'] |
339 | + |
340 | + |
341 | +def service_name(): |
342 | + """The name service group this unit belongs to""" |
343 | + return local_unit().split('/')[0] |
344 | + |
345 | + |
346 | +def hook_name(): |
347 | + """The name of the currently executing hook""" |
348 | + return os.path.basename(sys.argv[0]) |
349 | + |
350 | + |
351 | +@cached |
352 | +def config(scope=None): |
353 | + """Juju charm configuration""" |
354 | + config_cmd_line = ['config-get'] |
355 | + if scope is not None: |
356 | + config_cmd_line.append(scope) |
357 | + config_cmd_line.append('--format=json') |
358 | + try: |
359 | + return json.loads(subprocess.check_output(config_cmd_line)) |
360 | + except ValueError: |
361 | + return None |
362 | + |
363 | + |
364 | +@cached |
365 | +def relation_get(attribute=None, unit=None, rid=None): |
366 | + """Get relation information""" |
367 | + _args = ['relation-get', '--format=json'] |
368 | + if rid: |
369 | + _args.append('-r') |
370 | + _args.append(rid) |
371 | + _args.append(attribute or '-') |
372 | + if unit: |
373 | + _args.append(unit) |
374 | + try: |
375 | + return json.loads(subprocess.check_output(_args)) |
376 | + except ValueError: |
377 | + return None |
378 | + except CalledProcessError, e: |
379 | + if e.returncode == 2: |
380 | + return None |
381 | + raise |
382 | + |
383 | + |
384 | +def relation_set(relation_id=None, relation_settings={}, **kwargs): |
385 | + """Set relation information for the current unit""" |
386 | + relation_cmd_line = ['relation-set'] |
387 | + if relation_id is not None: |
388 | + relation_cmd_line.extend(('-r', relation_id)) |
389 | + for k, v in (relation_settings.items() + kwargs.items()): |
390 | + if v is None: |
391 | + relation_cmd_line.append('{}='.format(k)) |
392 | + else: |
393 | + relation_cmd_line.append('{}={}'.format(k, v)) |
394 | + subprocess.check_call(relation_cmd_line) |
395 | + # Flush cache of any relation-gets for local unit |
396 | + flush(local_unit()) |
397 | + |
398 | + |
399 | +@cached |
400 | +def relation_ids(reltype=None): |
401 | + """A list of relation_ids""" |
402 | + reltype = reltype or relation_type() |
403 | + relid_cmd_line = ['relation-ids', '--format=json'] |
404 | + if reltype is not None: |
405 | + relid_cmd_line.append(reltype) |
406 | + return json.loads(subprocess.check_output(relid_cmd_line)) or [] |
407 | + return [] |
408 | + |
409 | + |
410 | +@cached |
411 | +def related_units(relid=None): |
412 | + """A list of related units""" |
413 | + relid = relid or relation_id() |
414 | + units_cmd_line = ['relation-list', '--format=json'] |
415 | + if relid is not None: |
416 | + units_cmd_line.extend(('-r', relid)) |
417 | + return json.loads(subprocess.check_output(units_cmd_line)) or [] |
418 | + |
419 | + |
420 | +@cached |
421 | +def relation_for_unit(unit=None, rid=None): |
422 | + """Get the json represenation of a unit's relation""" |
423 | + unit = unit or remote_unit() |
424 | + relation = relation_get(unit=unit, rid=rid) |
425 | + for key in relation: |
426 | + if key.endswith('-list'): |
427 | + relation[key] = relation[key].split() |
428 | + relation['__unit__'] = unit |
429 | + return relation |
430 | + |
431 | + |
432 | +@cached |
433 | +def relations_for_id(relid=None): |
434 | + """Get relations of a specific relation ID""" |
435 | + relation_data = [] |
436 | + relid = relid or relation_ids() |
437 | + for unit in related_units(relid): |
438 | + unit_data = relation_for_unit(unit, relid) |
439 | + unit_data['__relid__'] = relid |
440 | + relation_data.append(unit_data) |
441 | + return relation_data |
442 | + |
443 | + |
444 | +@cached |
445 | +def relations_of_type(reltype=None): |
446 | + """Get relations of a specific type""" |
447 | + relation_data = [] |
448 | + reltype = reltype or relation_type() |
449 | + for relid in relation_ids(reltype): |
450 | + for relation in relations_for_id(relid): |
451 | + relation['__relid__'] = relid |
452 | + relation_data.append(relation) |
453 | + return relation_data |
454 | + |
455 | + |
456 | +@cached |
457 | +def relation_types(): |
458 | + """Get a list of relation types supported by this charm""" |
459 | + charmdir = os.environ.get('CHARM_DIR', '') |
460 | + mdf = open(os.path.join(charmdir, 'metadata.yaml')) |
461 | + md = yaml.safe_load(mdf) |
462 | + rel_types = [] |
463 | + for key in ('provides', 'requires', 'peers'): |
464 | + section = md.get(key) |
465 | + if section: |
466 | + rel_types.extend(section.keys()) |
467 | + mdf.close() |
468 | + return rel_types |
469 | + |
470 | + |
471 | +@cached |
472 | +def relations(): |
473 | + """Get a nested dictionary of relation data for all related units""" |
474 | + rels = {} |
475 | + for reltype in relation_types(): |
476 | + relids = {} |
477 | + for relid in relation_ids(reltype): |
478 | + units = {local_unit(): relation_get(unit=local_unit(), rid=relid)} |
479 | + for unit in related_units(relid): |
480 | + reldata = relation_get(unit=unit, rid=relid) |
481 | + units[unit] = reldata |
482 | + relids[relid] = units |
483 | + rels[reltype] = relids |
484 | + return rels |
485 | + |
486 | + |
487 | +@cached |
488 | +def is_relation_made(relation, keys='private-address'): |
489 | + ''' |
490 | + Determine whether a relation is established by checking for |
491 | + presence of key(s). If a list of keys is provided, they |
492 | + must all be present for the relation to be identified as made |
493 | + ''' |
494 | + if isinstance(keys, str): |
495 | + keys = [keys] |
496 | + for r_id in relation_ids(relation): |
497 | + for unit in related_units(r_id): |
498 | + context = {} |
499 | + for k in keys: |
500 | + context[k] = relation_get(k, rid=r_id, |
501 | + unit=unit) |
502 | + if None not in context.values(): |
503 | + return True |
504 | + return False |
505 | + |
506 | + |
507 | +def open_port(port, protocol="TCP"): |
508 | + """Open a service network port""" |
509 | + _args = ['open-port'] |
510 | + _args.append('{}/{}'.format(port, protocol)) |
511 | + subprocess.check_call(_args) |
512 | + |
513 | + |
514 | +def close_port(port, protocol="TCP"): |
515 | + """Close a service network port""" |
516 | + _args = ['close-port'] |
517 | + _args.append('{}/{}'.format(port, protocol)) |
518 | + subprocess.check_call(_args) |
519 | + |
520 | + |
521 | +@cached |
522 | +def unit_get(attribute): |
523 | + """Get the unit ID for the remote unit""" |
524 | + _args = ['unit-get', '--format=json', attribute] |
525 | + try: |
526 | + return json.loads(subprocess.check_output(_args)) |
527 | + except ValueError: |
528 | + return None |
529 | + |
530 | + |
531 | +def unit_private_ip(): |
532 | + """Get this unit's private IP address""" |
533 | + return unit_get('private-address') |
534 | + |
535 | + |
536 | +class UnregisteredHookError(Exception): |
537 | + """Raised when an undefined hook is called""" |
538 | + pass |
539 | + |
540 | + |
541 | +class Hooks(object): |
542 | + """A convenient handler for hook functions. |
543 | + |
544 | + Example: |
545 | + hooks = Hooks() |
546 | + |
547 | + # register a hook, taking its name from the function name |
548 | + @hooks.hook() |
549 | + def install(): |
550 | + ... |
551 | + |
552 | + # register a hook, providing a custom hook name |
553 | + @hooks.hook("config-changed") |
554 | + def config_changed(): |
555 | + ... |
556 | + |
557 | + if __name__ == "__main__": |
558 | + # execute a hook based on the name the program is called by |
559 | + hooks.execute(sys.argv) |
560 | + """ |
561 | + |
562 | + def __init__(self): |
563 | + super(Hooks, self).__init__() |
564 | + self._hooks = {} |
565 | + |
566 | + def register(self, name, function): |
567 | + """Register a hook""" |
568 | + self._hooks[name] = function |
569 | + |
570 | + def execute(self, args): |
571 | + """Execute a registered hook based on args[0]""" |
572 | + hook_name = os.path.basename(args[0]) |
573 | + if hook_name in self._hooks: |
574 | + self._hooks[hook_name]() |
575 | + else: |
576 | + raise UnregisteredHookError(hook_name) |
577 | + |
578 | + def hook(self, *hook_names): |
579 | + """Decorator, registering them as hooks""" |
580 | + def wrapper(decorated): |
581 | + for hook_name in hook_names: |
582 | + self.register(hook_name, decorated) |
583 | + else: |
584 | + self.register(decorated.__name__, decorated) |
585 | + if '_' in decorated.__name__: |
586 | + self.register( |
587 | + decorated.__name__.replace('_', '-'), decorated) |
588 | + return decorated |
589 | + return wrapper |
590 | + |
591 | + |
592 | +def charm_dir(): |
593 | + """Return the root directory of the current charm""" |
594 | + return os.environ.get('CHARM_DIR') |
595 | |
596 | === added file 'hooks/charmhelpers/core/host.py' |
597 | --- hooks/charmhelpers/core/host.py 1970-01-01 00:00:00 +0000 |
598 | +++ hooks/charmhelpers/core/host.py 2014-05-01 19:18:03 +0000 |
599 | @@ -0,0 +1,297 @@ |
600 | +"""Tools for working with the host system""" |
601 | +# Copyright 2012 Canonical Ltd. |
602 | +# |
603 | +# Authors: |
604 | +# Nick Moffitt <nick.moffitt@canonical.com> |
605 | +# Matthew Wedgwood <matthew.wedgwood@canonical.com> |
606 | + |
607 | +import os |
608 | +import pwd |
609 | +import grp |
610 | +import random |
611 | +import string |
612 | +import subprocess |
613 | +import hashlib |
614 | + |
615 | +from collections import OrderedDict |
616 | + |
617 | +from hookenv import log |
618 | + |
619 | + |
620 | +def service_start(service_name): |
621 | + """Start a system service""" |
622 | + return service('start', service_name) |
623 | + |
624 | + |
625 | +def service_stop(service_name): |
626 | + """Stop a system service""" |
627 | + return service('stop', service_name) |
628 | + |
629 | + |
630 | +def service_restart(service_name): |
631 | + """Restart a system service""" |
632 | + return service('restart', service_name) |
633 | + |
634 | + |
635 | +def service_reload(service_name, restart_on_failure=False): |
636 | + """Reload a system service, optionally falling back to restart if reload fails""" |
637 | + service_result = service('reload', service_name) |
638 | + if not service_result and restart_on_failure: |
639 | + service_result = service('restart', service_name) |
640 | + return service_result |
641 | + |
642 | + |
643 | +def service(action, service_name): |
644 | + """Control a system service""" |
645 | + cmd = ['service', service_name, action] |
646 | + return subprocess.call(cmd) == 0 |
647 | + |
648 | + |
649 | +def service_running(service): |
650 | + """Determine whether a system service is running""" |
651 | + try: |
652 | + output = subprocess.check_output(['service', service, 'status']) |
653 | + except subprocess.CalledProcessError: |
654 | + return False |
655 | + else: |
656 | + if ("start/running" in output or "is running" in output): |
657 | + return True |
658 | + else: |
659 | + return False |
660 | + |
661 | + |
662 | +def adduser(username, password=None, shell='/bin/bash', system_user=False): |
663 | + """Add a user to the system""" |
664 | + try: |
665 | + user_info = pwd.getpwnam(username) |
666 | + log('user {0} already exists!'.format(username)) |
667 | + except KeyError: |
668 | + log('creating user {0}'.format(username)) |
669 | + cmd = ['useradd'] |
670 | + if system_user or password is None: |
671 | + cmd.append('--system') |
672 | + else: |
673 | + cmd.extend([ |
674 | + '--create-home', |
675 | + '--shell', shell, |
676 | + '--password', password, |
677 | + ]) |
678 | + cmd.append(username) |
679 | + subprocess.check_call(cmd) |
680 | + user_info = pwd.getpwnam(username) |
681 | + return user_info |
682 | + |
683 | + |
684 | +def add_user_to_group(username, group): |
685 | + """Add a user to a group""" |
686 | + cmd = [ |
687 | + 'gpasswd', '-a', |
688 | + username, |
689 | + group |
690 | + ] |
691 | + log("Adding user {} to group {}".format(username, group)) |
692 | + subprocess.check_call(cmd) |
693 | + |
694 | + |
695 | +def rsync(from_path, to_path, flags='-r', options=None): |
696 | + """Replicate the contents of a path""" |
697 | + options = options or ['--delete', '--executability'] |
698 | + cmd = ['/usr/bin/rsync', flags] |
699 | + cmd.extend(options) |
700 | + cmd.append(from_path) |
701 | + cmd.append(to_path) |
702 | + log(" ".join(cmd)) |
703 | + return subprocess.check_output(cmd).strip() |
704 | + |
705 | + |
706 | +def symlink(source, destination): |
707 | + """Create a symbolic link""" |
708 | + log("Symlinking {} as {}".format(source, destination)) |
709 | + cmd = [ |
710 | + 'ln', |
711 | + '-sf', |
712 | + source, |
713 | + destination, |
714 | + ] |
715 | + subprocess.check_call(cmd) |
716 | + |
717 | + |
718 | +def mkdir(path, owner='root', group='root', perms=0555, force=False): |
719 | + """Create a directory""" |
720 | + log("Making dir {} {}:{} {:o}".format(path, owner, group, |
721 | + perms)) |
722 | + uid = pwd.getpwnam(owner).pw_uid |
723 | + gid = grp.getgrnam(group).gr_gid |
724 | + realpath = os.path.abspath(path) |
725 | + if os.path.exists(realpath): |
726 | + if force and not os.path.isdir(realpath): |
727 | + log("Removing non-directory file {} prior to mkdir()".format(path)) |
728 | + os.unlink(realpath) |
729 | + else: |
730 | + os.makedirs(realpath, perms) |
731 | + os.chown(realpath, uid, gid) |
732 | + |
733 | + |
734 | +def write_file(path, content, owner='root', group='root', perms=0444): |
735 | + """Create or overwrite a file with the contents of a string""" |
736 | + log("Writing file {} {}:{} {:o}".format(path, owner, group, perms)) |
737 | + uid = pwd.getpwnam(owner).pw_uid |
738 | + gid = grp.getgrnam(group).gr_gid |
739 | + with open(path, 'w') as target: |
740 | + os.fchown(target.fileno(), uid, gid) |
741 | + os.fchmod(target.fileno(), perms) |
742 | + target.write(content) |
743 | + |
744 | + |
745 | +def mount(device, mountpoint, options=None, persist=False): |
746 | + """Mount a filesystem at a particular mountpoint""" |
747 | + cmd_args = ['mount'] |
748 | + if options is not None: |
749 | + cmd_args.extend(['-o', options]) |
750 | + cmd_args.extend([device, mountpoint]) |
751 | + try: |
752 | + subprocess.check_output(cmd_args) |
753 | + except subprocess.CalledProcessError, e: |
754 | + log('Error mounting {} at {}\n{}'.format(device, mountpoint, e.output)) |
755 | + return False |
756 | + if persist: |
757 | + # TODO: update fstab |
758 | + pass |
759 | + return True |
760 | + |
761 | + |
762 | +def umount(mountpoint, persist=False): |
763 | + """Unmount a filesystem""" |
764 | + cmd_args = ['umount', mountpoint] |
765 | + try: |
766 | + subprocess.check_output(cmd_args) |
767 | + except subprocess.CalledProcessError, e: |
768 | + log('Error unmounting {}\n{}'.format(mountpoint, e.output)) |
769 | + return False |
770 | + if persist: |
771 | + # TODO: update fstab |
772 | + pass |
773 | + return True |
774 | + |
775 | + |
776 | +def mounts(): |
777 | + """Get a list of all mounted volumes as [[mountpoint,device],[...]]""" |
778 | + with open('/proc/mounts') as f: |
779 | + # [['/mount/point','/dev/path'],[...]] |
780 | + system_mounts = [m[1::-1] for m in [l.strip().split() |
781 | + for l in f.readlines()]] |
782 | + return system_mounts |
783 | + |
784 | + |
785 | +def file_hash(path): |
786 | + """Generate a md5 hash of the contents of 'path' or None if not found """ |
787 | + if os.path.exists(path): |
788 | + h = hashlib.md5() |
789 | + with open(path, 'r') as source: |
790 | + h.update(source.read()) # IGNORE:E1101 - it does have update |
791 | + return h.hexdigest() |
792 | + else: |
793 | + return None |
794 | + |
795 | + |
796 | +def restart_on_change(restart_map, stopstart=False): |
797 | + """Restart services based on configuration files changing |
798 | + |
799 | + This function is used a decorator, for example |
800 | + |
801 | + @restart_on_change({ |
802 | + '/etc/ceph/ceph.conf': [ 'cinder-api', 'cinder-volume' ] |
803 | + }) |
804 | + def ceph_client_changed(): |
805 | + ... |
806 | + |
807 | + In this example, the cinder-api and cinder-volume services |
808 | + would be restarted if /etc/ceph/ceph.conf is changed by the |
809 | + ceph_client_changed function. |
810 | + """ |
811 | + def wrap(f): |
812 | + def wrapped_f(*args): |
813 | + checksums = {} |
814 | + for path in restart_map: |
815 | + checksums[path] = file_hash(path) |
816 | + f(*args) |
817 | + restarts = [] |
818 | + for path in restart_map: |
819 | + if checksums[path] != file_hash(path): |
820 | + restarts += restart_map[path] |
821 | + services_list = list(OrderedDict.fromkeys(restarts)) |
822 | + if not stopstart: |
823 | + for service_name in services_list: |
824 | + service('restart', service_name) |
825 | + else: |
826 | + for action in ['stop', 'start']: |
827 | + for service_name in services_list: |
828 | + service(action, service_name) |
829 | + return wrapped_f |
830 | + return wrap |
831 | + |
832 | + |
833 | +def lsb_release(): |
834 | + """Return /etc/lsb-release in a dict""" |
835 | + d = {} |
836 | + with open('/etc/lsb-release', 'r') as lsb: |
837 | + for l in lsb: |
838 | + k, v = l.split('=') |
839 | + d[k.strip()] = v.strip() |
840 | + return d |
841 | + |
842 | + |
843 | +def pwgen(length=None): |
844 | + """Generate a random pasword.""" |
845 | + if length is None: |
846 | + length = random.choice(range(35, 45)) |
847 | + alphanumeric_chars = [ |
848 | + l for l in (string.letters + string.digits) |
849 | + if l not in 'l0QD1vAEIOUaeiou'] |
850 | + random_chars = [ |
851 | + random.choice(alphanumeric_chars) for _ in range(length)] |
852 | + return(''.join(random_chars)) |
853 | + |
854 | + |
855 | +def list_nics(nic_type): |
856 | + '''Return a list of nics of given type(s)''' |
857 | + if isinstance(nic_type, basestring): |
858 | + int_types = [nic_type] |
859 | + else: |
860 | + int_types = nic_type |
861 | + interfaces = [] |
862 | + for int_type in int_types: |
863 | + cmd = ['ip', 'addr', 'show', 'label', int_type + '*'] |
864 | + ip_output = subprocess.check_output(cmd).split('\n') |
865 | + ip_output = (line for line in ip_output if line) |
866 | + for line in ip_output: |
867 | + if line.split()[1].startswith(int_type): |
868 | + interfaces.append(line.split()[1].replace(":", "")) |
869 | + return interfaces |
870 | + |
871 | + |
872 | +def set_nic_mtu(nic, mtu): |
873 | + '''Set MTU on a network interface''' |
874 | + cmd = ['ip', 'link', 'set', nic, 'mtu', mtu] |
875 | + subprocess.check_call(cmd) |
876 | + |
877 | + |
878 | +def get_nic_mtu(nic): |
879 | + cmd = ['ip', 'addr', 'show', nic] |
880 | + ip_output = subprocess.check_output(cmd).split('\n') |
881 | + mtu = "" |
882 | + for line in ip_output: |
883 | + words = line.split() |
884 | + if 'mtu' in words: |
885 | + mtu = words[words.index("mtu") + 1] |
886 | + return mtu |
887 | + |
888 | + |
889 | +def get_nic_hwaddr(nic): |
890 | + cmd = ['ip', '-o', '-0', 'addr', 'show', nic] |
891 | + ip_output = subprocess.check_output(cmd) |
892 | + hwaddr = "" |
893 | + words = ip_output.split() |
894 | + if 'link/ether' in words: |
895 | + hwaddr = words[words.index('link/ether') + 1] |
896 | + return hwaddr |
897 | |
898 | === added directory 'hooks/charmhelpers/fetch' |
899 | === added file 'hooks/charmhelpers/fetch/__init__.py' |
900 | --- hooks/charmhelpers/fetch/__init__.py 1970-01-01 00:00:00 +0000 |
901 | +++ hooks/charmhelpers/fetch/__init__.py 2014-05-01 19:18:03 +0000 |
902 | @@ -0,0 +1,308 @@ |
903 | +import importlib |
904 | +from yaml import safe_load |
905 | +from charmhelpers.core.host import ( |
906 | + lsb_release |
907 | +) |
908 | +from urlparse import ( |
909 | + urlparse, |
910 | + urlunparse, |
911 | +) |
912 | +import subprocess |
913 | +from charmhelpers.core.hookenv import ( |
914 | + config, |
915 | + log, |
916 | +) |
917 | +import apt_pkg |
918 | +import os |
919 | + |
920 | +CLOUD_ARCHIVE = """# Ubuntu Cloud Archive |
921 | +deb http://ubuntu-cloud.archive.canonical.com/ubuntu {} main |
922 | +""" |
923 | +PROPOSED_POCKET = """# Proposed |
924 | +deb http://archive.ubuntu.com/ubuntu {}-proposed main universe multiverse restricted |
925 | +""" |
926 | +CLOUD_ARCHIVE_POCKETS = { |
927 | + # Folsom |
928 | + 'folsom': 'precise-updates/folsom', |
929 | + 'precise-folsom': 'precise-updates/folsom', |
930 | + 'precise-folsom/updates': 'precise-updates/folsom', |
931 | + 'precise-updates/folsom': 'precise-updates/folsom', |
932 | + 'folsom/proposed': 'precise-proposed/folsom', |
933 | + 'precise-folsom/proposed': 'precise-proposed/folsom', |
934 | + 'precise-proposed/folsom': 'precise-proposed/folsom', |
935 | + # Grizzly |
936 | + 'grizzly': 'precise-updates/grizzly', |
937 | + 'precise-grizzly': 'precise-updates/grizzly', |
938 | + 'precise-grizzly/updates': 'precise-updates/grizzly', |
939 | + 'precise-updates/grizzly': 'precise-updates/grizzly', |
940 | + 'grizzly/proposed': 'precise-proposed/grizzly', |
941 | + 'precise-grizzly/proposed': 'precise-proposed/grizzly', |
942 | + 'precise-proposed/grizzly': 'precise-proposed/grizzly', |
943 | + # Havana |
944 | + 'havana': 'precise-updates/havana', |
945 | + 'precise-havana': 'precise-updates/havana', |
946 | + 'precise-havana/updates': 'precise-updates/havana', |
947 | + 'precise-updates/havana': 'precise-updates/havana', |
948 | + 'havana/proposed': 'precise-proposed/havana', |
949 | + 'precise-havana/proposed': 'precise-proposed/havana', |
950 | + 'precise-proposed/havana': 'precise-proposed/havana', |
951 | + # Icehouse |
952 | + 'icehouse': 'precise-updates/icehouse', |
953 | + 'precise-icehouse': 'precise-updates/icehouse', |
954 | + 'precise-icehouse/updates': 'precise-updates/icehouse', |
955 | + 'precise-updates/icehouse': 'precise-updates/icehouse', |
956 | + 'icehouse/proposed': 'precise-proposed/icehouse', |
957 | + 'precise-icehouse/proposed': 'precise-proposed/icehouse', |
958 | + 'precise-proposed/icehouse': 'precise-proposed/icehouse', |
959 | +} |
960 | + |
961 | + |
962 | +def filter_installed_packages(packages): |
963 | + """Returns a list of packages that require installation""" |
964 | + apt_pkg.init() |
965 | + cache = apt_pkg.Cache() |
966 | + _pkgs = [] |
967 | + for package in packages: |
968 | + try: |
969 | + p = cache[package] |
970 | + p.current_ver or _pkgs.append(package) |
971 | + except KeyError: |
972 | + log('Package {} has no installation candidate.'.format(package), |
973 | + level='WARNING') |
974 | + _pkgs.append(package) |
975 | + return _pkgs |
976 | + |
977 | + |
978 | +def apt_install(packages, options=None, fatal=False): |
979 | + """Install one or more packages""" |
980 | + if options is None: |
981 | + options = ['--option=Dpkg::Options::=--force-confold'] |
982 | + |
983 | + cmd = ['apt-get', '--assume-yes'] |
984 | + cmd.extend(options) |
985 | + cmd.append('install') |
986 | + if isinstance(packages, basestring): |
987 | + cmd.append(packages) |
988 | + else: |
989 | + cmd.extend(packages) |
990 | + log("Installing {} with options: {}".format(packages, |
991 | + options)) |
992 | + env = os.environ.copy() |
993 | + if 'DEBIAN_FRONTEND' not in env: |
994 | + env['DEBIAN_FRONTEND'] = 'noninteractive' |
995 | + |
996 | + if fatal: |
997 | + subprocess.check_call(cmd, env=env) |
998 | + else: |
999 | + subprocess.call(cmd, env=env) |
1000 | + |
1001 | + |
1002 | +def apt_upgrade(options=None, fatal=False, dist=False): |
1003 | + """Upgrade all packages""" |
1004 | + if options is None: |
1005 | + options = ['--option=Dpkg::Options::=--force-confold'] |
1006 | + |
1007 | + cmd = ['apt-get', '--assume-yes'] |
1008 | + cmd.extend(options) |
1009 | + if dist: |
1010 | + cmd.append('dist-upgrade') |
1011 | + else: |
1012 | + cmd.append('upgrade') |
1013 | + log("Upgrading with options: {}".format(options)) |
1014 | + |
1015 | + env = os.environ.copy() |
1016 | + if 'DEBIAN_FRONTEND' not in env: |
1017 | + env['DEBIAN_FRONTEND'] = 'noninteractive' |
1018 | + |
1019 | + if fatal: |
1020 | + subprocess.check_call(cmd, env=env) |
1021 | + else: |
1022 | + subprocess.call(cmd, env=env) |
1023 | + |
1024 | + |
1025 | +def apt_update(fatal=False): |
1026 | + """Update local apt cache""" |
1027 | + cmd = ['apt-get', 'update'] |
1028 | + if fatal: |
1029 | + subprocess.check_call(cmd) |
1030 | + else: |
1031 | + subprocess.call(cmd) |
1032 | + |
1033 | + |
1034 | +def apt_purge(packages, fatal=False): |
1035 | + """Purge one or more packages""" |
1036 | + cmd = ['apt-get', '--assume-yes', 'purge'] |
1037 | + if isinstance(packages, basestring): |
1038 | + cmd.append(packages) |
1039 | + else: |
1040 | + cmd.extend(packages) |
1041 | + log("Purging {}".format(packages)) |
1042 | + if fatal: |
1043 | + subprocess.check_call(cmd) |
1044 | + else: |
1045 | + subprocess.call(cmd) |
1046 | + |
1047 | + |
1048 | +def apt_hold(packages, fatal=False): |
1049 | + """Hold one or more packages""" |
1050 | + cmd = ['apt-mark', 'hold'] |
1051 | + if isinstance(packages, basestring): |
1052 | + cmd.append(packages) |
1053 | + else: |
1054 | + cmd.extend(packages) |
1055 | + log("Holding {}".format(packages)) |
1056 | + if fatal: |
1057 | + subprocess.check_call(cmd) |
1058 | + else: |
1059 | + subprocess.call(cmd) |
1060 | + |
1061 | + |
1062 | +def add_source(source, key=None): |
1063 | + if source is None: |
1064 | + log('Source is not present. Skipping') |
1065 | + return |
1066 | + |
1067 | + if (source.startswith('ppa:') or |
1068 | + source.startswith('http') or |
1069 | + source.startswith('deb ') or |
1070 | + source.startswith('cloud-archive:')): |
1071 | + subprocess.check_call(['add-apt-repository', '--yes', source]) |
1072 | + elif source.startswith('cloud:'): |
1073 | + apt_install(filter_installed_packages(['ubuntu-cloud-keyring']), |
1074 | + fatal=True) |
1075 | + pocket = source.split(':')[-1] |
1076 | + if pocket not in CLOUD_ARCHIVE_POCKETS: |
1077 | + raise SourceConfigError( |
1078 | + 'Unsupported cloud: source option %s' % |
1079 | + pocket) |
1080 | + actual_pocket = CLOUD_ARCHIVE_POCKETS[pocket] |
1081 | + with open('/etc/apt/sources.list.d/cloud-archive.list', 'w') as apt: |
1082 | + apt.write(CLOUD_ARCHIVE.format(actual_pocket)) |
1083 | + elif source == 'proposed': |
1084 | + release = lsb_release()['DISTRIB_CODENAME'] |
1085 | + with open('/etc/apt/sources.list.d/proposed.list', 'w') as apt: |
1086 | + apt.write(PROPOSED_POCKET.format(release)) |
1087 | + if key: |
1088 | + subprocess.check_call(['apt-key', 'adv', '--keyserver', |
1089 | + 'keyserver.ubuntu.com', '--recv', |
1090 | + key]) |
1091 | + |
1092 | + |
1093 | +class SourceConfigError(Exception): |
1094 | + pass |
1095 | + |
1096 | + |
1097 | +def configure_sources(update=False, |
1098 | + sources_var='install_sources', |
1099 | + keys_var='install_keys'): |
1100 | + """ |
1101 | + Configure multiple sources from charm configuration |
1102 | + |
1103 | + Example config: |
1104 | + install_sources: |
1105 | + - "ppa:foo" |
1106 | + - "http://example.com/repo precise main" |
1107 | + install_keys: |
1108 | + - null |
1109 | + - "a1b2c3d4" |
1110 | + |
1111 | + Note that 'null' (a.k.a. None) should not be quoted. |
1112 | + """ |
1113 | + sources = safe_load(config(sources_var)) |
1114 | + keys = config(keys_var) |
1115 | + if keys is not None: |
1116 | + keys = safe_load(keys) |
1117 | + if isinstance(sources, basestring) and ( |
1118 | + keys is None or isinstance(keys, basestring)): |
1119 | + add_source(sources, keys) |
1120 | + else: |
1121 | + if not len(sources) == len(keys): |
1122 | + msg = 'Install sources and keys lists are different lengths' |
1123 | + raise SourceConfigError(msg) |
1124 | + for src_num in range(len(sources)): |
1125 | + add_source(sources[src_num], keys[src_num]) |
1126 | + if update: |
1127 | + apt_update(fatal=True) |
1128 | + |
1129 | +# The order of this list is very important. Handlers should be listed in from |
1130 | +# least- to most-specific URL matching. |
1131 | +FETCH_HANDLERS = ( |
1132 | + 'charmhelpers.fetch.archiveurl.ArchiveUrlFetchHandler', |
1133 | + 'charmhelpers.fetch.bzrurl.BzrUrlFetchHandler', |
1134 | +) |
1135 | + |
1136 | + |
1137 | +class UnhandledSource(Exception): |
1138 | + pass |
1139 | + |
1140 | + |
1141 | +def install_remote(source): |
1142 | + """ |
1143 | + Install a file tree from a remote source |
1144 | + |
1145 | + The specified source should be a url of the form: |
1146 | + scheme://[host]/path[#[option=value][&...]] |
1147 | + |
1148 | + Schemes supported are based on this modules submodules |
1149 | + Options supported are submodule-specific""" |
1150 | + # We ONLY check for True here because can_handle may return a string |
1151 | + # explaining why it can't handle a given source. |
1152 | + handlers = [h for h in plugins() if h.can_handle(source) is True] |
1153 | + installed_to = None |
1154 | + for handler in handlers: |
1155 | + try: |
1156 | + installed_to = handler.install(source) |
1157 | + except UnhandledSource: |
1158 | + pass |
1159 | + if not installed_to: |
1160 | + raise UnhandledSource("No handler found for source {}".format(source)) |
1161 | + return installed_to |
1162 | + |
1163 | + |
1164 | +def install_from_config(config_var_name): |
1165 | + charm_config = config() |
1166 | + source = charm_config[config_var_name] |
1167 | + return install_remote(source) |
1168 | + |
1169 | + |
1170 | +class BaseFetchHandler(object): |
1171 | + |
1172 | + """Base class for FetchHandler implementations in fetch plugins""" |
1173 | + |
1174 | + def can_handle(self, source): |
1175 | + """Returns True if the source can be handled. Otherwise returns |
1176 | + a string explaining why it cannot""" |
1177 | + return "Wrong source type" |
1178 | + |
1179 | + def install(self, source): |
1180 | + """Try to download and unpack the source. Return the path to the |
1181 | + unpacked files or raise UnhandledSource.""" |
1182 | + raise UnhandledSource("Wrong source type {}".format(source)) |
1183 | + |
1184 | + def parse_url(self, url): |
1185 | + return urlparse(url) |
1186 | + |
1187 | + def base_url(self, url): |
1188 | + """Return url without querystring or fragment""" |
1189 | + parts = list(self.parse_url(url)) |
1190 | + parts[4:] = ['' for i in parts[4:]] |
1191 | + return urlunparse(parts) |
1192 | + |
1193 | + |
1194 | +def plugins(fetch_handlers=None): |
1195 | + if not fetch_handlers: |
1196 | + fetch_handlers = FETCH_HANDLERS |
1197 | + plugin_list = [] |
1198 | + for handler_name in fetch_handlers: |
1199 | + package, classname = handler_name.rsplit('.', 1) |
1200 | + try: |
1201 | + handler_class = getattr( |
1202 | + importlib.import_module(package), |
1203 | + classname) |
1204 | + plugin_list.append(handler_class()) |
1205 | + except (ImportError, AttributeError): |
1206 | + # Skip missing plugins so that they can be ommitted from |
1207 | + # installation if desired |
1208 | + log("FetchHandler {} not found, skipping plugin".format( |
1209 | + handler_name)) |
1210 | + return plugin_list |
1211 | |
1212 | === added file 'hooks/charmhelpers/fetch/archiveurl.py' |
1213 | --- hooks/charmhelpers/fetch/archiveurl.py 1970-01-01 00:00:00 +0000 |
1214 | +++ hooks/charmhelpers/fetch/archiveurl.py 2014-05-01 19:18:03 +0000 |
1215 | @@ -0,0 +1,63 @@ |
1216 | +import os |
1217 | +import urllib2 |
1218 | +import urlparse |
1219 | + |
1220 | +from charmhelpers.fetch import ( |
1221 | + BaseFetchHandler, |
1222 | + UnhandledSource |
1223 | +) |
1224 | +from charmhelpers.payload.archive import ( |
1225 | + get_archive_handler, |
1226 | + extract, |
1227 | +) |
1228 | +from charmhelpers.core.host import mkdir |
1229 | + |
1230 | + |
1231 | +class ArchiveUrlFetchHandler(BaseFetchHandler): |
1232 | + """Handler for archives via generic URLs""" |
1233 | + def can_handle(self, source): |
1234 | + url_parts = self.parse_url(source) |
1235 | + if url_parts.scheme not in ('http', 'https', 'ftp', 'file'): |
1236 | + return "Wrong source type" |
1237 | + if get_archive_handler(self.base_url(source)): |
1238 | + return True |
1239 | + return False |
1240 | + |
1241 | + def download(self, source, dest): |
1242 | + # propogate all exceptions |
1243 | + # URLError, OSError, etc |
1244 | + proto, netloc, path, params, query, fragment = urlparse.urlparse(source) |
1245 | + if proto in ('http', 'https'): |
1246 | + auth, barehost = urllib2.splituser(netloc) |
1247 | + if auth is not None: |
1248 | + source = urlparse.urlunparse((proto, barehost, path, params, query, fragment)) |
1249 | + username, password = urllib2.splitpasswd(auth) |
1250 | + passman = urllib2.HTTPPasswordMgrWithDefaultRealm() |
1251 | + # Realm is set to None in add_password to force the username and password |
1252 | + # to be used whatever the realm |
1253 | + passman.add_password(None, source, username, password) |
1254 | + authhandler = urllib2.HTTPBasicAuthHandler(passman) |
1255 | + opener = urllib2.build_opener(authhandler) |
1256 | + urllib2.install_opener(opener) |
1257 | + response = urllib2.urlopen(source) |
1258 | + try: |
1259 | + with open(dest, 'w') as dest_file: |
1260 | + dest_file.write(response.read()) |
1261 | + except Exception as e: |
1262 | + if os.path.isfile(dest): |
1263 | + os.unlink(dest) |
1264 | + raise e |
1265 | + |
1266 | + def install(self, source): |
1267 | + url_parts = self.parse_url(source) |
1268 | + dest_dir = os.path.join(os.environ.get('CHARM_DIR'), 'fetched') |
1269 | + if not os.path.exists(dest_dir): |
1270 | + mkdir(dest_dir, perms=0755) |
1271 | + dld_file = os.path.join(dest_dir, os.path.basename(url_parts.path)) |
1272 | + try: |
1273 | + self.download(source, dld_file) |
1274 | + except urllib2.URLError as e: |
1275 | + raise UnhandledSource(e.reason) |
1276 | + except OSError as e: |
1277 | + raise UnhandledSource(e.strerror) |
1278 | + return extract(dld_file) |
1279 | |
1280 | === added file 'hooks/charmhelpers/fetch/bzrurl.py' |
1281 | --- hooks/charmhelpers/fetch/bzrurl.py 1970-01-01 00:00:00 +0000 |
1282 | +++ hooks/charmhelpers/fetch/bzrurl.py 2014-05-01 19:18:03 +0000 |
1283 | @@ -0,0 +1,49 @@ |
1284 | +import os |
1285 | +from charmhelpers.fetch import ( |
1286 | + BaseFetchHandler, |
1287 | + UnhandledSource |
1288 | +) |
1289 | +from charmhelpers.core.host import mkdir |
1290 | + |
1291 | +try: |
1292 | + from bzrlib.branch import Branch |
1293 | +except ImportError: |
1294 | + from charmhelpers.fetch import apt_install |
1295 | + apt_install("python-bzrlib") |
1296 | + from bzrlib.branch import Branch |
1297 | + |
1298 | + |
1299 | +class BzrUrlFetchHandler(BaseFetchHandler): |
1300 | + """Handler for bazaar branches via generic and lp URLs""" |
1301 | + def can_handle(self, source): |
1302 | + url_parts = self.parse_url(source) |
1303 | + if url_parts.scheme not in ('bzr+ssh', 'lp'): |
1304 | + return False |
1305 | + else: |
1306 | + return True |
1307 | + |
1308 | + def branch(self, source, dest): |
1309 | + url_parts = self.parse_url(source) |
1310 | + # If we use lp:branchname scheme we need to load plugins |
1311 | + if not self.can_handle(source): |
1312 | + raise UnhandledSource("Cannot handle {}".format(source)) |
1313 | + if url_parts.scheme == "lp": |
1314 | + from bzrlib.plugin import load_plugins |
1315 | + load_plugins() |
1316 | + try: |
1317 | + remote_branch = Branch.open(source) |
1318 | + remote_branch.bzrdir.sprout(dest).open_branch() |
1319 | + except Exception as e: |
1320 | + raise e |
1321 | + |
1322 | + def install(self, source): |
1323 | + url_parts = self.parse_url(source) |
1324 | + branch_name = url_parts.path.strip("/").split("/")[-1] |
1325 | + dest_dir = os.path.join(os.environ.get('CHARM_DIR'), "fetched", branch_name) |
1326 | + if not os.path.exists(dest_dir): |
1327 | + mkdir(dest_dir, perms=0755) |
1328 | + try: |
1329 | + self.branch(source, dest_dir) |
1330 | + except OSError as e: |
1331 | + raise UnhandledSource(e.strerror) |
1332 | + return dest_dir |
1333 | |
1334 | === added symlink 'hooks/config-changed' |
1335 | === target is u'hooks.py' |
1336 | === removed file 'hooks/config-changed' |
1337 | --- hooks/config-changed 2014-03-17 10:34:02 +0000 |
1338 | +++ hooks/config-changed 1970-01-01 00:00:00 +0000 |
1339 | @@ -1,39 +0,0 @@ |
1340 | -#!/usr/bin/env python |
1341 | - |
1342 | -import re |
1343 | -import subprocess |
1344 | -import json |
1345 | - |
1346 | - |
1347 | -def juju_log(message=None): |
1348 | - return (subprocess.call(['juju-log', str(message)]) == 0) |
1349 | - |
1350 | -log_rotate_conf = '/etc/logrotate.d/rsyslog' |
1351 | - |
1352 | -config_cmd_line = ['config-get', '--format=json'] |
1353 | - |
1354 | -syslog_rotate_interval = str(json.loads(subprocess.check_output( |
1355 | - config_cmd_line + ['syslog_rotate']))) |
1356 | -messages_rotate_interval = str(json.loads(subprocess.check_output( |
1357 | - config_cmd_line + ['messages_rotate']))) |
1358 | - |
1359 | -with open(log_rotate_conf, mode='r') as config_file: |
1360 | - config = config_file.readlines() |
1361 | - config_file.close() |
1362 | - |
1363 | - param_index = config.index('/var/log/syslog\n') |
1364 | - replace_pattern = syslog_rotate_interval + '\n' |
1365 | - config[param_index+2] = re.sub(r'\d*\n$', |
1366 | - replace_pattern, config[param_index+2]) |
1367 | - juju_log("syslog_rotate_interval set to %s" % syslog_rotate_interval) |
1368 | - |
1369 | - param_index = config.index('/var/log/messages\n') |
1370 | - replace_pattern = messages_rotate_interval + '\n' |
1371 | - config[param_index+2] = re.sub(r'\d*\n$', |
1372 | - replace_pattern, config[param_index+2]) |
1373 | - juju_log("messages_rotate_interval set to %s" % messages_rotate_interval) |
1374 | - |
1375 | -with open(log_rotate_conf, mode='w') as config_file: |
1376 | - for line in config: |
1377 | - config_file.write(line) |
1378 | - config_file.close |
1379 | |
1380 | === added file 'hooks/hooks.py' |
1381 | --- hooks/hooks.py 1970-01-01 00:00:00 +0000 |
1382 | +++ hooks/hooks.py 2014-05-01 19:18:03 +0000 |
1383 | @@ -0,0 +1,125 @@ |
1384 | +#!/usr/bin/env python |
1385 | + |
1386 | +import os |
1387 | +import sys |
1388 | + |
1389 | +_HERE = os.path.abspath(os.path.dirname(__file__)) |
1390 | + |
1391 | +sys.path.insert(0, os.path.join(_HERE, 'charmhelpers')) |
1392 | + |
1393 | +from charmhelpers.core.host import ( |
1394 | + service_start, |
1395 | + service_stop, |
1396 | + service_restart, |
1397 | +) |
1398 | + |
1399 | +from charmhelpers.core.hookenv import ( |
1400 | + Hooks, |
1401 | + close_port, |
1402 | + open_port, |
1403 | + config as config_get, |
1404 | + log as juju_log, |
1405 | + charm_dir, |
1406 | +) |
1407 | + |
1408 | +from charmhelpers.fetch import ( |
1409 | + apt_install |
1410 | +) |
1411 | + |
1412 | +DEFAULT_RSYSLOG_PORT = 514 |
1413 | +DEFAULT_RSYSLOG_PATH = os.path.join(os.path.sep, 'etc', 'rsyslog.d') |
1414 | +DEFAULT_LOGROTATE_PATH = os.path.join(os.path.sep, 'etc', 'logrotate.d', |
1415 | + 'rsyslog') |
1416 | +DEFAULT_CONFIGURATION_PARAMS = ( |
1417 | + 'syslog_rotate', |
1418 | + 'messages_rotate', |
1419 | + 'nova_logs' |
1420 | +) |
1421 | + |
1422 | +required_debian_pkgs = [ |
1423 | + 'rsyslog', |
1424 | + 'python-jinja2', |
1425 | +] |
1426 | + |
1427 | +hooks = Hooks() |
1428 | + |
1429 | +try: |
1430 | + from jinja2 import Environment, FileSystemLoader |
1431 | +except ImportError: |
1432 | + apt_install(['python-jinja2'], options=['-f']) |
1433 | + from jinja2 import Environment, FileSystemLoader |
1434 | + |
1435 | + |
1436 | +def get_template_dir(): |
1437 | + return os.path.join(charm_dir(), 'templates') |
1438 | + |
1439 | + |
1440 | +def get_config_template(name): |
1441 | + template_env = Environment(loader=FileSystemLoader(get_template_dir())) |
1442 | + return template_env.get_template(name) |
1443 | + |
1444 | + |
1445 | +def get_changed_config(): |
1446 | + changed = {} |
1447 | + for config_key in DEFAULT_CONFIGURATION_PARAMS: |
1448 | + changed[config_key] = config_get(config_key) |
1449 | + juju_log("Configuration key:%s set to value: %s" % |
1450 | + (config_key, changed[config_key])) |
1451 | + return changed |
1452 | + |
1453 | + |
1454 | +@hooks.hook() |
1455 | +def install(): |
1456 | + apt_install(required_debian_pkgs, options=['-f']) |
1457 | + |
1458 | + |
1459 | +@hooks.hook("upgrade-charm") |
1460 | +def upgrade_charm(): |
1461 | + install() |
1462 | + |
1463 | + |
1464 | +@hooks.hook("start") |
1465 | +def start(): |
1466 | + service_start("rsyslog") |
1467 | + open_port(DEFAULT_RSYSLOG_PORT) |
1468 | + open_port(DEFAULT_RSYSLOG_PORT, protocol="UDP") |
1469 | + |
1470 | + |
1471 | +@hooks.hook("stop") |
1472 | +def stop(): |
1473 | + service_stop("rsyslog") |
1474 | + close_port(DEFAULT_RSYSLOG_PORT) |
1475 | + close_port(DEFAULT_RSYSLOG_PORT, protocol="UDP") |
1476 | + |
1477 | + |
1478 | +def update_logrotate_config(**params): |
1479 | + template_name = "rsyslog.conf" |
1480 | + template = get_config_template(template_name) |
1481 | + |
1482 | + with open(DEFAULT_LOGROTATE_PATH, 'w') as logrotate_config: |
1483 | + logrotate_config.write(template.render(**params)) |
1484 | + |
1485 | + |
1486 | +def update_rsyslog_config(**params): |
1487 | + template_name = "60-aggregator.conf" |
1488 | + template = get_config_template(template_name) |
1489 | + |
1490 | + with open(os.path.join(DEFAULT_RSYSLOG_PATH, |
1491 | + template_name), 'w') as rsyslog_config: |
1492 | + rsyslog_config.write(template.render(**params)) |
1493 | + |
1494 | + |
1495 | +@hooks.hook("config-changed") |
1496 | +def config_changed(): |
1497 | + config = get_changed_config() |
1498 | + |
1499 | + update_logrotate_config(**config) |
1500 | + update_rsyslog_config(**config) |
1501 | + |
1502 | + juju_log("Configuration changed, restart required") |
1503 | + |
1504 | + #configuration changed, restart rsyslog |
1505 | + service_restart("rsyslog") |
1506 | + |
1507 | +if __name__ == "__main__": |
1508 | + hooks.execute(sys.argv) |
1509 | |
1510 | === added symlink 'hooks/install' |
1511 | === target is u'hooks.py' |
1512 | === removed file 'hooks/install' |
1513 | --- hooks/install 2012-04-17 09:35:17 +0000 |
1514 | +++ hooks/install 1970-01-01 00:00:00 +0000 |
1515 | @@ -1,8 +0,0 @@ |
1516 | -#!/bin/sh |
1517 | - |
1518 | -set -ue |
1519 | - |
1520 | -apt-get install -y rsyslog |
1521 | - |
1522 | -install -m 0644 -o root -g root files/60-aggregator.conf /etc/rsyslog.d/60-aggregator.conf |
1523 | -service rsyslog restart |
1524 | |
1525 | === added symlink 'hooks/start' |
1526 | === target is u'hooks.py' |
1527 | === removed file 'hooks/start' |
1528 | --- hooks/start 2012-04-17 08:06:34 +0000 |
1529 | +++ hooks/start 1970-01-01 00:00:00 +0000 |
1530 | @@ -1,8 +0,0 @@ |
1531 | -#!/bin/sh |
1532 | - |
1533 | -set -ue |
1534 | - |
1535 | -service rsyslog start || : |
1536 | - |
1537 | -open-port 514/tcp |
1538 | -open-port 514/udp |
1539 | |
1540 | === added symlink 'hooks/stop' |
1541 | === target is u'hooks.py' |
1542 | === removed file 'hooks/stop' |
1543 | --- hooks/stop 2012-09-11 15:59:53 +0000 |
1544 | +++ hooks/stop 1970-01-01 00:00:00 +0000 |
1545 | @@ -1,4 +0,0 @@ |
1546 | -#!/bin/bash |
1547 | -service rsyslog stop |
1548 | -close-port 514/tcp |
1549 | -close-port 514/udp |
1550 | |
1551 | === added symlink 'hooks/upgrade-charm' |
1552 | === target is u'hooks.py' |
1553 | === removed file 'hooks/upgrade-charm' |
1554 | --- hooks/upgrade-charm 2012-04-17 09:35:17 +0000 |
1555 | +++ hooks/upgrade-charm 1970-01-01 00:00:00 +0000 |
1556 | @@ -1,2 +0,0 @@ |
1557 | -#!/bin/sh |
1558 | -hooks/install |
1559 | |
1560 | === modified file 'revision' |
1561 | --- revision 2012-04-17 09:35:17 +0000 |
1562 | +++ revision 2014-05-01 19:18:03 +0000 |
1563 | @@ -1,1 +1,1 @@ |
1564 | -9 |
1565 | +14 |
1566 | |
1567 | === added file 'setup.cfg' |
1568 | --- setup.cfg 1970-01-01 00:00:00 +0000 |
1569 | +++ setup.cfg 2014-05-01 19:18:03 +0000 |
1570 | @@ -0,0 +1,5 @@ |
1571 | +[nosetests] |
1572 | +verbosity=2 |
1573 | +with-coverage=1 |
1574 | +cover-erase=1 |
1575 | +cover-package=hooks.hooks |
1576 | |
1577 | === added directory 'templates' |
1578 | === added file 'templates/60-aggregator.conf' |
1579 | --- templates/60-aggregator.conf 1970-01-01 00:00:00 +0000 |
1580 | +++ templates/60-aggregator.conf 2014-05-01 19:18:03 +0000 |
1581 | @@ -0,0 +1,7 @@ |
1582 | +# Remote listening for aggregation |
1583 | +$ModLoad imudp |
1584 | +$UDPServerRun 514 |
1585 | + |
1586 | +{% if nova_logs == True %} |
1587 | + {% include 'nova-logging.conf' %} |
1588 | +{% endif %} |
1589 | |
1590 | === added file 'templates/nova-logging.conf' |
1591 | --- templates/nova-logging.conf 1970-01-01 00:00:00 +0000 |
1592 | +++ templates/nova-logging.conf 2014-05-01 19:18:03 +0000 |
1593 | @@ -0,0 +1,12 @@ |
1594 | +# Create logging templates for nova |
1595 | +$template NovaFile,"/var/log/rsyslog/%HOSTNAME%/nova.log" |
1596 | +$template NovaAll,"/var/log/rsyslog/nova.log" |
1597 | + |
1598 | +# Log everything else to syslog.log |
1599 | +$template DynFile,"/var/log/rsyslog/%HOSTNAME%/syslog.log" |
1600 | +*.* ?DynFile |
1601 | + |
1602 | +# Log various openstack components to their own individual file |
1603 | +local0.* ?NovaFile |
1604 | +local0.* ?NovaAll |
1605 | +& ~ |
1606 | |
1607 | === added file 'templates/rsyslog.conf' |
1608 | --- templates/rsyslog.conf 1970-01-01 00:00:00 +0000 |
1609 | +++ templates/rsyslog.conf 2014-05-01 19:18:03 +0000 |
1610 | @@ -0,0 +1,37 @@ |
1611 | +/var/log/syslog |
1612 | +{ |
1613 | + rotate {{syslog_rotate}} |
1614 | + daily |
1615 | + missingok |
1616 | + notifempty |
1617 | + delaycompress |
1618 | + compress |
1619 | + postrotate |
1620 | + reload rsyslog >/dev/null 2>&1 || true |
1621 | + endscript |
1622 | +} |
1623 | + |
1624 | +/var/log/mail.info |
1625 | +/var/log/mail.warn |
1626 | +/var/log/mail.err |
1627 | +/var/log/mail.log |
1628 | +/var/log/daemon.log |
1629 | +/var/log/kern.log |
1630 | +/var/log/auth.log |
1631 | +/var/log/user.log |
1632 | +/var/log/lpr.log |
1633 | +/var/log/cron.log |
1634 | +/var/log/debug |
1635 | +/var/log/messages |
1636 | +{ |
1637 | + rotate {{messages_rotate}} |
1638 | + weekly |
1639 | + missingok |
1640 | + notifempty |
1641 | + compress |
1642 | + delaycompress |
1643 | + sharedscripts |
1644 | + postrotate |
1645 | + reload rsyslog >/dev/null 2>&1 || true |
1646 | + endscript |
1647 | +} |
1648 | |
1649 | === added file 'test_requirements.txt' |
1650 | --- test_requirements.txt 1970-01-01 00:00:00 +0000 |
1651 | +++ test_requirements.txt 2014-05-01 19:18:03 +0000 |
1652 | @@ -0,0 +1,6 @@ |
1653 | +Jinja2 |
1654 | +PyYAML |
1655 | +coverage |
1656 | +mock==1.0.1 |
1657 | +nose==1.3.1 |
1658 | +https://launchpad.net/python-apt/main/0.7.8/+download/python-apt-0.8.5.tar.gz |
1659 | |
1660 | === added directory 'unit_tests' |
1661 | === added file 'unit_tests/test_hooks.py' |
1662 | --- unit_tests/test_hooks.py 1970-01-01 00:00:00 +0000 |
1663 | +++ unit_tests/test_hooks.py 2014-05-01 19:18:03 +0000 |
1664 | @@ -0,0 +1,149 @@ |
1665 | +#!/usr/bin/env python |
1666 | +# -*- coding: utf-8 -*- |
1667 | + |
1668 | +__author__ = 'Jorge Niedbalski R. <jorge.niedbalski@canonical.com>' |
1669 | + |
1670 | +import os |
1671 | + |
1672 | +_HERE = os.path.abspath(os.path.dirname(__file__)) |
1673 | + |
1674 | +try: |
1675 | + import unittest |
1676 | + import mock |
1677 | +except ImportError as ex: |
1678 | + raise ImportError("Please install unittest and mock modules") |
1679 | + |
1680 | +import hooks |
1681 | + |
1682 | +TO_PATCH = [ |
1683 | + "apt_install", |
1684 | + "install", |
1685 | + "service_restart", |
1686 | + "service_start", |
1687 | + "open_port", |
1688 | + "service_stop", |
1689 | + "close_port", |
1690 | + "juju_log", |
1691 | + "charm_dir", |
1692 | + "config_get" |
1693 | +] |
1694 | + |
1695 | + |
1696 | +class HooksTestCase(unittest.TestCase): |
1697 | + |
1698 | + def setUp(self): |
1699 | + unittest.TestCase.setUp(self) |
1700 | + self.patch_all() |
1701 | + |
1702 | + self.juju_log.return_value = True |
1703 | + self.apt_install.return_value = True |
1704 | + self.charm_dir.return_value = os.path.join(_HERE, '..') |
1705 | + |
1706 | + def patch(self, method): |
1707 | + _m = mock.patch.object(hooks, method) |
1708 | + _mock = _m.start() |
1709 | + self.addCleanup(_m.stop) |
1710 | + return _mock |
1711 | + |
1712 | + def patch_all(self): |
1713 | + for method in TO_PATCH: |
1714 | + setattr(self, method, self.patch(method)) |
1715 | + |
1716 | + def test_install_hook(self): |
1717 | + """Check if install hooks is correctly executed |
1718 | + """ |
1719 | + hooks.hooks.execute(['install']) |
1720 | + self.apt_install.assert_called_with(['rsyslog', 'python-jinja2'], |
1721 | + options=["-f"]) |
1722 | + |
1723 | + def test_upgrade_charm(self): |
1724 | + """Check if charm upgrade hooks is correctly executed |
1725 | + """ |
1726 | + hooks.hooks.execute(['upgrade-charm']) |
1727 | + self.install.assert_called() |
1728 | + |
1729 | + def test_start_charm(self): |
1730 | + """Check if start hook is correctly executed |
1731 | + """ |
1732 | + hooks.hooks.execute(['start']) |
1733 | + self.service_start.assert_called_with("rsyslog") |
1734 | + |
1735 | + expected = [mock.call(514), |
1736 | + mock.call(514, protocol="UDP")] |
1737 | + |
1738 | + self.assertEquals(self.open_port.call_args_list, |
1739 | + expected) |
1740 | + |
1741 | + def test_stop_charm(self): |
1742 | + """Check if start hooks is correctly executed |
1743 | + """ |
1744 | + hooks.hooks.execute(['stop']) |
1745 | + |
1746 | + self.service_stop.assert_called_with("rsyslog") |
1747 | + |
1748 | + expected = [mock.call(514), |
1749 | + mock.call(514, protocol="UDP")] |
1750 | + |
1751 | + self.assertTrue(self.close_port.call_args_list == expected) |
1752 | + |
1753 | + def _tpl_has_value(self, template, value): |
1754 | + return hooks.get_config_template(template).render( |
1755 | + **hooks.get_changed_config()).find(value) > 0 |
1756 | + |
1757 | + def test_config_nova_disabled(self): |
1758 | + """Check if configuration templates are created correctly \ |
1759 | + ( nova_logs: False ) |
1760 | + """ |
1761 | + self.apt_install.return_value = True |
1762 | + self.charm_dir.return_value = os.path.join(_HERE, '..') |
1763 | + |
1764 | + config = { |
1765 | + "syslog_rotate": 7, |
1766 | + "messages_rotate": 4, |
1767 | + "nova_logs": False |
1768 | + } |
1769 | + |
1770 | + def config_side(*args, **kwargs): |
1771 | + return config[args[0]] |
1772 | + |
1773 | + self.config_get.side_effect = config_side |
1774 | + |
1775 | + self.assertFalse(self._tpl_has_value("60-aggregator.conf", "nova")) |
1776 | + self.assertTrue(self._tpl_has_value("rsyslog.conf", "7")) |
1777 | + self.assertTrue(self._tpl_has_value("rsyslog.conf", "4")) |
1778 | + |
1779 | + def test_config_nova_enabled(self): |
1780 | + """Check if configuration templates are created correctly \ |
1781 | + ( nova_logs: True ) |
1782 | + """ |
1783 | + config = { |
1784 | + "syslog_rotate": 7, |
1785 | + "messages_rotate": 4, |
1786 | + "nova_logs": True |
1787 | + } |
1788 | + |
1789 | + def config_side(*args, **kwargs): |
1790 | + return config[args[0]] |
1791 | + |
1792 | + self.config_get.side_effect = config_side |
1793 | + self.assertTrue(self._tpl_has_value("60-aggregator.conf", "nova")) |
1794 | + |
1795 | + def test_config_changed(self): |
1796 | + """Check if config-changed is executed correctly""" |
1797 | + _open = mock.mock_open() |
1798 | + |
1799 | + with mock.patch('__builtin__.open', _open, create=True): |
1800 | + hooks.config_changed() |
1801 | + |
1802 | + expected = [ |
1803 | + mock.call(os.path.join(hooks.get_template_dir(), |
1804 | + '60-aggregator.conf'), 'rb'), |
1805 | + mock.call(os.path.join(hooks.DEFAULT_RSYSLOG_PATH, |
1806 | + '60-aggregator.conf'), 'w'), |
1807 | + mock.call(os.path.join(hooks.get_template_dir(), |
1808 | + 'rsyslog.conf'), 'rb'), |
1809 | + mock.call(hooks.DEFAULT_LOGROTATE_PATH, 'w') |
1810 | + ] |
1811 | + |
1812 | + self.assertEquals(sorted(_open.call_args_list), sorted(expected)) |
1813 | + self.service_restart.assert_called_once_with("rsyslog") |
Hey Jorge,
Thanks for this submission, it looks great! Happy to see all those unit tests! I had to make a small change to the test_hooks.py file to get `make test` to complete successfully, so that should probably be updated on your branch:
--- unit_tests/ test_hooks. py 2014-04-30 20:29:29.665068579 -0400 test_hooks. py.new 2014-04-30 20:28:02.954158683 -0400
+++ unit_tests/
@@ -13,7 +13,7 @@
except ImportError as ex:
raise ImportError("Please install unittest and mock modules")
-from hooks import hooks
+import hooks
TO_PATCH = [
"apt_install",
Aside from that, this looks awesome and gets a big +1 from me. Someone from the charmers team will come through soon to do a followup review and promulgate these changes into the charm store.
If you have any questions/ comments/ concerns about the review contact us in #juju on irc.freenode.net or email the mailing list <email address hidden>
Thanks again for this contribution!