Merge lp:~james-page/charms/trusty/swift-storage/xenial into lp:~openstack-charmers-archive/charms/trusty/swift-storage/next

Proposed by James Page
Status: Merged
Merged at revision: 104
Proposed branch: lp:~james-page/charms/trusty/swift-storage/xenial
Merge into: lp:~openstack-charmers-archive/charms/trusty/swift-storage/next
Diff against target: 663 lines (+177/-86)
12 files modified
charmhelpers/contrib/openstack/amulet/deployment.py (+3/-2)
charmhelpers/contrib/openstack/context.py (+11/-7)
charmhelpers/contrib/openstack/neutron.py (+18/-6)
charmhelpers/contrib/openstack/utils.py (+65/-23)
charmhelpers/contrib/python/packages.py (+22/-7)
charmhelpers/core/host.py (+41/-26)
charmhelpers/fetch/giturl.py (+3/-1)
lib/swift_storage_context.py (+0/-3)
lib/swift_storage_utils.py (+2/-1)
tests/charmhelpers/contrib/openstack/amulet/deployment.py (+3/-2)
unit_tests/test_swift_storage_context.py (+0/-4)
unit_tests/test_swift_storage_utils.py (+9/-4)
To merge this branch: bzr merge lp:~james-page/charms/trusty/swift-storage/xenial
Reviewer Review Type Date Requested Status
OpenStack Charmers Pending
Review via email: mp+284515@code.launchpad.net

This proposal supersedes a proposal from 2016-01-30.

Description of the change

Resync helpers, fixup xenial support.

To post a comment you must log in.
Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_lint_check #18391 swift-storage-next for james-page mp284515
    LINT FAIL: lint-test failed

LINT Results (max last 2 lines):
make: *** [lint] Error 1
ERROR:root:Make target returned non-zero.

Full lint test output: http://paste.ubuntu.com/14731093/
Build: http://10.245.162.77:8080/job/charm_lint_check/18391/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_unit_test #17133 swift-storage-next for james-page mp284515
    UNIT OK: passed

Build: http://10.245.162.77:8080/job/charm_unit_test/17133/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_amulet_test #9100 swift-storage-next for james-page mp284515
    AMULET OK: passed

Build: http://10.245.162.77:8080/job/charm_amulet_test/9100/

104. By James Page

Tidy lint

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_lint_check #119 swift-storage-next for james-page mp284515
    LINT OK: passed

Build: http://10.245.162.36:8080/job/charm_lint_check/119/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_lint_check #125 swift-storage-next for james-page mp284515
    LINT OK: passed

Build: http://10.245.162.36:8080/job/charm_lint_check/125/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_unit_test #118 swift-storage-next for james-page mp284515
    UNIT OK: passed

Build: http://10.245.162.36:8080/job/charm_unit_test/118/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_amulet_test #23 swift-storage-next for james-page mp284515
    AMULET OK: passed

Build: http://10.245.162.36:8080/job/charm_amulet_test/23/

105. By James Page

Baseline

106. By James Page

Enable xenial mitaka tests

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_lint_check #237 swift-storage-next for james-page mp284515
    LINT OK: passed

Build: http://10.245.162.36:8080/job/charm_lint_check/237/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_unit_test #223 swift-storage-next for james-page mp284515
    UNIT OK: passed

Build: http://10.245.162.36:8080/job/charm_unit_test/223/

Revision history for this message
David Ames (thedac) wrote :

Looks good. Waiting on amulet. With the possible exception of the xenial-mitaka amulet test which may fail due to dependencies.

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_amulet_test #118 swift-storage-next for james-page mp284515
    AMULET FAIL: amulet-test failed

AMULET Results (max last 2 lines):
make: *** [functional_test] Error 1
ERROR:root:Make target returned non-zero.

Full amulet test output: http://paste.ubuntu.com/15019707/
Build: http://10.245.162.36:8080/job/charm_amulet_test/118/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_amulet_test #124 swift-storage-next for james-page mp284515
    AMULET FAIL: amulet-test failed

AMULET Results (max last 2 lines):
make: *** [functional_test] Error 1
ERROR:root:Make target returned non-zero.

Full amulet test output: http://paste.ubuntu.com/15022712/
Build: http://10.245.162.36:8080/job/charm_amulet_test/124/

Revision history for this message
uosci-testing-bot (uosci-testing-bot) wrote :

charm_amulet_test #130 swift-storage-next for james-page mp284515
    AMULET OK: passed

Build: http://10.245.162.36:8080/job/charm_amulet_test/130/

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'charmhelpers/contrib/openstack/amulet/deployment.py'
2--- charmhelpers/contrib/openstack/amulet/deployment.py 2016-01-04 21:31:57 +0000
3+++ charmhelpers/contrib/openstack/amulet/deployment.py 2016-02-11 16:47:55 +0000
4@@ -121,11 +121,12 @@
5
6 # Charms which should use the source config option
7 use_source = ['mysql', 'mongodb', 'rabbitmq-server', 'ceph',
8- 'ceph-osd', 'ceph-radosgw']
9+ 'ceph-osd', 'ceph-radosgw', 'ceph-mon']
10
11 # Charms which can not use openstack-origin, ie. many subordinates
12 no_origin = ['cinder-ceph', 'hacluster', 'neutron-openvswitch', 'nrpe',
13- 'openvswitch-odl', 'neutron-api-odl', 'odl-controller']
14+ 'openvswitch-odl', 'neutron-api-odl', 'odl-controller',
15+ 'cinder-backup']
16
17 if self.openstack:
18 for svc in services:
19
20=== modified file 'charmhelpers/contrib/openstack/context.py'
21--- charmhelpers/contrib/openstack/context.py 2016-01-08 02:38:26 +0000
22+++ charmhelpers/contrib/openstack/context.py 2016-02-11 16:47:55 +0000
23@@ -90,6 +90,12 @@
24 from charmhelpers.contrib.openstack.utils import get_host_ip
25 from charmhelpers.core.unitdata import kv
26
27+try:
28+ import psutil
29+except ImportError:
30+ apt_install('python-psutil', fatal=True)
31+ import psutil
32+
33 CA_CERT_PATH = '/usr/local/share/ca-certificates/keystone_juju_ca_cert.crt'
34 ADDRESS_TYPES = ['admin', 'internal', 'public']
35
36@@ -1258,13 +1264,11 @@
37
38 @property
39 def num_cpus(self):
40- try:
41- from psutil import NUM_CPUS
42- except ImportError:
43- apt_install('python-psutil', fatal=True)
44- from psutil import NUM_CPUS
45-
46- return NUM_CPUS
47+ # NOTE: use cpu_count if present (16.04 support)
48+ if hasattr(psutil, 'cpu_count'):
49+ return psutil.cpu_count()
50+ else:
51+ return psutil.NUM_CPUS
52
53 def __call__(self):
54 multiplier = config('worker-multiplier') or 0
55
56=== modified file 'charmhelpers/contrib/openstack/neutron.py'
57--- charmhelpers/contrib/openstack/neutron.py 2016-01-04 21:31:57 +0000
58+++ charmhelpers/contrib/openstack/neutron.py 2016-02-11 16:47:55 +0000
59@@ -50,7 +50,7 @@
60 if kernel_version() >= (3, 13):
61 return []
62 else:
63- return ['openvswitch-datapath-dkms']
64+ return [headers_package(), 'openvswitch-datapath-dkms']
65
66
67 # legacy
68@@ -70,7 +70,7 @@
69 relation_prefix='neutron',
70 ssl_dir=QUANTUM_CONF_DIR)],
71 'services': ['quantum-plugin-openvswitch-agent'],
72- 'packages': [[headers_package()] + determine_dkms_package(),
73+ 'packages': [determine_dkms_package(),
74 ['quantum-plugin-openvswitch-agent']],
75 'server_packages': ['quantum-server',
76 'quantum-plugin-openvswitch'],
77@@ -111,7 +111,7 @@
78 relation_prefix='neutron',
79 ssl_dir=NEUTRON_CONF_DIR)],
80 'services': ['neutron-plugin-openvswitch-agent'],
81- 'packages': [[headers_package()] + determine_dkms_package(),
82+ 'packages': [determine_dkms_package(),
83 ['neutron-plugin-openvswitch-agent']],
84 'server_packages': ['neutron-server',
85 'neutron-plugin-openvswitch'],
86@@ -155,7 +155,7 @@
87 relation_prefix='neutron',
88 ssl_dir=NEUTRON_CONF_DIR)],
89 'services': [],
90- 'packages': [[headers_package()] + determine_dkms_package(),
91+ 'packages': [determine_dkms_package(),
92 ['neutron-plugin-cisco']],
93 'server_packages': ['neutron-server',
94 'neutron-plugin-cisco'],
95@@ -174,7 +174,7 @@
96 'neutron-dhcp-agent',
97 'nova-api-metadata',
98 'etcd'],
99- 'packages': [[headers_package()] + determine_dkms_package(),
100+ 'packages': [determine_dkms_package(),
101 ['calico-compute',
102 'bird',
103 'neutron-dhcp-agent',
104@@ -219,7 +219,7 @@
105 relation_prefix='neutron',
106 ssl_dir=NEUTRON_CONF_DIR)],
107 'services': [],
108- 'packages': [[headers_package()] + determine_dkms_package()],
109+ 'packages': [determine_dkms_package()],
110 'server_packages': ['neutron-server',
111 'python-neutron-plugin-midonet'],
112 'server_services': ['neutron-server']
113@@ -233,6 +233,18 @@
114 'neutron-plugin-ml2']
115 # NOTE: patch in vmware renames nvp->nsx for icehouse onwards
116 plugins['nvp'] = plugins['nsx']
117+ if release >= 'kilo':
118+ plugins['midonet']['driver'] = (
119+ 'neutron.plugins.midonet.plugin.MidonetPluginV2')
120+ if release >= 'liberty':
121+ midonet_origin = config('midonet-origin')
122+ if midonet_origin is not None and midonet_origin[4:5] == '1':
123+ plugins['midonet']['driver'] = (
124+ 'midonet.neutron.plugin_v1.MidonetPluginV2')
125+ plugins['midonet']['server_packages'].remove(
126+ 'python-neutron-plugin-midonet')
127+ plugins['midonet']['server_packages'].append(
128+ 'python-networking-midonet')
129 return plugins
130
131
132
133=== modified file 'charmhelpers/contrib/openstack/utils.py'
134--- charmhelpers/contrib/openstack/utils.py 2016-01-13 18:36:55 +0000
135+++ charmhelpers/contrib/openstack/utils.py 2016-02-11 16:47:55 +0000
136@@ -25,6 +25,7 @@
137 import re
138
139 import six
140+import tempfile
141 import traceback
142 import uuid
143 import yaml
144@@ -41,6 +42,7 @@
145 config,
146 log as juju_log,
147 charm_dir,
148+ DEBUG,
149 INFO,
150 related_units,
151 relation_ids,
152@@ -105,16 +107,26 @@
153
154 # The ugly duckling - must list releases oldest to newest
155 SWIFT_CODENAMES = OrderedDict([
156- ('diablo', ['1.4.3']),
157- ('essex', ['1.4.8']),
158- ('folsom', ['1.7.4']),
159- ('grizzly', ['1.7.6', '1.7.7', '1.8.0']),
160- ('havana', ['1.9.0', '1.9.1', '1.10.0']),
161- ('icehouse', ['1.11.0', '1.12.0', '1.13.0', '1.13.1']),
162- ('juno', ['2.0.0', '2.1.0', '2.2.0']),
163- ('kilo', ['2.2.1', '2.2.2']),
164- ('liberty', ['2.3.0', '2.4.0', '2.5.0']),
165- ('mitaka', ['2.5.0']),
166+ ('diablo',
167+ ['1.4.3']),
168+ ('essex',
169+ ['1.4.8']),
170+ ('folsom',
171+ ['1.7.4']),
172+ ('grizzly',
173+ ['1.7.6', '1.7.7', '1.8.0']),
174+ ('havana',
175+ ['1.9.0', '1.9.1', '1.10.0']),
176+ ('icehouse',
177+ ['1.11.0', '1.12.0', '1.13.0', '1.13.1']),
178+ ('juno',
179+ ['2.0.0', '2.1.0', '2.2.0']),
180+ ('kilo',
181+ ['2.2.1', '2.2.2']),
182+ ('liberty',
183+ ['2.3.0', '2.4.0', '2.5.0']),
184+ ('mitaka',
185+ ['2.5.0']),
186 ])
187
188 # >= Liberty version->codename mapping
189@@ -337,12 +349,42 @@
190
191
192 def import_key(keyid):
193- cmd = "apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 " \
194- "--recv-keys %s" % keyid
195- try:
196- subprocess.check_call(cmd.split(' '))
197- except subprocess.CalledProcessError:
198- error_out("Error importing repo key %s" % keyid)
199+ key = keyid.strip()
200+ if (key.startswith('-----BEGIN PGP PUBLIC KEY BLOCK-----') and
201+ key.endswith('-----END PGP PUBLIC KEY BLOCK-----')):
202+ juju_log("PGP key found (looks like ASCII Armor format)", level=DEBUG)
203+ juju_log("Importing ASCII Armor PGP key", level=DEBUG)
204+ with tempfile.NamedTemporaryFile() as keyfile:
205+ with open(keyfile.name, 'w') as fd:
206+ fd.write(key)
207+ fd.write("\n")
208+
209+ cmd = ['apt-key', 'add', keyfile.name]
210+ try:
211+ subprocess.check_call(cmd)
212+ except subprocess.CalledProcessError:
213+ error_out("Error importing PGP key '%s'" % key)
214+ else:
215+ juju_log("PGP key found (looks like Radix64 format)", level=DEBUG)
216+ juju_log("Importing PGP key from keyserver", level=DEBUG)
217+ cmd = ['apt-key', 'adv', '--keyserver',
218+ 'hkp://keyserver.ubuntu.com:80', '--recv-keys', key]
219+ try:
220+ subprocess.check_call(cmd)
221+ except subprocess.CalledProcessError:
222+ error_out("Error importing PGP key '%s'" % key)
223+
224+
225+def get_source_and_pgp_key(input):
226+ """Look for a pgp key ID or ascii-armor key in the given input."""
227+ index = input.strip()
228+ index = input.rfind('|')
229+ if index < 0:
230+ return input, None
231+
232+ key = input[index + 1:].strip('|')
233+ source = input[:index]
234+ return source, key
235
236
237 def configure_installation_source(rel):
238@@ -354,16 +396,16 @@
239 with open('/etc/apt/sources.list.d/juju_deb.list', 'w') as f:
240 f.write(DISTRO_PROPOSED % ubuntu_rel)
241 elif rel[:4] == "ppa:":
242- src = rel
243+ src, key = get_source_and_pgp_key(rel)
244+ if key:
245+ import_key(key)
246+
247 subprocess.check_call(["add-apt-repository", "-y", src])
248 elif rel[:3] == "deb":
249- l = len(rel.split('|'))
250- if l == 2:
251- src, key = rel.split('|')
252- juju_log("Importing PPA key from keyserver for %s" % src)
253+ src, key = get_source_and_pgp_key(rel)
254+ if key:
255 import_key(key)
256- elif l == 1:
257- src = rel
258+
259 with open('/etc/apt/sources.list.d/juju_deb.list', 'w') as f:
260 f.write(src)
261 elif rel[:6] == 'cloud:':
262
263=== modified file 'charmhelpers/contrib/python/packages.py'
264--- charmhelpers/contrib/python/packages.py 2016-01-04 21:31:57 +0000
265+++ charmhelpers/contrib/python/packages.py 2016-02-11 16:47:55 +0000
266@@ -19,20 +19,35 @@
267
268 import os
269 import subprocess
270+import sys
271
272 from charmhelpers.fetch import apt_install, apt_update
273 from charmhelpers.core.hookenv import charm_dir, log
274
275-try:
276- from pip import main as pip_execute
277-except ImportError:
278- apt_update()
279- apt_install('python-pip')
280- from pip import main as pip_execute
281-
282 __author__ = "Jorge Niedbalski <jorge.niedbalski@canonical.com>"
283
284
285+def pip_execute(*args, **kwargs):
286+ """Overriden pip_execute() to stop sys.path being changed.
287+
288+ The act of importing main from the pip module seems to cause add wheels
289+ from the /usr/share/python-wheels which are installed by various tools.
290+ This function ensures that sys.path remains the same after the call is
291+ executed.
292+ """
293+ try:
294+ _path = sys.path
295+ try:
296+ from pip import main as _pip_execute
297+ except ImportError:
298+ apt_update()
299+ apt_install('python-pip')
300+ from pip import main as _pip_execute
301+ _pip_execute(*args, **kwargs)
302+ finally:
303+ sys.path = _path
304+
305+
306 def parse_options(given, available):
307 """Given a set of options, check if available"""
308 for key, value in sorted(given.items()):
309
310=== modified file 'charmhelpers/core/host.py'
311--- charmhelpers/core/host.py 2016-01-07 09:19:41 +0000
312+++ charmhelpers/core/host.py 2016-02-11 16:47:55 +0000
313@@ -138,7 +138,8 @@
314 except subprocess.CalledProcessError:
315 return False
316 else:
317- if ("start/running" in output or "is running" in output):
318+ if ("start/running" in output or "is running" in output or
319+ "up and running" in output):
320 return True
321 else:
322 return False
323@@ -160,13 +161,13 @@
324
325
326 def init_is_systemd():
327+ """Return True if the host system uses systemd, False otherwise."""
328 return os.path.isdir(SYSTEMD_SYSTEM)
329
330
331 def adduser(username, password=None, shell='/bin/bash', system_user=False,
332 primary_group=None, secondary_groups=None):
333- """
334- Add a user to the system.
335+ """Add a user to the system.
336
337 Will log but otherwise succeed if the user already exists.
338
339@@ -174,7 +175,7 @@
340 :param str password: Password for user; if ``None``, create a system user
341 :param str shell: The default shell for the user
342 :param bool system_user: Whether to create a login or system user
343- :param str primary_group: Primary group for user; defaults to their username
344+ :param str primary_group: Primary group for user; defaults to username
345 :param list secondary_groups: Optional list of additional groups
346
347 :returns: The password database entry struct, as returned by `pwd.getpwnam`
348@@ -300,14 +301,12 @@
349
350
351 def fstab_remove(mp):
352- """Remove the given mountpoint entry from /etc/fstab
353- """
354+ """Remove the given mountpoint entry from /etc/fstab"""
355 return Fstab.remove_by_mountpoint(mp)
356
357
358 def fstab_add(dev, mp, fs, options=None):
359- """Adds the given device entry to the /etc/fstab file
360- """
361+ """Adds the given device entry to the /etc/fstab file"""
362 return Fstab.add(dev, mp, fs, options=options)
363
364
365@@ -363,8 +362,7 @@
366
367
368 def file_hash(path, hash_type='md5'):
369- """
370- Generate a hash checksum of the contents of 'path' or None if not found.
371+ """Generate a hash checksum of the contents of 'path' or None if not found.
372
373 :param str hash_type: Any hash alrgorithm supported by :mod:`hashlib`,
374 such as md5, sha1, sha256, sha512, etc.
375@@ -379,10 +377,9 @@
376
377
378 def path_hash(path):
379- """
380- Generate a hash checksum of all files matching 'path'. Standard wildcards
381- like '*' and '?' are supported, see documentation for the 'glob' module for
382- more information.
383+ """Generate a hash checksum of all files matching 'path'. Standard
384+ wildcards like '*' and '?' are supported, see documentation for the 'glob'
385+ module for more information.
386
387 :return: dict: A { filename: hash } dictionary for all matched files.
388 Empty if none found.
389@@ -394,8 +391,7 @@
390
391
392 def check_hash(path, checksum, hash_type='md5'):
393- """
394- Validate a file using a cryptographic checksum.
395+ """Validate a file using a cryptographic checksum.
396
397 :param str checksum: Value of the checksum used to validate the file.
398 :param str hash_type: Hash algorithm used to generate `checksum`.
399@@ -410,6 +406,7 @@
400
401
402 class ChecksumError(ValueError):
403+ """A class derived from Value error to indicate the checksum failed."""
404 pass
405
406
407@@ -515,7 +512,7 @@
408
409
410 def list_nics(nic_type=None):
411- '''Return a list of nics of given type(s)'''
412+ """Return a list of nics of given type(s)"""
413 if isinstance(nic_type, six.string_types):
414 int_types = [nic_type]
415 else:
416@@ -557,12 +554,13 @@
417
418
419 def set_nic_mtu(nic, mtu):
420- '''Set MTU on a network interface'''
421+ """Set the Maximum Transmission Unit (MTU) on a network interface."""
422 cmd = ['ip', 'link', 'set', nic, 'mtu', mtu]
423 subprocess.check_call(cmd)
424
425
426 def get_nic_mtu(nic):
427+ """Return the Maximum Transmission Unit (MTU) for a network interface."""
428 cmd = ['ip', 'addr', 'show', nic]
429 ip_output = subprocess.check_output(cmd).decode('UTF-8').split('\n')
430 mtu = ""
431@@ -574,6 +572,7 @@
432
433
434 def get_nic_hwaddr(nic):
435+ """Return the Media Access Control (MAC) for a network interface."""
436 cmd = ['ip', '-o', '-0', 'addr', 'show', nic]
437 ip_output = subprocess.check_output(cmd).decode('UTF-8')
438 hwaddr = ""
439@@ -584,7 +583,7 @@
440
441
442 def cmp_pkgrevno(package, revno, pkgcache=None):
443- '''Compare supplied revno with the revno of the installed package
444+ """Compare supplied revno with the revno of the installed package
445
446 * 1 => Installed revno is greater than supplied arg
447 * 0 => Installed revno is the same as supplied arg
448@@ -593,7 +592,7 @@
449 This function imports apt_cache function from charmhelpers.fetch if
450 the pkgcache argument is None. Be sure to add charmhelpers.fetch if
451 you call this function, or pass an apt_pkg.Cache() instance.
452- '''
453+ """
454 import apt_pkg
455 if not pkgcache:
456 from charmhelpers.fetch import apt_cache
457@@ -603,19 +602,27 @@
458
459
460 @contextmanager
461-def chdir(d):
462+def chdir(directory):
463+ """Change the current working directory to a different directory for a code
464+ block and return the previous directory after the block exits. Useful to
465+ run commands from a specificed directory.
466+
467+ :param str directory: The directory path to change to for this context.
468+ """
469 cur = os.getcwd()
470 try:
471- yield os.chdir(d)
472+ yield os.chdir(directory)
473 finally:
474 os.chdir(cur)
475
476
477 def chownr(path, owner, group, follow_links=True, chowntopdir=False):
478- """
479- Recursively change user and group ownership of files and directories
480+ """Recursively change user and group ownership of files and directories
481 in given path. Doesn't chown path itself by default, only its children.
482
483+ :param str path: The string path to start changing ownership.
484+ :param str owner: The owner string to use when looking up the uid.
485+ :param str group: The group string to use when looking up the gid.
486 :param bool follow_links: Also Chown links if True
487 :param bool chowntopdir: Also chown path itself if True
488 """
489@@ -639,15 +646,23 @@
490
491
492 def lchownr(path, owner, group):
493+ """Recursively change user and group ownership of files and directories
494+ in a given path, not following symbolic links. See the documentation for
495+ 'os.lchown' for more information.
496+
497+ :param str path: The string path to start changing ownership.
498+ :param str owner: The owner string to use when looking up the uid.
499+ :param str group: The group string to use when looking up the gid.
500+ """
501 chownr(path, owner, group, follow_links=False)
502
503
504 def get_total_ram():
505- '''The total amount of system RAM in bytes.
506+ """The total amount of system RAM in bytes.
507
508 This is what is reported by the OS, and may be overcommitted when
509 there are multiple containers hosted on the same machine.
510- '''
511+ """
512 with open('/proc/meminfo', 'r') as f:
513 for line in f.readlines():
514 if line:
515
516=== modified file 'charmhelpers/fetch/giturl.py'
517--- charmhelpers/fetch/giturl.py 2016-01-13 18:36:55 +0000
518+++ charmhelpers/fetch/giturl.py 2016-02-11 16:47:55 +0000
519@@ -15,7 +15,7 @@
520 # along with charm-helpers. If not, see <http://www.gnu.org/licenses/>.
521
522 import os
523-from subprocess import check_call
524+from subprocess import check_call, CalledProcessError
525 from charmhelpers.fetch import (
526 BaseFetchHandler,
527 UnhandledSource,
528@@ -63,6 +63,8 @@
529 branch_name)
530 try:
531 self.clone(source, dest_dir, branch, depth)
532+ except CalledProcessError as e:
533+ raise UnhandledSource(e)
534 except OSError as e:
535 raise UnhandledSource(e.strerror)
536 return dest_dir
537
538=== modified file 'lib/swift_storage_context.py'
539--- lib/swift_storage_context.py 2015-11-24 13:51:05 +0000
540+++ lib/swift_storage_context.py 2016-02-11 16:47:55 +0000
541@@ -80,14 +80,11 @@
542 interfaces = []
543
544 def __call__(self):
545- import psutil
546- multiplier = int(config('worker-multiplier')) or 1
547 ctxt = {
548 'local_ip': unit_private_ip(),
549 'account_server_port': config('account-server-port'),
550 'container_server_port': config('container-server-port'),
551 'object_server_port': config('object-server-port'),
552- 'workers': str(psutil.NUM_CPUS * multiplier),
553 'object_server_threads_per_disk': config(
554 'object-server-threads-per-disk'),
555 'account_max_connections': config('account-max-connections'),
556
557=== modified file 'lib/swift_storage_utils.py'
558--- lib/swift_storage_utils.py 2015-11-02 21:40:22 +0000
559+++ lib/swift_storage_utils.py 2016-02-11 16:47:55 +0000
560@@ -132,7 +132,8 @@
561 for server in ['account', 'object', 'container']:
562 configs.register('/etc/swift/%s-server.conf' % server,
563 [SwiftStorageServerContext(),
564- context.BindHostContext()]),
565+ context.BindHostContext(),
566+ context.WorkerConfigContext()]),
567 return configs
568
569
570
571=== modified file 'tests/021-basic-xenial-mitaka' (properties changed: -x to +x)
572=== modified file 'tests/charmhelpers/contrib/openstack/amulet/deployment.py'
573--- tests/charmhelpers/contrib/openstack/amulet/deployment.py 2016-01-04 21:31:57 +0000
574+++ tests/charmhelpers/contrib/openstack/amulet/deployment.py 2016-02-11 16:47:55 +0000
575@@ -121,11 +121,12 @@
576
577 # Charms which should use the source config option
578 use_source = ['mysql', 'mongodb', 'rabbitmq-server', 'ceph',
579- 'ceph-osd', 'ceph-radosgw']
580+ 'ceph-osd', 'ceph-radosgw', 'ceph-mon']
581
582 # Charms which can not use openstack-origin, ie. many subordinates
583 no_origin = ['cinder-ceph', 'hacluster', 'neutron-openvswitch', 'nrpe',
584- 'openvswitch-odl', 'neutron-api-odl', 'odl-controller']
585+ 'openvswitch-odl', 'neutron-api-odl', 'odl-controller',
586+ 'cinder-backup']
587
588 if self.openstack:
589 for svc in services:
590
591=== modified file 'unit_tests/test_swift_storage_context.py'
592--- unit_tests/test_swift_storage_context.py 2015-07-17 15:52:38 +0000
593+++ unit_tests/test_swift_storage_context.py 2016-02-11 16:47:55 +0000
594@@ -67,18 +67,15 @@
595 _file.write.assert_called_with('RSYNC_ENABLE=true\n')
596
597 def test_swift_storage_server_context(self):
598- import psutil
599 self.unit_private_ip.return_value = '10.0.0.5'
600 self.test_config.set('account-server-port', '500')
601 self.test_config.set('object-server-port', '501')
602 self.test_config.set('container-server-port', '502')
603 self.test_config.set('object-server-threads-per-disk', '3')
604- self.test_config.set('worker-multiplier', '3')
605 self.test_config.set('object-replicator-concurrency', '3')
606 self.test_config.set('account-max-connections', '10')
607 self.test_config.set('container-max-connections', '10')
608 self.test_config.set('object-max-connections', '10')
609- num_workers = psutil.NUM_CPUS * 3
610 ctxt = swift_context.SwiftStorageServerContext()
611 result = ctxt()
612 ex = {
613@@ -87,7 +84,6 @@
614 'account_server_port': '500',
615 'local_ip': '10.0.0.5',
616 'object_server_threads_per_disk': '3',
617- 'workers': str(num_workers),
618 'object_replicator_concurrency': '3',
619 'account_max_connections': '10',
620 'container_max_connections': '10',
621
622=== modified file 'unit_tests/test_swift_storage_utils.py'
623--- unit_tests/test_swift_storage_utils.py 2015-11-16 20:56:57 +0000
624+++ unit_tests/test_swift_storage_utils.py 2016-02-11 16:47:55 +0000
625@@ -292,6 +292,7 @@
626 renderer.assert_called_with(templates_dir=swift_utils.TEMPLATES,
627 openstack_release='essex')
628
629+ @patch('charmhelpers.contrib.openstack.context.WorkerConfigContext')
630 @patch('charmhelpers.contrib.openstack.context.BindHostContext')
631 @patch.object(swift_utils, 'SwiftStorageContext')
632 @patch.object(swift_utils, 'RsyncContext')
633@@ -299,11 +300,12 @@
634 @patch('charmhelpers.contrib.openstack.templating.OSConfigRenderer')
635 def test_register_configs_post_install(self, renderer,
636 swift, rsync, server,
637- bind_context):
638+ bind_context, worker_context):
639 swift.return_value = 'swift_context'
640 rsync.return_value = 'rsync_context'
641 server.return_value = 'swift_server_context'
642 bind_context.return_value = 'bind_host_context'
643+ worker_context.return_value = 'worker_context'
644 self.get_os_codename_package.return_value = 'grizzly'
645 configs = MagicMock()
646 configs.register = MagicMock()
647@@ -316,11 +318,14 @@
648 call('/etc/rsync-juju.d/050-swift-storage.conf',
649 ['rsync_context', 'swift_context']),
650 call('/etc/swift/account-server.conf', ['swift_context',
651- 'bind_host_context']),
652+ 'bind_host_context',
653+ 'worker_context']),
654 call('/etc/swift/object-server.conf', ['swift_context',
655- 'bind_host_context']),
656+ 'bind_host_context',
657+ 'worker_context']),
658 call('/etc/swift/container-server.conf', ['swift_context',
659- 'bind_host_context'])
660+ 'bind_host_context',
661+ 'worker_context'])
662 ]
663 self.assertEquals(ex, configs.register.call_args_list)
664

Subscribers

People subscribed via source and target branches