Merge lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk into lp:charms/trusty/apache-hadoop-plugin

Proposed by Kevin W Monroe
Status: Merged
Merged at revision: 100
Proposed branch: lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk
Merge into: lp:charms/trusty/apache-hadoop-plugin
Diff against target: 538 lines (+160/-87) (has conflicts)
19 files modified
DEV-README.md (+15/-0)
README.md (+11/-5)
actions/parseTerasort.py (+1/-1)
actions/teragen (+4/-0)
actions/terasort (+4/-1)
dist.yaml (+0/-46)
hooks/callbacks.py (+24/-11)
hooks/common.py (+34/-7)
hooks/config-changed (+1/-1)
hooks/hadoop-plugin-relation-changed (+1/-1)
hooks/hadoop-plugin-relation-departed (+15/-0)
hooks/namenode-relation-changed (+1/-1)
hooks/namenode-relation-departed (+16/-0)
hooks/resourcemanager-relation-changed (+1/-1)
hooks/resourcemanager-relation-departed (+16/-0)
hooks/setup.py (+8/-4)
hooks/start (+1/-1)
hooks/stop (+1/-1)
resources.yaml (+6/-6)
Text conflict in DEV-README.md
Text conflict in README.md
To merge this branch: bzr merge lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk
Reviewer Review Type Date Requested Status
Kevin W Monroe Approve
Review via email: mp+268670@code.launchpad.net
To post a comment you must log in.
115. By Cory Johns

Fixed permissions on test_dist_config.py

Revision history for this message
Kevin W Monroe (kwmonroe) wrote :

Realtime syslog analytics bundle test looked good. Merged.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'DEV-README.md'
--- DEV-README.md 2015-06-29 14:23:33 +0000
+++ DEV-README.md 2015-08-21 21:51:35 +0000
@@ -27,11 +27,19 @@
2727
28Additionally, the `JAVA_HOME`, `HADOOP_HOME`, `HADOOP_CONF_DIR`, and other28Additionally, the `JAVA_HOME`, `HADOOP_HOME`, `HADOOP_CONF_DIR`, and other
29environment variables will be set via `/etc/environment`. This includes putting29environment variables will be set via `/etc/environment`. This includes putting
30<<<<<<< TREE
30the Hadoop bin and sbin directories on the `PATH`. There are31the Hadoop bin and sbin directories on the `PATH`. There are
31[helpers](http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/files/head:/common/noarch/)32[helpers](http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/files/head:/common/noarch/)
32in `charmhelpers.contrib.bigdata.utils` to assist with using the environment33in `charmhelpers.contrib.bigdata.utils` to assist with using the environment
33file. For example, to run the `hdfs` command to create a directory as the34file. For example, to run the `hdfs` command to create a directory as the
34`ubuntu` user:35`ubuntu` user:
36=======
37the Hadoop bin and sbin directories on the `PATH`. There are
38[helpers](https://git.launchpad.net/bigdata-data/tree/common/noarch)
39in `charmhelpers.contrib.bigdata.utils` to assist with using the environment
40file. For example, to run the `hdfs` command to create a directory as the
41`ubuntu` user:
42>>>>>>> MERGE-SOURCE
3543
36 from charmhelpers.contrib.bigdata.utils import run_as44 from charmhelpers.contrib.bigdata.utils import run_as
37 run_as('ubuntu', 'hdfs', 'dfs', '-mkdir', '-p', '/home/ubuntu/foo')45 run_as('ubuntu', 'hdfs', 'dfs', '-mkdir', '-p', '/home/ubuntu/foo')
@@ -102,10 +110,17 @@
102110
103## Manual Deployment111## Manual Deployment
104112
113<<<<<<< TREE
105The easiest way to deploy the core Apache Hadoop platform is to use one of114The easiest way to deploy the core Apache Hadoop platform is to use one of
106the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles).115the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles).
107However, to manually deploy the base Apache Hadoop platform without using one116However, to manually deploy the base Apache Hadoop platform without using one
108of the bundles, you can use the following:117of the bundles, you can use the following:
118=======
119The easiest way to deploy an Apache Hadoop platform is to use one of
120the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles).
121However, to manually deploy the base Apache Hadoop platform without using one
122of the bundles, you can use the following:
123>>>>>>> MERGE-SOURCE
109124
110 juju deploy apache-hadoop-hdfs-master hdfs-master125 juju deploy apache-hadoop-hdfs-master hdfs-master
111 juju deploy apache-hadoop-hdfs-secondary secondary-namenode126 juju deploy apache-hadoop-hdfs-secondary secondary-namenode
112127
=== modified file 'README.md'
--- README.md 2015-06-18 17:12:11 +0000
+++ README.md 2015-08-21 21:51:35 +0000
@@ -22,7 +22,11 @@
22If you wanted to also wanted to be able to analyze your data using Apache Pig,22If you wanted to also wanted to be able to analyze your data using Apache Pig,
23you could deploy it and attach it to the same plugin:23you could deploy it and attach it to the same plugin:
2424
25<<<<<<< TREE
25 juju deploy cs:~bigdata-charmers/trusty/apache-pig pig26 juju deploy cs:~bigdata-charmers/trusty/apache-pig pig
27=======
28 juju deploy apache-pig pig
29>>>>>>> MERGE-SOURCE
26 juju add-relation plugin pig30 juju add-relation plugin pig
2731
28## Benchmarking32## Benchmarking
@@ -92,17 +96,19 @@
92of these resources:96of these resources:
9397
94 sudo pip install jujuresources98 sudo pip install jujuresources
95 juju resources fetch --all apache-hadoop-plugin/resources.yaml -d /tmp/resources99 juju-resources fetch --all /path/to/resources.yaml -d /tmp/resources
96 juju resources serve -d /tmp/resources100 juju-resources serve -d /tmp/resources
97101
98This will fetch all of the resources needed by this charm and serve them via a102This will fetch all of the resources needed by this charm and serve them via a
99simple HTTP server. You can then set the `resources_mirror` config option to103simple HTTP server. The output from `juju-resources serve` will give you a
100have the charm use this server for retrieving resources.104URL that you can set as the `resources_mirror` config option for this charm.
105Setting this option will cause all resources required by this charm to be
106downloaded from the configured URL.
101107
102You can fetch the resources for all of the Apache Hadoop charms108You can fetch the resources for all of the Apache Hadoop charms
103(`apache-hadoop-hdfs-master`, `apache-hadoop-yarn-master`,109(`apache-hadoop-hdfs-master`, `apache-hadoop-yarn-master`,
104`apache-hadoop-compute-slave`, `apache-hadoop-plugin`, etc) into a single110`apache-hadoop-compute-slave`, `apache-hadoop-plugin`, etc) into a single
105directory and serve them all with a single `juju resources serve` instance.111directory and serve them all with a single `juju-resources serve` instance.
106112
107113
108## Contact Information114## Contact Information
109115
=== modified file 'actions/parseTerasort.py'
--- actions/parseTerasort.py 2015-06-08 20:05:37 +0000
+++ actions/parseTerasort.py 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2"""2"""
3Simple script to parse cassandra-stress' transaction results3Simple script to parse cassandra-stress' transaction results
4and reformat them as JSON for sending back to juju4and reformat them as JSON for sending back to juju
55
=== modified file 'actions/teragen'
--- actions/teragen 2015-06-08 21:32:27 +0000
+++ actions/teragen 2015-08-21 21:51:35 +0000
@@ -3,6 +3,10 @@
3SIZE=`action-get size`3SIZE=`action-get size`
4IN_DIR=`action-get indir`4IN_DIR=`action-get indir`
55
6# The plugin uses a venv to avoid conflicting with system bits; ensure our
7# venv is activated.
8. ${CHARM_DIR}/.venv/bin/activate
9
6benchmark-start10benchmark-start
711
8# NB: Escaped vars in the block below (e.g., \${HADOOP_HOME}) come from12# NB: Escaped vars in the block below (e.g., \${HADOOP_HOME}) come from
913
=== modified file 'actions/terasort'
--- actions/terasort 2015-06-08 21:32:27 +0000
+++ actions/terasort 2015-08-21 21:51:35 +0000
@@ -25,6 +25,9 @@
25# from this outer scope25# from this outer scope
26su ubuntu << EOF26su ubuntu << EOF
27. /etc/environment27. /etc/environment
28# The plugin uses a venv to avoid conflicting with system bits; ensure our
29# venv is activated.
30. ${CHARM_DIR}/.venv/bin/activate
2831
29mkdir -p /opt/terasort/results/$run32mkdir -p /opt/terasort/results/$run
3033
@@ -45,4 +48,4 @@
45EOF48EOF
46PATH=$OLDPATH49PATH=$OLDPATH
4750
48`cat /opt/terasort/results/$run/terasort.log | python $CHARM_DIR/actions/parseTerasort.py`51`cat /opt/terasort/results/$run/terasort.log | $CHARM_DIR/.venv/bin/python $CHARM_DIR/actions/parseTerasort.py`
4952
=== modified file 'dist.yaml'
--- dist.yaml 2015-04-16 15:45:57 +0000
+++ dist.yaml 2015-08-21 21:51:35 +0000
@@ -68,49 +68,3 @@
68 owner: 'hdfs'68 owner: 'hdfs'
69 group: 'hadoop'69 group: 'hadoop'
70 perms: 077570 perms: 0775
71ports:
72 # Ports that need to be exposed, overridden, or manually specified.
73 # Only expose ports serving a UI or external API (i.e., namenode and
74 # resourcemanager). Communication among units within the cluster does
75 # not need ports to be explicitly opened.
76 # If adding a port here, you will need to update
77 # charmhelpers.contrib.bigdata.handlers.apache or hooks/callbacks.py
78 # to ensure that it is supported.
79 namenode:
80 port: 8020
81 exposed_on: 'hdfs-master'
82 nn_webapp_http:
83 port: 50070
84 exposed_on: 'hdfs-master'
85 dn_webapp_http:
86 port: 50075
87 exposed_on: 'compute-slave'
88 resourcemanager:
89 port: 8032
90 exposed_on: 'yarn-master'
91 rm_webapp_http:
92 port: 8088
93 exposed_on: 'yarn-master'
94 rm_log:
95 port: 19888
96 nm_webapp_http:
97 port: 8042
98 exposed_on: 'compute-slave'
99 jobhistory:
100 port: 10020
101 jh_webapp_http:
102 port: 19888
103 exposed_on: 'yarn-master'
104 # TODO: support SSL
105 #nn_webapp_https:
106 # port: 50470
107 # exposed_on: 'hdfs-master'
108 #dn_webapp_https:
109 # port: 50475
110 # exposed_on: 'compute-slave'
111 #rm_webapp_https:
112 # port: 8090
113 # exposed_on: 'yarn-master'
114 #nm_webapp_https:
115 # port: 8044
116 # exposed_on: 'compute-slave'
11771
=== modified file 'hooks/callbacks.py'
--- hooks/callbacks.py 2015-06-25 15:41:48 +0000
+++ hooks/callbacks.py 2015-08-21 21:51:35 +0000
@@ -24,28 +24,28 @@
24def update_blocked_status():24def update_blocked_status():
25 if unitdata.kv().get('charm.active', False):25 if unitdata.kv().get('charm.active', False):
26 return26 return
27 rels = (27 rels = [
28 ('Yarn', 'ResourceManager', ResourceManager()),
29 ('HDFS', 'NameNode', NameNode()),28 ('HDFS', 'NameNode', NameNode()),
30 )29 ]
31 missing_rel = [rel for rel, res, impl in rels if not impl.connected_units()]30 missing_rel = [rel for rel, res, impl in rels if not impl.connected_units()]
32 missing_hosts = [rel for rel, res, impl in rels if not impl.am_i_registered()]31 rels.append(('Yarn', 'ResourceManager', ResourceManager()))
33 not_ready = [(rel, res) for rel, res, impl in rels if not impl.is_ready()]32 not_ready = [(rel, res) for rel, res, impl in rels if impl.connected_units() and not impl.is_ready()]
33 missing_hosts = [rel for rel, res, impl in rels if impl.connected_units() and not impl.am_i_registered()]
34 if missing_rel:34 if missing_rel:
35 hookenv.status_set('blocked', 'Waiting for relation to %s master%s' % (35 hookenv.status_set('blocked', 'Waiting for relation to %s master%s' % (
36 ' and '.join(missing_rel),36 ' and '.join(missing_rel),
37 's' if len(missing_rel) > 1 else '',37 's' if len(missing_rel) > 1 else '',
38 )),38 )),
39 elif missing_hosts:
40 hookenv.status_set('waiting', 'Waiting for /etc/hosts registration on %s' % (
41 ' and '.join(missing_hosts),
42 ))
43 elif not_ready:39 elif not_ready:
44 unready_rels, unready_ress = zip(*not_ready)40 unready_rels, unready_ress = zip(*not_ready)
45 hookenv.status_set('waiting', 'Waiting for %s to provide %s' % (41 hookenv.status_set('waiting', 'Waiting for %s to provide %s' % (
46 ' and '.join(unready_rels),42 ' and '.join(unready_rels),
47 ' and '.join(unready_ress),43 ' and '.join(unready_ress),
48 ))44 ))
45 elif missing_hosts:
46 hookenv.status_set('waiting', 'Waiting for /etc/hosts registration on %s' % (
47 ' and '.join(missing_hosts),
48 ))
4949
5050
51def update_working_status():51def update_working_status():
@@ -56,5 +56,18 @@
5656
5757
58def update_active_status():58def update_active_status():
59 unitdata.kv().set('charm.active', True)59 hdfs_ready = NameNode().is_ready()
60 hookenv.status_set('active', 'Ready')60 yarn_connected = ResourceManager().connected_units()
61 yarn_ready = ResourceManager().is_ready()
62 if hdfs_ready and (not yarn_connected or yarn_ready):
63 unitdata.kv().set('charm.active', True)
64 hookenv.status_set('active', 'Ready%s' % (
65 '' if yarn_ready else ' (HDFS only)'
66 ))
67 else:
68 clear_active_flag()
69 update_blocked_status()
70
71
72def clear_active_flag():
73 unitdata.kv().set('charm.active', False)
6174
=== modified file 'hooks/common.py'
--- hooks/common.py 2015-06-25 15:41:48 +0000
+++ hooks/common.py 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
@@ -14,12 +14,21 @@
14Common implementation for all hooks.14Common implementation for all hooks.
15"""15"""
1616
17import os
17import jujuresources18import jujuresources
18from charmhelpers.core import hookenv19from charmhelpers.core import hookenv
19from charmhelpers.core import unitdata20from charmhelpers.core import unitdata
20from charmhelpers.core import charmframework21from charmhelpers.core import charmframework
2122
2223
24# ensure that the venv is used for installing resources
25# (venv is used to ensure library versions needed by plugin
26# don't conflict with the charm the plugin is supporting)
27os.environ['PATH'] = os.pathsep.join([
28 os.path.join(hookenv.charm_dir(), '.venv/bin'),
29 os.environ['PATH']])
30
31
23def bootstrap_resources():32def bootstrap_resources():
24 """33 """
25 Install required resources defined in resources.yaml34 Install required resources defined in resources.yaml
@@ -51,7 +60,7 @@
5160
52 # list of keys required to be in the dist.yaml61 # list of keys required to be in the dist.yaml
53 client_reqs = ['vendor', 'hadoop_version', 'packages', 'groups', 'users',62 client_reqs = ['vendor', 'hadoop_version', 'packages', 'groups', 'users',
54 'dirs', 'ports']63 'dirs']
55 dist_config = jujubigdata.utils.DistConfig(filename='dist.yaml',64 dist_config = jujubigdata.utils.DistConfig(filename='dist.yaml',
56 required_keys=client_reqs)65 required_keys=client_reqs)
57 hadoop = jujubigdata.handlers.HadoopBase(dist_config)66 hadoop = jujubigdata.handlers.HadoopBase(dist_config)
@@ -71,26 +80,44 @@
71 ],80 ],
72 },81 },
73 {82 {
74 'name': 'plugin',83 'name': 'hdfs',
75 'provides': [84 'provides': [
76 jujubigdata.relations.HadoopPlugin(),85 jujubigdata.relations.HadoopPlugin(),
77 ],86 ],
78 'requires': [87 'requires': [
79 hadoop.is_installed,88 hadoop.is_installed,
89 hdfs_relation,
90 ],
91 'callbacks': [
92 callbacks.update_working_status,
93 hdfs_relation.register_provided_hosts,
94 jujubigdata.utils.manage_etc_hosts,
95 hdfs.configure_client,
96 callbacks.update_active_status,
97 ],
98 'cleanup': [
99 callbacks.clear_active_flag,
100 callbacks.update_blocked_status,
101 ],
102 },
103 {
104 'name': 'yarn',
105 'provides': [],
106 'requires': [
107 hadoop.is_installed,
80 yarn_relation,108 yarn_relation,
81 hdfs_relation,
82 ],109 ],
83 'callbacks': [110 'callbacks': [
84 callbacks.update_working_status,111 callbacks.update_working_status,
85 yarn_relation.register_provided_hosts,112 yarn_relation.register_provided_hosts,
86 hdfs_relation.register_provided_hosts,
87 jujubigdata.utils.manage_etc_hosts,113 jujubigdata.utils.manage_etc_hosts,
88 yarn.install_demo,114 yarn.install_demo,
89 yarn.configure_client,115 yarn.configure_client,
90 hdfs.configure_client,
91 callbacks.update_active_status,116 callbacks.update_active_status,
92 ],117 ],
93 'cleanup': [],118 'cleanup': [
119 callbacks.update_blocked_status,
120 ],
94 },121 },
95 ])122 ])
96 manager.manage()123 manager.manage()
97124
=== modified file 'hooks/config-changed'
--- hooks/config-changed 2015-02-09 18:13:28 +0000
+++ hooks/config-changed 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
55
=== modified file 'hooks/hadoop-plugin-relation-changed'
--- hooks/hadoop-plugin-relation-changed 2015-04-28 13:38:46 +0000
+++ hooks/hadoop-plugin-relation-changed 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
55
=== added file 'hooks/hadoop-plugin-relation-departed'
--- hooks/hadoop-plugin-relation-departed 1970-01-01 00:00:00 +0000
+++ hooks/hadoop-plugin-relation-departed 2015-08-21 21:51:35 +0000
@@ -0,0 +1,15 @@
1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at
5#
6# http://www.apache.org/licenses/LICENSE-2.0
7#
8# Unless required by applicable law or agreed to in writing, software
9# distributed under the License is distributed on an "AS IS" BASIS,
10# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11# See the License for the specific language governing permissions and
12# limitations under the License.
13
14import common
15common.manage()
016
=== modified file 'hooks/namenode-relation-changed'
--- hooks/namenode-relation-changed 2015-05-07 15:27:21 +0000
+++ hooks/namenode-relation-changed 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
55
=== added file 'hooks/namenode-relation-departed'
--- hooks/namenode-relation-departed 1970-01-01 00:00:00 +0000
+++ hooks/namenode-relation-departed 2015-08-21 21:51:35 +0000
@@ -0,0 +1,16 @@
1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at
5#
6# http://www.apache.org/licenses/LICENSE-2.0
7#
8# Unless required by applicable law or agreed to in writing, software
9# distributed under the License is distributed on an "AS IS" BASIS,
10# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11# See the License for the specific language governing permissions and
12# limitations under the License.
13
14import common
15
16common.manage()
017
=== modified file 'hooks/resourcemanager-relation-changed'
--- hooks/resourcemanager-relation-changed 2015-05-07 15:11:29 +0000
+++ hooks/resourcemanager-relation-changed 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
55
=== added file 'hooks/resourcemanager-relation-departed'
--- hooks/resourcemanager-relation-departed 1970-01-01 00:00:00 +0000
+++ hooks/resourcemanager-relation-departed 2015-08-21 21:51:35 +0000
@@ -0,0 +1,16 @@
1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at
5#
6# http://www.apache.org/licenses/LICENSE-2.0
7#
8# Unless required by applicable law or agreed to in writing, software
9# distributed under the License is distributed on an "AS IS" BASIS,
10# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11# See the License for the specific language governing permissions and
12# limitations under the License.
13
14import common
15
16common.manage()
017
=== modified file 'hooks/setup.py'
--- hooks/setup.py 2015-06-25 15:41:48 +0000
+++ hooks/setup.py 2015-08-21 21:51:35 +0000
@@ -9,6 +9,7 @@
9# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.9# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10# See the License for the specific language governing permissions and10# See the License for the specific language governing permissions and
11# limitations under the License.11# limitations under the License.
12import os
12import subprocess13import subprocess
13from glob import glob14from glob import glob
1415
@@ -17,12 +18,15 @@
17 """18 """
18 Do any setup required before the install hook.19 Do any setup required before the install hook.
19 """20 """
20 install_pip()21 setup_venv()
21 install_bundled_resources()22 install_bundled_resources()
2223
2324
24def install_pip():25def setup_venv():
25 subprocess.check_call(['apt-get', 'install', '-yq', 'python-pip', 'bzr'])26 if not os.path.exists('.venv/bin/python'):
27 subprocess.check_call(['apt-get', 'install', '-yq', 'python-virtualenv'])
28 subprocess.check_call(['virtualenv', '.venv'])
29 execfile('.venv/bin/activate_this.py', {'__file__': '.venv/bin/activate_this.py'})
2630
2731
28def install_bundled_resources():32def install_bundled_resources():
@@ -30,4 +34,4 @@
30 Install the bundled resources libraries.34 Install the bundled resources libraries.
31 """35 """
32 archives = glob('resources/python/*')36 archives = glob('resources/python/*')
33 subprocess.check_call(['pip', 'install'] + archives)37 subprocess.check_call(['.venv/bin/pip', 'install'] + archives)
3438
=== modified file 'hooks/start'
--- hooks/start 2015-02-09 18:13:28 +0000
+++ hooks/start 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
55
=== modified file 'hooks/stop'
--- hooks/stop 2015-02-09 18:13:28 +0000
+++ hooks/stop 2015-08-21 21:51:35 +0000
@@ -1,4 +1,4 @@
1#!/usr/bin/env python1#!.venv/bin/python
2# Licensed under the Apache License, Version 2.0 (the "License");2# Licensed under the Apache License, Version 2.0 (the "License");
3# you may not use this file except in compliance with the License.3# you may not use this file except in compliance with the License.
4# You may obtain a copy of the License at4# You may obtain a copy of the License at
55
=== modified file 'resources.yaml'
--- resources.yaml 2015-06-25 22:43:49 +0000
+++ resources.yaml 2015-08-21 21:51:35 +0000
@@ -4,7 +4,7 @@
4 pathlib:4 pathlib:
5 pypi: path.py>=7.05 pypi: path.py>=7.0
6 jujubigdata:6 jujubigdata:
7 pypi: jujubigdata>=2.0.2,<3.0.07 pypi: jujubigdata>=4.0.0,<5.0.0
8 charm-benchmark:8 charm-benchmark:
9 pypi: charm-benchmark>=1.0.1,<2.0.09 pypi: charm-benchmark>=1.0.1,<2.0.0
10 java-installer:10 java-installer:
@@ -12,19 +12,19 @@
12 # If replaced with an alternate implementation, it must output *only* two12 # If replaced with an alternate implementation, it must output *only* two
13 # lines containing the JAVA_HOME path, and the Java version, respectively,13 # lines containing the JAVA_HOME path, and the Java version, respectively,
14 # on stdout. Upon error, it must exit with a non-zero exit code.14 # on stdout. Upon error, it must exit with a non-zero exit code.
15 url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150625222410-qfck64q46ubx5i6p/javainstaller.sh-20150311213053-4vq7369jhlvc6qy8-1/java-installer.sh15 url: https://git.launchpad.net/bigdata-data/plain/common/noarch/java-installer.sh?id=baa0b74b86587f97b446f255deb96c8420021dd8
16 hash: 8fdff60270ea4be7bbef1e013e503057fe6efc2f4b5761edebc206a54f30302316 hash: f7df6937bdb4dcc60de559252b4e6b65c77959f871c7ef2e59af57832d7ddfca
17 hash_type: sha25617 hash_type: sha256
18optional_resources:18optional_resources:
19 hadoop-aarch64:19 hadoop-aarch64:
20 url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150303192631-swhrf8f7q82si75t/hadoop2.4.1.tar.gz-20150303192554-7gqslr4m8ahkwiax-2/hadoop-2.4.1.tar.gz20 url: https://git.launchpad.net/bigdata-data/plain/apache/aarch64/hadoop-2.4.1.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
21 hash: 03ad135835bfe413f85fe176259237a821 hash: 03ad135835bfe413f85fe176259237a8
22 hash_type: md522 hash_type: md5
23 hadoop-ppc64le:23 hadoop-ppc64le:
24 url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150130165209-nuz1myezjpdx7eus/hadoop2.4.1ppc64le.t-20150130165148-s8i19s002ht88gio-2/hadoop-2.4.1-ppc64le.tar.gz24 url: https://git.launchpad.net/bigdata-data/plain/apache/ppc64le/hadoop-2.4.1-ppc64le.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
25 hash: 09942b168a3db0d183b281477d3dae9deb7b7bc4b5783ba5cda3965b62e71bd525 hash: 09942b168a3db0d183b281477d3dae9deb7b7bc4b5783ba5cda3965b62e71bd5
26 hash_type: sha25626 hash_type: sha256
27 hadoop-x86_64:27 hadoop-x86_64:
28 url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/cory.johns%40canonical.com-20150116154822-x5osw3zfhw6e03b1/hadoop2.4.1.tar.gz-20150116154748-yfa2j12rr5m53xd3-1/hadoop-2.4.1.tar.gz28 url: https://git.launchpad.net/bigdata-data/plain/apache/x86_64/hadoop-2.4.1.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
29 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b67229 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
30 hash_type: sha25630 hash_type: sha256
3131
=== removed file 'resources/python/jujuresources-0.2.8.tar.gz'
32Binary files resources/python/jujuresources-0.2.8.tar.gz 2015-06-25 15:41:48 +0000 and resources/python/jujuresources-0.2.8.tar.gz 1970-01-01 00:00:00 +0000 differ32Binary files resources/python/jujuresources-0.2.8.tar.gz 2015-06-25 15:41:48 +0000 and resources/python/jujuresources-0.2.8.tar.gz 1970-01-01 00:00:00 +0000 differ
=== added file 'resources/python/jujuresources-0.2.9.tar.gz'
33Binary files resources/python/jujuresources-0.2.9.tar.gz 1970-01-01 00:00:00 +0000 and resources/python/jujuresources-0.2.9.tar.gz 2015-08-21 21:51:35 +0000 differ33Binary files resources/python/jujuresources-0.2.9.tar.gz 1970-01-01 00:00:00 +0000 and resources/python/jujuresources-0.2.9.tar.gz 2015-08-21 21:51:35 +0000 differ
=== modified file 'tests/remote/test_dist_config.py' (properties changed: -x to +x)

Subscribers

People subscribed via source and target branches