Merge lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk into lp:charms/trusty/apache-hadoop-plugin
- Trusty Tahr (14.04)
- trunk
- Merge into trunk
Proposed by
Kevin W Monroe
Status: | Merged |
---|---|
Merged at revision: | 100 |
Proposed branch: | lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk |
Merge into: | lp:charms/trusty/apache-hadoop-plugin |
Diff against target: |
538 lines (+160/-87) (has conflicts) 19 files modified
DEV-README.md (+15/-0) README.md (+11/-5) actions/parseTerasort.py (+1/-1) actions/teragen (+4/-0) actions/terasort (+4/-1) dist.yaml (+0/-46) hooks/callbacks.py (+24/-11) hooks/common.py (+34/-7) hooks/config-changed (+1/-1) hooks/hadoop-plugin-relation-changed (+1/-1) hooks/hadoop-plugin-relation-departed (+15/-0) hooks/namenode-relation-changed (+1/-1) hooks/namenode-relation-departed (+16/-0) hooks/resourcemanager-relation-changed (+1/-1) hooks/resourcemanager-relation-departed (+16/-0) hooks/setup.py (+8/-4) hooks/start (+1/-1) hooks/stop (+1/-1) resources.yaml (+6/-6) Text conflict in DEV-README.md Text conflict in README.md |
To merge this branch: | bzr merge lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Kevin W Monroe | Approve | ||
Review via email: mp+268670@code.launchpad.net |
Commit message
Description of the change
To post a comment you must log in.
- 115. By Cory Johns
-
Fixed permissions on test_dist_config.py
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === modified file 'DEV-README.md' | |||
2 | --- DEV-README.md 2015-06-29 14:23:33 +0000 | |||
3 | +++ DEV-README.md 2015-08-21 21:51:35 +0000 | |||
4 | @@ -27,11 +27,19 @@ | |||
5 | 27 | 27 | ||
6 | 28 | Additionally, the `JAVA_HOME`, `HADOOP_HOME`, `HADOOP_CONF_DIR`, and other | 28 | Additionally, the `JAVA_HOME`, `HADOOP_HOME`, `HADOOP_CONF_DIR`, and other |
7 | 29 | environment variables will be set via `/etc/environment`. This includes putting | 29 | environment variables will be set via `/etc/environment`. This includes putting |
8 | 30 | <<<<<<< TREE | ||
9 | 30 | the Hadoop bin and sbin directories on the `PATH`. There are | 31 | the Hadoop bin and sbin directories on the `PATH`. There are |
10 | 31 | [helpers](http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/files/head:/common/noarch/) | 32 | [helpers](http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/files/head:/common/noarch/) |
11 | 32 | in `charmhelpers.contrib.bigdata.utils` to assist with using the environment | 33 | in `charmhelpers.contrib.bigdata.utils` to assist with using the environment |
12 | 33 | file. For example, to run the `hdfs` command to create a directory as the | 34 | file. For example, to run the `hdfs` command to create a directory as the |
13 | 34 | `ubuntu` user: | 35 | `ubuntu` user: |
14 | 36 | ======= | ||
15 | 37 | the Hadoop bin and sbin directories on the `PATH`. There are | ||
16 | 38 | [helpers](https://git.launchpad.net/bigdata-data/tree/common/noarch) | ||
17 | 39 | in `charmhelpers.contrib.bigdata.utils` to assist with using the environment | ||
18 | 40 | file. For example, to run the `hdfs` command to create a directory as the | ||
19 | 41 | `ubuntu` user: | ||
20 | 42 | >>>>>>> MERGE-SOURCE | ||
21 | 35 | 43 | ||
22 | 36 | from charmhelpers.contrib.bigdata.utils import run_as | 44 | from charmhelpers.contrib.bigdata.utils import run_as |
23 | 37 | run_as('ubuntu', 'hdfs', 'dfs', '-mkdir', '-p', '/home/ubuntu/foo') | 45 | run_as('ubuntu', 'hdfs', 'dfs', '-mkdir', '-p', '/home/ubuntu/foo') |
24 | @@ -102,10 +110,17 @@ | |||
25 | 102 | 110 | ||
26 | 103 | ## Manual Deployment | 111 | ## Manual Deployment |
27 | 104 | 112 | ||
28 | 113 | <<<<<<< TREE | ||
29 | 105 | The easiest way to deploy the core Apache Hadoop platform is to use one of | 114 | The easiest way to deploy the core Apache Hadoop platform is to use one of |
30 | 106 | the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles). | 115 | the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles). |
31 | 107 | However, to manually deploy the base Apache Hadoop platform without using one | 116 | However, to manually deploy the base Apache Hadoop platform without using one |
32 | 108 | of the bundles, you can use the following: | 117 | of the bundles, you can use the following: |
33 | 118 | ======= | ||
34 | 119 | The easiest way to deploy an Apache Hadoop platform is to use one of | ||
35 | 120 | the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles). | ||
36 | 121 | However, to manually deploy the base Apache Hadoop platform without using one | ||
37 | 122 | of the bundles, you can use the following: | ||
38 | 123 | >>>>>>> MERGE-SOURCE | ||
39 | 109 | 124 | ||
40 | 110 | juju deploy apache-hadoop-hdfs-master hdfs-master | 125 | juju deploy apache-hadoop-hdfs-master hdfs-master |
41 | 111 | juju deploy apache-hadoop-hdfs-secondary secondary-namenode | 126 | juju deploy apache-hadoop-hdfs-secondary secondary-namenode |
42 | 112 | 127 | ||
43 | === modified file 'README.md' | |||
44 | --- README.md 2015-06-18 17:12:11 +0000 | |||
45 | +++ README.md 2015-08-21 21:51:35 +0000 | |||
46 | @@ -22,7 +22,11 @@ | |||
47 | 22 | If you wanted to also wanted to be able to analyze your data using Apache Pig, | 22 | If you wanted to also wanted to be able to analyze your data using Apache Pig, |
48 | 23 | you could deploy it and attach it to the same plugin: | 23 | you could deploy it and attach it to the same plugin: |
49 | 24 | 24 | ||
50 | 25 | <<<<<<< TREE | ||
51 | 25 | juju deploy cs:~bigdata-charmers/trusty/apache-pig pig | 26 | juju deploy cs:~bigdata-charmers/trusty/apache-pig pig |
52 | 27 | ======= | ||
53 | 28 | juju deploy apache-pig pig | ||
54 | 29 | >>>>>>> MERGE-SOURCE | ||
55 | 26 | juju add-relation plugin pig | 30 | juju add-relation plugin pig |
56 | 27 | 31 | ||
57 | 28 | ## Benchmarking | 32 | ## Benchmarking |
58 | @@ -92,17 +96,19 @@ | |||
59 | 92 | of these resources: | 96 | of these resources: |
60 | 93 | 97 | ||
61 | 94 | sudo pip install jujuresources | 98 | sudo pip install jujuresources |
64 | 95 | juju resources fetch --all apache-hadoop-plugin/resources.yaml -d /tmp/resources | 99 | juju-resources fetch --all /path/to/resources.yaml -d /tmp/resources |
65 | 96 | juju resources serve -d /tmp/resources | 100 | juju-resources serve -d /tmp/resources |
66 | 97 | 101 | ||
67 | 98 | This will fetch all of the resources needed by this charm and serve them via a | 102 | This will fetch all of the resources needed by this charm and serve them via a |
70 | 99 | simple HTTP server. You can then set the `resources_mirror` config option to | 103 | simple HTTP server. The output from `juju-resources serve` will give you a |
71 | 100 | have the charm use this server for retrieving resources. | 104 | URL that you can set as the `resources_mirror` config option for this charm. |
72 | 105 | Setting this option will cause all resources required by this charm to be | ||
73 | 106 | downloaded from the configured URL. | ||
74 | 101 | 107 | ||
75 | 102 | You can fetch the resources for all of the Apache Hadoop charms | 108 | You can fetch the resources for all of the Apache Hadoop charms |
76 | 103 | (`apache-hadoop-hdfs-master`, `apache-hadoop-yarn-master`, | 109 | (`apache-hadoop-hdfs-master`, `apache-hadoop-yarn-master`, |
77 | 104 | `apache-hadoop-compute-slave`, `apache-hadoop-plugin`, etc) into a single | 110 | `apache-hadoop-compute-slave`, `apache-hadoop-plugin`, etc) into a single |
79 | 105 | directory and serve them all with a single `juju resources serve` instance. | 111 | directory and serve them all with a single `juju-resources serve` instance. |
80 | 106 | 112 | ||
81 | 107 | 113 | ||
82 | 108 | ## Contact Information | 114 | ## Contact Information |
83 | 109 | 115 | ||
84 | === modified file 'actions/parseTerasort.py' | |||
85 | --- actions/parseTerasort.py 2015-06-08 20:05:37 +0000 | |||
86 | +++ actions/parseTerasort.py 2015-08-21 21:51:35 +0000 | |||
87 | @@ -1,4 +1,4 @@ | |||
89 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
90 | 2 | """ | 2 | """ |
91 | 3 | Simple script to parse cassandra-stress' transaction results | 3 | Simple script to parse cassandra-stress' transaction results |
92 | 4 | and reformat them as JSON for sending back to juju | 4 | and reformat them as JSON for sending back to juju |
93 | 5 | 5 | ||
94 | === modified file 'actions/teragen' | |||
95 | --- actions/teragen 2015-06-08 21:32:27 +0000 | |||
96 | +++ actions/teragen 2015-08-21 21:51:35 +0000 | |||
97 | @@ -3,6 +3,10 @@ | |||
98 | 3 | SIZE=`action-get size` | 3 | SIZE=`action-get size` |
99 | 4 | IN_DIR=`action-get indir` | 4 | IN_DIR=`action-get indir` |
100 | 5 | 5 | ||
101 | 6 | # The plugin uses a venv to avoid conflicting with system bits; ensure our | ||
102 | 7 | # venv is activated. | ||
103 | 8 | . ${CHARM_DIR}/.venv/bin/activate | ||
104 | 9 | |||
105 | 6 | benchmark-start | 10 | benchmark-start |
106 | 7 | 11 | ||
107 | 8 | # NB: Escaped vars in the block below (e.g., \${HADOOP_HOME}) come from | 12 | # NB: Escaped vars in the block below (e.g., \${HADOOP_HOME}) come from |
108 | 9 | 13 | ||
109 | === modified file 'actions/terasort' | |||
110 | --- actions/terasort 2015-06-08 21:32:27 +0000 | |||
111 | +++ actions/terasort 2015-08-21 21:51:35 +0000 | |||
112 | @@ -25,6 +25,9 @@ | |||
113 | 25 | # from this outer scope | 25 | # from this outer scope |
114 | 26 | su ubuntu << EOF | 26 | su ubuntu << EOF |
115 | 27 | . /etc/environment | 27 | . /etc/environment |
116 | 28 | # The plugin uses a venv to avoid conflicting with system bits; ensure our | ||
117 | 29 | # venv is activated. | ||
118 | 30 | . ${CHARM_DIR}/.venv/bin/activate | ||
119 | 28 | 31 | ||
120 | 29 | mkdir -p /opt/terasort/results/$run | 32 | mkdir -p /opt/terasort/results/$run |
121 | 30 | 33 | ||
122 | @@ -45,4 +48,4 @@ | |||
123 | 45 | EOF | 48 | EOF |
124 | 46 | PATH=$OLDPATH | 49 | PATH=$OLDPATH |
125 | 47 | 50 | ||
127 | 48 | `cat /opt/terasort/results/$run/terasort.log | python $CHARM_DIR/actions/parseTerasort.py` | 51 | `cat /opt/terasort/results/$run/terasort.log | $CHARM_DIR/.venv/bin/python $CHARM_DIR/actions/parseTerasort.py` |
128 | 49 | 52 | ||
129 | === modified file 'dist.yaml' | |||
130 | --- dist.yaml 2015-04-16 15:45:57 +0000 | |||
131 | +++ dist.yaml 2015-08-21 21:51:35 +0000 | |||
132 | @@ -68,49 +68,3 @@ | |||
133 | 68 | owner: 'hdfs' | 68 | owner: 'hdfs' |
134 | 69 | group: 'hadoop' | 69 | group: 'hadoop' |
135 | 70 | perms: 0775 | 70 | perms: 0775 |
136 | 71 | ports: | ||
137 | 72 | # Ports that need to be exposed, overridden, or manually specified. | ||
138 | 73 | # Only expose ports serving a UI or external API (i.e., namenode and | ||
139 | 74 | # resourcemanager). Communication among units within the cluster does | ||
140 | 75 | # not need ports to be explicitly opened. | ||
141 | 76 | # If adding a port here, you will need to update | ||
142 | 77 | # charmhelpers.contrib.bigdata.handlers.apache or hooks/callbacks.py | ||
143 | 78 | # to ensure that it is supported. | ||
144 | 79 | namenode: | ||
145 | 80 | port: 8020 | ||
146 | 81 | exposed_on: 'hdfs-master' | ||
147 | 82 | nn_webapp_http: | ||
148 | 83 | port: 50070 | ||
149 | 84 | exposed_on: 'hdfs-master' | ||
150 | 85 | dn_webapp_http: | ||
151 | 86 | port: 50075 | ||
152 | 87 | exposed_on: 'compute-slave' | ||
153 | 88 | resourcemanager: | ||
154 | 89 | port: 8032 | ||
155 | 90 | exposed_on: 'yarn-master' | ||
156 | 91 | rm_webapp_http: | ||
157 | 92 | port: 8088 | ||
158 | 93 | exposed_on: 'yarn-master' | ||
159 | 94 | rm_log: | ||
160 | 95 | port: 19888 | ||
161 | 96 | nm_webapp_http: | ||
162 | 97 | port: 8042 | ||
163 | 98 | exposed_on: 'compute-slave' | ||
164 | 99 | jobhistory: | ||
165 | 100 | port: 10020 | ||
166 | 101 | jh_webapp_http: | ||
167 | 102 | port: 19888 | ||
168 | 103 | exposed_on: 'yarn-master' | ||
169 | 104 | # TODO: support SSL | ||
170 | 105 | #nn_webapp_https: | ||
171 | 106 | # port: 50470 | ||
172 | 107 | # exposed_on: 'hdfs-master' | ||
173 | 108 | #dn_webapp_https: | ||
174 | 109 | # port: 50475 | ||
175 | 110 | # exposed_on: 'compute-slave' | ||
176 | 111 | #rm_webapp_https: | ||
177 | 112 | # port: 8090 | ||
178 | 113 | # exposed_on: 'yarn-master' | ||
179 | 114 | #nm_webapp_https: | ||
180 | 115 | # port: 8044 | ||
181 | 116 | # exposed_on: 'compute-slave' | ||
182 | 117 | 71 | ||
183 | === modified file 'hooks/callbacks.py' | |||
184 | --- hooks/callbacks.py 2015-06-25 15:41:48 +0000 | |||
185 | +++ hooks/callbacks.py 2015-08-21 21:51:35 +0000 | |||
186 | @@ -24,28 +24,28 @@ | |||
187 | 24 | def update_blocked_status(): | 24 | def update_blocked_status(): |
188 | 25 | if unitdata.kv().get('charm.active', False): | 25 | if unitdata.kv().get('charm.active', False): |
189 | 26 | return | 26 | return |
192 | 27 | rels = ( | 27 | rels = [ |
191 | 28 | ('Yarn', 'ResourceManager', ResourceManager()), | ||
193 | 29 | ('HDFS', 'NameNode', NameNode()), | 28 | ('HDFS', 'NameNode', NameNode()), |
195 | 30 | ) | 29 | ] |
196 | 31 | missing_rel = [rel for rel, res, impl in rels if not impl.connected_units()] | 30 | missing_rel = [rel for rel, res, impl in rels if not impl.connected_units()] |
199 | 32 | missing_hosts = [rel for rel, res, impl in rels if not impl.am_i_registered()] | 31 | rels.append(('Yarn', 'ResourceManager', ResourceManager())) |
200 | 33 | not_ready = [(rel, res) for rel, res, impl in rels if not impl.is_ready()] | 32 | not_ready = [(rel, res) for rel, res, impl in rels if impl.connected_units() and not impl.is_ready()] |
201 | 33 | missing_hosts = [rel for rel, res, impl in rels if impl.connected_units() and not impl.am_i_registered()] | ||
202 | 34 | if missing_rel: | 34 | if missing_rel: |
203 | 35 | hookenv.status_set('blocked', 'Waiting for relation to %s master%s' % ( | 35 | hookenv.status_set('blocked', 'Waiting for relation to %s master%s' % ( |
204 | 36 | ' and '.join(missing_rel), | 36 | ' and '.join(missing_rel), |
205 | 37 | 's' if len(missing_rel) > 1 else '', | 37 | 's' if len(missing_rel) > 1 else '', |
206 | 38 | )), | 38 | )), |
207 | 39 | elif missing_hosts: | ||
208 | 40 | hookenv.status_set('waiting', 'Waiting for /etc/hosts registration on %s' % ( | ||
209 | 41 | ' and '.join(missing_hosts), | ||
210 | 42 | )) | ||
211 | 43 | elif not_ready: | 39 | elif not_ready: |
212 | 44 | unready_rels, unready_ress = zip(*not_ready) | 40 | unready_rels, unready_ress = zip(*not_ready) |
213 | 45 | hookenv.status_set('waiting', 'Waiting for %s to provide %s' % ( | 41 | hookenv.status_set('waiting', 'Waiting for %s to provide %s' % ( |
214 | 46 | ' and '.join(unready_rels), | 42 | ' and '.join(unready_rels), |
215 | 47 | ' and '.join(unready_ress), | 43 | ' and '.join(unready_ress), |
216 | 48 | )) | 44 | )) |
217 | 45 | elif missing_hosts: | ||
218 | 46 | hookenv.status_set('waiting', 'Waiting for /etc/hosts registration on %s' % ( | ||
219 | 47 | ' and '.join(missing_hosts), | ||
220 | 48 | )) | ||
221 | 49 | 49 | ||
222 | 50 | 50 | ||
223 | 51 | def update_working_status(): | 51 | def update_working_status(): |
224 | @@ -56,5 +56,18 @@ | |||
225 | 56 | 56 | ||
226 | 57 | 57 | ||
227 | 58 | def update_active_status(): | 58 | def update_active_status(): |
230 | 59 | unitdata.kv().set('charm.active', True) | 59 | hdfs_ready = NameNode().is_ready() |
231 | 60 | hookenv.status_set('active', 'Ready') | 60 | yarn_connected = ResourceManager().connected_units() |
232 | 61 | yarn_ready = ResourceManager().is_ready() | ||
233 | 62 | if hdfs_ready and (not yarn_connected or yarn_ready): | ||
234 | 63 | unitdata.kv().set('charm.active', True) | ||
235 | 64 | hookenv.status_set('active', 'Ready%s' % ( | ||
236 | 65 | '' if yarn_ready else ' (HDFS only)' | ||
237 | 66 | )) | ||
238 | 67 | else: | ||
239 | 68 | clear_active_flag() | ||
240 | 69 | update_blocked_status() | ||
241 | 70 | |||
242 | 71 | |||
243 | 72 | def clear_active_flag(): | ||
244 | 73 | unitdata.kv().set('charm.active', False) | ||
245 | 61 | 74 | ||
246 | === modified file 'hooks/common.py' | |||
247 | --- hooks/common.py 2015-06-25 15:41:48 +0000 | |||
248 | +++ hooks/common.py 2015-08-21 21:51:35 +0000 | |||
249 | @@ -1,4 +1,4 @@ | |||
251 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
252 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
253 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
254 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
255 | @@ -14,12 +14,21 @@ | |||
256 | 14 | Common implementation for all hooks. | 14 | Common implementation for all hooks. |
257 | 15 | """ | 15 | """ |
258 | 16 | 16 | ||
259 | 17 | import os | ||
260 | 17 | import jujuresources | 18 | import jujuresources |
261 | 18 | from charmhelpers.core import hookenv | 19 | from charmhelpers.core import hookenv |
262 | 19 | from charmhelpers.core import unitdata | 20 | from charmhelpers.core import unitdata |
263 | 20 | from charmhelpers.core import charmframework | 21 | from charmhelpers.core import charmframework |
264 | 21 | 22 | ||
265 | 22 | 23 | ||
266 | 24 | # ensure that the venv is used for installing resources | ||
267 | 25 | # (venv is used to ensure library versions needed by plugin | ||
268 | 26 | # don't conflict with the charm the plugin is supporting) | ||
269 | 27 | os.environ['PATH'] = os.pathsep.join([ | ||
270 | 28 | os.path.join(hookenv.charm_dir(), '.venv/bin'), | ||
271 | 29 | os.environ['PATH']]) | ||
272 | 30 | |||
273 | 31 | |||
274 | 23 | def bootstrap_resources(): | 32 | def bootstrap_resources(): |
275 | 24 | """ | 33 | """ |
276 | 25 | Install required resources defined in resources.yaml | 34 | Install required resources defined in resources.yaml |
277 | @@ -51,7 +60,7 @@ | |||
278 | 51 | 60 | ||
279 | 52 | # list of keys required to be in the dist.yaml | 61 | # list of keys required to be in the dist.yaml |
280 | 53 | client_reqs = ['vendor', 'hadoop_version', 'packages', 'groups', 'users', | 62 | client_reqs = ['vendor', 'hadoop_version', 'packages', 'groups', 'users', |
282 | 54 | 'dirs', 'ports'] | 63 | 'dirs'] |
283 | 55 | dist_config = jujubigdata.utils.DistConfig(filename='dist.yaml', | 64 | dist_config = jujubigdata.utils.DistConfig(filename='dist.yaml', |
284 | 56 | required_keys=client_reqs) | 65 | required_keys=client_reqs) |
285 | 57 | hadoop = jujubigdata.handlers.HadoopBase(dist_config) | 66 | hadoop = jujubigdata.handlers.HadoopBase(dist_config) |
286 | @@ -71,26 +80,44 @@ | |||
287 | 71 | ], | 80 | ], |
288 | 72 | }, | 81 | }, |
289 | 73 | { | 82 | { |
291 | 74 | 'name': 'plugin', | 83 | 'name': 'hdfs', |
292 | 75 | 'provides': [ | 84 | 'provides': [ |
293 | 76 | jujubigdata.relations.HadoopPlugin(), | 85 | jujubigdata.relations.HadoopPlugin(), |
294 | 77 | ], | 86 | ], |
295 | 78 | 'requires': [ | 87 | 'requires': [ |
296 | 79 | hadoop.is_installed, | 88 | hadoop.is_installed, |
297 | 89 | hdfs_relation, | ||
298 | 90 | ], | ||
299 | 91 | 'callbacks': [ | ||
300 | 92 | callbacks.update_working_status, | ||
301 | 93 | hdfs_relation.register_provided_hosts, | ||
302 | 94 | jujubigdata.utils.manage_etc_hosts, | ||
303 | 95 | hdfs.configure_client, | ||
304 | 96 | callbacks.update_active_status, | ||
305 | 97 | ], | ||
306 | 98 | 'cleanup': [ | ||
307 | 99 | callbacks.clear_active_flag, | ||
308 | 100 | callbacks.update_blocked_status, | ||
309 | 101 | ], | ||
310 | 102 | }, | ||
311 | 103 | { | ||
312 | 104 | 'name': 'yarn', | ||
313 | 105 | 'provides': [], | ||
314 | 106 | 'requires': [ | ||
315 | 107 | hadoop.is_installed, | ||
316 | 80 | yarn_relation, | 108 | yarn_relation, |
317 | 81 | hdfs_relation, | ||
318 | 82 | ], | 109 | ], |
319 | 83 | 'callbacks': [ | 110 | 'callbacks': [ |
320 | 84 | callbacks.update_working_status, | 111 | callbacks.update_working_status, |
321 | 85 | yarn_relation.register_provided_hosts, | 112 | yarn_relation.register_provided_hosts, |
322 | 86 | hdfs_relation.register_provided_hosts, | ||
323 | 87 | jujubigdata.utils.manage_etc_hosts, | 113 | jujubigdata.utils.manage_etc_hosts, |
324 | 88 | yarn.install_demo, | 114 | yarn.install_demo, |
325 | 89 | yarn.configure_client, | 115 | yarn.configure_client, |
326 | 90 | hdfs.configure_client, | ||
327 | 91 | callbacks.update_active_status, | 116 | callbacks.update_active_status, |
328 | 92 | ], | 117 | ], |
330 | 93 | 'cleanup': [], | 118 | 'cleanup': [ |
331 | 119 | callbacks.update_blocked_status, | ||
332 | 120 | ], | ||
333 | 94 | }, | 121 | }, |
334 | 95 | ]) | 122 | ]) |
335 | 96 | manager.manage() | 123 | manager.manage() |
336 | 97 | 124 | ||
337 | === modified file 'hooks/config-changed' | |||
338 | --- hooks/config-changed 2015-02-09 18:13:28 +0000 | |||
339 | +++ hooks/config-changed 2015-08-21 21:51:35 +0000 | |||
340 | @@ -1,4 +1,4 @@ | |||
342 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
343 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
344 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
345 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
346 | 5 | 5 | ||
347 | === modified file 'hooks/hadoop-plugin-relation-changed' | |||
348 | --- hooks/hadoop-plugin-relation-changed 2015-04-28 13:38:46 +0000 | |||
349 | +++ hooks/hadoop-plugin-relation-changed 2015-08-21 21:51:35 +0000 | |||
350 | @@ -1,4 +1,4 @@ | |||
352 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
353 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
354 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
355 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
356 | 5 | 5 | ||
357 | === added file 'hooks/hadoop-plugin-relation-departed' | |||
358 | --- hooks/hadoop-plugin-relation-departed 1970-01-01 00:00:00 +0000 | |||
359 | +++ hooks/hadoop-plugin-relation-departed 2015-08-21 21:51:35 +0000 | |||
360 | @@ -0,0 +1,15 @@ | |||
361 | 1 | #!.venv/bin/python | ||
362 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
363 | 3 | # you may not use this file except in compliance with the License. | ||
364 | 4 | # You may obtain a copy of the License at | ||
365 | 5 | # | ||
366 | 6 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
367 | 7 | # | ||
368 | 8 | # Unless required by applicable law or agreed to in writing, software | ||
369 | 9 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
370 | 10 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
371 | 11 | # See the License for the specific language governing permissions and | ||
372 | 12 | # limitations under the License. | ||
373 | 13 | |||
374 | 14 | import common | ||
375 | 15 | common.manage() | ||
376 | 0 | 16 | ||
377 | === modified file 'hooks/namenode-relation-changed' | |||
378 | --- hooks/namenode-relation-changed 2015-05-07 15:27:21 +0000 | |||
379 | +++ hooks/namenode-relation-changed 2015-08-21 21:51:35 +0000 | |||
380 | @@ -1,4 +1,4 @@ | |||
382 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
383 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
384 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
385 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
386 | 5 | 5 | ||
387 | === added file 'hooks/namenode-relation-departed' | |||
388 | --- hooks/namenode-relation-departed 1970-01-01 00:00:00 +0000 | |||
389 | +++ hooks/namenode-relation-departed 2015-08-21 21:51:35 +0000 | |||
390 | @@ -0,0 +1,16 @@ | |||
391 | 1 | #!.venv/bin/python | ||
392 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
393 | 3 | # you may not use this file except in compliance with the License. | ||
394 | 4 | # You may obtain a copy of the License at | ||
395 | 5 | # | ||
396 | 6 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
397 | 7 | # | ||
398 | 8 | # Unless required by applicable law or agreed to in writing, software | ||
399 | 9 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
400 | 10 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
401 | 11 | # See the License for the specific language governing permissions and | ||
402 | 12 | # limitations under the License. | ||
403 | 13 | |||
404 | 14 | import common | ||
405 | 15 | |||
406 | 16 | common.manage() | ||
407 | 0 | 17 | ||
408 | === modified file 'hooks/resourcemanager-relation-changed' | |||
409 | --- hooks/resourcemanager-relation-changed 2015-05-07 15:11:29 +0000 | |||
410 | +++ hooks/resourcemanager-relation-changed 2015-08-21 21:51:35 +0000 | |||
411 | @@ -1,4 +1,4 @@ | |||
413 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
414 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
415 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
416 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
417 | 5 | 5 | ||
418 | === added file 'hooks/resourcemanager-relation-departed' | |||
419 | --- hooks/resourcemanager-relation-departed 1970-01-01 00:00:00 +0000 | |||
420 | +++ hooks/resourcemanager-relation-departed 2015-08-21 21:51:35 +0000 | |||
421 | @@ -0,0 +1,16 @@ | |||
422 | 1 | #!.venv/bin/python | ||
423 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | ||
424 | 3 | # you may not use this file except in compliance with the License. | ||
425 | 4 | # You may obtain a copy of the License at | ||
426 | 5 | # | ||
427 | 6 | # http://www.apache.org/licenses/LICENSE-2.0 | ||
428 | 7 | # | ||
429 | 8 | # Unless required by applicable law or agreed to in writing, software | ||
430 | 9 | # distributed under the License is distributed on an "AS IS" BASIS, | ||
431 | 10 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
432 | 11 | # See the License for the specific language governing permissions and | ||
433 | 12 | # limitations under the License. | ||
434 | 13 | |||
435 | 14 | import common | ||
436 | 15 | |||
437 | 16 | common.manage() | ||
438 | 0 | 17 | ||
439 | === modified file 'hooks/setup.py' | |||
440 | --- hooks/setup.py 2015-06-25 15:41:48 +0000 | |||
441 | +++ hooks/setup.py 2015-08-21 21:51:35 +0000 | |||
442 | @@ -9,6 +9,7 @@ | |||
443 | 9 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | 9 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
444 | 10 | # See the License for the specific language governing permissions and | 10 | # See the License for the specific language governing permissions and |
445 | 11 | # limitations under the License. | 11 | # limitations under the License. |
446 | 12 | import os | ||
447 | 12 | import subprocess | 13 | import subprocess |
448 | 13 | from glob import glob | 14 | from glob import glob |
449 | 14 | 15 | ||
450 | @@ -17,12 +18,15 @@ | |||
451 | 17 | """ | 18 | """ |
452 | 18 | Do any setup required before the install hook. | 19 | Do any setup required before the install hook. |
453 | 19 | """ | 20 | """ |
455 | 20 | install_pip() | 21 | setup_venv() |
456 | 21 | install_bundled_resources() | 22 | install_bundled_resources() |
457 | 22 | 23 | ||
458 | 23 | 24 | ||
461 | 24 | def install_pip(): | 25 | def setup_venv(): |
462 | 25 | subprocess.check_call(['apt-get', 'install', '-yq', 'python-pip', 'bzr']) | 26 | if not os.path.exists('.venv/bin/python'): |
463 | 27 | subprocess.check_call(['apt-get', 'install', '-yq', 'python-virtualenv']) | ||
464 | 28 | subprocess.check_call(['virtualenv', '.venv']) | ||
465 | 29 | execfile('.venv/bin/activate_this.py', {'__file__': '.venv/bin/activate_this.py'}) | ||
466 | 26 | 30 | ||
467 | 27 | 31 | ||
468 | 28 | def install_bundled_resources(): | 32 | def install_bundled_resources(): |
469 | @@ -30,4 +34,4 @@ | |||
470 | 30 | Install the bundled resources libraries. | 34 | Install the bundled resources libraries. |
471 | 31 | """ | 35 | """ |
472 | 32 | archives = glob('resources/python/*') | 36 | archives = glob('resources/python/*') |
474 | 33 | subprocess.check_call(['pip', 'install'] + archives) | 37 | subprocess.check_call(['.venv/bin/pip', 'install'] + archives) |
475 | 34 | 38 | ||
476 | === modified file 'hooks/start' | |||
477 | --- hooks/start 2015-02-09 18:13:28 +0000 | |||
478 | +++ hooks/start 2015-08-21 21:51:35 +0000 | |||
479 | @@ -1,4 +1,4 @@ | |||
481 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
482 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
483 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
484 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
485 | 5 | 5 | ||
486 | === modified file 'hooks/stop' | |||
487 | --- hooks/stop 2015-02-09 18:13:28 +0000 | |||
488 | +++ hooks/stop 2015-08-21 21:51:35 +0000 | |||
489 | @@ -1,4 +1,4 @@ | |||
491 | 1 | #!/usr/bin/env python | 1 | #!.venv/bin/python |
492 | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); | 2 | # Licensed under the Apache License, Version 2.0 (the "License"); |
493 | 3 | # you may not use this file except in compliance with the License. | 3 | # you may not use this file except in compliance with the License. |
494 | 4 | # You may obtain a copy of the License at | 4 | # You may obtain a copy of the License at |
495 | 5 | 5 | ||
496 | === modified file 'resources.yaml' | |||
497 | --- resources.yaml 2015-06-25 22:43:49 +0000 | |||
498 | +++ resources.yaml 2015-08-21 21:51:35 +0000 | |||
499 | @@ -4,7 +4,7 @@ | |||
500 | 4 | pathlib: | 4 | pathlib: |
501 | 5 | pypi: path.py>=7.0 | 5 | pypi: path.py>=7.0 |
502 | 6 | jujubigdata: | 6 | jujubigdata: |
504 | 7 | pypi: jujubigdata>=2.0.2,<3.0.0 | 7 | pypi: jujubigdata>=4.0.0,<5.0.0 |
505 | 8 | charm-benchmark: | 8 | charm-benchmark: |
506 | 9 | pypi: charm-benchmark>=1.0.1,<2.0.0 | 9 | pypi: charm-benchmark>=1.0.1,<2.0.0 |
507 | 10 | java-installer: | 10 | java-installer: |
508 | @@ -12,19 +12,19 @@ | |||
509 | 12 | # If replaced with an alternate implementation, it must output *only* two | 12 | # If replaced with an alternate implementation, it must output *only* two |
510 | 13 | # lines containing the JAVA_HOME path, and the Java version, respectively, | 13 | # lines containing the JAVA_HOME path, and the Java version, respectively, |
511 | 14 | # on stdout. Upon error, it must exit with a non-zero exit code. | 14 | # on stdout. Upon error, it must exit with a non-zero exit code. |
514 | 15 | url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150625222410-qfck64q46ubx5i6p/javainstaller.sh-20150311213053-4vq7369jhlvc6qy8-1/java-installer.sh | 15 | url: https://git.launchpad.net/bigdata-data/plain/common/noarch/java-installer.sh?id=baa0b74b86587f97b446f255deb96c8420021dd8 |
515 | 16 | hash: 8fdff60270ea4be7bbef1e013e503057fe6efc2f4b5761edebc206a54f303023 | 16 | hash: f7df6937bdb4dcc60de559252b4e6b65c77959f871c7ef2e59af57832d7ddfca |
516 | 17 | hash_type: sha256 | 17 | hash_type: sha256 |
517 | 18 | optional_resources: | 18 | optional_resources: |
518 | 19 | hadoop-aarch64: | 19 | hadoop-aarch64: |
520 | 20 | url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150303192631-swhrf8f7q82si75t/hadoop2.4.1.tar.gz-20150303192554-7gqslr4m8ahkwiax-2/hadoop-2.4.1.tar.gz | 20 | url: https://git.launchpad.net/bigdata-data/plain/apache/aarch64/hadoop-2.4.1.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0 |
521 | 21 | hash: 03ad135835bfe413f85fe176259237a8 | 21 | hash: 03ad135835bfe413f85fe176259237a8 |
522 | 22 | hash_type: md5 | 22 | hash_type: md5 |
523 | 23 | hadoop-ppc64le: | 23 | hadoop-ppc64le: |
525 | 24 | url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150130165209-nuz1myezjpdx7eus/hadoop2.4.1ppc64le.t-20150130165148-s8i19s002ht88gio-2/hadoop-2.4.1-ppc64le.tar.gz | 24 | url: https://git.launchpad.net/bigdata-data/plain/apache/ppc64le/hadoop-2.4.1-ppc64le.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0 |
526 | 25 | hash: 09942b168a3db0d183b281477d3dae9deb7b7bc4b5783ba5cda3965b62e71bd5 | 25 | hash: 09942b168a3db0d183b281477d3dae9deb7b7bc4b5783ba5cda3965b62e71bd5 |
527 | 26 | hash_type: sha256 | 26 | hash_type: sha256 |
528 | 27 | hadoop-x86_64: | 27 | hadoop-x86_64: |
530 | 28 | url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/cory.johns%40canonical.com-20150116154822-x5osw3zfhw6e03b1/hadoop2.4.1.tar.gz-20150116154748-yfa2j12rr5m53xd3-1/hadoop-2.4.1.tar.gz | 28 | url: https://git.launchpad.net/bigdata-data/plain/apache/x86_64/hadoop-2.4.1.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0 |
531 | 29 | hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672 | 29 | hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672 |
532 | 30 | hash_type: sha256 | 30 | hash_type: sha256 |
533 | 31 | 31 | ||
534 | === removed file 'resources/python/jujuresources-0.2.8.tar.gz' | |||
535 | 32 | Binary files resources/python/jujuresources-0.2.8.tar.gz 2015-06-25 15:41:48 +0000 and resources/python/jujuresources-0.2.8.tar.gz 1970-01-01 00:00:00 +0000 differ | 32 | Binary files resources/python/jujuresources-0.2.8.tar.gz 2015-06-25 15:41:48 +0000 and resources/python/jujuresources-0.2.8.tar.gz 1970-01-01 00:00:00 +0000 differ |
536 | === added file 'resources/python/jujuresources-0.2.9.tar.gz' | |||
537 | 33 | Binary files resources/python/jujuresources-0.2.9.tar.gz 1970-01-01 00:00:00 +0000 and resources/python/jujuresources-0.2.9.tar.gz 2015-08-21 21:51:35 +0000 differ | 33 | Binary files resources/python/jujuresources-0.2.9.tar.gz 1970-01-01 00:00:00 +0000 and resources/python/jujuresources-0.2.9.tar.gz 2015-08-21 21:51:35 +0000 differ |
538 | === modified file 'tests/remote/test_dist_config.py' (properties changed: -x to +x) |
Realtime syslog analytics bundle test looked good. Merged.