Merge lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk into lp:charms/trusty/apache-hadoop-plugin

Proposed by Kevin W Monroe
Status: Merged
Merged at revision: 100
Proposed branch: lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk
Merge into: lp:charms/trusty/apache-hadoop-plugin
Diff against target: 538 lines (+160/-87) (has conflicts)
19 files modified
DEV-README.md (+15/-0)
README.md (+11/-5)
actions/parseTerasort.py (+1/-1)
actions/teragen (+4/-0)
actions/terasort (+4/-1)
dist.yaml (+0/-46)
hooks/callbacks.py (+24/-11)
hooks/common.py (+34/-7)
hooks/config-changed (+1/-1)
hooks/hadoop-plugin-relation-changed (+1/-1)
hooks/hadoop-plugin-relation-departed (+15/-0)
hooks/namenode-relation-changed (+1/-1)
hooks/namenode-relation-departed (+16/-0)
hooks/resourcemanager-relation-changed (+1/-1)
hooks/resourcemanager-relation-departed (+16/-0)
hooks/setup.py (+8/-4)
hooks/start (+1/-1)
hooks/stop (+1/-1)
resources.yaml (+6/-6)
Text conflict in DEV-README.md
Text conflict in README.md
To merge this branch: bzr merge lp:~bigdata-dev/charms/trusty/apache-hadoop-plugin/trunk
Reviewer Review Type Date Requested Status
Kevin W Monroe Approve
Review via email: mp+268670@code.launchpad.net
To post a comment you must log in.
115. By Cory Johns

Fixed permissions on test_dist_config.py

Revision history for this message
Kevin W Monroe (kwmonroe) wrote :

Realtime syslog analytics bundle test looked good. Merged.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'DEV-README.md'
2--- DEV-README.md 2015-06-29 14:23:33 +0000
3+++ DEV-README.md 2015-08-21 21:51:35 +0000
4@@ -27,11 +27,19 @@
5
6 Additionally, the `JAVA_HOME`, `HADOOP_HOME`, `HADOOP_CONF_DIR`, and other
7 environment variables will be set via `/etc/environment`. This includes putting
8+<<<<<<< TREE
9 the Hadoop bin and sbin directories on the `PATH`. There are
10 [helpers](http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/files/head:/common/noarch/)
11 in `charmhelpers.contrib.bigdata.utils` to assist with using the environment
12 file. For example, to run the `hdfs` command to create a directory as the
13 `ubuntu` user:
14+=======
15+the Hadoop bin and sbin directories on the `PATH`. There are
16+[helpers](https://git.launchpad.net/bigdata-data/tree/common/noarch)
17+in `charmhelpers.contrib.bigdata.utils` to assist with using the environment
18+file. For example, to run the `hdfs` command to create a directory as the
19+`ubuntu` user:
20+>>>>>>> MERGE-SOURCE
21
22 from charmhelpers.contrib.bigdata.utils import run_as
23 run_as('ubuntu', 'hdfs', 'dfs', '-mkdir', '-p', '/home/ubuntu/foo')
24@@ -102,10 +110,17 @@
25
26 ## Manual Deployment
27
28+<<<<<<< TREE
29 The easiest way to deploy the core Apache Hadoop platform is to use one of
30 the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles).
31 However, to manually deploy the base Apache Hadoop platform without using one
32 of the bundles, you can use the following:
33+=======
34+The easiest way to deploy an Apache Hadoop platform is to use one of
35+the [apache bundles](https://jujucharms.com/u/bigdata-charmers/#bundles).
36+However, to manually deploy the base Apache Hadoop platform without using one
37+of the bundles, you can use the following:
38+>>>>>>> MERGE-SOURCE
39
40 juju deploy apache-hadoop-hdfs-master hdfs-master
41 juju deploy apache-hadoop-hdfs-secondary secondary-namenode
42
43=== modified file 'README.md'
44--- README.md 2015-06-18 17:12:11 +0000
45+++ README.md 2015-08-21 21:51:35 +0000
46@@ -22,7 +22,11 @@
47 If you wanted to also wanted to be able to analyze your data using Apache Pig,
48 you could deploy it and attach it to the same plugin:
49
50+<<<<<<< TREE
51 juju deploy cs:~bigdata-charmers/trusty/apache-pig pig
52+=======
53+ juju deploy apache-pig pig
54+>>>>>>> MERGE-SOURCE
55 juju add-relation plugin pig
56
57 ## Benchmarking
58@@ -92,17 +96,19 @@
59 of these resources:
60
61 sudo pip install jujuresources
62- juju resources fetch --all apache-hadoop-plugin/resources.yaml -d /tmp/resources
63- juju resources serve -d /tmp/resources
64+ juju-resources fetch --all /path/to/resources.yaml -d /tmp/resources
65+ juju-resources serve -d /tmp/resources
66
67 This will fetch all of the resources needed by this charm and serve them via a
68-simple HTTP server. You can then set the `resources_mirror` config option to
69-have the charm use this server for retrieving resources.
70+simple HTTP server. The output from `juju-resources serve` will give you a
71+URL that you can set as the `resources_mirror` config option for this charm.
72+Setting this option will cause all resources required by this charm to be
73+downloaded from the configured URL.
74
75 You can fetch the resources for all of the Apache Hadoop charms
76 (`apache-hadoop-hdfs-master`, `apache-hadoop-yarn-master`,
77 `apache-hadoop-compute-slave`, `apache-hadoop-plugin`, etc) into a single
78-directory and serve them all with a single `juju resources serve` instance.
79+directory and serve them all with a single `juju-resources serve` instance.
80
81
82 ## Contact Information
83
84=== modified file 'actions/parseTerasort.py'
85--- actions/parseTerasort.py 2015-06-08 20:05:37 +0000
86+++ actions/parseTerasort.py 2015-08-21 21:51:35 +0000
87@@ -1,4 +1,4 @@
88-#!/usr/bin/env python
89+#!.venv/bin/python
90 """
91 Simple script to parse cassandra-stress' transaction results
92 and reformat them as JSON for sending back to juju
93
94=== modified file 'actions/teragen'
95--- actions/teragen 2015-06-08 21:32:27 +0000
96+++ actions/teragen 2015-08-21 21:51:35 +0000
97@@ -3,6 +3,10 @@
98 SIZE=`action-get size`
99 IN_DIR=`action-get indir`
100
101+# The plugin uses a venv to avoid conflicting with system bits; ensure our
102+# venv is activated.
103+. ${CHARM_DIR}/.venv/bin/activate
104+
105 benchmark-start
106
107 # NB: Escaped vars in the block below (e.g., \${HADOOP_HOME}) come from
108
109=== modified file 'actions/terasort'
110--- actions/terasort 2015-06-08 21:32:27 +0000
111+++ actions/terasort 2015-08-21 21:51:35 +0000
112@@ -25,6 +25,9 @@
113 # from this outer scope
114 su ubuntu << EOF
115 . /etc/environment
116+# The plugin uses a venv to avoid conflicting with system bits; ensure our
117+# venv is activated.
118+. ${CHARM_DIR}/.venv/bin/activate
119
120 mkdir -p /opt/terasort/results/$run
121
122@@ -45,4 +48,4 @@
123 EOF
124 PATH=$OLDPATH
125
126-`cat /opt/terasort/results/$run/terasort.log | python $CHARM_DIR/actions/parseTerasort.py`
127+`cat /opt/terasort/results/$run/terasort.log | $CHARM_DIR/.venv/bin/python $CHARM_DIR/actions/parseTerasort.py`
128
129=== modified file 'dist.yaml'
130--- dist.yaml 2015-04-16 15:45:57 +0000
131+++ dist.yaml 2015-08-21 21:51:35 +0000
132@@ -68,49 +68,3 @@
133 owner: 'hdfs'
134 group: 'hadoop'
135 perms: 0775
136-ports:
137- # Ports that need to be exposed, overridden, or manually specified.
138- # Only expose ports serving a UI or external API (i.e., namenode and
139- # resourcemanager). Communication among units within the cluster does
140- # not need ports to be explicitly opened.
141- # If adding a port here, you will need to update
142- # charmhelpers.contrib.bigdata.handlers.apache or hooks/callbacks.py
143- # to ensure that it is supported.
144- namenode:
145- port: 8020
146- exposed_on: 'hdfs-master'
147- nn_webapp_http:
148- port: 50070
149- exposed_on: 'hdfs-master'
150- dn_webapp_http:
151- port: 50075
152- exposed_on: 'compute-slave'
153- resourcemanager:
154- port: 8032
155- exposed_on: 'yarn-master'
156- rm_webapp_http:
157- port: 8088
158- exposed_on: 'yarn-master'
159- rm_log:
160- port: 19888
161- nm_webapp_http:
162- port: 8042
163- exposed_on: 'compute-slave'
164- jobhistory:
165- port: 10020
166- jh_webapp_http:
167- port: 19888
168- exposed_on: 'yarn-master'
169- # TODO: support SSL
170- #nn_webapp_https:
171- # port: 50470
172- # exposed_on: 'hdfs-master'
173- #dn_webapp_https:
174- # port: 50475
175- # exposed_on: 'compute-slave'
176- #rm_webapp_https:
177- # port: 8090
178- # exposed_on: 'yarn-master'
179- #nm_webapp_https:
180- # port: 8044
181- # exposed_on: 'compute-slave'
182
183=== modified file 'hooks/callbacks.py'
184--- hooks/callbacks.py 2015-06-25 15:41:48 +0000
185+++ hooks/callbacks.py 2015-08-21 21:51:35 +0000
186@@ -24,28 +24,28 @@
187 def update_blocked_status():
188 if unitdata.kv().get('charm.active', False):
189 return
190- rels = (
191- ('Yarn', 'ResourceManager', ResourceManager()),
192+ rels = [
193 ('HDFS', 'NameNode', NameNode()),
194- )
195+ ]
196 missing_rel = [rel for rel, res, impl in rels if not impl.connected_units()]
197- missing_hosts = [rel for rel, res, impl in rels if not impl.am_i_registered()]
198- not_ready = [(rel, res) for rel, res, impl in rels if not impl.is_ready()]
199+ rels.append(('Yarn', 'ResourceManager', ResourceManager()))
200+ not_ready = [(rel, res) for rel, res, impl in rels if impl.connected_units() and not impl.is_ready()]
201+ missing_hosts = [rel for rel, res, impl in rels if impl.connected_units() and not impl.am_i_registered()]
202 if missing_rel:
203 hookenv.status_set('blocked', 'Waiting for relation to %s master%s' % (
204 ' and '.join(missing_rel),
205 's' if len(missing_rel) > 1 else '',
206 )),
207- elif missing_hosts:
208- hookenv.status_set('waiting', 'Waiting for /etc/hosts registration on %s' % (
209- ' and '.join(missing_hosts),
210- ))
211 elif not_ready:
212 unready_rels, unready_ress = zip(*not_ready)
213 hookenv.status_set('waiting', 'Waiting for %s to provide %s' % (
214 ' and '.join(unready_rels),
215 ' and '.join(unready_ress),
216 ))
217+ elif missing_hosts:
218+ hookenv.status_set('waiting', 'Waiting for /etc/hosts registration on %s' % (
219+ ' and '.join(missing_hosts),
220+ ))
221
222
223 def update_working_status():
224@@ -56,5 +56,18 @@
225
226
227 def update_active_status():
228- unitdata.kv().set('charm.active', True)
229- hookenv.status_set('active', 'Ready')
230+ hdfs_ready = NameNode().is_ready()
231+ yarn_connected = ResourceManager().connected_units()
232+ yarn_ready = ResourceManager().is_ready()
233+ if hdfs_ready and (not yarn_connected or yarn_ready):
234+ unitdata.kv().set('charm.active', True)
235+ hookenv.status_set('active', 'Ready%s' % (
236+ '' if yarn_ready else ' (HDFS only)'
237+ ))
238+ else:
239+ clear_active_flag()
240+ update_blocked_status()
241+
242+
243+def clear_active_flag():
244+ unitdata.kv().set('charm.active', False)
245
246=== modified file 'hooks/common.py'
247--- hooks/common.py 2015-06-25 15:41:48 +0000
248+++ hooks/common.py 2015-08-21 21:51:35 +0000
249@@ -1,4 +1,4 @@
250-#!/usr/bin/env python
251+#!.venv/bin/python
252 # Licensed under the Apache License, Version 2.0 (the "License");
253 # you may not use this file except in compliance with the License.
254 # You may obtain a copy of the License at
255@@ -14,12 +14,21 @@
256 Common implementation for all hooks.
257 """
258
259+import os
260 import jujuresources
261 from charmhelpers.core import hookenv
262 from charmhelpers.core import unitdata
263 from charmhelpers.core import charmframework
264
265
266+# ensure that the venv is used for installing resources
267+# (venv is used to ensure library versions needed by plugin
268+# don't conflict with the charm the plugin is supporting)
269+os.environ['PATH'] = os.pathsep.join([
270+ os.path.join(hookenv.charm_dir(), '.venv/bin'),
271+ os.environ['PATH']])
272+
273+
274 def bootstrap_resources():
275 """
276 Install required resources defined in resources.yaml
277@@ -51,7 +60,7 @@
278
279 # list of keys required to be in the dist.yaml
280 client_reqs = ['vendor', 'hadoop_version', 'packages', 'groups', 'users',
281- 'dirs', 'ports']
282+ 'dirs']
283 dist_config = jujubigdata.utils.DistConfig(filename='dist.yaml',
284 required_keys=client_reqs)
285 hadoop = jujubigdata.handlers.HadoopBase(dist_config)
286@@ -71,26 +80,44 @@
287 ],
288 },
289 {
290- 'name': 'plugin',
291+ 'name': 'hdfs',
292 'provides': [
293 jujubigdata.relations.HadoopPlugin(),
294 ],
295 'requires': [
296 hadoop.is_installed,
297+ hdfs_relation,
298+ ],
299+ 'callbacks': [
300+ callbacks.update_working_status,
301+ hdfs_relation.register_provided_hosts,
302+ jujubigdata.utils.manage_etc_hosts,
303+ hdfs.configure_client,
304+ callbacks.update_active_status,
305+ ],
306+ 'cleanup': [
307+ callbacks.clear_active_flag,
308+ callbacks.update_blocked_status,
309+ ],
310+ },
311+ {
312+ 'name': 'yarn',
313+ 'provides': [],
314+ 'requires': [
315+ hadoop.is_installed,
316 yarn_relation,
317- hdfs_relation,
318 ],
319 'callbacks': [
320 callbacks.update_working_status,
321 yarn_relation.register_provided_hosts,
322- hdfs_relation.register_provided_hosts,
323 jujubigdata.utils.manage_etc_hosts,
324 yarn.install_demo,
325 yarn.configure_client,
326- hdfs.configure_client,
327 callbacks.update_active_status,
328 ],
329- 'cleanup': [],
330+ 'cleanup': [
331+ callbacks.update_blocked_status,
332+ ],
333 },
334 ])
335 manager.manage()
336
337=== modified file 'hooks/config-changed'
338--- hooks/config-changed 2015-02-09 18:13:28 +0000
339+++ hooks/config-changed 2015-08-21 21:51:35 +0000
340@@ -1,4 +1,4 @@
341-#!/usr/bin/env python
342+#!.venv/bin/python
343 # Licensed under the Apache License, Version 2.0 (the "License");
344 # you may not use this file except in compliance with the License.
345 # You may obtain a copy of the License at
346
347=== modified file 'hooks/hadoop-plugin-relation-changed'
348--- hooks/hadoop-plugin-relation-changed 2015-04-28 13:38:46 +0000
349+++ hooks/hadoop-plugin-relation-changed 2015-08-21 21:51:35 +0000
350@@ -1,4 +1,4 @@
351-#!/usr/bin/env python
352+#!.venv/bin/python
353 # Licensed under the Apache License, Version 2.0 (the "License");
354 # you may not use this file except in compliance with the License.
355 # You may obtain a copy of the License at
356
357=== added file 'hooks/hadoop-plugin-relation-departed'
358--- hooks/hadoop-plugin-relation-departed 1970-01-01 00:00:00 +0000
359+++ hooks/hadoop-plugin-relation-departed 2015-08-21 21:51:35 +0000
360@@ -0,0 +1,15 @@
361+#!.venv/bin/python
362+# Licensed under the Apache License, Version 2.0 (the "License");
363+# you may not use this file except in compliance with the License.
364+# You may obtain a copy of the License at
365+#
366+# http://www.apache.org/licenses/LICENSE-2.0
367+#
368+# Unless required by applicable law or agreed to in writing, software
369+# distributed under the License is distributed on an "AS IS" BASIS,
370+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
371+# See the License for the specific language governing permissions and
372+# limitations under the License.
373+
374+import common
375+common.manage()
376
377=== modified file 'hooks/namenode-relation-changed'
378--- hooks/namenode-relation-changed 2015-05-07 15:27:21 +0000
379+++ hooks/namenode-relation-changed 2015-08-21 21:51:35 +0000
380@@ -1,4 +1,4 @@
381-#!/usr/bin/env python
382+#!.venv/bin/python
383 # Licensed under the Apache License, Version 2.0 (the "License");
384 # you may not use this file except in compliance with the License.
385 # You may obtain a copy of the License at
386
387=== added file 'hooks/namenode-relation-departed'
388--- hooks/namenode-relation-departed 1970-01-01 00:00:00 +0000
389+++ hooks/namenode-relation-departed 2015-08-21 21:51:35 +0000
390@@ -0,0 +1,16 @@
391+#!.venv/bin/python
392+# Licensed under the Apache License, Version 2.0 (the "License");
393+# you may not use this file except in compliance with the License.
394+# You may obtain a copy of the License at
395+#
396+# http://www.apache.org/licenses/LICENSE-2.0
397+#
398+# Unless required by applicable law or agreed to in writing, software
399+# distributed under the License is distributed on an "AS IS" BASIS,
400+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
401+# See the License for the specific language governing permissions and
402+# limitations under the License.
403+
404+import common
405+
406+common.manage()
407
408=== modified file 'hooks/resourcemanager-relation-changed'
409--- hooks/resourcemanager-relation-changed 2015-05-07 15:11:29 +0000
410+++ hooks/resourcemanager-relation-changed 2015-08-21 21:51:35 +0000
411@@ -1,4 +1,4 @@
412-#!/usr/bin/env python
413+#!.venv/bin/python
414 # Licensed under the Apache License, Version 2.0 (the "License");
415 # you may not use this file except in compliance with the License.
416 # You may obtain a copy of the License at
417
418=== added file 'hooks/resourcemanager-relation-departed'
419--- hooks/resourcemanager-relation-departed 1970-01-01 00:00:00 +0000
420+++ hooks/resourcemanager-relation-departed 2015-08-21 21:51:35 +0000
421@@ -0,0 +1,16 @@
422+#!.venv/bin/python
423+# Licensed under the Apache License, Version 2.0 (the "License");
424+# you may not use this file except in compliance with the License.
425+# You may obtain a copy of the License at
426+#
427+# http://www.apache.org/licenses/LICENSE-2.0
428+#
429+# Unless required by applicable law or agreed to in writing, software
430+# distributed under the License is distributed on an "AS IS" BASIS,
431+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
432+# See the License for the specific language governing permissions and
433+# limitations under the License.
434+
435+import common
436+
437+common.manage()
438
439=== modified file 'hooks/setup.py'
440--- hooks/setup.py 2015-06-25 15:41:48 +0000
441+++ hooks/setup.py 2015-08-21 21:51:35 +0000
442@@ -9,6 +9,7 @@
443 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
444 # See the License for the specific language governing permissions and
445 # limitations under the License.
446+import os
447 import subprocess
448 from glob import glob
449
450@@ -17,12 +18,15 @@
451 """
452 Do any setup required before the install hook.
453 """
454- install_pip()
455+ setup_venv()
456 install_bundled_resources()
457
458
459-def install_pip():
460- subprocess.check_call(['apt-get', 'install', '-yq', 'python-pip', 'bzr'])
461+def setup_venv():
462+ if not os.path.exists('.venv/bin/python'):
463+ subprocess.check_call(['apt-get', 'install', '-yq', 'python-virtualenv'])
464+ subprocess.check_call(['virtualenv', '.venv'])
465+ execfile('.venv/bin/activate_this.py', {'__file__': '.venv/bin/activate_this.py'})
466
467
468 def install_bundled_resources():
469@@ -30,4 +34,4 @@
470 Install the bundled resources libraries.
471 """
472 archives = glob('resources/python/*')
473- subprocess.check_call(['pip', 'install'] + archives)
474+ subprocess.check_call(['.venv/bin/pip', 'install'] + archives)
475
476=== modified file 'hooks/start'
477--- hooks/start 2015-02-09 18:13:28 +0000
478+++ hooks/start 2015-08-21 21:51:35 +0000
479@@ -1,4 +1,4 @@
480-#!/usr/bin/env python
481+#!.venv/bin/python
482 # Licensed under the Apache License, Version 2.0 (the "License");
483 # you may not use this file except in compliance with the License.
484 # You may obtain a copy of the License at
485
486=== modified file 'hooks/stop'
487--- hooks/stop 2015-02-09 18:13:28 +0000
488+++ hooks/stop 2015-08-21 21:51:35 +0000
489@@ -1,4 +1,4 @@
490-#!/usr/bin/env python
491+#!.venv/bin/python
492 # Licensed under the Apache License, Version 2.0 (the "License");
493 # you may not use this file except in compliance with the License.
494 # You may obtain a copy of the License at
495
496=== modified file 'resources.yaml'
497--- resources.yaml 2015-06-25 22:43:49 +0000
498+++ resources.yaml 2015-08-21 21:51:35 +0000
499@@ -4,7 +4,7 @@
500 pathlib:
501 pypi: path.py>=7.0
502 jujubigdata:
503- pypi: jujubigdata>=2.0.2,<3.0.0
504+ pypi: jujubigdata>=4.0.0,<5.0.0
505 charm-benchmark:
506 pypi: charm-benchmark>=1.0.1,<2.0.0
507 java-installer:
508@@ -12,19 +12,19 @@
509 # If replaced with an alternate implementation, it must output *only* two
510 # lines containing the JAVA_HOME path, and the Java version, respectively,
511 # on stdout. Upon error, it must exit with a non-zero exit code.
512- url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150625222410-qfck64q46ubx5i6p/javainstaller.sh-20150311213053-4vq7369jhlvc6qy8-1/java-installer.sh
513- hash: 8fdff60270ea4be7bbef1e013e503057fe6efc2f4b5761edebc206a54f303023
514+ url: https://git.launchpad.net/bigdata-data/plain/common/noarch/java-installer.sh?id=baa0b74b86587f97b446f255deb96c8420021dd8
515+ hash: f7df6937bdb4dcc60de559252b4e6b65c77959f871c7ef2e59af57832d7ddfca
516 hash_type: sha256
517 optional_resources:
518 hadoop-aarch64:
519- url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150303192631-swhrf8f7q82si75t/hadoop2.4.1.tar.gz-20150303192554-7gqslr4m8ahkwiax-2/hadoop-2.4.1.tar.gz
520+ url: https://git.launchpad.net/bigdata-data/plain/apache/aarch64/hadoop-2.4.1.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
521 hash: 03ad135835bfe413f85fe176259237a8
522 hash_type: md5
523 hadoop-ppc64le:
524- url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/kevin.monroe%40canonical.com-20150130165209-nuz1myezjpdx7eus/hadoop2.4.1ppc64le.t-20150130165148-s8i19s002ht88gio-2/hadoop-2.4.1-ppc64le.tar.gz
525+ url: https://git.launchpad.net/bigdata-data/plain/apache/ppc64le/hadoop-2.4.1-ppc64le.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
526 hash: 09942b168a3db0d183b281477d3dae9deb7b7bc4b5783ba5cda3965b62e71bd5
527 hash_type: sha256
528 hadoop-x86_64:
529- url: http://bazaar.launchpad.net/~bigdata-dev/bigdata-data/trunk/download/cory.johns%40canonical.com-20150116154822-x5osw3zfhw6e03b1/hadoop2.4.1.tar.gz-20150116154748-yfa2j12rr5m53xd3-1/hadoop-2.4.1.tar.gz
530+ url: https://git.launchpad.net/bigdata-data/plain/apache/x86_64/hadoop-2.4.1.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
531 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
532 hash_type: sha256
533
534=== removed file 'resources/python/jujuresources-0.2.8.tar.gz'
535Binary files resources/python/jujuresources-0.2.8.tar.gz 2015-06-25 15:41:48 +0000 and resources/python/jujuresources-0.2.8.tar.gz 1970-01-01 00:00:00 +0000 differ
536=== added file 'resources/python/jujuresources-0.2.9.tar.gz'
537Binary files resources/python/jujuresources-0.2.9.tar.gz 1970-01-01 00:00:00 +0000 and resources/python/jujuresources-0.2.9.tar.gz 2015-08-21 21:51:35 +0000 differ
538=== modified file 'tests/remote/test_dist_config.py' (properties changed: -x to +x)

Subscribers

People subscribed via source and target branches