Merge lp:~bigdata-dev/charms/trusty/apache-zeppelin/trunk into lp:charms/trusty/apache-zeppelin

Proposed by Kevin W Monroe
Status: Merged
Merged at revision: 17
Proposed branch: lp:~bigdata-dev/charms/trusty/apache-zeppelin/trunk
Merge into: lp:charms/trusty/apache-zeppelin
Diff against target: 71 lines (+16/-15)
3 files modified
README.md (+7/-7)
hooks/callbacks.py (+8/-7)
resources.yaml (+1/-1)
To merge this branch: bzr merge lp:~bigdata-dev/charms/trusty/apache-zeppelin/trunk
Reviewer Review Type Date Requested Status
Kevin W Monroe Approve
Review via email: mp+268675@code.launchpad.net
To post a comment you must log in.
27. By Cory Johns

Fixed permissions on test_dist_config.py

Revision history for this message
Kevin W Monroe (kwmonroe) wrote :

Realtime syslog analytics bundle test looked good. Merged.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'README.md'
2--- README.md 2015-07-24 16:38:17 +0000
3+++ README.md 2015-08-21 21:52:12 +0000
4@@ -14,15 +14,15 @@
5
6 ## Usage
7
8-This charm leverages our pluggable Hadoop model with the `hadoop-plugin`
9-interface. This means that you will need to deploy a base Apache Hadoop cluster
10-to run Spark. The suggested deployment method is to use the
11-[apache-hadoop-spark-zeppelin](https://jujucharms.com/u/bigdata-dev/apache-hadoop-spark-zeppelin/)
12-bundle. This will deploy the Apache Hadoop platform with a single Apache Spark
13-unit that communicates with the cluster by relating to the
14+This is a subordinate charm that requires the `apache-spark` interface. This
15+means that you will need to deploy a base Apache Spark cluster to use
16+Zeppelin. An easy way to deploy the recommended environment is to use the
17+[apache-hadoop-spark-zeppelin](https://jujucharms.com/apache-hadoop-spark-zeppelin)
18+bundle. This will deploy the Apache Hadoop platform with an Apache Spark +
19+Zeppelin unit that communicates with the cluster by relating to the
20 `apache-hadoop-plugin` subordinate charm:
21
22- juju-quickstart u/bigdata-dev/apache-hadoop-spark-zeppelin
23+ juju-quickstart apache-hadoop-spark-zeppelin
24
25 Alternatively, you may manually deploy the recommended environment as follows:
26
27
28=== modified file 'hooks/callbacks.py'
29--- hooks/callbacks.py 2015-07-24 16:38:17 +0000
30+++ hooks/callbacks.py 2015-08-21 21:52:12 +0000
31@@ -64,17 +64,18 @@
32
33 def setup_zeppelin_config(self):
34 '''
35- copy Zeppelin's default configuration files to zeppelin_conf property defined
36- in dist.yaml
37+ copy the default configuration files to zeppelin_conf property
38+ defined in dist.yaml
39 '''
40- conf_dir = self.dist_config.path('zeppelin') / 'conf'
41- self.dist_config.path('zeppelin_conf').rmtree_p()
42- conf_dir.copytree(self.dist_config.path('zeppelin_conf'))
43+ default_conf = self.dist_config.path('zeppelin') / 'conf'
44+ zeppelin_conf = self.dist_config.path('zeppelin_conf')
45+ zeppelin_conf.rmtree_p()
46+ default_conf.copytree(zeppelin_conf)
47 # Now remove the conf included in the tarball and symlink our real conf
48 # dir. we've seen issues where zepp doesn't honor ZEPPELIN_CONF_DIR
49 # and instead looks for config in ZEPPELIN_HOME/conf.
50- conf_dir.rmtree_p()
51- self.dist_config.path('zeppelin_conf').symlink(conf_dir)
52+ default_conf.rmtree_p()
53+ zeppelin_conf.symlink(default_conf)
54
55 zeppelin_env = self.dist_config.path('zeppelin_conf') / 'zeppelin-env.sh'
56 if not zeppelin_env.exists():
57
58=== modified file 'resources.yaml'
59--- resources.yaml 2015-07-24 16:38:17 +0000
60+++ resources.yaml 2015-08-21 21:52:12 +0000
61@@ -4,7 +4,7 @@
62 pathlib:
63 pypi: path.py>=7.0
64 jujubigdata:
65- pypi: jujubigdata>=2.1.0,<3.0.0
66+ pypi: jujubigdata>=4.0.0,<5.0.0
67 optional_resources:
68 zeppelin-ppc64le:
69 url: https://git.launchpad.net/bigdata-data/plain/apache/ppc64le/zeppelin-0.5.0-SNAPSHOT.tar.gz?id=c34a21c939f5fce9ab89b95d65fe2df50e7bbab0
70
71=== modified file 'tests/remote/test_dist_config.py' (properties changed: -x to +x)

Subscribers

People subscribed via source and target branches