Merge lp:~bigdata-dev/charms/trusty/apache-spark/trunk into lp:charms/trusty/apache-spark

Proposed by Kevin W Monroe
Status: Merged
Merged at revision: 34
Proposed branch: lp:~bigdata-dev/charms/trusty/apache-spark/trunk
Merge into: lp:charms/trusty/apache-spark
Diff against target: 79 lines (+15/-11)
3 files modified
README.md (+4/-4)
hooks/callbacks.py (+10/-6)
resources.yaml (+1/-1)
To merge this branch: bzr merge lp:~bigdata-dev/charms/trusty/apache-spark/trunk
Reviewer Review Type Date Requested Status
Kevin W Monroe Approve
Review via email: mp+268673@code.launchpad.net
To post a comment you must log in.
45. By Cory Johns

Fixed permissions on test_dist_config.py

Revision history for this message
Kevin W Monroe (kwmonroe) wrote :

Realtime syslog analytics bundle test looked good. Merged.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'README.md'
2--- README.md 2015-06-25 16:47:56 +0000
3+++ README.md 2015-08-21 21:52:05 +0000
4@@ -28,12 +28,12 @@
5 This charm leverages our pluggable Hadoop model with the `hadoop-plugin`
6 interface. This means that you will need to deploy a base Apache Hadoop cluster
7 to run Spark. The suggested deployment method is to use the
8-[apache-hadoop-spark](https://jujucharms.com/u/bigdata-dev/apache-hadoop-spark/)
9+[apache-hadoop-spark](https://jujucharms.com/apache-hadoop-spark/)
10 bundle. This will deploy the Apache Hadoop platform with a single Apache Spark
11 unit that communicates with the cluster by relating to the
12 `apache-hadoop-plugin` subordinate charm:
13
14- juju-quickstart u/bigdata-dev/apache-hadoop-spark
15+ juju-quickstart apache-hadoop-spark
16
17 Alternatively, you may manually deploy the recommended environment as follows:
18
19@@ -75,7 +75,7 @@
20
21 Deploy Apache Zeppelin and relate it to the Spark unit:
22
23- juju deploy cs:~bigdata-dev/trusty/apache-zeppelin zeppelin
24+ juju deploy apache-zeppelin zeppelin
25 juju add-relation spark zeppelin
26
27 Once the relation has been made, access the web interface at
28@@ -87,7 +87,7 @@
29 can combine code execution, rich text, mathematics, plots and rich media.
30 Deploy IPython Notebook for Spark and relate it to the Spark unit:
31
32- juju deploy cs:~bigdata-dev/trusty/apache-spark-notebook notebook
33+ juju deploy apache-spark-notebook notebook
34 juju add-relation spark notebook
35
36 Once the relation has been made, access the web interface at
37
38=== modified file 'hooks/callbacks.py'
39--- hooks/callbacks.py 2015-07-24 16:28:49 +0000
40+++ hooks/callbacks.py 2015-08-21 21:52:05 +0000
41@@ -80,13 +80,17 @@
42
43 def setup_spark_config(self):
44 '''
45- copy Spark's default configuration files to spark_conf property defined
46- in dist.yaml
47+ copy the default configuration files to spark_conf property
48+ defined in dist.yaml
49 '''
50- conf_dir = self.dist_config.path('spark') / 'conf'
51- self.dist_config.path('spark_conf').rmtree_p()
52- conf_dir.copytree(self.dist_config.path('spark_conf'))
53- conf_dir.rmtree_p()
54+ default_conf = self.dist_config.path('spark') / 'conf'
55+ spark_conf = self.dist_config.path('spark_conf')
56+ spark_conf.rmtree_p()
57+ default_conf.copytree(spark_conf)
58+ # Now remove the conf included in the tarball and symlink our real conf
59+ default_conf.rmtree_p()
60+ spark_conf.symlink(default_conf)
61+
62 spark_env = self.dist_config.path('spark_conf') / 'spark-env.sh'
63 if not spark_env.exists():
64 (self.dist_config.path('spark_conf') / 'spark-env.sh.template').copy(spark_env)
65
66=== modified file 'resources.yaml'
67--- resources.yaml 2015-07-24 16:28:49 +0000
68+++ resources.yaml 2015-08-21 21:52:05 +0000
69@@ -4,7 +4,7 @@
70 pathlib:
71 pypi: path.py>=7.0
72 jujubigdata:
73- pypi: jujubigdata>=2.1.0,<3.0.0
74+ pypi: jujubigdata>=4.0.0,<5.0.0
75 optional_resources:
76 spark-ppc64le:
77 url: https://git.launchpad.net/bigdata-data/plain/apache/ppc64le/spark-1.3.1-bin-2.4.0.tgz?id=45f439740a08b93ae72bc48a7103ebf58dbfa60b
78
79=== modified file 'tests/remote/test_dist_config.py' (properties changed: -x to +x)

Subscribers

People subscribed via source and target branches