Merge lp:~admcleod/charms/trusty/apache-hadoop-compute-slave/hadoop-upgrade into lp:~bigdata-dev/charms/trusty/apache-hadoop-compute-slave/trunk

Proposed by Andrew McLeod
Status: Needs review
Proposed branch: lp:~admcleod/charms/trusty/apache-hadoop-compute-slave/hadoop-upgrade
Merge into: lp:~bigdata-dev/charms/trusty/apache-hadoop-compute-slave/trunk
Diff against target: 154 lines (+125/-0)
4 files modified
README.md (+25/-0)
actions.yaml (+10/-0)
actions/hadoop-upgrade (+82/-0)
resources.yaml (+8/-0)
To merge this branch: bzr merge lp:~admcleod/charms/trusty/apache-hadoop-compute-slave/hadoop-upgrade
Reviewer Review Type Date Requested Status
Juju Big Data Development Pending
Review via email: mp+275387@code.launchpad.net

Description of the change

Added hadoop-upgrade action to upgrade/rollback hadoop software

Also updated resources.yaml

To post a comment you must log in.
Revision history for this message
Andrew McLeod (admcleod) wrote :

Need to update the README.md for rollback action param - fix on the way

93. By Andrew McLeod

updated README.md

Unmerged revisions

93. By Andrew McLeod

updated README.md

92. By Andrew McLeod

trivial text modifications

91. By Andrew McLeod

hadoop upgrade actions additions etc

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'README.md'
2--- README.md 2015-10-06 18:20:36 +0000
3+++ README.md 2015-10-22 16:46:04 +0000
4@@ -57,6 +57,31 @@
5 juju set yarn-master ganglia_metrics=true
6
7
8+## Upgrading
9+
10+This charm includes the hadoop-upgrade action which will download, untar and
11+upgrade the hadoop software to the specified version. This should be used in
12+conjunction with the hadoop-pre-upgrade and hadoop-post-upgrade actions on the
13+namenode (apache-hadoop-hdfs-master) which stops any hadoop related processes on
14+the cluster before allowing the upgrade to proceed.
15+
16+If different
17+versions of hadoop are running on related services, the cluster will not
18+function correctly.
19+
20+The rollback param specifies whether to recreate (overwrite)
21+the hadoop software or simply recreate the /usr/lib/hadoop symlink.
22+
23+Syntax for this action is:
24+
25+ juju action do datanode/0 hadoop-upgrade version=X.X.X rollback=false
26+
27+This action will upgrade the unit extended status.
28+You can also get action results with:
29+
30+ juju action fetch --wait 0 action-id
31+
32+
33 ## Deploying in Network-Restricted Environments
34
35 The Apache Hadoop charms can be deployed in environments with limited network
36
37=== added directory 'actions'
38=== added file 'actions.yaml'
39--- actions.yaml 1970-01-01 00:00:00 +0000
40+++ actions.yaml 2015-10-22 16:46:04 +0000
41@@ -0,0 +1,10 @@
42+hadoop-upgrade:
43+ description: upgrade (or roll back) hadoop to specified version
44+ params:
45+ version:
46+ type: string
47+ description: destination hadoop version X.X.X
48+ rollback:
49+ type: boolean
50+ description: true or false - defaults to false
51+ required: [version, rollback]
52
53=== added file 'actions/hadoop-upgrade'
54--- actions/hadoop-upgrade 1970-01-01 00:00:00 +0000
55+++ actions/hadoop-upgrade 2015-10-22 16:46:04 +0000
56@@ -0,0 +1,82 @@
57+#!/bin/bash
58+export SAVEPATH=$PATH
59+. /etc/environment
60+export PATH=$PATH:$SAVEPATH
61+export JAVA_HOME
62+
63+current_hadoop_ver=`/usr/lib/hadoop/bin/hadoop version|head -n1|awk '{print $2}'`
64+new_hadoop_ver=`action-get version`
65+cpu_arch=`lscpu|grep -i arch|awk '{print $2}'`
66+rollback=`action-get rollback`
67+
68+
69+if pgrep -f Dproc_datanode ; then
70+ action-set result="datanode process detected, upgrade aborted"
71+ status-set active "datanode process detected, upgrade aborted"
72+ exit 1
73+fi
74+
75+
76+if pgrep -f Dproc_nodemanager ; then
77+ action-set result="nodemanager process detected, upgrade aborted"
78+ status-set active "nodemanager process detected, upgrade aborted"
79+ exit 1
80+fi
81+
82+if [ "$new_hadoop_ver" == "$current_hadoop_ver" ] ; then
83+ action-set result="Same version already installed, aborting"
84+ action-fail "Same version already installed"
85+ exit 1
86+fi
87+
88+if [ "${rollback}" == "True" ] ; then
89+ if [ -d /usr/lib/hadoop-${new_hadoop_ver} ] ; then
90+ rm /usr/lib/hadoop
91+ ln -s /usr/lib/hadoop-${new_hadoop_ver} /usr/lib/hadoop
92+ if [ -d /usr/lib/hadoop-${current_hadoop_ver}/logs ] ; then
93+ mv /usr/lib/hadoop-${current_hadoop_ver}/logs /usr/lib/hadoop/
94+ fi
95+ fi
96+ action-set newhadoop.rollback="successfully rolled back"
97+ status-set active "Ready - rollback to ${new_hadoop_ver} complete"
98+ exit 0
99+fi
100+
101+status-set maintenance "Fetching hadoop-${new_hadoop_ver}-${cpu_arch}"
102+juju-resources fetch hadoop-${new_hadoop_ver}-${cpu_arch}
103+if [ ! $? -eq 0 ] ; then
104+ action-set newhadoop.fetch="fail"
105+ exit 1
106+fi
107+action-set newhadoop.fetch="success"
108+
109+status-set maintenance "Verifying hadoop-${new_hadoop_ver}-${cpu_arch}"
110+juju-resources verify hadoop-${new_hadoop_ver}-${cpu_arch}
111+if [ ! $? -eq 0 ] ; then
112+ action-set newhadoop.verify="fail"
113+ exit 1
114+fi
115+action-set newhadoop.verify="success"
116+
117+new_hadoop_path=`juju-resources resource_path hadoop-${new_hadoop_ver}-${cpu_arch}`
118+if [ -h /usr/lib/hadoop ] ; then
119+ rm /usr/lib/hadoop
120+fi
121+
122+mv /usr/lib/hadoop/ /usr/lib/hadoop-${current_hadoop_ver}
123+ln -s /usr/lib/hadoop-${current_hadoop_ver}/ /usr/lib/hadoop
124+current_hadoop_path=hadoop-${current_hadoop_ver}
125+
126+status-set maintenance "Extracting hadoop-${new_hadoop_ver}-${cpu_arch}"
127+tar -zxvf ${new_hadoop_path} -C /usr/lib/
128+if [ $? -eq 0 ] ; then
129+ if [ -h /usr/lib/hadoop ] ; then
130+ rm /usr/lib/hadoop
131+ fi
132+ ln -s /usr/lib/hadoop-${new_hadoop_ver} /usr/lib/hadoop
133+fi
134+if [ -d ${current_hadoop_path}/logs ] ; then
135+ mv ${current_hadoop_path}/logs ${new_hadoop_path}/
136+fi
137+action-set result="complete"
138+status-set active "Ready - hadoop version ${new_hadoop_ver} installed"
139
140=== modified file 'resources.yaml'
141--- resources.yaml 2015-10-06 18:21:31 +0000
142+++ resources.yaml 2015-10-22 16:46:04 +0000
143@@ -26,3 +26,11 @@
144 url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz
145 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
146 hash_type: sha256
147+ hadoop-2.4.1-x86_64:
148+ url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz
149+ hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
150+ hash_type: sha256
151+ hadoop-2.7.1-x86_64:
152+ url: http://mirrors.ukfast.co.uk/sites/ftp.apache.org/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz
153+ hash: 991dc34ea42a80b236ca46ff5d207107bcc844174df0441777248fdb6d8c9aa0
154+ hash_type: sha256

Subscribers

People subscribed via source and target branches