Merge lp:~admcleod/charms/trusty/apache-hadoop-compute-slave/hadoop-upgrade into lp:~bigdata-dev/charms/trusty/apache-hadoop-compute-slave/trunk

Proposed by Andrew McLeod
Status: Needs review
Proposed branch: lp:~admcleod/charms/trusty/apache-hadoop-compute-slave/hadoop-upgrade
Merge into: lp:~bigdata-dev/charms/trusty/apache-hadoop-compute-slave/trunk
Diff against target: 154 lines (+125/-0)
4 files modified
README.md (+25/-0)
actions.yaml (+10/-0)
actions/hadoop-upgrade (+82/-0)
resources.yaml (+8/-0)
To merge this branch: bzr merge lp:~admcleod/charms/trusty/apache-hadoop-compute-slave/hadoop-upgrade
Reviewer Review Type Date Requested Status
Juju Big Data Development Pending
Review via email: mp+275387@code.launchpad.net

Description of the change

Added hadoop-upgrade action to upgrade/rollback hadoop software

Also updated resources.yaml

To post a comment you must log in.
Revision history for this message
Andrew McLeod (admcleod) wrote :

Need to update the README.md for rollback action param - fix on the way

93. By Andrew McLeod

updated README.md

Unmerged revisions

93. By Andrew McLeod

updated README.md

92. By Andrew McLeod

trivial text modifications

91. By Andrew McLeod

hadoop upgrade actions additions etc

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'README.md'
--- README.md 2015-10-06 18:20:36 +0000
+++ README.md 2015-10-22 16:46:04 +0000
@@ -57,6 +57,31 @@
57 juju set yarn-master ganglia_metrics=true57 juju set yarn-master ganglia_metrics=true
5858
5959
60## Upgrading
61
62This charm includes the hadoop-upgrade action which will download, untar and
63upgrade the hadoop software to the specified version. This should be used in
64conjunction with the hadoop-pre-upgrade and hadoop-post-upgrade actions on the
65namenode (apache-hadoop-hdfs-master) which stops any hadoop related processes on
66the cluster before allowing the upgrade to proceed.
67
68If different
69versions of hadoop are running on related services, the cluster will not
70function correctly.
71
72The rollback param specifies whether to recreate (overwrite)
73the hadoop software or simply recreate the /usr/lib/hadoop symlink.
74
75Syntax for this action is:
76
77 juju action do datanode/0 hadoop-upgrade version=X.X.X rollback=false
78
79This action will upgrade the unit extended status.
80You can also get action results with:
81
82 juju action fetch --wait 0 action-id
83
84
60## Deploying in Network-Restricted Environments85## Deploying in Network-Restricted Environments
6186
62The Apache Hadoop charms can be deployed in environments with limited network87The Apache Hadoop charms can be deployed in environments with limited network
6388
=== added directory 'actions'
=== added file 'actions.yaml'
--- actions.yaml 1970-01-01 00:00:00 +0000
+++ actions.yaml 2015-10-22 16:46:04 +0000
@@ -0,0 +1,10 @@
1hadoop-upgrade:
2 description: upgrade (or roll back) hadoop to specified version
3 params:
4 version:
5 type: string
6 description: destination hadoop version X.X.X
7 rollback:
8 type: boolean
9 description: true or false - defaults to false
10 required: [version, rollback]
011
=== added file 'actions/hadoop-upgrade'
--- actions/hadoop-upgrade 1970-01-01 00:00:00 +0000
+++ actions/hadoop-upgrade 2015-10-22 16:46:04 +0000
@@ -0,0 +1,82 @@
1#!/bin/bash
2export SAVEPATH=$PATH
3. /etc/environment
4export PATH=$PATH:$SAVEPATH
5export JAVA_HOME
6
7current_hadoop_ver=`/usr/lib/hadoop/bin/hadoop version|head -n1|awk '{print $2}'`
8new_hadoop_ver=`action-get version`
9cpu_arch=`lscpu|grep -i arch|awk '{print $2}'`
10rollback=`action-get rollback`
11
12
13if pgrep -f Dproc_datanode ; then
14 action-set result="datanode process detected, upgrade aborted"
15 status-set active "datanode process detected, upgrade aborted"
16 exit 1
17fi
18
19
20if pgrep -f Dproc_nodemanager ; then
21 action-set result="nodemanager process detected, upgrade aborted"
22 status-set active "nodemanager process detected, upgrade aborted"
23 exit 1
24fi
25
26if [ "$new_hadoop_ver" == "$current_hadoop_ver" ] ; then
27 action-set result="Same version already installed, aborting"
28 action-fail "Same version already installed"
29 exit 1
30fi
31
32if [ "${rollback}" == "True" ] ; then
33 if [ -d /usr/lib/hadoop-${new_hadoop_ver} ] ; then
34 rm /usr/lib/hadoop
35 ln -s /usr/lib/hadoop-${new_hadoop_ver} /usr/lib/hadoop
36 if [ -d /usr/lib/hadoop-${current_hadoop_ver}/logs ] ; then
37 mv /usr/lib/hadoop-${current_hadoop_ver}/logs /usr/lib/hadoop/
38 fi
39 fi
40 action-set newhadoop.rollback="successfully rolled back"
41 status-set active "Ready - rollback to ${new_hadoop_ver} complete"
42 exit 0
43fi
44
45status-set maintenance "Fetching hadoop-${new_hadoop_ver}-${cpu_arch}"
46juju-resources fetch hadoop-${new_hadoop_ver}-${cpu_arch}
47if [ ! $? -eq 0 ] ; then
48 action-set newhadoop.fetch="fail"
49 exit 1
50fi
51action-set newhadoop.fetch="success"
52
53status-set maintenance "Verifying hadoop-${new_hadoop_ver}-${cpu_arch}"
54juju-resources verify hadoop-${new_hadoop_ver}-${cpu_arch}
55if [ ! $? -eq 0 ] ; then
56 action-set newhadoop.verify="fail"
57 exit 1
58fi
59action-set newhadoop.verify="success"
60
61new_hadoop_path=`juju-resources resource_path hadoop-${new_hadoop_ver}-${cpu_arch}`
62if [ -h /usr/lib/hadoop ] ; then
63 rm /usr/lib/hadoop
64fi
65
66mv /usr/lib/hadoop/ /usr/lib/hadoop-${current_hadoop_ver}
67ln -s /usr/lib/hadoop-${current_hadoop_ver}/ /usr/lib/hadoop
68current_hadoop_path=hadoop-${current_hadoop_ver}
69
70status-set maintenance "Extracting hadoop-${new_hadoop_ver}-${cpu_arch}"
71tar -zxvf ${new_hadoop_path} -C /usr/lib/
72if [ $? -eq 0 ] ; then
73 if [ -h /usr/lib/hadoop ] ; then
74 rm /usr/lib/hadoop
75 fi
76 ln -s /usr/lib/hadoop-${new_hadoop_ver} /usr/lib/hadoop
77fi
78if [ -d ${current_hadoop_path}/logs ] ; then
79 mv ${current_hadoop_path}/logs ${new_hadoop_path}/
80fi
81action-set result="complete"
82status-set active "Ready - hadoop version ${new_hadoop_ver} installed"
083
=== modified file 'resources.yaml'
--- resources.yaml 2015-10-06 18:21:31 +0000
+++ resources.yaml 2015-10-22 16:46:04 +0000
@@ -26,3 +26,11 @@
26 url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz26 url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz
27 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b67227 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
28 hash_type: sha25628 hash_type: sha256
29 hadoop-2.4.1-x86_64:
30 url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz
31 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
32 hash_type: sha256
33 hadoop-2.7.1-x86_64:
34 url: http://mirrors.ukfast.co.uk/sites/ftp.apache.org/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz
35 hash: 991dc34ea42a80b236ca46ff5d207107bcc844174df0441777248fdb6d8c9aa0
36 hash_type: sha256

Subscribers

People subscribed via source and target branches