Merge lp:~admcleod/charms/trusty/apache-hadoop-hdfs-secondary/hadoop-upgrade into lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-secondary/trunk

Proposed by Andrew McLeod
Status: Needs review
Proposed branch: lp:~admcleod/charms/trusty/apache-hadoop-hdfs-secondary/hadoop-upgrade
Merge into: lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-secondary/trunk
Diff against target: 149 lines (+120/-0)
4 files modified
README.md (+25/-0)
actions.yaml (+11/-0)
actions/hadoop-upgrade (+76/-0)
resources.yaml (+8/-0)
To merge this branch: bzr merge lp:~admcleod/charms/trusty/apache-hadoop-hdfs-secondary/hadoop-upgrade
Reviewer Review Type Date Requested Status
Juju Big Data Development Pending
Review via email: mp+275388@code.launchpad.net

Description of the change

Added hadoop-upgrade action, modified resources.yaml

To post a comment you must log in.
Revision history for this message
Andrew McLeod (admcleod) wrote :

Need to update the README.md for rollback action param - fix on the way

73. By Andrew McLeod

updated README.md

Unmerged revisions

73. By Andrew McLeod

updated README.md

72. By Andrew McLeod

trivial text modifications

71. By Andrew McLeod

hadoop upgrade additions

70. By Cory Johns

Get Hadoop binaries to S3 and cleanup tests to favor and improve bundle tests

69. By Kevin W Monroe

[merge] merge bigdata-dev r79..81 into bigdata-charmers

68. By Kevin W Monroe

[merge] merge bigdata-dev r69..r78 into bigdata-charmers

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'README.md'
2--- README.md 2015-10-06 18:25:57 +0000
3+++ README.md 2015-10-22 16:34:30 +0000
4@@ -48,6 +48,31 @@
5 juju set hdfs-master ganglia_metrics=true
6
7
8+## Upgrading
9+
10+This charm includes the hadoop-upgrade action which will download, untar and
11+upgrade the hadoop software to the specified version. This should be used in
12+conjunction with the hadoop-pre-upgrade and hadoop-post-upgrade actions on the
13+namenode (apache-hadoop-hdfs-master) which stops any hadoop related processes on
14+the cluster before allowing the upgrade to proceed.
15+
16+If different
17+versions of hadoop are running on related services, the cluster will not
18+function correctly.
19+
20+The rollback param specifies whether to recreate (overwrite)
21+the hadoop software or simply recreate the /usr/lib/hadoop symlink.
22+
23+Syntax for this action is:
24+
25+ juju action do datanode/0 hadoop-upgrade version=X.X.X rollback=false
26+
27+This action will upgrade the unit extended status.
28+You can also get action results with:
29+
30+ juju action fetch --wait 0 action-id
31+
32+
33 ## Deploying in Network-Restricted Environments
34
35 The Apache Hadoop charms can be deployed in environments with limited network
36
37=== added directory 'actions'
38=== added file 'actions.yaml'
39--- actions.yaml 1970-01-01 00:00:00 +0000
40+++ actions.yaml 2015-10-22 16:34:30 +0000
41@@ -0,0 +1,11 @@
42+hadoop-upgrade:
43+ description: upgrade (or roll back) hadoop to specified version
44+ params:
45+ version:
46+ type: string
47+ description: destination hadoop version X.X.X
48+ rollback:
49+ type: boolean
50+ description: true or false - defaults to false
51+ required: [version, rollback]
52+
53
54=== added file 'actions/hadoop-upgrade'
55--- actions/hadoop-upgrade 1970-01-01 00:00:00 +0000
56+++ actions/hadoop-upgrade 2015-10-22 16:34:30 +0000
57@@ -0,0 +1,76 @@
58+#!/bin/bash
59+export SAVEPATH=$PATH
60+. /etc/environment
61+export PATH=$PATH:$SAVEPATH
62+export JAVA_HOME
63+
64+current_hadoop_ver=`/usr/lib/hadoop/bin/hadoop version|head -n1|awk '{print $2}'`
65+new_hadoop_ver=`action-get version`
66+cpu_arch=`lscpu|grep -i arch|awk '{print $2}'`
67+rollback=`action-get rollback`
68+
69+
70+if pgrep -f Dproc_secondarynamenode ; then
71+ action-set result="secondarynamenode process detected, upgrade aborted"
72+ status-set active "secondarynamenode process detected, upgrade aborted"
73+ exit 1
74+fi
75+
76+if [ "$new_hadoop_ver" == "$current_hadoop_ver" ] ; then
77+ action-set result="Same version already installed, aborting"
78+ action-fail "Same version already installed"
79+ exit 1
80+fi
81+
82+if [ "${rollback}" == "True" ] ; then
83+ if [ -d /usr/lib/hadoop-${new_hadoop_ver} ] ; then
84+ rm /usr/lib/hadoop
85+ ln -s /usr/lib/hadoop-${new_hadoop_ver} /usr/lib/hadoop
86+ if [ -d /usr/lib/hadoop-${current_hadoop_ver}/logs ] ; then
87+ mv /usr/lib/hadoop-${current_hadoop_ver}/logs /usr/lib/hadoop/
88+ fi
89+ fi
90+ action-set newhadoop.rollback="successfully rolled back"
91+ status-set active "Ready - rollback to ${new_hadoop_ver} complete"
92+ exit 0
93+fi
94+
95+status-set maintenance "Fetching hadoop-${new_hadoop_ver}-${cpu_arch}"
96+juju-resources fetch hadoop-${new_hadoop_ver}-${cpu_arch}
97+if [ ! $? -eq 0 ] ; then
98+ action-set newhadoop.fetch="fail"
99+ exit 1
100+fi
101+action-set newhadoop.fetch="success"
102+
103+status-set maintenance "Verifying hadoop-${new_hadoop_ver}-${cpu_arch}"
104+juju-resources verify hadoop-${new_hadoop_ver}-${cpu_arch}
105+if [ ! $? -eq 0 ] ; then
106+ action-set newhadoop.verify="fail"
107+ exit 1
108+fi
109+action-set newhadoop.verify="success"
110+
111+new_hadoop_path=`juju-resources resource_path hadoop-${new_hadoop_ver}-${cpu_arch}`
112+if [ -h /usr/lib/hadoop ] ; then
113+ rm /usr/lib/hadoop
114+fi
115+
116+mv /usr/lib/hadoop/ /usr/lib/hadoop-${current_hadoop_ver}
117+ln -s /usr/lib/hadoop-${current_hadoop_ver}/ /usr/lib/hadoop
118+current_hadoop_path=hadoop-${current_hadoop_ver}
119+
120+status-set maintenance "Extracting hadoop-${new_hadoop_ver}-${cpu_arch}"
121+tar -zxvf ${new_hadoop_path} -C /usr/lib/
122+if [ $? -eq 0 ] ; then
123+ if [ -h /usr/lib/hadoop ] ; then
124+ rm /usr/lib/hadoop
125+ fi
126+ ln -s /usr/lib/hadoop-${new_hadoop_ver} /usr/lib/hadoop
127+fi
128+if [ -d ${current_hadoop_path}/logs ] ; then
129+ mv ${current_hadoop_path}/logs ${new_hadoop_path}/
130+fi
131+
132+action-set result="complete"
133+status-set active "Ready - hadoop version ${new_hadoop_ver} installed"
134
135=== modified file 'resources.yaml'
136--- resources.yaml 2015-10-06 18:26:23 +0000
137+++ resources.yaml 2015-10-22 16:34:30 +0000
138@@ -26,3 +26,11 @@
139 url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz
140 hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
141 hash_type: sha256
142+ hadoop-2.4.1-x86_64:
143+ url: https://s3.amazonaws.com/jujubigdata/apache/x86_64/hadoop-2.4.1-a790d39.tar.gz
144+ hash: a790d39baba3a597bd226042496764e0520c2336eedb28a1a3d5c48572d3b672
145+ hash_type: sha256
146+ hadoop-2.7.1-x86_64:
147+ url: http://mirrors.ukfast.co.uk/sites/ftp.apache.org/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz
148+ hash: 991dc34ea42a80b236ca46ff5d207107bcc844174df0441777248fdb6d8c9aa0
149+ hash_type: sha256

Subscribers

People subscribed via source and target branches