Merge ~barryprice/influxdb-charm/+git/trunk:master into influxdb-charm:master

Proposed by Barry Price
Status: Merged
Approved by: Barry Price
Approved revision: e705663ff7439e6c17fbac8a121173d654c516fd
Merged at revision: fb57c9cfe2c4ae05279cca4c72279fe40ef11ea6
Proposed branch: ~barryprice/influxdb-charm/+git/trunk:master
Merge into: influxdb-charm:master
Diff against target: 42 lines (+15/-7)
1 file modified
templates/influxdb-backup (+15/-7)
Reviewer Review Type Date Requested Status
Tom Haddon Approve
Canonical IS Reviewers Pending
Review via email: mp+374962@code.launchpad.net

Commit message

Add compression to backups LP:1850772

To post a comment you must log in.
Revision history for this message
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote :

This merge proposal is being monitored by mergebot. Change the status to Approved to merge.

Revision history for this message
Tom Haddon (mthaddon) wrote :

One question inline

Revision history for this message
Tom Haddon (mthaddon) wrote :

LGTM

review: Approve
Revision history for this message
🤖 Canonical IS Merge Bot (canonical-is-mergebot) wrote :

Change successfully merged at revision fb57c9cfe2c4ae05279cca4c72279fe40ef11ea6

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1diff --git a/templates/influxdb-backup b/templates/influxdb-backup
2index cd3fabf..b253fbc 100644
3--- a/templates/influxdb-backup
4+++ b/templates/influxdb-backup
5@@ -24,10 +24,10 @@ fi
6 auth_args=""
7 case $(influx -version) in
8 "InfluxDB shell version: 1.0."*|"InfluxDB shell 0."*)
9- auth_args="-username ${INFLUX_USERNAME} -password ${INFLUX_PASSWORD}"
10- ;;
11+ auth_args="-username ${INFLUX_USERNAME} -password ${INFLUX_PASSWORD}"
12+ ;;
13 *)
14- ;;
15+ ;;
16 esac
17
18 DATABASES=$(influx ${auth_args} --execute 'show databases' --format=json | jq --raw-output '.results[0].series[0].values[][0]' | grep -ve '^_internal$')
19@@ -40,11 +40,19 @@ for d in ${DATABASES}; do
20 influxd backup -database $d $TODAY_DIR/$d
21 done
22
23+# compress the on-disk backup - gzip will do as the files can be huge, more efficient compressors are too slow/intensive
24+tar czf $TODAY_DIR.tgz $TODAY_DIR && rm -rf $TODAY_DIR
25+
26 # remove old backups - to keep NUM backups we need to delete from directory NUM+1
27 (( NUM++ ))
28 cd $BACKUP_DIR
29-for dir in $(ls -t | tail -n +$NUM); do
30- if [ -d ./$dir ]; then
31- rm -rf ./$dir && echo "Removed $dir"
32- fi
33+
34+# remove any old backup dirs from pre-compression revisions of this script
35+for dir in $(find . -maxdepth 1 -type d | grep -v ^.$ | sort -r | tail -n +$NUM); do
36+ rm -rf ./$dir && echo "Removed $dir"
37+done
38+
39+# remove old compressed backup files
40+for file in $(find . -maxdepth 1 -type f | sort -r | tail -n +$NUM); do
41+ rm -rf ./$file && echo "Removed $file"
42 done

Subscribers

People subscribed via source and target branches

to all changes: