Merge lp:~christophe-lyon/cbuild/tools-fix-svn-name into lp:cbuild

Proposed by Christophe Lyon
Status: Needs review
Proposed branch: lp:~christophe-lyon/cbuild/tools-fix-svn-name
Merge into: lp:cbuild
Diff against target: 3321 lines (+3126/-0)
39 files modified
MAINTAIN.txt (+92/-0)
aarch64-merger.sh (+172/-0)
addtestdiffs.py (+133/-0)
cache-builds.py (+131/-0)
chain.sh (+75/-0)
check-leaks.sh (+14/-0)
closest.py (+63/-0)
cron-often.sh (+56/-0)
ctimes.py (+100/-0)
difftests.sh (+45/-0)
ec2-pull.sh (+28/-0)
ec2-spawn.sh (+36/-0)
ec2new.sh (+23/-0)
ec2slave-init-cbuild.sh (+36/-0)
ec2slave-init.sh (+105/-0)
ec2slave.sh (+65/-0)
ec2watch.sh (+18/-0)
enrich-gcc-linaro-repo (+45/-0)
export-bzr.sh (+85/-0)
export-git.sh (+75/-0)
export-hg.sh (+61/-0)
findancestor.py (+204/-0)
findlogs.py (+113/-0)
gcc-release.sh (+168/-0)
getbody.py (+26/-0)
lab-pull.sh (+22/-0)
latestami.sh (+17/-0)
launcher.sh (+146/-0)
libec2.sh (+78/-0)
pull_branch.sh (+21/-0)
s3-diet.py (+91/-0)
spawn.sh (+36/-0)
stamp_branch.sh (+70/-0)
tabulate.sh (+36/-0)
taker.json (+82/-0)
taker.py (+396/-0)
track-tree.sh (+16/-0)
up_branch.sh (+53/-0)
utilisation.py (+93/-0)
To merge this branch: bzr merge lp:~christophe-lyon/cbuild/tools-fix-svn-name
Reviewer Review Type Date Requested Status
Linaro Toolchain Builder Pending
Review via email: mp+216447@code.launchpad.net

Description of the change

Attempt to avoid generation of ~/var/snapshots/svn* files and associated scheduler jobs.

To post a comment you must log in.
108. By Christophe Lyon

Fix previous commit.

Unmerged revisions

108. By Christophe Lyon

Fix previous commit.

107. By Christophe Lyon

Do not generate tarball and delta if no name parameter is provided.

106. By Renato Golin

Adding Clang/Extra/RT to LLVM update

105. By Yvan Roux

Add gcc-linaro-4.8 branch in cron-often polling task.

104. By Linaro Toolchain Builder

Add support for toolchain64 host and GCC Linaro 4.8 branch

103. By Linaro Toolchain Builder

Update find to take gcc-4.8 branches into account.

102. By Christophe Lyon

Fix bug #1156536

101. By Michael Hope

Fix taker stuff

100. By Michael Hope

Make tabulate work on all GCC results.

99. By Michael Hope

Update to use gcc-google-4.7 and not gcc-google-4.6

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== added file 'MAINTAIN.txt'
--- MAINTAIN.txt 1970-01-01 00:00:00 +0000
+++ MAINTAIN.txt 2014-07-01 15:58:40 +0000
@@ -0,0 +1,92 @@
1cbuild maintenance tasks
2========================
3
4Users
5-----
6 * cbuild - pulls keys from http://launchpad.net/~cbuild
7
8Things to watch
9---------------
10
11Disks:
12
13 * / for free space
14 * /space for free space
15
16`~/var/snapshots/gcc-4.8?svn*`:
17
18 * Youngest is no more than 3 days old
19 * Check cbuild-tools/launcher.sh on fault
20
21`~/var/tcwg-web/*.pickle`:
22
23 * No more than four hours old
24 * Check lib/tcwg-web/update.sh on fault
25
26Common tasks
27------------
28
29Where have all my directories gone?
30
31 * ~/lib/cbuild
32 * ~/var/snapshots
33 * ~/public_html/build
34
35How do I spawn a release build via the shell?
36
37 * Use gcc-release-process.sh! else:
38 * scp gcc-linaro-4.7-2012.12.tar.bz2 cbuild@toolchain64.lab:~/var/snapshots
39 * ssh cbuild@toolchain64.lab
40 * ~/lib/cbuild-tools/spawn.sh ~/var/snapshots/gcc-linaro-4.7-2012.tar.bz2
41
42A fault with frequent tasks like merge reqeusts, tip builds or data
43being out of date?
44
45 * Check ~/lib/cbuild-tools/cron-often.sh
46
47A fault with daily tasks like upstream builds?
48
49 * Check ~/lib/cbuild-tools/launcher.sh
50
51A fault with one of the helpers?
52
53 * sudo service tcwg-web stop
54 * cd lib/tcwg-web
55 * Uncommet 'development = True' in twg-web.ini
56 * python index.py
57 * Run the request, see the backtrace
58
59Version out of date on one of the upstream builds?
60
61 * Edit launcher.sh
62
63Want to build/track a new upstream?
64
65 * git clone upstream-url ~/repos/decent-name
66 * Add a new dow line to launcher.sh
67
68Want to track a new Linaro series?
69
70 * cd ~/repos/gcc-linaro
71 * bzr branch lp:gcc-linaro/4.8
72 * Add a new line to cron-often.sh
73 * Consider pushing a new gcc-linaro-4.8+bzr12345.{tar,tar.xz} to ~/var/snapshots/base
74
75Want to propogate a cbuild update to the slaves?
76
77 * cd ~/lib/cbuild
78 * bzr pull
79
80Want to add a new build queue?
81
82 * cd ~/var/scheduler/queue
83 * mkdir queue-name
84 * echo host1 host2 host3 > queue-name/hosts.txt
85 * echo config-fragments-if-any > queue-name/template.txt
86 * cd ../spawn/default
87 * ln -s ../../queue/queue-name
88
89Want to delete a job from the queue?
90
91 * cd ~/var/scheduler/queue
92 * `rm */job-name.job`
093
=== added file 'aarch64-merger.sh'
--- aarch64-merger.sh 1970-01-01 00:00:00 +0000
+++ aarch64-merger.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,172 @@
1#!/bin/bash
2#
3# Automatically merge and create a merge request for any new aarch64
4# commits.
5#
6#
7
8set -e
9
10# Path to Linaro GCC 4.7
11tip=4.7
12# Path to the ARM bzr branch
13from=arm-aarch64-4.7
14# Fill this in with the last merged bzr revno
15# Example: bzr log 4.7 shows r193293..r193328. bzr log arm-* --show-ids shows that
16# r193328 is bzr116407.
17current=116423
18
19get_revision() {
20 echo $(bzr log -c $1 --show-ids --short $2 | awk -F: '/revision-id:/ {print $NF}')
21}
22
23is_fsf_merge() {
24 (bzr log -c $1 -p $2 | grep -qF "+++ gcc/DATESTAMP") && echo y
25}
26
27all_revnos() {
28 bzr log -r $stop.. --log-format=line $1 | awk -F: '{print $1;}'
29}
30
31extract_changelog() {
32 # Pull the ChangeLog out
33 inside=
34
35 while read i; do
36 if [[ "$i" =~ ^=== ]]; then inside=; fi
37
38 if [ -n "$inside" ] && [[ "$i" =~ ^\+(.+) ]]; then
39 echo "${BASH_REMATCH[1]}"
40 fi
41
42 if [[ "$i" =~ ^\+\+\+\ .+/ChangeLog\.aarch64 ]]; then inside=y; fi
43
44 done < /tmp/bzr$revno.diff > /tmp/bzr$revno.changes
45}
46
47search_back() {
48 # Search back to the first non-merge
49 for i in $all; do
50 if [ -z "$(is_fsf_merge $i $from)" ]; then
51 last=$i
52 break
53 fi
54 done
55}
56
57search_forward() {
58 # Search forward to the first non-merge
59 for i in $(seq $((current+1)) 1 $last); do
60 if [ -z "$(is_fsf_merge $i $from)" ]; then
61 first=$i
62 break
63 fi
64 done
65}
66
67dir=`mktemp -td merger.XXXXXXXXXX`
68trap "rm -rf $dir" EXIT
69
70# End point for any scans
71stop=116380
72
73all=$(all_revnos $from)
74
75search_back
76search_forward
77
78sfirst=$(get_revision $first $from)
79slast=$(get_revision $last $from)
80
81name="merge-from-aarch64-4.7-r$sfirst-r$slast"
82
83# See if the merge request already exists
84if false && wget -q -O- https://code.launchpad.net/gcc-linaro | grep -q -F $name; then
85 echo "Branch $name already exists, nothing to do."
86 exit 0
87fi
88
89rm -rf $name
90bzr branch -q --hardlink $tip $name
91
92realname="Michael Hope"
93email="michael.hope@linaro.org"
94login=$(bzr lp-login)
95
96short="Merge from FSF arm/aarch64-4.7-branch r$sfirst..r$slast."
97
98echo -e "$(date +%Y-%m-%d) $realname <$email>\n" > $dir/log
99echo -e "\t$short\n" >> $dir/log
100
101# Go through and merge each revision
102for revno in $(seq $first 1 $last); do
103 if [ -n "$(is_fsf_merge $revno $from)" ]; then
104 echo "Skipping the FSF merge at $revno"
105 continue
106 fi
107
108 svn=$(get_revision $revno $from)
109 stem=$dir/bzr$revno
110
111 echo "Merging $svn (bzr$revno)"
112 # bzr diff returns 1 on changed...
113 (cd $from && bzr diff -c $revno || true) > $stem.diff
114
115 if ! (cd $name && bzr patch $stem.diff); then
116 read -p "Conflicts found. Fix in $name then press enter > "
117 fi
118
119 # Nuke any patch reminants
120 find $name -name *.orig -exec rm {} \;
121 find $name -name *.rej -exec rm {} \;
122
123 # Add any missing files. bzr patch should have done this already
124 # but doesn't on conflict.
125 (cd $name && bzr add)
126
127 # Pull out the commit message
128 (cd $from && bzr log -c $revno) > $stem.log
129
130 inside=
131 while read i; do
132 if [ -n "$inside" ]; then
133 trimmed=$(echo "$i")
134 echo "$trimmed"
135 fi
136
137 if [[ "$i" =~ ^message: ]]; then inside=y; fi
138 done < $stem.log > $stem.message
139
140 # Commit each as a revision to make future tracking easier. The
141 # merge will roll this up into one.
142 echo -e "Backport $from r$svn.\n" > $stem.commit
143 cat $stem.message >> $stem.commit
144
145 (cd $name && bzr commit -F $stem.commit)
146
147 echo -e "\tBackport $from r$svn:" >> $dir/log
148 while read i; do
149 echo -e "\t$i"
150 done < $stem.message >> $dir/log
151
152 echo >> $dir/log
153done
154
155# Upate the ChangeLog and commit
156cat $dir/log $name/ChangeLog.linaro > $dir/new
157mv $dir/new $name/ChangeLog.linaro
158
159cd $name
160bzr commit -m"$short"
161bzr push "lp:~$login/gcc-linaro/$name"
162
163# propose-merge always runs an editor for the message. Make a fake
164# editor that sets the text.
165cat > $dir/editor <<EOF
166#!/bin/bash
167cp $dir/log \$1
168EOF
169
170chmod +x $dir/editor
171
172VISUAL=$dir/editor bzr lp-propose-merge -m"$short"
0173
=== added file 'addtestdiffs.py'
--- addtestdiffs.py 1970-01-01 00:00:00 +0000
+++ addtestdiffs.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,133 @@
1#!/usr/bin/python
2"""Record the difference between the test results of this build and
3it's ancestor or predecessor.
4"""
5
6import sys
7import re
8import os
9import pprint
10import subprocess
11import logging
12import time
13
14import findancestor
15import findlogs
16
17TEMPLATE = """Difference in testsuite results between:
18 %(current)s build %(buildid)s
19and the one before it:
20 %(other)s build %(otherbuildid)s
21
22------
23%(diff)s"""
24
25def make_testsuite_diff(left, right):
26 # PENDING: This is bad
27 return subprocess.check_output(['./difftests.sh', left, right])
28
29def make_benchmarks_diff(left, right, suffix):
30 return subprocess.check_output(['../linaro-toolchain-benchmarks/scripts/diffbench.py',
31 '%s/benchmarks-%s.txt' % (left, suffix),
32 '%s/benchmarks-%s.txt' % (right, suffix)
33 ])
34
35def run(allfiles, snapshots, builds, check, final, make_diff):
36 now = time.time()
37
38 # Make the search a bit easier by only keeping the .txt files
39 filtered = [x for x in allfiles if '.txt' in x[-1] or check in x[-1]]
40
41 # And cache by directory
42 directories = {}
43
44 for f in filtered:
45 directory = '/'.join(f[:-1])
46 directories.setdefault(directory, []).append(f)
47
48 prefiltered = findlogs.prefilter(allfiles, check)
49
50 # Find all builds that have finished
51 for build in [x for x in filtered if x[-1] == 'finished.txt']:
52 match = '/'.join(build[:-1])
53
54 # Skip if the build is too old. Either we have a result or
55 # it's never coming in
56 try:
57 age = (now - os.path.getmtime('%s/%s/finished.txt' % (builds, match))) / (60*60*24)
58
59 if age >= 30:
60 continue
61 except OSError:
62 continue
63
64 # Find all of the files in the same directory
65 siblings = directories[match]
66
67 all = ' '.join(x[-1] for x in siblings)
68
69 # Are there any .sum style results?
70 if check not in all:
71 continue
72
73 # See if the diff already exists
74 if ' %s' % final in all:
75 continue
76
77 # Find the ancestor or predecessor
78 current = build[1]
79 other = findancestor.find(allfiles, current, snapshots)
80
81 if not other:
82 logging.info('No known ancestor to %s' % current)
83 continue
84
85 # Find the corresponding build
86 logs = ['/'.join(x) for x in siblings if check in x[-1]]
87 assert logs
88 log = logs[-1]
89
90 try:
91 otherlogs = findlogs.find(prefiltered, log, other)
92 except Exception, ex:
93 logging.info('Error %s while finding logs on %s vs %s' % (ex, log, other))
94 otherlogs = None
95
96 if not otherlogs:
97 logging.debug("Can't find %s of %s" % (log, other))
98 continue
99
100 other, otherbuildid = otherlogs
101 buildid = build[-2]
102 diff = None
103
104 try:
105 logging.debug('Generating a diff between predecessor %s/%s and %s' % (other, otherbuildid, os.path.dirname(log)))
106
107 left = '%s/%s/logs/%s' % (builds, other, otherbuildid)
108 right = os.path.join(builds, os.path.dirname(log))
109 diff = make_diff(left, right)
110
111 except Exception, ex:
112 logging.error(ex)
113
114 if diff == None:
115 continue
116
117 with open('%s/%s/%s' % (builds, match, final), 'w') as f:
118 print >> f, TEMPLATE % locals()
119
120def main():
121# logging.basicConfig(level=logging.DEBUG)
122 allfiles, snapshots, builds = sys.argv[1:]
123
124 with open(allfiles) as f:
125 allf = [x.strip().split(os.sep) for x in f.readlines()]
126
127 run(allf, snapshots, builds, '.sum', 'testsuite-diff.txt', make_testsuite_diff)
128
129 for suffix in 'spec2000 eembc eembc_office'.split():
130 run(allf, snapshots, builds, 'benchmarks-%s.txt' % suffix, 'benchmarks-%s-diff.txt' % suffix, lambda x, y: make_benchmarks_diff(x, y, suffix))
131
132if __name__ == '__main__':
133 main()
0134
=== added file 'cache-builds.py'
--- cache-builds.py 1970-01-01 00:00:00 +0000
+++ cache-builds.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,131 @@
1#!/usr/bin/python
2
3"""Scan a directory tree and cache the results of each version and
4build.
5"""
6
7import sys
8import os
9import pprint
10import cPickle
11import json
12
13def list(path):
14 try:
15 return os.walk(path).next()
16 except StopIteration:
17 return None, [], []
18
19def get_date(root, name):
20 return os.path.getmtime('%s/%s' % (root, name))
21
22def intern_all(values):
23 return tuple(intern(x) for x in values)
24
25def make_path(top, path):
26 root = top['root']
27
28 assert path.startswith(root)
29 return path[len(root)+1:]
30
31def add_build(top, name, aroot):
32 parts = intern_all(name.split('-') + [''])
33 arch, distro, cbuild, host, config = parts[:5]
34
35 root, dirs, files = list(aroot)
36 files = intern_all(files)
37
38 top['hosts'][host] = arch
39 top['arches'].setdefault(arch, {})
40 top['configs'].setdefault(config, {})
41
42 build = {
43 'name': name,
44 'arch': arch,
45 'distro': distro,
46 'cbuild': cbuild,
47 'host': host,
48 'config': config,
49 'abspath': aroot,
50 'path': make_path(top, aroot),
51 'files': files,
52 'failed': [],
53 'subfailures': [],
54 'testlog': None,
55 'started': None,
56 'finished': None,
57 'languages': [],
58 'binary': None,
59 }
60
61 for name in files:
62 if 'failed' in name:
63 build['failed'].append(name)
64
65 if '-failed' in name:
66 build['subfailures'].append(name.replace('-failed', ' ').split()[0])
67
68 elif name == 'started.txt':
69 build['started'] = get_date(root, name)
70 elif name == 'finished.txt':
71 build['finished'] = get_date(root, name)
72 elif name == 'gcc-configure.txt':
73 with open('%s/%s' % (root, name)) as f:
74 for line in f:
75 if line.startswith('The following'):
76 end = line.split(': ')[-1]
77 build['languages'] = intern_all(end.strip().split(','))
78 break
79
80 if name.endswith('-testsuite.txt'):
81 build['testlog'] = make_path(top, '%s/%s' % (root, name))
82
83 return build
84
85def add_version(top, name, path):
86 version = {
87 'name': name,
88 'abspath': path,
89 'builds': {}
90 }
91
92 root, dirs, files = list('%s/logs' % path)
93
94 for dir in dirs:
95 version['builds'][dir] = add_build(top, dir, '%s/%s' % (root, dir))
96
97 return version
98
99def add_versions(top, aroot, adirs):
100 for dir in adirs:
101 path = '%s/%s' % (aroot, dir)
102
103 root, dirs, files = list(path)
104
105 if 'logs' in dirs:
106 version = add_version(top, dir, path)
107 top['versions'][dir] = version
108
109def main():
110 top = {
111 'root': None,
112 'all': [],
113 'versions': {},
114 'arches': {},
115 'configs': {},
116 'hosts': {},
117 }
118
119 for path in sys.argv[1:]:
120 root, dirs, files = os.walk(path).next()
121 top['root'] = root
122 add_versions(top, root, dirs)
123
124 with open('builds.pickle', 'w') as f:
125 cPickle.dump(top, f, cPickle.HIGHEST_PROTOCOL)
126
127 with open('builds.json', 'w') as f:
128 json.dump(top, f, indent=2)
129
130if __name__ == '__main__':
131 main()
0132
=== added file 'chain.sh'
--- chain.sh 1970-01-01 00:00:00 +0000
+++ chain.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,75 @@
1#!/bin/bash
2#
3# Chain a job on following another. Generally works on the latest.
4#
5
6set -e
7# If the pattern fails then propogate the error
8set -o pipefail
9
10. ~/.config/cbuild/toolsrc
11
12config=cortexa9hf
13
14type=$1
15shift
16pattern=$1
17shift
18
19# Set to run=echo for testing
20run=
21
22function get_latest {
23 case $type in
24 benchmarks | ubutest)
25 # See what binaries we have locally that match
26 version=$( ls -t $binaries/*${config}r?.tar* | head -n1 )
27 version=$( basename $( dirname $version ))
28 ;;
29 trunk | oecore)
30 # Pick the latest snapshot
31 version=$( ls -t $snapshots/$pattern | head -n1 )
32 version=$( basename $version )
33 version=$( echo $version | sed -r "s/(.+)\.tar.+/\1/" )
34 ;;
35 *)
36 exit -1
37 ;;
38 esac
39
40 [ -n $version ]
41}
42
43function spawn {
44 target=$queue/$2/$1.job
45
46 if [ ! -f $target ]; then
47 PYTHONPATH=$lib/tcwg-web python -m schedulejob $2 $1
48 echo Spawned $1 into $2
49 fi
50}
51
52function chain {
53 # Now do something
54 case $type in
55 benchmarks)
56 spawn benchmarks-$version a9hf-ref
57 spawn benchmarks-spec2000-$version a9hf-ref
58 ;;
59 ubutest)
60 spawn ubutest-$version a9hf-ref
61 ;;
62 trunk)
63 spawn $version a9hf-builder
64 ;;
65 oecore)
66 spawn test-oecore-$version x86_64-heavy
67 ;;
68 *)
69 exit -1
70 ;;
71 esac
72}
73
74get_latest
75chain
076
=== added file 'check-leaks.sh'
--- check-leaks.sh 1970-01-01 00:00:00 +0000
+++ check-leaks.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,14 @@
1#!/bin/sh
2#
3# Watch the build directory for any results that shouldn't be there
4#
5
6. ~/.config/cbuild/toolsrc
7
8benchmarks="coremark eembc denbench spec"
9
10check=$www/build/
11
12for i in $benchmarks; do
13 find $check -name "$i*run.txt" | sed "s#$check##" | head -n 10
14done
015
=== added file 'closest.py'
--- closest.py 1970-01-01 00:00:00 +0000
+++ closest.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,63 @@
1#!/usr/bin/env python
2# Find the closest base filename. Finds the one with the longest
3# common prefix
4#
5
6import sys
7import os.path
8import re
9
10def match(find, path):
11 """Check a build against a queue, returning the common prefix if
12 the needle starts with the queue name.
13
14 The longest match is the closest.
15
16 >>> match('foo', 'baz')
17 >>> match('gcc-foo-bar', 'gcc')
18 'gcc'
19 >>> match('gcc-foo-bar', 'gcc-foo')
20 'gcc-foo'
21 >>> match('gcc-foo-bar', 'default')
22 ''
23 """
24 base = os.path.basename(path)
25
26 if find.startswith(base):
27 return base
28 elif base == 'default':
29 return ''
30 else:
31 return None
32
33def match2(find, path):
34 base = os.path.basename(path)
35 return os.path.commonprefix([find, base])
36
37def main():
38 against = os.path.basename(sys.argv[1])
39
40 prefixes = []
41
42 for full in sys.argv[2:]:
43 key = match(against, full)
44
45 if key != None:
46 prefixes.append((full, key))
47
48 if not prefixes:
49 for full in sys.argv[2:]:
50 key = match2(against, full)
51
52 if key != None:
53 prefixes.append((full, key))
54
55 prefixes.sort(key=lambda x: -len(x[-1]))
56 print prefixes[0][0]
57
58def test():
59 import doctest
60 doctest.testmod()
61
62if __name__ == '__main__':
63 main()
064
=== added file 'cron-often.sh'
--- cron-often.sh 1970-01-01 00:00:00 +0000
+++ cron-often.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,56 @@
1#!/bin/bash
2#
3# Job that runs often, polling builds, branches, and others.
4#
5
6# Root of the www directory
7. ~/.config/cbuild/toolsrc
8
9timeout="timeout 3600"
10#timeout=
11
12cd $ctools
13
14# Update the list of all files and other caches
15function allfiles {
16 pushd $1 > /dev/null
17 find . | sort > /tmp/all-files.txt
18
19 # Only update if changed
20 ! cmp -s all-files.txt /tmp/all-files.txt \
21 && python ~/lib/cbuild-tools/cache-builds.py . \
22 && mv -f /tmp/all-files.txt . \
23 && gzip -9fc --rsyncable all-files.txt > all-files.txt.gz \
24 && gzip -9fc --rsyncable builds.json > builds.json.gz
25
26 popd > /dev/null
27}
28
29# Tabulate any new benchmarks
30./tabulate.sh
31
32# Update the file list cache
33allfiles $build
34allfiles $benchmarks
35
36# Pull any new merge requests
37timeout 14400 python taker.py -f prompt=false
38
39# Add the testsuite diffs to anything new
40python2.7 addtestdiffs.py $build/all-files.txt $snapshots $build
41python2.7 addtestdiffs.py $benchmarks/all-files.txt $snapshots $benchmarks
42
43# Rebuild all files after the diffs are added
44allfiles $build
45allfiles $benchmarks
46
47# Poll all interesting Linaro branches
48$timeout ./export-git.sh $repos/meta-linaro meta-linaro-0~
49$timeout ./pull_branch.sh $repos/gcc-linaro-4.8/4.8 gcc-linaro-4.8
50$timeout ./pull_branch.sh $repos/gcc-linaro/4.7 gcc-linaro-4.7
51$timeout ./pull_branch.sh $repos/gcc-linaro/4.6 gcc-linaro-4.6
52$timeout ./pull_branch.sh $repos/crosstool-ng/linaro crosstool-ng-linaro-1.13.1
53$timeout ./pull_branch.sh $repos/gdb-linaro/7.5 gdb-linaro-7.5
54$timeout ./pull_branch.sh $repos/cortex-strings cortex-strings-1.0
55$timeout ./export-git.sh $repos/qemu-linaro qemu-linaro-1.0~
56$timeout ./export-git.sh $repos/boot-wrapper boot-wrapper-0~
057
=== added file 'ctimes.py'
--- ctimes.py 1970-01-01 00:00:00 +0000
+++ ctimes.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,100 @@
1"""Parse the scheduler log and print the time spent transitioning from
2state to state for each host.
3"""
4import sys
5import fileinput
6import collections
7import pprint
8
9Edge = collections.namedtuple('Edge', 'a b took time')
10
11def parse():
12 hosts = {}
13 lasts = {}
14 first = None
15
16 for line in fileinput.input():
17 parts = line.split()
18 stamp, host, state, arg = (parts + ['']*4)[:4]
19 stamp = float(stamp)
20
21 if first == None:
22 first = stamp
23
24 if arg:
25 state = arg
26
27 last = lasts.get(host, None)
28 lasts[host] = (stamp, state)
29
30 if last:
31 elapsed = stamp - last[0]
32 hosts.setdefault(host, []).append(Edge(last[1], state, elapsed, stamp - first))
33
34 return hosts
35
36def cost():
37 hosts = parse()
38
39 print('\t'.join('day host job took'.split()))
40
41 for host in hosts:
42 doing = None
43 start = None
44
45 for edge in hosts[host]:
46 if edge.b == 'idle':
47 if doing and start:
48 pretty = doing.replace('-extract-top', '')
49 print('%.1f\t%s\t\t%s\t%.1f' % (edge.time/3600/24, host, pretty, (edge.time - start)/3600))
50 doing = None
51
52 if edge.b in ['lurking', 'idle']:
53 start = edge.time
54 doing = None
55
56 if edge.b in ['starting', 'idle', 'updating', 'running', 'lurking']:
57 # Skip
58 pass
59 elif 'fetch-binary' in edge.b:
60 # Skip
61 pass
62 else:
63 if doing == None:
64 doing = edge.b
65
66def main():
67 hosts = parse()
68
69 # Go through and bin them
70 bins = {}
71
72 for host, edges in hosts.items():
73 bins[host] = {}
74
75 for edge in edges:
76 key = '%s -> %s' % (edge.a, edge.b)
77
78 if edge.a in ['lurking', 'running', 'idle']:
79 continue
80
81 if key in ['idle -> updating', 'running -> idle', 'updating -> running']:
82 continue
83
84 bins[host].setdefault(key, []).append(edge.took)
85
86 for host in sorted(bins):
87 keys = bins[host]
88 print('%s:' % host)
89
90 for key in sorted(keys):
91 values = keys[key]
92 print(' %s: %.1f' % (key, sum(values)/len(values)), end=' ')
93
94 for value in values:
95 print('%.0f' % value, end=' ')
96
97 print()
98
99if __name__ == '__main__':
100 cost()
0101
=== added file 'difftests.sh'
--- difftests.sh 1970-01-01 00:00:00 +0000
+++ difftests.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,45 @@
1#!/bin/bash
2#
3# Take all of the .sum files from two directories and generate the differences
4#
5# Example:
6# difftests.sh \
7# build/gcc-4.6+svn173209/logs/armv7l-maverick-cbuild113-ursa3-cortexa9r1 \
8# build/gcc-4.6+svn173722/logs/armv7l-maverick-cbuild114-ursa4-cortexa9r1
9#
10
11set -e
12
13base=$1
14next=$2
15
16dir=`mktemp -d difftests.XXXXXXXXXX`
17trap "rm -rf $dir" EXIT
18
19mkdir -p $dir/base
20mkdir -p $dir/next
21
22# Copy across all logs
23cp $base/*.sum* $dir/base
24cp $next/*.sum* $dir/next
25unxz -f $dir/base/*.xz $dir/next/*.xz
26
27# Pull out jus the PASS/FAIL/etc lines and sort by test name
28# * Change absolute path names to .../
29# * Drop all limits tests
30#
31for i in `find $dir -name "*.sum"`; do
32 grep -E '^[A-Z]+:' $i \
33 | grep -Ev limits- \
34 | grep -Ev /guality/ \
35 | sed -r 's#/scratch/\w+/\w+/\w+/\w+/[^/]+#...#g' \
36 | sed -r "s#UNSUPPORTED: .+/testsuite/#UNSUPPORTED: #" \
37 | sort -k 2 > $i.tmp
38 mv $i.tmp $i
39done
40
41# diff returns non-zero if there is a difference
42set +e
43(cd $dir && diff -U 0 -r base next) > $dir/diff.txt
44# Drop anything but changes in test lines
45grep -E '^[+-][A-Z]' $dir/diff.txt || true
046
=== added file 'ec2-pull.sh'
--- ec2-pull.sh 1970-01-01 00:00:00 +0000
+++ ec2-pull.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,28 @@
1#!/bin/bash
2#
3# Pull the results from the cloud to the build server
4#
5
6set -e
7
8. ~/.config/cbuild/toolsrc
9
10s3=linaro-toolchain-builds
11
12dir=`mktemp -d`
13trap "rm -rf $dir" EXIT
14
15mkdir -p $dir/build
16cd $dir
17
18# Check what's there and fetch, aborting if it takes too long
19candidates=$( timeout 600 s3cmd ls s3://$s3 | awk '{print $4;}' | grep 'logs.tar.xz$' )
20
21for i in $candidates; do
22 base=$( basename $i )
23 # Get, extract, push, and remove
24 timeout 600 s3cmd get --skip-existing --no-progress $i && \
25 tar xaf $base && \
26 rsync -rtC build $www && \
27 s3cmd del $i
28done
029
=== added file 'ec2-spawn.sh'
--- ec2-spawn.sh 1970-01-01 00:00:00 +0000
+++ ec2-spawn.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,36 @@
1#!/bin/bash
2
3function spawn {
4 host=$1
5 ami=$2
6 type=$3
7
8 dir=/tmp/ec2slave/$host
9 mkdir -p $dir
10 cd $dir
11
12 ls /var/run/screen/S-cbuild/*.$host > /dev/null 2>&1 || \
13 screen -S $host -L -d -m bash -c "cd $HOME/lib/cbuild-tools && exec ./ec2slave.sh $host $ami $type"
14}
15
16# us-east1 x86_64 EBS Natty
17#spawn oort1 ami-fd589594 c1.xlarge
18# Precise
19spawn oort1 ami-a29943cb c1.xlarge
20#spawn oort3 ami-fd589594 c1.xlarge
21# us-east1 i686 EBS Natty
22#spawn oort2 ami-81c31ae8 c1.medium
23# Precise
24spawn oort2 ami-ac9943c5 c1.medium
25#spawn oort4 ami-ac9943c5 c1.medium
26spawn oort6 ami-ac9943c5 c1.medium
27#spawn oort8 ami-ac9943c5 c1.medium
28#spawn oort4 ami-e358958a c1.medium
29#spawn oort6 ami-e358958a c1.medium
30#spawn oort8 ami-e358958a c1.medium
31#spawn oort6 ami-e358958a c1.medium
32#spawn oort8 ami-06ad526f c1.medium
33#spawn oort11 ami-d5e54dbc c1.xlarge
34spawn oort12 ami-dfe54db6 c1.medium
35#spawn oort14 ami-37af765e c1.medium
36#spawn oort14 ami-87dd03ee c1.medium
037
=== added file 'ec2new.sh'
--- ec2new.sh 1970-01-01 00:00:00 +0000
+++ ec2new.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,23 @@
1#!/bin/bash
2#
3# Runs a EC2 slave for local use
4#
5# Usage: ec2new.sh [hostname [ami [type]]]
6#
7# such as:
8# ec2new.sh oort51 ami-1aad5273 c1.xlarge
9# ec2new.sh oort52 ami-06ad526f c1.medium
10#
11
12set -e
13
14# Don't kill on exit...
15kill=false
16. `dirname $0`/libec2.sh
17
18echo Starting an instance
19start_instance
20init_instance
21
22echo instance: $instance
23echo ip: $ip
024
=== added file 'ec2slave-init-cbuild.sh'
--- ec2slave-init-cbuild.sh 1970-01-01 00:00:00 +0000
+++ ec2slave-init-cbuild.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,36 @@
1#!/bin/sh
2#
3# Second stage where the cbuild user sets up
4#
5# Most configuration is in files shipped with this script
6
7touch ~/.in-cloud
8
9# Setup the SSH config
10cat > ~/.ssh/config <<EOF
11# Easier to set the port here then everywhere
12Host cbuild-master
13 ProxyCommand ssh cbuild.validation.linaro.org nc -q0 toolchain64 %p
14EOF
15
16# Set up the cbuild global config
17cat > ~/.config/cbuild/cbuildrc <<EOF
18# Nothing
19EOF
20
21# Seed the known hosts with keys
22for i in linaro-gateway bazaar.launchpad.net validation.linaro.org cbuild.validation.linaro.org $(hostname) localhost cbuild-master; do
23 ssh -o StrictHostKeyChecking=no -o BatchMode=yes $i true || true
24done
25
26# Checkout the cbuild scripts
27if [ ! -d /cbuild/.bzr ]; then
28 bzr lp-login cbuild
29 bzr branch --use-existing-dir lp:cbuild /cbuild
30fi
31
32cd /cbuild
33bzr pull
34
35# Seed files from the nice-and-close S3
36s3cmd sync s3://linaro-toolchain-files files
037
=== added file 'ec2slave-init.sh'
--- ec2slave-init.sh 1970-01-01 00:00:00 +0000
+++ ec2slave-init.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,105 @@
1#!/bin/sh
2
3# Fail on any error
4set -e
5
6# Automatically kill the slave if it runs for too long
7echo /sbin/poweroff | at now + 72 hours
8
9# Set the hostname
10hostname=${1:-oort99}
11
12echo $hostname > /etc/hostname
13hostname $hostname
14echo 127.0.0.1 $hostname proxy >> /etc/hosts
15
16# Bind mount /cbuild
17mkdir -p /cbuild /mnt/cbuild
18mount -o bind /mnt/cbuild /cbuild
19chmod a+w /cbuild
20
21# Use the S3 backed archive
22sed -i.dist 's,archive.ubuntu.com,archive.ubuntu.com.s3.amazonaws.com,g' /etc/apt/sources.list
23# without pipelining to make it more reliable
24echo "Acquire::http::Pipeline-Depth 0;" > /etc/apt/apt.conf.d/99no-pipelining
25
26flags="-yq --force-yes"
27
28# Install the basic packages
29apt-get update || true
30apt-get install $flags puppet jed python-software-properties
31
32# Add in the PPA
33#add-apt-repository ppa:linaro-toolchain-dev/build-deps
34#apt-get update || true
35
36# Basics
37apt-get build-dep $flags gcc binutils gdb
38apt-get build-dep $flags gcc gdb eglibc ffmpeg llvm python libvorbis
39apt-get build-dep $flags gcc-4.6 || \
40 apt-get build-dep $flags gcc-4.5 || \
41 apt-get build-dep $flags gcc-snapshot \
42
43# For llvm
44apt-get install $flags python-sphinx || true
45
46# For cross test. Missing on lucid
47apt-get install $flags qemu-kvm qemu-user qemu-system || true
48
49# Needed for baremetal and graphite
50apt-get install $flags binutils-arm-none-eabi libcloog-ppl-dev || true
51
52# Needed for cbuild
53apt-get install $flags time python-irclib wget gnupg ccrypt rsync coreutils \
54 sendemail bzr xdelta3 xz-utils unzip openssh-client gdb libio-socket-ssl-perl libnet-ssleay-perl
55
56# Needed for ubutest
57apt-get build-dep $flags eglibc base-files base-passwd bash coreutils dash \
58 debianutils diffutils dpkg e2fsprogs findutils grep gzip hostname ncurses \
59 perl python-defaults sed shadow tar util-linux
60
61# Needed for qemu
62apt-get install $flags libpixman-1-dev
63
64apt-get build-dep $flags python2.7 || true
65# devscripts asks a question...
66echo 1 | apt-get install $flags devscripts || true
67
68# Needed for crosstool-NG LSB based builds
69apt-get install $flags lsb lsb-build-cc3 lsb-appchk3 ccache gcc-mingw32 flip tk tofrodos
70apt-get install $flags gcc-4.1 g++-4.1 || true
71apt-get install $flags gcc-4.1-multilib g++-4.1-multilib || true
72
73# Needed for cloud builds
74apt-get install $flags polipo s3cmd
75
76# Use Bash as /bin/sh. Michael's fault...
77echo n | dpkg-reconfigure -f teletype dash
78
79# Set up the cbuild user
80cb=/home/cbuild
81
82adduser --disabled-password --gecos cbuild cbuild
83tar xaf *seed*.tar* -C /home/cbuild --strip-components=1
84
85chown -R cbuild.cbuild $cb
86chmod -R go-rwx $cb/.ssh
87
88# Finally put /tmp in RAM for faster builds
89mount -t tmpfs tmfs /tmp
90
91# Break multiarch
92multi=`dpkg-architecture -qDEB_BUILD_MULTIARCH || true`
93
94if [ ! -z "$multi" ]; then
95 pushd /usr/lib
96 cd /usr/lib && ls $multi/*crt*.o | xargs -L 1 ln -sf
97 cd /usr/include
98
99 for i in asm gnu bits sys; do
100 if [ -d $multi/$i ]; then
101 rm -rf $i
102 ln -sf $multi/$i
103 fi
104 done
105fi
0106
=== added file 'ec2slave.sh'
--- ec2slave.sh 1970-01-01 00:00:00 +0000
+++ ec2slave.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,65 @@
1#!/bin/bash
2#
3# Runs a EC2 slave that builds cbuild jobs
4#
5# Usage: ec2slave.sh [hostname [ami [type]]]
6#
7# such as:
8# ec2slave.sh oort2 ami-689c6701 c1.medium
9# ec2slave.sh oort1 ami-40926929 c1.xlarge
10#
11
12set -e
13
14. `dirname $0`/libec2.sh
15
16function kill_old {
17 if [ "$instance" != "" ]; then
18 now=`date +%s`
19 elapsed=$(($now-$used))
20
21 if [ "$elapsed" -gt "600" ]; then
22 echo Killing $instance due to inactivity
23 $lock ec2kill $instance
24 instance=
25 fi
26 fi
27}
28
29function update_status {
30 if [ "$instance" == "" ]; then
31 $wget -O - $SCHEDULER_API/update/$hostname/lurking
32 else
33 $wget -O - $SCHEDULER_API/update/$hostname/idle
34 fi
35}
36
37DRIFT=$(($RANDOM % 30))
38
39while true; do
40 echo $hostname $instance `date`
41
42 sleep $((180 + $DRIFT))
43
44 kill_old
45 update_status
46
47 # See if there's a job waiting
48 next=`$wget -O - $SCHEDULER_API/peek/$hostname`
49
50 case "$next" in
51 False) continue ;;
52 esac
53
54 echo Running a job
55
56 if [ "$instance" == "" ]; then
57 echo Starting an instance
58 start_instance
59 init_instance
60 used=`date +%s`
61 fi
62
63 ssh -o StrictHostKeyChecking=no -o BatchMode=yes -t cbuild@$ip 'cd /cbuild && ./run.sh'
64 used=`date +%s`
65done
066
=== added file 'ec2watch.sh'
--- ec2watch.sh 1970-01-01 00:00:00 +0000
+++ ec2watch.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,18 @@
1#!/bin/bash
2#
3# Watch the EC2 instances so we can see where the money goes
4#
5# Terribly ugly hack. Add this as a cronjob that runs once a half
6# hour. Spits the results of ec2din into a timestamped file under
7# $HOME/var.
8#
9
10# Pull in the local configuration
11. ~/.private/ec2slave.rc
12
13dir=$HOME/var/log/ec2watch
14month=$(date +%Y-%m)
15now=$(date +%F-%H-%M)
16
17mkdir -p $dir/$month
18ec2din --show-empty-fields | gzip > $dir/$month/$now.txt.gz
019
=== added file 'enrich-gcc-linaro-repo'
--- enrich-gcc-linaro-repo 1970-01-01 00:00:00 +0000
+++ enrich-gcc-linaro-repo 2014-07-01 15:58:40 +0000
@@ -0,0 +1,45 @@
1#!/usr/bin/python
2# -*- coding: utf-8 -*-
3#
4# Copies revisions from branches into the dev focus branch as it's
5# also the stacked branch. This speeds up branching when using a
6# shared repo.
7#
8# Runs as a cron job
9#
10# Author: Loïc Minier <loic.minier@linaro.org>
11#
12
13# took 6mn41s on the first run
14# 18s on the second run
15#
16
17import sys
18
19import bzrlib.repository
20import bzrlib.plugin
21
22#bzrlib.initialize()
23bzrlib.plugin.load_plugins()
24
25r_4_7 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.7')
26r_4_6 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.6')
27#r_4_5 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.5')
28#r_4_4 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.4')
29# copy revisions from 4.6 branches into 4.7 which is the devfocus
30# branch because it's used as a stacked branch
31# 4.4 and 4.5 aren't updated anymore
32r_4_7.fetch(r_4_6)
33#r_4_6.fetch(r_4_5)
34#r_4_6.fetch(r_4_4)
35# copy revisions from 4.6 into 4.4 and 4.5 once to workaround LP #388269; only
36# needed once ever
37# copying 4.5 into 4.4 took 52mn55s on the first run and 51s on second run
38# (with three up-to-date enriching branches enabled)
39# copying 4.5 into 4.6 took 1h45mn13s on the first run and 48s on second run
40# (with three maybe not fully up-to-date enriching branches enabled)
41#r_4_4.fetch(r_4_6)
42#r_4_5.fetch(r_4_6)
43r_4_6.fetch(r_4_7)
44
45print sys.argv[0], 'done'
0\ No newline at end of file46\ No newline at end of file
147
=== added file 'export-bzr.sh'
--- export-bzr.sh 1970-01-01 00:00:00 +0000
+++ export-bzr.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,85 @@
1#!/bin/sh
2# Update a branch and snapshot the latest revision
3# Can also snapshot past revisions and add a per-user
4# suffix
5#
6# Example: export-bzr.sh ~/repos/gcc-linaro/4.6 gcc-linaro-4.6 [revno [suffix [basedir]]]
7#
8set -e
9
10. ~/.config/cbuild/toolsrc
11
12dir=`mktemp -d`
13trap "rm -rf $dir" EXIT
14
15# Directory this script runs from
16here=`dirname $0`
17
18# Suffix for manual pulls
19SUFFIX=${4:-}
20
21# Pull a branch and tarball it up
22branch=$1
23head=$2
24
25latest=`bzr revno $branch`
26revno=${3:-$latest}
27
28base=${5:-$branch}
29
30stem="$head+bzr$revno$SUFFIX"
31tar="/tmp/$stem.tar"
32meta="/tmp/$stem.txt"
33diff="/tmp/$stem.diff"
34xdelta="$tar.xdelta3.xz"
35
36if [ -f $snapshots/`basename $xdelta` ]; then
37 exit 0
38fi
39
40closest=`$here/closest.py $tar $snapshots/base/*.tar`
41
42mkdir -p $snapshots
43
44ancestor=`bzr revision-info -d $branch -r ancestor:$base`
45
46# Record some meta data about the branch
47echo \# Branch summary > $meta
48echo name: $stem >> $meta
49echo snapshot-date: `date --rfc-3339=seconds` >> $meta
50bzr log -l1 $branch | grep -E '^(committer|timestamp):' >> $meta
51echo directory: $branch >> $meta
52echo branch: `bzr info $branch | grep parent | awk '{print $3;}'` >> $meta
53echo mainline-branch: $base >> $meta
54echo revno: $revno >> $meta
55echo revision: `bzr revision-info -d $branch -r $revno` >> $meta
56echo tip: `bzr revision-info -d $branch` >> $meta
57echo ancestor: $ancestor >> $meta
58echo closest: `basename $closest` >> $meta
59echo xdelta3: `basename $xdelta` >> $meta
60
61id=`echo $ancestor | awk '{ print $2; }'`
62
63# Create the diff
64if [ "$id" != "$revno" ]; then
65 bzr diff -r $id $branch > $diff || true
66fi
67
68echo Exporting and stamping $revno
69bzr export -r $revno $dir/$stem $branch
70REVNO=$revno $here/stamp_branch.sh $branch $dir/$stem
71
72# Create the tarball
73echo Creating $tar
74(cd $dir && find $stem -print0 | sort -z) | tar caf $tar -C $dir --no-recursion --null -T -
75
76# Create the delta
77echo Making the delta relative to $closest
78xdelta3 -fe -s $closest $tar $tar.xdelta3
79rm -f $tar
80xz -f $tar.xdelta3
81
82mv -f $tar*.xz $diff $meta $snapshots
83
84echo New snapshot `basename $tar` created
85echo stem: $stem
086
=== added file 'export-git.sh'
--- export-git.sh 1970-01-01 00:00:00 +0000
+++ export-git.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,75 @@
1#!/bin/sh
2#
3# Update a git branch and snapshot the latest revision
4#
5
6set -e
7
8. ~/.config/cbuild/toolsrc
9
10dir=`mktemp -d`
11trap "rm -rf $dir" EXIT
12
13# Directory this script runs from
14here=`dirname $0`
15
16# Suffix for manual pulls
17SUFFIX=${4:-}
18
19# Subdirectory to archive. Needed for Android GCC
20#SUBDIR=
21
22# Pull a branch and tarball it up
23branch=$1
24head=${2:-`basename $branch`}
25id=${3:-HEAD}
26
27export GIT_DIR=$branch/.git
28export GIT_WORK_TREE=$branch
29
30(cd $branch && git reset --hard -q && git pull -q)
31latest=`git rev-parse $id`
32short=`echo $latest | cut -c -7`
33now=`date +%Y%m%d`
34
35name=$head$now+git$short$SUFFIX
36tar=/tmp/$name.tar
37xdelta=$tar.xdelta3.xz
38
39if ls $snapshots/$head*git$short*xz > /dev/null 2>&1; then
40 exit 0
41fi
42
43mkdir -p $snapshots
44echo Exporting $short
45git archive --format=tar --prefix=$name/ $id $SUBDIR | tar xf - -C $dir
46
47if [ -n "$SUBDIR" ]; then
48 # We've ended up with $dir/$head/$SUBDIR. Shuffle them about
49 mv $dir/$head/$SUBDIR $dir
50 rm -r $dir/$head
51 mv $dir/$SUBDIR $dir/$head
52fi
53
54# Create the tarball
55echo Creating $tar
56(cd $dir && find $name -print0 | sort -z) | tar caf $tar -C $dir --no-recursion --null -T -
57
58# Use gzip as an approximation to how well it will compress
59size=$( gzip < $tar | wc -c )
60
61if [ "$size" -lt "10000000000" ]; then
62 # Small enough to just tar
63 xz -9f $tar
64else
65 # Create the delta
66 closest=`$here/closest.py $tar $snapshots/base/*.tar`
67 echo Making the delta relative to $closest
68 xdelta3 -fe -s $closest $tar $tar.xdelta3
69 rm -f $tar
70 xz -9f $tar.xdelta3
71fi
72
73mv $tar*.xz $snapshots
74echo New snapshot `basename $tar` created
75$here/spawn.sh $tar
076
=== added file 'export-hg.sh'
--- export-hg.sh 1970-01-01 00:00:00 +0000
+++ export-hg.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,61 @@
1#!/bin/sh
2# Update a branch and snapshot the latest revision
3# Can also snapshot past revisions and add a per-user
4# suffix
5#
6# Example: export-bzr.sh ~/repos/gcc-linaro/4.6 gcc-linaro-4.6 [revno [suffix [basedir]]]
7#
8set -e
9
10dir=`mktemp -d`
11trap "rm -rf $dir" EXIT
12
13# Directory this script runs from
14here=`dirname $0`
15
16# Suffix for manual pulls
17SUFFIX=${4:-}
18
19# Pull a branch and tarball it up
20branch=$1
21head=$2
22hgflags="-R $branch -y -q"
23
24hg $hgflags pull
25latest=`hg $hgflags -q tip | awk -F: '{print $1;}'`
26
27revno=${3:-$latest}
28
29base=${5:-$branch}
30
31snapshots=$HOME/snapshots
32stem=$head+hg$revno$SUFFIX
33tar=/tmp/$stem.tar
34xdelta=$tar.xdelta3.xz
35
36if [ -f $snapshots/`basename $xdelta` ]; then
37 exit 0
38fi
39
40closest=`$here/closest.py $tar $snapshots/base/*.tar`
41
42mkdir -p $snapshots
43
44echo Exporting and stamping $revno
45hg $hgflags archive -r $revno $dir/$head
46REVNO=$revno $here/stamp_branch.sh $branch $dir/$head
47
48# Create the tarball
49echo Creating $tar
50(cd $dir && find $head -print0 | sort -z) | tar caf $tar -C $dir --no-recursion --null -T -
51
52# Create the delta
53echo Making the delta relative to $closest
54xdelta3 -fe -s $closest $tar $tar.xdelta3
55rm -f $tar
56xz -f $tar.xdelta3
57
58mv -f $tar.xdelta3.xz $snapshots
59
60echo New snapshot `basename $tar` created
61echo stem: $stem
062
=== added file 'findancestor.py'
--- findancestor.py 1970-01-01 00:00:00 +0000
+++ findancestor.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,204 @@
1#!/usr/bin/python
2"""Given a build name, find the ancestor or the last working build of
3that series.
4
5Examples:
6 findancestor gcc-linaro-4.5+bzr99502~ams~merge-20110413 -> the ancestor
7 findancestor gcc-linaro-4.5+bzr99510 -> probably gcc-linaro-4.5+bzr99509
8 findancestor gcc-4.6+svn174684 -> the build from last week
9"""
10
11import sys
12import os.path
13import pprint
14import re
15import pdb
16import collections
17
18def parse(name):
19 """Parse the meta data file which is a series of name: value pairs"""
20 values = {}
21
22 with open(name) as f:
23 lines = [x.strip() for x in f.readlines()]
24
25 for line in lines:
26 if not line:
27 continue
28
29 if line.startswith('#'):
30 continue
31
32 name, value = [x.strip() for x in line.split(':', 1)]
33 values[name] = value
34
35 return values
36
37def revkey(record):
38 try:
39 if record.date:
40 return int(record.date.replace('~', ''))
41 else:
42 return int(record.revision)
43 except ValueError:
44 return record.revision
45
46def make_splitter():
47 """Splits a build name like gcc-4.5+svn1234,
48 gcc-linaro-4.6+bzr1234, and gdb-7.3~20110606+gitabcdef into the
49 product, series, optional date, version control system, and
50 revision.
51
52 >>> make_splitter().match('gcc-4.5~svn12345').groups()
53 ('gcc-', '4.5', None, '~svn', '12345')
54 >>> make_splitter().match('gcc-linaro-4.6+bzr4567').groups()
55 ('gcc-linaro-', '4.6', None, '+bzr', '4567')
56 >>> make_splitter().match('gdb-7.3~20110606+git057a947').groups()
57 ('gdb-', '7.3', '~20110606', '+git', '057a947')
58 """
59 return re.compile('([\D\-]+)([\d\.]+)([+~]\d+)?([+~][a-z]{3})(\w+)$')
60
61def make_release_splitter():
62 """Splits a release name like gcc-linaro-4.5-2011.03-0 into
63 product, series, and release.
64
65 >>> make_release_splitter().match('gcc-linaro-4.5-2011.03-0').groups()
66 ('gcc-linaro-', '4.5', '-2011.03-0')
67 >>> make_release_splitter().match('gcc-linaro-4.6-2012.03').groups()
68 ('gcc-linaro-', '4.6', '-2012.03')
69 """
70 return re.compile('([\D\-]+)([\d\.]+)(-[\d.\-]+)$')
71
72def find_via_meta(name, snapshots):
73 # See if we can find the meta data and extract the ancestor
74 metaname = os.path.join(snapshots, '%s.txt' % name)
75
76 try:
77 meta = parse(metaname)
78 revision, prev, mainline = meta['revision'], meta['ancestor'], meta['mainline-branch']
79
80 if revision != prev:
81 return '%s+bzr%s' % (mainline, prev.split()[0])
82 except Exception, ex:
83 pass
84
85 return ''
86
87def find_via_build(allfiles, name):
88 # Don't know the ancestor. Find the previous build
89 Record = collections.namedtuple('Record', 'product series date vcs revision')
90
91 splitter = make_splitter()
92 builds = set([x[1] for x in allfiles if x[-1] == 'finished.txt'])
93 builds = [splitter.match(x) for x in sorted(builds)]
94 builds = [Record(*x.groups()) for x in builds if x]
95
96 match = splitter.match(name)
97
98 if match:
99 this = Record(*match.groups())
100
101 # Grab the builds that match
102 matches = [x for x in builds
103 if x.product == this.product
104 and x.series == this.series
105 and x.vcs == this.vcs]
106
107 # Sort them by revno
108 matches.sort(key=revkey)
109
110 # Find this build and the one before
111 revnos = [x.revision for x in matches]
112
113 if this.revision in revnos:
114 idx = revnos.index(this.revision)
115
116 if idx > 0:
117 return ''.join(x for x in matches[idx-1] if x)
118
119 return ''
120
121def find_via_variant(allfiles, name):
122 # Don't know the ancestor. Find the same build without the variant.
123 if '^' not in name:
124 return None
125
126 root, variant = name.split('^')
127
128 builds = [x for x in allfiles if x[-1] == 'finished.txt' and x[-4] == root]
129
130 if builds:
131 return root
132
133 return ''
134
135def find_via_release(allfiles, name):
136 Record = collections.namedtuple('Record', 'product series release')
137
138 # Don't know the ancestor. Find the previous release
139 splitter = make_release_splitter()
140 builds = set([x[1] for x in allfiles if x[-1] == 'finished.txt'])
141 builds = [splitter.match(x) for x in sorted(builds)]
142 builds = [Record(*x.groups()) for x in builds if x]
143
144 match = splitter.match(name)
145
146 if match:
147 this = Record(*match.groups())
148
149 # Grab the builds that match
150 matches = [x for x in builds
151 if x.product == this.product
152 and x.series == this.series]
153
154 # Sort them by release
155 matches.sort(key=lambda x: x.release)
156
157 # Find this build and the one before
158 releases = [x.release for x in matches]
159
160 if this.release in releases:
161 idx = releases.index(this.release)
162
163 if idx > 0:
164 return ''.join(x for x in matches[idx-1] if x)
165
166 return ''
167
168def find(allfiles, name, snapshots):
169 ancestor = find_via_meta(name, snapshots)
170
171 if ancestor:
172 return ancestor
173
174 ancestor = find_via_variant(allfiles, name)
175
176 if ancestor:
177 return ancestor
178
179 ancestor = find_via_build(allfiles, name)
180
181 if ancestor:
182 return ancestor
183
184 ancestor = find_via_release(allfiles, name)
185
186 if ancestor:
187 return ancestor
188
189 return ''
190
191def main():
192 allfiles, name = sys.argv[1:]
193
194 with open(allfiles) as f:
195 allf = [x.strip().split(os.sep) for x in f.readlines()]
196
197 print find(allf, name, '')
198
199def test():
200 import doctest
201 doctest.testmod()
202
203if __name__ == '__main__':
204 main()
0205
=== added file 'findlogs.py'
--- findlogs.py 1970-01-01 00:00:00 +0000
+++ findlogs.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,113 @@
1#!/usr/bin/python
2"""Given a log file from a build and the build's ancestor, find the
3corresponding log file for the ancestor.
4
5Used to find the GCC testsuite results from the ancestor so that they
6can be compared.
7
8Example:
9 findlogs.py build/all-files.txt build/gcc-4.6+svn173722/logs/armv7l-maverick-cbuild114-ursa4-cortexa9r1/gcc.sum.xz gcc-4.6+svn173209
10 "gcc-4.6+svn173209 armv7l-maverick-cbuild113-ursa3-cortexa9r1"
11
12"""
13
14import sys
15import os.path
16import pprint
17import re
18import pdb
19import logging
20
21def prefilter(files, contains):
22 # Select those that could be a log
23 files = [x[-4:] for x in files if len(x) >= 4]
24 # ...and those that match the build log
25 files = [x for x in files if contains in x[-1]]
26
27 return files
28
29def find(allfiles, log, ancestor):
30 logging.debug("FIND: input log %s" % (log))
31 logging.debug("FIND: input ancestor %s" % (ancestor))
32 log = log.split('/')
33 arch, osname, cbuild, host, config = log[-2].split('-')
34
35 # Tidy up the ancestor
36 match = re.match('lp\:([^/]+)(/.+)?([+~]\w+\d+)', ancestor)
37
38 if match:
39 branch, series, revision = match.groups()
40
41 if series:
42 series = series[1:]
43 elif branch == 'gcc-linaro':
44 # gcc-linaro has been different series at different
45 # times. Patch.
46 match = re.match('\+bzr(\d+)', revision)
47
48 if match:
49 revno = int(match.group(1))
50 if revno < 100000:
51 series = '4.5'
52 elif revno < 110000:
53 series = '4.6'
54 elif revno < 121996:
55 series = '4.7'
56 else:
57 series = '4.8'
58
59 logging.debug("FIND: ancestor has branch %s series %s" % (branch, series))
60
61 elif branch == 'gcc':
62 # gcc is the same as gcc/4.8
63 match = re.match('\+bzr(\d+)', revision)
64
65 if match:
66 revno = int(match.group(1))
67 if revno < 120000:
68 series = '4.7'
69 else:
70 series = '4.8'
71 else:
72 assert False, 'Unrecognised development focus for %s' % branch
73
74 ancestor = '%s-%s%s' % (branch, series, revision)
75 logging.debug("FIND: updated ancestor to %s" % (ancestor))
76
77 # Select those that could be a log
78 files = [x[-4:] for x in allfiles if len(x) >= 4]
79
80 # Select results from the ancestor
81 files = [x for x in files if x[-4] == ancestor]
82 # ...and those that match the build log
83 files = [x for x in files if x[-1] == log[-1]]
84
85 # Split out the build ID
86 for f in files:
87 f[-2] = (f[-2].split('-') + ['']*5)[:5]
88
89 # ...and that have the same architecture, config, and os
90 files = [x for x in files if x[-2][0] == arch and x[-2][-1] == config and x[-2][1] == osname]
91
92 # Sort by cbuild number
93 files.sort(key=lambda x: int(x[-2][2].replace('cbuild', '')))
94
95 if files:
96 best = files[-1]
97 return (best[0], '-'.join(best[2]))
98 else:
99 return None
100
101def main():
102 allfiles, log, ancestor = sys.argv[1:]
103
104 with open(allfiles) as f:
105 allf = [x.strip().split(os.sep) for x in f.readlines()]
106
107 best = find(allf, log, ancestor)
108
109 if best:
110 print ' '.join(best)
111
112if __name__ == '__main__':
113 main()
0114
=== added file 'gcc-release.sh'
--- gcc-release.sh 1970-01-01 00:00:00 +0000
+++ gcc-release.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,168 @@
1#!/bin/bash
2#
3# Run the steps in the GCC release process
4#
5# gcc-release.sh lp:gcc-linaro/4.7 [respin]
6#
7
8set -e
9
10branch=$1
11[ -z "$branch" ] && echo 'Missing branch name' && exit -1
12
13if [ -n "$2" ]; then
14 respin=-$2
15 nextspin=$(($2 + 1))
16else
17 respin=
18 nextspin=1
19fi
20
21# Pull the author and email from bzr whoami
22me=$( bzr whoami )
23fullname=$( echo "$me" | awk -F\< '{print $1;}' | sed 's# *$##' )
24email=$( echo "$me" | awk -F\< '{print $2;}' | tr -d \> )
25
26year=$( date +%Y )
27month=$( date +%m )
28day=$( date +%d )
29
30series=$( echo $branch | sed -r "s#.+/(.+)#\1#" )
31# Die if the series can't be found in the branch name
32[ -n "$series" ]
33
34# For testing:
35# Stack on top of a local branch to cut the branch time
36#stacked=--stacked
37# Override the month
38#month=07
39
40# Use ~/.ssh/config to set up the validation lab bounce host
41host=cbuild@toolchain64.lab
42snapshots="~/var/snapshots"
43
44release=$year.$month$respin
45
46mkdir -p release/$series-$release
47cd release/$series-$release
48ln -sf ~/linaro/gcc/.bzr
49
50# Create a new branch if needed
51if [ ! -d tree/.bzr ]; then
52 echo "Branching $branch"
53 bzr branch $stacked $branch tree
54fi
55
56# Add the release ChangeLog entry and bump the version
57if ! head -n 50 tree/ChangeLog.linaro | grep -q "GCC Linaro $series-$release released"; then
58 echo "Adding release entry to the ChangeLog"
59 cat > message <<EOF
60$year-$month-$day $fullname <$email>
61
62 GCC Linaro $series-$release released.
63
64 gcc/
65 * LINARO-VERSION: Update.
66
67EOF
68 cat message tree/ChangeLog.linaro > tmp
69 mv tmp tree/ChangeLog.linaro
70
71 echo $series-$release > tree/gcc/LINARO-VERSION
72
73 echo "Tagging and committing"
74 bzr commit -m "Make $series-$release release." tree
75 bzr tag -d tree gcc-linaro-$series-$release
76fi
77
78work=gcc-linaro-$series-$release
79
80if [ ! -d $work ]; then
81 echo "Exporting"
82 bzr export $work.tmp tree
83 mv $work.tmp $work
84fi
85
86cd $work
87
88# Update the manuals
89env SOURCEDIR=`pwd`/gcc/doc DESTDIR=`pwd`/INSTALL ./gcc/doc/install.texi2html
90
91# Build the tree and translations
92if [ ! -d ../objdir/gcc/po ]; then
93 echo "Building"
94 ./contrib/gcc_build -d `pwd` -o ../objdir -c "--enable-generated-files-in-srcdir --disable-multilib" -m \-sj4 build > ../build.txt 2>&1
95fi
96
97# Copy the translations across
98cp -uv ../objdir/gcc/po/*.gmo gcc/po/
99
100# Make the md5sums
101if [ ! -f MD5SUMS ]; then
102 echo "Making MD5SUMS"
103 cat > MD5SUMS <<EOF
104# This file contains the MD5 checksums of the files in the
105# $work.tar.bz2 tarball.
106#
107# Besides verifying that all files in the tarball were correctly expanded,
108# it also can be used to determine if any files have changed since the
109# tarball was expanded or to verify that a patchfile was correctly applied.
110#
111# Suggested usage:
112# md5sum -c MD5SUMS | grep -v "OK\$"
113EOF
114 find . -type f | sed -e 's:^\./::' -e '/MD5SUMS/d' | sort | xargs md5sum >> MD5SUMS
115fi
116
117cd ..
118
119next=$series-$release-$nextspin~dev
120
121# Bump the version
122if ! grep -q $next tree/gcc/LINARO-VERSION; then
123 cat > message <<EOF
124$year-$month-$day $fullname <$email>
125
126 gcc/
127 * LINARO-VERSION: Bump version.
128
129EOF
130 cat message tree/ChangeLog.linaro > tmp
131 mv tmp tree/ChangeLog.linaro
132
133 echo $next > tree/gcc/LINARO-VERSION
134 bzr commit -m "Bump version number, post release." tree
135fi
136
137# Tar it up
138if [ ! -f $work.tar.bz2 ]; then
139 # Check for bad files
140 for ext in .orig .rej "~" .svn .bzr; do
141 if [ -n "$(find $work -name *$ext)" ]; then
142 echo "Found $ext files in the tarball"
143 exit -2
144 fi
145 done
146
147 echo "Making tarball"
148 tar cf $work.tar $work
149 bzip2 -9 $work.tar
150fi
151
152# Only sign on the build infrastructure
153if [ ! -f $work.tar.bz2.asc ]; then
154 read -p "Upload the tarball to snapshots and sign (y/n)? "
155
156 if [ "$REPLY" = "y" ]; then
157 echo "Pushing to snapshots"
158 rsync --progress -a $work.tar.bz2 "$host:$snapshots"
159 # Password is the EEMBC password on https://wiki.linaro.org/Internal/ToolChain
160 ssh -t $host gpg --no-use-agent -q --yes --passphrase-file /home/cbuild/.config/cbuild/password --armor --sign --detach-sig --default-key cbuild "$snapshots/$work.tar.bz2"
161 # Pull the signature back down and check
162 rsync -a "$host:$snapshots/$work*asc" .
163 gpg -q --verify $work.tar.bz2.asc
164 fi
165fi
166
167echo "Done. Results are in $(pwd). Spawn at http://ex.seabright.co.nz/helpers/scheduler/spawn."
168echo "Don't forget to push!"
0169
=== added file 'getbody.py'
--- getbody.py 1970-01-01 00:00:00 +0000
+++ getbody.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,26 @@
1#!/usr/bin/python
2"""Extracts the body of a GCC contrib/test_summary email by extracting
3the lines between a cat <<'EOF' and the EOF.
4"""
5
6import fileinput
7import re
8
9def main():
10 lines = iter([x.rstrip() for x in fileinput.input()])
11
12 while True:
13 line = lines.next()
14 if re.match('cat\s*<<', line):
15 break
16
17 while True:
18 line = lines.next()
19
20 if re.match('EOF', line):
21 break
22 else:
23 print line
24
25if __name__ == '__main__':
26 main()
027
=== added file 'lab-pull.sh'
--- lab-pull.sh 1970-01-01 00:00:00 +0000
+++ lab-pull.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,22 @@
1#!/bin/bash
2#
3# Pull results from the lab into this server
4#
5
6dir=$HOME/incoming/validation
7www=/var/www/ex.seabright.co.nz
8
9mkdir -p $dir
10
11for j in build benchmarks; do
12 # Fetch everything
13 timeout 600 rsync -e "ssh -C" -itr "control.v:~/cbuild/$j/*/*-logs.*" $dir/$j
14
15 # Extract those that are new
16 cd $www
17
18 for i in $dir/$j/*-logs*.xz; do
19 if [ $i.done -nt $i ]; then continue; fi
20 tar xaf $i && touch $i.done
21 done
22done
023
=== added file 'latestami.sh'
--- latestami.sh 1970-01-01 00:00:00 +0000
+++ latestami.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,17 @@
1#!/bin/bash
2#
3# Get the AMI of the latest release of a series
4#
5
6distro=${1:-precise}
7arch=${2:-amd64}
8region=us-east-1
9storage=ebs
10
11wget -q -O - https://cloud-images.ubuntu.com/query/$distro/server/released.current.txt \
12 | grep $region \
13 | grep $arch \
14 | grep $storage \
15 | grep -v hvm \
16 | awk '{ print $8; }'
17
018
=== added file 'launcher.sh'
--- launcher.sh 1970-01-01 00:00:00 +0000
+++ launcher.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,146 @@
1#!/bin/bash
2#
3# Takes all incoming events and launches them. Done as one large
4# script to keep everything together and remove contention and locking
5# issues.
6#
7
8. ~/.config/cbuild/toolsrc
9
10# Use run=echo to test
11run=
12
13#set -e
14
15# Pull in the local configuration
16hostrc=$( dirname $0 )/launcher-$( hostname ).rc
17[ -f $hostrc ] && . $hostrc
18
19function remove_old_jobs {
20 # Nuke any daily jobs early
21 $run find $scheduler -name "gcc-4.*svn*.job" -mtime +5 -exec rm -f {} \;
22 # Nuke any old jobs
23 $run find $scheduler -name "*.job" -mtime +14 -exec rm -f {} \;
24 # Nuke any old locks
25 $run find $scheduler -name "*.job.*" -mtime +20 -exec rm -f {} \;
26}
27
28function daily {
29 remove_old_jobs
30 # Pull the active trunk branches once a day
31 PRIORITY=daily $run $ctools/up_branch.sh $repos/gcc-trunk gcc-4.8~
32 PRIORITY=daily $run $ctools/up_branch.sh $repos/gcc-4.7 gcc-4.7+
33 # Update the md5sums on all snapshots
34 (cd $snapshots && $run rm -f md5sums && $run md5sum *z* > md5sums)
35 # Remove older files from S3
36 $run python $ctools/s3-diet.py linaro-toolchain-builds
37 # Check for leaking benchmark results
38 $run $ctools/check-leaks.sh
39 # Update the benchmarks graph
40 local pattern="logs/arm*precise*ursa1?-*a9hf*/benchmarks.txt"
41 local diffplot="python $lib/linaro-toolchain-benchmarks/scripts/diffplot.py"
42 (cd $benchmarks \
43 && $run $diffplot gcc-4.8~svn.png gcc-4.7.0/$pattern gcc-4.8~svn??????/$pattern \
44 && $run $diffplot gcc-linaro-4.7.png gcc-4.7.0/$pattern gcc-linaro-4.7-201?.??/$pattern \
45 && $run $diffplot gcc-linaro-4.7+bzr.png gcc-linaro-4.7+bzr??????/$pattern \
46 )
47}
48
49function build_many {
50 # Build a range of branches
51 local dow=$1
52
53 case $dow in
54 1)
55 # Benchmark the latest tip build
56 $run $ctools/chain.sh benchmarks "gcc-linaro-4.7+bzr??????"
57 $run $ctools/chain.sh oecore "gcc-linaro-4.7+bzr??????.*"
58 ;;
59 2)
60 $run $ctools/export-git.sh $repos/newlib newlib-1.21~
61 $run $ctools/export-git.sh $repos/qemu-git qemu-1.4~
62 $run $ctools/up_branch.sh $repos/gcc-arm-embedded-4.6 gcc-arm-embedded-4.6+
63 $run $ctools/chain.sh ubutest "gcc-4.6+*"
64 ;;
65 3)
66 # Ensure we get at least one build of trunk a week
67 $run $ctools/chain.sh trunk "gcc-4.8~*"
68 $run $ctools/export-git.sh $repos/openembedded-core openembedded-core-1.4~
69 $run $ctools/export-git.sh $repos/bitbake bitbake-1.15+
70 ;;
71 4)
72 $run $ctools/chain.sh trunk "gcc-4.7+*"
73 $run $ctools/up_branch.sh $repos/gcc-arm-aarch64-4.7 gcc-arm-aaarch64-4.7+
74 # Many updates for LLVM, only one tarball (no version number) at the end
75 $run $ctools/up_branch.sh $repos/llvm/tools/clang
76 $run $ctools/up_branch.sh $repos/llvm/tools/clang/tools/extra
77 $run $ctools/up_branch.sh $repos/llvm/projects/compiler-rt
78 $run $ctools/up_branch.sh $repos/llvm llvm-svn~
79 ;;
80 5)
81 # Benchmark the latest trunk build
82 $run $ctools/chain.sh benchmarks "gcc-4.8~*"
83 # Make sure 4.6 is built once a week
84 $run $ctools/up_branch.sh $repos/gcc-4.6 gcc-4.6+
85 $run $ctools/up_branch.sh $repos/eglibc/libc eglibc-2.18~
86 $run $ctools/export-git.sh $repos/libffi libffi-3.1~
87 ;;
88 6)
89 $run $ctools/chain.sh ubutest "gcc-4.8~*"
90 # Make sure 4.7 is built once a week
91 $run $ctools/up_branch.sh $repos/gcc-4.7 gcc-4.7+
92 $run $ctools/export-git.sh $repos/gdb gdb-7.6~
93 $run $ctools/up_branch.sh $repos/gcc-google-4.7 gcc-google-4.7+
94 ;;
95 7)
96 $run $ctools/chain.sh ubutest "gcc-4.7+*"
97 $run $ctools/export-git.sh $repos/binutils binutils-2.24~
98 ;;
99 esac
100}
101
102function dow {
103 local dow=$1
104
105 build_many $dow
106
107 case $dow in
108 4)
109 launch weekly
110 ;;
111 esac
112}
113
114function weekly {
115 true
116}
117
118function often {
119 true
120}
121
122function launch {
123 local event=$1
124 local arg=$2
125
126 case $event in
127 daily)
128 launch dow $( date +%u )
129 daily $arg
130 ;;
131 dow)
132 dow $arg
133 ;;
134 weekly)
135 weekly
136 ;;
137 often)
138 often
139 ;;
140 esac
141}
142
143event=$1; shift
144
145echo launcher: running $event on $( date +%u )
146launch $event $1
0147
=== added file 'libec2.sh'
--- libec2.sh 1970-01-01 00:00:00 +0000
+++ libec2.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,78 @@
1# Common functions for spawning and initialising an EC2 instace
2#
3
4. ~/.config/cbuild/toolsrc
5. $lib/cbuild/siterc
6
7wget="wget --no-proxy -q"
8this=ec2slave
9
10hostname=${1:-oort99}
11ami=${2:-missing-ami}
12# 64 bit high-CPU
13type=${3:-c1.xlarge}
14kill=${kill:-true}
15
16region=us-east-1
17
18lock="flock /tmp/heavy.lock"
19
20# Bring in any local configuration
21[ -f ~/.private/$this.rc ] && . ~/.private/$this.rc
22
23dir=/tmp/$this/$hostname
24
25function get_state {
26 $lock ec2din --show-empty-fields $instance > $dir/din
27 line=`grep ^INSTANCE $dir/din`
28 state=`echo $line | awk '{ print $6; }'`
29 if [ -f ~/.in-cloud ]; then
30 # Use the local IP
31 ip=`echo $line | awk '{ print $5; }'`
32 else
33 # Use the public IP
34 ip=`echo $line | awk '{ print $4; }'`
35 fi
36
37 echo get_state on $instance: $state $ip
38}
39
40function start_instance {
41 $wget -O - $SCHEDULER_API/update/$hostname/starting || true
42 mkdir -p $dir
43
44 $lock ec2run $ami -t $type --region $region -k $key \
45 --instance-initiated-shutdown-behavior terminate > $dir/run
46 instance=`grep ^INSTANCE $dir/run | awk '{ print $2; }'`
47
48 if [ "$kill" = "true" ]; then trap "$lock ec2kill $instance" EXIT; fi
49
50 get_state
51
52 while [ "$state" != "running" ]; do
53 sleep 5
54 get_state
55 done
56
57 # Wait for Ubuntu to boot
58 for i in {1..10}; do
59 if ssh -o StrictHostKeyChecking=no ubuntu@$ip true; then break; fi
60 sleep 10
61 done
62
63 # Test the connection
64 ssh ubuntu@$ip uname -a
65
66 echo Instance $instance started
67}
68
69function init_instance {
70 # Do the system wide setup
71 scp $this-init.sh ~/.private/$this-*seed* ubuntu@$ip:~
72 ssh ubuntu@$ip sudo bash $this-init.sh $hostname
73 # And the user specific
74 scp $this-init-cbuild.sh cbuild@$ip:~
75 ssh -t cbuild@$ip bash $this-init-cbuild.sh
76 # Copy the site configuration across
77 scp $lib/cbuild/siterc cbuild@$ip:/cbuild
78}
079
=== added file 'pull_branch.sh'
--- pull_branch.sh 1970-01-01 00:00:00 +0000
+++ pull_branch.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,21 @@
1#!/bin/sh
2# Update a branch, snapshot the latest revision, and spawn the jobs
3
4set -e
5
6# Directory this script runs from
7here=`dirname $0`
8
9# Pull a branch and tarball it up
10branch=$1
11head=$2
12
13# Silly bzr...
14rm -rf $branch/.bzr/checkout/limbo $branch/.bzr/checkout/pending-deletion
15bzr pull -q --overwrite -d $branch
16
17stem=`$here/export-bzr.sh $branch $head $3 $4 $5 $6 $7 | grep ^stem: | awk '{print $2;}'`
18
19if [ "$stem" != "" ]; then
20 $here/spawn.sh $stem
21fi
022
=== added file 's3-diet.py'
--- s3-diet.py 1970-01-01 00:00:00 +0000
+++ s3-diet.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,91 @@
1#!/usr/bin/env python
2
3"""Put a S3 bucket on a diet by removing old files. Used to delete
4old builds from the cbuild archive.
5"""
6
7import collections
8import sys
9import datetime
10import re
11import os.path
12import ConfigParser
13import logging
14
15import simples3
16import hurry.filesize
17
18
19Entry = collections.namedtuple('Entry', 'key timestamp etag size basename')
20
21
22def main():
23 dry_run = False
24
25 logging.basicConfig(level=logging.INFO)
26
27 config = ConfigParser.SafeConfigParser()
28 config.read(os.path.expanduser('~/.s3cfg'))
29
30 bucket = simples3.S3Bucket(sys.argv[1],
31 access_key=config.get('default', 'access_key'),
32 secret_key=config.get('default', 'secret_key'))
33
34
35 by_key = {}
36 entries = []
37
38 for v in bucket.listdir():
39 entry = Entry(*(v + (os.path.basename(v[0]), )))
40 by_key[entry.key] = entry
41 entries.append(entry)
42
43 # Do some linting
44 to_delete = []
45
46 # Delete the signature if the original file is gone
47 for entry in entries:
48 if entry.key.endswith('.asc'):
49 original = entry.key.replace('.asc', '')
50
51 if original not in by_key:
52 logging.debug('Deleting orphan signature %s' % entry.basename)
53 to_delete.append(entry)
54
55 # Delete anything that's too old
56 now = datetime.datetime.utcnow()
57
58 for entry in entries:
59 age = now - entry.timestamp
60
61 is_merge = '+bzr' in entry.basename and len(entry.basename.split('~')) == 3
62
63 if re.search(r'-201.\.\d\d', entry.basename) and '+bzr' not in entry.basename:
64 logging.debug("Not deleting %s as it's special" % entry.basename)
65 elif is_merge and age.days >= 30:
66 logging.debug('Deleting %d day old merge %s' % (age.days, entry.basename))
67 to_delete.append(entry)
68 elif age.days >= 45:
69 logging.debug('Deleting %d day old %s' % (age.days, entry.basename))
70 to_delete.append(entry)
71 else:
72 pass
73
74 # Nuke any duplicates
75 to_delete = sorted(set(to_delete))
76
77 total = sum(x.size for x in entries)
78 deleting = sum(x.size for x in to_delete)
79
80 logging.info('Total stored: %s in %d entries' % (hurry.filesize.size(total), len(entries)))
81 logging.info('To delete: %s in %d entries' % (hurry.filesize.size(deleting), len(to_delete)))
82
83 for entry in sorted(to_delete, key=lambda x: -x.size):
84 logging.info('Deleting %s' % entry.basename)
85
86 if not dry_run:
87 del bucket[entry.key]
88
89if __name__ == '__main__':
90 main()
91
092
=== added file 'spawn.sh'
--- spawn.sh 1970-01-01 00:00:00 +0000
+++ spawn.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,36 @@
1#/bin/bash
2#
3# Spawn a job based on the queues
4#
5# Examples:
6# spawn gcc-4.5+svn12345
7# spawn.sh ~/path/to/thing-to-build.tar.xz
8# spawn.sh ~/path/to/thing-to-build.tar.xz queue-name
9#
10
11set -e
12
13. ~/.config/cbuild/toolsrc
14
15# Directory this script runs from
16here=`dirname $0`
17
18job=$1
19
20priority=${PRIORITY:-.}
21
22# Tidy up job so that you can also supply a path
23job=`basename $job`
24job=`echo $job | sed -r 's/(.+)\.tar.*/\1/'`
25
26queue=${2:-$job}
27closest=$($here/closest.py $queue $scheduler/spawn/$priority/*)
28
29echo Spawning $job into the queue `basename $closest`
30
31# "Demux" via spawn queue into real job queues, use tcwg-web's schedulejob.py
32# script for scheduling jobs with LAVA support, etc.
33for i in $closest/*; do
34 PYTHONPATH=$lib/tcwg-web python -m schedulejob `basename $i` $job
35 echo Spawned into `basename $i`
36done
037
=== added file 'stamp_branch.sh'
--- stamp_branch.sh 1970-01-01 00:00:00 +0000
+++ stamp_branch.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,70 @@
1#!/bin/bash
2repo=$1
3into=$2
4
5# Check for known version control systems.
6if [ -d $repo/.bzr ]; then
7 GCC_GIT=${GCC_BZR-${BZR-bzr}}
8 vcs_type="bzr"
9elif [ -d $repo/.svn ]; then
10 GCC_SVN=${GCC_SVN-${SVN-svn}}
11 vcs_type="svn"
12else
13 echo "This does not seem to be a GCC GIT/Bzr tree!"
14 exit
15fi
16
17rm -f $into/LAST_UPDATED $into/gcc/REVISION
18
19case $vcs_type in
20 svn)
21 revision=${REVNO:-`svn info $1 | awk '/Revision:/ { print $2 }'`}
22 branch=`svn info $1 | sed -ne "/URL:/ {
23 s,.*/trunk,trunk,
24 s,.*/branches/,,
25 s,.*/tags/,,
26 p
27 }"`
28 ;;
29
30 bzr)
31 revision=${REVNO:-`bzr revno $repo`}
32 parent=`bzr info $repo | gawk '/parent branch/ { print $3; }'`
33
34 # Keep the branch name out of the version to prevent spurious
35 # test failures.
36 #
37 # ...net/%2Bbranch/gcc-linaro/4.7/ becomes gcc-linaro/4.7
38 # ...net/~michaelh1/gcc-linaro/better-longlong/ becomes gcc-linaro/~michaelh1
39 #
40 parts=( $(echo $parent | tr '/' '\n' | tac) )
41 echo ${parts[@]}
42 n3=${parts[0]}
43 n2=${parts[1]}
44 n1=${parts[2]}
45
46 case $n1,$n2,$n3 in
47 *branch,*,*) branch="$n2/$n3" ;;
48 \~*,*,*) branch="$n2/dev" ;;
49 *) branch="$n2" ;;
50 esac
51 ;;
52esac
53
54{
55 date
56 echo "`TZ=UTC date` (revision $revision)"
57} > $into/LAST_UPDATED
58
59# And the product specific parts
60
61# GCC
62if [ -f $into/gcc/version.c ]; then
63 echo "[$branch revision $revision]" > $into/gcc/REVISION
64fi
65
66# crosstool-NG
67if [ -f $into/ct-ng.in ]; then
68 sed -i -r "s#bzr.+#bzr$revision#" $into/version.sh
69 sed -i -r "s#^REVISION\s*=.*#REVISION = +bzr$revision#" $into/contrib/linaro/build.mk
70fi
071
=== added file 'tabulate.sh'
--- tabulate.sh 1970-01-01 00:00:00 +0000
+++ tabulate.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,36 @@
1#!/bin/bash
2#
3# Tabulate any new benchmark results
4#
5
6. ~/.config/cbuild/toolsrc
7
8tmp=`mktemp -td tabulate.XXXXXXXXXX`
9trap "rm -rf $tmp" EXIT
10
11base=${1:-$benchmarks}
12scripts=${2:-$HOME/lib/linaro-toolchain-benchmarks/scripts}
13
14tabulate() {
15 filter=$1
16 suffix=$2
17
18 name=benchmarks$suffix.txt
19
20 for i in $base/gcc*/logs/*/$filter.txt; do
21 dir=$(dirname $i)
22
23 # Skip if the tabulated version is newer
24 [ ! $i -nt $dir/$name ] && continue
25
26 echo $dir...
27 python $scripts/tabulate.py $dir/$filter.txt > $tmp/$name \
28 && touch -r $i $tmp/$name \
29 && mv $tmp/$name $dir
30 done
31}
32
33tabulate "*run" ""
34tabulate "spec2000-*run" "-spec2000"
35tabulate "eembc-*run" "-eembc"
36tabulate "eembc_office-*run" "-eembc_office"
037
=== added file 'taker.json'
--- taker.json 1970-01-01 00:00:00 +0000
+++ taker.json 2014-07-01 15:58:40 +0000
@@ -0,0 +1,82 @@
1{"build": "http://ex.seabright.co.nz/build/",
2 "instance": "production",
3 "projects": ["gcc-linaro", "gcc", "crosstool-ng", "gdb-linaro"],
4 "prompt": true,
5
6 "hosts": {
7 "crucis": {
8 "dry-run": true,
9 "repos": "/home/michaelh/linaro",
10 "all-files": "build/all-files.txt",
11 "tools": ".",
12 "build": "build"
13 },
14 "orion": {
15 "repos": "/home/cbuild/repos",
16 "all-files": "/var/www/ex.seabright.co.nz/build/all-files.txt",
17 "tools": "/home/cbuild/lib/cbuild-tools",
18 "build": "/var/www/ex.seabright.co.nz/build"
19 },
20 "cbuild-master": {
21 "repos": "/home/cbuild/var/repos",
22 "all-files": "/home/cbuild/public_html/build/all-files.txt",
23 "tools": "/home/cbuild/lib/cbuild-tools",
24 "build": "/home/cbuild/public_html/build"
25 },
26 "toolchain64": {
27 "repos": "/home/cbuild/var/repos",
28 "all-files": "/home/cbuild/public_html/build/all-files.txt",
29 "tools": "/home/cbuild/lib/cbuild-tools",
30 "build": "/home/cbuild/public_html/build"
31 }
32 },
33 "filters": {
34 "xinclude": ["lp663939"]
35 },
36 "branches": {
37 "lp:gcc-linaro": {
38 "prefix": "gcc-linaro-4.8",
39 "reporoot": "gcc-linaro-4.8"
40 },
41 "lp:gcc-linaro/4.5": {
42 "prefix": "gcc-linaro-4.5",
43 "reporoot": "gcc-linaro"
44 },
45 "lp:gcc-linaro/4.6": {
46 "prefix": "gcc-linaro-4.6",
47 "reporoot": "gcc-linaro"
48 },
49 "lp:gcc-linaro/4.7": {
50 "prefix": "gcc-linaro-4.7",
51 "reporoot": "gcc-linaro"
52 },
53 "lp:gcc-linaro/4.8": {
54 "prefix": "gcc-linaro-4.8",
55 "reporoot": "gcc-linaro-4.8"
56 },
57 "lp:gcc-linaro/4.6-2011.07-stable": {
58 "prefix": "gcc-linaro-4.6-2011.07-stable",
59 "reporoot": "gcc-linaro"
60 },
61 "lp:gcc-linaro/4.4": {
62 "prefix": "gcc-linaro-4.4",
63 "reporoot": "gcc-linaro"
64 },
65 "lp:gcc/4.6": {
66 "prefix": "gcc-4.6",
67 "reporoot": "gcc-linaro"
68 },
69 "lp:gcc": {
70 "prefix": "gcc-4.8",
71 "reporoot": "gcc-bzr"
72 },
73 "lp:gdb-linaro": {
74 "prefix": "gdb-linaro-7.5",
75 "reporoot": "gdb-linaro"
76 },
77 "lp:~linaro-toolchain-dev/crosstool-ng/linaro": {
78 "prefix": "crosstool-ng-linaro-1.13.1",
79 "reporoot": "crosstool-ng"
80 }
81 }
82}
083
=== added file 'taker.py'
--- taker.py 1970-01-01 00:00:00 +0000
+++ taker.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,396 @@
1"""Script that monitors merge requests, branches unseen ones, and sets
2up a build.
3"""
4
5import os
6import os.path
7import re
8import subprocess
9import json
10import socket
11import datetime
12import logging
13import getopt
14import sys
15import pprint
16import pdb
17import StringIO
18import findlogs
19
20import launchpadlib.launchpad
21
22# Message sent when a merge request is snapshotted and queued for build
23QUEUED = {
24 'subject': "[cbuild] Queued %(snapshot)s for build",
25 'body': """
26cbuild has taken a snapshot of this branch at r%(revno)s and queued it for build.
27
28The diff against the ancestor r%(ancestor)s is available at:
29 http://cbuild.validation.linaro.org/snapshots/%(snapshot)s.diff
30
31and will be built on the following builders:
32 %(builders)s
33
34You can track the build queue at:
35 http://cbuild.validation.linaro.org/helpers/scheduler
36
37cbuild-snapshot: %(snapshot)s
38cbuild-ancestor: %(target)s+bzr%(ancestor)s
39cbuild-state: check
40"""
41}
42
43# Message sent when a individual build completes
44CHECK_OK = {
45 'subject': "[cbuild] Build OK for %(snapshot)s on %(build)s",
46 'body': """
47cbuild successfully built this on %(build)s.
48
49The build results are available at:
50 http://cbuild.validation.linaro.org/build/%(snapshot)s/logs/%(build)s
51
52%(diff)s
53
54The full testsuite results are at:
55 http://cbuild.validation.linaro.org/build/%(snapshot)s/logs/%(build)s/gcc-testsuite.txt
56
57cbuild-checked: %(build)s
58"""
59}
60
61# Message sent when a individual build fails
62CHECK_FAILED = {
63 'subject': "[cbuild] Build failed for %(snapshot)s on %(build)s",
64 'body': """
65cbuild had trouble building this on %(build)s.
66See the following failure logs:
67 %(logs)s
68
69under the build results at:
70 http://cbuild.validation.linaro.org/build/%(snapshot)s/logs/%(build)s
71
72%(diff)s
73
74cbuild-checked: %(build)s
75""",
76 'vote': 'Needs Fixing',
77 'nice': False
78}
79
80CONFIG = json.load(open('taker.json'))
81
82def run(cmd, cwd=None, const=False):
83 logging.debug('Executing "%s" (cwd=%s)' % (' '.join(cmd), cwd))
84
85 if const or not get_config2(False, 'dry-run'):
86 child = subprocess.Popen(cmd, cwd=cwd, stdout=subprocess.PIPE)
87 stdout, stderr = child.communicate()
88 retcode = child.wait()
89
90 if retcode:
91 raise Exception('Child process returned %s' % retcode)
92
93 lines = [x.rstrip() for x in StringIO.StringIO(stdout).readlines()]
94 logging.debug('Stdout: %r' % lines[:20])
95
96 return lines
97 else:
98 return []
99
100def tidy_branch(v):
101 """Tidy up a branch name from one of the staging servers to the
102 top level server.
103 """
104 return v.replace('lp://qastaging/', 'lp:')
105
106def get_config_raw(v):
107 at = CONFIG
108
109 for name in v.split('!'):
110 at = at[name]
111
112 return at
113
114def get_config(*args):
115 v = '!'.join(args)
116
117 # See if the value exists in the hosts override first
118 try:
119 return get_config_raw('hosts!%s!%s' % (socket.gethostname(), v))
120 except KeyError:
121 return get_config_raw(v)
122
123def get_config2(default, *args):
124 """Get a configuration value, falling back to the default if not set."""
125 try:
126 return get_config(*args)
127 except KeyError:
128 return default
129
130class Proposal:
131 def __init__(self, proposal):
132 self.proposal = proposal
133 self.parse(proposal)
134
135 def parse(self, proposal):
136 """Parse the proposal by scanning all comments for
137 'cbuild-foo: bar' tokens. Pull these into member variables.
138 """
139 self.state = None
140 self.ancestor = None
141 self.checked = []
142
143 for comment in proposal.all_comments:
144 lines = [x.strip() for x in comment.message_body.split('\n')]
145
146 for line in lines:
147 match = re.match('cbuild-(\w+):\s+(\S+)', line)
148
149 if match:
150 name, value = match.groups()
151
152 if name in ['checked']:
153 getattr(self, name).append(value)
154 else:
155 setattr(self, name, value)
156
157def add_comment(proposal, template, **kwargs):
158 """Add a comment using the given template for the subject and
159 body text.
160 """
161 vote = template.get('vote', None)
162 # True if this is a 'happy' result. Used in debugging to
163 # prevent unreversable results from going up to Launchpad.
164 nice = template.get('nice', True)
165
166 subject = (template['subject'] % kwargs).rstrip()
167 body = (template['body'] % kwargs).rstrip()
168
169 just_print = get_config2(False, 'dry-run')
170
171 if not nice and get_config2(True, 'cautious'):
172 # Running in a debug/test mode. Don't push unreversable
173 # results.
174 just_print = True
175
176 if get_config2(False, 'prompt'):
177 logging.info("Want to add this comment: %s\n%s\n\nVote: %s\n" % (subject, body, vote))
178
179 if just_print:
180 got = raw_input('Add it anyway? > ')
181 else:
182 got = raw_input('OK to add? > ')
183
184 just_print = got.strip() not in ['Y', 'y']
185
186 if just_print:
187 logging.info("Would have added this comment: %s\n%s\n\nVote: %s\n" % (subject, body, vote))
188 else:
189 if vote:
190 proposal.proposal.createComment(subject=subject, content=body, vote=vote)
191 else:
192 proposal.proposal.createComment(subject=subject, content=body)
193
194def queue(proposal):
195 """Run the queue step by snapshoting and queuing the build."""
196 lp = proposal.proposal
197
198 source = lp.source_branch
199 owner = source.owner.name
200 branch = tidy_branch(source.bzr_identity)
201 revno = source.revision_count
202
203 # We might have hit this before Launchpad has scanned it
204 if not revno:
205 # Work around Launchpad being broken on 2011-09-07 by assuming
206 # the merge was against the latest revno
207 # Work around bzr revno bug with ghost ancestry
208 # (https://launchpad.net/bugs/1161018)
209 revno = run(['bzr', 'revno', 'nosmart+%s' % branch], const=True)
210 revno = int(revno[0])
211 assert revno
212
213 target = tidy_branch(lp.target_branch.bzr_identity)
214
215 # Find where to check it out
216 prefix = get_config('branches', target, 'prefix')
217 reporoot = get_config('branches', target, 'reporoot')
218
219 # Give it an archive name
220 suffix = '~%s~%s' % (owner, os.path.basename(branch))
221
222 snapshot = '%s+bzr%s%s' % (prefix, revno, suffix)
223 path = '%s/%s/%s' % (get_config('repos'), reporoot, snapshot)
224
225 # Remove the old directory
226 run(['rm', '-rf', path])
227 # Branch it
228 run(['bzr', 'branch', '--no-tree', branch, path], cwd=os.path.dirname(path))
229 # Find the common ancestor
230 lines = run(['bzr', 'revision-info', '-d', path, '-r', 'ancestor:%s' % target])
231
232 ancestor = lines[0].split()[0] if lines else 'unknown'
233
234 # Make the snapshot
235 lines = run(['%s/pull_branch.sh' % get_config('tools'), path, prefix, '%s' % revno, suffix, target])
236
237 # Scan the tool results to see which queues the build was
238 # pushed into
239 builders = []
240
241 for line in lines:
242 match = re.match('Spawned into (\S+)', line)
243
244 if match:
245 builders.append(match.group(1))
246
247 add_comment(proposal, QUEUED, snapshot=snapshot, revno=revno, builders=' '.join(builders), ancestor=ancestor, target=target)
248
249def make_diff(proposal, all_files, check):
250 """Check if there are any DejaGNU summary files and diff them."""
251 sums = [x for x in check if '.sum' in x[-1]]
252
253 if not proposal.ancestor:
254 lines = ['The test suite was not checked as the branch point was not recorded.']
255 elif not sums:
256 lines = ['The test suite was not checked as this build has no .sum style test results']
257 else:
258 # Have some test results. See if there's a corresponding
259 # version in the branch point build.
260 first = '/'.join(sums[0])
261 ref = findlogs.find(all_files, first, proposal.ancestor)
262
263 if not ref:
264 lines = ['The test suite was not checked as the branch point %s has nothing to compare against.' % proposal.ancestor]
265 else:
266 revision, build = ref
267
268 # Generate a unified diff between the two sets of results
269 diff = run(['%s/difftests.sh' % get_config('tools'),
270 '%s/%s/logs/%s' % (get_config('build'), revision, build),
271 '%s/%s' % (get_config('build'), os.path.dirname(first))],
272 const=True)
273
274 if diff:
275 lines = ['The test suite results changed compared to the branch point %s:' % proposal.ancestor]
276 lines.extend([' %s' % x for x in diff])
277 else:
278 lines = ['The test suite results were unchanged compared to the branch point %s.' % proposal.ancestor]
279
280 if len(lines) > 40:
281 total = len(lines)
282 lines = lines[:40]
283 lines.append(' ...and %d more' % (total - len(lines)))
284
285 return '\n'.join(lines)
286
287def check_build(proposal, all_files, build, matches):
288 """Check a single build to see how the results went."""
289
290 if build in proposal.checked:
291 logging.debug('Already posted a comment on %s' % build)
292 else:
293 check = [x for x in matches if len(x) >= 4 and x[2] == build]
294 failures = [x for x in check if 'failed' in x[-1]]
295 finished = 'finished.txt' in [x[-1] for x in check]
296
297 logging.info('Checking %s (finished=%s, failures=%r)' % (build, finished, failures))
298
299 if finished:
300 logs = ' '.join(x[-1] for x in failures)
301
302 diff = make_diff(proposal, all_files, check)
303
304 if failures:
305 add_comment(proposal, CHECK_FAILED, build=build, snapshot=proposal.snapshot, logs=logs, diff=diff)
306 else:
307 add_comment(proposal, CHECK_OK, build=build, snapshot=proposal.snapshot, logs=logs, diff=diff)
308
309def check(proposal, all_files):
310 """Check an already queued build for build results."""
311 # Find all log files for this snapshot
312 matches = [x for x in all_files if x[0] == proposal.snapshot]
313 matches = [x for x in matches if len(x) >= 3 and x[1] == 'logs']
314
315 if not matches:
316 logging.debug('No log files yet')
317 return
318
319 # Build a list of builds
320 builds = [x[2] for x in matches if len(x) == 3]
321
322 for build in builds:
323 check_build(proposal, all_files, build, matches)
324
325def run_proposal(proposal, all_files, allowed):
326 when = proposal.date_review_requested
327
328 # Only run recent proposals that have had review requested
329 if when:
330 now = datetime.datetime.now(when.tzinfo)
331 elapsed = (now - when).days
332
333 logging.info('Checking proposal %s which is %d days old' % (proposal, elapsed))
334
335 if elapsed <= get_config2(14, 'age-limit'):
336 p = Proposal(proposal)
337
338 state = p.state if p.state else 'queue'
339
340 if state in allowed:
341 # Could use a dodgy getattr()...
342 if state == 'queue':
343 queue(p)
344 elif state == 'check':
345 check(p, all_files)
346 else:
347 assert False, 'Proposal %s is in the invalid state "%s"' % (proposal, p.state)
348 else:
349 logging.info("Skipping %s" % state)
350
351def main():
352 opts, args = getopt.getopt(sys.argv[1:], 'f:vs:')
353 verbose = int(get_config2(0, 'verbose'))
354 steps = get_config2('queue', 'steps').split(',')
355
356 for opt, arg in opts:
357 if opt == '-f':
358 name, value = arg.split('=')
359 option = json.loads('{ "%s": %s }' % (name, value))
360 CONFIG.update(option)
361 elif opt == '-v':
362 verbose += 1
363 elif opt == '-s':
364 steps = arg.split(',')
365
366 if verbose >= 2:
367 logging.basicConfig(level=logging.DEBUG)
368 elif verbose >= 1:
369 logging.basicConfig(level=logging.INFO)
370
371 # Parse all-files into a list of already split paths
372 with open(get_config('all-files')) as f:
373 all_files = f.readlines()
374
375 all_files = [x.rstrip().split('/') for x in all_files]
376 all_files = [x[1:] for x in all_files if len(x) >= 2]
377
378 # Login to Launchpad
379 launchpad = launchpadlib.launchpad.Launchpad.login_with('taker', get_config('instance'), credentials_file=os.path.expanduser('~/.launchpadlib/credentials'))
380
381 # Scan all projects...
382 for name in get_config('projects'):
383 logging.info('Checking project %s' % name)
384 project = launchpad.projects[name]
385
386 # Scan all proposals in the project...
387 proposals = project.getMergeProposals()
388
389 for p in proposals:
390 try:
391 run_proposal(p, all_files, steps)
392 except Exception, ex:
393 logging.error('Error while processing %s: %s' % (p, ex))
394
395if __name__ == '__main__':
396 main()
0397
=== added file 'track-tree.sh'
--- track-tree.sh 1970-01-01 00:00:00 +0000
+++ track-tree.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,16 @@
1#!/bin/sh
2
3set -e
4
5dir=`mktemp -d`
6trap "rm -rf $dir" EXIT
7
8find $@ -type f | sort > $dir/all-files.txt
9cp -a changes/all-files.txt $dir/previous.txt
10
11cmp -s $dir/previous.txt $dir/all-files.txt && exit
12
13stamp=`date +%Y-%m-%d-%H%M%S`
14
15(cd $dir && diff -U 0 previous.txt all-files.txt) | tee $dir/$stamp.diff
16mv $dir/$stamp.diff $dir/all-files.txt changes
017
=== added file 'up_branch.sh'
--- up_branch.sh 1970-01-01 00:00:00 +0000
+++ up_branch.sh 2014-07-01 15:58:40 +0000
@@ -0,0 +1,53 @@
1#!/bin/bash
2# Update a SVN branch and snapshot the latest revision
3# Can also snapshot past revisions and add a per-user
4# suffix
5
6set -e
7
8. ~/.config/cbuild/toolsrc
9
10# Directory this script runs from
11here=`dirname $0`
12
13# Suffix for manual pulls
14SUFFIX=${SUFFIX:-}
15
16# Pull a branch and tarball it up
17branch=$1
18head=$2
19
20svn up -q $branch
21# Recent SVN adds a last changed revision field. Delete spaces to
22# strip the space from the revno and, coincidentally, make the field
23# easier to match.
24latest=$(svn info $branch | tr -d ' ' | awk -F: '/LastChangedRev/ { print $2 }')
25revno=${REVNO:-$latest}
26
27if [ "x${head}" != "x" ]; then
28 name=${head}svn$revno$SUFFIX
29 tar=/tmp/$name.tar
30 xdelta=$snapshots/$name.tar.xdelta3.xz
31
32 if [ ! -f $xdelta ]; then
33 echo Exporting and stamping $revno
34 tmp=`mktemp -d`
35 rm -rf $tmp/$name
36 svn export $branch $tmp/$name
37 REVNO=$revno $here/stamp_branch.sh $branch $tmp/$name
38
39 # Create the tarball
40 echo Creating $tar
41 (cd $tmp && find "$name" -print0 | sort -z) | tar caf $tar -C $tmp --no-recursion --null -T -
42 rm -rf $tmp
43 # Create the delta
44 closest=`$here/closest.py $tar $snapshots/base/*.tar`
45 echo Making the delta relative to $closest
46 xdelta3 -fe -s $closest $tar $tar.xdelta3
47 rm -f $tar
48 xz -f $tar.xdelta3
49 mv $tar.xdelta3.xz $snapshots
50 echo New snapshot $head $revno created
51 $here/spawn.sh $tar
52 fi
53fi
054
=== added file 'utilisation.py'
--- utilisation.py 1970-01-01 00:00:00 +0000
+++ utilisation.py 2014-07-01 15:58:40 +0000
@@ -0,0 +1,93 @@
1"""Plots the backlog and utilisation level of a class of build
2machines.
3"""
4
5import fileinput
6
7import matplotlib
8matplotlib.use('agg')
9
10from pylab import *
11
12def process(lines, name, matcher):
13 start = float(lines[0][0])
14 backlog = []
15 running = []
16 states = []
17 hosts = {}
18
19 for line in lines:
20 time, type, host, arg = line[:4]
21
22 if not matcher(host):
23 continue
24
25 offset = float(time) - start
26 days = offset / (60*60*24)
27
28 if type == 'backlog':
29 if hosts[host] == 'reserved':
30 # Reserved machines report as a zero backlog
31 pass
32 else:
33 backlog.append((days, int(arg)))
34 elif type == 'state':
35 if arg != 'updating':
36 states.append((host, arg, days))
37 hosts[host] = arg
38
39 # Scan through the states and remove the glitches
40 merged = []
41
42 for host in hosts:
43 just = [x for x in states if x[0] == host]
44
45 for i in range(len(just) - 1):
46 this = just[i]
47 next = just[i+1]
48 elapsed = next[-1] - this[-1]
49
50 if this[1] == 'running' and next[1] == 'idle' and elapsed < 5.0/60/60/24:
51 # Drop this record
52 pass
53 else:
54 merged.append(this)
55
56 merged.sort(key=lambda x: x[-1])
57
58 # Scan through and figure out the number of hosts running at any one time
59 states = {}
60
61 for row in merged:
62 states[row[0]] = row[1]
63
64 running.append((row[-1], sum(1 if x == 'running' else 0 for x in states.values())))
65
66 backlog = array(backlog)
67 running = array(running)
68
69 clf()
70 plot(backlog[:,0], backlog[:,1])
71 scatter(backlog[:,0], backlog[:,1], label='backlog')
72
73 plot(running[:,0], running[:,1], c='g', label='running')
74
75 ylim(0, ylim()[1]+1)
76 xlim(0, xlim()[1])
77 xlabel('Time (days)')
78 ylabel('Utilisation (#boards) & backlog (# jobs)')
79 title('%s utilisation' % name)
80 legend()
81 grid()
82 savefig('%s.png' % name)
83 show()
84
85def main():
86 lines = list(fileinput.input())
87 lines = [x.rstrip().split() for x in lines]
88
89 process(lines, 'ursas', lambda x: x.startswith('ursa'))
90# process(lines, 'cloud', lambda x: x.startswith('oort'))
91
92if __name__ == '__main__':
93 main()

Subscribers

People subscribed via source and target branches