Merge lp:~christophe-lyon/cbuild/tools-fix-svn-name into lp:cbuild
- tools-fix-svn-name
- Merge into trunk
Proposed by
Christophe Lyon
Status: | Needs review |
---|---|
Proposed branch: | lp:~christophe-lyon/cbuild/tools-fix-svn-name |
Merge into: | lp:cbuild |
Diff against target: |
3321 lines (+3126/-0) 39 files modified
MAINTAIN.txt (+92/-0) aarch64-merger.sh (+172/-0) addtestdiffs.py (+133/-0) cache-builds.py (+131/-0) chain.sh (+75/-0) check-leaks.sh (+14/-0) closest.py (+63/-0) cron-often.sh (+56/-0) ctimes.py (+100/-0) difftests.sh (+45/-0) ec2-pull.sh (+28/-0) ec2-spawn.sh (+36/-0) ec2new.sh (+23/-0) ec2slave-init-cbuild.sh (+36/-0) ec2slave-init.sh (+105/-0) ec2slave.sh (+65/-0) ec2watch.sh (+18/-0) enrich-gcc-linaro-repo (+45/-0) export-bzr.sh (+85/-0) export-git.sh (+75/-0) export-hg.sh (+61/-0) findancestor.py (+204/-0) findlogs.py (+113/-0) gcc-release.sh (+168/-0) getbody.py (+26/-0) lab-pull.sh (+22/-0) latestami.sh (+17/-0) launcher.sh (+146/-0) libec2.sh (+78/-0) pull_branch.sh (+21/-0) s3-diet.py (+91/-0) spawn.sh (+36/-0) stamp_branch.sh (+70/-0) tabulate.sh (+36/-0) taker.json (+82/-0) taker.py (+396/-0) track-tree.sh (+16/-0) up_branch.sh (+53/-0) utilisation.py (+93/-0) |
To merge this branch: | bzr merge lp:~christophe-lyon/cbuild/tools-fix-svn-name |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Linaro Toolchain Builder | Pending | ||
Review via email: mp+216447@code.launchpad.net |
Commit message
Description of the change
Attempt to avoid generation of ~/var/snapshots
To post a comment you must log in.
- 108. By Christophe Lyon
-
Fix previous commit.
Unmerged revisions
- 108. By Christophe Lyon
-
Fix previous commit.
- 107. By Christophe Lyon
-
Do not generate tarball and delta if no name parameter is provided.
- 106. By Renato Golin
-
Adding Clang/Extra/RT to LLVM update
- 105. By Yvan Roux
-
Add gcc-linaro-4.8 branch in cron-often polling task.
- 104. By Linaro Toolchain Builder
-
Add support for toolchain64 host and GCC Linaro 4.8 branch
- 103. By Linaro Toolchain Builder
-
Update find to take gcc-4.8 branches into account.
- 102. By Christophe Lyon
-
Fix bug #1156536
- 101. By Michael Hope
-
Fix taker stuff
- 100. By Michael Hope
-
Make tabulate work on all GCC results.
- 99. By Michael Hope
-
Update to use gcc-google-4.7 and not gcc-google-4.6
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === added file 'MAINTAIN.txt' |
2 | --- MAINTAIN.txt 1970-01-01 00:00:00 +0000 |
3 | +++ MAINTAIN.txt 2014-07-01 15:58:40 +0000 |
4 | @@ -0,0 +1,92 @@ |
5 | +cbuild maintenance tasks |
6 | +======================== |
7 | + |
8 | +Users |
9 | +----- |
10 | + * cbuild - pulls keys from http://launchpad.net/~cbuild |
11 | + |
12 | +Things to watch |
13 | +--------------- |
14 | + |
15 | +Disks: |
16 | + |
17 | + * / for free space |
18 | + * /space for free space |
19 | + |
20 | +`~/var/snapshots/gcc-4.8?svn*`: |
21 | + |
22 | + * Youngest is no more than 3 days old |
23 | + * Check cbuild-tools/launcher.sh on fault |
24 | + |
25 | +`~/var/tcwg-web/*.pickle`: |
26 | + |
27 | + * No more than four hours old |
28 | + * Check lib/tcwg-web/update.sh on fault |
29 | + |
30 | +Common tasks |
31 | +------------ |
32 | + |
33 | +Where have all my directories gone? |
34 | + |
35 | + * ~/lib/cbuild |
36 | + * ~/var/snapshots |
37 | + * ~/public_html/build |
38 | + |
39 | +How do I spawn a release build via the shell? |
40 | + |
41 | + * Use gcc-release-process.sh! else: |
42 | + * scp gcc-linaro-4.7-2012.12.tar.bz2 cbuild@toolchain64.lab:~/var/snapshots |
43 | + * ssh cbuild@toolchain64.lab |
44 | + * ~/lib/cbuild-tools/spawn.sh ~/var/snapshots/gcc-linaro-4.7-2012.tar.bz2 |
45 | + |
46 | +A fault with frequent tasks like merge reqeusts, tip builds or data |
47 | +being out of date? |
48 | + |
49 | + * Check ~/lib/cbuild-tools/cron-often.sh |
50 | + |
51 | +A fault with daily tasks like upstream builds? |
52 | + |
53 | + * Check ~/lib/cbuild-tools/launcher.sh |
54 | + |
55 | +A fault with one of the helpers? |
56 | + |
57 | + * sudo service tcwg-web stop |
58 | + * cd lib/tcwg-web |
59 | + * Uncommet 'development = True' in twg-web.ini |
60 | + * python index.py |
61 | + * Run the request, see the backtrace |
62 | + |
63 | +Version out of date on one of the upstream builds? |
64 | + |
65 | + * Edit launcher.sh |
66 | + |
67 | +Want to build/track a new upstream? |
68 | + |
69 | + * git clone upstream-url ~/repos/decent-name |
70 | + * Add a new dow line to launcher.sh |
71 | + |
72 | +Want to track a new Linaro series? |
73 | + |
74 | + * cd ~/repos/gcc-linaro |
75 | + * bzr branch lp:gcc-linaro/4.8 |
76 | + * Add a new line to cron-often.sh |
77 | + * Consider pushing a new gcc-linaro-4.8+bzr12345.{tar,tar.xz} to ~/var/snapshots/base |
78 | + |
79 | +Want to propogate a cbuild update to the slaves? |
80 | + |
81 | + * cd ~/lib/cbuild |
82 | + * bzr pull |
83 | + |
84 | +Want to add a new build queue? |
85 | + |
86 | + * cd ~/var/scheduler/queue |
87 | + * mkdir queue-name |
88 | + * echo host1 host2 host3 > queue-name/hosts.txt |
89 | + * echo config-fragments-if-any > queue-name/template.txt |
90 | + * cd ../spawn/default |
91 | + * ln -s ../../queue/queue-name |
92 | + |
93 | +Want to delete a job from the queue? |
94 | + |
95 | + * cd ~/var/scheduler/queue |
96 | + * `rm */job-name.job` |
97 | |
98 | === added file 'aarch64-merger.sh' |
99 | --- aarch64-merger.sh 1970-01-01 00:00:00 +0000 |
100 | +++ aarch64-merger.sh 2014-07-01 15:58:40 +0000 |
101 | @@ -0,0 +1,172 @@ |
102 | +#!/bin/bash |
103 | +# |
104 | +# Automatically merge and create a merge request for any new aarch64 |
105 | +# commits. |
106 | +# |
107 | +# |
108 | + |
109 | +set -e |
110 | + |
111 | +# Path to Linaro GCC 4.7 |
112 | +tip=4.7 |
113 | +# Path to the ARM bzr branch |
114 | +from=arm-aarch64-4.7 |
115 | +# Fill this in with the last merged bzr revno |
116 | +# Example: bzr log 4.7 shows r193293..r193328. bzr log arm-* --show-ids shows that |
117 | +# r193328 is bzr116407. |
118 | +current=116423 |
119 | + |
120 | +get_revision() { |
121 | + echo $(bzr log -c $1 --show-ids --short $2 | awk -F: '/revision-id:/ {print $NF}') |
122 | +} |
123 | + |
124 | +is_fsf_merge() { |
125 | + (bzr log -c $1 -p $2 | grep -qF "+++ gcc/DATESTAMP") && echo y |
126 | +} |
127 | + |
128 | +all_revnos() { |
129 | + bzr log -r $stop.. --log-format=line $1 | awk -F: '{print $1;}' |
130 | +} |
131 | + |
132 | +extract_changelog() { |
133 | + # Pull the ChangeLog out |
134 | + inside= |
135 | + |
136 | + while read i; do |
137 | + if [[ "$i" =~ ^=== ]]; then inside=; fi |
138 | + |
139 | + if [ -n "$inside" ] && [[ "$i" =~ ^\+(.+) ]]; then |
140 | + echo "${BASH_REMATCH[1]}" |
141 | + fi |
142 | + |
143 | + if [[ "$i" =~ ^\+\+\+\ .+/ChangeLog\.aarch64 ]]; then inside=y; fi |
144 | + |
145 | + done < /tmp/bzr$revno.diff > /tmp/bzr$revno.changes |
146 | +} |
147 | + |
148 | +search_back() { |
149 | + # Search back to the first non-merge |
150 | + for i in $all; do |
151 | + if [ -z "$(is_fsf_merge $i $from)" ]; then |
152 | + last=$i |
153 | + break |
154 | + fi |
155 | + done |
156 | +} |
157 | + |
158 | +search_forward() { |
159 | + # Search forward to the first non-merge |
160 | + for i in $(seq $((current+1)) 1 $last); do |
161 | + if [ -z "$(is_fsf_merge $i $from)" ]; then |
162 | + first=$i |
163 | + break |
164 | + fi |
165 | + done |
166 | +} |
167 | + |
168 | +dir=`mktemp -td merger.XXXXXXXXXX` |
169 | +trap "rm -rf $dir" EXIT |
170 | + |
171 | +# End point for any scans |
172 | +stop=116380 |
173 | + |
174 | +all=$(all_revnos $from) |
175 | + |
176 | +search_back |
177 | +search_forward |
178 | + |
179 | +sfirst=$(get_revision $first $from) |
180 | +slast=$(get_revision $last $from) |
181 | + |
182 | +name="merge-from-aarch64-4.7-r$sfirst-r$slast" |
183 | + |
184 | +# See if the merge request already exists |
185 | +if false && wget -q -O- https://code.launchpad.net/gcc-linaro | grep -q -F $name; then |
186 | + echo "Branch $name already exists, nothing to do." |
187 | + exit 0 |
188 | +fi |
189 | + |
190 | +rm -rf $name |
191 | +bzr branch -q --hardlink $tip $name |
192 | + |
193 | +realname="Michael Hope" |
194 | +email="michael.hope@linaro.org" |
195 | +login=$(bzr lp-login) |
196 | + |
197 | +short="Merge from FSF arm/aarch64-4.7-branch r$sfirst..r$slast." |
198 | + |
199 | +echo -e "$(date +%Y-%m-%d) $realname <$email>\n" > $dir/log |
200 | +echo -e "\t$short\n" >> $dir/log |
201 | + |
202 | +# Go through and merge each revision |
203 | +for revno in $(seq $first 1 $last); do |
204 | + if [ -n "$(is_fsf_merge $revno $from)" ]; then |
205 | + echo "Skipping the FSF merge at $revno" |
206 | + continue |
207 | + fi |
208 | + |
209 | + svn=$(get_revision $revno $from) |
210 | + stem=$dir/bzr$revno |
211 | + |
212 | + echo "Merging $svn (bzr$revno)" |
213 | + # bzr diff returns 1 on changed... |
214 | + (cd $from && bzr diff -c $revno || true) > $stem.diff |
215 | + |
216 | + if ! (cd $name && bzr patch $stem.diff); then |
217 | + read -p "Conflicts found. Fix in $name then press enter > " |
218 | + fi |
219 | + |
220 | + # Nuke any patch reminants |
221 | + find $name -name *.orig -exec rm {} \; |
222 | + find $name -name *.rej -exec rm {} \; |
223 | + |
224 | + # Add any missing files. bzr patch should have done this already |
225 | + # but doesn't on conflict. |
226 | + (cd $name && bzr add) |
227 | + |
228 | + # Pull out the commit message |
229 | + (cd $from && bzr log -c $revno) > $stem.log |
230 | + |
231 | + inside= |
232 | + while read i; do |
233 | + if [ -n "$inside" ]; then |
234 | + trimmed=$(echo "$i") |
235 | + echo "$trimmed" |
236 | + fi |
237 | + |
238 | + if [[ "$i" =~ ^message: ]]; then inside=y; fi |
239 | + done < $stem.log > $stem.message |
240 | + |
241 | + # Commit each as a revision to make future tracking easier. The |
242 | + # merge will roll this up into one. |
243 | + echo -e "Backport $from r$svn.\n" > $stem.commit |
244 | + cat $stem.message >> $stem.commit |
245 | + |
246 | + (cd $name && bzr commit -F $stem.commit) |
247 | + |
248 | + echo -e "\tBackport $from r$svn:" >> $dir/log |
249 | + while read i; do |
250 | + echo -e "\t$i" |
251 | + done < $stem.message >> $dir/log |
252 | + |
253 | + echo >> $dir/log |
254 | +done |
255 | + |
256 | +# Upate the ChangeLog and commit |
257 | +cat $dir/log $name/ChangeLog.linaro > $dir/new |
258 | +mv $dir/new $name/ChangeLog.linaro |
259 | + |
260 | +cd $name |
261 | +bzr commit -m"$short" |
262 | +bzr push "lp:~$login/gcc-linaro/$name" |
263 | + |
264 | +# propose-merge always runs an editor for the message. Make a fake |
265 | +# editor that sets the text. |
266 | +cat > $dir/editor <<EOF |
267 | +#!/bin/bash |
268 | +cp $dir/log \$1 |
269 | +EOF |
270 | + |
271 | +chmod +x $dir/editor |
272 | + |
273 | +VISUAL=$dir/editor bzr lp-propose-merge -m"$short" |
274 | |
275 | === added file 'addtestdiffs.py' |
276 | --- addtestdiffs.py 1970-01-01 00:00:00 +0000 |
277 | +++ addtestdiffs.py 2014-07-01 15:58:40 +0000 |
278 | @@ -0,0 +1,133 @@ |
279 | +#!/usr/bin/python |
280 | +"""Record the difference between the test results of this build and |
281 | +it's ancestor or predecessor. |
282 | +""" |
283 | + |
284 | +import sys |
285 | +import re |
286 | +import os |
287 | +import pprint |
288 | +import subprocess |
289 | +import logging |
290 | +import time |
291 | + |
292 | +import findancestor |
293 | +import findlogs |
294 | + |
295 | +TEMPLATE = """Difference in testsuite results between: |
296 | + %(current)s build %(buildid)s |
297 | +and the one before it: |
298 | + %(other)s build %(otherbuildid)s |
299 | + |
300 | +------ |
301 | +%(diff)s""" |
302 | + |
303 | +def make_testsuite_diff(left, right): |
304 | + # PENDING: This is bad |
305 | + return subprocess.check_output(['./difftests.sh', left, right]) |
306 | + |
307 | +def make_benchmarks_diff(left, right, suffix): |
308 | + return subprocess.check_output(['../linaro-toolchain-benchmarks/scripts/diffbench.py', |
309 | + '%s/benchmarks-%s.txt' % (left, suffix), |
310 | + '%s/benchmarks-%s.txt' % (right, suffix) |
311 | + ]) |
312 | + |
313 | +def run(allfiles, snapshots, builds, check, final, make_diff): |
314 | + now = time.time() |
315 | + |
316 | + # Make the search a bit easier by only keeping the .txt files |
317 | + filtered = [x for x in allfiles if '.txt' in x[-1] or check in x[-1]] |
318 | + |
319 | + # And cache by directory |
320 | + directories = {} |
321 | + |
322 | + for f in filtered: |
323 | + directory = '/'.join(f[:-1]) |
324 | + directories.setdefault(directory, []).append(f) |
325 | + |
326 | + prefiltered = findlogs.prefilter(allfiles, check) |
327 | + |
328 | + # Find all builds that have finished |
329 | + for build in [x for x in filtered if x[-1] == 'finished.txt']: |
330 | + match = '/'.join(build[:-1]) |
331 | + |
332 | + # Skip if the build is too old. Either we have a result or |
333 | + # it's never coming in |
334 | + try: |
335 | + age = (now - os.path.getmtime('%s/%s/finished.txt' % (builds, match))) / (60*60*24) |
336 | + |
337 | + if age >= 30: |
338 | + continue |
339 | + except OSError: |
340 | + continue |
341 | + |
342 | + # Find all of the files in the same directory |
343 | + siblings = directories[match] |
344 | + |
345 | + all = ' '.join(x[-1] for x in siblings) |
346 | + |
347 | + # Are there any .sum style results? |
348 | + if check not in all: |
349 | + continue |
350 | + |
351 | + # See if the diff already exists |
352 | + if ' %s' % final in all: |
353 | + continue |
354 | + |
355 | + # Find the ancestor or predecessor |
356 | + current = build[1] |
357 | + other = findancestor.find(allfiles, current, snapshots) |
358 | + |
359 | + if not other: |
360 | + logging.info('No known ancestor to %s' % current) |
361 | + continue |
362 | + |
363 | + # Find the corresponding build |
364 | + logs = ['/'.join(x) for x in siblings if check in x[-1]] |
365 | + assert logs |
366 | + log = logs[-1] |
367 | + |
368 | + try: |
369 | + otherlogs = findlogs.find(prefiltered, log, other) |
370 | + except Exception, ex: |
371 | + logging.info('Error %s while finding logs on %s vs %s' % (ex, log, other)) |
372 | + otherlogs = None |
373 | + |
374 | + if not otherlogs: |
375 | + logging.debug("Can't find %s of %s" % (log, other)) |
376 | + continue |
377 | + |
378 | + other, otherbuildid = otherlogs |
379 | + buildid = build[-2] |
380 | + diff = None |
381 | + |
382 | + try: |
383 | + logging.debug('Generating a diff between predecessor %s/%s and %s' % (other, otherbuildid, os.path.dirname(log))) |
384 | + |
385 | + left = '%s/%s/logs/%s' % (builds, other, otherbuildid) |
386 | + right = os.path.join(builds, os.path.dirname(log)) |
387 | + diff = make_diff(left, right) |
388 | + |
389 | + except Exception, ex: |
390 | + logging.error(ex) |
391 | + |
392 | + if diff == None: |
393 | + continue |
394 | + |
395 | + with open('%s/%s/%s' % (builds, match, final), 'w') as f: |
396 | + print >> f, TEMPLATE % locals() |
397 | + |
398 | +def main(): |
399 | +# logging.basicConfig(level=logging.DEBUG) |
400 | + allfiles, snapshots, builds = sys.argv[1:] |
401 | + |
402 | + with open(allfiles) as f: |
403 | + allf = [x.strip().split(os.sep) for x in f.readlines()] |
404 | + |
405 | + run(allf, snapshots, builds, '.sum', 'testsuite-diff.txt', make_testsuite_diff) |
406 | + |
407 | + for suffix in 'spec2000 eembc eembc_office'.split(): |
408 | + run(allf, snapshots, builds, 'benchmarks-%s.txt' % suffix, 'benchmarks-%s-diff.txt' % suffix, lambda x, y: make_benchmarks_diff(x, y, suffix)) |
409 | + |
410 | +if __name__ == '__main__': |
411 | + main() |
412 | |
413 | === added file 'cache-builds.py' |
414 | --- cache-builds.py 1970-01-01 00:00:00 +0000 |
415 | +++ cache-builds.py 2014-07-01 15:58:40 +0000 |
416 | @@ -0,0 +1,131 @@ |
417 | +#!/usr/bin/python |
418 | + |
419 | +"""Scan a directory tree and cache the results of each version and |
420 | +build. |
421 | +""" |
422 | + |
423 | +import sys |
424 | +import os |
425 | +import pprint |
426 | +import cPickle |
427 | +import json |
428 | + |
429 | +def list(path): |
430 | + try: |
431 | + return os.walk(path).next() |
432 | + except StopIteration: |
433 | + return None, [], [] |
434 | + |
435 | +def get_date(root, name): |
436 | + return os.path.getmtime('%s/%s' % (root, name)) |
437 | + |
438 | +def intern_all(values): |
439 | + return tuple(intern(x) for x in values) |
440 | + |
441 | +def make_path(top, path): |
442 | + root = top['root'] |
443 | + |
444 | + assert path.startswith(root) |
445 | + return path[len(root)+1:] |
446 | + |
447 | +def add_build(top, name, aroot): |
448 | + parts = intern_all(name.split('-') + ['']) |
449 | + arch, distro, cbuild, host, config = parts[:5] |
450 | + |
451 | + root, dirs, files = list(aroot) |
452 | + files = intern_all(files) |
453 | + |
454 | + top['hosts'][host] = arch |
455 | + top['arches'].setdefault(arch, {}) |
456 | + top['configs'].setdefault(config, {}) |
457 | + |
458 | + build = { |
459 | + 'name': name, |
460 | + 'arch': arch, |
461 | + 'distro': distro, |
462 | + 'cbuild': cbuild, |
463 | + 'host': host, |
464 | + 'config': config, |
465 | + 'abspath': aroot, |
466 | + 'path': make_path(top, aroot), |
467 | + 'files': files, |
468 | + 'failed': [], |
469 | + 'subfailures': [], |
470 | + 'testlog': None, |
471 | + 'started': None, |
472 | + 'finished': None, |
473 | + 'languages': [], |
474 | + 'binary': None, |
475 | + } |
476 | + |
477 | + for name in files: |
478 | + if 'failed' in name: |
479 | + build['failed'].append(name) |
480 | + |
481 | + if '-failed' in name: |
482 | + build['subfailures'].append(name.replace('-failed', ' ').split()[0]) |
483 | + |
484 | + elif name == 'started.txt': |
485 | + build['started'] = get_date(root, name) |
486 | + elif name == 'finished.txt': |
487 | + build['finished'] = get_date(root, name) |
488 | + elif name == 'gcc-configure.txt': |
489 | + with open('%s/%s' % (root, name)) as f: |
490 | + for line in f: |
491 | + if line.startswith('The following'): |
492 | + end = line.split(': ')[-1] |
493 | + build['languages'] = intern_all(end.strip().split(',')) |
494 | + break |
495 | + |
496 | + if name.endswith('-testsuite.txt'): |
497 | + build['testlog'] = make_path(top, '%s/%s' % (root, name)) |
498 | + |
499 | + return build |
500 | + |
501 | +def add_version(top, name, path): |
502 | + version = { |
503 | + 'name': name, |
504 | + 'abspath': path, |
505 | + 'builds': {} |
506 | + } |
507 | + |
508 | + root, dirs, files = list('%s/logs' % path) |
509 | + |
510 | + for dir in dirs: |
511 | + version['builds'][dir] = add_build(top, dir, '%s/%s' % (root, dir)) |
512 | + |
513 | + return version |
514 | + |
515 | +def add_versions(top, aroot, adirs): |
516 | + for dir in adirs: |
517 | + path = '%s/%s' % (aroot, dir) |
518 | + |
519 | + root, dirs, files = list(path) |
520 | + |
521 | + if 'logs' in dirs: |
522 | + version = add_version(top, dir, path) |
523 | + top['versions'][dir] = version |
524 | + |
525 | +def main(): |
526 | + top = { |
527 | + 'root': None, |
528 | + 'all': [], |
529 | + 'versions': {}, |
530 | + 'arches': {}, |
531 | + 'configs': {}, |
532 | + 'hosts': {}, |
533 | + } |
534 | + |
535 | + for path in sys.argv[1:]: |
536 | + root, dirs, files = os.walk(path).next() |
537 | + top['root'] = root |
538 | + add_versions(top, root, dirs) |
539 | + |
540 | + with open('builds.pickle', 'w') as f: |
541 | + cPickle.dump(top, f, cPickle.HIGHEST_PROTOCOL) |
542 | + |
543 | + with open('builds.json', 'w') as f: |
544 | + json.dump(top, f, indent=2) |
545 | + |
546 | +if __name__ == '__main__': |
547 | + main() |
548 | |
549 | === added file 'chain.sh' |
550 | --- chain.sh 1970-01-01 00:00:00 +0000 |
551 | +++ chain.sh 2014-07-01 15:58:40 +0000 |
552 | @@ -0,0 +1,75 @@ |
553 | +#!/bin/bash |
554 | +# |
555 | +# Chain a job on following another. Generally works on the latest. |
556 | +# |
557 | + |
558 | +set -e |
559 | +# If the pattern fails then propogate the error |
560 | +set -o pipefail |
561 | + |
562 | +. ~/.config/cbuild/toolsrc |
563 | + |
564 | +config=cortexa9hf |
565 | + |
566 | +type=$1 |
567 | +shift |
568 | +pattern=$1 |
569 | +shift |
570 | + |
571 | +# Set to run=echo for testing |
572 | +run= |
573 | + |
574 | +function get_latest { |
575 | + case $type in |
576 | + benchmarks | ubutest) |
577 | + # See what binaries we have locally that match |
578 | + version=$( ls -t $binaries/*${config}r?.tar* | head -n1 ) |
579 | + version=$( basename $( dirname $version )) |
580 | + ;; |
581 | + trunk | oecore) |
582 | + # Pick the latest snapshot |
583 | + version=$( ls -t $snapshots/$pattern | head -n1 ) |
584 | + version=$( basename $version ) |
585 | + version=$( echo $version | sed -r "s/(.+)\.tar.+/\1/" ) |
586 | + ;; |
587 | + *) |
588 | + exit -1 |
589 | + ;; |
590 | + esac |
591 | + |
592 | + [ -n $version ] |
593 | +} |
594 | + |
595 | +function spawn { |
596 | + target=$queue/$2/$1.job |
597 | + |
598 | + if [ ! -f $target ]; then |
599 | + PYTHONPATH=$lib/tcwg-web python -m schedulejob $2 $1 |
600 | + echo Spawned $1 into $2 |
601 | + fi |
602 | +} |
603 | + |
604 | +function chain { |
605 | + # Now do something |
606 | + case $type in |
607 | + benchmarks) |
608 | + spawn benchmarks-$version a9hf-ref |
609 | + spawn benchmarks-spec2000-$version a9hf-ref |
610 | + ;; |
611 | + ubutest) |
612 | + spawn ubutest-$version a9hf-ref |
613 | + ;; |
614 | + trunk) |
615 | + spawn $version a9hf-builder |
616 | + ;; |
617 | + oecore) |
618 | + spawn test-oecore-$version x86_64-heavy |
619 | + ;; |
620 | + *) |
621 | + exit -1 |
622 | + ;; |
623 | + esac |
624 | +} |
625 | + |
626 | +get_latest |
627 | +chain |
628 | |
629 | === added file 'check-leaks.sh' |
630 | --- check-leaks.sh 1970-01-01 00:00:00 +0000 |
631 | +++ check-leaks.sh 2014-07-01 15:58:40 +0000 |
632 | @@ -0,0 +1,14 @@ |
633 | +#!/bin/sh |
634 | +# |
635 | +# Watch the build directory for any results that shouldn't be there |
636 | +# |
637 | + |
638 | +. ~/.config/cbuild/toolsrc |
639 | + |
640 | +benchmarks="coremark eembc denbench spec" |
641 | + |
642 | +check=$www/build/ |
643 | + |
644 | +for i in $benchmarks; do |
645 | + find $check -name "$i*run.txt" | sed "s#$check##" | head -n 10 |
646 | +done |
647 | |
648 | === added file 'closest.py' |
649 | --- closest.py 1970-01-01 00:00:00 +0000 |
650 | +++ closest.py 2014-07-01 15:58:40 +0000 |
651 | @@ -0,0 +1,63 @@ |
652 | +#!/usr/bin/env python |
653 | +# Find the closest base filename. Finds the one with the longest |
654 | +# common prefix |
655 | +# |
656 | + |
657 | +import sys |
658 | +import os.path |
659 | +import re |
660 | + |
661 | +def match(find, path): |
662 | + """Check a build against a queue, returning the common prefix if |
663 | + the needle starts with the queue name. |
664 | + |
665 | + The longest match is the closest. |
666 | + |
667 | + >>> match('foo', 'baz') |
668 | + >>> match('gcc-foo-bar', 'gcc') |
669 | + 'gcc' |
670 | + >>> match('gcc-foo-bar', 'gcc-foo') |
671 | + 'gcc-foo' |
672 | + >>> match('gcc-foo-bar', 'default') |
673 | + '' |
674 | + """ |
675 | + base = os.path.basename(path) |
676 | + |
677 | + if find.startswith(base): |
678 | + return base |
679 | + elif base == 'default': |
680 | + return '' |
681 | + else: |
682 | + return None |
683 | + |
684 | +def match2(find, path): |
685 | + base = os.path.basename(path) |
686 | + return os.path.commonprefix([find, base]) |
687 | + |
688 | +def main(): |
689 | + against = os.path.basename(sys.argv[1]) |
690 | + |
691 | + prefixes = [] |
692 | + |
693 | + for full in sys.argv[2:]: |
694 | + key = match(against, full) |
695 | + |
696 | + if key != None: |
697 | + prefixes.append((full, key)) |
698 | + |
699 | + if not prefixes: |
700 | + for full in sys.argv[2:]: |
701 | + key = match2(against, full) |
702 | + |
703 | + if key != None: |
704 | + prefixes.append((full, key)) |
705 | + |
706 | + prefixes.sort(key=lambda x: -len(x[-1])) |
707 | + print prefixes[0][0] |
708 | + |
709 | +def test(): |
710 | + import doctest |
711 | + doctest.testmod() |
712 | + |
713 | +if __name__ == '__main__': |
714 | + main() |
715 | |
716 | === added file 'cron-often.sh' |
717 | --- cron-often.sh 1970-01-01 00:00:00 +0000 |
718 | +++ cron-often.sh 2014-07-01 15:58:40 +0000 |
719 | @@ -0,0 +1,56 @@ |
720 | +#!/bin/bash |
721 | +# |
722 | +# Job that runs often, polling builds, branches, and others. |
723 | +# |
724 | + |
725 | +# Root of the www directory |
726 | +. ~/.config/cbuild/toolsrc |
727 | + |
728 | +timeout="timeout 3600" |
729 | +#timeout= |
730 | + |
731 | +cd $ctools |
732 | + |
733 | +# Update the list of all files and other caches |
734 | +function allfiles { |
735 | + pushd $1 > /dev/null |
736 | + find . | sort > /tmp/all-files.txt |
737 | + |
738 | + # Only update if changed |
739 | + ! cmp -s all-files.txt /tmp/all-files.txt \ |
740 | + && python ~/lib/cbuild-tools/cache-builds.py . \ |
741 | + && mv -f /tmp/all-files.txt . \ |
742 | + && gzip -9fc --rsyncable all-files.txt > all-files.txt.gz \ |
743 | + && gzip -9fc --rsyncable builds.json > builds.json.gz |
744 | + |
745 | + popd > /dev/null |
746 | +} |
747 | + |
748 | +# Tabulate any new benchmarks |
749 | +./tabulate.sh |
750 | + |
751 | +# Update the file list cache |
752 | +allfiles $build |
753 | +allfiles $benchmarks |
754 | + |
755 | +# Pull any new merge requests |
756 | +timeout 14400 python taker.py -f prompt=false |
757 | + |
758 | +# Add the testsuite diffs to anything new |
759 | +python2.7 addtestdiffs.py $build/all-files.txt $snapshots $build |
760 | +python2.7 addtestdiffs.py $benchmarks/all-files.txt $snapshots $benchmarks |
761 | + |
762 | +# Rebuild all files after the diffs are added |
763 | +allfiles $build |
764 | +allfiles $benchmarks |
765 | + |
766 | +# Poll all interesting Linaro branches |
767 | +$timeout ./export-git.sh $repos/meta-linaro meta-linaro-0~ |
768 | +$timeout ./pull_branch.sh $repos/gcc-linaro-4.8/4.8 gcc-linaro-4.8 |
769 | +$timeout ./pull_branch.sh $repos/gcc-linaro/4.7 gcc-linaro-4.7 |
770 | +$timeout ./pull_branch.sh $repos/gcc-linaro/4.6 gcc-linaro-4.6 |
771 | +$timeout ./pull_branch.sh $repos/crosstool-ng/linaro crosstool-ng-linaro-1.13.1 |
772 | +$timeout ./pull_branch.sh $repos/gdb-linaro/7.5 gdb-linaro-7.5 |
773 | +$timeout ./pull_branch.sh $repos/cortex-strings cortex-strings-1.0 |
774 | +$timeout ./export-git.sh $repos/qemu-linaro qemu-linaro-1.0~ |
775 | +$timeout ./export-git.sh $repos/boot-wrapper boot-wrapper-0~ |
776 | |
777 | === added file 'ctimes.py' |
778 | --- ctimes.py 1970-01-01 00:00:00 +0000 |
779 | +++ ctimes.py 2014-07-01 15:58:40 +0000 |
780 | @@ -0,0 +1,100 @@ |
781 | +"""Parse the scheduler log and print the time spent transitioning from |
782 | +state to state for each host. |
783 | +""" |
784 | +import sys |
785 | +import fileinput |
786 | +import collections |
787 | +import pprint |
788 | + |
789 | +Edge = collections.namedtuple('Edge', 'a b took time') |
790 | + |
791 | +def parse(): |
792 | + hosts = {} |
793 | + lasts = {} |
794 | + first = None |
795 | + |
796 | + for line in fileinput.input(): |
797 | + parts = line.split() |
798 | + stamp, host, state, arg = (parts + ['']*4)[:4] |
799 | + stamp = float(stamp) |
800 | + |
801 | + if first == None: |
802 | + first = stamp |
803 | + |
804 | + if arg: |
805 | + state = arg |
806 | + |
807 | + last = lasts.get(host, None) |
808 | + lasts[host] = (stamp, state) |
809 | + |
810 | + if last: |
811 | + elapsed = stamp - last[0] |
812 | + hosts.setdefault(host, []).append(Edge(last[1], state, elapsed, stamp - first)) |
813 | + |
814 | + return hosts |
815 | + |
816 | +def cost(): |
817 | + hosts = parse() |
818 | + |
819 | + print('\t'.join('day host job took'.split())) |
820 | + |
821 | + for host in hosts: |
822 | + doing = None |
823 | + start = None |
824 | + |
825 | + for edge in hosts[host]: |
826 | + if edge.b == 'idle': |
827 | + if doing and start: |
828 | + pretty = doing.replace('-extract-top', '') |
829 | + print('%.1f\t%s\t\t%s\t%.1f' % (edge.time/3600/24, host, pretty, (edge.time - start)/3600)) |
830 | + doing = None |
831 | + |
832 | + if edge.b in ['lurking', 'idle']: |
833 | + start = edge.time |
834 | + doing = None |
835 | + |
836 | + if edge.b in ['starting', 'idle', 'updating', 'running', 'lurking']: |
837 | + # Skip |
838 | + pass |
839 | + elif 'fetch-binary' in edge.b: |
840 | + # Skip |
841 | + pass |
842 | + else: |
843 | + if doing == None: |
844 | + doing = edge.b |
845 | + |
846 | +def main(): |
847 | + hosts = parse() |
848 | + |
849 | + # Go through and bin them |
850 | + bins = {} |
851 | + |
852 | + for host, edges in hosts.items(): |
853 | + bins[host] = {} |
854 | + |
855 | + for edge in edges: |
856 | + key = '%s -> %s' % (edge.a, edge.b) |
857 | + |
858 | + if edge.a in ['lurking', 'running', 'idle']: |
859 | + continue |
860 | + |
861 | + if key in ['idle -> updating', 'running -> idle', 'updating -> running']: |
862 | + continue |
863 | + |
864 | + bins[host].setdefault(key, []).append(edge.took) |
865 | + |
866 | + for host in sorted(bins): |
867 | + keys = bins[host] |
868 | + print('%s:' % host) |
869 | + |
870 | + for key in sorted(keys): |
871 | + values = keys[key] |
872 | + print(' %s: %.1f' % (key, sum(values)/len(values)), end=' ') |
873 | + |
874 | + for value in values: |
875 | + print('%.0f' % value, end=' ') |
876 | + |
877 | + print() |
878 | + |
879 | +if __name__ == '__main__': |
880 | + cost() |
881 | |
882 | === added file 'difftests.sh' |
883 | --- difftests.sh 1970-01-01 00:00:00 +0000 |
884 | +++ difftests.sh 2014-07-01 15:58:40 +0000 |
885 | @@ -0,0 +1,45 @@ |
886 | +#!/bin/bash |
887 | +# |
888 | +# Take all of the .sum files from two directories and generate the differences |
889 | +# |
890 | +# Example: |
891 | +# difftests.sh \ |
892 | +# build/gcc-4.6+svn173209/logs/armv7l-maverick-cbuild113-ursa3-cortexa9r1 \ |
893 | +# build/gcc-4.6+svn173722/logs/armv7l-maverick-cbuild114-ursa4-cortexa9r1 |
894 | +# |
895 | + |
896 | +set -e |
897 | + |
898 | +base=$1 |
899 | +next=$2 |
900 | + |
901 | +dir=`mktemp -d difftests.XXXXXXXXXX` |
902 | +trap "rm -rf $dir" EXIT |
903 | + |
904 | +mkdir -p $dir/base |
905 | +mkdir -p $dir/next |
906 | + |
907 | +# Copy across all logs |
908 | +cp $base/*.sum* $dir/base |
909 | +cp $next/*.sum* $dir/next |
910 | +unxz -f $dir/base/*.xz $dir/next/*.xz |
911 | + |
912 | +# Pull out jus the PASS/FAIL/etc lines and sort by test name |
913 | +# * Change absolute path names to .../ |
914 | +# * Drop all limits tests |
915 | +# |
916 | +for i in `find $dir -name "*.sum"`; do |
917 | + grep -E '^[A-Z]+:' $i \ |
918 | + | grep -Ev limits- \ |
919 | + | grep -Ev /guality/ \ |
920 | + | sed -r 's#/scratch/\w+/\w+/\w+/\w+/[^/]+#...#g' \ |
921 | + | sed -r "s#UNSUPPORTED: .+/testsuite/#UNSUPPORTED: #" \ |
922 | + | sort -k 2 > $i.tmp |
923 | + mv $i.tmp $i |
924 | +done |
925 | + |
926 | +# diff returns non-zero if there is a difference |
927 | +set +e |
928 | +(cd $dir && diff -U 0 -r base next) > $dir/diff.txt |
929 | +# Drop anything but changes in test lines |
930 | +grep -E '^[+-][A-Z]' $dir/diff.txt || true |
931 | |
932 | === added file 'ec2-pull.sh' |
933 | --- ec2-pull.sh 1970-01-01 00:00:00 +0000 |
934 | +++ ec2-pull.sh 2014-07-01 15:58:40 +0000 |
935 | @@ -0,0 +1,28 @@ |
936 | +#!/bin/bash |
937 | +# |
938 | +# Pull the results from the cloud to the build server |
939 | +# |
940 | + |
941 | +set -e |
942 | + |
943 | +. ~/.config/cbuild/toolsrc |
944 | + |
945 | +s3=linaro-toolchain-builds |
946 | + |
947 | +dir=`mktemp -d` |
948 | +trap "rm -rf $dir" EXIT |
949 | + |
950 | +mkdir -p $dir/build |
951 | +cd $dir |
952 | + |
953 | +# Check what's there and fetch, aborting if it takes too long |
954 | +candidates=$( timeout 600 s3cmd ls s3://$s3 | awk '{print $4;}' | grep 'logs.tar.xz$' ) |
955 | + |
956 | +for i in $candidates; do |
957 | + base=$( basename $i ) |
958 | + # Get, extract, push, and remove |
959 | + timeout 600 s3cmd get --skip-existing --no-progress $i && \ |
960 | + tar xaf $base && \ |
961 | + rsync -rtC build $www && \ |
962 | + s3cmd del $i |
963 | +done |
964 | |
965 | === added file 'ec2-spawn.sh' |
966 | --- ec2-spawn.sh 1970-01-01 00:00:00 +0000 |
967 | +++ ec2-spawn.sh 2014-07-01 15:58:40 +0000 |
968 | @@ -0,0 +1,36 @@ |
969 | +#!/bin/bash |
970 | + |
971 | +function spawn { |
972 | + host=$1 |
973 | + ami=$2 |
974 | + type=$3 |
975 | + |
976 | + dir=/tmp/ec2slave/$host |
977 | + mkdir -p $dir |
978 | + cd $dir |
979 | + |
980 | + ls /var/run/screen/S-cbuild/*.$host > /dev/null 2>&1 || \ |
981 | + screen -S $host -L -d -m bash -c "cd $HOME/lib/cbuild-tools && exec ./ec2slave.sh $host $ami $type" |
982 | +} |
983 | + |
984 | +# us-east1 x86_64 EBS Natty |
985 | +#spawn oort1 ami-fd589594 c1.xlarge |
986 | +# Precise |
987 | +spawn oort1 ami-a29943cb c1.xlarge |
988 | +#spawn oort3 ami-fd589594 c1.xlarge |
989 | +# us-east1 i686 EBS Natty |
990 | +#spawn oort2 ami-81c31ae8 c1.medium |
991 | +# Precise |
992 | +spawn oort2 ami-ac9943c5 c1.medium |
993 | +#spawn oort4 ami-ac9943c5 c1.medium |
994 | +spawn oort6 ami-ac9943c5 c1.medium |
995 | +#spawn oort8 ami-ac9943c5 c1.medium |
996 | +#spawn oort4 ami-e358958a c1.medium |
997 | +#spawn oort6 ami-e358958a c1.medium |
998 | +#spawn oort8 ami-e358958a c1.medium |
999 | +#spawn oort6 ami-e358958a c1.medium |
1000 | +#spawn oort8 ami-06ad526f c1.medium |
1001 | +#spawn oort11 ami-d5e54dbc c1.xlarge |
1002 | +spawn oort12 ami-dfe54db6 c1.medium |
1003 | +#spawn oort14 ami-37af765e c1.medium |
1004 | +#spawn oort14 ami-87dd03ee c1.medium |
1005 | |
1006 | === added file 'ec2new.sh' |
1007 | --- ec2new.sh 1970-01-01 00:00:00 +0000 |
1008 | +++ ec2new.sh 2014-07-01 15:58:40 +0000 |
1009 | @@ -0,0 +1,23 @@ |
1010 | +#!/bin/bash |
1011 | +# |
1012 | +# Runs a EC2 slave for local use |
1013 | +# |
1014 | +# Usage: ec2new.sh [hostname [ami [type]]] |
1015 | +# |
1016 | +# such as: |
1017 | +# ec2new.sh oort51 ami-1aad5273 c1.xlarge |
1018 | +# ec2new.sh oort52 ami-06ad526f c1.medium |
1019 | +# |
1020 | + |
1021 | +set -e |
1022 | + |
1023 | +# Don't kill on exit... |
1024 | +kill=false |
1025 | +. `dirname $0`/libec2.sh |
1026 | + |
1027 | +echo Starting an instance |
1028 | +start_instance |
1029 | +init_instance |
1030 | + |
1031 | +echo instance: $instance |
1032 | +echo ip: $ip |
1033 | |
1034 | === added file 'ec2slave-init-cbuild.sh' |
1035 | --- ec2slave-init-cbuild.sh 1970-01-01 00:00:00 +0000 |
1036 | +++ ec2slave-init-cbuild.sh 2014-07-01 15:58:40 +0000 |
1037 | @@ -0,0 +1,36 @@ |
1038 | +#!/bin/sh |
1039 | +# |
1040 | +# Second stage where the cbuild user sets up |
1041 | +# |
1042 | +# Most configuration is in files shipped with this script |
1043 | + |
1044 | +touch ~/.in-cloud |
1045 | + |
1046 | +# Setup the SSH config |
1047 | +cat > ~/.ssh/config <<EOF |
1048 | +# Easier to set the port here then everywhere |
1049 | +Host cbuild-master |
1050 | + ProxyCommand ssh cbuild.validation.linaro.org nc -q0 toolchain64 %p |
1051 | +EOF |
1052 | + |
1053 | +# Set up the cbuild global config |
1054 | +cat > ~/.config/cbuild/cbuildrc <<EOF |
1055 | +# Nothing |
1056 | +EOF |
1057 | + |
1058 | +# Seed the known hosts with keys |
1059 | +for i in linaro-gateway bazaar.launchpad.net validation.linaro.org cbuild.validation.linaro.org $(hostname) localhost cbuild-master; do |
1060 | + ssh -o StrictHostKeyChecking=no -o BatchMode=yes $i true || true |
1061 | +done |
1062 | + |
1063 | +# Checkout the cbuild scripts |
1064 | +if [ ! -d /cbuild/.bzr ]; then |
1065 | + bzr lp-login cbuild |
1066 | + bzr branch --use-existing-dir lp:cbuild /cbuild |
1067 | +fi |
1068 | + |
1069 | +cd /cbuild |
1070 | +bzr pull |
1071 | + |
1072 | +# Seed files from the nice-and-close S3 |
1073 | +s3cmd sync s3://linaro-toolchain-files files |
1074 | |
1075 | === added file 'ec2slave-init.sh' |
1076 | --- ec2slave-init.sh 1970-01-01 00:00:00 +0000 |
1077 | +++ ec2slave-init.sh 2014-07-01 15:58:40 +0000 |
1078 | @@ -0,0 +1,105 @@ |
1079 | +#!/bin/sh |
1080 | + |
1081 | +# Fail on any error |
1082 | +set -e |
1083 | + |
1084 | +# Automatically kill the slave if it runs for too long |
1085 | +echo /sbin/poweroff | at now + 72 hours |
1086 | + |
1087 | +# Set the hostname |
1088 | +hostname=${1:-oort99} |
1089 | + |
1090 | +echo $hostname > /etc/hostname |
1091 | +hostname $hostname |
1092 | +echo 127.0.0.1 $hostname proxy >> /etc/hosts |
1093 | + |
1094 | +# Bind mount /cbuild |
1095 | +mkdir -p /cbuild /mnt/cbuild |
1096 | +mount -o bind /mnt/cbuild /cbuild |
1097 | +chmod a+w /cbuild |
1098 | + |
1099 | +# Use the S3 backed archive |
1100 | +sed -i.dist 's,archive.ubuntu.com,archive.ubuntu.com.s3.amazonaws.com,g' /etc/apt/sources.list |
1101 | +# without pipelining to make it more reliable |
1102 | +echo "Acquire::http::Pipeline-Depth 0;" > /etc/apt/apt.conf.d/99no-pipelining |
1103 | + |
1104 | +flags="-yq --force-yes" |
1105 | + |
1106 | +# Install the basic packages |
1107 | +apt-get update || true |
1108 | +apt-get install $flags puppet jed python-software-properties |
1109 | + |
1110 | +# Add in the PPA |
1111 | +#add-apt-repository ppa:linaro-toolchain-dev/build-deps |
1112 | +#apt-get update || true |
1113 | + |
1114 | +# Basics |
1115 | +apt-get build-dep $flags gcc binutils gdb |
1116 | +apt-get build-dep $flags gcc gdb eglibc ffmpeg llvm python libvorbis |
1117 | +apt-get build-dep $flags gcc-4.6 || \ |
1118 | + apt-get build-dep $flags gcc-4.5 || \ |
1119 | + apt-get build-dep $flags gcc-snapshot \ |
1120 | + |
1121 | +# For llvm |
1122 | +apt-get install $flags python-sphinx || true |
1123 | + |
1124 | +# For cross test. Missing on lucid |
1125 | +apt-get install $flags qemu-kvm qemu-user qemu-system || true |
1126 | + |
1127 | +# Needed for baremetal and graphite |
1128 | +apt-get install $flags binutils-arm-none-eabi libcloog-ppl-dev || true |
1129 | + |
1130 | +# Needed for cbuild |
1131 | +apt-get install $flags time python-irclib wget gnupg ccrypt rsync coreutils \ |
1132 | + sendemail bzr xdelta3 xz-utils unzip openssh-client gdb libio-socket-ssl-perl libnet-ssleay-perl |
1133 | + |
1134 | +# Needed for ubutest |
1135 | +apt-get build-dep $flags eglibc base-files base-passwd bash coreutils dash \ |
1136 | + debianutils diffutils dpkg e2fsprogs findutils grep gzip hostname ncurses \ |
1137 | + perl python-defaults sed shadow tar util-linux |
1138 | + |
1139 | +# Needed for qemu |
1140 | +apt-get install $flags libpixman-1-dev |
1141 | + |
1142 | +apt-get build-dep $flags python2.7 || true |
1143 | +# devscripts asks a question... |
1144 | +echo 1 | apt-get install $flags devscripts || true |
1145 | + |
1146 | +# Needed for crosstool-NG LSB based builds |
1147 | +apt-get install $flags lsb lsb-build-cc3 lsb-appchk3 ccache gcc-mingw32 flip tk tofrodos |
1148 | +apt-get install $flags gcc-4.1 g++-4.1 || true |
1149 | +apt-get install $flags gcc-4.1-multilib g++-4.1-multilib || true |
1150 | + |
1151 | +# Needed for cloud builds |
1152 | +apt-get install $flags polipo s3cmd |
1153 | + |
1154 | +# Use Bash as /bin/sh. Michael's fault... |
1155 | +echo n | dpkg-reconfigure -f teletype dash |
1156 | + |
1157 | +# Set up the cbuild user |
1158 | +cb=/home/cbuild |
1159 | + |
1160 | +adduser --disabled-password --gecos cbuild cbuild |
1161 | +tar xaf *seed*.tar* -C /home/cbuild --strip-components=1 |
1162 | + |
1163 | +chown -R cbuild.cbuild $cb |
1164 | +chmod -R go-rwx $cb/.ssh |
1165 | + |
1166 | +# Finally put /tmp in RAM for faster builds |
1167 | +mount -t tmpfs tmfs /tmp |
1168 | + |
1169 | +# Break multiarch |
1170 | +multi=`dpkg-architecture -qDEB_BUILD_MULTIARCH || true` |
1171 | + |
1172 | +if [ ! -z "$multi" ]; then |
1173 | + pushd /usr/lib |
1174 | + cd /usr/lib && ls $multi/*crt*.o | xargs -L 1 ln -sf |
1175 | + cd /usr/include |
1176 | + |
1177 | + for i in asm gnu bits sys; do |
1178 | + if [ -d $multi/$i ]; then |
1179 | + rm -rf $i |
1180 | + ln -sf $multi/$i |
1181 | + fi |
1182 | + done |
1183 | +fi |
1184 | |
1185 | === added file 'ec2slave.sh' |
1186 | --- ec2slave.sh 1970-01-01 00:00:00 +0000 |
1187 | +++ ec2slave.sh 2014-07-01 15:58:40 +0000 |
1188 | @@ -0,0 +1,65 @@ |
1189 | +#!/bin/bash |
1190 | +# |
1191 | +# Runs a EC2 slave that builds cbuild jobs |
1192 | +# |
1193 | +# Usage: ec2slave.sh [hostname [ami [type]]] |
1194 | +# |
1195 | +# such as: |
1196 | +# ec2slave.sh oort2 ami-689c6701 c1.medium |
1197 | +# ec2slave.sh oort1 ami-40926929 c1.xlarge |
1198 | +# |
1199 | + |
1200 | +set -e |
1201 | + |
1202 | +. `dirname $0`/libec2.sh |
1203 | + |
1204 | +function kill_old { |
1205 | + if [ "$instance" != "" ]; then |
1206 | + now=`date +%s` |
1207 | + elapsed=$(($now-$used)) |
1208 | + |
1209 | + if [ "$elapsed" -gt "600" ]; then |
1210 | + echo Killing $instance due to inactivity |
1211 | + $lock ec2kill $instance |
1212 | + instance= |
1213 | + fi |
1214 | + fi |
1215 | +} |
1216 | + |
1217 | +function update_status { |
1218 | + if [ "$instance" == "" ]; then |
1219 | + $wget -O - $SCHEDULER_API/update/$hostname/lurking |
1220 | + else |
1221 | + $wget -O - $SCHEDULER_API/update/$hostname/idle |
1222 | + fi |
1223 | +} |
1224 | + |
1225 | +DRIFT=$(($RANDOM % 30)) |
1226 | + |
1227 | +while true; do |
1228 | + echo $hostname $instance `date` |
1229 | + |
1230 | + sleep $((180 + $DRIFT)) |
1231 | + |
1232 | + kill_old |
1233 | + update_status |
1234 | + |
1235 | + # See if there's a job waiting |
1236 | + next=`$wget -O - $SCHEDULER_API/peek/$hostname` |
1237 | + |
1238 | + case "$next" in |
1239 | + False) continue ;; |
1240 | + esac |
1241 | + |
1242 | + echo Running a job |
1243 | + |
1244 | + if [ "$instance" == "" ]; then |
1245 | + echo Starting an instance |
1246 | + start_instance |
1247 | + init_instance |
1248 | + used=`date +%s` |
1249 | + fi |
1250 | + |
1251 | + ssh -o StrictHostKeyChecking=no -o BatchMode=yes -t cbuild@$ip 'cd /cbuild && ./run.sh' |
1252 | + used=`date +%s` |
1253 | +done |
1254 | |
1255 | === added file 'ec2watch.sh' |
1256 | --- ec2watch.sh 1970-01-01 00:00:00 +0000 |
1257 | +++ ec2watch.sh 2014-07-01 15:58:40 +0000 |
1258 | @@ -0,0 +1,18 @@ |
1259 | +#!/bin/bash |
1260 | +# |
1261 | +# Watch the EC2 instances so we can see where the money goes |
1262 | +# |
1263 | +# Terribly ugly hack. Add this as a cronjob that runs once a half |
1264 | +# hour. Spits the results of ec2din into a timestamped file under |
1265 | +# $HOME/var. |
1266 | +# |
1267 | + |
1268 | +# Pull in the local configuration |
1269 | +. ~/.private/ec2slave.rc |
1270 | + |
1271 | +dir=$HOME/var/log/ec2watch |
1272 | +month=$(date +%Y-%m) |
1273 | +now=$(date +%F-%H-%M) |
1274 | + |
1275 | +mkdir -p $dir/$month |
1276 | +ec2din --show-empty-fields | gzip > $dir/$month/$now.txt.gz |
1277 | |
1278 | === added file 'enrich-gcc-linaro-repo' |
1279 | --- enrich-gcc-linaro-repo 1970-01-01 00:00:00 +0000 |
1280 | +++ enrich-gcc-linaro-repo 2014-07-01 15:58:40 +0000 |
1281 | @@ -0,0 +1,45 @@ |
1282 | +#!/usr/bin/python |
1283 | +# -*- coding: utf-8 -*- |
1284 | +# |
1285 | +# Copies revisions from branches into the dev focus branch as it's |
1286 | +# also the stacked branch. This speeds up branching when using a |
1287 | +# shared repo. |
1288 | +# |
1289 | +# Runs as a cron job |
1290 | +# |
1291 | +# Author: Loïc Minier <loic.minier@linaro.org> |
1292 | +# |
1293 | + |
1294 | +# took 6mn41s on the first run |
1295 | +# 18s on the second run |
1296 | +# |
1297 | + |
1298 | +import sys |
1299 | + |
1300 | +import bzrlib.repository |
1301 | +import bzrlib.plugin |
1302 | + |
1303 | +#bzrlib.initialize() |
1304 | +bzrlib.plugin.load_plugins() |
1305 | + |
1306 | +r_4_7 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.7') |
1307 | +r_4_6 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.6') |
1308 | +#r_4_5 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.5') |
1309 | +#r_4_4 = bzrlib.repository.Repository.open('lp:gcc-linaro/4.4') |
1310 | +# copy revisions from 4.6 branches into 4.7 which is the devfocus |
1311 | +# branch because it's used as a stacked branch |
1312 | +# 4.4 and 4.5 aren't updated anymore |
1313 | +r_4_7.fetch(r_4_6) |
1314 | +#r_4_6.fetch(r_4_5) |
1315 | +#r_4_6.fetch(r_4_4) |
1316 | +# copy revisions from 4.6 into 4.4 and 4.5 once to workaround LP #388269; only |
1317 | +# needed once ever |
1318 | +# copying 4.5 into 4.4 took 52mn55s on the first run and 51s on second run |
1319 | +# (with three up-to-date enriching branches enabled) |
1320 | +# copying 4.5 into 4.6 took 1h45mn13s on the first run and 48s on second run |
1321 | +# (with three maybe not fully up-to-date enriching branches enabled) |
1322 | +#r_4_4.fetch(r_4_6) |
1323 | +#r_4_5.fetch(r_4_6) |
1324 | +r_4_6.fetch(r_4_7) |
1325 | + |
1326 | +print sys.argv[0], 'done' |
1327 | \ No newline at end of file |
1328 | |
1329 | === added file 'export-bzr.sh' |
1330 | --- export-bzr.sh 1970-01-01 00:00:00 +0000 |
1331 | +++ export-bzr.sh 2014-07-01 15:58:40 +0000 |
1332 | @@ -0,0 +1,85 @@ |
1333 | +#!/bin/sh |
1334 | +# Update a branch and snapshot the latest revision |
1335 | +# Can also snapshot past revisions and add a per-user |
1336 | +# suffix |
1337 | +# |
1338 | +# Example: export-bzr.sh ~/repos/gcc-linaro/4.6 gcc-linaro-4.6 [revno [suffix [basedir]]] |
1339 | +# |
1340 | +set -e |
1341 | + |
1342 | +. ~/.config/cbuild/toolsrc |
1343 | + |
1344 | +dir=`mktemp -d` |
1345 | +trap "rm -rf $dir" EXIT |
1346 | + |
1347 | +# Directory this script runs from |
1348 | +here=`dirname $0` |
1349 | + |
1350 | +# Suffix for manual pulls |
1351 | +SUFFIX=${4:-} |
1352 | + |
1353 | +# Pull a branch and tarball it up |
1354 | +branch=$1 |
1355 | +head=$2 |
1356 | + |
1357 | +latest=`bzr revno $branch` |
1358 | +revno=${3:-$latest} |
1359 | + |
1360 | +base=${5:-$branch} |
1361 | + |
1362 | +stem="$head+bzr$revno$SUFFIX" |
1363 | +tar="/tmp/$stem.tar" |
1364 | +meta="/tmp/$stem.txt" |
1365 | +diff="/tmp/$stem.diff" |
1366 | +xdelta="$tar.xdelta3.xz" |
1367 | + |
1368 | +if [ -f $snapshots/`basename $xdelta` ]; then |
1369 | + exit 0 |
1370 | +fi |
1371 | + |
1372 | +closest=`$here/closest.py $tar $snapshots/base/*.tar` |
1373 | + |
1374 | +mkdir -p $snapshots |
1375 | + |
1376 | +ancestor=`bzr revision-info -d $branch -r ancestor:$base` |
1377 | + |
1378 | +# Record some meta data about the branch |
1379 | +echo \# Branch summary > $meta |
1380 | +echo name: $stem >> $meta |
1381 | +echo snapshot-date: `date --rfc-3339=seconds` >> $meta |
1382 | +bzr log -l1 $branch | grep -E '^(committer|timestamp):' >> $meta |
1383 | +echo directory: $branch >> $meta |
1384 | +echo branch: `bzr info $branch | grep parent | awk '{print $3;}'` >> $meta |
1385 | +echo mainline-branch: $base >> $meta |
1386 | +echo revno: $revno >> $meta |
1387 | +echo revision: `bzr revision-info -d $branch -r $revno` >> $meta |
1388 | +echo tip: `bzr revision-info -d $branch` >> $meta |
1389 | +echo ancestor: $ancestor >> $meta |
1390 | +echo closest: `basename $closest` >> $meta |
1391 | +echo xdelta3: `basename $xdelta` >> $meta |
1392 | + |
1393 | +id=`echo $ancestor | awk '{ print $2; }'` |
1394 | + |
1395 | +# Create the diff |
1396 | +if [ "$id" != "$revno" ]; then |
1397 | + bzr diff -r $id $branch > $diff || true |
1398 | +fi |
1399 | + |
1400 | +echo Exporting and stamping $revno |
1401 | +bzr export -r $revno $dir/$stem $branch |
1402 | +REVNO=$revno $here/stamp_branch.sh $branch $dir/$stem |
1403 | + |
1404 | +# Create the tarball |
1405 | +echo Creating $tar |
1406 | +(cd $dir && find $stem -print0 | sort -z) | tar caf $tar -C $dir --no-recursion --null -T - |
1407 | + |
1408 | +# Create the delta |
1409 | +echo Making the delta relative to $closest |
1410 | +xdelta3 -fe -s $closest $tar $tar.xdelta3 |
1411 | +rm -f $tar |
1412 | +xz -f $tar.xdelta3 |
1413 | + |
1414 | +mv -f $tar*.xz $diff $meta $snapshots |
1415 | + |
1416 | +echo New snapshot `basename $tar` created |
1417 | +echo stem: $stem |
1418 | |
1419 | === added file 'export-git.sh' |
1420 | --- export-git.sh 1970-01-01 00:00:00 +0000 |
1421 | +++ export-git.sh 2014-07-01 15:58:40 +0000 |
1422 | @@ -0,0 +1,75 @@ |
1423 | +#!/bin/sh |
1424 | +# |
1425 | +# Update a git branch and snapshot the latest revision |
1426 | +# |
1427 | + |
1428 | +set -e |
1429 | + |
1430 | +. ~/.config/cbuild/toolsrc |
1431 | + |
1432 | +dir=`mktemp -d` |
1433 | +trap "rm -rf $dir" EXIT |
1434 | + |
1435 | +# Directory this script runs from |
1436 | +here=`dirname $0` |
1437 | + |
1438 | +# Suffix for manual pulls |
1439 | +SUFFIX=${4:-} |
1440 | + |
1441 | +# Subdirectory to archive. Needed for Android GCC |
1442 | +#SUBDIR= |
1443 | + |
1444 | +# Pull a branch and tarball it up |
1445 | +branch=$1 |
1446 | +head=${2:-`basename $branch`} |
1447 | +id=${3:-HEAD} |
1448 | + |
1449 | +export GIT_DIR=$branch/.git |
1450 | +export GIT_WORK_TREE=$branch |
1451 | + |
1452 | +(cd $branch && git reset --hard -q && git pull -q) |
1453 | +latest=`git rev-parse $id` |
1454 | +short=`echo $latest | cut -c -7` |
1455 | +now=`date +%Y%m%d` |
1456 | + |
1457 | +name=$head$now+git$short$SUFFIX |
1458 | +tar=/tmp/$name.tar |
1459 | +xdelta=$tar.xdelta3.xz |
1460 | + |
1461 | +if ls $snapshots/$head*git$short*xz > /dev/null 2>&1; then |
1462 | + exit 0 |
1463 | +fi |
1464 | + |
1465 | +mkdir -p $snapshots |
1466 | +echo Exporting $short |
1467 | +git archive --format=tar --prefix=$name/ $id $SUBDIR | tar xf - -C $dir |
1468 | + |
1469 | +if [ -n "$SUBDIR" ]; then |
1470 | + # We've ended up with $dir/$head/$SUBDIR. Shuffle them about |
1471 | + mv $dir/$head/$SUBDIR $dir |
1472 | + rm -r $dir/$head |
1473 | + mv $dir/$SUBDIR $dir/$head |
1474 | +fi |
1475 | + |
1476 | +# Create the tarball |
1477 | +echo Creating $tar |
1478 | +(cd $dir && find $name -print0 | sort -z) | tar caf $tar -C $dir --no-recursion --null -T - |
1479 | + |
1480 | +# Use gzip as an approximation to how well it will compress |
1481 | +size=$( gzip < $tar | wc -c ) |
1482 | + |
1483 | +if [ "$size" -lt "10000000000" ]; then |
1484 | + # Small enough to just tar |
1485 | + xz -9f $tar |
1486 | +else |
1487 | + # Create the delta |
1488 | + closest=`$here/closest.py $tar $snapshots/base/*.tar` |
1489 | + echo Making the delta relative to $closest |
1490 | + xdelta3 -fe -s $closest $tar $tar.xdelta3 |
1491 | + rm -f $tar |
1492 | + xz -9f $tar.xdelta3 |
1493 | +fi |
1494 | + |
1495 | +mv $tar*.xz $snapshots |
1496 | +echo New snapshot `basename $tar` created |
1497 | +$here/spawn.sh $tar |
1498 | |
1499 | === added file 'export-hg.sh' |
1500 | --- export-hg.sh 1970-01-01 00:00:00 +0000 |
1501 | +++ export-hg.sh 2014-07-01 15:58:40 +0000 |
1502 | @@ -0,0 +1,61 @@ |
1503 | +#!/bin/sh |
1504 | +# Update a branch and snapshot the latest revision |
1505 | +# Can also snapshot past revisions and add a per-user |
1506 | +# suffix |
1507 | +# |
1508 | +# Example: export-bzr.sh ~/repos/gcc-linaro/4.6 gcc-linaro-4.6 [revno [suffix [basedir]]] |
1509 | +# |
1510 | +set -e |
1511 | + |
1512 | +dir=`mktemp -d` |
1513 | +trap "rm -rf $dir" EXIT |
1514 | + |
1515 | +# Directory this script runs from |
1516 | +here=`dirname $0` |
1517 | + |
1518 | +# Suffix for manual pulls |
1519 | +SUFFIX=${4:-} |
1520 | + |
1521 | +# Pull a branch and tarball it up |
1522 | +branch=$1 |
1523 | +head=$2 |
1524 | +hgflags="-R $branch -y -q" |
1525 | + |
1526 | +hg $hgflags pull |
1527 | +latest=`hg $hgflags -q tip | awk -F: '{print $1;}'` |
1528 | + |
1529 | +revno=${3:-$latest} |
1530 | + |
1531 | +base=${5:-$branch} |
1532 | + |
1533 | +snapshots=$HOME/snapshots |
1534 | +stem=$head+hg$revno$SUFFIX |
1535 | +tar=/tmp/$stem.tar |
1536 | +xdelta=$tar.xdelta3.xz |
1537 | + |
1538 | +if [ -f $snapshots/`basename $xdelta` ]; then |
1539 | + exit 0 |
1540 | +fi |
1541 | + |
1542 | +closest=`$here/closest.py $tar $snapshots/base/*.tar` |
1543 | + |
1544 | +mkdir -p $snapshots |
1545 | + |
1546 | +echo Exporting and stamping $revno |
1547 | +hg $hgflags archive -r $revno $dir/$head |
1548 | +REVNO=$revno $here/stamp_branch.sh $branch $dir/$head |
1549 | + |
1550 | +# Create the tarball |
1551 | +echo Creating $tar |
1552 | +(cd $dir && find $head -print0 | sort -z) | tar caf $tar -C $dir --no-recursion --null -T - |
1553 | + |
1554 | +# Create the delta |
1555 | +echo Making the delta relative to $closest |
1556 | +xdelta3 -fe -s $closest $tar $tar.xdelta3 |
1557 | +rm -f $tar |
1558 | +xz -f $tar.xdelta3 |
1559 | + |
1560 | +mv -f $tar.xdelta3.xz $snapshots |
1561 | + |
1562 | +echo New snapshot `basename $tar` created |
1563 | +echo stem: $stem |
1564 | |
1565 | === added file 'findancestor.py' |
1566 | --- findancestor.py 1970-01-01 00:00:00 +0000 |
1567 | +++ findancestor.py 2014-07-01 15:58:40 +0000 |
1568 | @@ -0,0 +1,204 @@ |
1569 | +#!/usr/bin/python |
1570 | +"""Given a build name, find the ancestor or the last working build of |
1571 | +that series. |
1572 | + |
1573 | +Examples: |
1574 | + findancestor gcc-linaro-4.5+bzr99502~ams~merge-20110413 -> the ancestor |
1575 | + findancestor gcc-linaro-4.5+bzr99510 -> probably gcc-linaro-4.5+bzr99509 |
1576 | + findancestor gcc-4.6+svn174684 -> the build from last week |
1577 | +""" |
1578 | + |
1579 | +import sys |
1580 | +import os.path |
1581 | +import pprint |
1582 | +import re |
1583 | +import pdb |
1584 | +import collections |
1585 | + |
1586 | +def parse(name): |
1587 | + """Parse the meta data file which is a series of name: value pairs""" |
1588 | + values = {} |
1589 | + |
1590 | + with open(name) as f: |
1591 | + lines = [x.strip() for x in f.readlines()] |
1592 | + |
1593 | + for line in lines: |
1594 | + if not line: |
1595 | + continue |
1596 | + |
1597 | + if line.startswith('#'): |
1598 | + continue |
1599 | + |
1600 | + name, value = [x.strip() for x in line.split(':', 1)] |
1601 | + values[name] = value |
1602 | + |
1603 | + return values |
1604 | + |
1605 | +def revkey(record): |
1606 | + try: |
1607 | + if record.date: |
1608 | + return int(record.date.replace('~', '')) |
1609 | + else: |
1610 | + return int(record.revision) |
1611 | + except ValueError: |
1612 | + return record.revision |
1613 | + |
1614 | +def make_splitter(): |
1615 | + """Splits a build name like gcc-4.5+svn1234, |
1616 | + gcc-linaro-4.6+bzr1234, and gdb-7.3~20110606+gitabcdef into the |
1617 | + product, series, optional date, version control system, and |
1618 | + revision. |
1619 | + |
1620 | + >>> make_splitter().match('gcc-4.5~svn12345').groups() |
1621 | + ('gcc-', '4.5', None, '~svn', '12345') |
1622 | + >>> make_splitter().match('gcc-linaro-4.6+bzr4567').groups() |
1623 | + ('gcc-linaro-', '4.6', None, '+bzr', '4567') |
1624 | + >>> make_splitter().match('gdb-7.3~20110606+git057a947').groups() |
1625 | + ('gdb-', '7.3', '~20110606', '+git', '057a947') |
1626 | + """ |
1627 | + return re.compile('([\D\-]+)([\d\.]+)([+~]\d+)?([+~][a-z]{3})(\w+)$') |
1628 | + |
1629 | +def make_release_splitter(): |
1630 | + """Splits a release name like gcc-linaro-4.5-2011.03-0 into |
1631 | + product, series, and release. |
1632 | + |
1633 | + >>> make_release_splitter().match('gcc-linaro-4.5-2011.03-0').groups() |
1634 | + ('gcc-linaro-', '4.5', '-2011.03-0') |
1635 | + >>> make_release_splitter().match('gcc-linaro-4.6-2012.03').groups() |
1636 | + ('gcc-linaro-', '4.6', '-2012.03') |
1637 | + """ |
1638 | + return re.compile('([\D\-]+)([\d\.]+)(-[\d.\-]+)$') |
1639 | + |
1640 | +def find_via_meta(name, snapshots): |
1641 | + # See if we can find the meta data and extract the ancestor |
1642 | + metaname = os.path.join(snapshots, '%s.txt' % name) |
1643 | + |
1644 | + try: |
1645 | + meta = parse(metaname) |
1646 | + revision, prev, mainline = meta['revision'], meta['ancestor'], meta['mainline-branch'] |
1647 | + |
1648 | + if revision != prev: |
1649 | + return '%s+bzr%s' % (mainline, prev.split()[0]) |
1650 | + except Exception, ex: |
1651 | + pass |
1652 | + |
1653 | + return '' |
1654 | + |
1655 | +def find_via_build(allfiles, name): |
1656 | + # Don't know the ancestor. Find the previous build |
1657 | + Record = collections.namedtuple('Record', 'product series date vcs revision') |
1658 | + |
1659 | + splitter = make_splitter() |
1660 | + builds = set([x[1] for x in allfiles if x[-1] == 'finished.txt']) |
1661 | + builds = [splitter.match(x) for x in sorted(builds)] |
1662 | + builds = [Record(*x.groups()) for x in builds if x] |
1663 | + |
1664 | + match = splitter.match(name) |
1665 | + |
1666 | + if match: |
1667 | + this = Record(*match.groups()) |
1668 | + |
1669 | + # Grab the builds that match |
1670 | + matches = [x for x in builds |
1671 | + if x.product == this.product |
1672 | + and x.series == this.series |
1673 | + and x.vcs == this.vcs] |
1674 | + |
1675 | + # Sort them by revno |
1676 | + matches.sort(key=revkey) |
1677 | + |
1678 | + # Find this build and the one before |
1679 | + revnos = [x.revision for x in matches] |
1680 | + |
1681 | + if this.revision in revnos: |
1682 | + idx = revnos.index(this.revision) |
1683 | + |
1684 | + if idx > 0: |
1685 | + return ''.join(x for x in matches[idx-1] if x) |
1686 | + |
1687 | + return '' |
1688 | + |
1689 | +def find_via_variant(allfiles, name): |
1690 | + # Don't know the ancestor. Find the same build without the variant. |
1691 | + if '^' not in name: |
1692 | + return None |
1693 | + |
1694 | + root, variant = name.split('^') |
1695 | + |
1696 | + builds = [x for x in allfiles if x[-1] == 'finished.txt' and x[-4] == root] |
1697 | + |
1698 | + if builds: |
1699 | + return root |
1700 | + |
1701 | + return '' |
1702 | + |
1703 | +def find_via_release(allfiles, name): |
1704 | + Record = collections.namedtuple('Record', 'product series release') |
1705 | + |
1706 | + # Don't know the ancestor. Find the previous release |
1707 | + splitter = make_release_splitter() |
1708 | + builds = set([x[1] for x in allfiles if x[-1] == 'finished.txt']) |
1709 | + builds = [splitter.match(x) for x in sorted(builds)] |
1710 | + builds = [Record(*x.groups()) for x in builds if x] |
1711 | + |
1712 | + match = splitter.match(name) |
1713 | + |
1714 | + if match: |
1715 | + this = Record(*match.groups()) |
1716 | + |
1717 | + # Grab the builds that match |
1718 | + matches = [x for x in builds |
1719 | + if x.product == this.product |
1720 | + and x.series == this.series] |
1721 | + |
1722 | + # Sort them by release |
1723 | + matches.sort(key=lambda x: x.release) |
1724 | + |
1725 | + # Find this build and the one before |
1726 | + releases = [x.release for x in matches] |
1727 | + |
1728 | + if this.release in releases: |
1729 | + idx = releases.index(this.release) |
1730 | + |
1731 | + if idx > 0: |
1732 | + return ''.join(x for x in matches[idx-1] if x) |
1733 | + |
1734 | + return '' |
1735 | + |
1736 | +def find(allfiles, name, snapshots): |
1737 | + ancestor = find_via_meta(name, snapshots) |
1738 | + |
1739 | + if ancestor: |
1740 | + return ancestor |
1741 | + |
1742 | + ancestor = find_via_variant(allfiles, name) |
1743 | + |
1744 | + if ancestor: |
1745 | + return ancestor |
1746 | + |
1747 | + ancestor = find_via_build(allfiles, name) |
1748 | + |
1749 | + if ancestor: |
1750 | + return ancestor |
1751 | + |
1752 | + ancestor = find_via_release(allfiles, name) |
1753 | + |
1754 | + if ancestor: |
1755 | + return ancestor |
1756 | + |
1757 | + return '' |
1758 | + |
1759 | +def main(): |
1760 | + allfiles, name = sys.argv[1:] |
1761 | + |
1762 | + with open(allfiles) as f: |
1763 | + allf = [x.strip().split(os.sep) for x in f.readlines()] |
1764 | + |
1765 | + print find(allf, name, '') |
1766 | + |
1767 | +def test(): |
1768 | + import doctest |
1769 | + doctest.testmod() |
1770 | + |
1771 | +if __name__ == '__main__': |
1772 | + main() |
1773 | |
1774 | === added file 'findlogs.py' |
1775 | --- findlogs.py 1970-01-01 00:00:00 +0000 |
1776 | +++ findlogs.py 2014-07-01 15:58:40 +0000 |
1777 | @@ -0,0 +1,113 @@ |
1778 | +#!/usr/bin/python |
1779 | +"""Given a log file from a build and the build's ancestor, find the |
1780 | +corresponding log file for the ancestor. |
1781 | + |
1782 | +Used to find the GCC testsuite results from the ancestor so that they |
1783 | +can be compared. |
1784 | + |
1785 | +Example: |
1786 | + findlogs.py build/all-files.txt build/gcc-4.6+svn173722/logs/armv7l-maverick-cbuild114-ursa4-cortexa9r1/gcc.sum.xz gcc-4.6+svn173209 |
1787 | + "gcc-4.6+svn173209 armv7l-maverick-cbuild113-ursa3-cortexa9r1" |
1788 | + |
1789 | +""" |
1790 | + |
1791 | +import sys |
1792 | +import os.path |
1793 | +import pprint |
1794 | +import re |
1795 | +import pdb |
1796 | +import logging |
1797 | + |
1798 | +def prefilter(files, contains): |
1799 | + # Select those that could be a log |
1800 | + files = [x[-4:] for x in files if len(x) >= 4] |
1801 | + # ...and those that match the build log |
1802 | + files = [x for x in files if contains in x[-1]] |
1803 | + |
1804 | + return files |
1805 | + |
1806 | +def find(allfiles, log, ancestor): |
1807 | + logging.debug("FIND: input log %s" % (log)) |
1808 | + logging.debug("FIND: input ancestor %s" % (ancestor)) |
1809 | + log = log.split('/') |
1810 | + arch, osname, cbuild, host, config = log[-2].split('-') |
1811 | + |
1812 | + # Tidy up the ancestor |
1813 | + match = re.match('lp\:([^/]+)(/.+)?([+~]\w+\d+)', ancestor) |
1814 | + |
1815 | + if match: |
1816 | + branch, series, revision = match.groups() |
1817 | + |
1818 | + if series: |
1819 | + series = series[1:] |
1820 | + elif branch == 'gcc-linaro': |
1821 | + # gcc-linaro has been different series at different |
1822 | + # times. Patch. |
1823 | + match = re.match('\+bzr(\d+)', revision) |
1824 | + |
1825 | + if match: |
1826 | + revno = int(match.group(1)) |
1827 | + if revno < 100000: |
1828 | + series = '4.5' |
1829 | + elif revno < 110000: |
1830 | + series = '4.6' |
1831 | + elif revno < 121996: |
1832 | + series = '4.7' |
1833 | + else: |
1834 | + series = '4.8' |
1835 | + |
1836 | + logging.debug("FIND: ancestor has branch %s series %s" % (branch, series)) |
1837 | + |
1838 | + elif branch == 'gcc': |
1839 | + # gcc is the same as gcc/4.8 |
1840 | + match = re.match('\+bzr(\d+)', revision) |
1841 | + |
1842 | + if match: |
1843 | + revno = int(match.group(1)) |
1844 | + if revno < 120000: |
1845 | + series = '4.7' |
1846 | + else: |
1847 | + series = '4.8' |
1848 | + else: |
1849 | + assert False, 'Unrecognised development focus for %s' % branch |
1850 | + |
1851 | + ancestor = '%s-%s%s' % (branch, series, revision) |
1852 | + logging.debug("FIND: updated ancestor to %s" % (ancestor)) |
1853 | + |
1854 | + # Select those that could be a log |
1855 | + files = [x[-4:] for x in allfiles if len(x) >= 4] |
1856 | + |
1857 | + # Select results from the ancestor |
1858 | + files = [x for x in files if x[-4] == ancestor] |
1859 | + # ...and those that match the build log |
1860 | + files = [x for x in files if x[-1] == log[-1]] |
1861 | + |
1862 | + # Split out the build ID |
1863 | + for f in files: |
1864 | + f[-2] = (f[-2].split('-') + ['']*5)[:5] |
1865 | + |
1866 | + # ...and that have the same architecture, config, and os |
1867 | + files = [x for x in files if x[-2][0] == arch and x[-2][-1] == config and x[-2][1] == osname] |
1868 | + |
1869 | + # Sort by cbuild number |
1870 | + files.sort(key=lambda x: int(x[-2][2].replace('cbuild', ''))) |
1871 | + |
1872 | + if files: |
1873 | + best = files[-1] |
1874 | + return (best[0], '-'.join(best[2])) |
1875 | + else: |
1876 | + return None |
1877 | + |
1878 | +def main(): |
1879 | + allfiles, log, ancestor = sys.argv[1:] |
1880 | + |
1881 | + with open(allfiles) as f: |
1882 | + allf = [x.strip().split(os.sep) for x in f.readlines()] |
1883 | + |
1884 | + best = find(allf, log, ancestor) |
1885 | + |
1886 | + if best: |
1887 | + print ' '.join(best) |
1888 | + |
1889 | +if __name__ == '__main__': |
1890 | + main() |
1891 | |
1892 | === added file 'gcc-release.sh' |
1893 | --- gcc-release.sh 1970-01-01 00:00:00 +0000 |
1894 | +++ gcc-release.sh 2014-07-01 15:58:40 +0000 |
1895 | @@ -0,0 +1,168 @@ |
1896 | +#!/bin/bash |
1897 | +# |
1898 | +# Run the steps in the GCC release process |
1899 | +# |
1900 | +# gcc-release.sh lp:gcc-linaro/4.7 [respin] |
1901 | +# |
1902 | + |
1903 | +set -e |
1904 | + |
1905 | +branch=$1 |
1906 | +[ -z "$branch" ] && echo 'Missing branch name' && exit -1 |
1907 | + |
1908 | +if [ -n "$2" ]; then |
1909 | + respin=-$2 |
1910 | + nextspin=$(($2 + 1)) |
1911 | +else |
1912 | + respin= |
1913 | + nextspin=1 |
1914 | +fi |
1915 | + |
1916 | +# Pull the author and email from bzr whoami |
1917 | +me=$( bzr whoami ) |
1918 | +fullname=$( echo "$me" | awk -F\< '{print $1;}' | sed 's# *$##' ) |
1919 | +email=$( echo "$me" | awk -F\< '{print $2;}' | tr -d \> ) |
1920 | + |
1921 | +year=$( date +%Y ) |
1922 | +month=$( date +%m ) |
1923 | +day=$( date +%d ) |
1924 | + |
1925 | +series=$( echo $branch | sed -r "s#.+/(.+)#\1#" ) |
1926 | +# Die if the series can't be found in the branch name |
1927 | +[ -n "$series" ] |
1928 | + |
1929 | +# For testing: |
1930 | +# Stack on top of a local branch to cut the branch time |
1931 | +#stacked=--stacked |
1932 | +# Override the month |
1933 | +#month=07 |
1934 | + |
1935 | +# Use ~/.ssh/config to set up the validation lab bounce host |
1936 | +host=cbuild@toolchain64.lab |
1937 | +snapshots="~/var/snapshots" |
1938 | + |
1939 | +release=$year.$month$respin |
1940 | + |
1941 | +mkdir -p release/$series-$release |
1942 | +cd release/$series-$release |
1943 | +ln -sf ~/linaro/gcc/.bzr |
1944 | + |
1945 | +# Create a new branch if needed |
1946 | +if [ ! -d tree/.bzr ]; then |
1947 | + echo "Branching $branch" |
1948 | + bzr branch $stacked $branch tree |
1949 | +fi |
1950 | + |
1951 | +# Add the release ChangeLog entry and bump the version |
1952 | +if ! head -n 50 tree/ChangeLog.linaro | grep -q "GCC Linaro $series-$release released"; then |
1953 | + echo "Adding release entry to the ChangeLog" |
1954 | + cat > message <<EOF |
1955 | +$year-$month-$day $fullname <$email> |
1956 | + |
1957 | + GCC Linaro $series-$release released. |
1958 | + |
1959 | + gcc/ |
1960 | + * LINARO-VERSION: Update. |
1961 | + |
1962 | +EOF |
1963 | + cat message tree/ChangeLog.linaro > tmp |
1964 | + mv tmp tree/ChangeLog.linaro |
1965 | + |
1966 | + echo $series-$release > tree/gcc/LINARO-VERSION |
1967 | + |
1968 | + echo "Tagging and committing" |
1969 | + bzr commit -m "Make $series-$release release." tree |
1970 | + bzr tag -d tree gcc-linaro-$series-$release |
1971 | +fi |
1972 | + |
1973 | +work=gcc-linaro-$series-$release |
1974 | + |
1975 | +if [ ! -d $work ]; then |
1976 | + echo "Exporting" |
1977 | + bzr export $work.tmp tree |
1978 | + mv $work.tmp $work |
1979 | +fi |
1980 | + |
1981 | +cd $work |
1982 | + |
1983 | +# Update the manuals |
1984 | +env SOURCEDIR=`pwd`/gcc/doc DESTDIR=`pwd`/INSTALL ./gcc/doc/install.texi2html |
1985 | + |
1986 | +# Build the tree and translations |
1987 | +if [ ! -d ../objdir/gcc/po ]; then |
1988 | + echo "Building" |
1989 | + ./contrib/gcc_build -d `pwd` -o ../objdir -c "--enable-generated-files-in-srcdir --disable-multilib" -m \-sj4 build > ../build.txt 2>&1 |
1990 | +fi |
1991 | + |
1992 | +# Copy the translations across |
1993 | +cp -uv ../objdir/gcc/po/*.gmo gcc/po/ |
1994 | + |
1995 | +# Make the md5sums |
1996 | +if [ ! -f MD5SUMS ]; then |
1997 | + echo "Making MD5SUMS" |
1998 | + cat > MD5SUMS <<EOF |
1999 | +# This file contains the MD5 checksums of the files in the |
2000 | +# $work.tar.bz2 tarball. |
2001 | +# |
2002 | +# Besides verifying that all files in the tarball were correctly expanded, |
2003 | +# it also can be used to determine if any files have changed since the |
2004 | +# tarball was expanded or to verify that a patchfile was correctly applied. |
2005 | +# |
2006 | +# Suggested usage: |
2007 | +# md5sum -c MD5SUMS | grep -v "OK\$" |
2008 | +EOF |
2009 | + find . -type f | sed -e 's:^\./::' -e '/MD5SUMS/d' | sort | xargs md5sum >> MD5SUMS |
2010 | +fi |
2011 | + |
2012 | +cd .. |
2013 | + |
2014 | +next=$series-$release-$nextspin~dev |
2015 | + |
2016 | +# Bump the version |
2017 | +if ! grep -q $next tree/gcc/LINARO-VERSION; then |
2018 | + cat > message <<EOF |
2019 | +$year-$month-$day $fullname <$email> |
2020 | + |
2021 | + gcc/ |
2022 | + * LINARO-VERSION: Bump version. |
2023 | + |
2024 | +EOF |
2025 | + cat message tree/ChangeLog.linaro > tmp |
2026 | + mv tmp tree/ChangeLog.linaro |
2027 | + |
2028 | + echo $next > tree/gcc/LINARO-VERSION |
2029 | + bzr commit -m "Bump version number, post release." tree |
2030 | +fi |
2031 | + |
2032 | +# Tar it up |
2033 | +if [ ! -f $work.tar.bz2 ]; then |
2034 | + # Check for bad files |
2035 | + for ext in .orig .rej "~" .svn .bzr; do |
2036 | + if [ -n "$(find $work -name *$ext)" ]; then |
2037 | + echo "Found $ext files in the tarball" |
2038 | + exit -2 |
2039 | + fi |
2040 | + done |
2041 | + |
2042 | + echo "Making tarball" |
2043 | + tar cf $work.tar $work |
2044 | + bzip2 -9 $work.tar |
2045 | +fi |
2046 | + |
2047 | +# Only sign on the build infrastructure |
2048 | +if [ ! -f $work.tar.bz2.asc ]; then |
2049 | + read -p "Upload the tarball to snapshots and sign (y/n)? " |
2050 | + |
2051 | + if [ "$REPLY" = "y" ]; then |
2052 | + echo "Pushing to snapshots" |
2053 | + rsync --progress -a $work.tar.bz2 "$host:$snapshots" |
2054 | + # Password is the EEMBC password on https://wiki.linaro.org/Internal/ToolChain |
2055 | + ssh -t $host gpg --no-use-agent -q --yes --passphrase-file /home/cbuild/.config/cbuild/password --armor --sign --detach-sig --default-key cbuild "$snapshots/$work.tar.bz2" |
2056 | + # Pull the signature back down and check |
2057 | + rsync -a "$host:$snapshots/$work*asc" . |
2058 | + gpg -q --verify $work.tar.bz2.asc |
2059 | + fi |
2060 | +fi |
2061 | + |
2062 | +echo "Done. Results are in $(pwd). Spawn at http://ex.seabright.co.nz/helpers/scheduler/spawn." |
2063 | +echo "Don't forget to push!" |
2064 | |
2065 | === added file 'getbody.py' |
2066 | --- getbody.py 1970-01-01 00:00:00 +0000 |
2067 | +++ getbody.py 2014-07-01 15:58:40 +0000 |
2068 | @@ -0,0 +1,26 @@ |
2069 | +#!/usr/bin/python |
2070 | +"""Extracts the body of a GCC contrib/test_summary email by extracting |
2071 | +the lines between a cat <<'EOF' and the EOF. |
2072 | +""" |
2073 | + |
2074 | +import fileinput |
2075 | +import re |
2076 | + |
2077 | +def main(): |
2078 | + lines = iter([x.rstrip() for x in fileinput.input()]) |
2079 | + |
2080 | + while True: |
2081 | + line = lines.next() |
2082 | + if re.match('cat\s*<<', line): |
2083 | + break |
2084 | + |
2085 | + while True: |
2086 | + line = lines.next() |
2087 | + |
2088 | + if re.match('EOF', line): |
2089 | + break |
2090 | + else: |
2091 | + print line |
2092 | + |
2093 | +if __name__ == '__main__': |
2094 | + main() |
2095 | |
2096 | === added file 'lab-pull.sh' |
2097 | --- lab-pull.sh 1970-01-01 00:00:00 +0000 |
2098 | +++ lab-pull.sh 2014-07-01 15:58:40 +0000 |
2099 | @@ -0,0 +1,22 @@ |
2100 | +#!/bin/bash |
2101 | +# |
2102 | +# Pull results from the lab into this server |
2103 | +# |
2104 | + |
2105 | +dir=$HOME/incoming/validation |
2106 | +www=/var/www/ex.seabright.co.nz |
2107 | + |
2108 | +mkdir -p $dir |
2109 | + |
2110 | +for j in build benchmarks; do |
2111 | + # Fetch everything |
2112 | + timeout 600 rsync -e "ssh -C" -itr "control.v:~/cbuild/$j/*/*-logs.*" $dir/$j |
2113 | + |
2114 | + # Extract those that are new |
2115 | + cd $www |
2116 | + |
2117 | + for i in $dir/$j/*-logs*.xz; do |
2118 | + if [ $i.done -nt $i ]; then continue; fi |
2119 | + tar xaf $i && touch $i.done |
2120 | + done |
2121 | +done |
2122 | |
2123 | === added file 'latestami.sh' |
2124 | --- latestami.sh 1970-01-01 00:00:00 +0000 |
2125 | +++ latestami.sh 2014-07-01 15:58:40 +0000 |
2126 | @@ -0,0 +1,17 @@ |
2127 | +#!/bin/bash |
2128 | +# |
2129 | +# Get the AMI of the latest release of a series |
2130 | +# |
2131 | + |
2132 | +distro=${1:-precise} |
2133 | +arch=${2:-amd64} |
2134 | +region=us-east-1 |
2135 | +storage=ebs |
2136 | + |
2137 | +wget -q -O - https://cloud-images.ubuntu.com/query/$distro/server/released.current.txt \ |
2138 | + | grep $region \ |
2139 | + | grep $arch \ |
2140 | + | grep $storage \ |
2141 | + | grep -v hvm \ |
2142 | + | awk '{ print $8; }' |
2143 | + |
2144 | |
2145 | === added file 'launcher.sh' |
2146 | --- launcher.sh 1970-01-01 00:00:00 +0000 |
2147 | +++ launcher.sh 2014-07-01 15:58:40 +0000 |
2148 | @@ -0,0 +1,146 @@ |
2149 | +#!/bin/bash |
2150 | +# |
2151 | +# Takes all incoming events and launches them. Done as one large |
2152 | +# script to keep everything together and remove contention and locking |
2153 | +# issues. |
2154 | +# |
2155 | + |
2156 | +. ~/.config/cbuild/toolsrc |
2157 | + |
2158 | +# Use run=echo to test |
2159 | +run= |
2160 | + |
2161 | +#set -e |
2162 | + |
2163 | +# Pull in the local configuration |
2164 | +hostrc=$( dirname $0 )/launcher-$( hostname ).rc |
2165 | +[ -f $hostrc ] && . $hostrc |
2166 | + |
2167 | +function remove_old_jobs { |
2168 | + # Nuke any daily jobs early |
2169 | + $run find $scheduler -name "gcc-4.*svn*.job" -mtime +5 -exec rm -f {} \; |
2170 | + # Nuke any old jobs |
2171 | + $run find $scheduler -name "*.job" -mtime +14 -exec rm -f {} \; |
2172 | + # Nuke any old locks |
2173 | + $run find $scheduler -name "*.job.*" -mtime +20 -exec rm -f {} \; |
2174 | +} |
2175 | + |
2176 | +function daily { |
2177 | + remove_old_jobs |
2178 | + # Pull the active trunk branches once a day |
2179 | + PRIORITY=daily $run $ctools/up_branch.sh $repos/gcc-trunk gcc-4.8~ |
2180 | + PRIORITY=daily $run $ctools/up_branch.sh $repos/gcc-4.7 gcc-4.7+ |
2181 | + # Update the md5sums on all snapshots |
2182 | + (cd $snapshots && $run rm -f md5sums && $run md5sum *z* > md5sums) |
2183 | + # Remove older files from S3 |
2184 | + $run python $ctools/s3-diet.py linaro-toolchain-builds |
2185 | + # Check for leaking benchmark results |
2186 | + $run $ctools/check-leaks.sh |
2187 | + # Update the benchmarks graph |
2188 | + local pattern="logs/arm*precise*ursa1?-*a9hf*/benchmarks.txt" |
2189 | + local diffplot="python $lib/linaro-toolchain-benchmarks/scripts/diffplot.py" |
2190 | + (cd $benchmarks \ |
2191 | + && $run $diffplot gcc-4.8~svn.png gcc-4.7.0/$pattern gcc-4.8~svn??????/$pattern \ |
2192 | + && $run $diffplot gcc-linaro-4.7.png gcc-4.7.0/$pattern gcc-linaro-4.7-201?.??/$pattern \ |
2193 | + && $run $diffplot gcc-linaro-4.7+bzr.png gcc-linaro-4.7+bzr??????/$pattern \ |
2194 | + ) |
2195 | +} |
2196 | + |
2197 | +function build_many { |
2198 | + # Build a range of branches |
2199 | + local dow=$1 |
2200 | + |
2201 | + case $dow in |
2202 | + 1) |
2203 | + # Benchmark the latest tip build |
2204 | + $run $ctools/chain.sh benchmarks "gcc-linaro-4.7+bzr??????" |
2205 | + $run $ctools/chain.sh oecore "gcc-linaro-4.7+bzr??????.*" |
2206 | + ;; |
2207 | + 2) |
2208 | + $run $ctools/export-git.sh $repos/newlib newlib-1.21~ |
2209 | + $run $ctools/export-git.sh $repos/qemu-git qemu-1.4~ |
2210 | + $run $ctools/up_branch.sh $repos/gcc-arm-embedded-4.6 gcc-arm-embedded-4.6+ |
2211 | + $run $ctools/chain.sh ubutest "gcc-4.6+*" |
2212 | + ;; |
2213 | + 3) |
2214 | + # Ensure we get at least one build of trunk a week |
2215 | + $run $ctools/chain.sh trunk "gcc-4.8~*" |
2216 | + $run $ctools/export-git.sh $repos/openembedded-core openembedded-core-1.4~ |
2217 | + $run $ctools/export-git.sh $repos/bitbake bitbake-1.15+ |
2218 | + ;; |
2219 | + 4) |
2220 | + $run $ctools/chain.sh trunk "gcc-4.7+*" |
2221 | + $run $ctools/up_branch.sh $repos/gcc-arm-aarch64-4.7 gcc-arm-aaarch64-4.7+ |
2222 | + # Many updates for LLVM, only one tarball (no version number) at the end |
2223 | + $run $ctools/up_branch.sh $repos/llvm/tools/clang |
2224 | + $run $ctools/up_branch.sh $repos/llvm/tools/clang/tools/extra |
2225 | + $run $ctools/up_branch.sh $repos/llvm/projects/compiler-rt |
2226 | + $run $ctools/up_branch.sh $repos/llvm llvm-svn~ |
2227 | + ;; |
2228 | + 5) |
2229 | + # Benchmark the latest trunk build |
2230 | + $run $ctools/chain.sh benchmarks "gcc-4.8~*" |
2231 | + # Make sure 4.6 is built once a week |
2232 | + $run $ctools/up_branch.sh $repos/gcc-4.6 gcc-4.6+ |
2233 | + $run $ctools/up_branch.sh $repos/eglibc/libc eglibc-2.18~ |
2234 | + $run $ctools/export-git.sh $repos/libffi libffi-3.1~ |
2235 | + ;; |
2236 | + 6) |
2237 | + $run $ctools/chain.sh ubutest "gcc-4.8~*" |
2238 | + # Make sure 4.7 is built once a week |
2239 | + $run $ctools/up_branch.sh $repos/gcc-4.7 gcc-4.7+ |
2240 | + $run $ctools/export-git.sh $repos/gdb gdb-7.6~ |
2241 | + $run $ctools/up_branch.sh $repos/gcc-google-4.7 gcc-google-4.7+ |
2242 | + ;; |
2243 | + 7) |
2244 | + $run $ctools/chain.sh ubutest "gcc-4.7+*" |
2245 | + $run $ctools/export-git.sh $repos/binutils binutils-2.24~ |
2246 | + ;; |
2247 | + esac |
2248 | +} |
2249 | + |
2250 | +function dow { |
2251 | + local dow=$1 |
2252 | + |
2253 | + build_many $dow |
2254 | + |
2255 | + case $dow in |
2256 | + 4) |
2257 | + launch weekly |
2258 | + ;; |
2259 | + esac |
2260 | +} |
2261 | + |
2262 | +function weekly { |
2263 | + true |
2264 | +} |
2265 | + |
2266 | +function often { |
2267 | + true |
2268 | +} |
2269 | + |
2270 | +function launch { |
2271 | + local event=$1 |
2272 | + local arg=$2 |
2273 | + |
2274 | + case $event in |
2275 | + daily) |
2276 | + launch dow $( date +%u ) |
2277 | + daily $arg |
2278 | + ;; |
2279 | + dow) |
2280 | + dow $arg |
2281 | + ;; |
2282 | + weekly) |
2283 | + weekly |
2284 | + ;; |
2285 | + often) |
2286 | + often |
2287 | + ;; |
2288 | + esac |
2289 | +} |
2290 | + |
2291 | +event=$1; shift |
2292 | + |
2293 | +echo launcher: running $event on $( date +%u ) |
2294 | +launch $event $1 |
2295 | |
2296 | === added file 'libec2.sh' |
2297 | --- libec2.sh 1970-01-01 00:00:00 +0000 |
2298 | +++ libec2.sh 2014-07-01 15:58:40 +0000 |
2299 | @@ -0,0 +1,78 @@ |
2300 | +# Common functions for spawning and initialising an EC2 instace |
2301 | +# |
2302 | + |
2303 | +. ~/.config/cbuild/toolsrc |
2304 | +. $lib/cbuild/siterc |
2305 | + |
2306 | +wget="wget --no-proxy -q" |
2307 | +this=ec2slave |
2308 | + |
2309 | +hostname=${1:-oort99} |
2310 | +ami=${2:-missing-ami} |
2311 | +# 64 bit high-CPU |
2312 | +type=${3:-c1.xlarge} |
2313 | +kill=${kill:-true} |
2314 | + |
2315 | +region=us-east-1 |
2316 | + |
2317 | +lock="flock /tmp/heavy.lock" |
2318 | + |
2319 | +# Bring in any local configuration |
2320 | +[ -f ~/.private/$this.rc ] && . ~/.private/$this.rc |
2321 | + |
2322 | +dir=/tmp/$this/$hostname |
2323 | + |
2324 | +function get_state { |
2325 | + $lock ec2din --show-empty-fields $instance > $dir/din |
2326 | + line=`grep ^INSTANCE $dir/din` |
2327 | + state=`echo $line | awk '{ print $6; }'` |
2328 | + if [ -f ~/.in-cloud ]; then |
2329 | + # Use the local IP |
2330 | + ip=`echo $line | awk '{ print $5; }'` |
2331 | + else |
2332 | + # Use the public IP |
2333 | + ip=`echo $line | awk '{ print $4; }'` |
2334 | + fi |
2335 | + |
2336 | + echo get_state on $instance: $state $ip |
2337 | +} |
2338 | + |
2339 | +function start_instance { |
2340 | + $wget -O - $SCHEDULER_API/update/$hostname/starting || true |
2341 | + mkdir -p $dir |
2342 | + |
2343 | + $lock ec2run $ami -t $type --region $region -k $key \ |
2344 | + --instance-initiated-shutdown-behavior terminate > $dir/run |
2345 | + instance=`grep ^INSTANCE $dir/run | awk '{ print $2; }'` |
2346 | + |
2347 | + if [ "$kill" = "true" ]; then trap "$lock ec2kill $instance" EXIT; fi |
2348 | + |
2349 | + get_state |
2350 | + |
2351 | + while [ "$state" != "running" ]; do |
2352 | + sleep 5 |
2353 | + get_state |
2354 | + done |
2355 | + |
2356 | + # Wait for Ubuntu to boot |
2357 | + for i in {1..10}; do |
2358 | + if ssh -o StrictHostKeyChecking=no ubuntu@$ip true; then break; fi |
2359 | + sleep 10 |
2360 | + done |
2361 | + |
2362 | + # Test the connection |
2363 | + ssh ubuntu@$ip uname -a |
2364 | + |
2365 | + echo Instance $instance started |
2366 | +} |
2367 | + |
2368 | +function init_instance { |
2369 | + # Do the system wide setup |
2370 | + scp $this-init.sh ~/.private/$this-*seed* ubuntu@$ip:~ |
2371 | + ssh ubuntu@$ip sudo bash $this-init.sh $hostname |
2372 | + # And the user specific |
2373 | + scp $this-init-cbuild.sh cbuild@$ip:~ |
2374 | + ssh -t cbuild@$ip bash $this-init-cbuild.sh |
2375 | + # Copy the site configuration across |
2376 | + scp $lib/cbuild/siterc cbuild@$ip:/cbuild |
2377 | +} |
2378 | |
2379 | === added file 'pull_branch.sh' |
2380 | --- pull_branch.sh 1970-01-01 00:00:00 +0000 |
2381 | +++ pull_branch.sh 2014-07-01 15:58:40 +0000 |
2382 | @@ -0,0 +1,21 @@ |
2383 | +#!/bin/sh |
2384 | +# Update a branch, snapshot the latest revision, and spawn the jobs |
2385 | + |
2386 | +set -e |
2387 | + |
2388 | +# Directory this script runs from |
2389 | +here=`dirname $0` |
2390 | + |
2391 | +# Pull a branch and tarball it up |
2392 | +branch=$1 |
2393 | +head=$2 |
2394 | + |
2395 | +# Silly bzr... |
2396 | +rm -rf $branch/.bzr/checkout/limbo $branch/.bzr/checkout/pending-deletion |
2397 | +bzr pull -q --overwrite -d $branch |
2398 | + |
2399 | +stem=`$here/export-bzr.sh $branch $head $3 $4 $5 $6 $7 | grep ^stem: | awk '{print $2;}'` |
2400 | + |
2401 | +if [ "$stem" != "" ]; then |
2402 | + $here/spawn.sh $stem |
2403 | +fi |
2404 | |
2405 | === added file 's3-diet.py' |
2406 | --- s3-diet.py 1970-01-01 00:00:00 +0000 |
2407 | +++ s3-diet.py 2014-07-01 15:58:40 +0000 |
2408 | @@ -0,0 +1,91 @@ |
2409 | +#!/usr/bin/env python |
2410 | + |
2411 | +"""Put a S3 bucket on a diet by removing old files. Used to delete |
2412 | +old builds from the cbuild archive. |
2413 | +""" |
2414 | + |
2415 | +import collections |
2416 | +import sys |
2417 | +import datetime |
2418 | +import re |
2419 | +import os.path |
2420 | +import ConfigParser |
2421 | +import logging |
2422 | + |
2423 | +import simples3 |
2424 | +import hurry.filesize |
2425 | + |
2426 | + |
2427 | +Entry = collections.namedtuple('Entry', 'key timestamp etag size basename') |
2428 | + |
2429 | + |
2430 | +def main(): |
2431 | + dry_run = False |
2432 | + |
2433 | + logging.basicConfig(level=logging.INFO) |
2434 | + |
2435 | + config = ConfigParser.SafeConfigParser() |
2436 | + config.read(os.path.expanduser('~/.s3cfg')) |
2437 | + |
2438 | + bucket = simples3.S3Bucket(sys.argv[1], |
2439 | + access_key=config.get('default', 'access_key'), |
2440 | + secret_key=config.get('default', 'secret_key')) |
2441 | + |
2442 | + |
2443 | + by_key = {} |
2444 | + entries = [] |
2445 | + |
2446 | + for v in bucket.listdir(): |
2447 | + entry = Entry(*(v + (os.path.basename(v[0]), ))) |
2448 | + by_key[entry.key] = entry |
2449 | + entries.append(entry) |
2450 | + |
2451 | + # Do some linting |
2452 | + to_delete = [] |
2453 | + |
2454 | + # Delete the signature if the original file is gone |
2455 | + for entry in entries: |
2456 | + if entry.key.endswith('.asc'): |
2457 | + original = entry.key.replace('.asc', '') |
2458 | + |
2459 | + if original not in by_key: |
2460 | + logging.debug('Deleting orphan signature %s' % entry.basename) |
2461 | + to_delete.append(entry) |
2462 | + |
2463 | + # Delete anything that's too old |
2464 | + now = datetime.datetime.utcnow() |
2465 | + |
2466 | + for entry in entries: |
2467 | + age = now - entry.timestamp |
2468 | + |
2469 | + is_merge = '+bzr' in entry.basename and len(entry.basename.split('~')) == 3 |
2470 | + |
2471 | + if re.search(r'-201.\.\d\d', entry.basename) and '+bzr' not in entry.basename: |
2472 | + logging.debug("Not deleting %s as it's special" % entry.basename) |
2473 | + elif is_merge and age.days >= 30: |
2474 | + logging.debug('Deleting %d day old merge %s' % (age.days, entry.basename)) |
2475 | + to_delete.append(entry) |
2476 | + elif age.days >= 45: |
2477 | + logging.debug('Deleting %d day old %s' % (age.days, entry.basename)) |
2478 | + to_delete.append(entry) |
2479 | + else: |
2480 | + pass |
2481 | + |
2482 | + # Nuke any duplicates |
2483 | + to_delete = sorted(set(to_delete)) |
2484 | + |
2485 | + total = sum(x.size for x in entries) |
2486 | + deleting = sum(x.size for x in to_delete) |
2487 | + |
2488 | + logging.info('Total stored: %s in %d entries' % (hurry.filesize.size(total), len(entries))) |
2489 | + logging.info('To delete: %s in %d entries' % (hurry.filesize.size(deleting), len(to_delete))) |
2490 | + |
2491 | + for entry in sorted(to_delete, key=lambda x: -x.size): |
2492 | + logging.info('Deleting %s' % entry.basename) |
2493 | + |
2494 | + if not dry_run: |
2495 | + del bucket[entry.key] |
2496 | + |
2497 | +if __name__ == '__main__': |
2498 | + main() |
2499 | + |
2500 | |
2501 | === added file 'spawn.sh' |
2502 | --- spawn.sh 1970-01-01 00:00:00 +0000 |
2503 | +++ spawn.sh 2014-07-01 15:58:40 +0000 |
2504 | @@ -0,0 +1,36 @@ |
2505 | +#/bin/bash |
2506 | +# |
2507 | +# Spawn a job based on the queues |
2508 | +# |
2509 | +# Examples: |
2510 | +# spawn gcc-4.5+svn12345 |
2511 | +# spawn.sh ~/path/to/thing-to-build.tar.xz |
2512 | +# spawn.sh ~/path/to/thing-to-build.tar.xz queue-name |
2513 | +# |
2514 | + |
2515 | +set -e |
2516 | + |
2517 | +. ~/.config/cbuild/toolsrc |
2518 | + |
2519 | +# Directory this script runs from |
2520 | +here=`dirname $0` |
2521 | + |
2522 | +job=$1 |
2523 | + |
2524 | +priority=${PRIORITY:-.} |
2525 | + |
2526 | +# Tidy up job so that you can also supply a path |
2527 | +job=`basename $job` |
2528 | +job=`echo $job | sed -r 's/(.+)\.tar.*/\1/'` |
2529 | + |
2530 | +queue=${2:-$job} |
2531 | +closest=$($here/closest.py $queue $scheduler/spawn/$priority/*) |
2532 | + |
2533 | +echo Spawning $job into the queue `basename $closest` |
2534 | + |
2535 | +# "Demux" via spawn queue into real job queues, use tcwg-web's schedulejob.py |
2536 | +# script for scheduling jobs with LAVA support, etc. |
2537 | +for i in $closest/*; do |
2538 | + PYTHONPATH=$lib/tcwg-web python -m schedulejob `basename $i` $job |
2539 | + echo Spawned into `basename $i` |
2540 | +done |
2541 | |
2542 | === added file 'stamp_branch.sh' |
2543 | --- stamp_branch.sh 1970-01-01 00:00:00 +0000 |
2544 | +++ stamp_branch.sh 2014-07-01 15:58:40 +0000 |
2545 | @@ -0,0 +1,70 @@ |
2546 | +#!/bin/bash |
2547 | +repo=$1 |
2548 | +into=$2 |
2549 | + |
2550 | +# Check for known version control systems. |
2551 | +if [ -d $repo/.bzr ]; then |
2552 | + GCC_GIT=${GCC_BZR-${BZR-bzr}} |
2553 | + vcs_type="bzr" |
2554 | +elif [ -d $repo/.svn ]; then |
2555 | + GCC_SVN=${GCC_SVN-${SVN-svn}} |
2556 | + vcs_type="svn" |
2557 | +else |
2558 | + echo "This does not seem to be a GCC GIT/Bzr tree!" |
2559 | + exit |
2560 | +fi |
2561 | + |
2562 | +rm -f $into/LAST_UPDATED $into/gcc/REVISION |
2563 | + |
2564 | +case $vcs_type in |
2565 | + svn) |
2566 | + revision=${REVNO:-`svn info $1 | awk '/Revision:/ { print $2 }'`} |
2567 | + branch=`svn info $1 | sed -ne "/URL:/ { |
2568 | + s,.*/trunk,trunk, |
2569 | + s,.*/branches/,, |
2570 | + s,.*/tags/,, |
2571 | + p |
2572 | + }"` |
2573 | + ;; |
2574 | + |
2575 | + bzr) |
2576 | + revision=${REVNO:-`bzr revno $repo`} |
2577 | + parent=`bzr info $repo | gawk '/parent branch/ { print $3; }'` |
2578 | + |
2579 | + # Keep the branch name out of the version to prevent spurious |
2580 | + # test failures. |
2581 | + # |
2582 | + # ...net/%2Bbranch/gcc-linaro/4.7/ becomes gcc-linaro/4.7 |
2583 | + # ...net/~michaelh1/gcc-linaro/better-longlong/ becomes gcc-linaro/~michaelh1 |
2584 | + # |
2585 | + parts=( $(echo $parent | tr '/' '\n' | tac) ) |
2586 | + echo ${parts[@]} |
2587 | + n3=${parts[0]} |
2588 | + n2=${parts[1]} |
2589 | + n1=${parts[2]} |
2590 | + |
2591 | + case $n1,$n2,$n3 in |
2592 | + *branch,*,*) branch="$n2/$n3" ;; |
2593 | + \~*,*,*) branch="$n2/dev" ;; |
2594 | + *) branch="$n2" ;; |
2595 | + esac |
2596 | + ;; |
2597 | +esac |
2598 | + |
2599 | +{ |
2600 | + date |
2601 | + echo "`TZ=UTC date` (revision $revision)" |
2602 | +} > $into/LAST_UPDATED |
2603 | + |
2604 | +# And the product specific parts |
2605 | + |
2606 | +# GCC |
2607 | +if [ -f $into/gcc/version.c ]; then |
2608 | + echo "[$branch revision $revision]" > $into/gcc/REVISION |
2609 | +fi |
2610 | + |
2611 | +# crosstool-NG |
2612 | +if [ -f $into/ct-ng.in ]; then |
2613 | + sed -i -r "s#bzr.+#bzr$revision#" $into/version.sh |
2614 | + sed -i -r "s#^REVISION\s*=.*#REVISION = +bzr$revision#" $into/contrib/linaro/build.mk |
2615 | +fi |
2616 | |
2617 | === added file 'tabulate.sh' |
2618 | --- tabulate.sh 1970-01-01 00:00:00 +0000 |
2619 | +++ tabulate.sh 2014-07-01 15:58:40 +0000 |
2620 | @@ -0,0 +1,36 @@ |
2621 | +#!/bin/bash |
2622 | +# |
2623 | +# Tabulate any new benchmark results |
2624 | +# |
2625 | + |
2626 | +. ~/.config/cbuild/toolsrc |
2627 | + |
2628 | +tmp=`mktemp -td tabulate.XXXXXXXXXX` |
2629 | +trap "rm -rf $tmp" EXIT |
2630 | + |
2631 | +base=${1:-$benchmarks} |
2632 | +scripts=${2:-$HOME/lib/linaro-toolchain-benchmarks/scripts} |
2633 | + |
2634 | +tabulate() { |
2635 | + filter=$1 |
2636 | + suffix=$2 |
2637 | + |
2638 | + name=benchmarks$suffix.txt |
2639 | + |
2640 | + for i in $base/gcc*/logs/*/$filter.txt; do |
2641 | + dir=$(dirname $i) |
2642 | + |
2643 | + # Skip if the tabulated version is newer |
2644 | + [ ! $i -nt $dir/$name ] && continue |
2645 | + |
2646 | + echo $dir... |
2647 | + python $scripts/tabulate.py $dir/$filter.txt > $tmp/$name \ |
2648 | + && touch -r $i $tmp/$name \ |
2649 | + && mv $tmp/$name $dir |
2650 | + done |
2651 | +} |
2652 | + |
2653 | +tabulate "*run" "" |
2654 | +tabulate "spec2000-*run" "-spec2000" |
2655 | +tabulate "eembc-*run" "-eembc" |
2656 | +tabulate "eembc_office-*run" "-eembc_office" |
2657 | |
2658 | === added file 'taker.json' |
2659 | --- taker.json 1970-01-01 00:00:00 +0000 |
2660 | +++ taker.json 2014-07-01 15:58:40 +0000 |
2661 | @@ -0,0 +1,82 @@ |
2662 | +{"build": "http://ex.seabright.co.nz/build/", |
2663 | + "instance": "production", |
2664 | + "projects": ["gcc-linaro", "gcc", "crosstool-ng", "gdb-linaro"], |
2665 | + "prompt": true, |
2666 | + |
2667 | + "hosts": { |
2668 | + "crucis": { |
2669 | + "dry-run": true, |
2670 | + "repos": "/home/michaelh/linaro", |
2671 | + "all-files": "build/all-files.txt", |
2672 | + "tools": ".", |
2673 | + "build": "build" |
2674 | + }, |
2675 | + "orion": { |
2676 | + "repos": "/home/cbuild/repos", |
2677 | + "all-files": "/var/www/ex.seabright.co.nz/build/all-files.txt", |
2678 | + "tools": "/home/cbuild/lib/cbuild-tools", |
2679 | + "build": "/var/www/ex.seabright.co.nz/build" |
2680 | + }, |
2681 | + "cbuild-master": { |
2682 | + "repos": "/home/cbuild/var/repos", |
2683 | + "all-files": "/home/cbuild/public_html/build/all-files.txt", |
2684 | + "tools": "/home/cbuild/lib/cbuild-tools", |
2685 | + "build": "/home/cbuild/public_html/build" |
2686 | + }, |
2687 | + "toolchain64": { |
2688 | + "repos": "/home/cbuild/var/repos", |
2689 | + "all-files": "/home/cbuild/public_html/build/all-files.txt", |
2690 | + "tools": "/home/cbuild/lib/cbuild-tools", |
2691 | + "build": "/home/cbuild/public_html/build" |
2692 | + } |
2693 | + }, |
2694 | + "filters": { |
2695 | + "xinclude": ["lp663939"] |
2696 | + }, |
2697 | + "branches": { |
2698 | + "lp:gcc-linaro": { |
2699 | + "prefix": "gcc-linaro-4.8", |
2700 | + "reporoot": "gcc-linaro-4.8" |
2701 | + }, |
2702 | + "lp:gcc-linaro/4.5": { |
2703 | + "prefix": "gcc-linaro-4.5", |
2704 | + "reporoot": "gcc-linaro" |
2705 | + }, |
2706 | + "lp:gcc-linaro/4.6": { |
2707 | + "prefix": "gcc-linaro-4.6", |
2708 | + "reporoot": "gcc-linaro" |
2709 | + }, |
2710 | + "lp:gcc-linaro/4.7": { |
2711 | + "prefix": "gcc-linaro-4.7", |
2712 | + "reporoot": "gcc-linaro" |
2713 | + }, |
2714 | + "lp:gcc-linaro/4.8": { |
2715 | + "prefix": "gcc-linaro-4.8", |
2716 | + "reporoot": "gcc-linaro-4.8" |
2717 | + }, |
2718 | + "lp:gcc-linaro/4.6-2011.07-stable": { |
2719 | + "prefix": "gcc-linaro-4.6-2011.07-stable", |
2720 | + "reporoot": "gcc-linaro" |
2721 | + }, |
2722 | + "lp:gcc-linaro/4.4": { |
2723 | + "prefix": "gcc-linaro-4.4", |
2724 | + "reporoot": "gcc-linaro" |
2725 | + }, |
2726 | + "lp:gcc/4.6": { |
2727 | + "prefix": "gcc-4.6", |
2728 | + "reporoot": "gcc-linaro" |
2729 | + }, |
2730 | + "lp:gcc": { |
2731 | + "prefix": "gcc-4.8", |
2732 | + "reporoot": "gcc-bzr" |
2733 | + }, |
2734 | + "lp:gdb-linaro": { |
2735 | + "prefix": "gdb-linaro-7.5", |
2736 | + "reporoot": "gdb-linaro" |
2737 | + }, |
2738 | + "lp:~linaro-toolchain-dev/crosstool-ng/linaro": { |
2739 | + "prefix": "crosstool-ng-linaro-1.13.1", |
2740 | + "reporoot": "crosstool-ng" |
2741 | + } |
2742 | + } |
2743 | +} |
2744 | |
2745 | === added file 'taker.py' |
2746 | --- taker.py 1970-01-01 00:00:00 +0000 |
2747 | +++ taker.py 2014-07-01 15:58:40 +0000 |
2748 | @@ -0,0 +1,396 @@ |
2749 | +"""Script that monitors merge requests, branches unseen ones, and sets |
2750 | +up a build. |
2751 | +""" |
2752 | + |
2753 | +import os |
2754 | +import os.path |
2755 | +import re |
2756 | +import subprocess |
2757 | +import json |
2758 | +import socket |
2759 | +import datetime |
2760 | +import logging |
2761 | +import getopt |
2762 | +import sys |
2763 | +import pprint |
2764 | +import pdb |
2765 | +import StringIO |
2766 | +import findlogs |
2767 | + |
2768 | +import launchpadlib.launchpad |
2769 | + |
2770 | +# Message sent when a merge request is snapshotted and queued for build |
2771 | +QUEUED = { |
2772 | + 'subject': "[cbuild] Queued %(snapshot)s for build", |
2773 | + 'body': """ |
2774 | +cbuild has taken a snapshot of this branch at r%(revno)s and queued it for build. |
2775 | + |
2776 | +The diff against the ancestor r%(ancestor)s is available at: |
2777 | + http://cbuild.validation.linaro.org/snapshots/%(snapshot)s.diff |
2778 | + |
2779 | +and will be built on the following builders: |
2780 | + %(builders)s |
2781 | + |
2782 | +You can track the build queue at: |
2783 | + http://cbuild.validation.linaro.org/helpers/scheduler |
2784 | + |
2785 | +cbuild-snapshot: %(snapshot)s |
2786 | +cbuild-ancestor: %(target)s+bzr%(ancestor)s |
2787 | +cbuild-state: check |
2788 | +""" |
2789 | +} |
2790 | + |
2791 | +# Message sent when a individual build completes |
2792 | +CHECK_OK = { |
2793 | + 'subject': "[cbuild] Build OK for %(snapshot)s on %(build)s", |
2794 | + 'body': """ |
2795 | +cbuild successfully built this on %(build)s. |
2796 | + |
2797 | +The build results are available at: |
2798 | + http://cbuild.validation.linaro.org/build/%(snapshot)s/logs/%(build)s |
2799 | + |
2800 | +%(diff)s |
2801 | + |
2802 | +The full testsuite results are at: |
2803 | + http://cbuild.validation.linaro.org/build/%(snapshot)s/logs/%(build)s/gcc-testsuite.txt |
2804 | + |
2805 | +cbuild-checked: %(build)s |
2806 | +""" |
2807 | +} |
2808 | + |
2809 | +# Message sent when a individual build fails |
2810 | +CHECK_FAILED = { |
2811 | + 'subject': "[cbuild] Build failed for %(snapshot)s on %(build)s", |
2812 | + 'body': """ |
2813 | +cbuild had trouble building this on %(build)s. |
2814 | +See the following failure logs: |
2815 | + %(logs)s |
2816 | + |
2817 | +under the build results at: |
2818 | + http://cbuild.validation.linaro.org/build/%(snapshot)s/logs/%(build)s |
2819 | + |
2820 | +%(diff)s |
2821 | + |
2822 | +cbuild-checked: %(build)s |
2823 | +""", |
2824 | + 'vote': 'Needs Fixing', |
2825 | + 'nice': False |
2826 | +} |
2827 | + |
2828 | +CONFIG = json.load(open('taker.json')) |
2829 | + |
2830 | +def run(cmd, cwd=None, const=False): |
2831 | + logging.debug('Executing "%s" (cwd=%s)' % (' '.join(cmd), cwd)) |
2832 | + |
2833 | + if const or not get_config2(False, 'dry-run'): |
2834 | + child = subprocess.Popen(cmd, cwd=cwd, stdout=subprocess.PIPE) |
2835 | + stdout, stderr = child.communicate() |
2836 | + retcode = child.wait() |
2837 | + |
2838 | + if retcode: |
2839 | + raise Exception('Child process returned %s' % retcode) |
2840 | + |
2841 | + lines = [x.rstrip() for x in StringIO.StringIO(stdout).readlines()] |
2842 | + logging.debug('Stdout: %r' % lines[:20]) |
2843 | + |
2844 | + return lines |
2845 | + else: |
2846 | + return [] |
2847 | + |
2848 | +def tidy_branch(v): |
2849 | + """Tidy up a branch name from one of the staging servers to the |
2850 | + top level server. |
2851 | + """ |
2852 | + return v.replace('lp://qastaging/', 'lp:') |
2853 | + |
2854 | +def get_config_raw(v): |
2855 | + at = CONFIG |
2856 | + |
2857 | + for name in v.split('!'): |
2858 | + at = at[name] |
2859 | + |
2860 | + return at |
2861 | + |
2862 | +def get_config(*args): |
2863 | + v = '!'.join(args) |
2864 | + |
2865 | + # See if the value exists in the hosts override first |
2866 | + try: |
2867 | + return get_config_raw('hosts!%s!%s' % (socket.gethostname(), v)) |
2868 | + except KeyError: |
2869 | + return get_config_raw(v) |
2870 | + |
2871 | +def get_config2(default, *args): |
2872 | + """Get a configuration value, falling back to the default if not set.""" |
2873 | + try: |
2874 | + return get_config(*args) |
2875 | + except KeyError: |
2876 | + return default |
2877 | + |
2878 | +class Proposal: |
2879 | + def __init__(self, proposal): |
2880 | + self.proposal = proposal |
2881 | + self.parse(proposal) |
2882 | + |
2883 | + def parse(self, proposal): |
2884 | + """Parse the proposal by scanning all comments for |
2885 | + 'cbuild-foo: bar' tokens. Pull these into member variables. |
2886 | + """ |
2887 | + self.state = None |
2888 | + self.ancestor = None |
2889 | + self.checked = [] |
2890 | + |
2891 | + for comment in proposal.all_comments: |
2892 | + lines = [x.strip() for x in comment.message_body.split('\n')] |
2893 | + |
2894 | + for line in lines: |
2895 | + match = re.match('cbuild-(\w+):\s+(\S+)', line) |
2896 | + |
2897 | + if match: |
2898 | + name, value = match.groups() |
2899 | + |
2900 | + if name in ['checked']: |
2901 | + getattr(self, name).append(value) |
2902 | + else: |
2903 | + setattr(self, name, value) |
2904 | + |
2905 | +def add_comment(proposal, template, **kwargs): |
2906 | + """Add a comment using the given template for the subject and |
2907 | + body text. |
2908 | + """ |
2909 | + vote = template.get('vote', None) |
2910 | + # True if this is a 'happy' result. Used in debugging to |
2911 | + # prevent unreversable results from going up to Launchpad. |
2912 | + nice = template.get('nice', True) |
2913 | + |
2914 | + subject = (template['subject'] % kwargs).rstrip() |
2915 | + body = (template['body'] % kwargs).rstrip() |
2916 | + |
2917 | + just_print = get_config2(False, 'dry-run') |
2918 | + |
2919 | + if not nice and get_config2(True, 'cautious'): |
2920 | + # Running in a debug/test mode. Don't push unreversable |
2921 | + # results. |
2922 | + just_print = True |
2923 | + |
2924 | + if get_config2(False, 'prompt'): |
2925 | + logging.info("Want to add this comment: %s\n%s\n\nVote: %s\n" % (subject, body, vote)) |
2926 | + |
2927 | + if just_print: |
2928 | + got = raw_input('Add it anyway? > ') |
2929 | + else: |
2930 | + got = raw_input('OK to add? > ') |
2931 | + |
2932 | + just_print = got.strip() not in ['Y', 'y'] |
2933 | + |
2934 | + if just_print: |
2935 | + logging.info("Would have added this comment: %s\n%s\n\nVote: %s\n" % (subject, body, vote)) |
2936 | + else: |
2937 | + if vote: |
2938 | + proposal.proposal.createComment(subject=subject, content=body, vote=vote) |
2939 | + else: |
2940 | + proposal.proposal.createComment(subject=subject, content=body) |
2941 | + |
2942 | +def queue(proposal): |
2943 | + """Run the queue step by snapshoting and queuing the build.""" |
2944 | + lp = proposal.proposal |
2945 | + |
2946 | + source = lp.source_branch |
2947 | + owner = source.owner.name |
2948 | + branch = tidy_branch(source.bzr_identity) |
2949 | + revno = source.revision_count |
2950 | + |
2951 | + # We might have hit this before Launchpad has scanned it |
2952 | + if not revno: |
2953 | + # Work around Launchpad being broken on 2011-09-07 by assuming |
2954 | + # the merge was against the latest revno |
2955 | + # Work around bzr revno bug with ghost ancestry |
2956 | + # (https://launchpad.net/bugs/1161018) |
2957 | + revno = run(['bzr', 'revno', 'nosmart+%s' % branch], const=True) |
2958 | + revno = int(revno[0]) |
2959 | + assert revno |
2960 | + |
2961 | + target = tidy_branch(lp.target_branch.bzr_identity) |
2962 | + |
2963 | + # Find where to check it out |
2964 | + prefix = get_config('branches', target, 'prefix') |
2965 | + reporoot = get_config('branches', target, 'reporoot') |
2966 | + |
2967 | + # Give it an archive name |
2968 | + suffix = '~%s~%s' % (owner, os.path.basename(branch)) |
2969 | + |
2970 | + snapshot = '%s+bzr%s%s' % (prefix, revno, suffix) |
2971 | + path = '%s/%s/%s' % (get_config('repos'), reporoot, snapshot) |
2972 | + |
2973 | + # Remove the old directory |
2974 | + run(['rm', '-rf', path]) |
2975 | + # Branch it |
2976 | + run(['bzr', 'branch', '--no-tree', branch, path], cwd=os.path.dirname(path)) |
2977 | + # Find the common ancestor |
2978 | + lines = run(['bzr', 'revision-info', '-d', path, '-r', 'ancestor:%s' % target]) |
2979 | + |
2980 | + ancestor = lines[0].split()[0] if lines else 'unknown' |
2981 | + |
2982 | + # Make the snapshot |
2983 | + lines = run(['%s/pull_branch.sh' % get_config('tools'), path, prefix, '%s' % revno, suffix, target]) |
2984 | + |
2985 | + # Scan the tool results to see which queues the build was |
2986 | + # pushed into |
2987 | + builders = [] |
2988 | + |
2989 | + for line in lines: |
2990 | + match = re.match('Spawned into (\S+)', line) |
2991 | + |
2992 | + if match: |
2993 | + builders.append(match.group(1)) |
2994 | + |
2995 | + add_comment(proposal, QUEUED, snapshot=snapshot, revno=revno, builders=' '.join(builders), ancestor=ancestor, target=target) |
2996 | + |
2997 | +def make_diff(proposal, all_files, check): |
2998 | + """Check if there are any DejaGNU summary files and diff them.""" |
2999 | + sums = [x for x in check if '.sum' in x[-1]] |
3000 | + |
3001 | + if not proposal.ancestor: |
3002 | + lines = ['The test suite was not checked as the branch point was not recorded.'] |
3003 | + elif not sums: |
3004 | + lines = ['The test suite was not checked as this build has no .sum style test results'] |
3005 | + else: |
3006 | + # Have some test results. See if there's a corresponding |
3007 | + # version in the branch point build. |
3008 | + first = '/'.join(sums[0]) |
3009 | + ref = findlogs.find(all_files, first, proposal.ancestor) |
3010 | + |
3011 | + if not ref: |
3012 | + lines = ['The test suite was not checked as the branch point %s has nothing to compare against.' % proposal.ancestor] |
3013 | + else: |
3014 | + revision, build = ref |
3015 | + |
3016 | + # Generate a unified diff between the two sets of results |
3017 | + diff = run(['%s/difftests.sh' % get_config('tools'), |
3018 | + '%s/%s/logs/%s' % (get_config('build'), revision, build), |
3019 | + '%s/%s' % (get_config('build'), os.path.dirname(first))], |
3020 | + const=True) |
3021 | + |
3022 | + if diff: |
3023 | + lines = ['The test suite results changed compared to the branch point %s:' % proposal.ancestor] |
3024 | + lines.extend([' %s' % x for x in diff]) |
3025 | + else: |
3026 | + lines = ['The test suite results were unchanged compared to the branch point %s.' % proposal.ancestor] |
3027 | + |
3028 | + if len(lines) > 40: |
3029 | + total = len(lines) |
3030 | + lines = lines[:40] |
3031 | + lines.append(' ...and %d more' % (total - len(lines))) |
3032 | + |
3033 | + return '\n'.join(lines) |
3034 | + |
3035 | +def check_build(proposal, all_files, build, matches): |
3036 | + """Check a single build to see how the results went.""" |
3037 | + |
3038 | + if build in proposal.checked: |
3039 | + logging.debug('Already posted a comment on %s' % build) |
3040 | + else: |
3041 | + check = [x for x in matches if len(x) >= 4 and x[2] == build] |
3042 | + failures = [x for x in check if 'failed' in x[-1]] |
3043 | + finished = 'finished.txt' in [x[-1] for x in check] |
3044 | + |
3045 | + logging.info('Checking %s (finished=%s, failures=%r)' % (build, finished, failures)) |
3046 | + |
3047 | + if finished: |
3048 | + logs = ' '.join(x[-1] for x in failures) |
3049 | + |
3050 | + diff = make_diff(proposal, all_files, check) |
3051 | + |
3052 | + if failures: |
3053 | + add_comment(proposal, CHECK_FAILED, build=build, snapshot=proposal.snapshot, logs=logs, diff=diff) |
3054 | + else: |
3055 | + add_comment(proposal, CHECK_OK, build=build, snapshot=proposal.snapshot, logs=logs, diff=diff) |
3056 | + |
3057 | +def check(proposal, all_files): |
3058 | + """Check an already queued build for build results.""" |
3059 | + # Find all log files for this snapshot |
3060 | + matches = [x for x in all_files if x[0] == proposal.snapshot] |
3061 | + matches = [x for x in matches if len(x) >= 3 and x[1] == 'logs'] |
3062 | + |
3063 | + if not matches: |
3064 | + logging.debug('No log files yet') |
3065 | + return |
3066 | + |
3067 | + # Build a list of builds |
3068 | + builds = [x[2] for x in matches if len(x) == 3] |
3069 | + |
3070 | + for build in builds: |
3071 | + check_build(proposal, all_files, build, matches) |
3072 | + |
3073 | +def run_proposal(proposal, all_files, allowed): |
3074 | + when = proposal.date_review_requested |
3075 | + |
3076 | + # Only run recent proposals that have had review requested |
3077 | + if when: |
3078 | + now = datetime.datetime.now(when.tzinfo) |
3079 | + elapsed = (now - when).days |
3080 | + |
3081 | + logging.info('Checking proposal %s which is %d days old' % (proposal, elapsed)) |
3082 | + |
3083 | + if elapsed <= get_config2(14, 'age-limit'): |
3084 | + p = Proposal(proposal) |
3085 | + |
3086 | + state = p.state if p.state else 'queue' |
3087 | + |
3088 | + if state in allowed: |
3089 | + # Could use a dodgy getattr()... |
3090 | + if state == 'queue': |
3091 | + queue(p) |
3092 | + elif state == 'check': |
3093 | + check(p, all_files) |
3094 | + else: |
3095 | + assert False, 'Proposal %s is in the invalid state "%s"' % (proposal, p.state) |
3096 | + else: |
3097 | + logging.info("Skipping %s" % state) |
3098 | + |
3099 | +def main(): |
3100 | + opts, args = getopt.getopt(sys.argv[1:], 'f:vs:') |
3101 | + verbose = int(get_config2(0, 'verbose')) |
3102 | + steps = get_config2('queue', 'steps').split(',') |
3103 | + |
3104 | + for opt, arg in opts: |
3105 | + if opt == '-f': |
3106 | + name, value = arg.split('=') |
3107 | + option = json.loads('{ "%s": %s }' % (name, value)) |
3108 | + CONFIG.update(option) |
3109 | + elif opt == '-v': |
3110 | + verbose += 1 |
3111 | + elif opt == '-s': |
3112 | + steps = arg.split(',') |
3113 | + |
3114 | + if verbose >= 2: |
3115 | + logging.basicConfig(level=logging.DEBUG) |
3116 | + elif verbose >= 1: |
3117 | + logging.basicConfig(level=logging.INFO) |
3118 | + |
3119 | + # Parse all-files into a list of already split paths |
3120 | + with open(get_config('all-files')) as f: |
3121 | + all_files = f.readlines() |
3122 | + |
3123 | + all_files = [x.rstrip().split('/') for x in all_files] |
3124 | + all_files = [x[1:] for x in all_files if len(x) >= 2] |
3125 | + |
3126 | + # Login to Launchpad |
3127 | + launchpad = launchpadlib.launchpad.Launchpad.login_with('taker', get_config('instance'), credentials_file=os.path.expanduser('~/.launchpadlib/credentials')) |
3128 | + |
3129 | + # Scan all projects... |
3130 | + for name in get_config('projects'): |
3131 | + logging.info('Checking project %s' % name) |
3132 | + project = launchpad.projects[name] |
3133 | + |
3134 | + # Scan all proposals in the project... |
3135 | + proposals = project.getMergeProposals() |
3136 | + |
3137 | + for p in proposals: |
3138 | + try: |
3139 | + run_proposal(p, all_files, steps) |
3140 | + except Exception, ex: |
3141 | + logging.error('Error while processing %s: %s' % (p, ex)) |
3142 | + |
3143 | +if __name__ == '__main__': |
3144 | + main() |
3145 | |
3146 | === added file 'track-tree.sh' |
3147 | --- track-tree.sh 1970-01-01 00:00:00 +0000 |
3148 | +++ track-tree.sh 2014-07-01 15:58:40 +0000 |
3149 | @@ -0,0 +1,16 @@ |
3150 | +#!/bin/sh |
3151 | + |
3152 | +set -e |
3153 | + |
3154 | +dir=`mktemp -d` |
3155 | +trap "rm -rf $dir" EXIT |
3156 | + |
3157 | +find $@ -type f | sort > $dir/all-files.txt |
3158 | +cp -a changes/all-files.txt $dir/previous.txt |
3159 | + |
3160 | +cmp -s $dir/previous.txt $dir/all-files.txt && exit |
3161 | + |
3162 | +stamp=`date +%Y-%m-%d-%H%M%S` |
3163 | + |
3164 | +(cd $dir && diff -U 0 previous.txt all-files.txt) | tee $dir/$stamp.diff |
3165 | +mv $dir/$stamp.diff $dir/all-files.txt changes |
3166 | |
3167 | === added file 'up_branch.sh' |
3168 | --- up_branch.sh 1970-01-01 00:00:00 +0000 |
3169 | +++ up_branch.sh 2014-07-01 15:58:40 +0000 |
3170 | @@ -0,0 +1,53 @@ |
3171 | +#!/bin/bash |
3172 | +# Update a SVN branch and snapshot the latest revision |
3173 | +# Can also snapshot past revisions and add a per-user |
3174 | +# suffix |
3175 | + |
3176 | +set -e |
3177 | + |
3178 | +. ~/.config/cbuild/toolsrc |
3179 | + |
3180 | +# Directory this script runs from |
3181 | +here=`dirname $0` |
3182 | + |
3183 | +# Suffix for manual pulls |
3184 | +SUFFIX=${SUFFIX:-} |
3185 | + |
3186 | +# Pull a branch and tarball it up |
3187 | +branch=$1 |
3188 | +head=$2 |
3189 | + |
3190 | +svn up -q $branch |
3191 | +# Recent SVN adds a last changed revision field. Delete spaces to |
3192 | +# strip the space from the revno and, coincidentally, make the field |
3193 | +# easier to match. |
3194 | +latest=$(svn info $branch | tr -d ' ' | awk -F: '/LastChangedRev/ { print $2 }') |
3195 | +revno=${REVNO:-$latest} |
3196 | + |
3197 | +if [ "x${head}" != "x" ]; then |
3198 | + name=${head}svn$revno$SUFFIX |
3199 | + tar=/tmp/$name.tar |
3200 | + xdelta=$snapshots/$name.tar.xdelta3.xz |
3201 | + |
3202 | + if [ ! -f $xdelta ]; then |
3203 | + echo Exporting and stamping $revno |
3204 | + tmp=`mktemp -d` |
3205 | + rm -rf $tmp/$name |
3206 | + svn export $branch $tmp/$name |
3207 | + REVNO=$revno $here/stamp_branch.sh $branch $tmp/$name |
3208 | + |
3209 | + # Create the tarball |
3210 | + echo Creating $tar |
3211 | + (cd $tmp && find "$name" -print0 | sort -z) | tar caf $tar -C $tmp --no-recursion --null -T - |
3212 | + rm -rf $tmp |
3213 | + # Create the delta |
3214 | + closest=`$here/closest.py $tar $snapshots/base/*.tar` |
3215 | + echo Making the delta relative to $closest |
3216 | + xdelta3 -fe -s $closest $tar $tar.xdelta3 |
3217 | + rm -f $tar |
3218 | + xz -f $tar.xdelta3 |
3219 | + mv $tar.xdelta3.xz $snapshots |
3220 | + echo New snapshot $head $revno created |
3221 | + $here/spawn.sh $tar |
3222 | + fi |
3223 | +fi |
3224 | |
3225 | === added file 'utilisation.py' |
3226 | --- utilisation.py 1970-01-01 00:00:00 +0000 |
3227 | +++ utilisation.py 2014-07-01 15:58:40 +0000 |
3228 | @@ -0,0 +1,93 @@ |
3229 | +"""Plots the backlog and utilisation level of a class of build |
3230 | +machines. |
3231 | +""" |
3232 | + |
3233 | +import fileinput |
3234 | + |
3235 | +import matplotlib |
3236 | +matplotlib.use('agg') |
3237 | + |
3238 | +from pylab import * |
3239 | + |
3240 | +def process(lines, name, matcher): |
3241 | + start = float(lines[0][0]) |
3242 | + backlog = [] |
3243 | + running = [] |
3244 | + states = [] |
3245 | + hosts = {} |
3246 | + |
3247 | + for line in lines: |
3248 | + time, type, host, arg = line[:4] |
3249 | + |
3250 | + if not matcher(host): |
3251 | + continue |
3252 | + |
3253 | + offset = float(time) - start |
3254 | + days = offset / (60*60*24) |
3255 | + |
3256 | + if type == 'backlog': |
3257 | + if hosts[host] == 'reserved': |
3258 | + # Reserved machines report as a zero backlog |
3259 | + pass |
3260 | + else: |
3261 | + backlog.append((days, int(arg))) |
3262 | + elif type == 'state': |
3263 | + if arg != 'updating': |
3264 | + states.append((host, arg, days)) |
3265 | + hosts[host] = arg |
3266 | + |
3267 | + # Scan through the states and remove the glitches |
3268 | + merged = [] |
3269 | + |
3270 | + for host in hosts: |
3271 | + just = [x for x in states if x[0] == host] |
3272 | + |
3273 | + for i in range(len(just) - 1): |
3274 | + this = just[i] |
3275 | + next = just[i+1] |
3276 | + elapsed = next[-1] - this[-1] |
3277 | + |
3278 | + if this[1] == 'running' and next[1] == 'idle' and elapsed < 5.0/60/60/24: |
3279 | + # Drop this record |
3280 | + pass |
3281 | + else: |
3282 | + merged.append(this) |
3283 | + |
3284 | + merged.sort(key=lambda x: x[-1]) |
3285 | + |
3286 | + # Scan through and figure out the number of hosts running at any one time |
3287 | + states = {} |
3288 | + |
3289 | + for row in merged: |
3290 | + states[row[0]] = row[1] |
3291 | + |
3292 | + running.append((row[-1], sum(1 if x == 'running' else 0 for x in states.values()))) |
3293 | + |
3294 | + backlog = array(backlog) |
3295 | + running = array(running) |
3296 | + |
3297 | + clf() |
3298 | + plot(backlog[:,0], backlog[:,1]) |
3299 | + scatter(backlog[:,0], backlog[:,1], label='backlog') |
3300 | + |
3301 | + plot(running[:,0], running[:,1], c='g', label='running') |
3302 | + |
3303 | + ylim(0, ylim()[1]+1) |
3304 | + xlim(0, xlim()[1]) |
3305 | + xlabel('Time (days)') |
3306 | + ylabel('Utilisation (#boards) & backlog (# jobs)') |
3307 | + title('%s utilisation' % name) |
3308 | + legend() |
3309 | + grid() |
3310 | + savefig('%s.png' % name) |
3311 | + show() |
3312 | + |
3313 | +def main(): |
3314 | + lines = list(fileinput.input()) |
3315 | + lines = [x.rstrip().split() for x in lines] |
3316 | + |
3317 | + process(lines, 'ursas', lambda x: x.startswith('ursa')) |
3318 | +# process(lines, 'cloud', lambda x: x.startswith('oort')) |
3319 | + |
3320 | +if __name__ == '__main__': |
3321 | + main() |