Merge lp:~robru/bileto/enable-britney into lp:bileto

Proposed by Robert Bruce Park on 2015-11-24
Status: Merged
Approved by: Robert Bruce Park on 2015-12-14
Approved revision: 431
Merged at revision: 332
Proposed branch: lp:~robru/bileto/enable-britney
Merge into: lp:bileto
Diff against target: 727 lines (+435/-39)
16 files modified
.bzr-builddeb/default.conf (+2/-0)
Makefile (+11/-2)
britney/Makefile (+6/-0)
britney/britney.conf.in (+58/-0)
britney/cron (+2/-0)
britney/expire.sh (+8/-0)
britney/fetch-indexes (+139/-0)
britney/iterate.py (+150/-0)
db_migrations.sh (+27/-23)
debian/control (+1/-0)
tests/test_v1.py (+8/-3)
tickets/models.py (+6/-2)
tickets/settings.py (+2/-0)
tickets/static/app.js (+4/-4)
tickets/static/index.html (+6/-1)
tickets/v1.py (+5/-4)
To merge this branch: bzr merge lp:~robru/bileto/enable-britney
Reviewer Review Type Date Requested Status
Martin Pitt Approve on 2015-12-14
Robert Bruce Park (community) Approve on 2015-12-14
Review via email: mp+278521@code.launchpad.net

Commit Message

Run britney inside bileto.

Description of the Change

Experimental addition of britney to bileto instance.

To post a comment you must log in.
Robert Bruce Park (robru) wrote :

This is so close to done I can smell it.

TODO:

* Expose britney output in flask

* Figure out why vivid doesn't work since xenial is working, and also figure out why the logging shows some things but not python tracebacks. I guess STDERR is getting lost somewhere along the way.

Robert Bruce Park (robru) wrote :

Ok, this is looking great for a first iteration:

* All requests in qa state 'Ready for QA' are automatically submitted for autopkgtests.
* Results automatically appear in the request page, with links to the excuses page.
* Publication is not blocked on this being successful
* The huge archive indexes that britney consumes are downloaded in parallel
* each britney is run in series due to incredible memory consumption (I'll be increasing the production unit from 2GB to 8GB when this goes live)

review: Approve
Robert Bruce Park (robru) wrote :

Here is the successful test run:

https://launchpad.net/~ci-train-ppa-service/+archive/ubuntu/landing-015/+packages

(note to self, make train post this on MRs)

Robert Bruce Park (robru) wrote :
Martin Pitt (pitti) wrote :

> * each britney is run in series due to incredible memory consumption (I'll be increasing the production unit from 2GB to 8GB when this goes live)

Yes, it does that I'm afraid -- it slurps in the entire Packages_*/Sources indexes as Python dictionaries so that it can make efficient installability checks, reverse and forward dependency calculations, etc. So its RAM usage can't be significantly reduced.

Why does the ticket say "Ready for QA" while tests are still running, and thus packages are still "not considered"? That smells like a bug in evaluating excuses.yaml?

Martin Pitt (pitti) wrote :

A few nits, but by and large LGTM. I suggest adding s390x in a separate MP to not make this too messy.

review: Approve
lp:~robru/bileto/enable-britney updated on 2015-12-14
432. By Robert Bruce Park on 2015-12-14

Permanently enable verbose output for now.

433. By Robert Bruce Park on 2015-12-14

Fix amqp uri leak.

Robert Bruce Park (robru) wrote :

some replies inline.

lp:~robru/bileto/enable-britney updated on 2015-12-14
434. By Robert Bruce Park on 2015-12-14

Record a log of bzr update.

435. By Robert Bruce Park on 2015-12-14

Run autopkgtests on s390x also.

436. By Robert Bruce Park on 2015-12-14

Stupid bzr logging on stderr.

437. By Robert Bruce Park on 2015-12-14

Use series-specific configs with different arches.

438. By Robert Bruce Park on 2015-12-14

Hard-code s390x in ADT_ARCHES as it is ignored pre-xenial.

439. By Robert Bruce Park on 2015-12-14

Fix s390x in xenial.

440. By Robert Bruce Park on 2015-12-14

Drop back to every 15 minutes in cron.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added directory '.bzr-builddeb'
2=== added file '.bzr-builddeb/default.conf'
3--- .bzr-builddeb/default.conf 1970-01-01 00:00:00 +0000
4+++ .bzr-builddeb/default.conf 2015-12-14 21:05:53 +0000
5@@ -0,0 +1,2 @@
6+[BUILDDEB]
7+split = True
8
9=== modified file 'Makefile'
10--- Makefile 2015-11-09 23:57:30 +0000
11+++ Makefile 2015-12-14 21:05:53 +0000
12@@ -1,6 +1,10 @@
13 envsubst = perl -pe 's/@([A-Z_]+)@/$$ENV{$$1} or die "$$1 undefined"/eg'
14 apt = apt-get install -yq --force-yes
15
16+britney_branch = lp:~ubuntu-release/britney/britney2-ubuntu
17+britney_path = /var/lib/britney
18+britney_prefix = /tmp/britney_
19+
20 check:
21 python3 -m coverage run -m nose2
22 python3 -m coverage report --include="t*/*" --show-missing --fail-under=100
23@@ -13,19 +17,24 @@
24 ./run.py
25
26 install-common-deps:
27- $(apt) python3-launchpadlib python3-flask python3-flask-openid python3-flask-sqlalchemy python3-sqlalchemy python3-gunicorn
28+ $(apt) python3-requests python3-launchpadlib python3-flask python3-flask-openid python3-flask-sqlalchemy python3-sqlalchemy python3-gunicorn
29
30 install-test-deps: install-common-deps
31 $(apt) python3-pyflakes || $(apt) pyflakes
32 $(apt) pylint3 python3-pep8 python3-nose2 python3-coverage python3-mock
33
34 install: install-common-deps
35- $(apt) uuid python3-psycopg2 gunicorn3 supervisor postgresql-client
36+ $(apt) uuid python3-yaml python3-amqplib python3-psycopg2 gunicorn3 supervisor postgresql-client
37 $(envsubst) <files/supervisor.conf >/etc/supervisor/conf.d/bileto.conf
38 cp files/logrotate.conf /etc/logrotate.d/bileto
39 cp files/expire-sessions.sh /etc/cron.hourly/expire-sessions
40+ [ -d "$(britney_path)" ] || bzr checkout --lightweight $(britney_branch) $(britney_path)
41+ bzr update "$(britney_path)" 2>&1 | tee "$(britney_path).info"
42+ cp britney/cron /etc/cron.d/bileto-britney
43+ cp britney/expire.sh /etc/cron.daily/expire-britney-output
44
45 install-user:
46 [ -f "$$HOME/bileto.api.token" ] || uuid > "$$HOME/bileto.api.token"
47 uuid > "$$HOME/bileto.instance.id"
48+ mkdir --parents $(britney_prefix)data $(britney_prefix)output
49 ./db_migrations.sh
50
51=== added directory 'britney'
52=== added file 'britney/Makefile'
53--- britney/Makefile 1970-01-01 00:00:00 +0000
54+++ britney/Makefile 2015-12-14 21:05:53 +0000
55@@ -0,0 +1,6 @@
56+envsubst = perl -pe 's/@([A-Z_]+)@/$$ENV{$$1} or die "$$1 undefined"/eg'
57+input = /var/lib/bileto/britney/britney.conf.in
58+
59+%.conf: $(input) ~/amqp.uri
60+ $(envsubst) <$(input) >$@.new
61+ mv $@.new $@
62
63=== added file 'britney/britney.conf.in'
64--- britney/britney.conf.in 1970-01-01 00:00:00 +0000
65+++ britney/britney.conf.in 2015-12-14 21:05:53 +0000
66@@ -0,0 +1,58 @@
67+# Configuration file for britney
68+
69+# Paths for control files
70+TESTING = /tmp/britney_data/%(SERIES)
71+UNSTABLE = /tmp/britney_data/@PPA@-%(SERIES)
72+PARTIAL_UNSTABLE = yes
73+
74+# Output
75+NONINST_STATUS = /tmp/britney_output/%(SERIES)/@PPA@/non-installable-status
76+EXCUSES_OUTPUT = /tmp/britney_output/%(SERIES)/@PPA@/excuses.html
77+EXCUSES_YAML_OUTPUT = /tmp/britney_output/%(SERIES)/@PPA@/excuses.yaml
78+UPGRADE_OUTPUT = /tmp/britney_output/%(SERIES)/@PPA@/output.txt
79+HEIDI_OUTPUT = /tmp/britney_output/%(SERIES)/@PPA@/HeidiResult
80+
81+# List of release architectures
82+ARCHITECTURES = @ARCHES@ @PORTS_ARCHES@
83+
84+# if you're not in this list, arch: all packages are allowed to break on you
85+NOBREAKALL_ARCHES = amd64
86+
87+# if you're in this list, your packages may not stay in sync with the source
88+OUTOFSYNC_ARCHES =
89+
90+# if you're in this list, your uninstallability count may increase
91+BREAK_ARCHES =
92+
93+# if you're in this list, you are a new architecture
94+NEW_ARCHES = @NEW_ARCHES@
95+
96+# priorities and delays
97+MINDAYS_LOW = 0
98+MINDAYS_MEDIUM = 0
99+MINDAYS_HIGH = 0
100+MINDAYS_CRITICAL = 0
101+MINDAYS_EMERGENCY = 0
102+DEFAULT_URGENCY = medium
103+
104+# support for old libraries in testing (smooth update)
105+# use ALL to enable smooth updates for all the sections
106+#
107+# naming a non-existent section will effectively disable new smooth
108+# updates but still allow removals to occur
109+SMOOTH_UPDATES = badgers
110+
111+REMOVE_OBSOLETE = no
112+
113+ADT_ENABLE = yes
114+ADT_DEBUG = no
115+ADT_ARCHES = amd64 i386 armhf ppc64el s390x
116+ADT_PPAS = @TEAM@/@OVERLAY@ @TEAM@/@PPA@
117+ADT_AMQP = @AMQP_URI@
118+# Swift base URL with the results (must be publicly readable and browsable)
119+ADT_SWIFT_URL = https://objectstorage.prodstack4-5.canonical.com/v1/AUTH_77e2ada1e7a84929a74ba3b87153c0ac
120+
121+BOOTTEST_ENABLE = no
122+BOOTTEST_DEBUG = yes
123+BOOTTEST_ARCHES = armhf amd64
124+BOOTTEST_FETCH = yes
125
126=== added file 'britney/cron'
127--- britney/cron 1970-01-01 00:00:00 +0000
128+++ britney/cron 2015-12-14 21:05:53 +0000
129@@ -0,0 +1,2 @@
130+# Trigger britney every 15 minutes
131+*/15 * * * * www-data python3 /var/lib/bileto/britney/iterate.py
132
133=== added file 'britney/expire.sh'
134--- britney/expire.sh 1970-01-01 00:00:00 +0000
135+++ britney/expire.sh 2015-12-14 21:05:53 +0000
136@@ -0,0 +1,8 @@
137+#!/bin/sh
138+DIR="/tmp/britney_output"
139+
140+# Delete britney output files that are more than 3 months old.
141+find $DIR -type f -mmin +7776000 -delete
142+
143+# Delete empty directories too.
144+find $DIR -empty -type d -delete
145
146=== added file 'britney/fetch-indexes'
147--- britney/fetch-indexes 1970-01-01 00:00:00 +0000
148+++ britney/fetch-indexes 2015-12-14 21:05:53 +0000
149@@ -0,0 +1,139 @@
150+#!/bin/sh
151+# download current package indexes to data/<series>{,-proposed}/ for running
152+# britney against a PPA. The PPA will play the role of "-proposed" (i. e.
153+# "unstable" in britney terms, containing the updated packages to test), the
154+# Ubuntu archive has the "-release" part (i. e. "testing" in britney terms, in
155+# which the -proposed packages are being landed).
156+#
157+# Author: Martin Pitt <martin.pitt@ubuntu.com>
158+# Author: Robert Bruce Park <robert.park@canonical.com>
159+set -u
160+
161+export OVERLAY="stable-phone-overlay"
162+export AMQP_URI=$(cat ~/amqp.uri)
163+export TEAM=$(cat ~/ppa.team)
164+export ARCHES="i386 amd64"
165+export PORTS_ARCHES="armhf arm64 powerpc ppc64el"
166+
167+BRITNEY=/var/lib/britney/britney.py
168+CODEDIR=$(dirname "$(readlink -f "$0")")
169+DATADIR=/tmp/britney_data
170+OUTDIR=/tmp/britney_output
171+CACHE=$DATADIR/CACHE
172+ARCHIVE=http://archive.ubuntu.com/ubuntu/dists
173+PORTS=http://ports.ubuntu.com/dists
174+
175+usage() {
176+cat <<EOF
177+Usage: $0 SERIES [SILONAME]
178+$1
179+
180+SERIES:
181+ Ubuntu release codename, eg trusty, xenial, etc.
182+
183+SILONAME (optional):
184+ Name of PPA to fetch; defaults to stable-phone-overlay.
185+EOF
186+exit 1
187+}
188+
189+log() {
190+ echo "$(date -Iseconds)" "$@"
191+}
192+
193+# Echo something before running it.
194+# Needed because 'set -x' goes to stderr which I don't want.
195+loudly() {
196+ log "$@"
197+ "$@"
198+}
199+
200+# Download files in parallel in background, only if there is an update
201+refresh() {
202+ DIR=$CACHE/$pocket/$(echo $1 | rev | cut --delimiter=/ --fields=2,3 | rev)
203+ mkdir --parents $DIR
204+ wget --directory-prefix $DIR --timestamping $1 --append-output $DIR/$$-wget-log --no-verbose &
205+}
206+
207+[ $# -gt 0 ] && export SERIES="$1" && shift || usage "Missing series."
208+
209+if [ $# -gt 0 ]; then
210+ export PPA="$1"
211+else
212+ export PPA="$OVERLAY"
213+fi
214+
215+PPA_URL=http://ppa.launchpad.net/$TEAM/$PPA/ubuntu/dists/$SERIES/main
216+
217+# Enable s390x only for xenial or newer.
218+case "$SERIES" in
219+ precise|trusty|vivid|wily)
220+ export NEW_ARCHES=" "
221+ ;;
222+ *)
223+ export NEW_ARCHES="s390x"
224+ export PORTS_ARCHES="$PORTS_ARCHES s390x"
225+ ;;
226+esac
227+
228+log 'Refreshing package indexes...'
229+
230+# Get archive bits
231+if [ "$PPA" = "$OVERLAY" ]; then
232+ for pocket in $SERIES $SERIES-updates $SERIES-proposed; do
233+ for component in main restricted universe multiverse; do
234+ for arch in $ARCHES; do
235+ refresh $ARCHIVE/$pocket/$component/binary-$arch/Packages.gz
236+ done
237+ for arch in $PORTS_ARCHES; do
238+ refresh $PORTS/$pocket/$component/binary-$arch/Packages.gz
239+ done
240+ refresh $ARCHIVE/$pocket/$component/source/Sources.bz2
241+ done
242+ done
243+fi
244+
245+# Get ppa bits
246+pocket=$PPA-$SERIES
247+for arch in $ARCHES $PORTS_ARCHES; do
248+ refresh $PPA_URL/binary-$arch/Packages.gz
249+done
250+refresh $PPA_URL/source/Sources.bz2
251+
252+wait # for wgets to finish
253+
254+find $DATADIR -name "$$-wget-log*" -exec cat '{}' \; -delete
255+
256+log 'Building britney indexes...'
257+
258+if [ "$PPA" = "$OVERLAY" ]; then
259+ # "Testing" is archive + stable overlay ppa
260+ DEST=$DATADIR/$SERIES
261+ mkdir --parents $DEST
262+ bzcat $CACHE/$SERIES*/*/source/Sources.bz2 \
263+ $CACHE/$OVERLAY-$SERIES/*/source/Sources.bz2 \
264+ > $DEST/Sources
265+ for arch in $ARCHES $PORTS_ARCHES; do
266+ zcat $CACHE/$SERIES*/*/binary-$arch/Packages.gz \
267+ $CACHE/$OVERLAY-$SERIES/*/binary-$arch/Packages.gz \
268+ > $DEST/Packages_${arch}
269+ done
270+else
271+ # "Unstable" is just silo PPA
272+ DEST=$DATADIR/$PPA-$SERIES
273+ mkdir --parents $DEST
274+ bzcat $CACHE/$PPA-$SERIES/*/source/Sources.bz2 > $DEST/Sources
275+ for arch in $ARCHES $PORTS_ARCHES; do
276+ zcat $CACHE/$PPA-$SERIES/*/binary-$arch/Packages.gz > $DEST/Packages_${arch}
277+ done
278+ touch $DEST/Blocks
279+
280+ # Create config file atomically.
281+ CONFIG="$DEST.conf"
282+ make --file "$CODEDIR/Makefile" "$CONFIG"
283+
284+ mkdir --parents "$OUTDIR/$SERIES/$PPA/"
285+ loudly $BRITNEY -v --config "$CONFIG" --series $SERIES | sed "s#$AMQP_URI#AMQP_URI#"
286+fi
287+
288+log "$0 done."
289
290=== added file 'britney/iterate.py'
291--- britney/iterate.py 1970-01-01 00:00:00 +0000
292+++ britney/iterate.py 2015-12-14 21:05:53 +0000
293@@ -0,0 +1,150 @@
294+#!/usr/bin/env python3
295+
296+"""Britney runner.
297+
298+This script iterates over all appropriate bileto requests, downloads archive
299+indexes, and runs britney with those indexes as inputs.
300+"""
301+
302+import sys
303+import requests
304+
305+from glob import glob
306+from time import time
307+from json import dumps
308+from shutil import copy2
309+from subprocess import Popen, PIPE
310+from collections import defaultdict
311+from fcntl import LOCK_EX, LOCK_NB, flock
312+from os.path import abspath, dirname, expanduser
313+
314+
315+PUBLIC = '/static/britney/'
316+OUTPUT = '/tmp/britney_output/'
317+ATOMIC = OUTPUT + 'last-run.txt'
318+LOG_NAME = OUTPUT + 'log.txt'
319+LOCK_FD = open(LOG_NAME, 'a')
320+sys.stdout = sys.stderr = LOCK_FD
321+
322+ORIG_EXCEPTHOOK = sys.excepthook
323+API_PREFIX = 'http://127.0.0.1:8080/v1/'
324+READ_API = API_PREFIX + 'silo/landing'
325+WRITE_API = API_PREFIX + 'tickets?token='
326+HOME = dirname(abspath(__file__))
327+FETCH = HOME + '/fetch-indexes'
328+EXCUSES = OUTPUT + '*/{}/excuses.html'
329+SLASH = '/'
330+PLUS = '+'
331+SPACE = ' '
332+EMPTY = ''
333+BNEWLINE = b'\n'
334+NEWLINE = '\n'
335+
336+
337+def cleanup(*args, **kwargs):
338+ """Actions that must be run no matter how badly everything exploded."""
339+ LOCK_FD.close()
340+ copy2(LOG_NAME, ATOMIC)
341+ ORIG_EXCEPTHOOK(*args, **kwargs)
342+
343+
344+def get_ppa(siloname):
345+ """Extract just the ppa name from the siloname."""
346+ return siloname[siloname.index(SLASH) + 1:]
347+
348+
349+def read_token():
350+ """Read API token."""
351+ with open(expanduser('~/bileto.api.token')) as token:
352+ return token.read().strip()
353+
354+
355+def one_at_a_time():
356+ """Prevent this script from running multiple times."""
357+ flock(LOCK_FD, LOCK_EX | LOCK_NB)
358+
359+
360+def fetch_indexes(*args):
361+ """Invoke the index-fetcher script with the given args."""
362+ return Popen((FETCH,) + args, stdout=PIPE, stderr=PIPE)
363+
364+
365+def get_requests():
366+ """Return list of all active requests from Bileto."""
367+ resp = requests.get(READ_API).json()
368+ return resp.get('requests', [])
369+
370+
371+def update_requests(reqs, token):
372+ """Send excuses links to bileto."""
373+ for req in reqs:
374+ request_id = req['request_id']
375+ landing = get_ppa(req['siloname'])
376+ files = glob(EXCUSES.format(landing))
377+ if files:
378+ new_req = dict(
379+ request_id=request_id,
380+ autopkgtest=NEWLINE.join(
381+ name.replace(OUTPUT, PUBLIC) for name in files))
382+ resp = requests.post(
383+ WRITE_API + token, dumps(new_req),
384+ headers={'content-type': 'application/json'})
385+ print('Updating request {}: {} {}'.format(
386+ request_id, resp.status_code, resp.reason))
387+
388+
389+def build_siloname_dict(reqs):
390+ """Build a dict mapping series to silonames."""
391+ runs = defaultdict(list)
392+ for req in reqs:
393+ landing = get_ppa(req['siloname'])
394+ if 'Ready' in req['qa_signoff']:
395+ for series in req['series'].split(PLUS):
396+ runs[series].append(landing)
397+ return runs
398+
399+
400+def parallelize_fetching(silos):
401+ """Invoke fetchers in parallel but serialize output on stdout."""
402+ procs = []
403+ # Parallelize most of the fetching...
404+ for series in sorted(silos):
405+ procs.append(fetch_indexes(series))
406+ # ... serialize the output ...
407+ for proc in procs:
408+ print_proc(proc)
409+ # ... and serialize the britney invocations
410+ for series, silonames in sorted(silos.items()):
411+ for siloname in sorted(silonames):
412+ print_proc(fetch_indexes(series, siloname))
413+
414+
415+def print_proc(proc):
416+ """Print the output from a subprocess."""
417+ sys.stdout.buffer.write(BNEWLINE.join(
418+ (b'', SPACE.join(proc.args).encode('utf-8')) + proc.communicate()))
419+
420+
421+def main():
422+ """Handle errors."""
423+ start = time()
424+ token = read_token()
425+ try:
426+ one_at_a_time()
427+ LOCK_FD.seek(0)
428+ LOCK_FD.truncate()
429+ reqs = get_requests()
430+ parallelize_fetching(build_siloname_dict(reqs))
431+ update_requests(reqs, token)
432+ except BlockingIOError:
433+ return 0
434+ except (IOError, ConnectionRefusedError):
435+ print('Failed to connect to Bileto. Is it running?')
436+ print('\n\nCompleted in {:0.1f}s.'.format(time() - start))
437+ cleanup()
438+ return 0
439+
440+
441+if __name__ == '__main__':
442+ sys.excepthook = cleanup
443+ sys.exit(main())
444
445=== modified file 'db_migrations.sh'
446--- db_migrations.sh 2015-11-19 23:38:57 +0000
447+++ db_migrations.sh 2015-12-14 21:05:53 +0000
448@@ -3,7 +3,7 @@
449 # Super ultra-lightweight db migrations crafted by hand because the necessary
450 # flask/sqlalchemy migrations bits aren't available for python3 in trusty.
451
452-exit 0 # No migrations are currently required.
453+# exit 0 # No migrations are currently required.
454
455 CONNECTION=$(tr ':/@' ' ' < "$HOME/bileto.postgresql.info")
456 USER=$(echo "$CONNECTION" | awk '{ print $2 }')
457@@ -26,26 +26,30 @@
458 $PG "SELECT pg_terminate_backend($pid);"
459 done
460
461+
462+$PG "ALTER TABLE request ADD COLUMN autopkgtest text DEFAULT '';" || true
463+
464+rm -rf "$PGPASSFILE"
465+exit 0
466+
467 # These ones have succeeded in production and only kept for reference
468-# for column in job_log creator landers siloname sync_request published_versions distribution series dest qa_signoff; do
469-# $PG "ALTER TABLE request ALTER COLUMN $column SET DATA TYPE text;"
470-# done
471-# $PG "ALTER TABLE comment ALTER COLUMN siloname SET DATA TYPE text;"
472-
473-# $PG "ALTER TABLE comment ALTER COLUMN request_id SET DATA TYPE bigint;"
474-# $PG "ALTER TABLE comment ALTER COLUMN id SET DATA TYPE bigint;"
475-# $PG "ALTER TABLE request ALTER COLUMN request_id SET DATA TYPE bigint;"
476-
477-# $PG "ALTER TABLE comment ADD COLUMN log_url text DEFAULT '';" || true
478-# $PG "ALTER TABLE comment ADD COLUMN published_versions text DEFAULT '';" || true
479-# $PG "ALTER TABLE comment ADD COLUMN siloname varchar(1000) DEFAULT '';" || true
480-
481-# $PG "ALTER TABLE request ADD COLUMN job_log varchar(1000) DEFAULT '';" || true
482-# $PG "ALTER TABLE request ADD COLUMN creator varchar(1000) DEFAULT '';" || true
483-
484-# for column in status sources search; do
485-# $PG "ALTER TABLE request ALTER COLUMN $column SET DATA TYPE text;"
486-# done
487-# $PG "ALTER TABLE comment ALTER COLUMN author SET DATA TYPE text;"
488-
489-rm -rf "$PGPASSFILE"
490+for column in job_log creator landers siloname sync_request published_versions distribution series dest qa_signoff; do
491+ $PG "ALTER TABLE request ALTER COLUMN $column SET DATA TYPE text;"
492+done
493+$PG "ALTER TABLE comment ALTER COLUMN siloname SET DATA TYPE text;"
494+
495+$PG "ALTER TABLE comment ALTER COLUMN request_id SET DATA TYPE bigint;"
496+$PG "ALTER TABLE comment ALTER COLUMN id SET DATA TYPE bigint;"
497+$PG "ALTER TABLE request ALTER COLUMN request_id SET DATA TYPE bigint;"
498+
499+$PG "ALTER TABLE comment ADD COLUMN log_url text DEFAULT '';" || true
500+$PG "ALTER TABLE comment ADD COLUMN published_versions text DEFAULT '';" || true
501+$PG "ALTER TABLE comment ADD COLUMN siloname varchar(1000) DEFAULT '';" || true
502+
503+$PG "ALTER TABLE request ADD COLUMN job_log varchar(1000) DEFAULT '';" || true
504+$PG "ALTER TABLE request ADD COLUMN creator varchar(1000) DEFAULT '';" || true
505+
506+for column in status sources search; do
507+ $PG "ALTER TABLE request ALTER COLUMN $column SET DATA TYPE text;"
508+done
509+$PG "ALTER TABLE comment ALTER COLUMN author SET DATA TYPE text;"
510
511=== modified file 'debian/control'
512--- debian/control 2015-12-09 06:31:56 +0000
513+++ debian/control 2015-12-14 21:05:53 +0000
514@@ -3,6 +3,7 @@
515 Priority: extra
516 Maintainer: Robert Bruce Park <robert.park@canonical.com>
517 Build-Depends: debhelper (>= 9),
518+ language-pack-en,
519 pylint3,
520 python3-coverage,
521 python3-flask,
522
523=== modified file 'tests/test_v1.py'
524--- tests/test_v1.py 2015-11-20 20:25:42 +0000
525+++ tests/test_v1.py 2015-12-14 21:05:53 +0000
526@@ -117,11 +117,11 @@
527 '/v1/tickets',
528 data=json.dumps(dict(request_id=1, description='bar')),
529 content_type='application/json')
530- self.assertEqual(ret.status_code, 201)
531+ self.assertEqual(ret.status_code, 200)
532 req = Request.query.get(1)
533 self.assertEqual(req.description, 'bar')
534 self.assertEqual(
535- req.search, 'fritz\nbar\n\n\n\n\n\n\n\n\n\nsiloname\n\nNew\n\n')
536+ req.search, '\nfritz\nbar\n\n\n\n\n\n\n\n\n\nsiloname\n\nNew\n\n')
537 self.assertEqual(Comment.query.get(1).text, 'Updated description.')
538
539 def test_update_ticket_doesnt_exist(self):
540@@ -451,6 +451,9 @@
541 ret = self.app.get('/v1/metadata')
542 self.assertDictEqual(parse(ret.data), dict(
543 docs=dict(
544+ autopkgtest='Autopkgtest Results\n\nThis field will '
545+ 'automatically be filled with links to autopkgtest'
546+ ' results.',
547 comments='Audit Log',
548 creator='Request Creator\n\nLaunchpad nickname of the person '
549 'who created this request.',
550@@ -493,7 +496,7 @@
551 datalists=dict(
552 dest=[
553 '',
554- 'ci-train-ppa-service/ubuntu/stable-phone-overlay',
555+ 'team-not-set/ubuntu/stable-phone-overlay',
556 ],
557 distribution=['ubuntu'],
558 qa_signoff=QA_SIGNOFF,
559@@ -508,7 +511,9 @@
560 'ppa:~team/ubuntu/ppa,vivid',
561 ]),
562 jenkins='https://ci-train.staging.ubuntu.com/',
563+ ppa_team='team-not-set',
564 titles=dict(
565+ autopkgtest='Autopkgtest Results',
566 comments='Audit Log',
567 creator='Request Creator',
568 date='Creation Date',
569
570=== modified file 'tickets/models.py'
571--- tickets/models.py 2015-11-20 20:25:42 +0000
572+++ tickets/models.py 2015-12-14 21:05:53 +0000
573@@ -26,6 +26,10 @@
574
575 PUBLISHABLE = ('publish without qa', 'qa granted')
576
577+AUTOPKGTEST = """Autopkgtest Results
578+
579+This field will automatically be filled with links to autopkgtest results."""
580+
581 CREATOR = """Request Creator
582
583 Launchpad nickname of the person who created this request."""
584@@ -139,8 +143,7 @@
585
586 class Request(db.Model, BiletoModel):
587 """Contains all data associated with one landing request."""
588- request_id = db.Column(
589- BigInt, primary_key=True, doc='Request ID')
590+ request_id = db.Column(BigInt, primary_key=True, doc='Request ID')
591 date = db.Column(db.DateTime, default=right_now, doc='Creation Date')
592
593 test_plan = db.Column(db.Text, default='', doc=TEST_PLAN)
594@@ -148,6 +151,7 @@
595 download_links = db.Column(db.Text, default='', doc=MANUAL_URLS)
596 merge_proposals = db.Column(db.Text, default='', doc=MERGES)
597
598+ autopkgtest = db.Column(db.Text, default='', doc=AUTOPKGTEST)
599 job_log = db.Column(db.Text, default='', doc=JOB_LOG)
600 creator = db.Column(db.Text, default='', doc=CREATOR)
601 landers = db.Column(db.Text, default='', doc=LANDERS)
602
603=== modified file 'tickets/settings.py'
604--- tickets/settings.py 2015-10-19 19:05:19 +0000
605+++ tickets/settings.py 2015-12-14 21:05:53 +0000
606@@ -22,6 +22,7 @@
607 SQLITE_PATH = expanduser('~/bileto.db')
608 PG_PATH = expanduser('~/bileto.postgresql.info')
609 JENKINS_PATH = expanduser('~/bileto.jenkins.url')
610+PPA_TEAM_PATH = expanduser('~/ppa.team')
611
612
613 def read_from_file(path):
614@@ -31,6 +32,7 @@
615 return data.read().strip()
616
617
618+PPA_TEAM = read_from_file(PPA_TEAM_PATH) or 'team-not-set'
619 API_TOKEN = read_from_file(API_PATH)
620 PG_INFO = read_from_file(PG_PATH)
621 JENKINS_INFO = read_from_file(
622
623=== modified file 'tickets/static/app.js'
624--- tickets/static/app.js 2015-11-18 20:54:13 +0000
625+++ tickets/static/app.js 2015-12-14 21:05:53 +0000
626@@ -1,5 +1,5 @@
627 var EXCUSES = 'http://people.canonical.com/~ubuntu-archive/proposed-migration/';
628-var PPA_PREFIX = 'https://launchpad.net/~ci-train-ppa-service/+archive/';
629+var PPA_PREFIX = 'https://launchpad.net/~';
630 var SOURCE_PREFIX = 'https://launchpad.net/ubuntu/+source/';
631 var TEAM_PREFIX = 'https://launchpad.net/~';
632
633@@ -33,6 +33,7 @@
634 text = text || '';
635 return text.replace(/\n/g, '<br />')
636 .replace(/(#|bug)\s?(\d+)/g, ' http://pad.lv/$2')
637+ .replace(/\/static\/[-A-Z0-9+&@#/%?=~_|!:,.;]*[A-Z0-9+&@#/%=~_|]/gi, shorter)
638 .replace(/https?:\/\/[-A-Z0-9+&@#/%?=~_|!:,.;]*[A-Z0-9+&@#/%=~_|]/gi, shorter);
639 }
640
641@@ -45,10 +46,9 @@
642 return text;
643 }
644
645-function display_siloname(text) {
646+function display_siloname(ppa_team, text) {
647 text = (text || '');
648- // TODO: Make PPA link link to correct staging/production PPA team
649- return a(PPA_PREFIX + text, text)
650+ return a(PPA_PREFIX + ppa_team + '/+archive/' + text, text)
651 }
652
653 function link_excuses(match, packagelist) {
654
655=== added symlink 'tickets/static/britney'
656=== target is u'/tmp/britney_output'
657=== modified file 'tickets/static/index.html'
658--- tickets/static/index.html 2015-11-20 19:48:04 +0000
659+++ tickets/static/index.html 2015-12-14 21:05:53 +0000
660@@ -93,7 +93,7 @@
661
662 <tr>
663 <th>{{meta.titles.siloname}}</th>
664-<td ng-bind-html="display_siloname(req.siloname)"></td>
665+<td ng-bind-html="display_siloname(meta.ppa_team, req.siloname)"></td>
666 </tr>
667
668 <tr>
669@@ -116,6 +116,11 @@
670 <td>{{req[field]}}</td>
671 </tr>
672
673+<tr ng-show="req.autopkgtest">
674+<th>{{meta.titles.autopkgtest}}</th>
675+<td ng-bind-html="linkify(req.autopkgtest)"></td>
676+</tr>
677+
678 <tr>
679 <th>{{meta.titles.landers}}</th>
680 <td ng-bind-html="display_landers(req.landers)"></td>
681
682=== removed directory 'tickets/static/json'
683=== modified file 'tickets/v1.py'
684--- tickets/v1.py 2015-11-19 18:59:59 +0000
685+++ tickets/v1.py 2015-12-14 21:05:53 +0000
686@@ -25,7 +25,7 @@
687 from tickets.teams import Teams
688 from tickets.lplib import get_series, memoize
689 from tickets.models import PUBLISHABLE, Request, Comment
690-from tickets.settings import API_TOKEN, JENKINS_INFO, read_from_file
691+from tickets.settings import API_TOKEN, JENKINS_INFO, PPA_TEAM, read_from_file
692
693
694 LOGFILE = '/var/log/bileto.log'
695@@ -120,6 +120,7 @@
696 authenticate()
697 assert_json()
698 requestid = request.json.get('request_id')
699+ success = 200 if requestid else 201
700 if not requestid:
701 req = Request()
702 db.session.add(req)
703@@ -145,7 +146,7 @@
704 db.session.commit()
705 req.update(**request.json)
706 db.session.commit()
707- return jsonify(dict(request_id=req.request_id)), 201
708+ return jsonify(dict(request_id=req.request_id)), success
709
710
711 @app.route('/v1/tickets', methods=['GET'])
712@@ -247,8 +248,7 @@
713 datalists=dict(
714 distribution=['ubuntu'],
715 series=['xenial+vivid'] + series,
716- dest=[
717- '', 'ci-train-ppa-service/ubuntu/stable-phone-overlay'],
718+ dest=['', '{}/ubuntu/stable-phone-overlay'.format(PPA_TEAM)],
719 sync_request=['ubuntu,' + s for s in series] + [
720 '0',
721 'stable-overlay,vivid',
722@@ -260,6 +260,7 @@
723 ) + HIDDEN,
724 qa_signoff=QA_SIGNOFF),
725 jenkins=JENKINS_INFO,
726+ ppa_team=PPA_TEAM,
727 ))
728
729

Subscribers

People subscribed via source and target branches