Merge lp:~sergiodj/lp-ftbfs-report/fix-internal-pkgset-team-links into lp:lp-ftbfs-report

Proposed by Sergio Durigan Junior
Status: Superseded
Proposed branch: lp:~sergiodj/lp-ftbfs-report/fix-internal-pkgset-team-links
Merge into: lp:lp-ftbfs-report
Diff against target: 460 lines (+182/-37) (has conflicts)
3 files modified
source/build_status.html (+30/-10)
source/build_status.py (+148/-27)
source/style.css (+4/-0)
Text conflict in source/build_status.html
Text conflict in source/build_status.py
To merge this branch: bzr merge lp:~sergiodj/lp-ftbfs-report/fix-internal-pkgset-team-links
Reviewer Review Type Date Requested Status
William Grant Pending
Review via email: mp+407098@code.launchpad.net

Description of the change

When the packageset and the team names are the same, the internal links for each of these will end up also being the same. Currently, because the packageset section comes first in the page, that means that the team internal link will point to it.

A simple fix is to use distinct suffixes to compose the internal link names, which help differentiate them.

To post a comment you must log in.

Unmerged revisions

65. By Sergio Durigan Junior

Use suffixes for pkgset and team internal links

When a packageset has the same name of a team, the internal links will
point to the same place, which currently is the packageset section
(because it comes first in the page). We can solve that by using
distinct suffixes when creating the links.

64. By William Grant

Exclude copy archive failures that also failed in the primary archive.

63. By William Grant

Update source link.

62. By William Grant

Include only packages in main in team sections.

61. By William Grant

Generate per team lists.

60. By William Grant

Port to modern python-apt.

59. By William Grant

Add --release-only which skips packages that only fail outside the release pocket. Useful for port bootstrapping.

58. By William Grant

Add space before pocket name.

57. By William Grant

Fix spacing with more architectures.

56. By William Grant

Don't always use the current series when comparing test rebuilds to the primary archive. We rebuild old series too.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'source/build_status.html'
2--- source/build_status.html 2013-05-28 21:39:21 +0000
3+++ source/build_status.html 2021-08-13 16:24:09 +0000
4@@ -4,7 +4,7 @@
5 {#- Macro definitions -#}
6 {% macro legend_row(row_class, field, field_desc) -%}
7 <tr class="{{ row_class }}" style="text-align: center;">
8- <td class="{{ field }}">arch ({{ field[0] }})</td>
9+ <td class="{{ field }}">arch ({{ abbrs[field] }})</td>
10 <td style="text-align: left;">{{ field_desc }}</td>
11 {% for arch in arch_list -%}
12 {% if stats[field][arch].cnt -%}
13@@ -26,9 +26,9 @@
14 <col width="25%" />
15 <col width="24%" />
16 <col width="2%" />
17- <col width="11%" span="{{ arch_list|count }}" />
18- <col width="5%" />
19- <col width="5%" />
20+ <col width="8%" span="{{ arch_list|count }}" />
21+ <col width="6%" />
22+ <col width="6%" />
23 </colgroup>
24 <thead>
25 <tr>
26@@ -79,7 +79,7 @@
27 {% if arch in ver.logs -%}
28 <td class="{{ ver.logs[arch].buildstate }}"
29 onmouseover="Tip('{{ ver.logs[arch].tooltip|e }}')" onmouseout="UnTip()">
30- <a href="{{ ver.logs[arch].url }}">{{ arch }} ({{ ver.logs[arch].buildstate[0] }})</a>
31+ <a href="{{ ver.logs[arch].url }}">{{ arch }} ({{ abbrs[ver.logs[arch].buildstate] }})</a>
32 &rarr;
33 <a href="{{ ver.logs[arch].log }}">Log</a>
34 </td>
35@@ -130,16 +130,25 @@
36
37 <p>Jump to packageset:
38 {% for (ps, lst) in packagesets.items()|sort if lst|count > 0 -%}
39- <a href="#{{ ps }}">{{ ps }}</a> ({{ lst|count }})
40- {% endfor -%}
41- </p>
42+ <a href="#{{ ps }}-pkgset">{{ ps }}</a> ({{ lst|count }})
43+ {% endfor -%}
44+ </p>
45+ <p>Jump to team:
46+ {% for (team, lst) in teams.items()|sort if lst|count > 0 -%}
47+ <a href="#{{ team }}-team">{{ team }}</a> ({{ lst|count }})
48+ {% endfor -%}
49+ </p>
50+
51+ {% if notice %}
52+ {{ notice }}
53+ {% endif %}
54
55 <h2>Legend and statistics for the displayed data</h2>
56 <table class="grid" width="95%">
57 <colgroup>
58 <col width="6%" />
59 <col width="45%" />
60- <col width="11%" span="{{ arch_list|count }}" />
61+ <col width="8%" span="{{ arch_list|count }}" />
62 </colgroup>
63 <thead>
64 <tr>
65@@ -159,6 +168,7 @@
66 </thead>
67 <tbody>
68 {{ legend_row('even', 'FAILEDTOBUILD', 'Package failed to build') }}
69+ {{ legend_row('odd', 'CANCELLED', 'Cancelled build') }}
70 {{ legend_row('odd', 'MANUALDEPWAIT', 'Package is waiting on another package') }}
71 {{ legend_row('even', 'CHROOTWAIT', 'Failure in the chroot') }}
72 {{ legend_row('odd', 'UPLOADFAIL', 'Failed to upload') }}
73@@ -179,10 +189,16 @@
74
75 <h2>Packagesets</h2>
76 {% for (ps, lst) in packagesets.items()|sort if lst|count > 0 -%}
77- <h3 id="{{ ps }}">{{ ps }}: {{ lst|count }} packages (<a href="#top">top</a>)</h3>
78+ <h3 id="{{ ps }}-pkgset">{{ ps }}: {{ lst|count }} packages (<a href="#top">top</a>)</h3>
79 {{ table(lst, ps, 'Also belongs to another packageset?') }}
80 {% endfor -%}
81
82+ <h2>Teams (main only)</h2>
83+ {% for (team, lst) in teams.items()|sort if lst|count > 0 -%}
84+ <h3 id="{{ team }}-team">{{ team }}: {{ lst|count }} packages (<a href="#top">top</a>)</h3>
85+ {{ table(lst, team, 'Also belongs to another team?') }}
86+ {% endfor -%}
87+
88 {% if main_archive %}
89 <h2 id="main_superseded">main (superseded): {{ main_superseded|count }} packages (<a href="#top">top</a>)</h2>
90 {{ table(main_superseded) }}
91@@ -217,6 +233,7 @@
92 alt="Valid XHTML 1.1" height="31" width="88"
93 style="border-width:0px" /></a>
94 </p>
95+<<<<<<< TREE
96 <p style="font-size:smaller">Source: <a href="https://code.launchpad.net/lp-ftbfs-report">lp:lp-ftbfs-report</a></p>
97 <script type="text/javascript" src="source/wz_tooltip.js"></script>
98 <script type="text/javascript" src="source/jquery.min.js"></script>
99@@ -357,5 +374,8 @@
100 /* Build failures raw data */
101 var json_data={{ historical_json }};
102 </script>
103+=======
104+ <p style="font-size:smaller">Source: <a href="https://code.launchpad.net/~wgrant/lp-ftbfs-report/production">lp:~wgrant/lp-ftbfs-report/production</a></p>
105+>>>>>>> MERGE-SOURCE
106 </body>
107 </html>
108
109=== modified file 'source/build_status.py'
110--- source/build_status.py 2012-04-14 17:11:03 +0000
111+++ source/build_status.py 2021-08-13 16:24:09 +0000
112@@ -20,6 +20,8 @@
113 #import httplib2
114 #httplib2.debuglevel = 1
115
116+import os
117+import requests
118 import sys
119 import time
120 import apt_pkg
121@@ -30,12 +32,13 @@
122 from launchpadlib.errors import HTTPError
123 from launchpadlib.launchpad import Launchpad
124 from operator import (attrgetter, methodcaller)
125+from optparse import OptionParser
126
127 lp_service = 'production'
128 api_version = 'devel'
129 default_arch_list = []
130 find_tagged_bugs = 'ftbfs'
131-apt_pkg.InitSystem()
132+apt_pkg.init_system()
133
134 class PersonTeam(object):
135 _cache = dict()
136@@ -77,7 +80,7 @@
137 class VersionList(list):
138 def append(self, item):
139 super(SourcePackage.VersionList, self).append(item)
140- self.sort(key = attrgetter('version'), cmp = apt_pkg.VersionCompare)
141+ self.sort(key = attrgetter('version'), cmp = apt_pkg.version_compare)
142
143 def __new__(cls, spph):
144 try:
145@@ -99,6 +102,10 @@
146 for ps in srcpkg.packagesets:
147 packagesets_ftbfs[ps].append(srcpkg)
148
149+ srcpkg.teams = set([team for (team, srcpkglist) in teams.items() if spph.source_package_name in srcpkglist and spph.component_name == "main"])
150+ for team in srcpkg.teams:
151+ teams_ftbfs[team].append(srcpkg)
152+
153 # add to cache
154 cls._cache[spph.source_package_name] = srcpkg
155
156@@ -136,6 +143,33 @@
157 else:
158 return list(self.packagesets.difference((name,)))
159
160+class MainArchiveBuilds(object):
161+ _cache = dict()
162+
163+ def __new__(cls, main_archive, source, version):
164+ try:
165+ return cls._cache["%s,%s" % (source, version)]
166+ except KeyError:
167+ bfm = super(MainArchiveBuilds, cls).__new__(cls)
168+ results = {}
169+ sourcepubs = main_archive.getPublishedSources(
170+ exact_match=True, source_name=source, version=version)
171+ for pub in sourcepubs:
172+ for build in pub.getBuilds():
173+ # assumes sourcepubs are sorted latest release to oldest,
174+ # so first record wins
175+ if build.arch_tag not in results:
176+ results[build.arch_tag] = build.buildstate
177+ bfm.results = results
178+ # add to cache
179+ cls._cache["%s,%s" % (source, version)] = bfm
180+
181+ return bfm
182+
183+ @classmethod
184+ def clear(cls):
185+ cls._cache.clear()
186+
187 class SPPH(object):
188 _cache = dict() # dict with all SPPH objects
189
190@@ -172,6 +206,7 @@
191 'Dependency wait': 'MANUALDEPWAIT',
192 'Chroot problem': 'CHROOTWAIT',
193 'Failed to upload': 'UPLOADFAIL',
194+ 'Cancelled build': 'CANCELLED',
195 }
196 self.buildstate = buildstates[build.buildstate]
197 self.url = build.web_link
198@@ -203,7 +238,7 @@
199 return u'Changed-By: %s' % (self.changed_by)
200
201
202-def fetch_pkg_list(archive, series, state, last_published, arch_list=default_arch_list, main_archive=None, main_series=None):
203+def fetch_pkg_list(archive, series, state, last_published, arch_list=default_arch_list, main_archive=None, main_series=None, release_only=False):
204 print "Processing '%s'" % state
205
206 cur_last_published = None
207@@ -250,30 +285,59 @@
208 version=spph._lp.source_package_version,
209 status='Published')
210 spph.current = len(main_publications[:1]) > 0
211+ elif release_only:
212+ release_publications = archive.getPublishedSources(
213+ distro_series=series,
214+ pocket='Release',
215+ exact_match=True,
216+ source_name=spph._lp.source_package_name,
217+ version=spph._lp.source_package_version,
218+ status='Published')
219+ spph.current = len(release_publications[:1]) > 0
220+ if not spph.current:
221+ release_publications = archive.getPublishedSources(
222+ distro_series=series,
223+ pocket='Release',
224+ exact_match=True,
225+ source_name=spph._lp.source_package_name,
226+ version=spph._lp.source_package_version,
227+ status='Pending')
228+ spph.current = len(release_publications[:1]) > 0
229 else:
230 spph.current = True
231
232 if not spph.current:
233 print " superseded"
234+
235+ if main_archive:
236+ # If this build failure is not a regression versus the
237+ # main archive, do not report it.
238+ main_builds = MainArchiveBuilds(main_archive,
239+ spph._lp.source_package_name,
240+ spph._lp.source_package_version)
241+ try:
242+ if main_builds.results[arch] != 'Successfully built':
243+ print " Skipping %s" % build.title
244+ continue
245+ except KeyError:
246+ pass
247+
248 SPPH(csp_link).addBuildLog(build)
249
250 return cur_last_published
251
252
253-def generate_page(archive, series, archs_by_archive, main_archive, template = 'build_status.html', arch_list = default_arch_list):
254- try:
255- out = open('../%s-%s.html' % (archive.name, series.name), 'w')
256- except IOError:
257- return
258-
259+def generate_page(name, archive, series, archs_by_archive, main_archive, template = 'build_status.html', arch_list = default_arch_list, notice=None, release_only=False):
260 # sort the package lists
261 filter_ftbfs = lambda pkglist, current: filter(methodcaller('isFTBFS', arch_list, current), sorted(pkglist))
262 data = {}
263 for comp in ('main', 'restricted', 'universe', 'multiverse'):
264 data[comp] = filter_ftbfs(components[comp], True)
265- data['%s_superseded' % comp] = filter_ftbfs(components[comp], False)
266+ data['%s_superseded' % comp] = filter_ftbfs(components[comp], False) if not release_only else []
267 for pkgset, pkglist in packagesets_ftbfs.items():
268 packagesets_ftbfs[pkgset] = filter_ftbfs(pkglist, True)
269+ for team, pkglist in teams_ftbfs.items():
270+ teams_ftbfs[team] = filter_ftbfs(pkglist, True)
271
272 # container object to hold the counts and the tooltip
273 class StatData(object):
274@@ -284,7 +348,7 @@
275
276 # compute some statistics (number of packages for each build failure type)
277 stats = {}
278- for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL'):
279+ for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL', 'CANCELLED'):
280 stats[state] = {}
281 for arch in arch_list:
282 tooltip = []
283@@ -313,31 +377,47 @@
284 data['archs_by_archive'] = archs_by_archive
285 data['lastupdate'] = time.strftime('%F %T %z')
286 data['packagesets'] = packagesets_ftbfs
287+<<<<<<< TREE
288 data['historical_json'] = generate_historical_json(archive, series)
289+=======
290+ data['teams'] = teams_ftbfs
291+ data['notice'] = notice
292+ data['abbrs'] = {
293+ 'FAILEDTOBUILD': 'F',
294+ 'CANCELLED': 'X',
295+ 'MANUALDEPWAIT': 'M',
296+ 'CHROOTWAIT': 'C',
297+ 'UPLOADFAIL': 'U',
298+ }
299+>>>>>>> MERGE-SOURCE
300
301 env = Environment(loader=FileSystemLoader('.'))
302 template = env.get_template('build_status.html')
303 stream = template.render(**data)
304+
305+ fn = '../%s.html' % name
306+ out = open('%s.new' % fn, 'w')
307 out.write(stream.encode('utf-8'))
308 out.close()
309+ os.rename('%s.new' % fn, fn)
310
311-def generate_csvfile(archive, series, arch_list = default_arch_list):
312- csvout = open('../%s-%s.csv' % (archive.name, series.name), 'w')
313+def generate_csvfile(name, arch_list = default_arch_list):
314+ csvout = open('../%s.csv' % name, 'w')
315 linetemplate = '%(name)s,%(link)s,%(explain)s\n'
316 for comp in components.values():
317 for pkg in comp:
318 for ver in pkg.versions:
319- for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL'):
320+ for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL', 'CANCELLED'):
321 archs = [ arch for (arch, log) in ver.logs.items() if log.buildstate == state ]
322 if archs:
323 log = ver.logs[archs[0]].log
324 csvout.write(linetemplate % {'name': pkg.name, 'link': log,
325 'explain':"[%s] %s" %(', '.join(archs), state)})
326
327-def load_timestamps(archive, series):
328+def load_timestamps(name):
329 '''Load the saved timestamps about the last still published FTBFS build record.'''
330 try:
331- timestamp_file = file('%s-%s.json' % (archive.name, series.name), 'r')
332+ timestamp_file = file('%s.json' % name, 'r')
333 tmp = json.load(timestamp_file)
334 timestamps = {}
335 for state, timestamp in tmp.items():
336@@ -352,11 +432,12 @@
337 'Dependency wait': None,
338 'Chroot problem': None,
339 'Failed to upload': None,
340+ 'Cancelled build': None,
341 }
342
343-def save_timestamps(archive, series, timestamps):
344+def save_timestamps(name, timestamps):
345 '''Save the timestamps of the last still published FTBFS build record into a JSON file.'''
346- timestamp_file = file('%s-%s.json' % (archive.name, series.name), 'w')
347+ timestamp_file = file('%s.json' % name, 'w')
348 tmp = {}
349 for state, timestamp in timestamps.items():
350 if timestamp is not None:
351@@ -439,6 +520,7 @@
352 launchpad = Launchpad.login_anonymously('qa-ftbfs', lp_service, version=api_version)
353
354 ubuntu = launchpad.distributions['ubuntu']
355+<<<<<<< TREE
356 assert len(sys.argv) >= 4
357
358 try:
359@@ -451,15 +533,44 @@
360 except HTTPError:
361 print 'Error: %s is not a valid series.' % sys.argv[2]
362 sys.exit(1)
363+=======
364+
365+ usage = "usage: %prog [options] <archive> <series> <arch> [<arch> ...]"
366+ parser = OptionParser(usage=usage)
367+ parser.add_option(
368+ "-f", "--filename", dest="name",
369+ help="File name prefix for the result.")
370+ parser.add_option(
371+ "-n", "--notice", dest="notice_file",
372+ help="HTML notice file to include in the page header.")
373+ parser.add_option(
374+ "--release-only", dest="release_only", action="store_true",
375+ help="Only include sources currently published in the release pocket.")
376+ (options, args) = parser.parse_args()
377+ if len(args) < 3:
378+ parser.error("Need at least 4 arguments.")
379+
380+ try:
381+ archive = ubuntu.getArchive(name=args[0])
382+ except HTTPError:
383+ print 'Error: %s is not a valid archive.' % args[0]
384+ try:
385+ series = ubuntu.getSeries(name_or_version=args[1])
386+ except HTTPError:
387+ print 'Error: %s is not a valid series.' % args[1]
388+
389+ if options.name is None:
390+ options.name = '%s-%s' % (archive.name, series.name)
391+>>>>>>> MERGE-SOURCE
392
393 if archive.name != 'primary':
394 main_archive = ubuntu.main_archive
395- main_series = ubuntu.current_series
396+ main_series = series
397 else:
398 main_archive = main_series = None
399
400 archs_by_archive = dict(main=[], ports=[])
401- for arch in sys.argv[3:]:
402+ for arch in args[2:]:
403 das = series.getDistroArchSeries(archtag=arch)
404 archs_by_archive[das.official and 'main' or 'ports'].append(arch)
405 default_arch_list.extend(archs_by_archive['main'])
406@@ -472,7 +583,7 @@
407 PersonTeam.clear()
408 SourcePackage.clear()
409 SPPH.clear()
410- last_published = load_timestamps(archive, series)
411+ last_published = load_timestamps(options.name)
412
413 # list of SourcePackages for each component
414 components = {
415@@ -491,14 +602,24 @@
416 packagesets[ps.name] = ps.getSourcesIncluded(direct_inclusion=False)
417 packagesets_ftbfs[ps.name] = [] # empty list to add FTBFS for each package set later
418
419- for state in ('Failed to build', 'Dependency wait', 'Chroot problem', 'Failed to upload'):
420- last_published[state] = fetch_pkg_list(archive, series, state, last_published[state], default_arch_list, main_archive, main_series)
421-
422- save_timestamps(archive, series, last_published)
423+ teams = requests.get('https://people.canonical.com/~ubuntu-archive/package-team-mapping.json').json()
424+
425+ # Per team list of FTBFS
426+ teams_ftbfs = {team: [] for team in teams}
427+
428+ for state in ('Failed to build', 'Dependency wait', 'Chroot problem', 'Failed to upload', 'Cancelled build'):
429+ last_published[state] = fetch_pkg_list(archive, series, state, last_published[state], default_arch_list, main_archive, main_series, options.release_only)
430+
431+ save_timestamps(options.name, last_published)
432+
433+ if options.notice_file:
434+ notice = open(options.notice_file).read()
435+ else:
436+ notice = None
437
438 print "Updating Historical database..."
439 store_historical(archive, series)
440 print "Generating HTML page..."
441- generate_page(archive, series, archs_by_archive, main_archive)
442+ generate_page(options.name, archive, series, archs_by_archive, main_archive, notice=notice, release_only=options.release_only)
443 print "Generating CSV file..."
444- generate_csvfile(archive, series)
445+ generate_csvfile(options.name)
446
447=== modified file 'source/style.css'
448--- source/style.css 2011-10-16 14:22:34 +0000
449+++ source/style.css 2021-08-13 16:24:09 +0000
450@@ -52,6 +52,10 @@
451 background-color:#cc0000;
452 }
453
454+.CANCELLED {
455+ background-color:#aa0000;
456+}
457+
458 .MANUALDEPWAIT {
459 background:#ff9900;
460 }

Subscribers

People subscribed via source and target branches