Merge lp:~sergiodj/lp-ftbfs-report/fix-internal-pkgset-team-links into lp:lp-ftbfs-report

Proposed by Sergio Durigan Junior
Status: Superseded
Proposed branch: lp:~sergiodj/lp-ftbfs-report/fix-internal-pkgset-team-links
Merge into: lp:lp-ftbfs-report
Diff against target: 460 lines (+182/-37) (has conflicts)
3 files modified
source/build_status.html (+30/-10)
source/build_status.py (+148/-27)
source/style.css (+4/-0)
Text conflict in source/build_status.html
Text conflict in source/build_status.py
To merge this branch: bzr merge lp:~sergiodj/lp-ftbfs-report/fix-internal-pkgset-team-links
Reviewer Review Type Date Requested Status
William Grant Pending
Review via email: mp+407098@code.launchpad.net

Description of the change

When the packageset and the team names are the same, the internal links for each of these will end up also being the same. Currently, because the packageset section comes first in the page, that means that the team internal link will point to it.

A simple fix is to use distinct suffixes to compose the internal link names, which help differentiate them.

To post a comment you must log in.

Unmerged revisions

65. By Sergio Durigan Junior

Use suffixes for pkgset and team internal links

When a packageset has the same name of a team, the internal links will
point to the same place, which currently is the packageset section
(because it comes first in the page). We can solve that by using
distinct suffixes when creating the links.

64. By William Grant

Exclude copy archive failures that also failed in the primary archive.

63. By William Grant

Update source link.

62. By William Grant

Include only packages in main in team sections.

61. By William Grant

Generate per team lists.

60. By William Grant

Port to modern python-apt.

59. By William Grant

Add --release-only which skips packages that only fail outside the release pocket. Useful for port bootstrapping.

58. By William Grant

Add space before pocket name.

57. By William Grant

Fix spacing with more architectures.

56. By William Grant

Don't always use the current series when comparing test rebuilds to the primary archive. We rebuild old series too.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'source/build_status.html'
--- source/build_status.html 2013-05-28 21:39:21 +0000
+++ source/build_status.html 2021-08-13 16:24:09 +0000
@@ -4,7 +4,7 @@
4{#- Macro definitions -#}4{#- Macro definitions -#}
5{% macro legend_row(row_class, field, field_desc) -%}5{% macro legend_row(row_class, field, field_desc) -%}
6 <tr class="{{ row_class }}" style="text-align: center;">6 <tr class="{{ row_class }}" style="text-align: center;">
7 <td class="{{ field }}">arch ({{ field[0] }})</td>7 <td class="{{ field }}">arch ({{ abbrs[field] }})</td>
8 <td style="text-align: left;">{{ field_desc }}</td>8 <td style="text-align: left;">{{ field_desc }}</td>
9 {% for arch in arch_list -%}9 {% for arch in arch_list -%}
10 {% if stats[field][arch].cnt -%}10 {% if stats[field][arch].cnt -%}
@@ -26,9 +26,9 @@
26 <col width="25%" />26 <col width="25%" />
27 <col width="24%" />27 <col width="24%" />
28 <col width="2%" />28 <col width="2%" />
29 <col width="11%" span="{{ arch_list|count }}" />29 <col width="8%" span="{{ arch_list|count }}" />
30 <col width="5%" />30 <col width="6%" />
31 <col width="5%" />31 <col width="6%" />
32 </colgroup>32 </colgroup>
33 <thead>33 <thead>
34 <tr>34 <tr>
@@ -79,7 +79,7 @@
79 {% if arch in ver.logs -%}79 {% if arch in ver.logs -%}
80 <td class="{{ ver.logs[arch].buildstate }}"80 <td class="{{ ver.logs[arch].buildstate }}"
81 onmouseover="Tip('{{ ver.logs[arch].tooltip|e }}')" onmouseout="UnTip()">81 onmouseover="Tip('{{ ver.logs[arch].tooltip|e }}')" onmouseout="UnTip()">
82 <a href="{{ ver.logs[arch].url }}">{{ arch }} ({{ ver.logs[arch].buildstate[0] }})</a>82 <a href="{{ ver.logs[arch].url }}">{{ arch }} ({{ abbrs[ver.logs[arch].buildstate] }})</a>
83 &rarr;83 &rarr;
84 <a href="{{ ver.logs[arch].log }}">Log</a>84 <a href="{{ ver.logs[arch].log }}">Log</a>
85 </td>85 </td>
@@ -130,16 +130,25 @@
130130
131 <p>Jump to packageset:131 <p>Jump to packageset:
132 {% for (ps, lst) in packagesets.items()|sort if lst|count > 0 -%}132 {% for (ps, lst) in packagesets.items()|sort if lst|count > 0 -%}
133 <a href="#{{ ps }}">{{ ps }}</a> ({{ lst|count }})133 <a href="#{{ ps }}-pkgset">{{ ps }}</a> ({{ lst|count }})
134 {% endfor -%}134 {% endfor -%}
135 </p>135 </p>
136 <p>Jump to team:
137 {% for (team, lst) in teams.items()|sort if lst|count > 0 -%}
138 <a href="#{{ team }}-team">{{ team }}</a> ({{ lst|count }})
139 {% endfor -%}
140 </p>
141
142 {% if notice %}
143 {{ notice }}
144 {% endif %}
136145
137 <h2>Legend and statistics for the displayed data</h2>146 <h2>Legend and statistics for the displayed data</h2>
138 <table class="grid" width="95%">147 <table class="grid" width="95%">
139 <colgroup>148 <colgroup>
140 <col width="6%" />149 <col width="6%" />
141 <col width="45%" />150 <col width="45%" />
142 <col width="11%" span="{{ arch_list|count }}" />151 <col width="8%" span="{{ arch_list|count }}" />
143 </colgroup>152 </colgroup>
144 <thead>153 <thead>
145 <tr>154 <tr>
@@ -159,6 +168,7 @@
159 </thead>168 </thead>
160 <tbody>169 <tbody>
161 {{ legend_row('even', 'FAILEDTOBUILD', 'Package failed to build') }}170 {{ legend_row('even', 'FAILEDTOBUILD', 'Package failed to build') }}
171 {{ legend_row('odd', 'CANCELLED', 'Cancelled build') }}
162 {{ legend_row('odd', 'MANUALDEPWAIT', 'Package is waiting on another package') }}172 {{ legend_row('odd', 'MANUALDEPWAIT', 'Package is waiting on another package') }}
163 {{ legend_row('even', 'CHROOTWAIT', 'Failure in the chroot') }}173 {{ legend_row('even', 'CHROOTWAIT', 'Failure in the chroot') }}
164 {{ legend_row('odd', 'UPLOADFAIL', 'Failed to upload') }}174 {{ legend_row('odd', 'UPLOADFAIL', 'Failed to upload') }}
@@ -179,10 +189,16 @@
179189
180 <h2>Packagesets</h2>190 <h2>Packagesets</h2>
181 {% for (ps, lst) in packagesets.items()|sort if lst|count > 0 -%}191 {% for (ps, lst) in packagesets.items()|sort if lst|count > 0 -%}
182 <h3 id="{{ ps }}">{{ ps }}: {{ lst|count }} packages (<a href="#top">top</a>)</h3>192 <h3 id="{{ ps }}-pkgset">{{ ps }}: {{ lst|count }} packages (<a href="#top">top</a>)</h3>
183 {{ table(lst, ps, 'Also belongs to another packageset?') }}193 {{ table(lst, ps, 'Also belongs to another packageset?') }}
184 {% endfor -%}194 {% endfor -%}
185195
196 <h2>Teams (main only)</h2>
197 {% for (team, lst) in teams.items()|sort if lst|count > 0 -%}
198 <h3 id="{{ team }}-team">{{ team }}: {{ lst|count }} packages (<a href="#top">top</a>)</h3>
199 {{ table(lst, team, 'Also belongs to another team?') }}
200 {% endfor -%}
201
186 {% if main_archive %}202 {% if main_archive %}
187 <h2 id="main_superseded">main (superseded): {{ main_superseded|count }} packages (<a href="#top">top</a>)</h2>203 <h2 id="main_superseded">main (superseded): {{ main_superseded|count }} packages (<a href="#top">top</a>)</h2>
188 {{ table(main_superseded) }}204 {{ table(main_superseded) }}
@@ -217,6 +233,7 @@
217 alt="Valid XHTML 1.1" height="31" width="88"233 alt="Valid XHTML 1.1" height="31" width="88"
218 style="border-width:0px" /></a>234 style="border-width:0px" /></a>
219 </p>235 </p>
236<<<<<<< TREE
220 <p style="font-size:smaller">Source: <a href="https://code.launchpad.net/lp-ftbfs-report">lp:lp-ftbfs-report</a></p>237 <p style="font-size:smaller">Source: <a href="https://code.launchpad.net/lp-ftbfs-report">lp:lp-ftbfs-report</a></p>
221 <script type="text/javascript" src="source/wz_tooltip.js"></script>238 <script type="text/javascript" src="source/wz_tooltip.js"></script>
222 <script type="text/javascript" src="source/jquery.min.js"></script>239 <script type="text/javascript" src="source/jquery.min.js"></script>
@@ -357,5 +374,8 @@
357 /* Build failures raw data */374 /* Build failures raw data */
358 var json_data={{ historical_json }};375 var json_data={{ historical_json }};
359 </script>376 </script>
377=======
378 <p style="font-size:smaller">Source: <a href="https://code.launchpad.net/~wgrant/lp-ftbfs-report/production">lp:~wgrant/lp-ftbfs-report/production</a></p>
379>>>>>>> MERGE-SOURCE
360</body>380</body>
361</html>381</html>
362382
=== modified file 'source/build_status.py'
--- source/build_status.py 2012-04-14 17:11:03 +0000
+++ source/build_status.py 2021-08-13 16:24:09 +0000
@@ -20,6 +20,8 @@
20#import httplib220#import httplib2
21#httplib2.debuglevel = 121#httplib2.debuglevel = 1
2222
23import os
24import requests
23import sys25import sys
24import time26import time
25import apt_pkg27import apt_pkg
@@ -30,12 +32,13 @@
30from launchpadlib.errors import HTTPError32from launchpadlib.errors import HTTPError
31from launchpadlib.launchpad import Launchpad33from launchpadlib.launchpad import Launchpad
32from operator import (attrgetter, methodcaller)34from operator import (attrgetter, methodcaller)
35from optparse import OptionParser
3336
34lp_service = 'production'37lp_service = 'production'
35api_version = 'devel'38api_version = 'devel'
36default_arch_list = []39default_arch_list = []
37find_tagged_bugs = 'ftbfs'40find_tagged_bugs = 'ftbfs'
38apt_pkg.InitSystem()41apt_pkg.init_system()
3942
40class PersonTeam(object):43class PersonTeam(object):
41 _cache = dict()44 _cache = dict()
@@ -77,7 +80,7 @@
77 class VersionList(list):80 class VersionList(list):
78 def append(self, item):81 def append(self, item):
79 super(SourcePackage.VersionList, self).append(item)82 super(SourcePackage.VersionList, self).append(item)
80 self.sort(key = attrgetter('version'), cmp = apt_pkg.VersionCompare)83 self.sort(key = attrgetter('version'), cmp = apt_pkg.version_compare)
8184
82 def __new__(cls, spph):85 def __new__(cls, spph):
83 try:86 try:
@@ -99,6 +102,10 @@
99 for ps in srcpkg.packagesets:102 for ps in srcpkg.packagesets:
100 packagesets_ftbfs[ps].append(srcpkg)103 packagesets_ftbfs[ps].append(srcpkg)
101104
105 srcpkg.teams = set([team for (team, srcpkglist) in teams.items() if spph.source_package_name in srcpkglist and spph.component_name == "main"])
106 for team in srcpkg.teams:
107 teams_ftbfs[team].append(srcpkg)
108
102 # add to cache109 # add to cache
103 cls._cache[spph.source_package_name] = srcpkg110 cls._cache[spph.source_package_name] = srcpkg
104111
@@ -136,6 +143,33 @@
136 else:143 else:
137 return list(self.packagesets.difference((name,)))144 return list(self.packagesets.difference((name,)))
138145
146class MainArchiveBuilds(object):
147 _cache = dict()
148
149 def __new__(cls, main_archive, source, version):
150 try:
151 return cls._cache["%s,%s" % (source, version)]
152 except KeyError:
153 bfm = super(MainArchiveBuilds, cls).__new__(cls)
154 results = {}
155 sourcepubs = main_archive.getPublishedSources(
156 exact_match=True, source_name=source, version=version)
157 for pub in sourcepubs:
158 for build in pub.getBuilds():
159 # assumes sourcepubs are sorted latest release to oldest,
160 # so first record wins
161 if build.arch_tag not in results:
162 results[build.arch_tag] = build.buildstate
163 bfm.results = results
164 # add to cache
165 cls._cache["%s,%s" % (source, version)] = bfm
166
167 return bfm
168
169 @classmethod
170 def clear(cls):
171 cls._cache.clear()
172
139class SPPH(object):173class SPPH(object):
140 _cache = dict() # dict with all SPPH objects174 _cache = dict() # dict with all SPPH objects
141175
@@ -172,6 +206,7 @@
172 'Dependency wait': 'MANUALDEPWAIT',206 'Dependency wait': 'MANUALDEPWAIT',
173 'Chroot problem': 'CHROOTWAIT',207 'Chroot problem': 'CHROOTWAIT',
174 'Failed to upload': 'UPLOADFAIL',208 'Failed to upload': 'UPLOADFAIL',
209 'Cancelled build': 'CANCELLED',
175 }210 }
176 self.buildstate = buildstates[build.buildstate]211 self.buildstate = buildstates[build.buildstate]
177 self.url = build.web_link212 self.url = build.web_link
@@ -203,7 +238,7 @@
203 return u'Changed-By: %s' % (self.changed_by)238 return u'Changed-By: %s' % (self.changed_by)
204239
205240
206def fetch_pkg_list(archive, series, state, last_published, arch_list=default_arch_list, main_archive=None, main_series=None):241def fetch_pkg_list(archive, series, state, last_published, arch_list=default_arch_list, main_archive=None, main_series=None, release_only=False):
207 print "Processing '%s'" % state242 print "Processing '%s'" % state
208243
209 cur_last_published = None244 cur_last_published = None
@@ -250,30 +285,59 @@
250 version=spph._lp.source_package_version,285 version=spph._lp.source_package_version,
251 status='Published')286 status='Published')
252 spph.current = len(main_publications[:1]) > 0287 spph.current = len(main_publications[:1]) > 0
288 elif release_only:
289 release_publications = archive.getPublishedSources(
290 distro_series=series,
291 pocket='Release',
292 exact_match=True,
293 source_name=spph._lp.source_package_name,
294 version=spph._lp.source_package_version,
295 status='Published')
296 spph.current = len(release_publications[:1]) > 0
297 if not spph.current:
298 release_publications = archive.getPublishedSources(
299 distro_series=series,
300 pocket='Release',
301 exact_match=True,
302 source_name=spph._lp.source_package_name,
303 version=spph._lp.source_package_version,
304 status='Pending')
305 spph.current = len(release_publications[:1]) > 0
253 else:306 else:
254 spph.current = True307 spph.current = True
255308
256 if not spph.current:309 if not spph.current:
257 print " superseded"310 print " superseded"
311
312 if main_archive:
313 # If this build failure is not a regression versus the
314 # main archive, do not report it.
315 main_builds = MainArchiveBuilds(main_archive,
316 spph._lp.source_package_name,
317 spph._lp.source_package_version)
318 try:
319 if main_builds.results[arch] != 'Successfully built':
320 print " Skipping %s" % build.title
321 continue
322 except KeyError:
323 pass
324
258 SPPH(csp_link).addBuildLog(build)325 SPPH(csp_link).addBuildLog(build)
259326
260 return cur_last_published327 return cur_last_published
261328
262329
263def generate_page(archive, series, archs_by_archive, main_archive, template = 'build_status.html', arch_list = default_arch_list):330def generate_page(name, archive, series, archs_by_archive, main_archive, template = 'build_status.html', arch_list = default_arch_list, notice=None, release_only=False):
264 try:
265 out = open('../%s-%s.html' % (archive.name, series.name), 'w')
266 except IOError:
267 return
268
269 # sort the package lists331 # sort the package lists
270 filter_ftbfs = lambda pkglist, current: filter(methodcaller('isFTBFS', arch_list, current), sorted(pkglist))332 filter_ftbfs = lambda pkglist, current: filter(methodcaller('isFTBFS', arch_list, current), sorted(pkglist))
271 data = {}333 data = {}
272 for comp in ('main', 'restricted', 'universe', 'multiverse'):334 for comp in ('main', 'restricted', 'universe', 'multiverse'):
273 data[comp] = filter_ftbfs(components[comp], True)335 data[comp] = filter_ftbfs(components[comp], True)
274 data['%s_superseded' % comp] = filter_ftbfs(components[comp], False)336 data['%s_superseded' % comp] = filter_ftbfs(components[comp], False) if not release_only else []
275 for pkgset, pkglist in packagesets_ftbfs.items():337 for pkgset, pkglist in packagesets_ftbfs.items():
276 packagesets_ftbfs[pkgset] = filter_ftbfs(pkglist, True)338 packagesets_ftbfs[pkgset] = filter_ftbfs(pkglist, True)
339 for team, pkglist in teams_ftbfs.items():
340 teams_ftbfs[team] = filter_ftbfs(pkglist, True)
277341
278 # container object to hold the counts and the tooltip342 # container object to hold the counts and the tooltip
279 class StatData(object):343 class StatData(object):
@@ -284,7 +348,7 @@
284348
285 # compute some statistics (number of packages for each build failure type)349 # compute some statistics (number of packages for each build failure type)
286 stats = {}350 stats = {}
287 for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL'):351 for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL', 'CANCELLED'):
288 stats[state] = {}352 stats[state] = {}
289 for arch in arch_list:353 for arch in arch_list:
290 tooltip = []354 tooltip = []
@@ -313,31 +377,47 @@
313 data['archs_by_archive'] = archs_by_archive377 data['archs_by_archive'] = archs_by_archive
314 data['lastupdate'] = time.strftime('%F %T %z')378 data['lastupdate'] = time.strftime('%F %T %z')
315 data['packagesets'] = packagesets_ftbfs379 data['packagesets'] = packagesets_ftbfs
380<<<<<<< TREE
316 data['historical_json'] = generate_historical_json(archive, series)381 data['historical_json'] = generate_historical_json(archive, series)
382=======
383 data['teams'] = teams_ftbfs
384 data['notice'] = notice
385 data['abbrs'] = {
386 'FAILEDTOBUILD': 'F',
387 'CANCELLED': 'X',
388 'MANUALDEPWAIT': 'M',
389 'CHROOTWAIT': 'C',
390 'UPLOADFAIL': 'U',
391 }
392>>>>>>> MERGE-SOURCE
317393
318 env = Environment(loader=FileSystemLoader('.'))394 env = Environment(loader=FileSystemLoader('.'))
319 template = env.get_template('build_status.html')395 template = env.get_template('build_status.html')
320 stream = template.render(**data)396 stream = template.render(**data)
397
398 fn = '../%s.html' % name
399 out = open('%s.new' % fn, 'w')
321 out.write(stream.encode('utf-8'))400 out.write(stream.encode('utf-8'))
322 out.close()401 out.close()
402 os.rename('%s.new' % fn, fn)
323403
324def generate_csvfile(archive, series, arch_list = default_arch_list):404def generate_csvfile(name, arch_list = default_arch_list):
325 csvout = open('../%s-%s.csv' % (archive.name, series.name), 'w')405 csvout = open('../%s.csv' % name, 'w')
326 linetemplate = '%(name)s,%(link)s,%(explain)s\n'406 linetemplate = '%(name)s,%(link)s,%(explain)s\n'
327 for comp in components.values():407 for comp in components.values():
328 for pkg in comp:408 for pkg in comp:
329 for ver in pkg.versions:409 for ver in pkg.versions:
330 for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL'):410 for state in ('FAILEDTOBUILD', 'MANUALDEPWAIT', 'CHROOTWAIT', 'UPLOADFAIL', 'CANCELLED'):
331 archs = [ arch for (arch, log) in ver.logs.items() if log.buildstate == state ]411 archs = [ arch for (arch, log) in ver.logs.items() if log.buildstate == state ]
332 if archs:412 if archs:
333 log = ver.logs[archs[0]].log413 log = ver.logs[archs[0]].log
334 csvout.write(linetemplate % {'name': pkg.name, 'link': log,414 csvout.write(linetemplate % {'name': pkg.name, 'link': log,
335 'explain':"[%s] %s" %(', '.join(archs), state)})415 'explain':"[%s] %s" %(', '.join(archs), state)})
336416
337def load_timestamps(archive, series):417def load_timestamps(name):
338 '''Load the saved timestamps about the last still published FTBFS build record.'''418 '''Load the saved timestamps about the last still published FTBFS build record.'''
339 try:419 try:
340 timestamp_file = file('%s-%s.json' % (archive.name, series.name), 'r')420 timestamp_file = file('%s.json' % name, 'r')
341 tmp = json.load(timestamp_file)421 tmp = json.load(timestamp_file)
342 timestamps = {}422 timestamps = {}
343 for state, timestamp in tmp.items():423 for state, timestamp in tmp.items():
@@ -352,11 +432,12 @@
352 'Dependency wait': None,432 'Dependency wait': None,
353 'Chroot problem': None,433 'Chroot problem': None,
354 'Failed to upload': None,434 'Failed to upload': None,
435 'Cancelled build': None,
355 }436 }
356437
357def save_timestamps(archive, series, timestamps):438def save_timestamps(name, timestamps):
358 '''Save the timestamps of the last still published FTBFS build record into a JSON file.'''439 '''Save the timestamps of the last still published FTBFS build record into a JSON file.'''
359 timestamp_file = file('%s-%s.json' % (archive.name, series.name), 'w')440 timestamp_file = file('%s.json' % name, 'w')
360 tmp = {}441 tmp = {}
361 for state, timestamp in timestamps.items():442 for state, timestamp in timestamps.items():
362 if timestamp is not None:443 if timestamp is not None:
@@ -439,6 +520,7 @@
439 launchpad = Launchpad.login_anonymously('qa-ftbfs', lp_service, version=api_version)520 launchpad = Launchpad.login_anonymously('qa-ftbfs', lp_service, version=api_version)
440521
441 ubuntu = launchpad.distributions['ubuntu']522 ubuntu = launchpad.distributions['ubuntu']
523<<<<<<< TREE
442 assert len(sys.argv) >= 4524 assert len(sys.argv) >= 4
443525
444 try:526 try:
@@ -451,15 +533,44 @@
451 except HTTPError:533 except HTTPError:
452 print 'Error: %s is not a valid series.' % sys.argv[2]534 print 'Error: %s is not a valid series.' % sys.argv[2]
453 sys.exit(1)535 sys.exit(1)
536=======
537
538 usage = "usage: %prog [options] <archive> <series> <arch> [<arch> ...]"
539 parser = OptionParser(usage=usage)
540 parser.add_option(
541 "-f", "--filename", dest="name",
542 help="File name prefix for the result.")
543 parser.add_option(
544 "-n", "--notice", dest="notice_file",
545 help="HTML notice file to include in the page header.")
546 parser.add_option(
547 "--release-only", dest="release_only", action="store_true",
548 help="Only include sources currently published in the release pocket.")
549 (options, args) = parser.parse_args()
550 if len(args) < 3:
551 parser.error("Need at least 4 arguments.")
552
553 try:
554 archive = ubuntu.getArchive(name=args[0])
555 except HTTPError:
556 print 'Error: %s is not a valid archive.' % args[0]
557 try:
558 series = ubuntu.getSeries(name_or_version=args[1])
559 except HTTPError:
560 print 'Error: %s is not a valid series.' % args[1]
561
562 if options.name is None:
563 options.name = '%s-%s' % (archive.name, series.name)
564>>>>>>> MERGE-SOURCE
454565
455 if archive.name != 'primary':566 if archive.name != 'primary':
456 main_archive = ubuntu.main_archive567 main_archive = ubuntu.main_archive
457 main_series = ubuntu.current_series568 main_series = series
458 else:569 else:
459 main_archive = main_series = None570 main_archive = main_series = None
460571
461 archs_by_archive = dict(main=[], ports=[])572 archs_by_archive = dict(main=[], ports=[])
462 for arch in sys.argv[3:]:573 for arch in args[2:]:
463 das = series.getDistroArchSeries(archtag=arch)574 das = series.getDistroArchSeries(archtag=arch)
464 archs_by_archive[das.official and 'main' or 'ports'].append(arch)575 archs_by_archive[das.official and 'main' or 'ports'].append(arch)
465 default_arch_list.extend(archs_by_archive['main'])576 default_arch_list.extend(archs_by_archive['main'])
@@ -472,7 +583,7 @@
472 PersonTeam.clear()583 PersonTeam.clear()
473 SourcePackage.clear()584 SourcePackage.clear()
474 SPPH.clear()585 SPPH.clear()
475 last_published = load_timestamps(archive, series)586 last_published = load_timestamps(options.name)
476587
477 # list of SourcePackages for each component588 # list of SourcePackages for each component
478 components = {589 components = {
@@ -491,14 +602,24 @@
491 packagesets[ps.name] = ps.getSourcesIncluded(direct_inclusion=False)602 packagesets[ps.name] = ps.getSourcesIncluded(direct_inclusion=False)
492 packagesets_ftbfs[ps.name] = [] # empty list to add FTBFS for each package set later603 packagesets_ftbfs[ps.name] = [] # empty list to add FTBFS for each package set later
493604
494 for state in ('Failed to build', 'Dependency wait', 'Chroot problem', 'Failed to upload'):605 teams = requests.get('https://people.canonical.com/~ubuntu-archive/package-team-mapping.json').json()
495 last_published[state] = fetch_pkg_list(archive, series, state, last_published[state], default_arch_list, main_archive, main_series)606
496607 # Per team list of FTBFS
497 save_timestamps(archive, series, last_published)608 teams_ftbfs = {team: [] for team in teams}
609
610 for state in ('Failed to build', 'Dependency wait', 'Chroot problem', 'Failed to upload', 'Cancelled build'):
611 last_published[state] = fetch_pkg_list(archive, series, state, last_published[state], default_arch_list, main_archive, main_series, options.release_only)
612
613 save_timestamps(options.name, last_published)
614
615 if options.notice_file:
616 notice = open(options.notice_file).read()
617 else:
618 notice = None
498619
499 print "Updating Historical database..."620 print "Updating Historical database..."
500 store_historical(archive, series)621 store_historical(archive, series)
501 print "Generating HTML page..."622 print "Generating HTML page..."
502 generate_page(archive, series, archs_by_archive, main_archive)623 generate_page(options.name, archive, series, archs_by_archive, main_archive, notice=notice, release_only=options.release_only)
503 print "Generating CSV file..."624 print "Generating CSV file..."
504 generate_csvfile(archive, series)625 generate_csvfile(options.name)
505626
=== modified file 'source/style.css'
--- source/style.css 2011-10-16 14:22:34 +0000
+++ source/style.css 2021-08-13 16:24:09 +0000
@@ -52,6 +52,10 @@
52 background-color:#cc0000;52 background-color:#cc0000;
53}53}
5454
55.CANCELLED {
56 background-color:#aa0000;
57}
58
55.MANUALDEPWAIT {59.MANUALDEPWAIT {
56 background:#ff9900;60 background:#ff9900;
57}61}

Subscribers

People subscribed via source and target branches