Merge lp:~milo/launchpad-work-items-tracker/bug996948 into lp:launchpad-work-items-tracker

Proposed by Milo Casagrande
Status: Superseded
Proposed branch: lp:~milo/launchpad-work-items-tracker/bug996948
Merge into: lp:launchpad-work-items-tracker
Diff against target: 3380 lines (+2497/-125) (has conflicts)
32 files modified
all-projects (+64/-2)
burndown-chart (+5/-0)
collect (+160/-66)
collect_jira (+229/-0)
collect_roadmap (+301/-0)
css/status.css (+58/-0)
generate-all (+53/-14)
html-report (+102/-0)
jira.py (+55/-0)
kanban-papyrs-to-jira (+397/-0)
lpworkitems/collect.py (+20/-20)
lpworkitems/collect_roadmap.py (+71/-0)
lpworkitems/database.py (+74/-2)
lpworkitems/error_collector.py (+10/-0)
lpworkitems/factory.py (+32/-1)
lpworkitems/models.py (+32/-14)
lpworkitems/models_roadmap.py (+60/-0)
lpworkitems/tests/test_collect.py (+0/-1)
lpworkitems/tests/test_collect_roadmap.py (+69/-0)
lpworkitems/tests/test_factory.py (+1/-1)
lpworkitems/tests/test_models.py (+17/-2)
report_tools.py (+162/-1)
roadmap-bp-chart (+249/-0)
roadmap_health.py (+102/-0)
templates/base.html (+13/-0)
templates/body.html (+4/-1)
templates/roadmap_card.html (+71/-0)
templates/roadmap_lane.html (+60/-0)
templates/util.html (+9/-0)
tests.py (+1/-0)
themes/linaro/templates/footer.html (+10/-0)
utils.py (+6/-0)
Text conflict in collect
Text conflict in report_tools.py
To merge this branch: bzr merge lp:~milo/launchpad-work-items-tracker/bug996948
Reviewer Review Type Date Requested Status
Linaro Infrastructure Pending
Review via email: mp+128653@code.launchpad.net

This proposal has been superseded by a proposal from 2012-10-09.

Description of the change

Merge proposal in order to fix the integrity error we receive from status.l.o.

Changes done:
- catch the integrity error and print warning with name of duplicated Blueprint
- skip the Blueprint and continue, otherwise script will fail later in the code due to variable not being set

To post a comment you must log in.
344. By Milo Casagrande

Used error logger, added URL to Blueprint.

Unmerged revisions

344. By Milo Casagrande

Used error logger, added URL to Blueprint.

343. By Milo Casagrande

Catch integrity error, and print warning.

342. By James Tunnicliffe

Removed template driven excaping from some pre-defined HTML so it renders correctly

341. By James Tunnicliffe

Fixed errors parsing meta values containing a colon

340. By Milo Casagrande

Merged lp:~lool/launchpad-work-items-tracker/fix-line-wrap-breakage.

339. By Milo Casagrande

Merge lp:~lool/launchpad-work-items-tracker/jira-support.

338. By Данило Шеган

Merge HTML escaping fix from trunk r301 (by Martin Pitt).

337. By Данило Шеган

Fix problem with graph generation for milestones in the far future. Patch by Loic.

336. By Milo Casagrande

Merge lp:~milo/launchpad-work-items-tracker/bug1017878.

335. By James Tunnicliffe

Merge in migration tools

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'all-projects'
--- all-projects 2012-08-27 15:32:43 +0000
+++ all-projects 2012-10-09 09:20:30 +0000
@@ -14,9 +14,27 @@
14import sys14import sys
15import datetime15import datetime
1616
17import report_tools
18
1719
18def collect(source_dir, db_file, config_file, extra_args):20def collect(source_dir, db_file, config_file, extra_args):
19 args = [os.path.join(source_dir, "collect")]21 return run_collect_script(source_dir, db_file, config_file, extra_args,
22 "collect")
23
24
25def collect_jira(source_dir, db_file, config_file, extra_args):
26 return run_collect_script(source_dir, db_file, config_file, extra_args,
27 "collect_jira")
28
29
30def collect_roadmap(source_dir, db_file, config_file, extra_args):
31 return run_collect_script(source_dir, db_file, config_file, extra_args,
32 "collect_roadmap")
33
34
35def run_collect_script(source_dir, db_file, config_file, extra_args,
36 collect_script):
37 args = [os.path.join(source_dir, collect_script)]
20 args.extend(["-d", db_file])38 args.extend(["-d", db_file])
21 args.extend(["-c", config_file])39 args.extend(["-c", config_file])
22 args += extra_args40 args += extra_args
@@ -56,9 +74,14 @@
5674
5775
58def main():76def main():
59 parser = optparse.OptionParser(usage="%prog <database dir> <www root dir> [www root url]")77 parser = optparse.OptionParser(
78 usage="%prog <database dir> <www root dir> [www root url]")
60 parser.add_option("--config-dir", dest="config_dir", default="config")79 parser.add_option("--config-dir", dest="config_dir", default="config")
80 parser.add_option("--roadmap-config-file", dest="roadmap_config_file",
81 default="roadmap-config")
61 parser.add_option("--debug", dest="debug", action="store_true")82 parser.add_option("--debug", dest="debug", action="store_true")
83 parser.add_option("--kanban-token-file", dest="kanban_token_file")
84 parser.add_option("--papyrs-token-file", dest="papyrs_token_file")
62 opts, args = parser.parse_args()85 opts, args = parser.parse_args()
6386
64 if os.environ.get("DEBUG", None) is not None:87 if os.environ.get("DEBUG", None) is not None:
@@ -88,6 +111,11 @@
88 os.path.join(source_dir, opts.config_dir, "*%s" % valid_config_suffix))111 os.path.join(source_dir, opts.config_dir, "*%s" % valid_config_suffix))
89112
90 for config_file in filenames:113 for config_file in filenames:
114 # read roadmap config to find where to get cards from
115 cfg = report_tools.load_config(config_file)
116 # default to kanbantool
117 cards_source = cfg.get('cards_source', 'kanban')
118
91 project_name = os.path.basename(config_file)[:-len(valid_config_suffix)]119 project_name = os.path.basename(config_file)[:-len(valid_config_suffix)]
92 project_output_dir = os.path.join(output_dir, project_name)120 project_output_dir = os.path.join(output_dir, project_name)
93 db_file = os.path.join(db_dir, "%s.db" % project_name)121 db_file = os.path.join(db_dir, "%s.db" % project_name)
@@ -111,6 +139,40 @@
111 if not collect(source_dir, db_file, config_file, extra_collect_args):139 if not collect(source_dir, db_file, config_file, extra_collect_args):
112 sys.stderr.write("collect failed for %s" % project_name)140 sys.stderr.write("collect failed for %s" % project_name)
113 continue141 continue
142
143 if cards_source == 'jira':
144 extra_collect_jira_args = []
145 if not collect_jira(source_dir, db_file, opts.roadmap_config_file,
146 extra_collect_jira_args):
147 sys.stderr.write("collect_jira failed for %s" % project_name)
148 continue
149 elif cards_source == 'kanban':
150 extra_collect_roadmap_args = []
151 extra_collect_roadmap_args.extend(["--board", '10721'])
152 if opts.kanban_token_file is not None:
153 with open(opts.kanban_token_file) as token_file:
154 token = token_file.read()
155 extra_collect_roadmap_args.extend(["--kanbantoken", token])
156 else:
157 sys.stderr.write("No Kanbantool API token given to "
158 "collect_roadmap for %s" % project_name)
159 if opts.papyrs_token_file is not None:
160 with open(opts.papyrs_token_file) as token_file:
161 token = token_file.read()
162 extra_collect_roadmap_args.extend(["--papyrstoken", token])
163 else:
164 sys.stderr.write("No Papyrs API token given to "
165 "collect_roadmap for %s" % project_name)
166
167 if not collect_roadmap(source_dir, db_file,
168 opts.roadmap_config_file,
169 extra_collect_roadmap_args):
170 sys.stderr.write("collect_roadmap failed for %s" %
171 project_name)
172 else:
173 sys.stderr.write("Unknown cards source %s" % cards_source)
174 continue
175
114 publish_new_db(project_name, project_output_dir, db_file)176 publish_new_db(project_name, project_output_dir, db_file)
115 generate_reports(project_output_dir, config_file, db_file,177 generate_reports(project_output_dir, config_file, db_file,
116 source_dir, extra_generate_args, debug=opts.debug)178 source_dir, extra_generate_args, debug=opts.debug)
117179
=== modified file 'burndown-chart'
--- burndown-chart 2012-08-24 05:29:13 +0000
+++ burndown-chart 2012-10-09 09:20:30 +0000
@@ -312,6 +312,11 @@
312 opts.team or 'all', opts.group or 'none', milestone_collection and milestone_collection.display_name or 'none')312 opts.team or 'all', opts.group or 'none', milestone_collection and milestone_collection.display_name or 'none')
313 sys.exit(0)313 sys.exit(0)
314314
315if date_to_ordinal(end_date) - date_to_ordinal(start_date) > 366:
316 print 'WARNING: date range %s - %s is over a year, not generating a chart' % (
317 start_date, end_date)
318 sys.exit(0)
319
315# title320# title
316if opts.team:321if opts.team:
317 title = '/20' + opts.team322 title = '/20' + opts.team
318323
=== modified file 'collect'
--- collect 2012-07-11 22:36:56 +0000
+++ collect 2012-10-09 09:20:30 +0000
@@ -6,13 +6,27 @@
6# Copyright (C) 2010, 2011 Canonical Ltd.6# Copyright (C) 2010, 2011 Canonical Ltd.
7# License: GPL-37# License: GPL-3
88
9import urllib, re, sys, optparse, smtplib, pwd, os, urlparse
10import logging9import logging
10import optparse
11import os
12import pwd
13import re
14import smtplib
15import sys
16import urllib
17import urlparse
18import sqlite3
19
11from email.mime.text import MIMEText20from email.mime.text import MIMEText
1221
13from launchpadlib.launchpad import Launchpad, EDGE_SERVICE_ROOT22from launchpadlib.launchpad import Launchpad, EDGE_SERVICE_ROOT
1423
15from lpworkitems.collect import CollectorStore, PersonCache, WorkitemParser, bug_wi_states24from lpworkitems.collect import (
25 CollectorStore,
26 PersonCache,
27 WorkitemParser,
28 bug_wi_states,
29 )
16from lpworkitems.database import get_store30from lpworkitems.database import get_store
17from lpworkitems.error_collector import (31from lpworkitems.error_collector import (
18 BlueprintURLError,32 BlueprintURLError,
@@ -26,6 +40,7 @@
26 TeamParticipation,40 TeamParticipation,
27 Workitem,41 Workitem,
28 )42 )
43from utils import unicode_or_None
29import report_tools44import report_tools
3045
3146
@@ -58,20 +73,16 @@
58 """Get a link to the Launchpad API object on the website."""73 """Get a link to the Launchpad API object on the website."""
59 api_link = item.self_link74 api_link = item.self_link
60 parts = urlparse.urlparse(api_link)75 parts = urlparse.urlparse(api_link)
61 link = parts.scheme + "://" + parts.netloc.replace("api.", "") + "/" + parts.path.split("/", 2)[2]76 link = parts.scheme + "://" + parts.netloc.replace("api.", "") + \
77 "/" + parts.path.split("/", 2)[2]
62 return link.decode("utf-8")78 return link.decode("utf-8")
6379
6480
65def unicode_or_None(attr):
66 if attr is None:
67 return attr
68 if isinstance(attr, unicode):
69 return attr
70 return attr.decode("utf-8")
71
72
73import simplejson81import simplejson
82
74_orig_loads = simplejson.loads83_orig_loads = simplejson.loads
84
85
75def loads(something):86def loads(something):
76 return _orig_loads(unicode_or_None(something))87 return _orig_loads(unicode_or_None(something))
77simplejson.loads = loads88simplejson.loads = loads
@@ -96,7 +107,10 @@
96 '''107 '''
97 model_bp = Blueprint.from_launchpad(bp)108 model_bp = Blueprint.from_launchpad(bp)
98 if model_bp.milestone_name not in collector.valid_milestone_names():109 if model_bp.milestone_name not in collector.valid_milestone_names():
99 data_error(web_link(bp), 'milestone "%s" is unknown/invalid' % model_bp.milestone_name, True)110 data_error(
111 web_link(bp),
112 'milestone "%s" is unknown/invalid' % model_bp.milestone_name,
113 True)
100 model_bp = collector.store_blueprint(model_bp)114 model_bp = collector.store_blueprint(model_bp)
101 if model_bp:115 if model_bp:
102 dbg('lp_import_blueprint: added blueprint: %s' % bp.name)116 dbg('lp_import_blueprint: added blueprint: %s' % bp.name)
@@ -116,7 +130,10 @@
116 model_group = BlueprintGroup.from_launchpad(bp)130 model_group = BlueprintGroup.from_launchpad(bp)
117 model_group.area = area131 model_group.area = area
118 if model_group.milestone_name not in collector.valid_milestone_names():132 if model_group.milestone_name not in collector.valid_milestone_names():
119 data_error(web_link(bp), 'milestone "%s" is unknown/invalid' % model_group.milestone, True)133 data_error(
134 web_link(bp),
135 'milestone "%s" is unknown/invalid' % model_group.milestone,
136 True)
120137
121 model_group = collector.store_blueprint_group(model_group)138 model_group = collector.store_blueprint_group(model_group)
122 if model_group is None:139 if model_group is None:
@@ -126,9 +143,9 @@
126 add_dependencies(collector, model_group.name, deps)143 add_dependencies(collector, model_group.name, deps)
127 return model_group144 return model_group
128145
146
129def parse_meta_item(collector, line, bp_name):147def parse_meta_item(collector, line, bp_name):
130 '''Parse a meta information line from a blueprint148 '''Parse a meta information line from a blueprint
131
132 '''149 '''
133150
134 line = line.strip()151 line = line.strip()
@@ -136,18 +153,19 @@
136 return153 return
137154
138 try:155 try:
139 (key, value) = line.rsplit(':', 1)156 (key, value) = line.split(':', 1)
140 key = key.strip()157 key = key.strip()
141 value = value.strip()158 value = value.strip()
142 except ValueError:159 except ValueError:
143 dbg("\tMeta line '%s' can not be parsed" % line)160 dbg("\tMeta line '%s' can not be parsed" % line)
144 return161 return
145162
146 dbg( "\t\tMeta for %s: key='%s' value='%s'" % (bp_name, key, value) )163 dbg("\t\tMeta for %s: key='%s' value='%s'" % (bp_name, key, value))
147 collector.store_meta(key, value, bp_name)164 collector.store_meta(key, value, bp_name)
148165
149166
150def parse_complexity_item(collector, line, bp_name, bp_url, def_milestone, def_assignee):167def parse_complexity_item(collector, line, bp_name, bp_url, def_milestone,
168 def_assignee):
151 line = line.strip()169 line = line.strip()
152 # remove special characters people tend to type170 # remove special characters people tend to type
153 line = re.sub('[^\w -.]', '', line)171 line = re.sub('[^\w -.]', '', line)
@@ -156,9 +174,9 @@
156174
157 dbg("\tParsing complexity line '%s'" % line)175 dbg("\tParsing complexity line '%s'" % line)
158176
159 num = None177 num = None
160 milestone = None178 milestone = None
161 assignee = None179 assignee = None
162180
163 try:181 try:
164 complexity_list = line.split()182 complexity_list = line.split()
@@ -176,7 +194,9 @@
176 dbg('\tComplexity: %s MS: %s Who: %s' % (num, milestone, assignee))194 dbg('\tComplexity: %s MS: %s Who: %s' % (num, milestone, assignee))
177 collector.store_complexity(assignee, num, milestone, bp_name)195 collector.store_complexity(assignee, num, milestone, bp_name)
178 except ValueError:196 except ValueError:
179 data_error(bp_url, "\tComplexity line '%s' could not be parsed %s" % (line, ValueError))197 data_error(bp_url,
198 "\tComplexity line '%s' could not be parsed %s" %
199 (line, ValueError))
180200
181201
182def milestone_extract(text, valid_milestones):202def milestone_extract(text, valid_milestones):
@@ -188,7 +208,9 @@
188 return word208 return word
189 return None209 return None
190210
191def lp_import_blueprint_workitems(collector, bp, distro_release, people_cache=None, projects=None):211
212def lp_import_blueprint_workitems(collector, bp, distro_release,
213 people_cache=None, projects=None):
192 '''Collect work items from a Launchpad blueprint.214 '''Collect work items from a Launchpad blueprint.
193215
194 This includes work items from the whiteboard as well as linked bugs.216 This includes work items from the whiteboard as well as linked bugs.
@@ -202,17 +224,26 @@
202 in_complexity_block = False224 in_complexity_block = False
203 work_items = []225 work_items = []
204226
205 model_bp = collector.store.find(227 try:
206 Blueprint, Blueprint.name == bp.name).one()228 model_bp = collector.store.find(Blueprint,
207 assert model_bp is not None, "Asked to process workitems of %s when it is not in the db" % bp.name229 Blueprint.name == bp.name).one()
208230 except sqlite3.IntegrityError:
209 dbg('lp_import_blueprint_workitems(): processing %s (spec milestone: %s, spec assignee: %s, spec implementation: %s)' % (231 logger.warn('Duplicated Blueprint found: %s. It will not be '
232 'considered.' % bp.name)
233 return
234
235 assert model_bp is not None, \
236 "Asked to process workitems of %s when it is not in the db" % bp.name
237
238 dbg('lp_import_blueprint_workitems(): processing %s (spec milestone: %s,' \
239 ' spec assignee: %s, spec implementation: %s)' % (
210 bp.name, model_bp.milestone_name, model_bp.assignee_name,240 bp.name, model_bp.milestone_name, model_bp.assignee_name,
211 model_bp.implementation))241 model_bp.implementation))
212242
213 valid_milestones = collector.valid_milestone_names()243 valid_milestones = collector.valid_milestone_names()
214 global error_collector244 global error_collector
215 parser = WorkitemParser(245 parser = WorkitemParser(
246<<<<<<< TREE
216 model_bp, model_bp.milestone_name, collector.lp, people_cache=people_cache,247 model_bp, model_bp.milestone_name, collector.lp, people_cache=people_cache,
217 error_collector=error_collector)248 error_collector=error_collector)
218249
@@ -228,19 +259,43 @@
228 if workitems_text:259 if workitems_text:
229 for l in workitems_text.splitlines():260 for l in workitems_text.splitlines():
230 if not in_workitems_block:261 if not in_workitems_block:
262=======
263 model_bp, model_bp.milestone_name, collector.lp,
264 people_cache=people_cache, error_collector=error_collector)
265
266 # Get work items from both the whiteboard and the new workitems_text
267 # property. Once the migration is completed and nobody's using the
268 # whiteboard for work items we can change this to pull work items only
269 # from bp.workitems_text.
270 workitems_text = bp.whiteboard
271 if workitems_text is None:
272 workitems_text = ''
273 if getattr(bp, 'workitems_text', '') != '':
274 workitems_text += "\n" + bp.workitems_text
275 if workitems_text:
276 for l in workitems_text.splitlines():
277 if not in_workitems_block:
278>>>>>>> MERGE-SOURCE
231 m = work_items_re.search(l)279 m = work_items_re.search(l)
232 if m:280 if m:
233 in_workitems_block = True281 in_workitems_block = True
234 dbg('lp_import_blueprint_workitems(): starting work items block at ' + l)282 dbg('lp_import_blueprint_workitems():'
283 ' starting work items block at ' + l)
235 milestone = milestone_extract(m.group(1), valid_milestones)284 milestone = milestone_extract(m.group(1), valid_milestones)
236 dbg(' ... setting milestone to ' + str(milestone))285 dbg(' ... setting milestone to ' + str(milestone))
286<<<<<<< TREE
237 parser.milestone_name = milestone or parser.blueprint.milestone_name287 parser.milestone_name = milestone or parser.blueprint.milestone_name
288=======
289 parser.milestone_name = \
290 milestone or parser.blueprint.milestone_name
291>>>>>>> MERGE-SOURCE
238 continue292 continue
239293
240 if in_workitems_block:294 if in_workitems_block:
241 dbg("\tworkitem (raw): '%s'" % (l.strip()))295 dbg("\tworkitem (raw): '%s'" % (l.strip()))
242 if not l.strip():296 if not l.strip():
243 dbg('lp_import_blueprint_workitems(): closing work items block with line: ' + l)297 dbg('lp_import_blueprint_workitems():'
298 ' closing work items block with line: ' + l)
244 in_workitems_block = False299 in_workitems_block = False
245 parser.milestone_name = parser.blueprint.milestone_name300 parser.milestone_name = parser.blueprint.milestone_name
246 workitem = parser.parse_blueprint_workitem(l)301 workitem = parser.parse_blueprint_workitem(l)
@@ -308,7 +363,8 @@
308 member.name, team.name)363 member.name, team.name)
309 if recursive or team.name in cfg.get('recursive_teams', []):364 if recursive or team.name in cfg.get('recursive_teams', []):
310 _import_teams_recurse(365 _import_teams_recurse(
311 collector, cfg, member, top_level_team_names + [member.name],366 collector, cfg, member,
367 top_level_team_names + [member.name],
312 people_cache=people_cache, recursive=True)368 people_cache=people_cache, recursive=True)
313369
314370
@@ -346,12 +402,14 @@
346 blueprint.status = lp_project.summary or name402 blueprint.status = lp_project.summary or name
347 collector.store_blueprint(blueprint)403 collector.store_blueprint(blueprint)
348404
349 for task in lp_project.searchTasks(status=bug_wi_states.keys(), **cfg['work_item_bugs']):405 for task in lp_project.searchTasks(status=bug_wi_states.keys(),
406 **cfg['work_item_bugs']):
350 id = task.self_link.split('/')[-1]407 id = task.self_link.split('/')[-1]
351 title = task.title.split('"', 1)[1].rstrip('"')408 title = task.title.split('"', 1)[1].rstrip('"')
352 state = bug_wi_states[task.status]409 state = bug_wi_states[task.status]
353 if state is None:410 if state is None:
354 dbg('lp_import_bug_workitems: ignoring #%s: %s (status: %s)' % (id, title, task.status))411 dbg('lp_import_bug_workitems: ignoring #%s: %s (status: %s)' % (
412 id, title, task.status))
355 continue413 continue
356 dbg('lp_import_bug_workitems: #%s: %s (%s)' % (id, title, state))414 dbg('lp_import_bug_workitems: #%s: %s (%s)' % (id, title, state))
357415
@@ -384,14 +442,17 @@
384 milestones.extend([ms for ms in project.all_milestones])442 milestones.extend([ms for ms in project.all_milestones])
385443
386 if 'release' in cfg:444 if 'release' in cfg:
387 lp_project = collector.lp.distributions['ubuntu'].getSeries(name_or_version=cfg['release'])445 lp_project = collector.lp.distributions['ubuntu'].getSeries(
446 name_or_version=cfg['release'])
388 projects.append((lp_project, None))447 projects.append((lp_project, None))
389 add_milestones(lp_project)448 add_milestones(lp_project)
390 else:449 else:
391 assert 'project' in cfg, 'Configuration needs to specify project or release'450 assert 'project' in cfg, \
451 'Configuration needs to specify project or release'
392 lp_project = collector.lp.projects[cfg['project']]452 lp_project = collector.lp.projects[cfg['project']]
393 if 'project_series' in cfg:453 if 'project_series' in cfg:
394 lp_project_series = lp_project.getSeries(name=cfg['project_series'])454 lp_project_series = lp_project.getSeries(
455 name=cfg['project_series'])
395 add_milestones(lp_project_series)456 add_milestones(lp_project_series)
396 else:457 else:
397 lp_project_series = None458 lp_project_series = None
@@ -413,6 +474,10 @@
413 if is_dict and extra_projects[extra_project_name] is not None:474 if is_dict and extra_projects[extra_project_name] is not None:
414 extra_project_series = extra_project.getSeries(475 extra_project_series = extra_project.getSeries(
415 name=extra_projects[extra_project_name])476 name=extra_projects[extra_project_name])
477 if extra_project_series is None:
478 raise AssertionError(
479 "%s has no series named %s"
480 % (extra_project_name, extra_projects[extra_project_name]))
416 add_milestones(extra_project_series)481 add_milestones(extra_project_series)
417 else:482 else:
418 extra_project_series = None483 extra_project_series = None
@@ -453,10 +518,12 @@
453518
454 for project, series in projects:519 for project, series in projects:
455 # XXX: should this be valid_ or all_specifications?520 # XXX: should this be valid_ or all_specifications?
456 project_spec_group_matcher = spec_group_matchers.get(project.name, None)521 project_spec_group_matcher = spec_group_matchers.get(project.name,
522 None)
457 project_bps = project.valid_specifications523 project_bps = project.valid_specifications
458 for bp in project_bps:524 for bp in project_bps:
459 if name_pattern is not None and re.search(name_pattern, bp.name) is None:525 if name_pattern is not None and \
526 re.search(name_pattern, bp.name) is None:
460 continue527 continue
461 if project_spec_group_matcher is not None:528 if project_spec_group_matcher is not None:
462 match = re.search(project_spec_group_matcher, bp.name)529 match = re.search(project_spec_group_matcher, bp.name)
@@ -471,7 +538,8 @@
471 add_blueprint(bp)538 add_blueprint(bp)
472 if series is not None:539 if series is not None:
473 for bp in series.valid_specifications:540 for bp in series.valid_specifications:
474 if name_pattern is not None and re.search(name_pattern, bp.name) is None:541 if name_pattern is not None and \
542 re.search(name_pattern, bp.name) is None:
475 continue543 continue
476 if project_spec_group_matcher is not None:544 if project_spec_group_matcher is not None:
477 match = re.search(project_spec_group_matcher, bp.name)545 match = re.search(project_spec_group_matcher, bp.name)
@@ -492,7 +560,6 @@
492 deps[possible_dep] = possible_deps[possible_dep]560 deps[possible_dep] = possible_deps[possible_dep]
493 if deps:561 if deps:
494 lp_import_spec_group(collector, spec_group, area, deps)562 lp_import_spec_group(collector, spec_group, area, deps)
495
496 lp_import_bug_workitems(lp_project, collector, cfg)563 lp_import_bug_workitems(lp_project, collector, cfg)
497564
498565
@@ -540,7 +607,8 @@
540 if in_section:607 if in_section:
541 result.append([name, status, section])608 result.append([name, status, section])
542 fields = line.strip().split(u'==')609 fields = line.strip().split(u'==')
543 assert not fields[0] # should be empty610 # should be empty
611 assert not fields[0]
544 name = fields[1].strip()612 name = fields[1].strip()
545 section = []613 section = []
546 collect = 1614 collect = 1
@@ -550,7 +618,8 @@
550 in_section = True618 in_section = True
551 collect = 0619 collect = 0
552 fields = line.strip().split(u'||')620 fields = line.strip().split(u'||')
553 assert not fields[0] # should be empty621 # should be empty
622 assert not fields[0]
554 assignee = default_assignee623 assignee = default_assignee
555 istatus = u'todo'624 istatus = u'todo'
556 milestone = None625 milestone = None
@@ -562,11 +631,12 @@
562 desc = fields[which].strip()631 desc = fields[which].strip()
563 if u'status' in field_off:632 if u'status' in field_off:
564 which = field_off[u'status']633 which = field_off[u'status']
565 status_search = [ fields[which] ]634 status_search = [fields[which]]
566 else:635 else:
567 status_search = fields[2:]636 status_search = fields[2:]
568 for f in status_search:637 for f in status_search:
569 if u'DONE' in f or u'POSTPONED' in f or u'TODO' in f or u'INPROGRESS' in f or u'BLOCKED' in f:638 if u'DONE' in f or u'POSTPONED' in f or u'TODO' in f or \
639 u'INPROGRESS' in f or u'BLOCKED' in f:
570 ff = f.split()640 ff = f.split()
571 if len(ff) == 2:641 if len(ff) == 2:
572 assignee = ff[1]642 assignee = ff[1]
@@ -615,14 +685,17 @@
615 for url, default_assignee in cfg.get('moin_pages', {}).iteritems():685 for url, default_assignee in cfg.get('moin_pages', {}).iteritems():
616 url = unicode_or_None(url)686 url = unicode_or_None(url)
617 default_assignee = unicode_or_None(default_assignee)687 default_assignee = unicode_or_None(default_assignee)
618 dbg('moin_import(): processing %s (default assignee: %s)' % (url, default_assignee))688 dbg('moin_import(): processing %s (default assignee: %s)' % (
619 for group, status, items in get_moin_workitems_group(url, default_assignee):689 url, default_assignee))
690 for group, status, items in get_moin_workitems_group(url,
691 default_assignee):
620 url_clean = url.replace('?action=raw', '')692 url_clean = url.replace('?action=raw', '')
621 name = url_clean.split('://', 1)[1].split('/', 1)[1]693 name = url_clean.split('://', 1)[1].split('/', 1)[1]
622 if group:694 if group:
623 name += u' ' + group695 name += u' ' + group
624 spec_url = u'%s#%s' % (url_clean, escape_url(group))696 spec_url = u'%s#%s' % (url_clean, escape_url(group))
625 dbg(' got group %s: name="%s", url="%s"' % (group, name, spec_url))697 dbg(' got group %s: name="%s", url="%s"' % (
698 group, name, spec_url))
626 else:699 else:
627 spec_url = url_clean700 spec_url = url_clean
628 dbg(' no group: name="%s", url="%s"' % (name, spec_url))701 dbg(' no group: name="%s", url="%s"' % (name, spec_url))
@@ -661,7 +734,8 @@
661 optparser.add_option('-c', '--config',734 optparser.add_option('-c', '--config',
662 help='Path to configuration file', dest='config', metavar='PATH')735 help='Path to configuration file', dest='config', metavar='PATH')
663 optparser.add_option('-p', '--pattern', metavar='REGEX',736 optparser.add_option('-p', '--pattern', metavar='REGEX',
664 help='Regex pattern for blueprint name (optional, mainly for testing)', dest='pattern')737 help='Regex pattern for blueprint name (optional, mainly for testing)',
738 dest='pattern')
665 optparser.add_option('--debug', action='store_true', default=False,739 optparser.add_option('--debug', action='store_true', default=False,
666 help='Enable debugging output in parsing routines')740 help='Enable debugging output in parsing routines')
667 optparser.add_option('--mail', action='store_true', default=False,741 optparser.add_option('--mail', action='store_true', default=False,
@@ -679,41 +753,54 @@
679753
680 return opts, args754 return opts, args
681755
756
682def send_error_mails(cfg):757def send_error_mails(cfg):
683 '''Send data_errors to contacts.758 '''Send data_errors to contacts.
684759
685 Data error contacts are defined in the configuration in the "error_contact"760 Data error contacts are defined in the configuration in the
686 map (which assigns a regexp over spec names to a list of email addresses).761 "project_notification_addresses" map (which assigns project names to a list
687 If no match is found, the error goes to stderr.762 of email addresses). If no address list for a project is found, the error
763 goes to stderr.
688 '''764 '''
689 global error_collector765 global error_collector
690766
691 # sort errors into address buckets767 # sort errors into address buckets
692 emails = {} # email address -> contents768 # email address -> contents
769 emails = {}
693770
694 dbg('mailing %i data errors' % len(error_collector.errors))771 dbg('mailing %i data errors' % len(error_collector.errors))
695 for error in error_collector.errors:772 for error in error_collector.errors:
696 for pattern, addresses in cfg['error_contact'].iteritems():773 project_name = error.get_project_name()
697 if (error.get_blueprint_name() is not None774 if project_name is not None:
698 and re.search(pattern, error.get_blueprint_name())):775 addresses = cfg['project_notification_addresses'][project_name]
699 dbg('spec %s matches error_contact pattern "%s", mailing to %s' % (error.get_blueprint_name(),776 dbg('spec %s is targetted to "%s", mailing to %s' % (
700 pattern, ', '.join(addresses)))777 error.get_blueprint_name(), project_name,
701 for a in addresses:778 ', '.join(addresses)))
702 emails.setdefault(a, '')779 for a in addresses:
703 emails[a] += error.format_for_display() + '\n'780 emails.setdefault(a, '')
704 break781 emails[a] += error.format_for_display() + '\n'
705 else:782 else:
706 print >> sys.stderr, error.format_for_display(), '(no error_contact pattern)'783 print >> sys.stderr, error.format_for_display(), \
784 '(no error_contact pattern)'
707785
708 # send mails786 # send mails
709 for addr, contents in emails.iteritems():787 for addr, contents in emails.iteritems():
710 msg = MIMEText(contents.encode('ascii', 'replace'))788 msg = MIMEText(contents.encode('ascii', 'replace'))
711 msg['Subject'] = 'Errors in work item definitions'789 msg['Subject'] = 'Errors in work item definitions'
712 msg['From'] = 'Launchpad work item tracker <work-items-tracker-hackers@lists.launchpad.net>'790 msg['From'] = 'Launchpad work item tracker ' + \
791 '<work-items-tracker-hackers@lists.launchpad.net>'
713 msg['To'] = addr792 msg['To'] = addr
714 s = smtplib.SMTP()793 s = smtplib.SMTP()
715 s.connect()794 s.connect()
795<<<<<<< TREE
716 s.sendmail('devnull@canonical.com', addr, msg.as_string())796 s.sendmail('devnull@canonical.com', addr, msg.as_string())
797=======
798 s.sendmail(os.environ.get(
799 'EMAIL',
800 pwd.getpwuid(os.geteuid()).pw_name + '@localhost'),
801 addr,
802 msg.as_string())
803>>>>>>> MERGE-SOURCE
717 s.quit()804 s.quit()
718805
719806
@@ -753,22 +840,29 @@
753 bug_status_map[key] = unicode_or_None(val)840 bug_status_map[key] = unicode_or_None(val)
754 bug_wi_states.update(bug_status_map)841 bug_wi_states.update(bug_status_map)
755842
756 lock_path = opts.database + ".collect_lock"843 lock_path = opts.database + ".lock"
757 lock_f = open(lock_path, "wb")844 lock_f = open(lock_path, "wb")
758 if report_tools.lock_file(lock_f) is None:845 if report_tools.lock_file(lock_f) is None:
759 print "Another instance is already running"846 print "Another instance is already running"
760 sys.exit(0)847 sys.exit(0)
761848
762 if "beta" in EDGE_SERVICE_ROOT:849 if "beta" in EDGE_SERVICE_ROOT:
763 lp = Launchpad.login_with('ubuntu-work-items', service_root=EDGE_SERVICE_ROOT.replace("edge.", "").replace("beta", "devel"))850 service_root = EDGE_SERVICE_ROOT
851 service_root = service_root.replace("edge.", "")
852 service_root = service_root.replace("beta", "devel")
853 lp = Launchpad.login_with('ubuntu-work-items',
854 service_root=service_root)
764 else:855 else:
765 lp = Launchpad.login_with('ubuntu-work-items', service_root="production", version="devel")856 lp = Launchpad.login_with('ubuntu-work-items',
857 service_root="production", version="devel")
766858
767 store = get_store(opts.database)859 store = get_store(opts.database)
768 collector = CollectorStore(store, lp, error_collector)860 collector = CollectorStore(store, lp, error_collector)
769861
770 # reset status for current day862 # reset status for current day
771 collector.clear_todays_workitems()863 collector.clear_todays_workitems()
864 # We can delete all blueprints while keeping work items for previous days
865 # because there's no foreign key reference from WorkItem to Blueprint.
772 collector.clear_blueprints()866 collector.clear_blueprints()
773 collector.clear_metas()867 collector.clear_metas()
774 collector.clear_complexitys()868 collector.clear_complexitys()
775869
=== added file 'collect_jira'
--- collect_jira 1970-01-01 00:00:00 +0000
+++ collect_jira 2012-10-09 09:20:30 +0000
@@ -0,0 +1,229 @@
1#!/usr/bin/python
2#
3# Pull items from cards.linaro.org and put them into a database.
4
5import logging
6import optparse
7import os
8import simplejson
9import sys
10import urllib2
11
12import jira
13from lpworkitems.collect_roadmap import (
14 CollectorStore,
15 )
16from lpworkitems.database import get_store
17from lpworkitems.error_collector import (
18 ErrorCollector,
19 StderrErrorCollector,
20 )
21from lpworkitems.models_roadmap import (
22 Lane,
23 Card,
24 )
25import report_tools
26
27
28# An ErrorCollector to collect the data errors for later reporting
29error_collector = None
30
31
32logger = logging.getLogger("linarojira")
33
34JIRA_API_URL = 'http://cards.linaro.org/rest/api/2'
35JIRA_PROJECT_KEY = 'CARD'
36JIRA_ISSUE_BY_KEY_URL = 'http://cards.linaro.org/browse/%s'
37
38
39def dbg(msg):
40 '''Print out debugging message if debugging is enabled.'''
41 logger.debug(msg)
42
43
44def get_json_data(url):
45 data = None
46 try:
47 data = simplejson.load(urllib2.urlopen(url))
48 except urllib2.HTTPError, e:
49 print "HTTP error for url '%s': %d" % (url, e.code)
50 except urllib2.URLError, e:
51 print "Network error for url '%s': %s" % (url, e.reason.args[1])
52 except ValueError, e:
53 print "Data error for url '%s': %s" % (url, e.args[0])
54
55 return data
56
57
58def jira_import(collector, cfg, opts):
59 '''Collect roadmap items from JIRA into DB.'''
60
61 # import JIRA versions as database Lanes
62 result = jira.do_request(opts, 'project/%s/versions' % JIRA_PROJECT_KEY)
63 for version in result:
64 dbg('Adding lane (name = %s, id = %s)' %
65 (version['name'], version['id']))
66 model_lane = Lane(unicode(version['name']), int(version['id']))
67 if model_lane.name == cfg['current_lane']:
68 model_lane.is_current = True
69 else:
70 model_lane.is_current = False
71 collector.store_lane(model_lane)
72
73 # find id of "Sponsor" custom field in JIRA
74 result = jira.do_request(opts, 'field')
75 sponsor_fields = [field for field in result if field['name'] == 'Sponsor']
76 assert len(sponsor_fields) == 1, 'Not a single Sponsor field'
77 sponsor_field_id = sponsor_fields[0]['id']
78
79 # import JIRA issues as database Cards
80 result = jira.do_request(
81 opts, 'search', jql='project = %s' % JIRA_PROJECT_KEY,
82 fields=['summary', 'fixVersions', 'status', 'components',
83 'priority', 'description', 'timetracking', sponsor_field_id])
84 for issue in result['issues']:
85 fields = issue['fields']
86 name = unicode(fields['summary'])
87 card_id = int(issue['id'])
88 key = unicode(issue['key'])
89 fixVersions = fields['fixVersions']
90 if len(fixVersions) == 0:
91 dbg('Skipping card without lane (name = %s, key = %s)' %
92 (name, key))
93 continue
94 # JIRA allows listing multiple versions in fixVersions
95 assert len(fixVersions) == 1
96 lane_id = int(fixVersions[0]['id'])
97
98 dbg('Adding card (name = %s, id = %s, lane_id = %s, key = %s)' %
99 (name, card_id, lane_id, key))
100 model_card = Card(name, card_id, lane_id, key)
101 model_card.status = unicode(fields['status']['name'])
102 components = fields['components']
103 if len(components) == 0:
104 dbg('Skipping card without component (name = %s, key = %s)' %
105 (name, key))
106 # JIRA allows listing multiple components
107 assert len(components) == 1
108 model_card.team = unicode(components[0]['name'])
109 model_card.priority = unicode(fields['priority']['name'])
110 size_fields = []
111 timetracking = fields['timetracking']
112 if 'originalEstimate' in timetracking:
113 size_fields += [
114 'original estimate: %s' % timetracking['originalEstimate']]
115 if 'remainingEstimate' in timetracking:
116 size_fields += [
117 'remaining estimate: %s' % timetracking['remainingEstimate']]
118 model_card.size = unicode(', '.join(size_fields))
119 model_card.sponsor = u''
120 # None if no sponsor is selected
121 if fields[sponsor_field_id] is not None:
122 sponsors = [s['value'] for s in fields[sponsor_field_id]]
123 model_card.sponsor = unicode(', '.join(sorted(sponsors)))
124 model_card.url = JIRA_ISSUE_BY_KEY_URL % key
125 # XXX need to either download the HTML version or convert this to HTML
126 model_card.description = unicode(fields['description'])
127 # acceptance criteria is in the description
128 model_card.acceptance_criteria = u''
129 collector.store_card(model_card)
130 return
131
132########################################################################
133#
134# Program operations and main
135#
136########################################################################
137
138
139def parse_argv():
140 '''Parse CLI arguments.
141
142 Return (options, args) tuple.
143 '''
144 optparser = optparse.OptionParser()
145 optparser.add_option('-d', '--database',
146 help='Path to database', dest='database', metavar='PATH')
147 optparser.add_option('-c', '--config',
148 help='Path to configuration file', dest='config', metavar='PATH')
149 optparser.add_option('--debug', action='store_true', default=False,
150 help='Enable debugging output in parsing routines')
151 optparser.add_option('--mail', action='store_true', default=False,
152 help='Send data errors as email (according to "error_config" map in '
153 'config file) instead of printing to stderr', dest='mail')
154 optparser.add_option('--jira-username', default='robot',
155 help='JIRA username for authentication', dest='jira_username')
156 optparser.add_option('--jira-password', default='cuf4moh2',
157 help='JIRA password for authentication', dest='jira_password')
158
159 (opts, args) = optparser.parse_args()
160
161 if not opts.database:
162 optparser.error('No database given')
163 if not opts.config:
164 optparser.error('No config given')
165
166 return opts, args
167
168
169def setup_logging(debug):
170 ch = logging.StreamHandler()
171 ch.setLevel(logging.INFO)
172 formatter = logging.Formatter("%(message)s")
173 ch.setFormatter(formatter)
174 logger.setLevel(logging.INFO)
175 logger.addHandler(ch)
176 if debug:
177 ch.setLevel(logging.DEBUG)
178 formatter = logging.Formatter(
179 "%(asctime)s - %(name)s - %(levelname)s - %(message)s")
180 ch.setFormatter(formatter)
181 logger.setLevel(logging.DEBUG)
182
183
184def update_todays_blueprint_daily_count_per_state(collector):
185 """Clear today's entries and create them again to reflect the current
186 state of blueprints."""
187 collector.clear_todays_blueprint_daily_count_per_state()
188 collector.store_roadmap_bp_count_per_state()
189
190
191def main():
192 report_tools.fix_stdouterr()
193
194 (opts, args) = parse_argv()
195 opts.jira_api_url = JIRA_API_URL
196
197 setup_logging(opts.debug)
198
199 global error_collector
200 if opts.mail:
201 error_collector = ErrorCollector()
202 else:
203 error_collector = StderrErrorCollector()
204
205 cfg = report_tools.load_config(opts.config)
206
207 lock_path = opts.database + ".lock"
208 lock_f = open(lock_path, "wb")
209 if report_tools.lock_file(lock_f) is None:
210 print "Another instance is already running"
211 sys.exit(0)
212
213 store = get_store(opts.database)
214 collector = CollectorStore(store, '', error_collector)
215
216 collector.clear_lanes()
217 collector.clear_cards()
218
219 jira_import(collector, cfg, opts)
220
221 update_todays_blueprint_daily_count_per_state(collector)
222
223 store.commit()
224
225 os.unlink(lock_path)
226
227
228if __name__ == '__main__':
229 main()
0230
=== added file 'collect_roadmap'
--- collect_roadmap 1970-01-01 00:00:00 +0000
+++ collect_roadmap 2012-10-09 09:20:30 +0000
@@ -0,0 +1,301 @@
1#!/usr/bin/python
2#
3# Pull items from the Linaro roadmap in Kanbantool and put them into a database.
4
5import logging
6import optparse
7import os
8import simplejson
9import sys
10import urllib2
11
12from lpworkitems.collect_roadmap import (
13 CollectorStore,
14 get_json_item,
15 lookup_kanban_priority,
16 )
17from lpworkitems.database import get_store
18from lpworkitems.error_collector import (
19 ErrorCollector,
20 StderrErrorCollector,
21 )
22from lpworkitems.models_roadmap import (
23 Lane,
24 Card,
25 )
26from utils import unicode_or_None
27import report_tools
28
29
30# An ErrorCollector to collect the data errors for later reporting
31error_collector = None
32
33
34logger = logging.getLogger("linaroroadmap")
35
36
37def dbg(msg):
38 '''Print out debugging message if debugging is enabled.'''
39 logger.debug(msg)
40
41
42def get_kanban_url(item_url, api_token):
43 base_url = 'https://linaro.kanbantool.com/api/v1'
44 return "%s/%s?api_token=%s" % (base_url, item_url, api_token)
45
46
47def get_json_data(url):
48 data = None
49 try:
50 data = simplejson.load(urllib2.urlopen(url))
51 except urllib2.HTTPError, e:
52 print "HTTP error for url '%s': %d" % (url, e.code)
53 except urllib2.URLError, e:
54 print "Network error for url '%s': %s" % (url, e.reason.args[1])
55 except ValueError, e:
56 print "Data error for url '%s': %s" % (url, e.args[0])
57
58 return data
59
60
61def kanban_import_lanes(collector, workflow_stages, cfg):
62 nodes = {}
63 root_node_id = None
64 lanes_to_ignore = ['Legend']
65
66 # Iterate over all workflow_stages which may be in any order.
67 for workflow_stage in workflow_stages:
68 if workflow_stage['name'] in lanes_to_ignore:
69 dbg("Ignoring lane %s." % workflow_stage['name'])
70 continue
71 parent_id = workflow_stage['parent_id']
72 if parent_id is None:
73 assert root_node_id is None, 'We have already found the root node.'
74 root_node_id = workflow_stage['id']
75 else:
76 if parent_id not in nodes:
77 nodes[parent_id] = []
78 # Add child workflow_stage
79 nodes[parent_id].append(workflow_stage)
80
81 statuses = []
82 for node in nodes[root_node_id]:
83 assert node['parent_id'] == root_node_id
84 model_lane = Lane(get_json_item(node, 'name'),
85 node['id'])
86 if model_lane.name == cfg['current_lane']:
87 model_lane.is_current = True
88 else:
89 model_lane.is_current = False
90 collector.store_lane(model_lane)
91 node_id = node['id']
92 if node_id in nodes:
93 statuses.extend(nodes[node_id])
94 return statuses
95
96
97def kanban_import_cards(collector, tasks, status_list, card_types, papyrs_token):
98 types_to_ignore = ['Summits']
99 for task in tasks:
100 dbg("Collecting card '%s'." % (task['task']['name']))
101 status_id = task['task']['workflow_stage_id']
102 assert status_id is not None
103 task_status = None
104 for status in status_list:
105 if status['id'] == status_id:
106 task_status = status
107 break
108 card_type_id = task['task']['card_type_id']
109 card_type_name = None
110 for card_type in card_types:
111 if card_type['id'] == card_type_id:
112 card_type_name = card_type['name']
113 break
114 else:
115 dbg("Cannot find type for card '%s'." % (task['task']['name']))
116 if card_type_name in types_to_ignore:
117 dbg("Ignoring card '%s' since it\'s type is '%s'." % \
118 (task['task']['name'], card_type['name']))
119 else:
120 if task_status is not None:
121 lane_id = task_status['parent_id']
122 assert lane_id is not None
123 else:
124 lane_id = status_id
125 if not collector.lane_is_collected(lane_id):
126 dbg("Ignoring card '%s' since it\'s Lane is ignored." % \
127 (task['task']['name']))
128 continue
129 model_card = Card(get_json_item(task['task'], 'name'),
130 task['task']['id'], lane_id,
131 get_json_item(task['task'], 'external_id'))
132 if task_status is not None:
133 model_card.status = get_json_item(task_status, 'name')
134 model_card.team = unicode_or_None(card_type_name)
135 model_card.priority = lookup_kanban_priority(
136 task['task']['priority'])
137 model_card.size = get_json_item(task['task'], 'size_estimate')
138 model_card.sponsor = get_json_item(task['task'],
139 'custom_field_1')
140
141 external_link = task['task']['custom_field_2']
142 if external_link is not None and external_link is not '':
143 model_card.url = unicode_or_None(external_link)
144 dbg('Getting Papyrs information from %s.' % external_link)
145 papyrs_data = papyrs_import(collector, external_link, papyrs_token)
146 model_card.description = get_json_item(papyrs_data,
147 'description')
148 model_card.acceptance_criteria = get_json_item(
149 papyrs_data, 'acceptance_criteria')
150 collector.store_card(model_card)
151
152
153def kanban_import(collector, cfg, board_id, api_token, papyrs_token):
154 '''Collect roadmap items from KanbanTool into DB.'''
155 board_url = get_kanban_url('boards/%s.json' % board_id, api_token)
156 board = get_json_data(board_url)
157 assert board is not None, "Could not access board %s." % board_id
158 card_types = board['board']['card_types']
159 status_list = kanban_import_lanes(collector,
160 board['board']['workflow_stages'], cfg)
161
162 tasks_url = get_kanban_url('boards/%s/tasks.json' % board_id, api_token)
163 tasks = get_json_data(tasks_url)
164 kanban_import_cards(collector, tasks, status_list, card_types, papyrs_token)
165
166
167def papyrs_import(collector, requirement_url, papyrs_token):
168 description = None
169 acceptance_criteria = None
170
171 page = get_json_data(requirement_url + '?json&auth_token=%s' % papyrs_token)
172 if page is None:
173 return {'description': None,
174 'acceptance_criteria': None}
175
176 page_text_items = page[0]
177 page_extra_items = page[1]
178
179 has_found_description = False
180 last_heading = ''
181 for page_item in page_text_items:
182 if page_item['classname'] == 'Heading':
183 last_heading = page_item['text']
184 if page_item['classname'] == 'Paragraph':
185 if not has_found_description:
186 description = page_item['html']
187 has_found_description = True
188 elif 'Acceptance Criteria' in last_heading:
189 acceptance_criteria = page_item['html']
190
191 return {'description': get_first_paragraph(description),
192 'acceptance_criteria': get_first_paragraph(acceptance_criteria)}
193
194
195def get_first_paragraph(text):
196 if text is None:
197 return None
198 # This might break, depending on what type of line breaks
199 # whoever authors the Papyrs document uses.
200 first_pararaph, _, _ = text.partition('<br>')
201 return first_pararaph
202
203
204########################################################################
205#
206# Program operations and main
207#
208########################################################################
209
210def parse_argv():
211 '''Parse CLI arguments.
212
213 Return (options, args) tuple.
214 '''
215 optparser = optparse.OptionParser()
216 optparser.add_option('-d', '--database',
217 help='Path to database', dest='database', metavar='PATH')
218 optparser.add_option('-c', '--config',
219 help='Path to configuration file', dest='config', metavar='PATH')
220 optparser.add_option('--debug', action='store_true', default=False,
221 help='Enable debugging output in parsing routines')
222 optparser.add_option('--mail', action='store_true', default=False,
223 help='Send data errors as email (according to "error_config" map in '
224 'config file) instead of printing to stderr', dest='mail')
225 optparser.add_option('--board',
226 help='Board id at Kanbantool', dest='board')
227 optparser.add_option('--kanbantoken',
228 help='Kanbantool API token for authentication', dest='kanban_token')
229 optparser.add_option('--papyrstoken',
230 help='Papyrs API token for authentication', dest='papyrs_token')
231
232 (opts, args) = optparser.parse_args()
233
234 if not opts.database:
235 optparser.error('No database given')
236 if not opts.config:
237 optparser.error('No config given')
238
239 return opts, args
240
241
242def setup_logging(debug):
243 ch = logging.StreamHandler()
244 ch.setLevel(logging.INFO)
245 formatter = logging.Formatter("%(message)s")
246 ch.setFormatter(formatter)
247 logger.setLevel(logging.INFO)
248 logger.addHandler(ch)
249 if debug:
250 ch.setLevel(logging.DEBUG)
251 formatter = logging.Formatter(
252 "%(asctime)s - %(name)s - %(levelname)s - %(message)s")
253 ch.setFormatter(formatter)
254 logger.setLevel(logging.DEBUG)
255
256
257def update_todays_blueprint_daily_count_per_state(collector):
258 """Clear today's entries and create them again to reflect the current
259 state of blueprints."""
260 collector.clear_todays_blueprint_daily_count_per_state()
261 collector.store_roadmap_bp_count_per_state()
262
263
264def main():
265 report_tools.fix_stdouterr()
266
267 (opts, args) = parse_argv()
268
269 setup_logging(opts.debug)
270
271 global error_collector
272 if opts.mail:
273 error_collector = ErrorCollector()
274 else:
275 error_collector = StderrErrorCollector()
276
277 cfg = report_tools.load_config(opts.config)
278
279 lock_path = opts.database + ".lock"
280 lock_f = open(lock_path, "wb")
281 if report_tools.lock_file(lock_f) is None:
282 print "Another instance is already running"
283 sys.exit(0)
284
285 store = get_store(opts.database)
286 collector = CollectorStore(store, '', error_collector)
287
288 collector.clear_lanes()
289 collector.clear_cards()
290
291 kanban_import(collector, cfg, opts.board, opts.kanban_token, opts.papyrs_token)
292
293 update_todays_blueprint_daily_count_per_state(collector)
294
295 store.commit()
296
297 os.unlink(lock_path)
298
299
300if __name__ == '__main__':
301 main()
0302
=== modified file 'css/status.css'
--- css/status.css 2011-05-18 20:51:50 +0000
+++ css/status.css 2012-10-09 09:20:30 +0000
@@ -46,6 +46,64 @@
46 font-size: 1.2em;46 font-size: 1.2em;
47}47}
4848
49
50.roadmap_progress_text {
51 position: absolute;
52 top:0; left:0;
53
54 padding-top: 5px;
55
56 color: #ffffff;
57 text-align: center;
58 width: 100%;
59}
60
61.roadmap_wrap {
62 border: 1px solid black;
63 position: relative;
64 margin-top: 2px;
65 margin-bottom: 3px;
66 margin-left: auto;
67 margin-right: auto;
68 background-color: #bdbdbd;
69}
70
71.roadmap_wrap, .roadmap_value {
72 width: 300px;
73 height: 28px;
74}
75
76table .roadmap_wrap, table .roadmap_value {
77 border: 0px;
78 width: 155px;
79 height: 1.4em;
80 background-color: #ffffff;
81}
82
83.roadmap_value {
84 float: left;
85}
86
87.roadmap_value .Completed {
88 background-color: green;
89 height: inherit
90}
91
92.roadmap_value .Blocked {
93 background-color: red;
94 height: inherit
95}
96
97.roadmap_value .InProgress {
98 background-color: gray;
99 height: inherit
100}
101
102.roadmap_value .Planned {
103 background-color: orange;
104 height: inherit
105}
106
49.progress_wrap {107.progress_wrap {
50 position: relative;108 position: relative;
51 border: 1px solid black;109 border: 1px solid black;
52110
=== modified file 'generate-all'
--- generate-all 2011-09-09 07:14:11 +0000
+++ generate-all 2012-10-09 09:20:30 +0000
@@ -6,7 +6,7 @@
6# Copyright (C) 2010, 2011 Canonical Ltd.6# Copyright (C) 2010, 2011 Canonical Ltd.
7# License: GPL-37# License: GPL-3
88
9import optparse, os.path, sys9import optparse, os.path, sys, errno
1010
11import report_tools11import report_tools
1212
@@ -47,13 +47,7 @@
47 burnup_chart_teams = []47 burnup_chart_teams = []
48 primary_team = None48 primary_team = None
4949
50lock_path = opts.database + ".generate_lock"50lock_path = opts.database + ".lock"
51lock_f = open(lock_path, "wb")
52if report_tools.lock_file(lock_f) is None:
53 print "Another instance is already running"
54 sys.exit(0)
55
56lock_path = opts.database + ".generate_lock"
57lock_f = open(lock_path, "wb")51lock_f = open(lock_path, "wb")
58if report_tools.lock_file(lock_f) is None:52if report_tools.lock_file(lock_f) is None:
59 print "Another instance is already running"53 print "Another instance is already running"
@@ -80,14 +74,60 @@
80usersubdir = os.path.join(opts.output_dir, 'u')74usersubdir = os.path.join(opts.output_dir, 'u')
81try:75try:
82 os.mkdir(usersubdir)76 os.mkdir(usersubdir)
83except OSError:77except OSError as exc:
84 None78 if exc.errno == errno.EEXIST:
79 pass
80 else:
81 raise
8582
86groupssubdir = os.path.join(opts.output_dir, 'group')83groupssubdir = os.path.join(opts.output_dir, 'group')
87try:84try:
88 os.mkdir(groupssubdir)85 os.mkdir(groupssubdir)
89except OSError:86except OSError as exc:
90 None87 if exc.errno == errno.EEXIST:
88 pass
89 else:
90 raise
91
92lanessubdir = os.path.join(opts.output_dir, '..', 'lane')
93try:
94 os.mkdir(lanessubdir)
95except OSError as exc:
96 if exc.errno == errno.EEXIST:
97 pass
98 else:
99 raise
100
101cardssubdir = os.path.join(opts.output_dir, '..', 'card')
102try:
103 os.mkdir(cardssubdir)
104except OSError as exc:
105 if exc.errno == errno.EEXIST:
106 pass
107 else:
108 raise
109
110# roadmap lanes
111for lane in report_tools.lanes(store):
112 basename = os.path.join(lanessubdir, lane.name)
113 report_tools.roadmap_pages(my_path, opts.database, basename, opts.config,
114 lane, root=opts.root)
115
116# roadmap cards
117for card in report_tools.cards(store):
118 if card.roadmap_id != '':
119 page_name = card.roadmap_id
120 else:
121 page_name = str(card.card_id)
122 basename = os.path.join(cardssubdir, page_name)
123 report_tools.roadmap_cards(my_path, opts.database, basename, opts.config,
124 card, root=opts.root)
125
126# roadmap front page
127basename = os.path.join(lanessubdir, 'index')
128lane = report_tools.current_lane(store)
129report_tools.roadmap_pages(my_path, opts.database, basename, opts.config,
130 lane, root=opts.root)
91131
92for u in users:132for u in users:
93 for m in milestones:133 for m in milestones:
@@ -156,11 +196,10 @@
156 basename = os.path.join(opts.output_dir, status)196 basename = os.path.join(opts.output_dir, status)
157 report_tools.workitem_list(my_path, opts.database, basename, opts.config, status, root=opts.root)197 report_tools.workitem_list(my_path, opts.database, basename, opts.config, status, root=opts.root)
158198
159# front page199# cycle front page
160basename = os.path.join(opts.output_dir, 'index')200basename = os.path.join(opts.output_dir, 'index')
161report_tools.status_overview(my_path, opts.database, basename, opts.config, root=opts.root)201report_tools.status_overview(my_path, opts.database, basename, opts.config, root=opts.root)
162202
163
164def copy_files(source_dir):203def copy_files(source_dir):
165 for filename in os.listdir(source_dir):204 for filename in os.listdir(source_dir):
166 dest = open(os.path.join(opts.output_dir, filename), 'w')205 dest = open(os.path.join(opts.output_dir, filename), 'w')
167206
=== modified file 'html-report'
--- html-report 2012-06-20 19:54:52 +0000
+++ html-report 2012-10-09 09:20:30 +0000
@@ -10,6 +10,13 @@
1010
11from report_tools import escape_url11from report_tools import escape_url
12import report_tools12import report_tools
13from roadmap_health import (
14 card_health_checks,
15)
16from lpworkitems.models import (
17 ROADMAP_STATUSES_MAP,
18 ROADMAP_ORDERED_STATUSES,
19)
1320
1421
15class WorkitemTarget(object):22class WorkitemTarget(object):
@@ -453,6 +460,97 @@
453 print report_tools.fill_template(460 print report_tools.fill_template(
454 "workitem_list.html", data, theme=opts.theme)461 "workitem_list.html", data, theme=opts.theme)
455462
463 def roadmap_page(self, store, opts):
464 if opts.lane is None:
465 print "<h1>Error, no lane specified.</h1>"
466 if not opts.title:
467 title = opts.lane
468 else:
469 title = opts.title
470
471 data = self.template_data(store, opts)
472 lane = report_tools.lane(store, opts.lane)
473 lanes = report_tools.lanes(store)
474 statuses = []
475 bp_status_totals = {'Completed': 0, 'Total': 0, 'Percentage': 0}
476 for status, cards in report_tools.statuses(store, lane):
477 cards_with_bps = []
478 for card in cards:
479 report_tools.check_card_health(store, card_health_checks, card)
480 blueprint_status_counts = report_tools.card_bp_status_counts(
481 store, card.roadmap_id)
482 total = sum(blueprint_status_counts.values())
483 bp_percentages = dict.fromkeys(ROADMAP_ORDERED_STATUSES, 0)
484 bp_status_totals['Completed'] += \
485 blueprint_status_counts['Completed']
486 for key in ROADMAP_STATUSES_MAP:
487 bp_status_totals['Total'] += blueprint_status_counts[key]
488 if total > 0:
489 bp_percentages[key] = (
490 100.0 * blueprint_status_counts[key] / total)
491
492 cards_with_bps.append({'card': card,
493 'bp_statuses': blueprint_status_counts,
494 'bp_percentages': bp_percentages})
495 statuses.append(dict(name=status, cards=cards_with_bps))
496 if bp_status_totals['Total'] > 0:
497 bp_status_totals['Percentage'] = (100 * bp_status_totals['Completed'] /
498 bp_status_totals['Total'])
499
500 data.update(dict(statuses=statuses))
501 data.update(dict(bp_status_totals=bp_status_totals))
502 data.update(dict(status_order=ROADMAP_ORDERED_STATUSES))
503 data.update(dict(page_type="roadmap_lane"))
504 data.update(dict(lane_title=title))
505 data.update(dict(lanes=lanes))
506 data.update(dict(chart_url=opts.chart_url))
507 print report_tools.fill_template(
508 "roadmap_lane.html", data, theme=opts.theme)
509
510
511 def roadmap_card(self, store, opts):
512 if opts.card is None:
513 print "<h1>Error, no card specified.</h1>"
514
515 data = self.template_data(store, opts)
516 card = report_tools.card(store, int(opts.card)).one()
517 health_checks = report_tools.check_card_health(store, card_health_checks, card)
518 lane = report_tools.lane(store, None, id=card.lane_id)
519
520 if not opts.title:
521 title = card.name
522 else:
523 title = opts.title
524
525 blueprints = report_tools.card_blueprints_by_status(store, card.roadmap_id)
526 bp_status_totals = {'Completed': 0, 'Total': 0, 'Percentage': 0}
527 bp_status_totals['Total'] = (len(blueprints['Planned']) +
528 len(blueprints['Blocked']) +
529 len(blueprints['In Progress']) +
530 len(blueprints['Completed']))
531 bp_status_totals['Completed'] = len(blueprints['Completed'])
532 if bp_status_totals['Total'] > 0:
533 bp_status_totals['Percentage'] = (100 * bp_status_totals['Completed'] /
534 bp_status_totals['Total'])
535
536 card_has_blueprints = bp_status_totals['Total'] > 0
537
538 status_order = ROADMAP_ORDERED_STATUSES[:]
539 status_order.reverse()
540
541 data.update(dict(page_type="roadmap_card"))
542 data.update(dict(card_title=title))
543 data.update(dict(card=card))
544 data.update(dict(health_checks=health_checks))
545 data.update(dict(lane=lane.name))
546 data.update(dict(status_order=status_order))
547 data.update(dict(blueprints=blueprints))
548 data.update(dict(bp_status_totals=bp_status_totals))
549 data.update(dict(card_has_blueprints=card_has_blueprints))
550
551 print report_tools.fill_template(
552 "roadmap_card.html", data, theme=opts.theme)
553
456554
457class WorkitemsOnDate(object):555class WorkitemsOnDate(object):
458556
@@ -531,6 +629,10 @@
531 help="Include all milestones targetted to this date.")629 help="Include all milestones targetted to this date.")
532 optparser.add_option('--theme', dest="theme",630 optparser.add_option('--theme', dest="theme",
533 help="The theme to use.", default="linaro")631 help="The theme to use.", default="linaro")
632 optparser.add_option('--lane',
633 help='Roadmap lane', dest='lane')
634 optparser.add_option('--card',
635 help='Roadmap card', dest='card')
534636
535 (opts, args) = optparser.parse_args()637 (opts, args) = optparser.parse_args()
536 if not opts.database:638 if not opts.database:
537639
=== added file 'jira.py'
--- jira.py 1970-01-01 00:00:00 +0000
+++ jira.py 2012-10-09 09:20:30 +0000
@@ -0,0 +1,55 @@
1#!/usr/bin/python
2# -*- coding: UTF-8 -*-
3
4import base64
5import optparse
6import simplejson
7import urllib2
8
9
10def do_request(opts, relpathname, **kwargs):
11 request = urllib2.Request('%s/%s' % (opts.jira_api_url, relpathname))
12 if opts.jira_username and opts.jira_password:
13 base64string = base64.encodestring(
14 '%s:%s' % (opts.jira_username, opts.jira_password)
15 ).replace('\n', '')
16 request.add_header('Authorization', 'Basic %s' % base64string)
17 request_data = None
18 if kwargs.keys():
19 request.add_header('Content-Type', 'application/json')
20 request_data = simplejson.dumps(kwargs)
21 response_data = urllib2.urlopen(request, request_data)
22 return simplejson.load(response_data)
23
24
25def main():
26 parser = optparse.OptionParser(usage="%prog")
27 parser.add_option("--jira-api-url", dest="jira_api_url",
28 default="http://cards.linaro.org/rest/api/2")
29 parser.add_option("--jira-username", dest="jira_username",
30 default="robot")
31 parser.add_option("--jira-password", dest="jira_password",
32 default="cuf4moh2")
33 opts, args = parser.parse_args()
34
35 # simple search
36 print do_request(opts, 'search', maxResults=1, jql='project = CARD',
37 fields=['summary', 'status'])
38
39 # information about a project
40 #print do_request(opts, 'project/CARD')
41
42 # on creating issues
43 #print do_request(opts, 'issue/createmeta?projectIds=10000')
44
45 # on statuses
46 #print do_request(opts, 'status')
47
48 # on fields
49 #print do_request(opts, 'field')
50
51 # on a security level
52 #print do_request(opts, 'securitylevel/10000')
53
54if __name__ == "__main__":
55 main()
056
=== added file 'kanban-papyrs-to-jira'
--- kanban-papyrs-to-jira 1970-01-01 00:00:00 +0000
+++ kanban-papyrs-to-jira 2012-10-09 09:20:30 +0000
@@ -0,0 +1,397 @@
1#!/usr/bin/python
2# -*- coding: UTF-8 -*-
3# Copyright (C) 2012 Linaro Ltd.
4# Author: Loïc Minier <loic.minier@linaro.org>
5# License: GPL-3
6
7import jira
8
9from bs4 import BeautifulSoup
10import logging
11import optparse
12import os
13import re
14import simplejson
15import sys
16import urllib2
17
18logger = logging.getLogger("linaroroadmap")
19
20def dbg(msg):
21 '''Print out debugging message if debugging is enabled.'''
22 logger.debug(msg)
23
24class InMemCollector:
25 def __init__(self):
26 self.lanes = []
27 self.cards = []
28
29 def store_lane(self, lane):
30 self.lanes.append(lane)
31
32 def store_card(self, card):
33 self.cards.append(card)
34
35 def lane_is_collected(self, lane_id):
36 for l in self.lanes:
37 if l.lane_id == lane_id:
38 return True
39 return False
40
41def kanban_request(opts, relpathname, method='GET', **kwargs):
42 request = urllib2.Request(
43 '%s/%s.json?_m=%s' % (opts.kanban_api_url, relpathname, method))
44 if opts.kanban_token:
45 request.add_header('X-KanbanToolToken', opts.kanban_token)
46 request_data = None
47 if kwargs.keys():
48 request.add_header('Content-Type', 'application/json')
49 request_data = simplejson.dumps(kwargs)
50 print request_data
51 response_data = urllib2.urlopen(request, request_data)
52 return simplejson.load(response_data)
53
54def get_papyrs_page(papyrs_url, token):
55 url = '%s?json&auth_token=%s' % (papyrs_url, token)
56 return simplejson.load(urllib2.urlopen(url))
57
58def get_kanban_boards(opts):
59 return kanban_request(opts, 'boards')
60
61def get_kanban_board(opts, board_id):
62 return kanban_request(opts, 'boards/%s' % board_id)
63
64def get_kanban_tasks(opts, board_id):
65 return kanban_request(opts, 'boards/%s/tasks' % board_id)
66
67def get_kanban_task(opts, board_id, task_id):
68 return kanban_request(opts, 'boards/%s/tasks/%s' % (board_id, task_id))
69
70def put_kanban_task(opts, board_id, task_id, **kwargs):
71 return kanban_request(opts, 'boards/%s/tasks/%s' % (board_id, task_id), method='PUT', **kwargs)
72
73def main():
74 # TODO: add support for passing a card id or papyrs URL
75 parser = optparse.OptionParser(usage="%prog")
76 parser.add_option("--kanban-api-url", dest="kanban_api_url",
77 default="https://linaro.kanbantool.com/api/v1")
78 parser.add_option("--jira-api-url", dest="jira_api_url",
79 default="http://cards.linaro.org/rest/api/2")
80 # defaults are read-only ~linaro-infrastructure tokens
81 parser.add_option("--kanban-token", dest="kanban_token",
82 default="9F209W7Y84TE")
83 parser.add_option("--papyrs-token", dest="papyrs_token",
84 default="868e9088b53c")
85 parser.add_option("--jira-username", dest="jira_username",
86 default="robot")
87 parser.add_option("--jira-password", dest="jira_password",
88 default="cuf4moh2")
89 parser.add_option("--jira-project", dest="jira_project_name",
90 default="CARD")
91 parser.add_option("--jira-issuetype", dest="jira_issuetype_name",
92 default="Roadmap Card")
93 parser.add_option("--board-id", dest="board_id", default="10721")
94 parser.add_option('--debug', action='store_true', default=True,
95 help='Enable debugging output in parsing routines')
96 parser.add_option('--board',
97 help='Board id at Kanban Tool', dest='board', default='10721')
98 opts, args = parser.parse_args()
99
100 if os.environ.get("DEBUG", None) is not None:
101 opts.debug = True
102
103 if len(args) != 0:
104 parser.error("You can not pass any argument")
105
106 if opts.kanban_token is None:
107 sys.stderr.write("No Kanbantool API token given")
108 if opts.papyrs_token is None:
109 sys.stderr.write("No Papyrs API token given")
110
111 # logging setup
112 logger = logging.getLogger()
113 ch = logging.StreamHandler()
114 formatter = logging.Formatter("%(asctime)s %(message)s")
115 ch.setFormatter(formatter)
116 logger.addHandler(ch)
117 if opts.debug:
118 logger.setLevel(logging.DEBUG)
119
120 boards = get_kanban_boards(opts)
121 # dump
122 for board in boards:
123 board = board['board']
124 dbg('Found board "%s" with id %s' % (board['name'], board['id']))
125 dbg('')
126
127 assert 1 == len(filter(lambda b: str(b['board']['id']) == opts.board_id, boards)), \
128 'Expected exactly one board with id %s' % opts.board_id
129
130 board = get_kanban_board(opts, opts.board_id)
131 board = board['board']
132
133 workflow_stages = board['workflow_stages']
134 # ideally order wouldn't matter but the "position" field of our workflow stages
135 # is bogus (always 1) so we can't use it
136 leaf_workflow_stages = []
137 for workflow_stage in workflow_stages:
138 childs = filter(
139 lambda ws: ws['parent_id'] == workflow_stage['id'], workflow_stages)
140 if not childs:
141 # build a name list for leaf workflow stages
142 name = []
143 id = workflow_stage['id']
144 while True:
145 ws = filter(lambda ws: ws['id'] == id, workflow_stages)[0]
146 if ws['name'] is None:
147 break
148 name = [ws['name']] + name
149 id = ws['parent_id']
150 leaf_workflow_stages.append((workflow_stage['id'], name))
151 # dump
152 for id, name in leaf_workflow_stages:
153 pretty_name = "/".join(name)
154 dbg('Found leaf workflow stage %s with id %s' % (pretty_name, id))
155 dbg('')
156
157 card_types = board['card_types']
158 # dump
159 for card_type in card_types:
160 dbg('Found card type %s with id %s' % (card_type['name'], card_type['id']))
161 dbg('')
162
163 def get_leaf_workflow_stage_name(worfklow_stage_id):
164 return [name
165 for id, name
166 in leaf_workflow_stages
167 if id == worfklow_stage_id][0]
168
169 def get_card_type_name(card_type_id):
170 return [card_type['name']
171 for card_type
172 in card_types
173 if card_type['id'] == card_type_id][0]
174
175 def filter_tasks(task):
176 # ignore tasks in Legend and Deferred workflow stages
177 lwsn = get_leaf_workflow_stage_name(task['workflow_stage_id'])
178 if lwsn in (['Legend'], ['Deferred']):
179 dbg('Ignoring task %s in workflow stage %s'
180 % (task['external_id'], "/".join(lwsn)))
181 return False
182 # ignore tasks with Summit and Unknown card type names
183 card_type_name = get_card_type_name(task['card_type_id'])
184 if card_type_name in ('Summits', 'Unknown'):
185 dbg('Ignoring task %s with card type name %s'
186 % (task['external_id'], card_type_name))
187 return False
188 return True
189
190 tasks = get_kanban_tasks(opts, opts.board_id)
191 tasks = [t['task'] for t in tasks if filter_tasks(t['task'])]
192 # dump
193 for task in tasks:
194 dbg('Found task %s with id %s, workflow_stage_id %s, priority %s, '
195 'card_type_id %s, custom_field_2 %s, and external_id %s'
196 % (task['name'], task['id'], task['workflow_stage_id'],
197 task['priority'], task['card_type_id'],
198 task['custom_field_2'], task['external_id']))
199
200 CARD_TYPE_NAMES_TO_PREFIXES = {
201 'LAVA': 'LAVA',
202 'Android': 'ANDROID',
203 'Linux & Ubuntu': 'LINUX',
204 'TCWG': 'TCWG',
205 'GWG': 'GWG',
206 'MMWG': 'MMWG',
207 'KWG': 'KWG',
208 'PMWG': 'PMWG',
209 'OCTO': 'OCTO',
210 }
211
212 # check consistency of external_id with external_link and custom_field_2
213 # (papyrs URL), and of external_id with card_type name
214 for task in tasks:
215 external_id = task['external_id']
216 papyrs_url = task['custom_field_2']
217 external_link = task['external_link']
218 assert papyrs_url == 'https://linaro.papyrs.com/%s' % external_id, \
219 'Incorrect papyrs URL %s for task %s' % (papyrs_url, external_id)
220 assert external_link == 'http://status.linaro.org/card/%s' % external_id, \
221 'Incorrect external_link %s for task %s' % (external_link, external_id)
222 card_type_name = get_card_type_name(task['card_type_id'])
223 prefix = CARD_TYPE_NAMES_TO_PREFIXES[card_type_name]
224 assert external_id.startswith(prefix), \
225 'Incorrect card type prefix %s for task %s' % (prefix, external_id)
226
227 # verify papyrs pages
228 #for task in tasks:
229 for task in []:
230 external_id = task['external_id']
231 papyrs_url = task['custom_field_2']
232 dbg('Fetching card %s' % task['name'])
233 papyrs_json = get_papyrs_page(papyrs_url, opts.papyrs_token)
234
235 try:
236 # number of columns
237 ncols = len(papyrs_json)
238 assert ncols == 2, 'Expected exactly two columns but got %s' % len(ncols)
239
240 # first column
241 col0 = papyrs_json[0]
242 p0 = col0[0]
243 classname = p0['classname']
244 assert classname == 'Heading', \
245 "First paragraph of first column should be a a heading but is %s" % classname
246 assert p0['text'] == p0['html'], \
247 "Expected text (%s) and HTML (%s) to be identical for first heading" % (p0['text'], p0['html'])
248 assert p0['text'] == task['name'], \
249 'Mismatch between first heading (%s) and task (%s)' % (p0['text'], task['name'])
250 for p in col0[1:-2]:
251 assert p['classname'] in ('Heading', 'Paragraph'), \
252 'Got unexpected classname %s' % p['classname']
253 if p['classname'] == 'Heading':
254 assert p['text'] == p['html'], \
255 'Expected heading HTML (%s) to match text (%s)' % (p['html'], p['text'])
256 if p['classname'] == 'Paragraph':
257 soup = BeautifulSoup('<root>%s</root>' % p['html'], 'xml')
258 for tag in soup.root.find_all(True):
259 assert tag.name in ('font', 'b', 'a', 'ul', 'ol', 'li', 'br', 'p', 'span', 'div', 'u'), 'Unexpected tag %s' % tag.name
260
261 # second column
262 pm1 = col0[-1]
263 assert pm1['classname'] == 'Discuss', \
264 'Expect last classname to be Discuss but got %s' % pm1['classname']
265
266 col1 = papyrs_json[1]
267 skip_next_paragraph = False
268 nattachs = 0
269 for p in col1:
270 if p['classname'] in ('Checklist', 'Twitters', 'Navigation'):
271 pass
272 elif p['classname'] == 'Attachment':
273 nattachs += 1
274 elif p['classname'] == 'Heading' and p['text'] == 'Attachments':
275 pass
276 elif p['classname'] == 'Heading' and p['text'] == 'Metadata':
277 skip_next_paragraph = True
278 elif p['classname'] == 'Paragraph' and skip_next_paragraph:
279 skip_next_paragraph = False
280 else:
281 assert False, 'Unexpected paragraph %s' % p
282 if nattachs > 0:
283 dbg('Found %s attachment(s) on card %s' % (nattachs, task['name']))
284 except Exception, e:
285 dbg(e)
286
287 # query jira data
288 jira_project_result = jira.do_request(opts, 'project/%s' % opts.jira_project_name)
289 jira_statuses_result = jira.do_request(opts, 'status')
290 jira_fields_result = jira.do_request(opts, 'field')
291 # not allowed
292 #jira_securitylevels_result = jira.do_request(opts, 'securitylevel')
293 jira_priorities_result = jira.do_request(opts, 'priority')
294
295 def search_jira_id(jira_result, name):
296 return [r['id'] for r in jira_result if r['name'] == name][0]
297
298 # http://cards.linaro.org/rest/api/2/project/CARD has id 10000
299 #jira_project_id = 10000
300 jira_project_id = jira_project_result['id']
301 # issuetype for "Roadmap Card" http://cards.linaro.org/rest/api/2/issuetype/9
302 #jira_issuetype_id = 9
303 jira_issuetype_id = search_jira_id(jira_project_result['issueTypes'], opts.jira_issuetype_name)
304 dbg('Found id %s for %s issueType' % (jira_issuetype_id, opts.jira_issuetype_name))
305 for component in jira_project_result['components']:
306 dbg('Found component %s with id %s' % (component['name'], component['id']))
307 for version in jira_project_result['versions']:
308 dbg('Found version %s with id %s' % (version['name'], version['id']))
309 for status in jira_statuses_result:
310 dbg('Found status %s with id %s' % (status['name'], status['id']))
311
312 TYPE_TO_COMPONENT = {
313 'LAVA': 'LAVA',
314 'Android': 'Android',
315 'Linux & Ubuntu': 'Linux & Ubuntu',
316 'TCWG': 'Toolchain WG',
317 'GWG': 'Graphics WG',
318 'MMWG': 'Multimedia WG',
319 'KWG': 'Kernel WG',
320 'PMWG': 'Power Management WG',
321 'OCTO': 'OCTO',
322 }
323
324 STAGE_TO_STATUS = {
325 'New/Draft': 'New/Drafting',
326 'New/Needs Work': 'New/Drafting',
327 'New/TSC Reviewed': 'New/Reviewed',
328 '2012Q1/Done': 'Approved',
329 '2012Q1/Ready': 'Approved',
330 '2012Q2/Forecast': 'Approved',
331 '2012Q3/Forecast': 'Approved',
332 '2012H2/Forecast': 'Approved',
333 '2013/Forecast': 'Approved',
334 }
335
336 STAGE_TO_VERSION = {
337 'New/Draft': None,
338 'New/Needs Work': None,
339 'New/TSC Reviewed': None,
340 '2012Q1/Done': '2012Q1',
341 '2012Q1/Ready': '2012Q1',
342 '2012Q2/Forecast': '2012Q2',
343 '2012Q3/Forecast': '2012Q3',
344 '2012H2/Forecast': '2012H2',
345 '2013/Forecast': '2013',
346 }
347
348 PRIORITY_MAP = {
349 -1: 'Minor',
350 0: 'Major',
351 1: 'Critical',
352 }
353
354 # actual copy
355 for task in tasks:
356 print task['name']
357 print task['external_id']
358 external_id = task['external_id']
359 papyrs_url = task['custom_field_2']
360 papyrs_json = get_papyrs_page(papyrs_url, opts.papyrs_token)
361 # first column
362 col0 = papyrs_json[0]
363 p0 = col0[0]
364 # assemble HTML of description
365 html = ""
366 for p in col0[1:-1]:
367 if p['classname'] == 'Heading':
368 html += '<h1>%s</h1>\n' % p['text']
369 if p['classname'] == 'Paragraph':
370 html += '%s\n' % p['html']
371 html = '{html}\n%s{html}\n' % html
372
373 stage = "/".join(get_leaf_workflow_stage_name(task['workflow_stage_id']))
374 status = STAGE_TO_STATUS[stage]
375 version = STAGE_TO_VERSION[stage]
376 type_name = get_card_type_name(task['card_type_id'])
377 component = TYPE_TO_COMPONENT[type_name]
378 priority = PRIORITY_MAP[task['priority']]
379
380 fields = {'project': {'id': jira_project_id},
381 'summary': task['name'],
382 'issuetype': {'id': jira_issuetype_id},
383 'description': html,
384 'components': [{'id': search_jira_id(jira_project_result['components'], component)}],
385 search_jira_id(jira_fields_result, 'Alias Card ID'): task['external_id'],
386 # XXX hardcoded default security level; also, can't set security level to Public via API
387 #'security': {'id': search_jira_id(jira_securitylevels_result, 'Public')},
388 'priority': {'id': search_jira_id(jira_priorities_result, priority)},
389 }
390 #'status': search_jira_id(jira_statuses_result, status),
391 if version:
392 fields['fixVersions'] = [{'id': search_jira_id(jira_project_result['versions'], version)}]
393 dbg('Uploading card %s' % fields)
394 print jira.do_request(opts, 'issue', fields=fields)
395
396if __name__ == "__main__":
397 main()
0398
=== modified file 'lpworkitems/collect.py'
--- lpworkitems/collect.py 2011-12-06 15:20:43 +0000
+++ lpworkitems/collect.py 2012-10-09 09:20:30 +0000
@@ -26,6 +26,7 @@
26# "interesting")26# "interesting")
27workitem_precedence = [None, u'done', u'postponed', u'blocked', u'todo', u'inprogress']27workitem_precedence = [None, u'done', u'postponed', u'blocked', u'todo', u'inprogress']
2828
29
29class PersonCache(object):30class PersonCache(object):
30 """A cache of Launchpad accounts."""31 """A cache of Launchpad accounts."""
3132
@@ -91,31 +92,29 @@
91 project_name = self.lp.load(milestone.target.self_link).name92 project_name = self.lp.load(milestone.target.self_link).name
92 existing_milestone = self.store.find(93 existing_milestone = self.store.find(
93 models.Milestone,94 models.Milestone,
94 models.Milestone.name==milestone_name).any()95 models.Milestone.name == milestone_name).any()
95 if existing_milestone is not None:96 if existing_milestone is not None:
96 # We only store a milestone for the first project that we97 # TODO: We now allow for the same milestone in different projects
97 # see it in.98 # to have different due dates (within reasonable limits).
98 # Check that the dates match, otherwise it's very confusing99 # However, the old algorithm which relied on all due dates to match,
100 # only stores a single milestone with due date for the first project
101 # that we see it in (essentially, random one). This is expected to
102 # be elaborated shortly. Then this block can be removed completely,
103 # until then it is left as a reminder.
99 target_date = None104 target_date = None
100 if milestone.date_targeted is not None:105 if milestone.date_targeted is not None:
101 target_date = milestone.date_targeted.strftime("%Y-%m-%d")106 target_date = milestone.date_targeted.strftime("%Y-%m")
102 existing_target_date = existing_milestone.due_date107 existing_target_date = existing_milestone.due_date
103 if isinstance(existing_target_date, datetime.date):108 if isinstance(existing_target_date, datetime.date):
104 existing_target_date = existing_target_date.strftime("%Y-%m-%d")109 existing_target_date = existing_target_date.strftime("%Y-%m")
105 if (target_date and existing_target_date != target_date):110 if (target_date and existing_target_date != target_date):
106 error = MilestoneError(111 existing_milestone.due_date = milestone.date_targeted
107 milestone,112 else:
108 "Milestone %s (%s) has due_date %s but %s already has "113 db_milestone = models.Milestone()
109 "the due date as %s" % (milestone.name, project_name,114 db_milestone.name = milestone_name
110 target_date, existing_milestone.project,115 db_milestone.due_date = milestone.date_targeted
111 existing_target_date))116 db_milestone.project = project_name
112 self.error_collector.store_error(error)117 self.store.add(db_milestone)
113 return
114 db_milestone = models.Milestone()
115 db_milestone.name = milestone_name
116 db_milestone.due_date = milestone.date_targeted
117 db_milestone.project = project_name
118 self.store.add(db_milestone)
119118
120 def store_lp_milestones(self, milestones):119 def store_lp_milestones(self, milestones):
121 if self.store.find(models.Milestone).any() is not None:120 if self.store.find(models.Milestone).any() is not None:
@@ -270,7 +269,7 @@
270 if ']' in desc:269 if ']' in desc:
271 off = desc.index(']')270 off = desc.index(']')
272 assignee_name = desc[1:off]271 assignee_name = desc[1:off]
273 desc = desc[off+1:].strip()272 desc = desc[off + 1:].strip()
274 else:273 else:
275 self.error_collector.record_blueprint_error(274 self.error_collector.record_blueprint_error(
276 self.blueprint,275 self.blueprint,
@@ -314,6 +313,7 @@
314 def get_workitem_if_tracked(self, task, projects=None,313 def get_workitem_if_tracked(self, task, projects=None,
315 distro_release=None):314 distro_release=None):
316 target = self.lp.load(task.target.self_link)315 target = self.lp.load(task.target.self_link)
316
317 def get_rtype(obj):317 def get_rtype(obj):
318 return urllib.splittag(obj.resource_type_link)[1]318 return urllib.splittag(obj.resource_type_link)[1]
319 rtype = get_rtype(target)319 rtype = get_rtype(target)
320320
=== added file 'lpworkitems/collect_roadmap.py'
--- lpworkitems/collect_roadmap.py 1970-01-01 00:00:00 +0000
+++ lpworkitems/collect_roadmap.py 2012-10-09 09:20:30 +0000
@@ -0,0 +1,71 @@
1import datetime
2
3from lpworkitems import models_roadmap
4from utils import unicode_or_None
5
6
7class CollectorStore(object):
8
9 def __init__(self, store, base_url, error_collector):
10 self.store = store
11 self.base_url = base_url
12 self.error_collector = error_collector
13
14 def _clear_all(self, *find_args):
15 self.store.find(*find_args).remove()
16
17 def clear_lanes(self):
18 self._clear_all(models_roadmap.Lane)
19
20 def clear_cards(self):
21 self._clear_all(models_roadmap.Card)
22
23 def store_lane(self, lane):
24 self.store.add(lane)
25
26 def store_card(self, card):
27 self.store.add(card)
28
29 def clear_todays_blueprint_daily_count_per_state(self):
30 self._clear_all(
31 models_roadmap.BlueprintDailyCountPerState,
32 models_roadmap.BlueprintDailyCountPerState.day == datetime.date.today())
33
34 def store_roadmap_bp_count_per_state(self):
35 query = """
36 SELECT implementation, lane_id, count(*)
37 FROM specs
38 JOIN meta on spec = specs.name
39 JOIN card on roadmap_id = value
40 WHERE key = 'Roadmap id'
41 GROUP BY implementation, lane_id
42 """
43 day = datetime.date.today()
44 result = self.store.execute(query)
45 for status, lane_id, count in result:
46 obj = models_roadmap.BlueprintDailyCountPerState()
47 obj.day = day
48 obj.status = status
49 obj.lane_id = lane_id
50 obj.count = count
51 self.store.add(obj)
52
53 def lane_is_collected(self, lane_id):
54 return self.store.find(models_roadmap.Lane, models_roadmap.
55 Lane.lane_id == lane_id).one() is not None
56
57
58def get_json_item(data, item_name):
59 item = data[item_name]
60 if item is not None:
61 item = item.strip()
62 return unicode_or_None(item)
63
64
65def lookup_kanban_priority(numeric_priority):
66 priority_lookup = {-1: "low",
67 0: "normal",
68 1: "high"}
69 assert numeric_priority in priority_lookup, (
70 "Priority '%s' is unknown." % numeric_priority)
71 return unicode_or_None(priority_lookup[numeric_priority])
072
=== modified file 'lpworkitems/database.py'
--- lpworkitems/database.py 2011-06-24 19:09:26 +0000
+++ lpworkitems/database.py 2012-10-09 09:20:30 +0000
@@ -7,13 +7,13 @@
7 store.execute('''CREATE TABLE version (7 store.execute('''CREATE TABLE version (
8 db_layout_ref INT NOT NULL8 db_layout_ref INT NOT NULL
9 )''')9 )''')
10 store.execute('''INSERT INTO version VALUES (10)''')10 store.execute('''INSERT INTO version VALUES (15)''')
1111
12 store.execute('''CREATE TABLE specs (12 store.execute('''CREATE TABLE specs (
13 name VARCHAR(255) PRIMARY KEY,13 name VARCHAR(255) PRIMARY KEY,
14 url VARCHAR(1000) NOT NULL,14 url VARCHAR(1000) NOT NULL,
15 priority CHAR(20),15 priority CHAR(20),
16 implementation CHAR(30),16 implementation CHAR(30) NOT NULL,
17 assignee CHAR(50),17 assignee CHAR(50),
18 team CHAR(50),18 team CHAR(50),
19 status VARCHAR(5000) NOT NULL,19 status VARCHAR(5000) NOT NULL,
@@ -25,6 +25,13 @@
25 roadmap_notes VARCHAR(5000)25 roadmap_notes VARCHAR(5000)
26 )''')26 )''')
2727
28 store.execute('''CREATE TABLE spec_daily_count_per_state (
29 status VARCHAR(5000) NOT NULL,
30 day DATE NOT NULL,
31 lane_id REFERENCES lane(lane_id),
32 count INT NOT NULL
33 )''')
34
28 store.execute('''CREATE TABLE work_items (35 store.execute('''CREATE TABLE work_items (
29 description VARCHAR(1000) NOT NULL,36 description VARCHAR(1000) NOT NULL,
30 spec VARCHAR(255) REFERENCES specs(name),37 spec VARCHAR(255) REFERENCES specs(name),
@@ -90,6 +97,30 @@
90 display_name VARCHAR(50)97 display_name VARCHAR(50)
91 )''')98 )''')
9299
100 store.execute('''CREATE TABLE lane (
101 name VARCHAR(200) NOT NULL,
102 lane_id NOT NULL,
103 is_current BOOLEAN,
104 cards REFERENCES card(card_id)
105 )''')
106
107 store.execute('''CREATE TABLE card (
108 name VARCHAR(200) NOT NULL,
109 card_id NOT NULL,
110 url VARCHAR(200),
111 is_healthy BOOLEAN,
112 status VARCHAR(50),
113 team VARCHAR(50),
114 priority VARCHAR(50),
115 size VARCHAR(50),
116 sponsor VARCHAR(50),
117 contact VARCHAR(50),
118 description BLOB,
119 acceptance_criteria BLOB,
120 roadmap_id VARCHAR(50),
121 lane_id REFERENCES lane(lane_id)
122 )''')
123
93124
94def upgrade_if_needed(store):125def upgrade_if_needed(store):
95 # upgrade DB layout126 # upgrade DB layout
@@ -177,7 +208,47 @@
177 )''')208 )''')
178 store.execute('UPDATE version SET db_layout_ref = 10')209 store.execute('UPDATE version SET db_layout_ref = 10')
179 ver = 10210 ver = 10
211 if ver == 10:
212 store.execute('''CREATE TABLE lane (
213 name VARCHAR(200) NOT NULL,
214 lane_id NOT NULL,
215 cards REFERENCES card(card_id)
216 )''')
180217
218 store.execute('''CREATE TABLE card (
219 name VARCHAR(200) NOT NULL,
220 card_id NOT NULL,
221 status VARCHAR(50),
222 lane_id REFERENCES lane(lane_id)
223 )''')
224 store.execute('UPDATE version SET db_layout_ref = 11')
225 ver = 11
226 if ver == 11:
227 store.execute('ALTER TABLE card ADD COLUMN roadmap_id VARCHAR(50)')
228 store.execute('UPDATE version SET db_layout_ref = 12')
229 ver = 12
230 if ver == 12:
231 store.execute('ALTER TABLE card ADD COLUMN team VARCHAR(50)')
232 store.execute('ALTER TABLE card ADD COLUMN priority VARCHAR(50)')
233 store.execute('ALTER TABLE card ADD COLUMN size VARCHAR(50)')
234 store.execute('ALTER TABLE card ADD COLUMN sponsor VARCHAR(50)')
235 store.execute('ALTER TABLE card ADD COLUMN contact VARCHAR(50)')
236 store.execute('ALTER TABLE card ADD COLUMN description BLOB')
237 store.execute('ALTER TABLE card ADD COLUMN acceptance_criteria BLOB')
238 store.execute('ALTER TABLE lane ADD COLUMN is_current BOOLEAN')
239 store.execute('UPDATE version SET db_layout_ref = 13')
240 if ver == 13:
241 store.execute('ALTER TABLE card ADD COLUMN url VARCHAR(200)')
242 store.execute('ALTER TABLE card ADD COLUMN is_healthy BOOLEAN')
243 store.execute('UPDATE version SET db_layout_ref = 14')
244 if ver == 14:
245 store.execute('''CREATE TABLE spec_daily_count_per_state (
246 status VARCHAR(5000) NOT NULL,
247 day DATE NOT NULL,
248 lane_id REFERENCES lane(lane_id),
249 count INT NOT NULL
250 )''')
251 store.execute('UPDATE version SET db_layout_ref = 15')
181252
182def get_store(dbpath):253def get_store(dbpath):
183 '''Open/initialize database.254 '''Open/initialize database.
@@ -205,5 +276,6 @@
205 store.execute('''CREATE INDEX work_items_date_idx ON work_items (date)''')276 store.execute('''CREATE INDEX work_items_date_idx ON work_items (date)''')
206 store.execute('''CREATE INDEX work_items_status_idx ON work_items (status)''')277 store.execute('''CREATE INDEX work_items_status_idx ON work_items (status)''')
207278
279
208def create_v6_indexes(store):280def create_v6_indexes(store):
209 store.execute('''CREATE INDEX work_items_assignee_milestone_idx on work_items(assignee,milestone)''')281 store.execute('''CREATE INDEX work_items_assignee_milestone_idx on work_items(assignee,milestone)''')
210282
=== modified file 'lpworkitems/error_collector.py'
--- lpworkitems/error_collector.py 2011-06-08 19:30:24 +0000
+++ lpworkitems/error_collector.py 2012-10-09 09:20:30 +0000
@@ -48,6 +48,10 @@
48 """Get the name of the blueprint, or None if not a blueprint."""48 """Get the name of the blueprint, or None if not a blueprint."""
49 return None49 return None
5050
51 def get_project_name(self):
52 """Get the name of the project, or None if not a project."""
53 return None
54
51 def format_for_display(self):55 def format_for_display(self):
52 """Produce a string representation of the Error.56 """Produce a string representation of the Error.
5357
@@ -84,6 +88,9 @@
84 def get_blueprint_name(self):88 def get_blueprint_name(self):
85 return self.blueprint.name89 return self.blueprint.name
8690
91 def get_project_name(self):
92 return self.blueprint.url.split('/')[-3]
93
8794
88class BlueprintURLError(Error):95class BlueprintURLError(Error):
89 """A deprecated class for backwards-compatibility.96 """A deprecated class for backwards-compatibility.
@@ -101,6 +108,9 @@
101 def get_blueprint_name(self):108 def get_blueprint_name(self):
102 return self.blueprint_url.split('/')[-1]109 return self.blueprint_url.split('/')[-1]
103110
111 def get_project_name(self):
112 return self.blueprint_url.split('/')[-3]
113
104114
105class MilestoneError(Error):115class MilestoneError(Error):
106116
107117
=== modified file 'lpworkitems/factory.py'
--- lpworkitems/factory.py 2011-06-14 22:00:21 +0000
+++ lpworkitems/factory.py 2012-10-09 09:20:30 +0000
@@ -11,6 +11,7 @@
11 TeamStructure,11 TeamStructure,
12 Workitem,12 Workitem,
13 )13 )
14from lpworkitems.models_roadmap import BlueprintDailyCountPerState, Card
1415
1516
16class Factory(object):17class Factory(object):
@@ -63,6 +64,8 @@
63 url = self.getUniqueUnicode(prefix=name+"_url")64 url = self.getUniqueUnicode(prefix=name+"_url")
64 if status is None:65 if status is None:
65 status = self.getUniqueUnicode(prefix=name+"_status")66 status = self.getUniqueUnicode(prefix=name+"_status")
67 if implementation is None:
68 implementation = u'Unknown'
66 blueprint.name = name69 blueprint.name = name
67 blueprint.url = url70 blueprint.url = url
68 blueprint.status = status71 blueprint.status = status
@@ -109,8 +112,11 @@
109 self.store.add(workitem)112 self.store.add(workitem)
110 return workitem113 return workitem
111114
112 def make_meta(self, store=True):115 def make_meta(self, key=None, value=None, blueprint=None, store=True):
113 meta = Meta()116 meta = Meta()
117 meta.key = key
118 meta.value = value
119 meta.blueprint = blueprint
114 if store:120 if store:
115 self.store.add(meta)121 self.store.add(meta)
116 return meta122 return meta
@@ -155,3 +161,28 @@
155 if store:161 if store:
156 self.store.add(person)162 self.store.add(person)
157 return person163 return person
164
165 def make_blueprint_daily_count_per_state(self, status=None, count=1,
166 day=None, store=True):
167 if status is None:
168 status = self.getUniqueUnicode()
169 if day is None:
170 day = datetime.date.today()
171 obj = BlueprintDailyCountPerState()
172 obj.day = day
173 obj.status = status
174 obj.count = count
175 obj.lane_id = 1
176 if store:
177 self.store.add(obj)
178 return obj
179
180 def make_card(self, store=True):
181 name = self.getUniqueUnicode()
182 card_id = self.getUniqueInteger()
183 lane_id = self.getUniqueInteger()
184 roadmap_id = self.getUniqueUnicode()
185 card = Card(name, card_id, lane_id, roadmap_id)
186 if store:
187 self.store.add(card)
188 return card
158189
=== modified file 'lpworkitems/models.py'
--- lpworkitems/models.py 2011-12-06 15:20:43 +0000
+++ lpworkitems/models.py 2012-10-09 09:20:30 +0000
@@ -1,15 +1,20 @@
1import datetime1import datetime
2import re2import re
33from utils import unicode_or_None
4from storm.locals import Date, Reference, ReferenceSet, Unicode4
55from storm.locals import Date, Int, Reference, ReferenceSet, Unicode
66
7def unicode_or_None(attr):7ROADMAP_STATUSES_MAP = {
8 if attr is None:8 u'Completed': [u'Implemented'],
9 return attr9 u'Blocked': [u'Needs Infrastructure', u'Blocked', u'Deferred'],
10 if isinstance(attr, unicode):10 u'In Progress': [u'Deployment', u'Needs Code Review',
11 return attr11 u'Beta Available', u'Good progress',
12 return attr.decode("utf-8")12 u'Slow progress', u'Started'],
13 u'Planned': [u'Unknown', u'Not started', u'Informational']}
14
15ROADMAP_ORDERED_STATUSES = ['Completed', 'In Progress', 'Blocked', 'Planned']
16assert set(ROADMAP_ORDERED_STATUSES) == set(ROADMAP_STATUSES_MAP.keys()), (
17 'The roadmap statuses are incorrect: %s' % ROADMAP_ORDERED_STATUSES)
1318
1419
15def fill_blueprint_info_from_launchpad(model_bp, lp_bp):20def fill_blueprint_info_from_launchpad(model_bp, lp_bp):
@@ -55,6 +60,14 @@
55 project = Unicode()60 project = Unicode()
5661
5762
63def get_roadmap_status_for_bp_implementation_status(implementation):
64 for key in ROADMAP_STATUSES_MAP:
65 if implementation in ROADMAP_STATUSES_MAP[key]:
66 return key
67 # XXX: Is None the appropriate return value here?
68 return None
69
70
58class Blueprint(object):71class Blueprint(object):
5972
60 __storm_table__ = "specs"73 __storm_table__ = "specs"
@@ -88,6 +101,15 @@
88 lp_bp.whiteboard, "Roadmap\s+Notes")101 lp_bp.whiteboard, "Roadmap\s+Notes")
89 return model_bp102 return model_bp
90103
104 @property
105 def roadmap_status(self):
106 return get_roadmap_status_for_bp_implementation_status(
107 self.implementation)
108
109
110def current_date():
111 return datetime.date.today()
112
91113
92class Person(object):114class Person(object):
93115
@@ -115,10 +137,6 @@
115 superteam_name = Unicode(name="team")137 superteam_name = Unicode(name="team")
116138
117139
118def current_date():
119 return datetime.date.today()
120
121
122class Meta(object):140class Meta(object):
123141
124 __storm_table__ = "meta"142 __storm_table__ = "meta"
125143
=== added file 'lpworkitems/models_roadmap.py'
--- lpworkitems/models_roadmap.py 1970-01-01 00:00:00 +0000
+++ lpworkitems/models_roadmap.py 2012-10-09 09:20:30 +0000
@@ -0,0 +1,60 @@
1import datetime
2import re
3from utils import unicode_or_None
4
5from storm.locals import Date, Reference, ReferenceSet, Unicode, Int, Bool
6
7from lpworkitems import models
8
9
10class Card(object):
11
12 __storm_table__ = "card"
13
14 name = Unicode()
15 status = Unicode()
16 card_id = Int(primary=True)
17 lane_id = Int()
18 roadmap_id = Unicode()
19 team = Unicode()
20 priority = Unicode()
21 size = Unicode()
22 sponsor = Unicode()
23 contact = Unicode()
24 description = Unicode()
25 acceptance_criteria = Unicode()
26 url = Unicode()
27 is_healthy = Bool()
28
29 def __init__(self, name, card_id, lane_id, roadmap_id):
30 self.lane_id = lane_id
31 self.card_id = card_id
32 self.name = name
33 self.roadmap_id = roadmap_id
34
35
36class Lane(object):
37
38 __storm_table__ = "lane"
39
40 name = Unicode()
41 lane_id = Int(primary=True)
42 is_current = Bool()
43 cards = ReferenceSet(lane_id, Card.lane_id)
44
45 def __init__(self, name, lane_id):
46 self.lane_id = lane_id
47 self.name = name
48
49
50def current_date():
51 return datetime.date.today()
52
53
54class BlueprintDailyCountPerState(object):
55 __storm_table__ = 'spec_daily_count_per_state'
56 __storm_primary__ = 'status', 'day'
57 day = Date(default_factory=current_date)
58 status = Unicode()
59 lane_id = Int()
60 count = Int()
061
=== modified file 'lpworkitems/tests/test_collect.py'
--- lpworkitems/tests/test_collect.py 2011-06-14 22:00:21 +0000
+++ lpworkitems/tests/test_collect.py 2012-10-09 09:20:30 +0000
@@ -184,7 +184,6 @@
184 self.store.find(184 self.store.find(
185 Milestone, Milestone.name == name).one())185 Milestone, Milestone.name == name).one())
186186
187
188 def test_store_blueprint_stores_blueprint(self):187 def test_store_blueprint_stores_blueprint(self):
189 blueprint = self.factory.make_blueprint(store=False)188 blueprint = self.factory.make_blueprint(store=False)
190 ret = self.collector.store_blueprint(blueprint)189 ret = self.collector.store_blueprint(blueprint)
191190
=== added file 'lpworkitems/tests/test_collect_roadmap.py'
--- lpworkitems/tests/test_collect_roadmap.py 1970-01-01 00:00:00 +0000
+++ lpworkitems/tests/test_collect_roadmap.py 2012-10-09 09:20:30 +0000
@@ -0,0 +1,69 @@
1import datetime
2
3from lpworkitems.collect_roadmap import (
4 CollectorStore,
5 get_json_item,
6 )
7from lpworkitems.models_roadmap import BlueprintDailyCountPerState
8from lpworkitems.error_collector import (
9 ErrorCollector,
10 )
11from lpworkitems.testing import TestCaseWithFakeLaunchpad
12
13
14class CollectorTests(TestCaseWithFakeLaunchpad):
15
16 def setUp(self):
17 super(CollectorTests, self).setUp()
18 self.error_collector = ErrorCollector()
19 self.collector = CollectorStore(
20 self.store, self.lp, self.error_collector)
21
22 def assertClears(self, cls, fn):
23 self.assertTrue(self.store.find(cls).count() > 0)
24 fn()
25 self.assertEqual(0, self.store.find(cls).count())
26
27 def test_clear_todays_blueprint_daily_count_per_state(self):
28 self.factory.make_blueprint_daily_count_per_state(
29 day=datetime.date.today())
30 self.assertClears(
31 BlueprintDailyCountPerState,
32 self.collector.clear_todays_blueprint_daily_count_per_state)
33
34 def test_store_roadmap_bp_count_per_state(self):
35 bp = self.factory.make_blueprint()
36 card = self.factory.make_card()
37 meta = self.factory.make_meta(
38 key=u'Roadmap id', value=card.roadmap_id, blueprint=bp)
39 self.collector.store_roadmap_bp_count_per_state()
40 self.assertEqual(
41 1, self.store.find(BlueprintDailyCountPerState).count())
42 entry = self.store.find(BlueprintDailyCountPerState).one()
43 self.assertEqual(1, entry.count)
44 self.assertEqual(card.lane_id, entry.lane_id)
45 self.assertEqual(bp.implementation, entry.status)
46
47 # XXX Add tests for the roadmap classes.
48
49
50class RoadmapUtilsTests(TestCaseWithFakeLaunchpad):
51
52 def setUp(self):
53 super(RoadmapUtilsTests, self).setUp()
54 self.json_data = {"data": "Text",
55 "whitespace": " Text ",
56 "none": None
57 }
58
59 def test_get_json_data_unicode(self):
60 item = get_json_item(self.json_data, 'data')
61 self.assertEquals(item, u'Text')
62
63 def test_get_json_data_whitespace(self):
64 item = get_json_item(self.json_data, 'whitespace')
65 self.assertEquals(item, u'Text')
66
67 def test_get_json_data_none(self):
68 item = get_json_item(self.json_data, 'none')
69 self.assertEquals(item, None)
070
=== modified file 'lpworkitems/tests/test_factory.py'
--- lpworkitems/tests/test_factory.py 2011-06-04 18:48:23 +0000
+++ lpworkitems/tests/test_factory.py 2012-10-09 09:20:30 +0000
@@ -162,7 +162,7 @@
162 implementation = u"Implemented"162 implementation = u"Implemented"
163 self.assert_with_and_without(163 self.assert_with_and_without(
164 self.factory.make_blueprint, "implementation", implementation,164 self.factory.make_blueprint, "implementation", implementation,
165 Equals(None))165 Equals("Unknown"))
166166
167 def test_uses_assignee_name(self):167 def test_uses_assignee_name(self):
168 assignee_name = self.factory.getUniqueUnicode(168 assignee_name = self.factory.getUniqueUnicode(
169169
=== modified file 'lpworkitems/tests/test_models.py'
--- lpworkitems/tests/test_models.py 2011-12-06 15:20:43 +0000
+++ lpworkitems/tests/test_models.py 2012-10-09 09:20:30 +0000
@@ -6,8 +6,11 @@
6 extract_last_path_segment_from_url,6 extract_last_path_segment_from_url,
7 extract_user_name_from_url,7 extract_user_name_from_url,
8 get_whiteboard_section,8 get_whiteboard_section,
9 )9 ROADMAP_STATUSES_MAP,
10from lpworkitems.testing import TestCaseWithFakeLaunchpad10 )
11from lpworkitems.testing import (
12 TestCaseWithFakeLaunchpad,
13 )
1114
1215
13class GetWhiteboardSectionTests(TestCase):16class GetWhiteboardSectionTests(TestCase):
@@ -42,6 +45,18 @@
4245
43class BlueprintTests(TestCaseWithFakeLaunchpad):46class BlueprintTests(TestCaseWithFakeLaunchpad):
4447
48 def test_roadmap_status(self):
49 roadmap_status = "Completed"
50 bp_implementation = ROADMAP_STATUSES_MAP[roadmap_status][0]
51 bp_status = self.factory.make_blueprint(
52 implementation=bp_implementation)
53 self.assertEqual(roadmap_status, bp_status.roadmap_status)
54
55 def test_roadmap_status_unknown_status(self):
56 blueprint = self.factory.make_blueprint(
57 implementation=u"Not Expected")
58 self.assertEqual(None, blueprint.roadmap_status)
59
45 def test_from_launchpad_sets_name(self):60 def test_from_launchpad_sets_name(self):
46 name = self.factory.getUniqueUnicode(prefix="lpblueprint")61 name = self.factory.getUniqueUnicode(prefix="lpblueprint")
47 lp_bp = self.lp.make_blueprint(name=name)62 lp_bp = self.lp.make_blueprint(name=name)
4863
=== modified file 'report_tools.py'
--- report_tools.py 2012-07-17 06:00:48 +0000
+++ report_tools.py 2012-10-09 09:20:30 +0000
@@ -3,12 +3,25 @@
3# Tools for generating reports3# Tools for generating reports
44
5import datetime5import datetime
6import urllib, sys, os.path, re6<<<<<<< TREE
7import urllib, sys, os.path, re
8=======
9import urllib, sys, os.path, re
10from storm.locals import create_database, Store
11>>>>>>> MERGE-SOURCE
7from subprocess import Popen12from subprocess import Popen
8from cgi import escape13from cgi import escape
9from lpworkitems import database14from lpworkitems import database
10import errno15import errno
11import fcntl16import fcntl
17from lpworkitems.models_roadmap import (
18 Lane,
19 Card,
20)
21from lpworkitems.models import (
22 Meta, ROADMAP_STATUSES_MAP,
23 get_roadmap_status_for_bp_implementation_status,
24)
1225
13valid_states = [u'todo', u'blocked', u'inprogress', u'done', u'postponed']26valid_states = [u'todo', u'blocked', u'inprogress', u'done', u'postponed']
14state_labels = [u'Todo', u'Blocked', u'In Progress', u'Done', u'Postponed']27state_labels = [u'Todo', u'Blocked', u'In Progress', u'Done', u'Postponed']
@@ -180,6 +193,52 @@
180 fh.close()193 fh.close()
181194
182195
196def roadmap_pages(my_path, database, basename, config, lane, root=None):
197 cfg = load_config(config)
198 fh = open(basename + '.html', 'w')
199 chart_path, _ = os.path.split(basename)
200 chart_name = os.path.join(chart_path, 'current_quarter.svg')
201 try:
202 args = [os.path.join(my_path, 'html-report'), '-d', database]
203 args += ['--report-type', 'roadmap_page']
204 args += ['--lane', lane.name]
205 if root:
206 args += ['--root', root]
207 if lane.is_current:
208 args += ['--chart', chart_name]
209 report_args(args, theme=get_theme(cfg))
210 proc = Popen(args, stdout=fh)
211 print basename + '.html'
212 proc.wait()
213 finally:
214 fh.close()
215
216 if lane.is_current:
217 args = [os.path.join(my_path, 'roadmap-bp-chart'), '-d', database,
218 '-o', chart_name]
219 args += ['--inverted']
220 proc = Popen(args)
221 print chart_name
222 proc.wait()
223
224
225def roadmap_cards(my_path, database, basename, config, card, root=None):
226 cfg = load_config(config)
227 fh = open(basename + '.html', 'w')
228 try:
229 args = [os.path.join(my_path, 'html-report'), '-d', database]
230 args += ['--report-type', 'roadmap_card']
231 args += ['--card', '%s' % card.card_id]
232 if root:
233 args += ['--root', root]
234 report_args(args, theme=get_theme(cfg))
235 proc = Popen(args, stdout=fh)
236 print basename + '.html'
237 proc.wait()
238 finally:
239 fh.close()
240
241
183def run_reports(my_path, database, basename, config, milestone=None, team=None,242def run_reports(my_path, database, basename, config, milestone=None, team=None,
184 user=None, trend_starts=None, trend_override=None, burnup=False, root=None, date=None):243 user=None, trend_starts=None, trend_override=None, burnup=False, root=None, date=None):
185244
@@ -281,6 +340,34 @@
281 return escape(html, True)340 return escape(html, True)
282341
283342
343def blueprints_over_time(store):
344 '''Calculate blueprint development over time for the current lane.
345
346 We do not need to care about teams or groups since this is intended for the
347 roadmap overview.
348
349 Return date -> state -> count mapping. states are
350 {planned,inprogress,completed,blocked}.
351 '''
352 data = {}
353 result = store.execute("""
354 SELECT status, day, count
355 FROM spec_daily_count_per_state
356 JOIN lane on lane.lane_id = spec_daily_count_per_state.lane_id
357 WHERE lane.is_current = 1
358 """)
359 for status, day, count in result:
360 roadmap_status = get_roadmap_status_for_bp_implementation_status(
361 status)
362 assert roadmap_status is not None
363 if day not in data:
364 data[day] = {}
365 if roadmap_status not in data[day]:
366 data[day][roadmap_status] = 0
367 data[day][roadmap_status] += count
368 return data
369
370
284def workitems_over_time(store, team=None, group=None, milestone_collection=None):371def workitems_over_time(store, team=None, group=None, milestone_collection=None):
285 '''Calculate work item development over time.372 '''Calculate work item development over time.
286373
@@ -876,6 +963,80 @@
876 return rv963 return rv
877964
878965
966def lanes(store):
967 return store.find(Lane)
968
969
970def lane(store, name, id=None):
971 if id is None:
972 return store.find(Lane, Lane.name == unicode(name)).one()
973 else:
974 return store.find(Lane, Lane.lane_id == id).one()
975
976
977def current_lane(store):
978 return store.find(Lane, Lane.is_current).one()
979
980
981def lane_cards(store, lane):
982 return lane.cards
983
984
985def statuses(store, lane):
986 result = []
987 for status in store.find(Card.status,
988 Card.lane_id == lane.lane_id).config(distinct=True):
989 result.append((status, store.find(Card,
990 Card.lane_id == lane.lane_id,
991 Card.status == status)))
992 return result
993
994
995def cards(store):
996 return store.find(Card)
997
998
999def card(store, card_id):
1000 return store.find(Card, Card.card_id == card_id)
1001
1002
1003def card_blueprints(store, roadmap_id):
1004 metas = store.find(Meta,
1005 Meta.key == u'Roadmap id',
1006 Meta.value == roadmap_id)
1007 return [meta.blueprint for meta in metas]
1008
1009
1010def card_blueprints_by_status(store, roadmap_id):
1011 blueprints = card_blueprints(store, roadmap_id)
1012 bp_by_status = {}
1013 for key in ROADMAP_STATUSES_MAP:
1014 bp_by_status[key] = []
1015 for bp in blueprints:
1016 bp_by_status[bp.roadmap_status].append(bp)
1017 return bp_by_status
1018
1019
1020def card_bp_status_counts(store, roadmap_id):
1021 blueprints = card_blueprints(store, roadmap_id)
1022 total_by_status = dict([(key, 0) for key in ROADMAP_STATUSES_MAP])
1023 for bp in blueprints:
1024 total_by_status[bp.roadmap_status] += 1
1025 return total_by_status
1026
1027
1028def check_card_health(store, card_health_checks, card):
1029 performed_checks = []
1030 card.is_healthy = True
1031 for check in card_health_checks:
1032 result = check.execute(card, store)
1033 if result == check.NOT_OK:
1034 card.is_healthy = False
1035 performed_checks.append({'name': check.name,
1036 'result': result})
1037 return performed_checks
1038
1039
879def subteams(store, team):1040def subteams(store, team):
880 result = store.execute('SELECT name from team_structure where team = ?', (unicode(team),))1041 result = store.execute('SELECT name from team_structure where team = ?', (unicode(team),))
881 return [i[0] for i in result]1042 return [i[0] for i in result]
8821043
=== added file 'roadmap-bp-chart'
--- roadmap-bp-chart 1970-01-01 00:00:00 +0000
+++ roadmap-bp-chart 2012-10-09 09:20:30 +0000
@@ -0,0 +1,249 @@
1#!/usr/bin/python
2#
3# Create a blueprint tracking chart from a blueprint database.
4#
5# Copyright (C) 2010, 2011 Canonical Ltd.
6# License: GPL-3
7
8import optparse, datetime, sys
9import report_tools
10
11from pychart import *
12
13def date_to_ordinal(s):
14 '''Turn yyyy-mm-dd strings to ordinals'''
15 return report_tools.date_to_python(s).toordinal()
16
17
18def ordinal_to_date(ordinal):
19 '''Turn an ordinal date into a string'''
20 d = datetime.date.fromordinal(int(ordinal))
21 return d.strftime('%Y-%m-%d')
22
23def format_date(ordinal):
24 d = datetime.date.fromordinal(int(ordinal))
25 return '/a60{}' + d.strftime('%b %d, %y')
26
27def do_chart(data, start_date, end_date, trend_start, title, filename, only_weekdays, inverted):
28 #set up default values
29 format = 'svg'
30 height = 450
31 width = 1000
32 legend_x = 700
33 legend_y = 200
34 title_x = 300
35 title_y = 350
36
37 if inverted:
38 legend_x=200
39
40 # Tell pychart to use colors
41 theme.use_color = True
42 theme.default_font_size = 12
43 theme.reinitialize()
44
45 # turn into pychart data model and calculate maximum number of WIs
46 max_items = 1 # start at 1 to avoid zero div
47 lastactive = 0
48 pcdata = []
49
50 for date in xrange(date_to_ordinal(start_date), date_to_ordinal(end_date)+1):
51 if (not only_weekdays or datetime.date.fromordinal(date).weekday() < 5):
52 end_date = ordinal_to_date(date)
53 i = data.get(ordinal_to_date(date), {})
54 count = i.get('Completed', 0) + i.get('Planned', 0) + i.get('Blocked', 0) + i.get('In Progress', 0)
55 if max_items < count:
56 max_items = count
57 pcdata.append((date, i.get('Planned', 0),0,
58 i.get('Blocked', 0),0,
59 i.get('In Progress', 0),0,
60 i.get('Completed',0),0, count))
61 if count > 0:
62 lastactive = len(pcdata) - 1
63
64 # add some extra space to look nicer
65 max_items = int(max_items * 1.05)
66
67 x_interval = len(pcdata)/20
68 if max_items > 500:
69 y_interval = max_items/200*10
70 elif max_items < 20:
71 y_interval = 1
72 else:
73 y_interval = max_items/20
74
75 # create the chart object
76 chart_object.set_defaults(area.T, size=(width, height),
77 y_range=(0, None), x_coord=category_coord.T(pcdata, 0))
78
79 # tell the chart object it will use a bar chart, and will
80 # use the data list for it's model
81 chart_object.set_defaults(bar_plot.T, data=pcdata)
82
83 # create the chart area
84 # tell it to start at coords 0,0
85 # tell it the labels, and the tics, etc..
86 # HACK: to prevent 0 div
87 if max_items == 0:
88 max_items = 1
89 ar = area.T(legend=legend.T(loc=(legend_x,legend_y)), loc=(0,0),
90 x_axis=axis.X(label='Date', tic_interval=x_interval,format=format_date),
91 y_axis=axis.Y(label='Blueprints', tic_interval=y_interval),
92 y_range=(0, max_items))
93
94 #initialize the blar_plot fill styles
95 bar_plot.fill_styles.reset()
96
97 # create each set of data to plot
98 # note that index zero is the label col
99 # for each column of data, tell it what to use for the legend and
100 # what color to make the bar, no lines, and
101 # what plot to stack on
102
103 tlabel = ''
104
105 if inverted:
106 plot1 = bar_plot.T(label='Completed' + tlabel, hcol=7)
107 plot1.fill_style = fill_style.Plain(bgcolor=color.seagreen)
108
109 plot3 = bar_plot.T(label='In Progress' + tlabel, hcol=5, stack_on = plot1)
110 plot3.fill_style = fill_style.Plain(bgcolor=color.gray65)
111
112 plot5 = bar_plot.T(label='Blocked' + tlabel, hcol=3, stack_on = plot3)
113 plot5.fill_style = fill_style.Plain(bgcolor=color.red1)
114
115 plot7 = bar_plot.T(label='Planned' + tlabel, hcol=1, stack_on = plot5)
116 plot7.fill_style = fill_style.Plain(bgcolor=color.darkorange1)
117 else:
118 plot1 = bar_plot.T(label='Planned' + tlabel, hcol=1)
119 plot1.fill_style = fill_style.Plain(bgcolor=color.darkorange1)
120
121 plot3 = bar_plot.T(label='Blocked' + tlabel, hcol=3, stack_on = plot1)
122 plot3.fill_style = fill_style.Plain(bgcolor=color.red1)
123
124 plot5 = bar_plot.T(label='In Progress' + tlabel, hcol=5, stack_on = plot3)
125 plot5.fill_style = fill_style.Plain(bgcolor=color.gray65)
126
127 plot7 = bar_plot.T(label='Completed' + tlabel, hcol=7, stack_on = plot5)
128 plot7.fill_style = fill_style.Plain(bgcolor=color.seagreen)
129
130
131 plot1.line_style = None
132 plot3.line_style = None
133 plot5.line_style = None
134 plot7.line_style = None
135
136 plot11 = bar_plot.T(label='total', hcol=9)
137 plot11.fill_style = None
138 plot11.line_style = line_style.gray30
139
140 # create the canvas with the specified filename and file format
141 can = canvas.init(filename,format)
142
143 # add the data to the area and draw it
144 ar.add_plot(plot1, plot3, plot5, plot7)
145 ar.draw()
146
147 # title
148 tb = text_box.T(loc=(title_x, title_y), text=title, line_style=None)
149 tb.fill_style = None
150 tb.draw()
151
152#
153# main
154#
155
156# argv parsing
157optparser = optparse.OptionParser()
158optparser.add_option('-d', '--database',
159 help='Path to database', dest='database', metavar='PATH')
160optparser.add_option('-t', '--team',
161 help='Restrict report to a particular team', dest='team')
162optparser.add_option('-m', '--milestone',
163 help='Restrict report to a particular milestone', dest='milestone')
164optparser.add_option('-o', '--output',
165 help='Output file', dest='output')
166optparser.add_option('--trend-start', type='int',
167 help='Explicitly set start of trend line', dest='trendstart')
168optparser.add_option('-u', '--user',
169 help='Run for this user', dest='user')
170optparser.add_option('--only-weekdays', action='store_true',
171 help='Skip Saturdays and Sundays in the resulting graph', dest='only_weekdays')
172optparser.add_option('--inverted', action='store_true',
173 help='Generate an inverted burndown chart', dest='inverted')
174optparser.add_option('-s', '--start-date',
175 help='Explicitly set the start date of the burndown data', dest='start_date')
176optparser.add_option('-e', '--end-date',
177 help='Explicitly set the end date of the burndown data', dest='end_date')
178optparser.add_option('--no-foreign', action='store_true', default=False,
179 help='Do not show foreign totals separate', dest='noforeign')
180optparser.add_option('--group',
181 help='Run for this group', dest='group')
182optparser.add_option('--date',
183 help='Run for this date', dest='date')
184
185(opts, args) = optparser.parse_args()
186if not opts.database:
187 optparser.error('No database given')
188if not opts.output:
189 optparser.error('No output file given')
190
191if opts.user and opts.team:
192 optparser.error('team and user options are mutually exclusive')
193if opts.user and opts.group:
194 optparser.error('user and group options are mutually exclusive')
195if opts.team and opts.group:
196 optparser.error('team and group options are mutually exclusive')
197if opts.milestone and opts.date:
198 optparser.error('milestone and date options are mutually exclusive')
199
200# The typing allows polymorphic behavior
201if opts.user:
202 opts.team = report_tools.user_string(opts.user)
203elif opts.team:
204 opts.team = report_tools.team_string(opts.team)
205
206store = report_tools.get_store(opts.database)
207
208milestone_collection = None
209if opts.milestone:
210 milestone_collection = report_tools.get_milestone(store, opts.milestone)
211elif opts.date:
212 milestone_collection = report_tools.MilestoneGroup(
213 report_tools.date_to_python(opts.date))
214
215
216# get date -> state -> count mapping
217data = report_tools.blueprints_over_time(store)
218
219if len(data) == 0:
220 print 'WARNING: no blueprints, not generating chart (team: %s, group: %s, due date: %s)' % (
221 opts.team or 'all', opts.group or 'none', milestone_collection and milestone_collection.display_name or 'none')
222 sys.exit(0)
223
224# calculate start/end date if no dates are given
225if opts.start_date is None:
226 start_date = sorted(data.keys())[0]
227else:
228 start_date=opts.start_date
229
230if opts.end_date is None:
231 if milestone_collection is not None:
232 end_date = milestone_collection.due_date_str
233 else:
234 end_date=report_tools.milestone_due_date(store)
235else:
236 end_date=opts.end_date
237
238if not start_date or not end_date or date_to_ordinal(start_date) > date_to_ordinal(end_date):
239 print 'WARNING: empty date range, not generating chart (team: %s, group: %s, due date: %s)' % (
240 opts.team or 'all', opts.group or 'none', milestone_collection and milestone_collection.display_name or 'none')
241 sys.exit(0)
242
243# title
244title = '/20all quarters'
245
246if milestone_collection is not None:
247 title += ' (%s)' % milestone_collection.name
248
249do_chart(data, start_date, end_date, opts.trendstart, title, opts.output, opts.only_weekdays, opts.inverted)
0250
=== added file 'roadmap_health.py'
--- roadmap_health.py 1970-01-01 00:00:00 +0000
+++ roadmap_health.py 2012-10-09 09:20:30 +0000
@@ -0,0 +1,102 @@
1from report_tools import (
2 card_blueprints,
3 card_blueprints_by_status,
4)
5
6card_health_checks = []
7
8
9def register_health_check(cls):
10 card_health_checks.append(cls)
11 return cls
12
13
14class HealthCheck(object):
15 NOT_APPLICABLE = 'n/a'
16 OK = 'OK'
17 NOT_OK = 'Not OK'
18 name = 'Base check, not to be used'
19
20 @classmethod
21 def applicable(cls, card, store=None):
22 raise NotImplementedError()
23
24 @classmethod
25 def check(cls, card, store=None):
26 raise NotImplementedError()
27
28 @classmethod
29 def execute(cls, card, store=None):
30 if cls.applicable(card, store):
31 if cls.check(card, store):
32 return cls.OK
33 else:
34 return cls.NOT_OK
35 else:
36 return cls.NOT_APPLICABLE
37
38
39@register_health_check
40class DescriptionHealthCheck(HealthCheck):
41 name = 'Has description'
42
43 @classmethod
44 def applicable(cls, card, store=None):
45 return True
46
47 @classmethod
48 def check(cls, card, store=None):
49 return card.description is not None
50
51
52@register_health_check
53class CriteriaHealthCheck(HealthCheck):
54 name = 'Has acceptance criteria'
55
56 @classmethod
57 def applicable(cls, card, store=None):
58 return True
59
60 @classmethod
61 def check(cls, card, store=None):
62 return card.acceptance_criteria is not None
63
64
65@register_health_check
66class BlueprintsHealthCheck(HealthCheck):
67 name = 'Has blueprints'
68
69 @classmethod
70 def applicable(cls, card, store):
71 return card.status == 'Ready'
72
73 @classmethod
74 def check(cls, card, store):
75 return len(card_blueprints(store, card.roadmap_id)) > 0
76
77
78@register_health_check
79class BlueprintsBlockedHealthCheck(HealthCheck):
80 name = 'Has no Blocked blueprints'
81
82 @classmethod
83 def applicable(cls, card, store):
84 return card.status != 'Ready'
85
86 @classmethod
87 def check(cls, card, store):
88 blueprints = card_blueprints_by_status(store, card.roadmap_id)
89 return len(blueprints['Blocked']) == 0
90
91
92@register_health_check
93class RoadmapIdHealthCheck(HealthCheck):
94 name = 'Has a roadmap id'
95
96 @classmethod
97 def applicable(cls, card, store=None):
98 return True
99
100 @classmethod
101 def check(cls, card, store=None):
102 return card.roadmap_id != ''
0103
=== modified file 'templates/base.html'
--- templates/base.html 2011-06-02 15:00:45 +0000
+++ templates/base.html 2012-10-09 09:20:30 +0000
@@ -127,6 +127,19 @@
127127
128 });128 });
129 </script>129 </script>
130 <script type="text/javascript">
131
132 var _gaq = _gaq || [];
133 _gaq.push(['_setAccount', 'UA-16756069-4']);
134 _gaq.push(['_trackPageview']);
135
136 (function() {
137 var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
138 ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
139 var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
140 })();
141
142 </script>
130</head>143</head>
131144
132${next.body()}145${next.body()}
133146
=== modified file 'templates/body.html'
--- templates/body.html 2011-05-27 20:07:25 +0000
+++ templates/body.html 2012-10-09 09:20:30 +0000
@@ -12,7 +12,7 @@
12% if page_type == "overview":12% if page_type == "overview":
13 active13 active
14% endif14% endif
15 " title="Overview" id="overview_nav"><a href="${util.url('')}">Overview</a></li>15 " title="Roadmap" id="overview_nav"><a href="${root}../lane/">Roadmap</a></li>
16 <li class="link16 <li class="link
17% if page_type == "about":17% if page_type == "about":
18 active18 active
@@ -39,6 +39,9 @@
39<div id="content_pane">39<div id="content_pane">
40<div id="main_content">40<div id="main_content">
41${next.body()}41${next.body()}
42
43<%namespace name="footer" file="footer.html"/>
44${footer.body()}
42</div>45</div>
43</div>46</div>
44</div>47</div>
4548
=== added file 'templates/footer.html'
=== added file 'templates/roadmap_card.html'
--- templates/roadmap_card.html 1970-01-01 00:00:00 +0000
+++ templates/roadmap_card.html 2012-10-09 09:20:30 +0000
@@ -0,0 +1,71 @@
1<%inherit file="body.html"/>
2<%namespace name="util" file="util.html"/>
3<%namespace name="terminology" file="terminology.html"/>
4
5<%!
6import report_tools
7%>
8
9<%def name="title()">
10${card_title}
11</%def>
12
13<h1>${title()}</h1>
14<div style="float: right">
15<h3>Health check</h3>
16
17<table>${'<tr colspan="2"><td><font color="#FF0000"><b>Needs attention!</b></font></td></tr>' if not card.is_healthy else ''}
18% for result in health_checks:
19 <tr ${'bgcolor="#FFAAAA"' if result['result'] == 'Not OK' else 'bgcolor="#FFFFFF"'}>
20 <td>${result['name']}</td>
21 <td>${result['result']}</td>
22 </tr>
23% endfor
24</table>
25</div>
26<h2>${card.status} in <a href="../lane/${lane}.html">${lane}</a></h2>
27<p>
28<ul>
29 <li>Card ID: <a href="${card.url}">${card.roadmap_id}</a>
30 <li>Sponsor: ${card.sponsor}
31 <li>Contact: ${card.contact}
32 <li>Priority: ${card.priority}
33 <li>Size: ${card.size}
34 <li>Team: ${card.team}
35</ul>
36
37<div style="clear:both; text-align: center">Overall blueprint completion</div>
38% if card_has_blueprints:
39${util.roadmap_progress_bar(bp_status_totals)}
40% else:
41<center><i>Progress graph pending linked blueprints.</i></center>
42% endif
43
44<h3>Description</h3> ${card.description if card.description is not None else '<i>No description could be found.</i>'}
45<p><a href="${card.url}">Read the full description</a>.
46<h3>Acceptance criteria</h3> ${card.acceptance_criteria if card.acceptance_criteria is not None else '<i>No acceptance criteria could be found.</i>'}
47<p><a href="${card.url}">Read the full acceptance criteria</a>.
48<p>
49% if card_has_blueprints:
50<table>
51<thead>
52 <tr><th>Title</th>
53 <th>Assignee</th>
54 <th>Priority</th>
55 <th>Status</th>
56 <th>Expected milestone</th>
57 </tr>
58</thead>
59% for status in status_order:
60% for bp in sorted(blueprints[status], key=lambda bp: bp.milestone_name):
61 <tr><td><a href="${bp.url}">${bp.name}</a></td>
62 <td>${bp.assignee_name}</td>
63 <td>${bp.priority}</td>
64 <td>${status}
65 <td>${bp.milestone_name}</td>
66 </tr>
67% endfor
68% endfor
69</table>
70% endif
71
072
=== added file 'templates/roadmap_lane.html'
--- templates/roadmap_lane.html 1970-01-01 00:00:00 +0000
+++ templates/roadmap_lane.html 2012-10-09 09:20:30 +0000
@@ -0,0 +1,60 @@
1<%inherit file="body.html"/>
2<%namespace name="util" file="util.html"/>
3<%namespace name="terminology" file="terminology.html"/>
4
5<%!
6import report_tools
7%>
8
9<%def name="title()">
10Progress for ${lane_title}
11</%def>
12
13<p style="text-align: right; color: green; font-size: 13pt; float: right">
14Lane:&nbsp;<select name="laneselect" onchange="window.location=this.value;">
15% for lane in lanes:
16<option value="${lane.name}.html"${' selected="selected"' if lane.name == lane_title else ''}>${lane.name}${' (current)' if lane.is_current else ''}</option>
17% endfor
18</select>
19<h1>${title()}</h1>
20${util.roadmap_progress_bar(bp_status_totals)}
21<p>
22<table width="100%">
23<thead><tr><th>Card</th><th>Status</th><th>Team</th><th>Priority</th><th>Blueprints</th><th>Health</th></tr></thead>
24% for status in statuses:
25% for card_dict in status['cards']:
26 <tr>
27 <td>
28 <a href="../card/${card_dict['card'].roadmap_id if card_dict['card'].roadmap_id != '' else card_dict['card'].card_id}.html">${card_dict['card'].name}</a>
29 </td>
30 <td>${status['name']}</td>
31 <td>${card_dict['card'].team}</td><td align=right>${card_dict['card'].priority}</td>
32 <td>
33 <div class="roadmap_wrap" title="
34% for bp_status in status_order:
35${bp_status}: ${card_dict['bp_statuses'][bp_status]}&nbsp;
36% endfor
37">
38% for bp_status in status_order:
39 <div class="roadmap_value" style="width:${card_dict['bp_percentages'][bp_status]}%">
40 <div class="${bp_status.replace(' ', '')}">&nbsp;</div>
41 </div>
42% endfor
43 </div>
44 </td>
45 <td>
46 ${'<font color="#FF0000">Needs attention!</font>' if not card_dict['card'].is_healthy else '' | n}
47 </td>
48% endfor
49% endfor
50</table>
51
52% if chart_url != 'burndown.svg':
53<!-- The cli option defaults to burndown.svg! :( -->
54<div class="overview_graph">
55<h3>Blueprint progress</h3><p><a href="current_quarter.svg">(enlarge)</a></p>
56<object
57 height="500" width="833"
58 data="current_quarter.svg" type="image/svg+xml">Blueprint progress</object>
59</div>
60% endif
061
=== modified file 'templates/util.html'
--- templates/util.html 2011-06-01 20:55:59 +0000
+++ templates/util.html 2012-10-09 09:20:30 +0000
@@ -23,6 +23,15 @@
23 </div>23 </div>
24</%def>24</%def>
2525
26<%def name="roadmap_progress_bar(item)">
27<div class="roadmap_wrap" title="${item['Completed']} blueprints complete of ${item['Total']}">
28 <div class="roadmap_value" style="width:${item['Percentage']}%">
29 <div class="Completed">&nbsp;</div>
30 </div>
31 <div class="roadmap_progress_text">${item['Percentage']} % complete of ${item['Total']}</div>
32</div>
33</%def>
34
26<%def name="url(end)">${root}${end}</%def>35<%def name="url(end)">${root}${end}</%def>
2736
28<%def name="burndown_chart(chart_url, large=False)">37<%def name="burndown_chart(chart_url, large=False)">
2938
=== modified file 'tests.py'
--- tests.py 2011-06-01 03:36:17 +0000
+++ tests.py 2012-10-09 09:20:30 +0000
@@ -168,6 +168,7 @@
168 loader = TestLoader()168 loader = TestLoader()
169 suite = loader.loadTestsFromName(__name__)169 suite = loader.loadTestsFromName(__name__)
170 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_collect"))170 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_collect"))
171 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_collect_roadmap"))
171 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_error_collector"))172 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_error_collector"))
172 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_factory"))173 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_factory"))
173 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_fake_launchpad"))174 suite.addTests(loader.loadTestsFromName("lpworkitems.tests.test_fake_launchpad"))
174175
=== added file 'themes/linaro/templates/footer.html'
--- themes/linaro/templates/footer.html 1970-01-01 00:00:00 +0000
+++ themes/linaro/templates/footer.html 2012-10-09 09:20:30 +0000
@@ -0,0 +1,10 @@
1<%!
2 import datetime
3%>
4<div id="footer">
5 Last updated: ${datetime.datetime.utcnow().strftime("%a %d %B %Y, %H:%M UTC")} |
6 <a href="https://bugs.launchpad.net/launchpad-work-items-tracker">Bugs</a> |
7 <a href="https://code.launchpad.net/~linaro-infrastructure/launchpad-work-items-tracker/linaro">Code</a> |
8 <a href="https://code.launchpad.net/~linaro-infrastructure/launchpad-work-items-tracker/linaro-config">Config</a> |
9 <a href="/update.log.txt">Update log</a> (<a href="/update.log.txt.1">yesterday</a>)
10</div>
011
=== added file 'utils.py'
--- utils.py 1970-01-01 00:00:00 +0000
+++ utils.py 2012-10-09 09:20:30 +0000
@@ -0,0 +1,6 @@
1def unicode_or_None(attr):
2 if attr is None:
3 return attr
4 if isinstance(attr, unicode):
5 return attr
6 return attr.decode("utf-8")

Subscribers

People subscribed via source and target branches

to all changes: