Merge lp:~sinzui/launchpad/base-layout-additions into lp:launchpad
- base-layout-additions
- Merge into devel
Status: | Merged | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Merged at revision: | not available | ||||||||||||
Proposed branch: | lp:~sinzui/launchpad/base-layout-additions | ||||||||||||
Merge into: | lp:launchpad | ||||||||||||
Diff against target: | None lines | ||||||||||||
To merge this branch: | bzr merge lp:~sinzui/launchpad/base-layout-additions | ||||||||||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Eleanor Berger (community) | Approve | ||
Review via email: mp+9267@code.launchpad.net |
Commit message
Description of the change
Curtis Hovey (sinzui) wrote : | # |
Eleanor Berger (intellectronica) wrote : | # |
Hi curtis,
The diff from trunk to this branch is huge (more than 6k lines) and contains lots of stuff and can't possibly be relevant. I'm not sure how to get the correct diff, could you please add instructions or just paste the correct one?
Curtis Hovey (sinzui) wrote : | # |
Hi Tom.
> correct diff, could you please add instructions or just paste the correct one?
db-devel has not merged into devel I see. This is the real diff of my changes that will will be landed in devel after the branches are synced.
=== modified file 'lib/canonical/
--- lib/canonical/
+++ lib/canonical/
@@ -1225,6 +1225,38 @@
u''
+== CSS classes for public and private objects ==
+
+Users need to recognise private information as they are viewing it. This is
+accomplished with a CSS class.
+
+Any object can be converted to the 'public' CSS class. Th object does not need
+to have a private bool attributes.
+
+ >>> thing = object()
+ >>> print test_tales(
+ public
+
+The CSS class honors the state of the object's privacy if the object supports
+the private attribute. If the object is not private, the class is 'public'.
+
+ >>> bug = factory.
+ >>> print bug.private
+ False
+ >>> print test_tales(
+ public
+
+If the private attribute is True, the class is 'private'.
+
+ >>> owner = bug.bugtasks[
+ >>> login_person(owner)
+ >>> bug.setPrivate(
+ True
+ >>> print test_tales(
+ private
+ >>> login(ANONYMOUS)
+
+
== Formatting of private attributes on Teams ==
To protect privacy of teams, the formatter for teams will only show
=== modified file 'lib/canonical/
--- lib/canonical/
+++ lib/canonical/
@@ -189,7 +189,7 @@
width: auto;
margin-right: 24%;
}
-#footer {
+.footer {
margin-top: 2em;
border-top: 1px solid #cdd3dd;
padding-top: .5em;
@@ -234,9 +234,13 @@
font-family: bitstream vera sans, arial, helvetica, clean, sans-serif;
font-size: 97%; /* 2.0 backward compatability */
}
+body.private {
+ /* It must be obvious to the user that the context is private */
+ background: url("/@
+ }
h1 {
clear: none;
- padding-top: 6px; /* An offset fromt he logo. */
+ padding-top: 6px; /* An offset from the logo. */
font-size: 197%;
margin-bottom: .5em; /* 2.0 backward compatability */
}
@@ -252,10 +256,7 @@
p, li, dt, dd, blockquote {
max-width: 45em; /* Wrap the text before the eye gets lost. */
}
-pre, code, samp, tt,
-#bug-description,
-.bug-comment,
-.bug-activity {
+pre, code, samp, tt, .preformatted {
font-size: 116%;
}
ol {
@@ -356,7 +357,7 @@
/* Exceptions they may be common. */
-#contributors dt strong {
+.contributors dt strong {
padding-left: 1em;
}
@@ -383,17 +384,17 @@
background: #fbfbfb;
}
-#project-downloads {
+.downloads {
font-weight: bold;
}
-#project-downloads a {
+.downloads a {
color: #4f843c;
}
-#project-downloads li {
+.downloads li {
marg...
Eleanor Berger (intellectronica) wrote : | # |
Very nice branch.
There's one typo:
=== modified file 'lib/canonical/
--- lib/canonical/
+++ lib/canonical/
@@ -1225,6 +1225,38 @@
u''
+== CSS classes for public and private objects ==
+
+Users need to recognise private information as they are viewing it. This is
+accomplished with a CSS class.
+
+Any object can be converted to the 'public' CSS class. Th object does not need
+to have a private bool attributes.
Typo: Th --> The
And we've discusse on IRC using zope interfaces instead of reflection to check whether an object has a private attribute.
Curtis Hovey (sinzui) wrote : | # |
Hi Tom.
Thank you for suggesting a long-term fix for the privacy support issue.
I implemented IPrivacy.private and updated IPerson, Ibug, IBranch,
IArchive, and IWHSubmission to redefine it because each has
read-write/guard conditions.
I rejected the IObjectPrivacy adapter because we now have 5 objects
implementing this attribute and an UI that expects a uniform
implementation. We need to commit to a privacy definition before we make
other objects private.
On Mon, 2009-07-27 at 13:50 +0000, Tom Berger wrote:
...
> And we've discusse on IRC using zope interfaces instead of reflection
> to check whether an object has a private attribute.
=== modified file 'lib/canonical/
--- lib/canonical/
+++ lib/canonical/
@@ -1230,8 +1230,8 @@
Users need to recognise private information as they are viewing it. This is
accomplished with a CSS class.
-Any object can be converted to the 'public' CSS class. Th object does not need
-to have a private bool attributes.
+Any object can be converted to the 'public' CSS class. The object does not
+need to implement IPrivacy.
>>> thing = object()
>>> print test_tales(
=== modified file 'lib/canonical/
--- lib/canonical/
+++ lib/canonical/
@@ -56,6 +56,7 @@
from lp.registry.
from lp.registry.
from lp.soyuz.
+from canonical.
from canonical.
from canonical.
from canonical.
@@ -122,7 +123,7 @@
VERSION_1 = DBItem(1, "Version 1")
-class IHWSubmission(
+class IHWSubmission(
"""Raw submission data for the hardware database.
See doc/hwdb.txt for details about the attributes.
@@ -143,6 +144,8 @@
Choice(
+ # This is redefined from IPrivacy.private because the attribute is
+ # is required.
private = exported(
Bool(
=== modified file 'lib/canonical/
--- lib/canonical/
+++ lib/canonical/
@@ -59,6 +59,7 @@
'IPassword
'IPrivateA
'IPrivateM
+ 'IPrivacy',
'IReadZODB
'IRosettaA
'IStructur
@@ -388,6 +389,16 @@
"""
+class IPrivacy(
+ """Something that can be private."""
+
+ private = Bool(
+ title=_("This is private"),
+ required=False,
+ ...
Preview Diff
1 | === modified file 'database/replication/initialize.py' |
2 | --- database/replication/initialize.py 2009-06-24 21:17:33 +0000 |
3 | +++ database/replication/initialize.py 2009-07-19 04:41:14 +0000 |
4 | @@ -144,9 +144,11 @@ |
5 | comment='Launchpad tables and sequences'); |
6 | """) |
7 | |
8 | + script.append( |
9 | + "echo 'Adding %d tables to replication set @lpmain_set';" |
10 | + % len(lpmain_tables)) |
11 | for table in sorted(lpmain_tables): |
12 | script.append(""" |
13 | - echo 'Adding %(table)s to replication set @lpmain_set'; |
14 | set add table ( |
15 | set id=@lpmain_set, |
16 | origin=@master_node, |
17 | @@ -156,9 +158,11 @@ |
18 | entry_id += 1 |
19 | |
20 | entry_id = 200 |
21 | + script.append( |
22 | + "echo 'Adding %d sequences to replication set @lpmain_set';" |
23 | + % len(lpmain_sequences)) |
24 | for sequence in sorted(lpmain_sequences): |
25 | script.append(""" |
26 | - echo 'Adding %(sequence)s to replication set @lpmain_set'; |
27 | set add sequence ( |
28 | set id=@lpmain_set, |
29 | origin=@master_node, |
30 | |
31 | === modified file 'database/replication/report.py' |
32 | --- database/replication/report.py 2009-06-24 21:17:33 +0000 |
33 | +++ database/replication/report.py 2009-07-19 04:41:14 +0000 |
34 | @@ -39,7 +39,7 @@ |
35 | self.labels = labels[:] |
36 | self.rows = [] |
37 | |
38 | - |
39 | + |
40 | class HtmlReport: |
41 | |
42 | def alert(self, text): |
43 | @@ -157,10 +157,11 @@ |
44 | |
45 | cur.execute(""" |
46 | SELECT li_receiver, li_origin, li_provider |
47 | - FROM sl_listen ORDER BY li_receiver |
48 | + FROM sl_listen |
49 | + ORDER BY li_receiver, li_origin, li_provider |
50 | """) |
51 | for row in cur.fetchall(): |
52 | - table.rows.append('Node %s' % node for node in row) |
53 | + table.rows.append(['Node %s' % node for node in row]) |
54 | return report.table(table) |
55 | |
56 | |
57 | |
58 | === modified file 'database/sampledata/current-dev.sql' |
59 | --- database/sampledata/current-dev.sql 2009-07-17 00:26:05 +0000 |
60 | +++ database/sampledata/current-dev.sql 2009-07-19 04:41:14 +0000 |
61 | @@ -4756,7 +4756,8 @@ |
62 | INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243626, 'Commercial Subscription Approvers', 243623, NULL, 'commercial-approvers', NULL, NULL, NULL, NULL, 1, NULL, '2008-06-27 14:49:38.676264', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
63 | INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243627, 'Ubuntu-branches-owner', NULL, NULL, 'ubuntu-branches-owner', NULL, NULL, NULL, NULL, 1, NULL, '2009-03-17 07:28:15.948042', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, 1, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, 243625); |
64 | INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243628, 'Ubuntu branches', 243627, 'Celebrity team that controls Ubuntu source package branches.', 'ubuntu-branches', NULL, NULL, NULL, NULL, 3, NULL, '2009-03-17 07:29:13.259033', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
65 | -INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243629, 'HWDB Team', 16, NULL, 'hwdb-team', NULL, NULL, NULL, NULL, 3, NULL, '2009-07-09 09:12:39.400351', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
66 | +INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243629, 'Ubuntu Security Team', 4, NULL, 'ubuntu-security', NULL, NULL, NULL, NULL, 2, NULL, '2009-07-14 20:23:59.698654', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
67 | +INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243630, 'HWDB Team', 16, NULL, 'hwdb-team', NULL, NULL, NULL, NULL, 3, NULL, '2009-07-09 09:12:39.400351', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
68 | |
69 | |
70 | ALTER TABLE person ENABLE TRIGGER ALL; |
71 | @@ -9142,23 +9143,24 @@ |
72 | INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (81, 16, 243620, 2, '2008-05-14 12:07:14.22745', NULL, NULL, NULL, 16, NULL, 16, '2008-05-14 12:07:14.22745', NULL, NULL, '2008-05-14 12:07:14.22745', NULL, NULL, NULL, '2008-05-14 12:07:14.140921'); |
73 | INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (83, 243622, 243621, 3, '2008-05-12 17:40:08.720578', NULL, NULL, NULL, 16, NULL, 16, '2008-05-12 17:40:08.720578', NULL, NULL, '2008-05-12 17:40:08.720578', NULL, NULL, NULL, '2008-05-12 17:40:08.637114'); |
74 | INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (85, 243627, 243628, 3, '2009-03-17 07:29:13.30381', NULL, NULL, NULL, 243627, NULL, 243627, '2009-03-17 07:29:13.30381', NULL, NULL, '2009-03-17 07:29:13.30381', NULL, NULL, NULL, '2009-03-17 07:29:13.259033'); |
75 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (86, 1, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
76 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (87, 12, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
77 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (88, 16, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
78 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (89, 22, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
79 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (90, 23, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
80 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (91, 26, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
81 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (92, 27, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
82 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (93, 28, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
83 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (94, 29, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
84 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (95, 38, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
85 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (96, 63, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
86 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (97, 70, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
87 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (98, 243610, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
88 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (99, 243611, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
89 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (100, 243617, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
90 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (101, 243622, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
91 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (102, 243623, 243629, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
92 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (86, 4, 243629, 3, '2009-07-14 20:23:59.769346', NULL, NULL, NULL, 4, NULL, 4, '2009-07-14 20:23:59.769346', NULL, NULL, '2009-07-14 20:23:59.769346', NULL, NULL, NULL, '2009-07-14 20:23:59.698654'); |
93 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (87, 1, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
94 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (88, 12, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
95 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (89, 16, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
96 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (90, 22, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
97 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (91, 23, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
98 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (92, 26, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
99 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (93, 27, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
100 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (94, 28, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
101 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (95, 29, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
102 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (96, 38, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
103 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (97, 63, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
104 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (98, 70, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
105 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (99, 243610, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
106 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (100, 243611, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
107 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (101, 243617, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
108 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (102, 243622, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
109 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (103, 243623, 243630, 2, '2009-07-09 11:58:46.481813', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:46.481813'); |
110 | |
111 | |
112 | ALTER TABLE teammembership ENABLE TRIGGER ALL; |
113 | @@ -9342,23 +9344,25 @@ |
114 | INSERT INTO teamparticipation (id, team, person) VALUES (195, 243628, 243628); |
115 | INSERT INTO teamparticipation (id, team, person) VALUES (196, 243628, 243627); |
116 | INSERT INTO teamparticipation (id, team, person) VALUES (197, 243629, 243629); |
117 | -INSERT INTO teamparticipation (id, team, person) VALUES (198, 243629, 1); |
118 | -INSERT INTO teamparticipation (id, team, person) VALUES (199, 243629, 12); |
119 | -INSERT INTO teamparticipation (id, team, person) VALUES (200, 243629, 16); |
120 | -INSERT INTO teamparticipation (id, team, person) VALUES (201, 243629, 22); |
121 | -INSERT INTO teamparticipation (id, team, person) VALUES (202, 243629, 23); |
122 | -INSERT INTO teamparticipation (id, team, person) VALUES (203, 243629, 26); |
123 | -INSERT INTO teamparticipation (id, team, person) VALUES (204, 243629, 27); |
124 | -INSERT INTO teamparticipation (id, team, person) VALUES (205, 243629, 28); |
125 | -INSERT INTO teamparticipation (id, team, person) VALUES (206, 243629, 29); |
126 | -INSERT INTO teamparticipation (id, team, person) VALUES (207, 243629, 38); |
127 | -INSERT INTO teamparticipation (id, team, person) VALUES (208, 243629, 63); |
128 | -INSERT INTO teamparticipation (id, team, person) VALUES (209, 243629, 70); |
129 | -INSERT INTO teamparticipation (id, team, person) VALUES (210, 243629, 243610); |
130 | -INSERT INTO teamparticipation (id, team, person) VALUES (211, 243629, 243611); |
131 | -INSERT INTO teamparticipation (id, team, person) VALUES (212, 243629, 243617); |
132 | -INSERT INTO teamparticipation (id, team, person) VALUES (213, 243629, 243622); |
133 | -INSERT INTO teamparticipation (id, team, person) VALUES (214, 243629, 243623); |
134 | +INSERT INTO teamparticipation (id, team, person) VALUES (198, 243629, 4); |
135 | +INSERT INTO teamparticipation (id, team, person) VALUES (199, 243630, 243630); |
136 | +INSERT INTO teamparticipation (id, team, person) VALUES (200, 243630, 1); |
137 | +INSERT INTO teamparticipation (id, team, person) VALUES (201, 243630, 12); |
138 | +INSERT INTO teamparticipation (id, team, person) VALUES (202, 243630, 16); |
139 | +INSERT INTO teamparticipation (id, team, person) VALUES (203, 243630, 22); |
140 | +INSERT INTO teamparticipation (id, team, person) VALUES (204, 243630, 23); |
141 | +INSERT INTO teamparticipation (id, team, person) VALUES (205, 243630, 26); |
142 | +INSERT INTO teamparticipation (id, team, person) VALUES (206, 243630, 27); |
143 | +INSERT INTO teamparticipation (id, team, person) VALUES (207, 243630, 28); |
144 | +INSERT INTO teamparticipation (id, team, person) VALUES (208, 243630, 29); |
145 | +INSERT INTO teamparticipation (id, team, person) VALUES (209, 243630, 38); |
146 | +INSERT INTO teamparticipation (id, team, person) VALUES (210, 243630, 63); |
147 | +INSERT INTO teamparticipation (id, team, person) VALUES (211, 243630, 70); |
148 | +INSERT INTO teamparticipation (id, team, person) VALUES (212, 243630, 243610); |
149 | +INSERT INTO teamparticipation (id, team, person) VALUES (213, 243630, 243611); |
150 | +INSERT INTO teamparticipation (id, team, person) VALUES (214, 243630, 243617); |
151 | +INSERT INTO teamparticipation (id, team, person) VALUES (215, 243630, 243622); |
152 | +INSERT INTO teamparticipation (id, team, person) VALUES (216, 243630, 243623); |
153 | |
154 | |
155 | ALTER TABLE teamparticipation ENABLE TRIGGER ALL; |
156 | |
157 | === modified file 'database/sampledata/current.sql' |
158 | --- database/sampledata/current.sql 2009-07-17 00:26:05 +0000 |
159 | +++ database/sampledata/current.sql 2009-07-19 04:41:14 +0000 |
160 | @@ -4754,7 +4754,8 @@ |
161 | INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243626, 'Launchpad Users', 12, NULL, 'launchpad-users', NULL, NULL, NULL, NULL, 2, NULL, '2008-11-26 18:19:53.547918', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
162 | INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243627, 'Ubuntu-branches-owner', NULL, NULL, 'ubuntu-branches-owner', NULL, NULL, NULL, NULL, 1, NULL, '2009-03-17 07:26:14.024613', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, 1, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, 2436242); |
163 | INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243628, 'Ubuntu branches', 243627, 'Celebrity team that controls official source package branches.', 'ubuntu-branches', NULL, NULL, NULL, NULL, 3, NULL, '2009-03-17 07:27:39.306182', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
164 | -INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243629, 'HWDB Team', 16, NULL, 'hwdb-team', NULL, NULL, NULL, NULL, 3, NULL, '2009-07-09 09:12:39.400351', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
165 | +INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243629, 'Ubuntu Security Team', 4, NULL, 'ubuntu-security', NULL, NULL, NULL, NULL, 2, NULL, '2009-07-14 20:23:59.698654', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
166 | +INSERT INTO person (id, displayname, teamowner, teamdescription, name, language, fti, defaultmembershipperiod, defaultrenewalperiod, subscriptionpolicy, merged, datecreated, addressline1, addressline2, organization, city, province, country, postcode, phone, homepage_content, icon, mugshot, hide_email_addresses, creation_rationale, creation_comment, registrant, logo, renewal_policy, personal_standing, personal_standing_reason, mail_resumption_date, mailing_list_auto_subscribe_policy, mailing_list_receive_duplicates, visibility, verbose_bugnotifications, account) VALUES (243630, 'HWDB Team', 16, NULL, 'hwdb-team', NULL, NULL, NULL, NULL, 3, NULL, '2009-07-09 09:12:39.400351', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, false, NULL, NULL, NULL, NULL, 10, 0, NULL, NULL, 1, true, 1, true, NULL); |
167 | |
168 | |
169 | ALTER TABLE person ENABLE TRIGGER ALL; |
170 | @@ -9139,23 +9140,24 @@ |
171 | INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (84, 243623, 243624, 3, '2008-06-27 14:49:38.698594', NULL, NULL, NULL, 243623, NULL, 243623, '2008-06-27 14:49:38.698594', NULL, NULL, '2008-06-27 14:49:38.698594', NULL, NULL, NULL, '2008-06-27 14:49:38.676264'); |
172 | INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (85, 12, 243626, 3, '2008-11-26 18:19:53.849673', NULL, NULL, NULL, 12, NULL, 12, '2008-11-26 18:19:53.849673', NULL, NULL, '2008-11-26 18:19:53.849673', NULL, NULL, NULL, '2008-11-26 18:19:53.547918'); |
173 | INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (86, 243627, 243628, 3, '2009-03-17 07:27:39.361471', NULL, NULL, NULL, 243627, NULL, 243627, '2009-03-17 07:27:39.361471', NULL, NULL, '2009-03-17 07:27:39.361471', NULL, NULL, NULL, '2009-03-17 07:27:39.306182'); |
174 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (87, 1, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
175 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (88, 12, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
176 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (89, 16, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
177 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (90, 22, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
178 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (91, 23, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
179 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (92, 26, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
180 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (93, 27, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
181 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (94, 28, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
182 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (95, 29, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
183 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (96, 38, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
184 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (97, 63, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
185 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (98, 70, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
186 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (99, 243610, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
187 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (100, 243611, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
188 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (101, 243617, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
189 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (102, 243622, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
190 | -INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (103, 243623, 243629, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
191 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (87, 4, 243629, 3, '2009-07-14 20:23:59.769346', NULL, NULL, NULL, 4, NULL, 4, '2009-07-14 20:23:59.769346', NULL, NULL, '2009-07-14 20:23:59.769346', NULL, NULL, NULL, '2009-07-14 20:23:59.698654'); |
192 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (88, 1, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
193 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (89, 12, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
194 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (90, 16, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
195 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (91, 22, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
196 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (92, 23, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
197 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (93, 26, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
198 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (94, 27, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
199 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (95, 28, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
200 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (96, 29, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
201 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (97, 38, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
202 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (98, 63, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
203 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (99, 70, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
204 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (100, 243610, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
205 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (101, 243611, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
206 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (102, 243617, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
207 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (103, 243622, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
208 | +INSERT INTO teammembership (id, person, team, status, date_joined, date_expires, last_changed_by, last_change_comment, proposed_by, acknowledged_by, reviewed_by, date_proposed, date_last_changed, date_acknowledged, date_reviewed, proponent_comment, acknowledger_comment, reviewer_comment, date_created) VALUES (104, 243623, 243630, 2, '2009-07-09 11:58:38.122886', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '2009-07-09 11:58:38.122886'); |
209 | |
210 | |
211 | ALTER TABLE teammembership ENABLE TRIGGER ALL; |
212 | @@ -9344,23 +9346,25 @@ |
213 | INSERT INTO teamparticipation (id, team, person) VALUES (198, 243628, 243628); |
214 | INSERT INTO teamparticipation (id, team, person) VALUES (199, 243628, 243627); |
215 | INSERT INTO teamparticipation (id, team, person) VALUES (200, 243629, 243629); |
216 | -INSERT INTO teamparticipation (id, team, person) VALUES (201, 243629, 1); |
217 | -INSERT INTO teamparticipation (id, team, person) VALUES (202, 243629, 12); |
218 | -INSERT INTO teamparticipation (id, team, person) VALUES (203, 243629, 16); |
219 | -INSERT INTO teamparticipation (id, team, person) VALUES (204, 243629, 22); |
220 | -INSERT INTO teamparticipation (id, team, person) VALUES (205, 243629, 23); |
221 | -INSERT INTO teamparticipation (id, team, person) VALUES (206, 243629, 26); |
222 | -INSERT INTO teamparticipation (id, team, person) VALUES (207, 243629, 27); |
223 | -INSERT INTO teamparticipation (id, team, person) VALUES (208, 243629, 28); |
224 | -INSERT INTO teamparticipation (id, team, person) VALUES (209, 243629, 29); |
225 | -INSERT INTO teamparticipation (id, team, person) VALUES (210, 243629, 38); |
226 | -INSERT INTO teamparticipation (id, team, person) VALUES (211, 243629, 63); |
227 | -INSERT INTO teamparticipation (id, team, person) VALUES (212, 243629, 70); |
228 | -INSERT INTO teamparticipation (id, team, person) VALUES (213, 243629, 243610); |
229 | -INSERT INTO teamparticipation (id, team, person) VALUES (214, 243629, 243611); |
230 | -INSERT INTO teamparticipation (id, team, person) VALUES (215, 243629, 243617); |
231 | -INSERT INTO teamparticipation (id, team, person) VALUES (216, 243629, 243622); |
232 | -INSERT INTO teamparticipation (id, team, person) VALUES (217, 243629, 243623); |
233 | +INSERT INTO teamparticipation (id, team, person) VALUES (201, 243629, 4); |
234 | +INSERT INTO teamparticipation (id, team, person) VALUES (202, 243630, 243630); |
235 | +INSERT INTO teamparticipation (id, team, person) VALUES (203, 243630, 1); |
236 | +INSERT INTO teamparticipation (id, team, person) VALUES (204, 243630, 12); |
237 | +INSERT INTO teamparticipation (id, team, person) VALUES (205, 243630, 16); |
238 | +INSERT INTO teamparticipation (id, team, person) VALUES (206, 243630, 22); |
239 | +INSERT INTO teamparticipation (id, team, person) VALUES (207, 243630, 23); |
240 | +INSERT INTO teamparticipation (id, team, person) VALUES (208, 243630, 26); |
241 | +INSERT INTO teamparticipation (id, team, person) VALUES (209, 243630, 27); |
242 | +INSERT INTO teamparticipation (id, team, person) VALUES (210, 243630, 28); |
243 | +INSERT INTO teamparticipation (id, team, person) VALUES (211, 243630, 29); |
244 | +INSERT INTO teamparticipation (id, team, person) VALUES (212, 243630, 38); |
245 | +INSERT INTO teamparticipation (id, team, person) VALUES (213, 243630, 63); |
246 | +INSERT INTO teamparticipation (id, team, person) VALUES (214, 243630, 70); |
247 | +INSERT INTO teamparticipation (id, team, person) VALUES (215, 243630, 243610); |
248 | +INSERT INTO teamparticipation (id, team, person) VALUES (216, 243630, 243611); |
249 | +INSERT INTO teamparticipation (id, team, person) VALUES (217, 243630, 243617); |
250 | +INSERT INTO teamparticipation (id, team, person) VALUES (218, 243630, 243622); |
251 | +INSERT INTO teamparticipation (id, team, person) VALUES (219, 243630, 243623); |
252 | |
253 | |
254 | ALTER TABLE teamparticipation ENABLE TRIGGER ALL; |
255 | |
256 | === added file 'database/schema/patch-2109-55-2.sql' |
257 | --- database/schema/patch-2109-55-2.sql 1970-01-01 00:00:00 +0000 |
258 | +++ database/schema/patch-2109-55-2.sql 2009-06-30 06:33:22 +0000 |
259 | @@ -0,0 +1,9 @@ |
260 | +SET client_min_messages=ERROR; |
261 | + |
262 | +CREATE INDEX revisionauthor__lower_email__idx ON RevisionAuthor(lower(email)); |
263 | +CREATE INDEX HWSubmission__lower_raw_emailaddress__idx |
264 | + ON HWSubmission(lower(raw_emailaddress)); |
265 | +CREATE INDEX question__status__datecreated__idx |
266 | + ON Question(status, datecreated); |
267 | + |
268 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2109, 55, 2); |
269 | |
270 | === added file 'database/schema/patch-2109-61-1.sql' |
271 | --- database/schema/patch-2109-61-1.sql 1970-01-01 00:00:00 +0000 |
272 | +++ database/schema/patch-2109-61-1.sql 2009-07-06 12:03:49 +0000 |
273 | @@ -0,0 +1,71 @@ |
274 | +SET client_min_messages=ERROR; |
275 | + |
276 | +DROP VIEW IF EXISTS POExport; |
277 | + |
278 | +CREATE VIEW POExport AS |
279 | +SELECT |
280 | + ((COALESCE((potmsgset.id)::text, 'X'::text) || '.'::text) || COALESCE((translationmessage.id)::text, 'X'::text)) AS id, |
281 | + POTemplate.productseries, |
282 | + POTemplate.sourcepackagename, |
283 | + POTemplate.distroseries, |
284 | + POTemplate.id AS potemplate, |
285 | + POTemplate.header AS template_header, |
286 | + POTemplate.languagepack, |
287 | + POFile.id AS pofile, |
288 | + POFile.language, |
289 | + POFile.variant, |
290 | + POFile.topcomment AS translation_file_comment, |
291 | + POFile.header AS translation_header, |
292 | + POFile.fuzzyheader AS is_translation_header_fuzzy, |
293 | + TranslationTemplateItem.sequence, |
294 | + POTMsgSet.id AS potmsgset, |
295 | + TranslationMessage.comment, |
296 | + POTMsgSet.sourcecomment AS source_comment, |
297 | + POTMsgSet.filereferences AS file_references, |
298 | + POTMsgSet.flagscomment AS flags_comment, |
299 | + POTMsgSet.context, |
300 | + msgid_singular.msgid AS msgid_singular, |
301 | + msgid_plural.msgid AS msgid_plural, |
302 | + TranslationMessage.is_current, |
303 | + TranslationMessage.is_imported, |
304 | + TranslationMessage.potemplate AS diverged, |
305 | + potranslation0.translation AS translation0, |
306 | + potranslation1.translation AS translation1, |
307 | + potranslation2.translation AS translation2, |
308 | + potranslation3.translation AS translation3, |
309 | + potranslation4.translation AS translation4, |
310 | + potranslation5.translation AS translation5 |
311 | +FROM POTMsgSet |
312 | +JOIN TranslationTemplateItem ON |
313 | + TranslationTemplateItem.potmsgset = POTMsgSet.id |
314 | +JOIN POTemplate ON |
315 | + POTemplate.id = TranslationTemplateItem.potemplate |
316 | +JOIN POFile ON |
317 | + POTemplate.id = POFile.potemplate |
318 | +LEFT JOIN TranslationMessage ON |
319 | + POTMsgSet.id = TranslationMessage.potmsgset AND |
320 | + TranslationMessage.is_current IS TRUE AND |
321 | + TranslationMessage.language = POFile.language AND |
322 | + TranslationMessage.variant IS NOT DISTINCT FROM POFile.variant |
323 | +LEFT JOIN POMsgID AS msgid_singular ON |
324 | + msgid_singular.id = POTMsgSet.msgid_singular |
325 | +LEFT JOIN POMsgID AS msgid_plural ON |
326 | + msgid_plural.id = POTMsgSet.msgid_plural |
327 | +LEFT JOIN POTranslation AS potranslation0 ON |
328 | + potranslation0.id = TranslationMessage.msgstr0 |
329 | +LEFT JOIN POTranslation AS potranslation1 ON |
330 | + potranslation1.id = TranslationMessage.msgstr1 |
331 | +LEFT JOIN POTranslation AS potranslation2 ON |
332 | + potranslation2.id = TranslationMessage.msgstr2 |
333 | +LEFT JOIN POTranslation AS potranslation3 ON |
334 | + potranslation3.id = TranslationMessage.msgstr3 |
335 | +LEFT JOIN POTranslation AS potranslation4 ON |
336 | + potranslation4.id = TranslationMessage.msgstr4 |
337 | +LEFT JOIN POTranslation potranslation5 ON |
338 | + potranslation5.id = TranslationMessage.msgstr5 |
339 | +WHERE |
340 | + TranslationMessage.potemplate IS NULL OR |
341 | + TranslationMessage.potemplate = POFile.potemplate; |
342 | + |
343 | + |
344 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2109, 61, 1); |
345 | |
346 | === added file 'database/schema/patch-2109-62-0.sql' |
347 | --- database/schema/patch-2109-62-0.sql 1970-01-01 00:00:00 +0000 |
348 | +++ database/schema/patch-2109-62-0.sql 2009-07-09 13:40:12 +0000 |
349 | @@ -0,0 +1,5 @@ |
350 | +SET client_min_messages=ERROR; |
351 | + |
352 | +DROP VIEW IF EXISTS POExport; |
353 | + |
354 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2109, 62, 0); |
355 | |
356 | === modified file 'database/schema/security.cfg' |
357 | --- database/schema/security.cfg 2009-07-17 18:46:25 +0000 |
358 | +++ database/schema/security.cfg 2009-07-19 04:41:14 +0000 |
359 | @@ -592,6 +592,7 @@ |
360 | public.revisioncache = SELECT, INSERT |
361 | public.revisionparent = SELECT, INSERT |
362 | public.revisionproperty = SELECT, INSERT |
363 | +public.seriessourcepackagebranch = SELECT |
364 | public.sourcepackagename = SELECT |
365 | public.staticdiff = SELECT, INSERT, DELETE |
366 | public.validpersoncache = SELECT |
367 | @@ -1116,7 +1117,7 @@ |
368 | public.packageupload = SELECT, UPDATE |
369 | public.packageuploadsource = SELECT |
370 | public.packageuploadbuild = SELECT |
371 | -public.packageuploadcustom = SELECT |
372 | +public.packageuploadcustom = SELECT, UPDATE |
373 | |
374 | # Distribution/Publishing stuff |
375 | public.archive = SELECT, UPDATE |
376 | @@ -1133,20 +1134,21 @@ |
377 | public.pocketchroot = SELECT |
378 | public.sourcepackagerelease = SELECT, UPDATE |
379 | public.binarypackagerelease = SELECT, UPDATE |
380 | -public.sourcepackagereleasefile = SELECT |
381 | -public.binarypackagefile = SELECT |
382 | +public.sourcepackagereleasefile = SELECT, UPDATE |
383 | +public.binarypackagefile = SELECT, UPDATE |
384 | public.sourcepackagename = SELECT |
385 | public.binarypackagename = SELECT |
386 | public.binarypackagepublishinghistory = SELECT |
387 | public.sourcepackagepublishinghistory = SELECT |
388 | public.sourcepackagefilepublishing = SELECT |
389 | public.binarypackagefilepublishing = SELECT |
390 | -public.securesourcepackagepublishinghistory = SELECT, INSERT |
391 | -public.securebinarypackagepublishinghistory = SELECT, INSERT |
392 | +public.securesourcepackagepublishinghistory = SELECT, INSERT, UPDATE |
393 | +public.securebinarypackagepublishinghistory = SELECT, INSERT, UPDATE |
394 | public.component = SELECT |
395 | public.section = SELECT |
396 | public.componentselection = SELECT |
397 | public.sectionselection = SELECT |
398 | +public.packagediff = SELECT, UPDATE |
399 | |
400 | # Librarian stuff |
401 | public.libraryfilealias = SELECT, INSERT |
402 | @@ -1602,6 +1604,7 @@ |
403 | public.product = SELECT |
404 | public.productseries = SELECT |
405 | public.revision = SELECT |
406 | +public.revisionauthor = SELECT, INSERT |
407 | public.sourcepackagename = SELECT |
408 | public.staticdiff = SELECT, INSERT |
409 | public.teammembership = SELECT |
410 | |
411 | === modified file 'database/schema/trusted.sql' |
412 | --- database/schema/trusted.sql 2009-07-17 00:26:05 +0000 |
413 | +++ database/schema/trusted.sql 2009-07-19 04:41:14 +0000 |
414 | @@ -409,13 +409,13 @@ |
415 | |
416 | IF v_trash_old THEN |
417 | -- Was this somebody's most-recently-changed message? |
418 | + -- If so, delete the entry for that change. |
419 | DELETE FROM POFileTranslator |
420 | WHERE latest_message = OLD.id; |
421 | - |
422 | IF FOUND THEN |
423 | - -- Delete old records. |
424 | - |
425 | - -- Insert a past record if there is one. |
426 | + -- We deleted the entry for somebody's latest contribution. |
427 | + -- Find that person's latest remaining contribution and |
428 | + -- create a new record for that. |
429 | INSERT INTO POFileTranslator ( |
430 | person, pofile, latest_message, date_last_touched |
431 | ) |
432 | @@ -427,25 +427,24 @@ |
433 | new_latest_message.date_reviewed) |
434 | FROM POFile |
435 | JOIN TranslationTemplateItem AS old_template_item |
436 | - ON (OLD.potmsgset = |
437 | - old_template_item.potmsgset) AND |
438 | - (old_template_item.potemplate = pofile.potemplate) AND |
439 | - (pofile.language |
440 | - IS NOT DISTINCT FROM OLD.language) AND |
441 | - (pofile.variant |
442 | - IS NOT DISTINCT FROM OLD.variant) |
443 | + ON OLD.potmsgset = old_template_item.potmsgset AND |
444 | + old_template_item.potemplate = pofile.potemplate AND |
445 | + pofile.language = OLD.language AND |
446 | + pofile.variant IS NOT DISTINCT FROM OLD.variant |
447 | JOIN TranslationTemplateItem AS new_template_item |
448 | ON (old_template_item.potemplate = |
449 | new_template_item.potemplate) |
450 | JOIN TranslationMessage AS new_latest_message |
451 | - ON (new_latest_message.potmsgset = |
452 | - new_template_item.potmsgset) AND |
453 | - (new_latest_message.language |
454 | - IS NOT DISTINCT FROM OLD.language AND |
455 | - (new_latest_message.variant) |
456 | - IS NOT DISTINCT FROM OLD.variant) |
457 | + ON new_latest_message.potmsgset = |
458 | + new_template_item.potmsgset AND |
459 | + new_latest_message.language = OLD.language AND |
460 | + new_latest_message.variant IS NOT DISTINCT FROM OLD.variant |
461 | + LEFT OUTER JOIN POfileTranslator AS ExistingEntry |
462 | + ON ExistingEntry.person = OLD.submitter AND |
463 | + ExistingEntry.pofile = POFile.id |
464 | WHERE |
465 | - new_latest_message.submitter=OLD.submitter |
466 | + new_latest_message.submitter = OLD.submitter AND |
467 | + ExistingEntry IS NULL |
468 | ORDER BY new_latest_message.submitter, pofile.id, |
469 | new_latest_message.date_created DESC, |
470 | new_latest_message.id DESC; |
471 | |
472 | === modified file 'lib/canonical/launchpad/doc/celebrities.txt' |
473 | --- lib/canonical/launchpad/doc/celebrities.txt 2009-05-06 16:10:16 +0000 |
474 | +++ lib/canonical/launchpad/doc/celebrities.txt 2009-07-15 02:58:08 +0000 |
475 | @@ -200,3 +200,14 @@ |
476 | >>> ubuntu_branches = personset.getByName('ubuntu-branches') |
477 | >>> celebs.ubuntu_branches == ubuntu_branches |
478 | True |
479 | + |
480 | + |
481 | +== Ubuntu security team == |
482 | + |
483 | +There is a celebrity representing the 'ubuntu-security' team, which is |
484 | +mainly used for granting special permissions on the ubuntu primary |
485 | +archive. |
486 | + |
487 | + >>> ubuntu_security = personset.getByName('ubuntu-security') |
488 | + >>> celebs.ubuntu_security == ubuntu_security |
489 | + True |
490 | |
491 | === modified file 'lib/canonical/launchpad/doc/tales.txt' |
492 | --- lib/canonical/launchpad/doc/tales.txt 2009-07-14 12:52:21 +0000 |
493 | +++ lib/canonical/launchpad/doc/tales.txt 2009-07-24 22:07:48 +0000 |
494 | @@ -1225,6 +1225,38 @@ |
495 | u'' |
496 | |
497 | |
498 | +== CSS classes for public and private objects == |
499 | + |
500 | +Users need to recognise private information as they are viewing it. This is |
501 | +accomplished with a CSS class. |
502 | + |
503 | +Any object can be converted to the 'public' CSS class. Th object does not need |
504 | +to have a private bool attributes. |
505 | + |
506 | + >>> thing = object() |
507 | + >>> print test_tales('context/fmt:public-private-css', context=thing) |
508 | + public |
509 | + |
510 | +The CSS class honors the state of the object's privacy if the object supports |
511 | +the private attribute. If the object is not private, the class is 'public'. |
512 | + |
513 | + >>> bug = factory.makeBug(title='public-and-private') |
514 | + >>> print bug.private |
515 | + False |
516 | + >>> print test_tales('context/fmt:public-private-css', context=bug) |
517 | + public |
518 | + |
519 | +If the private attribute is True, the class is 'private'. |
520 | + |
521 | + >>> owner = bug.bugtasks[0].target.owner |
522 | + >>> login_person(owner) |
523 | + >>> bug.setPrivate(True, owner) |
524 | + True |
525 | + >>> print test_tales('context/fmt:public-private-css', context=bug) |
526 | + private |
527 | + >>> login(ANONYMOUS) |
528 | + |
529 | + |
530 | == Formatting of private attributes on Teams == |
531 | |
532 | To protect privacy of teams, the formatter for teams will only show |
533 | |
534 | === modified file 'lib/canonical/launchpad/doc/vocabularies.txt' |
535 | --- lib/canonical/launchpad/doc/vocabularies.txt 2009-06-02 08:20:49 +0000 |
536 | +++ lib/canonical/launchpad/doc/vocabularies.txt 2009-07-03 11:34:14 +0000 |
537 | @@ -457,6 +457,31 @@ |
538 | BranchVocabulary with respect to the tokens and privacy awareness. |
539 | |
540 | |
541 | +=== HostedBranchRestrictedOnOwner === |
542 | + |
543 | +Here's a vocabulary for all hosted branches owned by the current user. |
544 | + |
545 | + >>> from lp.code.enums import BranchType |
546 | + |
547 | + >>> a_user = factory.makePerson(name='a-branching-user') |
548 | + >>> product1 = factory.makeProduct(name='product-one') |
549 | + >>> mirrored_branch = factory.makeBranch( |
550 | + ... owner=a_user, product=product1, name='mirrored', |
551 | + ... branch_type=BranchType.MIRRORED) |
552 | + >>> product2 = factory.makeProduct(name='product-two') |
553 | + >>> hosted_branch = factory.makeBranch( |
554 | + ... owner=a_user, product=product2, name='hosted') |
555 | + >>> foreign_branch = factory.makeBranch() |
556 | + |
557 | +It returns branches owned by the user, but not ones owned by others, nor |
558 | +ones that aren't hosted on Launchpad. |
559 | + |
560 | + >>> branch_vocabulary = vocabulary_registry.get( |
561 | + ... a_user, "HostedBranchRestrictedOnOwner") |
562 | + >>> print_vocab_branches(branch_vocabulary, None) |
563 | + ~a-branching-user/product-two/hosted |
564 | + |
565 | + |
566 | === Processor === |
567 | |
568 | All processors type available in Launchpad. |
569 | |
570 | === modified file 'lib/canonical/launchpad/icing/style-3-0.css' |
571 | --- lib/canonical/launchpad/icing/style-3-0.css 2009-07-15 16:05:25 +0000 |
572 | +++ lib/canonical/launchpad/icing/style-3-0.css 2009-07-24 22:18:22 +0000 |
573 | @@ -189,7 +189,7 @@ |
574 | width: auto; |
575 | margin-right: 24%; |
576 | } |
577 | -#footer { |
578 | +.footer { |
579 | margin-top: 2em; |
580 | border-top: 1px solid #cdd3dd; |
581 | padding-top: .5em; |
582 | @@ -234,9 +234,13 @@ |
583 | font-family: bitstream vera sans, arial, helvetica, clean, sans-serif; |
584 | font-size: 97%; /* 2.0 backward compatability */ |
585 | } |
586 | +body.private { |
587 | + /* It must be obvious to the user that the context is private */ |
588 | + background: url("/@@/private-y-bg") top left repeat-y; |
589 | + } |
590 | h1 { |
591 | clear: none; |
592 | - padding-top: 6px; /* An offset fromt he logo. */ |
593 | + padding-top: 6px; /* An offset from the logo. */ |
594 | font-size: 197%; |
595 | margin-bottom: .5em; /* 2.0 backward compatability */ |
596 | } |
597 | @@ -252,10 +256,7 @@ |
598 | p, li, dt, dd, blockquote { |
599 | max-width: 45em; /* Wrap the text before the eye gets lost. */ |
600 | } |
601 | -pre, code, samp, tt, |
602 | -#bug-description, |
603 | -.bug-comment, |
604 | -.bug-activity { |
605 | +pre, code, samp, tt, .console { |
606 | font-size: 116%; |
607 | } |
608 | ol { |
609 | @@ -356,7 +357,7 @@ |
610 | |
611 | |
612 | /* Exceptions they may be common. */ |
613 | -#contributors dt strong { |
614 | +.contributors dt strong { |
615 | padding-left: 1em; |
616 | } |
617 | |
618 | @@ -383,17 +384,17 @@ |
619 | background: #fbfbfb; |
620 | } |
621 | |
622 | -#project-downloads { |
623 | +.downloads { |
624 | font-weight: bold; |
625 | } |
626 | -#project-downloads a { |
627 | +.downloads a { |
628 | color: #4f843c; |
629 | } |
630 | -#project-downloads li { |
631 | +.downloads li { |
632 | margin: 0; |
633 | padding: 2px 0 0; |
634 | } |
635 | -#project-downloads li a { |
636 | +.downloads li a { |
637 | display: block; |
638 | margin: 0; |
639 | border: 1px solid #4f843c; |
640 | @@ -406,7 +407,7 @@ |
641 | font-size: 108%; |
642 | text-decoration: underline; |
643 | } |
644 | -#project-downloads .released { |
645 | +.downloads .released { |
646 | margin: 0 0 .5em 0; |
647 | background: #f3f3f3 url(../images/bg-project-released.gif) right bottom no-repeat; |
648 | padding: .4em .2em; |
649 | @@ -415,36 +416,36 @@ |
650 | text-align: right; |
651 | } |
652 | |
653 | -#get-involved ul { |
654 | +.involvement ul { |
655 | border-top: 1px solid #d0d0d0; |
656 | } |
657 | -#get-involved li { |
658 | +.involvement li { |
659 | border-bottom: 1px solid #d0d0d0; |
660 | padding: 0; |
661 | font-size: 108%; |
662 | font-weight: bold; |
663 | } |
664 | -#get-involved a { |
665 | +.involvement a { |
666 | display: block; |
667 | padding: .3em; |
668 | } |
669 | -#get-involved a.bugs { |
670 | +.involvement a.bugs { |
671 | color: #b9413e; |
672 | background: url(../images/red-arrow.gif) right center no-repeat; |
673 | } |
674 | -#get-involved a.question { |
675 | +.involvement a.question { |
676 | color: #5265b2; |
677 | background: url(../images/blue-arrow.gif) right center no-repeat; |
678 | } |
679 | -#get-involved a.translate { |
680 | +.involvement a.translate { |
681 | color: #c5458e; |
682 | background: url(../images/pink-arrow.gif) right center no-repeat; |
683 | } |
684 | -#get-involved a.code { |
685 | +.involvement a.code { |
686 | color: #d39f57; |
687 | background: url(../images/yellow-arrow.gif) right center no-repeat; |
688 | } |
689 | -#get-involved a.blueprint { |
690 | +.involvement a.blueprint { |
691 | color: #5ba4c6; |
692 | background: url(../images/lightblue-arrow.gif) right center no-repeat; |
693 | } |
694 | |
695 | === modified file 'lib/canonical/launchpad/icing/style.css' |
696 | --- lib/canonical/launchpad/icing/style.css 2009-07-16 04:17:24 +0000 |
697 | +++ lib/canonical/launchpad/icing/style.css 2009-07-21 03:23:28 +0000 |
698 | @@ -1618,9 +1618,15 @@ |
699 | font-size: 1.5em; |
700 | font-family: "URW Gothic L","MgOpen Moderna","Lucida Sans",sans-serif; |
701 | text-align: center; |
702 | - margin: 1em 0 1em 0; |
703 | - padding: 0.2em; |
704 | + margin: auto; |
705 | + margin-top: 1em; |
706 | + margin-bottom: 1em; |
707 | + padding: 0.5em; |
708 | background-color: #ededed; |
709 | + width: 90%; |
710 | +} |
711 | +#home-description .smaller { |
712 | + font-size: 80%; |
713 | } |
714 | #home-stats { |
715 | margin: auto; |
716 | |
717 | === added file 'lib/canonical/launchpad/images/edit-transparent.png' |
718 | Binary files lib/canonical/launchpad/images/edit-transparent.png 1970-01-01 00:00:00 +0000 and lib/canonical/launchpad/images/edit-transparent.png 2009-07-20 14:59:27 +0000 differ |
719 | === added file 'lib/canonical/launchpad/images/private-y-bg.png' |
720 | Binary files lib/canonical/launchpad/images/private-y-bg.png 1970-01-01 00:00:00 +0000 and lib/canonical/launchpad/images/private-y-bg.png 2009-07-24 22:07:48 +0000 differ |
721 | === modified file 'lib/canonical/launchpad/interfaces/launchpad.py' |
722 | --- lib/canonical/launchpad/interfaces/launchpad.py 2009-07-17 00:26:05 +0000 |
723 | +++ lib/canonical/launchpad/interfaces/launchpad.py 2009-07-19 04:41:14 +0000 |
724 | @@ -128,10 +128,10 @@ |
725 | ubuntu_branches = Attribute("The Ubuntu branches team") |
726 | ubuntu_bugzilla = Attribute("The Ubuntu Bugzilla.") |
727 | ubuntu_cdimage_mirror = Attribute("The main cdimage mirror for Ubuntu.") |
728 | + ubuntu_security = Attribute("The 'ubuntu-security' team.") |
729 | vcs_imports = Attribute("The 'vcs-imports' team.") |
730 | |
731 | |
732 | - |
733 | class ICrowd(Interface): |
734 | |
735 | def __contains__(person_or_team_or_anything): |
736 | |
737 | === modified file 'lib/canonical/launchpad/javascript/bugs/bugtask-index.js' |
738 | --- lib/canonical/launchpad/javascript/bugs/bugtask-index.js 2009-07-17 18:46:25 +0000 |
739 | +++ lib/canonical/launchpad/javascript/bugs/bugtask-index.js 2009-07-21 22:46:23 +0000 |
740 | @@ -259,11 +259,6 @@ |
741 | subscriber: new Y.lp.Subscriber({uri: LP.client.links.me}) |
742 | }); |
743 | |
744 | - // XXX deryck 2009-07-09 bug=397406 The classnames used to |
745 | - // determine direct vs. dupe subscriptions are not set |
746 | - // correctly and fix_subscription_link_classes works around |
747 | - // this bug. |
748 | - fix_subscription_link_classes(subscription); |
749 | var is_direct = subscription.get( |
750 | 'link').get('parentNode').hasClass('subscribed-true'); |
751 | var has_dupes = subscription.get( |
752 | @@ -298,27 +293,6 @@ |
753 | } |
754 | |
755 | /* |
756 | - * XXX deryck 2009-07-09 bug=397406 The classnames used to |
757 | - * determine direct vs. dupe subscriptions are not set |
758 | - * correctly and fix_subscription_link_classes works around |
759 | - * this bug. |
760 | - * |
761 | - * @method fix_subscription_link_classes |
762 | - * @param subscription {Object} A Y.lp.Subscription object. |
763 | - */ |
764 | -function fix_subscription_link_classes(subscription) { |
765 | - var subscriber = subscription.get('subscriber'); |
766 | - var subscription_link = subscription.get('link'); |
767 | - var me_nodes = Y.all('.subscriber-' + subscriber.get('name')); |
768 | - if (Y.Lang.isValue(me_nodes) && me_nodes.size() > 1) { |
769 | - set_subscription_link_parent_class(subscription_link, true, true); |
770 | - me_nodes.each(function(div) { |
771 | - set_subscription_link_parent_class(div.query('img'), true, true); |
772 | - }); |
773 | - } |
774 | -} |
775 | - |
776 | -/* |
777 | * Set click handlers for unsubscribe remove icons. |
778 | * |
779 | * @method setup_unsubscribe_icon_handlers |
780 | @@ -1233,7 +1207,7 @@ |
781 | edit_icon.setAttribute('src', '/@@/edit'); |
782 | }); |
783 | content.on('mouseout', function(e) { |
784 | - edit_icon.setAttribute('src', null); |
785 | + edit_icon.setAttribute('src', '/@@/edit-transparent'); |
786 | }); |
787 | content.setStyle('cursor', 'pointer'); |
788 | }; |
789 | @@ -1259,8 +1233,8 @@ |
790 | // canel clicks on the edit links. Users most likely don't |
791 | // want to edit the bugtasks. |
792 | if (Y.Lang.isValue(LP.client.cache.bug.duplicate_of_link)) { |
793 | - status_content.on('click', function(e) { e.halt() }); |
794 | - importance_content.on('click', function(e) { e.halt() }); |
795 | + status_content.on('click', function(e) { e.halt(); }); |
796 | + importance_content.on('click', function(e) { e.halt(); }); |
797 | return; |
798 | } |
799 | |
800 | |
801 | === modified file 'lib/canonical/launchpad/javascript/lp/lp.js' |
802 | --- lib/canonical/launchpad/javascript/lp/lp.js 2009-07-13 10:52:46 +0000 |
803 | +++ lib/canonical/launchpad/javascript/lp/lp.js 2009-07-21 17:14:29 +0000 |
804 | @@ -516,7 +516,17 @@ |
805 | for (var i = 0; i < nodes.length; i++) { |
806 | var node = nodes[i]; |
807 | if (node.focus) { |
808 | - node.focus(); |
809 | + try { |
810 | + // Trying to focus a hidden element throws an error in IE8. |
811 | + if (node.offsetHeight !== 0) { |
812 | + node.focus(); |
813 | + } |
814 | + } catch (e) { |
815 | + YUI().use('console', function(Y) { |
816 | + Y.log('In setFocusByName(<' + |
817 | + node.tagName + ' type=' + node.type + '>): ' + e); |
818 | + }); |
819 | + } |
820 | break; |
821 | } |
822 | } |
823 | |
824 | === modified file 'lib/canonical/launchpad/pagetitles.py' |
825 | --- lib/canonical/launchpad/pagetitles.py 2009-07-17 00:26:05 +0000 |
826 | +++ lib/canonical/launchpad/pagetitles.py 2009-07-19 04:41:14 +0000 |
827 | @@ -1136,6 +1136,9 @@ |
828 | |
829 | productseries_linkbranch = ContextTitle('Link an existing branch to %s') |
830 | |
831 | +productseries_link_translations_branch = ContextTitle( |
832 | + "Set translations export branch for %s") |
833 | + |
834 | productseries_index = ContextTitle('%s') |
835 | |
836 | productseries_delete = ContextTitle('Delete %s') |
837 | |
838 | === modified file 'lib/canonical/launchpad/security.py' |
839 | --- lib/canonical/launchpad/security.py 2009-07-19 02:09:34 +0000 |
840 | +++ lib/canonical/launchpad/security.py 2009-07-19 04:41:14 +0000 |
841 | @@ -1333,10 +1333,18 @@ |
842 | usedfor = IPackageUpload |
843 | |
844 | def checkAuthenticated(self, user): |
845 | - """Return True if user has an ArchivePermission or is an admin.""" |
846 | + """Return True if user has an ArchivePermission or is an admin. |
847 | + |
848 | + If it's a delayed-copy, check if the user can upload to its targeted |
849 | + archive. |
850 | + """ |
851 | if AdminByAdminsTeam.checkAuthenticated(self, user): |
852 | return True |
853 | |
854 | + if self.obj.is_delayed_copy: |
855 | + archive_append = AppendArchive(self.obj.archive) |
856 | + return archive_append.checkAuthenticated(user) |
857 | + |
858 | permission_set = getUtility(IArchivePermissionSet) |
859 | permissions = permission_set.componentsForQueueAdmin( |
860 | self.obj.archive, user) |
861 | @@ -1907,6 +1915,9 @@ |
862 | PPA upload rights are managed via `IArchive.canUpload`; |
863 | |
864 | Appending to PRIMARY, PARTNER or COPY archives is restricted to owners. |
865 | + |
866 | + Appending to ubuntu main archives can also be done by the |
867 | + 'ubuntu-security' celebrity. |
868 | """ |
869 | permission = 'launchpad.Append' |
870 | usedfor = IArchive |
871 | @@ -1918,6 +1929,12 @@ |
872 | if self.obj.is_ppa and self.obj.canUpload(user): |
873 | return True |
874 | |
875 | + celebrities = getUtility(ILaunchpadCelebrities) |
876 | + if (self.obj.is_main and |
877 | + self.obj.distribution == celebrities.ubuntu and |
878 | + user.inTeam(celebrities.ubuntu_security)): |
879 | + return True |
880 | + |
881 | return False |
882 | |
883 | |
884 | |
885 | === modified file 'lib/canonical/launchpad/templates/launchpad-librarianfailure.pt' |
886 | --- lib/canonical/launchpad/templates/launchpad-librarianfailure.pt 2009-07-17 17:59:07 +0000 |
887 | +++ lib/canonical/launchpad/templates/launchpad-librarianfailure.pt 2009-07-24 04:26:14 +0000 |
888 | @@ -3,14 +3,11 @@ |
889 | xmlns:tal="http://xml.zope.org/namespaces/tal" |
890 | xmlns:metal="http://xml.zope.org/namespaces/metal" |
891 | xmlns:i18n="http://xml.zope.org/namespaces/i18n" |
892 | - xml:lang="en" |
893 | - lang="en" |
894 | - dir="ltr" |
895 | - metal:use-macro="view/macro:page/freeform" |
896 | + metal:use-macro="view/macro:page/locationless" |
897 | i18n:domain="launchpad" |
898 | > |
899 | <body> |
900 | - <div metal:fill-slot="main"> |
901 | + <div class="top-portlet" metal:fill-slot="main"> |
902 | <h1 class="exception">Sorry, you can't do this right now</h1> |
903 | <p> |
904 | Sorry, you can't upload or download files from Launchpad at the moment, |
905 | |
906 | === modified file 'lib/canonical/launchpad/templates/launchpad-requestexpired.pt' |
907 | --- lib/canonical/launchpad/templates/launchpad-requestexpired.pt 2009-07-17 17:59:07 +0000 |
908 | +++ lib/canonical/launchpad/templates/launchpad-requestexpired.pt 2009-07-24 04:26:14 +0000 |
909 | @@ -3,14 +3,11 @@ |
910 | xmlns:tal="http://xml.zope.org/namespaces/tal" |
911 | xmlns:metal="http://xml.zope.org/namespaces/metal" |
912 | xmlns:i18n="http://xml.zope.org/namespaces/i18n" |
913 | - xml:lang="en" |
914 | - lang="en" |
915 | - dir="ltr" |
916 | - metal:use-macro="view/macro:page/freeform" |
917 | + metal:use-macro="view/macro:page/locationless" |
918 | i18n:domain="launchpad" |
919 | > |
920 | <body> |
921 | - <div metal:fill-slot="main"> |
922 | + <div class="top-portlet" metal:fill-slot="main"> |
923 | <h1 class="exception">Timeout error</h1> |
924 | <div |
925 | id="redirect_notice" style="display:none" class="informational message"> |
926 | |
927 | === modified file 'lib/canonical/launchpad/templates/root-index.pt' |
928 | --- lib/canonical/launchpad/templates/root-index.pt 2009-07-17 17:59:07 +0000 |
929 | +++ lib/canonical/launchpad/templates/root-index.pt 2009-07-21 06:42:41 +0000 |
930 | @@ -46,7 +46,6 @@ |
931 | alt="" |
932 | style="margin: 0 9em 1em 0"/> |
933 | <br /> |
934 | - |
935 | <input id="text" type="text" name="field.text" size="50" /> |
936 | <input type="submit" value="Search Launchpad" /> |
937 | </form> |
938 | @@ -59,11 +58,11 @@ |
939 | <strong tal:content="view/blueprint_count/fmt:intcomma">123</strong> blueprints, |
940 | and counting... |
941 | </div> |
942 | - <div id="home-description">Launchpad is a unique collaboration and |
943 | - <a href="http://bazaar-vcs.org/">Bazaar</a> |
944 | - code hosting platform for software projects. <a |
945 | - href="/+tour" |
946 | - >Read more...</a></div> |
947 | + <div id="home-description"> |
948 | + Launchpad is a code hosting and software collaboration platform.<br /> |
949 | + <span class="smaller">Launchpad is open source -- you can <a href="https://dev.launchpad.net/">join the community</a> of people |
950 | + who help improve it.</span> |
951 | + </div> |
952 | <div id="home-page" style="width:90%; max-width:80em; margin:auto;"> |
953 | <div class="three column left" id="featured-projects"> |
954 | <h2>Featured projects</h2> |
955 | |
956 | === modified file 'lib/canonical/launchpad/tour/bugs' |
957 | --- lib/canonical/launchpad/tour/bugs 2009-06-12 16:36:02 +0000 |
958 | +++ lib/canonical/launchpad/tour/bugs 2009-07-20 19:12:02 +0000 |
959 | @@ -66,7 +66,7 @@ |
960 | In Launchpad, you can share a bug report and its comment history with other communities interested in finding a fix. |
961 | Each project — or even different releases within a project — can track its own status, importance and assignee for that same bug report. |
962 | <br /><br /> |
963 | - Even if the bug is tracked elsewhere — such as in Trac, Sourceforge or Bugzilla — Launchpad can pull in it status. Using our <a href="<a href="https://help.launchpad.net/Bugs/PluginAPISpec">bug tracker plugins</a> for Bugzilla and Trac you can share a comment history for the same bug tracked both in Launchpad and an external tracker.<br /><br /> |
964 | + Even if the bug is tracked elsewhere — such as in Trac, Sourceforge or Bugzilla — Launchpad can pull in it status. Using our <a href="https://help.launchpad.net/Bugs/PluginAPISpec">bug tracker plugins</a> for Bugzilla and Trac you can share a comment history for the same bug tracked both in Launchpad and an external tracker.<br /><br /> |
965 | And to help find low-hanging fruit, there’s a “Bugs fixed elsewhere” report that shows which of your bugs are marked fixed in other communities. |
966 | |
967 | </p> |
968 | |
969 | === modified file 'lib/canonical/launchpad/utilities/celebrities.py' |
970 | --- lib/canonical/launchpad/utilities/celebrities.py 2009-07-17 00:26:05 +0000 |
971 | +++ lib/canonical/launchpad/utilities/celebrities.py 2009-07-19 04:41:14 +0000 |
972 | @@ -136,6 +136,7 @@ |
973 | ubuntu = CelebrityDescriptor(IDistributionSet, 'ubuntu') |
974 | ubuntu_branches = CelebrityDescriptor(IPersonSet, 'ubuntu-branches') |
975 | ubuntu_bugzilla = CelebrityDescriptor(IBugTrackerSet, 'ubuntu-bugzilla') |
976 | + ubuntu_security = CelebrityDescriptor(IPersonSet, 'ubuntu-security') |
977 | vcs_imports = CelebrityDescriptor(IPersonSet, 'vcs-imports') |
978 | |
979 | @property |
980 | |
981 | === modified file 'lib/canonical/launchpad/vocabularies/configure.zcml' |
982 | --- lib/canonical/launchpad/vocabularies/configure.zcml 2009-07-13 18:15:02 +0000 |
983 | +++ lib/canonical/launchpad/vocabularies/configure.zcml 2009-07-19 04:41:14 +0000 |
984 | @@ -17,6 +17,12 @@ |
985 | /> |
986 | |
987 | <utility |
988 | + name="HostedBranchRestrictedOnOwner" |
989 | + component="canonical.launchpad.vocabularies.HostedBranchRestrictedOnOwnerVocabulary" |
990 | + provides="zope.schema.interfaces.IVocabularyFactory" |
991 | + /> |
992 | + |
993 | + <utility |
994 | name="BranchRestrictedOnProduct" |
995 | component="canonical.launchpad.vocabularies.BranchRestrictedOnProductVocabulary" |
996 | provides="zope.schema.interfaces.IVocabularyFactory" |
997 | |
998 | === modified file 'lib/canonical/launchpad/vocabularies/dbobjects.py' |
999 | --- lib/canonical/launchpad/vocabularies/dbobjects.py 2009-07-17 00:26:05 +0000 |
1000 | +++ lib/canonical/launchpad/vocabularies/dbobjects.py 2009-07-19 04:41:14 +0000 |
1001 | @@ -11,6 +11,7 @@ |
1002 | |
1003 | __all__ = [ |
1004 | 'BountyVocabulary', |
1005 | + 'HostedBranchRestrictedOnOwnerVocabulary', |
1006 | 'BranchRestrictedOnProductVocabulary', |
1007 | 'BranchVocabulary', |
1008 | 'BugNominatableSeriesesVocabulary', |
1009 | @@ -76,11 +77,13 @@ |
1010 | CountableIterator, IHugeVocabulary, |
1011 | NamedSQLObjectVocabulary, SQLObjectVocabularyBase) |
1012 | |
1013 | +from lp.code.enums import BranchType |
1014 | from lp.code.interfaces.branch import IBranch |
1015 | from lp.code.interfaces.branchcollection import IAllBranches |
1016 | from lp.registry.interfaces.distribution import IDistribution |
1017 | from lp.registry.interfaces.distroseries import ( |
1018 | DistroSeriesStatus, IDistroSeries) |
1019 | +from lp.registry.interfaces.person import IPerson |
1020 | from lp.registry.interfaces.product import IProduct |
1021 | from lp.registry.interfaces.productseries import IProductSeries |
1022 | from lp.registry.interfaces.project import IProject |
1023 | @@ -186,6 +189,25 @@ |
1024 | return getUtility(IAllBranches).inProduct(self.product) |
1025 | |
1026 | |
1027 | +class HostedBranchRestrictedOnOwnerVocabulary(BranchVocabularyBase): |
1028 | + """A vocabulary for hosted branches owned by the current user. |
1029 | + |
1030 | + These are branches that the user is guaranteed to be able to push |
1031 | + to. |
1032 | + """ |
1033 | + def __init__(self, context=None): |
1034 | + """Pass a Person as context, or anything else for the current user.""" |
1035 | + super(HostedBranchRestrictedOnOwnerVocabulary, self).__init__(context) |
1036 | + if IPerson.providedBy(self.context): |
1037 | + self.user = context |
1038 | + else: |
1039 | + self.user = getUtility(ILaunchBag).user |
1040 | + |
1041 | + def _getCollection(self): |
1042 | + return getUtility(IAllBranches).ownedBy(self.user).withBranchType( |
1043 | + BranchType.HOSTED) |
1044 | + |
1045 | + |
1046 | class BugVocabulary(SQLObjectVocabularyBase): |
1047 | |
1048 | _table = Bug |
1049 | |
1050 | === modified file 'lib/canonical/launchpad/webapp/tales.py' |
1051 | --- lib/canonical/launchpad/webapp/tales.py 2009-07-17 00:26:05 +0000 |
1052 | +++ lib/canonical/launchpad/webapp/tales.py 2009-07-24 22:07:48 +0000 |
1053 | @@ -56,6 +56,7 @@ |
1054 | from canonical.launchpad.webapp.session import get_cookie_domain |
1055 | from canonical.lazr.canonicalurl import nearest_adapter |
1056 | from lp.soyuz.interfaces.build import BuildStatus |
1057 | +from canonical.lazr.utils import safe_hasattr |
1058 | |
1059 | |
1060 | def escape(text, quote=True): |
1061 | @@ -414,7 +415,7 @@ |
1062 | # The names which can be traversed further (e.g context/fmt:url/+edit). |
1063 | traversable_names = {'link': 'link', 'url': 'url', 'api_url': 'api_url'} |
1064 | # Names which are allowed but can't be traversed further. |
1065 | - final_traversable_names = {} |
1066 | + final_traversable_names = {'public-private-css': 'public_private_css',} |
1067 | |
1068 | def __init__(self, context): |
1069 | self._context = context |
1070 | @@ -473,6 +474,13 @@ |
1071 | "No link implementation for %r, IPathAdapter implementation " |
1072 | "for %r." % (self, self._context)) |
1073 | |
1074 | + def public_private_css(self): |
1075 | + """Return the CSS class that represents the object's privacy.""" |
1076 | + if safe_hasattr(self._context, 'private') and self._context.private: |
1077 | + return 'private' |
1078 | + else: |
1079 | + return 'public' |
1080 | + |
1081 | |
1082 | class ObjectImageDisplayAPI: |
1083 | """Base class for producing the HTML that presents objects |
1084 | @@ -911,6 +919,7 @@ |
1085 | } |
1086 | |
1087 | final_traversable_names = {'local-time': 'local_time'} |
1088 | + final_traversable_names.update(ObjectFormatterAPI.final_traversable_names) |
1089 | |
1090 | def traverse(self, name, furtherPath): |
1091 | """Special-case traversal for links with an optional rootsite.""" |
1092 | @@ -1492,6 +1501,7 @@ |
1093 | 'aliases': 'aliases', |
1094 | 'external-link': 'external_link', |
1095 | 'external-title-link': 'external_title_link'} |
1096 | + final_traversable_names.update(ObjectFormatterAPI.final_traversable_names) |
1097 | |
1098 | def link(self, view_name): |
1099 | """Return an HTML link to the bugtracker page. |
1100 | @@ -1556,6 +1566,7 @@ |
1101 | final_traversable_names = { |
1102 | 'external-link': 'external_link', |
1103 | 'external-link-short': 'external_link_short'} |
1104 | + final_traversable_names.update(ObjectFormatterAPI.final_traversable_names) |
1105 | |
1106 | def _make_external_link(self, summary=None): |
1107 | """Return an external HTML link to the target of the bug watch. |
1108 | @@ -1965,6 +1976,7 @@ |
1109 | 'icon-link': 'link', |
1110 | 'link-icon': 'link', |
1111 | } |
1112 | + final_traversable_names.update(ObjectFormatterAPI.final_traversable_names) |
1113 | |
1114 | def icon(self): |
1115 | """Return the icon representation of the link.""" |
1116 | |
1117 | === modified file 'lib/canonical/launchpad/windmill/tests/test_registry/test_plusnew_step2.py' |
1118 | --- lib/canonical/launchpad/windmill/tests/test_registry/test_plusnew_step2.py 2009-06-25 05:30:52 +0000 |
1119 | +++ lib/canonical/launchpad/windmill/tests/test_registry/test_plusnew_step2.py 2009-07-21 17:14:29 +0000 |
1120 | @@ -52,12 +52,17 @@ |
1121 | validator='className|unseen') |
1122 | # Clicking on the href expands the search results. |
1123 | client.click(id='search-results-expander') |
1124 | - client.waits.sleep(milliseconds=u'1000') |
1125 | + client.waits.forElement( |
1126 | + xpath='//*[@id="search-results" and contains(@class, "lazr-opened")]', |
1127 | + milliseconds=u'1000') |
1128 | client.asserts.assertProperty( |
1129 | id=u'search-results', |
1130 | validator='className|lazr-opened') |
1131 | # Clicking it again hides the results. |
1132 | client.click(id='search-results-expander') |
1133 | + client.waits.forElement( |
1134 | + xpath='//*[@id="search-results" and contains(@class, "lazr-closed")]', |
1135 | + milliseconds=u'1000') |
1136 | client.asserts.assertProperty( |
1137 | id=u'search-results', |
1138 | validator='className|lazr-closed') |
1139 | |
1140 | === modified file 'lib/canonical/lazr/interfaces/feed.py' |
1141 | --- lib/canonical/lazr/interfaces/feed.py 2009-06-25 05:30:52 +0000 |
1142 | +++ lib/canonical/lazr/interfaces/feed.py 2009-07-21 07:27:48 +0000 |
1143 | @@ -133,14 +133,7 @@ |
1144 | def getItems(): |
1145 | """Get the individual items for the feed. |
1146 | |
1147 | - For instance, get all announcements for a project. Each item should |
1148 | - be converted to a feed entry using itemToFeedEntry. |
1149 | - """ |
1150 | - |
1151 | - def itemToFeedEntry(item): |
1152 | - """Convert a single item to a formatted feed entry. |
1153 | - |
1154 | - An individual entry will be an instance providing `IFeedEntry`. |
1155 | + Individual items will provide `IFeedEntry`. |
1156 | """ |
1157 | |
1158 | def renderAtom(): |
1159 | |
1160 | === removed symlink 'lib/canonical/shipit' |
1161 | === target was u'../../sourcecode/shipit' |
1162 | === removed symlink 'lib/canonical/signon' |
1163 | === target was u'../../sourcecode/canonical-identity-provider' |
1164 | === modified file 'lib/canonical/widgets/product.py' |
1165 | --- lib/canonical/widgets/product.py 2009-07-17 00:26:05 +0000 |
1166 | +++ lib/canonical/widgets/product.py 2009-07-21 03:56:03 +0000 |
1167 | @@ -362,7 +362,9 @@ |
1168 | return self.template() |
1169 | |
1170 | def _renderTable(self, category, column_count=1): |
1171 | - html = ['<table id="%s">' % category] |
1172 | + # The tables are wrapped in divs, since IE8 does not respond |
1173 | + # to setting the table's height to zero. |
1174 | + html = ['<div id="%s"><table>' % category] |
1175 | rendered_items = self.items_by_category[category] |
1176 | row_count = int(math.ceil(len(rendered_items) / float(column_count))) |
1177 | for i in range(0, row_count): |
1178 | @@ -373,7 +375,7 @@ |
1179 | break |
1180 | html.append('<td>%s</td>' % rendered_items[index]) |
1181 | html.append('</tr>') |
1182 | - html.append('</table>') |
1183 | + html.append('</table></div>') |
1184 | return '\n'.join(html) |
1185 | |
1186 | |
1187 | |
1188 | === modified file 'lib/lp/app/browser/tests/base-layout.txt' |
1189 | --- lib/lp/app/browser/tests/base-layout.txt 2009-07-07 23:44:05 +0000 |
1190 | +++ lib/lp/app/browser/tests/base-layout.txt 2009-07-24 22:07:48 +0000 |
1191 | @@ -41,7 +41,7 @@ |
1192 | <title>Test base-layout: main_side</title> ... |
1193 | <!-- Extra head content --> |
1194 | </head> |
1195 | - <body id="document" class="tab-overview main_side yui-skin-sam"> |
1196 | + <body id="document" class="tab-overview main_side public yui-skin-sam"> |
1197 | <div class="yui-d0"> |
1198 | <div id="locationbar"> ... |
1199 | <form id="globalsearch" ... |
1200 | @@ -50,8 +50,9 @@ |
1201 | <div class="yui-t4"> |
1202 | <div id="maincontent" class="yui-main"> |
1203 | <div class="yui-b" dir="ltr"> |
1204 | + <h1>Heading</h1> |
1205 | + <!-- future breadcrumb rule --> |
1206 | <div class="top-portlet"> |
1207 | - <h1>Heading</h1> |
1208 | <p class="registered"> |
1209 | Registered on 2005-09-16 |
1210 | by <a class="sprite team" href="#">Illuminati team</a> |
1211 | @@ -68,7 +69,7 @@ |
1212 | </div> |
1213 | </div><!-- yui-b side --> |
1214 | </div><!-- yui-t4 --> |
1215 | - <div id="footer"> ... |
1216 | + <div id="footer" class="footer"> ... |
1217 | </div><!-- footer--> |
1218 | </div><!-- yui-d0--> |
1219 | <script>LP.client.cache['context'] ... |
1220 | @@ -94,7 +95,7 @@ |
1221 | <title>Test base-layout: main_only</title> ... |
1222 | <!-- Extra head content --> |
1223 | </head> |
1224 | - <body id="document" class="tab-overview main_only yui-skin-sam"> |
1225 | + <body id="document" class="tab-overview main_only public yui-skin-sam"> |
1226 | <div class="yui-d0"> |
1227 | <div id="locationbar"> ... |
1228 | <form id="globalsearch" ... |
1229 | @@ -102,8 +103,9 @@ |
1230 | <div id="lp-apps" ... |
1231 | <div id="maincontent" class="yui-main"> |
1232 | <div class="yui-b" dir="ltr"> |
1233 | + <h1>Heading</h1> |
1234 | + <!-- future breadcrumb rule --> |
1235 | <div class="top-portlet"> |
1236 | - <h1>Heading</h1> |
1237 | <p class="registered"> |
1238 | Registered on 2005-09-16 |
1239 | by <a class="sprite team" href="#">Illuminati team</a> |
1240 | @@ -116,7 +118,7 @@ |
1241 | </div><!-- yui-main --> |
1242 | <!-- yui-b side --> |
1243 | <!-- yui-t4 --> |
1244 | - <div id="footer"> ... |
1245 | + <div id="footer" class="footer"> ... |
1246 | </div><!-- footer--> |
1247 | </div><!-- yui-d0--> |
1248 | <script>LP.client.cache['context'] ... |
1249 | @@ -141,7 +143,7 @@ |
1250 | <title>Test base-layout: searchless</title> ... |
1251 | <!-- Extra head content --> |
1252 | </head> |
1253 | - <body id="document" class="tab-overview searchless yui-skin-sam"> |
1254 | + <body id="document" class="tab-overview searchless public yui-skin-sam"> |
1255 | <div class="yui-d0"> |
1256 | <div id="locationbar"> |
1257 | <div id="logincontrol"><a href="...">Log in / Register</a></div> |
1258 | @@ -150,8 +152,9 @@ |
1259 | </div><!--id="locationbar"--> |
1260 | <div id="maincontent" class="yui-main"> |
1261 | <div class="yui-b" dir="ltr"> |
1262 | + <h1>Heading</h1> |
1263 | + <!-- future breadcrumb rule --> |
1264 | <div class="top-portlet"> |
1265 | - <h1>Heading</h1> |
1266 | <p class="registered"> |
1267 | Registered on 2005-09-16 |
1268 | by <a class="sprite team" href="#">Illuminati team</a> |
1269 | @@ -164,7 +167,7 @@ |
1270 | </div><!-- yui-main --> |
1271 | <!-- yui-b side --> |
1272 | <!-- yui-t4 --> |
1273 | - <div id="footer"> ... |
1274 | + <div id="footer" class="footer"> ... |
1275 | </div><!-- footer--> |
1276 | </div><!-- yui-d0--> |
1277 | <script>LP.client.cache['context'] ... |
1278 | @@ -174,7 +177,8 @@ |
1279 | The locationless template is intended for pages that provide content outside |
1280 | of normal application use. It is for exceptions. The epilogue and main |
1281 | slots are rendered. Global search and the application tabs are not included |
1282 | -in the header. |
1283 | +in the header. The layout does not have a heading slot to provide the h1 |
1284 | +or breadcrumbs; the template must provide its own h1 element. |
1285 | |
1286 | >>> class LocationlessView(LaunchpadView): |
1287 | ... """A simple view to test base-layout.""" |
1288 | @@ -190,7 +194,7 @@ |
1289 | <title>Test base-layout: locationless</title> ... |
1290 | <!-- Extra head content --> |
1291 | </head> |
1292 | - <body id="document" class="tab-overview locationless yui-skin-sam"> |
1293 | + <body id="document" class="tab-overview locationless public yui-skin-sam"> |
1294 | <div class="yui-d0"> |
1295 | <div id="locationbar"> |
1296 | <div id="logincontrol"><a href="...">Log in / Register</a></div> |
1297 | @@ -213,9 +217,38 @@ |
1298 | </div><!-- yui-main --> |
1299 | <!-- yui-b side --> |
1300 | <!-- yui-t4 --> |
1301 | - <div id="footer"> ... |
1302 | + <div id="footer" class="footer"> ... |
1303 | </div><!-- footer--> |
1304 | </div><!-- yui-d0--> |
1305 | <script>LP.client.cache['context'] ... |
1306 | </body> |
1307 | </html> ... |
1308 | + |
1309 | + |
1310 | +Public and private presentation |
1311 | +------------------------------- |
1312 | + |
1313 | +The base-layout master templates uses the fmt:public-private-css formatter to |
1314 | +add the 'public' or 'private' CSS class to the body tag. When the context is |
1315 | +private, the 'private' class is added to the body's class attribute. |
1316 | + |
1317 | + >>> from lp.registry.interfaces.person import PersonVisibility |
1318 | + >>> from canonical.launchpad.testing.pages import find_tag_by_id |
1319 | + |
1320 | + >>> login('admin@canonical.com') |
1321 | + >>> team = factory.makeTeam( |
1322 | + ... owner=user, name='a-private-team', |
1323 | + ... visibility=PersonVisibility.PRIVATE) |
1324 | + >>> view = LocationlessView(team, request) |
1325 | + >>> body = find_tag_by_id(view.render(), 'document') |
1326 | + >>> print body['class'] |
1327 | + tab-overview locationless private yui-skin-sam |
1328 | + |
1329 | +When the context is public, the 'public' class is in the class attribute. |
1330 | + |
1331 | + >>> login(ANONYMOUS) |
1332 | + >>> team = factory.makeTeam(owner=user, name='a-public-team') |
1333 | + >>> view = LocationlessView(team, request) |
1334 | + >>> body = find_tag_by_id(view.render(), 'document') |
1335 | + >>> print body['class'] |
1336 | + tab-overview locationless public yui-skin-sam |
1337 | |
1338 | === modified file 'lib/lp/app/browser/tests/testfiles/main-only.pt' |
1339 | --- lib/lp/app/browser/tests/testfiles/main-only.pt 2009-07-17 17:59:07 +0000 |
1340 | +++ lib/lp/app/browser/tests/testfiles/main-only.pt 2009-07-24 04:26:14 +0000 |
1341 | @@ -12,9 +12,9 @@ |
1342 | </head> |
1343 | |
1344 | <body> |
1345 | + <tal:heading metal:fill-slot="heading">Heading</tal:heading> |
1346 | <tal:main metal:fill-slot="main"> |
1347 | <div class="top-portlet"> |
1348 | - <h1>Heading</h1> |
1349 | <p class="registered"> |
1350 | Registered on 2005-09-16 |
1351 | by <a class="sprite team" href="#">Illuminati team</a> |
1352 | |
1353 | === modified file 'lib/lp/app/browser/tests/testfiles/main-side.pt' |
1354 | --- lib/lp/app/browser/tests/testfiles/main-side.pt 2009-07-17 17:59:07 +0000 |
1355 | +++ lib/lp/app/browser/tests/testfiles/main-side.pt 2009-07-24 04:26:14 +0000 |
1356 | @@ -12,9 +12,9 @@ |
1357 | </head> |
1358 | |
1359 | <body> |
1360 | + <tal:heading metal:fill-slot="heading">Heading</tal:heading> |
1361 | <tal:main metal:fill-slot="main"> |
1362 | <div class="top-portlet"> |
1363 | - <h1>Heading</h1> |
1364 | <p class="registered"> |
1365 | Registered on 2005-09-16 |
1366 | by <a class="sprite team" href="#">Illuminati team</a> |
1367 | |
1368 | === modified file 'lib/lp/app/browser/tests/testfiles/searchless.pt' |
1369 | --- lib/lp/app/browser/tests/testfiles/searchless.pt 2009-07-17 17:59:07 +0000 |
1370 | +++ lib/lp/app/browser/tests/testfiles/searchless.pt 2009-07-24 04:26:14 +0000 |
1371 | @@ -12,9 +12,9 @@ |
1372 | </head> |
1373 | |
1374 | <body> |
1375 | + <tal:heading metal:fill-slot="heading">Heading</tal:heading> |
1376 | <tal:main metal:fill-slot="main"> |
1377 | <div class="top-portlet"> |
1378 | - <h1>Heading</h1> |
1379 | <p class="registered"> |
1380 | Registered on 2005-09-16 |
1381 | by <a class="sprite team" href="#">Illuminati team</a> |
1382 | |
1383 | === modified file 'lib/lp/app/templates/base-layout.pt' |
1384 | --- lib/lp/app/templates/base-layout.pt 2009-07-17 17:59:07 +0000 |
1385 | +++ lib/lp/app/templates/base-layout.pt 2009-07-24 22:07:48 +0000 |
1386 | @@ -74,7 +74,10 @@ |
1387 | </head> |
1388 | |
1389 | <body id="document" |
1390 | - tal:attributes="class string:tab-${view/menu:selectedfacetname} ${view/macro:pagetype} yui-skin-sam"> |
1391 | + tal:attributes="class string:tab-${view/menu:selectedfacetname} |
1392 | + ${view/macro:pagetype} |
1393 | + ${view/context/fmt:public-private-css} |
1394 | + yui-skin-sam"> |
1395 | <div class="yui-d0"> |
1396 | <div id="locationbar"> |
1397 | <tal:login replace="structure context/@@login_status" /> |
1398 | @@ -133,6 +136,12 @@ |
1399 | lang view/lang|default_language|default; |
1400 | xml:lang view/lang|default_language|default; |
1401 | dir view/dir|string:ltr"> |
1402 | + <tal:location condition="view/macro:pagehas/applicationtabs"> |
1403 | + <h1><metal:main define-slot="heading" /></h1> |
1404 | + <tal:breadcrumbs> |
1405 | + <!-- future breadcrumb rule --> |
1406 | + </tal:breadcrumbs> |
1407 | + </tal:location> |
1408 | <metal:main define-slot="main" /> |
1409 | </div><!-- yui-b --> |
1410 | </div><!-- yui-main --> |
1411 | @@ -143,7 +152,7 @@ |
1412 | </div><!-- yui-b side --> |
1413 | </div><!-- yui-t4 --> |
1414 | |
1415 | - <div id="footer"> |
1416 | + <div id="footer" class="footer"> |
1417 | <div id="lp-arcana"> |
1418 | © 2004-2009 <a |
1419 | href="http://canonical.com/">Canonical Ltd.</a> | |
1420 | |
1421 | === modified file 'lib/lp/bugs/browser/bug.py' |
1422 | --- lib/lp/bugs/browser/bug.py 2009-07-17 00:26:05 +0000 |
1423 | +++ lib/lp/bugs/browser/bug.py 2009-07-20 16:59:49 +0000 |
1424 | @@ -385,8 +385,15 @@ |
1425 | |
1426 | @cachedproperty |
1427 | def duplicate_subscribers(self): |
1428 | - """Caches the list of subscribers from duplicates.""" |
1429 | - return frozenset(self.context.getSubscribersFromDuplicates()) |
1430 | + """Caches the list of subscribers from duplicates. |
1431 | + |
1432 | + Don't use getSubscribersFromDuplicates here because that method |
1433 | + omits a user if the user is also a direct or indirect subscriber. |
1434 | + getSubscriptionsFromDuplicates doesn't, so find person objects via |
1435 | + this method. |
1436 | + """ |
1437 | + dupe_subscriptions = self.context.getSubscriptionsFromDuplicates() |
1438 | + return frozenset([sub.person for sub in dupe_subscriptions]) |
1439 | |
1440 | def subscription_class(self, subscribed_person): |
1441 | """Returns a set of CSS class names based on subscription status. |
1442 | |
1443 | === modified file 'lib/lp/bugs/model/bug.py' |
1444 | --- lib/lp/bugs/model/bug.py 2009-07-17 18:46:25 +0000 |
1445 | +++ lib/lp/bugs/model/bug.py 2009-07-21 20:33:31 +0000 |
1446 | @@ -68,7 +68,6 @@ |
1447 | from lp.bugs.interfaces.bugactivity import IBugActivitySet |
1448 | from lp.bugs.interfaces.bugattachment import ( |
1449 | BugAttachmentType, IBugAttachmentSet) |
1450 | -from lp.bugs.interfaces.bugbranch import IBugBranch |
1451 | from lp.bugs.interfaces.bugmessage import IBugMessageSet |
1452 | from lp.bugs.interfaces.bugnomination import ( |
1453 | NominationError, NominationSeriesObsoleteError) |
1454 | @@ -533,17 +532,6 @@ |
1455 | Bug.duplicateof = %d""" % self.id, |
1456 | prejoins=["person"], clauseTables=["Bug"])) |
1457 | |
1458 | - # Direct and "also notified" subscribers take precedence |
1459 | - # over subscribers from duplicates. |
1460 | - duplicate_subscriptions -= set(self.getDirectSubscriptions()) |
1461 | - also_notified_subscriptions = set() |
1462 | - for also_notified_subscriber in self.getAlsoNotifiedSubscribers(): |
1463 | - for duplicate_subscription in duplicate_subscriptions: |
1464 | - if also_notified_subscriber == duplicate_subscription.person: |
1465 | - also_notified_subscriptions.add(duplicate_subscription) |
1466 | - break |
1467 | - duplicate_subscriptions -= also_notified_subscriptions |
1468 | - |
1469 | # Only add a subscriber once to the list. |
1470 | duplicate_subscribers = set( |
1471 | sub.person for sub in duplicate_subscriptions) |
1472 | |
1473 | === modified file 'lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt' |
1474 | --- lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt 2009-07-17 18:46:25 +0000 |
1475 | +++ lib/lp/bugs/templates/bugtask-tasks-and-nominations-table-row.pt 2009-07-20 14:59:27 +0000 |
1476 | @@ -64,7 +64,7 @@ |
1477 | style="float: left" |
1478 | tal:content="context/status/title" /> |
1479 | <a href="+editstatus" style="margin-left: 3px"> |
1480 | - <img class="editicon" /> |
1481 | + <img class="editicon" src="/@@/edit-transparent" /> |
1482 | </a> |
1483 | </div> |
1484 | </td> |
1485 | @@ -78,7 +78,7 @@ |
1486 | style="float: left" |
1487 | tal:content="context/importance/title" /> |
1488 | <a href="+editstatus" style="margin-left: 3px"> |
1489 | - <img class="editicon" /> |
1490 | + <img class="editicon" src="/@@/edit-transparent" /> |
1491 | </a> |
1492 | </div> |
1493 | </td> |
1494 | |
1495 | === modified file 'lib/lp/bugs/windmill/tests/test_bugs/test_bug_inline_subscriber.py' |
1496 | --- lib/lp/bugs/windmill/tests/test_bugs/test_bug_inline_subscriber.py 2009-07-17 00:26:05 +0000 |
1497 | +++ lib/lp/bugs/windmill/tests/test_bugs/test_bug_inline_subscriber.py 2009-07-20 18:38:23 +0000 |
1498 | @@ -167,52 +167,81 @@ |
1499 | xpath=SUBSCRIPTION_LINK, validator=u'Subscribe') |
1500 | client.asserts.assertNotNode(classname=FOO_BAR_CLASS) |
1501 | |
1502 | - # A bit of a corner case here, but make sure that when |
1503 | - # a user is subscribed to both the main bug and the dupe |
1504 | - # that the user is unsubscribed correctly. |
1505 | - client.open(url=BUG_URL % 5) |
1506 | - client.waits.forPageLoad(timeout=PAGE_LOAD) |
1507 | - client.waits.forElement( |
1508 | - id=u'subscribers-links', timeout=FOR_ELEMENT) |
1509 | - # Subscribe to the main bug, bug 5. |
1510 | - client.click(xpath=SUBSCRIPTION_LINK) |
1511 | - client.waits.sleep(milliseconds=SLEEP) |
1512 | - client.asserts.assertText( |
1513 | - xpath=SUBSCRIPTION_LINK, validator=u'Unsubscribe') |
1514 | - # Go to bug 6, the dupe, and subscribe. |
1515 | - client.open(url=BUG_URL % 6) |
1516 | - client.waits.forPageLoad(timeout=PAGE_LOAD) |
1517 | - client.waits.forElement( |
1518 | - id=u'subscribers-links', timeout=FOR_ELEMENT) |
1519 | - client.click(xpath=SUBSCRIPTION_LINK) |
1520 | - # Now back to bug 5. The first unsubscribe should remove |
1521 | - # the current bug direct subscription. |
1522 | - client.open(url=BUG_URL % 5) |
1523 | - client.waits.forPageLoad(timeout=PAGE_LOAD) |
1524 | - client.waits.forElement( |
1525 | - id=u'subscribers-links', timeout=FOR_ELEMENT) |
1526 | - client.asserts.assertText( |
1527 | - xpath=SUBSCRIPTION_LINK, validator=u'Unsubscribe') |
1528 | - # Confirm there are 2 subscriber links: one in direct subscribers, |
1529 | - # and one in duplicate subscribers. |
1530 | - client.asserts.assertNode( |
1531 | - xpath=(u'//div[@id="subscribers-links"]' |
1532 | - '/div/a[@name="Foo Bar"]')) |
1533 | - client.asserts.assertNode( |
1534 | - xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1535 | - '/div/a[@name="Foo Bar"]')) |
1536 | - # The first click unsubscribes the direct subscription, leaving the dupe. |
1537 | - client.click(xpath=SUBSCRIPTION_LINK) |
1538 | - client.waits.sleep(milliseconds=SLEEP) |
1539 | - client.asserts.assertNotNode( |
1540 | - xpath=(u'//div[@id="subscribers-links"]' |
1541 | - '/div/a[@name="Foo Bar"]')) |
1542 | - client.asserts.assertNode( |
1543 | - xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1544 | - '/div/a[@name="Foo Bar"]')) |
1545 | - # The second unsubscribe removes the dupe/ |
1546 | - client.click(xpath=SUBSCRIPTION_LINK) |
1547 | - client.waits.sleep(milliseconds=SLEEP) |
1548 | - client.asserts.assertNotNode( |
1549 | - xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1550 | - '/div/a[@name="Foo Bar"]')) |
1551 | + # Subscribe/Unsubscribe link handling when dealing |
1552 | + # with duplicates... |
1553 | + # |
1554 | + # First test case, ensure unsubscribing works when |
1555 | + # dealing with a duplicate and an indirect subscription. |
1556 | + lpuser.SAMPLE_PERSON.ensure_login(client) |
1557 | + # Go to bug 6, the dupe, and subscribe. |
1558 | + client.open(url=BUG_URL % 6) |
1559 | + client.waits.forPageLoad(timeout=PAGE_LOAD) |
1560 | + client.waits.forElement( |
1561 | + id=u'subscribers-links', timeout=FOR_ELEMENT) |
1562 | + client.click(xpath=SUBSCRIPTION_LINK) |
1563 | + client.waits.sleep(milliseconds=SLEEP) |
1564 | + client.asserts.assertText( |
1565 | + xpath=SUBSCRIPTION_LINK, validator=u'Unsubscribe') |
1566 | + # Now back to bug 5. |
1567 | + client.open(url=BUG_URL % 5) |
1568 | + client.waits.forPageLoad(timeout=PAGE_LOAD) |
1569 | + client.waits.forElement( |
1570 | + id=u'subscribers-links', timeout=FOR_ELEMENT) |
1571 | + # Confirm there are 2 subscriber links: one in duplicate subscribers, |
1572 | + # and one in indirect subscribers. |
1573 | + client.asserts.assertNode( |
1574 | + xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1575 | + '/div/a[@name="Sample Person"]')) |
1576 | + client.asserts.assertNode( |
1577 | + xpath=(u'//div[@id="subscribers-indirect"]' |
1578 | + '/div/a[text() = "Sample Person"]')) |
1579 | + # Clicking "Unsubscribe" successfully removes the duplicate subscription, |
1580 | + # but the indirect subscription remains. |
1581 | + client.click(xpath=SUBSCRIPTION_LINK) |
1582 | + client.waits.sleep(milliseconds=SLEEP) |
1583 | + client.asserts.assertNotNode( |
1584 | + xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1585 | + '/div/a[@name="Sample Person"]')) |
1586 | + client.asserts.assertNode( |
1587 | + xpath=(u'//div[@id="subscribers-indirect"]' |
1588 | + '/div/a[text() = "Sample Person"]')) |
1589 | + |
1590 | + # Second test case, confirm duplicate handling is correct between direct |
1591 | + # and duplicate subscriptions. Subscribe directly to bug 5. |
1592 | + client.click(xpath=SUBSCRIPTION_LINK) |
1593 | + client.waits.sleep(milliseconds=SLEEP) |
1594 | + client.asserts.assertText( |
1595 | + xpath=SUBSCRIPTION_LINK, validator=u'Unsubscribe') |
1596 | + # Go to bug 6, the dupe, and subscribe. |
1597 | + client.open(url=BUG_URL % 6) |
1598 | + client.waits.forPageLoad(timeout=PAGE_LOAD) |
1599 | + client.waits.forElement( |
1600 | + id=u'subscribers-links', timeout=FOR_ELEMENT) |
1601 | + client.click(xpath=SUBSCRIPTION_LINK) |
1602 | + client.waits.sleep(milliseconds=SLEEP) |
1603 | + client.asserts.assertText( |
1604 | + xpath=SUBSCRIPTION_LINK, validator=u'Unsubscribe') |
1605 | + # Now back to bug 5. Confirm there are 2 subscriptions. |
1606 | + client.open(url=BUG_URL % 5) |
1607 | + client.asserts.assertNode( |
1608 | + xpath=(u'//div[@id="subscribers-links"]' |
1609 | + '/div/a[@name="Sample Person"]')) |
1610 | + client.asserts.assertNode( |
1611 | + xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1612 | + '/div/a[@name="Sample Person"]')) |
1613 | + # The first click unsubscribes the direct subscription, leaving |
1614 | + # the duplicate subscription. |
1615 | + client.click(xpath=SUBSCRIPTION_LINK) |
1616 | + client.waits.sleep(milliseconds=SLEEP) |
1617 | + client.asserts.assertNotNode( |
1618 | + xpath=(u'//div[@id="subscribers-links"]' |
1619 | + '/div/a[@name="Sample Person"]')) |
1620 | + client.asserts.assertNode( |
1621 | + xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1622 | + '/div/a[@name="Sample Person"]')) |
1623 | + # The second unsubscribe removes the duplicate, too. |
1624 | + client.click(xpath=SUBSCRIPTION_LINK) |
1625 | + client.waits.sleep(milliseconds=SLEEP) |
1626 | + client.asserts.assertNotNode( |
1627 | + xpath=(u'//div[@id="subscribers-from-duplicates"]' |
1628 | + '/div/a[@name="Sample Person"]')) |
1629 | |
1630 | === modified file 'lib/lp/code/browser/branch.py' |
1631 | --- lib/lp/code/browser/branch.py 2009-07-17 00:26:05 +0000 |
1632 | +++ lib/lp/code/browser/branch.py 2009-07-19 04:41:14 +0000 |
1633 | @@ -520,6 +520,15 @@ |
1634 | return False |
1635 | return self.context.target.collection.getBranches().count() > 1 |
1636 | |
1637 | + def translations_sources(self): |
1638 | + """Anything that automatically exports its translations here. |
1639 | + |
1640 | + Produces a list, so that the template can easily check whether |
1641 | + there are any translations sources. |
1642 | + """ |
1643 | + # Actually only ProductSeries currently do that. |
1644 | + return list(self.context.getProductSeriesPushingTranslations()) |
1645 | + |
1646 | |
1647 | class DecoratedMergeProposal: |
1648 | """Provide some additional attributes to a normal branch merge proposal. |
1649 | |
1650 | === modified file 'lib/lp/code/browser/configure.zcml' |
1651 | --- lib/lp/code/browser/configure.zcml 2009-07-17 00:26:05 +0000 |
1652 | +++ lib/lp/code/browser/configure.zcml 2009-07-20 18:22:54 +0000 |
1653 | @@ -508,14 +508,14 @@ |
1654 | for="lp.code.interfaces.branchvisibilitypolicy.IHasBranchVisibilityPolicy" |
1655 | facet="overview" |
1656 | class="lp.code.browser.branchvisibilitypolicy.AddBranchVisibilityTeamPolicyView" |
1657 | - permission="launchpad.Admin" |
1658 | + permission="launchpad.Commercial" |
1659 | template="../templates/branch-visibility-edit.pt"/> |
1660 | <browser:page |
1661 | name="+removebranchvisibilitypolicy" |
1662 | for="lp.code.interfaces.branchvisibilitypolicy.IHasBranchVisibilityPolicy" |
1663 | facet="overview" |
1664 | class="lp.code.browser.branchvisibilitypolicy.RemoveBranchVisibilityTeamPolicyView" |
1665 | - permission="launchpad.Admin" |
1666 | + permission="launchpad.Commercial" |
1667 | template="../templates/branch-visibility-edit.pt"/> |
1668 | <browser:page |
1669 | name="+spark" |
1670 | |
1671 | === modified file 'lib/lp/code/browser/tests/test_branch.py' |
1672 | --- lib/lp/code/browser/tests/test_branch.py 2009-07-17 00:26:05 +0000 |
1673 | +++ lib/lp/code/browser/tests/test_branch.py 2009-07-19 04:41:14 +0000 |
1674 | @@ -208,6 +208,27 @@ |
1675 | view.initialize() |
1676 | self.assertFalse(view.show_merge_links) |
1677 | |
1678 | + def testNoProductSeriesPushingTranslations(self): |
1679 | + # By default, a branch view shows no product series pushing |
1680 | + # translations to the branch. |
1681 | + branch = self.factory.makeBranch() |
1682 | + |
1683 | + view = BranchView(branch, self.request) |
1684 | + view.initialize() |
1685 | + self.assertEqual(list(view.translations_sources()), []) |
1686 | + |
1687 | + def testProductSeriesPushingTranslations(self): |
1688 | + # If a product series exports its translations to the branch, |
1689 | + # the view shows it. |
1690 | + product = self.factory.makeProduct() |
1691 | + trunk = product.getSeries('trunk') |
1692 | + branch = self.factory.makeBranch(owner=product.owner) |
1693 | + removeSecurityProxy(trunk).translations_branch = branch |
1694 | + |
1695 | + view = BranchView(branch, self.request) |
1696 | + view.initialize() |
1697 | + self.assertEqual(list(view.translations_sources()), [trunk]) |
1698 | + |
1699 | |
1700 | class TestBranchReviewerEditView(TestCaseWithFactory): |
1701 | """Test the BranchReviewerEditView view.""" |
1702 | |
1703 | === modified file 'lib/lp/code/configure.zcml' |
1704 | --- lib/lp/code/configure.zcml 2009-07-17 00:26:05 +0000 |
1705 | +++ lib/lp/code/configure.zcml 2009-07-19 04:41:14 +0000 |
1706 | @@ -388,6 +388,7 @@ |
1707 | canBeDeleted |
1708 | deletionRequirements |
1709 | associatedProductSeries |
1710 | + getProductSeriesPushingTranslations |
1711 | associatedSuiteSourcePackages |
1712 | subscribe |
1713 | getSubscription |
1714 | |
1715 | === modified file 'lib/lp/code/feed/branch.py' |
1716 | --- lib/lp/code/feed/branch.py 2009-07-17 00:26:05 +0000 |
1717 | +++ lib/lp/code/feed/branch.py 2009-07-21 07:27:48 +0000 |
1718 | @@ -40,6 +40,12 @@ |
1719 | from lp.registry.interfaces.project import IProject |
1720 | |
1721 | |
1722 | +def revision_feed_id(revision): |
1723 | + """Return a consistent id for a revision to use as an id.""" |
1724 | + return "tag:launchpad.net,%s:/revision/%s" % ( |
1725 | + revision.revision_date.date().isoformat(), revision.revision_id) |
1726 | + |
1727 | + |
1728 | class BranchFeedEntry(FeedEntry): |
1729 | """See `IFeedEntry`.""" |
1730 | def construct_id(self): |
1731 | @@ -248,16 +254,35 @@ |
1732 | Called by getItems which may cache the results. |
1733 | """ |
1734 | cache = self._getRevisionCache() |
1735 | - revisions = cache.public().getRevisions().config(limit=self.quantity) |
1736 | + revisions = cache.public().getRevisions() |
1737 | # Convert the items into their feed entry representation. |
1738 | - items = [self.itemToFeedEntry(item) for item in revisions] |
1739 | + items = [] |
1740 | + for revision in revisions: |
1741 | + content_view = self._createView(revision) |
1742 | + if content_view is not None: |
1743 | + entry = self.createFeedEntry(content_view) |
1744 | + items.append(entry) |
1745 | + # If we've hit our limit, stop iterating the revisions. |
1746 | + if len(items) >= self.quantity: |
1747 | + break |
1748 | return items |
1749 | |
1750 | - def itemToFeedEntry(self, revision): |
1751 | - """See `IFeed`.""" |
1752 | - id = "tag:launchpad.net,%s:/revision/%s" % ( |
1753 | - revision.revision_date.date().isoformat(), revision.revision_id) |
1754 | + def _createView(self, revision): |
1755 | + """Make a view for this revision. |
1756 | + |
1757 | + :return: A view class, or None. |
1758 | + """ |
1759 | content_view = RevisionFeedContentView(revision, self.request, self) |
1760 | + # If there is no longer an associated branch for this, return None as |
1761 | + # we don't want to show this revision. |
1762 | + if content_view.branch is None: |
1763 | + return None |
1764 | + return content_view |
1765 | + |
1766 | + def createFeedEntry(self, content_view): |
1767 | + """Create the FeedEntry for the specified view.""" |
1768 | + revision = content_view.context |
1769 | + id = revision_feed_id(revision) |
1770 | content = content_view.render() |
1771 | content_data = FeedTypedData(content=content, |
1772 | content_type="html", |
1773 | |
1774 | === added directory 'lib/lp/code/feed/tests' |
1775 | === added file 'lib/lp/code/feed/tests/__init__.py' |
1776 | --- lib/lp/code/feed/tests/__init__.py 1970-01-01 00:00:00 +0000 |
1777 | +++ lib/lp/code/feed/tests/__init__.py 2009-07-20 05:51:06 +0000 |
1778 | @@ -0,0 +1,4 @@ |
1779 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
1780 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1781 | + |
1782 | +"""Test for feeds relating to Launchpad code live here.""" |
1783 | |
1784 | === added file 'lib/lp/code/feed/tests/test_revision.py' |
1785 | --- lib/lp/code/feed/tests/test_revision.py 1970-01-01 00:00:00 +0000 |
1786 | +++ lib/lp/code/feed/tests/test_revision.py 2009-07-21 03:30:59 +0000 |
1787 | @@ -0,0 +1,141 @@ |
1788 | +# Copyright 2009 Canonical Ltd. This software is licensed under the |
1789 | +# GNU Affero General Public License version 3 (see the file LICENSE). |
1790 | + |
1791 | +"""Tests for the revision feeds.""" |
1792 | + |
1793 | +__metaclass__ = type |
1794 | + |
1795 | +from datetime import datetime |
1796 | +import unittest |
1797 | + |
1798 | +from pytz import UTC |
1799 | +from zope.component import getUtility |
1800 | + |
1801 | +from canonical.launchpad.webapp.servers import LaunchpadTestRequest |
1802 | +from canonical.testing.layers import DatabaseFunctionalLayer |
1803 | +from lp.code.feed.branch import ( |
1804 | + ProductRevisionFeed, revision_feed_id, RevisionListingFeed) |
1805 | +from lp.code.interfaces.revision import IRevisionSet |
1806 | +from lp.testing import login_person, TestCaseWithFactory |
1807 | + |
1808 | + |
1809 | +class TestRevisionFeedId(TestCaseWithFactory): |
1810 | + """Test the revision_feed_id function.""" |
1811 | + |
1812 | + layer = DatabaseFunctionalLayer |
1813 | + |
1814 | + def test_format(self): |
1815 | + # The id contains the iso format of the date part of the revision |
1816 | + # date, and the revision id. |
1817 | + revision_date = datetime(2009, 07, 21, 12, tzinfo=UTC) |
1818 | + revision = self.factory.makeRevision( |
1819 | + revision_date=revision_date, rev_id="test_revision_id") |
1820 | + feed_id = revision_feed_id(revision) |
1821 | + self.assertEqual( |
1822 | + 'tag:launchpad.net,2009-07-21:/revision/test_revision_id', |
1823 | + feed_id) |
1824 | + |
1825 | + |
1826 | +class TestRevisionFeed(TestCaseWithFactory): |
1827 | + """Tests for the methods of the RevisionListingFeed base class.""" |
1828 | + |
1829 | + layer = DatabaseFunctionalLayer |
1830 | + |
1831 | + def _createBranchWithRevision(self): |
1832 | + """Create a branch with a linked, cached revision. |
1833 | + |
1834 | + :return: a tuple of (branch, revision) |
1835 | + """ |
1836 | + revision = self.factory.makeRevision() |
1837 | + branch = self.factory.makeBranch() |
1838 | + branch.createBranchRevision(1, revision) |
1839 | + getUtility(IRevisionSet).updateRevisionCacheForBranch(branch) |
1840 | + return branch, revision |
1841 | + |
1842 | + def _createFeed(self): |
1843 | + """Create and return a RevisionListingFeed instance.""" |
1844 | + # The FeedBase class determins the feed type by the end of the |
1845 | + # requested URL, so forcing .atom here. |
1846 | + return RevisionListingFeed( |
1847 | + None, LaunchpadTestRequest( |
1848 | + SERVER_URL="http://example.com/fake.atom")) |
1849 | + |
1850 | + def test_createView(self): |
1851 | + # Revisions that are linked to branches are shown in the feed. |
1852 | + |
1853 | + # Since we are calling into a base class that would normally take a |
1854 | + # context and a request, we need to give it something - None should be |
1855 | + # fine. |
1856 | + branch, revision = self._createBranchWithRevision() |
1857 | + revision_feed = self._createFeed() |
1858 | + view = revision_feed._createView(revision) |
1859 | + self.assertEqual(revision, view.context) |
1860 | + self.assertEqual(branch, view.branch) |
1861 | + |
1862 | + def test_createView_revision_not_in_branch(self): |
1863 | + # If a revision is in the RevisionCache table, but no longer |
1864 | + # associated with a public branch, then the createView call will |
1865 | + # return None to indicate not do show this revision. |
1866 | + branch, revision = self._createBranchWithRevision() |
1867 | + # Now delete the branch. |
1868 | + login_person(branch.owner) |
1869 | + branch.destroySelf() |
1870 | + revision_feed = self._createFeed() |
1871 | + view = revision_feed._createView(revision) |
1872 | + self.assertIs(None, view) |
1873 | + |
1874 | + |
1875 | +class TestProductRevisionFeed(TestCaseWithFactory): |
1876 | + """Tests for the ProductRevisionFeed.""" |
1877 | + |
1878 | + layer = DatabaseFunctionalLayer |
1879 | + |
1880 | + def _createBranchWithRevision(self, product): |
1881 | + """Create a branch with a linked, cached revision. |
1882 | + |
1883 | + :return: a tuple of (branch, revision) |
1884 | + """ |
1885 | + revision = self.factory.makeRevision() |
1886 | + branch = self.factory.makeProductBranch(product=product) |
1887 | + branch.createBranchRevision(1, revision) |
1888 | + getUtility(IRevisionSet).updateRevisionCacheForBranch(branch) |
1889 | + return branch, revision |
1890 | + |
1891 | + def _createFeed(self, product): |
1892 | + """Create and return a ProductRevisionFeed instance.""" |
1893 | + # The FeedBase class determins the feed type by the end of the |
1894 | + # requested URL, so forcing .atom here. |
1895 | + return ProductRevisionFeed( |
1896 | + product, LaunchpadTestRequest( |
1897 | + SERVER_URL="http://example.com/fake.atom")) |
1898 | + |
1899 | + def test_getItems_empty(self): |
1900 | + # If there are no revisions for a product, there are no items. |
1901 | + product = self.factory.makeProduct() |
1902 | + feed = self._createFeed(product) |
1903 | + self.assertEqual([], feed.getItems()) |
1904 | + |
1905 | + def test_getItems_revisions(self): |
1906 | + # If there are revisions in branches for the project, these are |
1907 | + # returned in the feeds items. |
1908 | + product = self.factory.makeProduct() |
1909 | + branch, revision = self._createBranchWithRevision(product) |
1910 | + feed = self._createFeed(product) |
1911 | + [item] = feed.getItems() |
1912 | + self.assertEqual(revision_feed_id(revision), item.id) |
1913 | + |
1914 | + def test_getItems_skips_revisions_not_in_branches(self): |
1915 | + # If a revision was added to a project, but the only branch that |
1916 | + # referred to that revision was subsequently removed, the revision |
1917 | + # does not show in the feed. |
1918 | + product = self.factory.makeProduct() |
1919 | + branch, revision = self._createBranchWithRevision(product) |
1920 | + # Now delete the branch. |
1921 | + login_person(branch.owner) |
1922 | + branch.destroySelf() |
1923 | + feed = self._createFeed(product) |
1924 | + self.assertEqual([], feed.getItems()) |
1925 | + |
1926 | + |
1927 | +def test_suite(): |
1928 | + return unittest.TestLoader().loadTestsFromName(__name__) |
1929 | |
1930 | === modified file 'lib/lp/code/interfaces/branch.py' |
1931 | --- lib/lp/code/interfaces/branch.py 2009-07-17 00:26:05 +0000 |
1932 | +++ lib/lp/code/interfaces/branch.py 2009-07-19 04:41:14 +0000 |
1933 | @@ -780,6 +780,15 @@ |
1934 | series as a branch. |
1935 | """ |
1936 | |
1937 | + def getProductSeriesPushingTranslations(): |
1938 | + """Return sequence of product series pushing translations here. |
1939 | + |
1940 | + These are any `ProductSeries` that have this branch as their |
1941 | + translations_branch. It should normally be at most one, but |
1942 | + there's nothing stopping people from combining translations |
1943 | + branches. |
1944 | + """ |
1945 | + |
1946 | def associatedSuiteSourcePackages(): |
1947 | """Return the suite source packages that this branch is linked to.""" |
1948 | |
1949 | |
1950 | === modified file 'lib/lp/code/model/branch.py' |
1951 | --- lib/lp/code/model/branch.py 2009-07-17 00:26:05 +0000 |
1952 | +++ lib/lp/code/model/branch.py 2009-07-19 04:41:14 +0000 |
1953 | @@ -484,6 +484,9 @@ |
1954 | spec_link.destroySelf)) |
1955 | for series in self.associatedProductSeries(): |
1956 | alteration_operations.append(ClearSeriesBranch(series, self)) |
1957 | + for series in self.getProductSeriesPushingTranslations(): |
1958 | + alteration_operations.append( |
1959 | + ClearSeriesTranslationsBranch(series, self)) |
1960 | |
1961 | series_set = getUtility(IFindOfficialBranchLinks) |
1962 | alteration_operations.extend( |
1963 | @@ -529,6 +532,14 @@ |
1964 | ProductSeries, |
1965 | ProductSeries.branch == self) |
1966 | |
1967 | + def getProductSeriesPushingTranslations(self): |
1968 | + """See `IBranch`.""" |
1969 | + # Imported here to avoid circular import. |
1970 | + from lp.registry.model.productseries import ProductSeries |
1971 | + return Store.of(self).find( |
1972 | + ProductSeries, |
1973 | + ProductSeries.translations_branch == self) |
1974 | + |
1975 | def associatedSuiteSourcePackages(self): |
1976 | """See `IBranch`.""" |
1977 | series_set = getUtility(IFindOfficialBranchLinks) |
1978 | @@ -910,6 +921,21 @@ |
1979 | self.affected_object.syncUpdate() |
1980 | |
1981 | |
1982 | +class ClearSeriesTranslationsBranch(DeletionOperation): |
1983 | + """Deletion operation that clears a series' translations branch.""" |
1984 | + |
1985 | + def __init__(self, series, branch): |
1986 | + DeletionOperation.__init__( |
1987 | + self, series, |
1988 | + _('This series exports its translations to this branch.')) |
1989 | + self.branch = branch |
1990 | + |
1991 | + def __call__(self): |
1992 | + if self.affected_object.branch == self.branch: |
1993 | + self.affected_object.branch = None |
1994 | + self.affected_object.syncUpdate() |
1995 | + |
1996 | + |
1997 | class ClearOfficialPackageBranch(DeletionOperation): |
1998 | """Deletion operation that clears an official package branch.""" |
1999 | |
2000 | |
2001 | === modified file 'lib/lp/code/model/tests/test_branch.py' |
2002 | --- lib/lp/code/model/tests/test_branch.py 2009-07-17 00:26:05 +0000 |
2003 | +++ lib/lp/code/model/tests/test_branch.py 2009-07-19 04:41:14 +0000 |
2004 | @@ -558,6 +558,14 @@ |
2005 | " is not deletable.") |
2006 | self.assertRaises(CannotDeleteBranch, self.branch.destroySelf) |
2007 | |
2008 | + def test_productSeriesTranslationsBranchDisablesDeletion(self): |
2009 | + self.product.development_focus.translations_branch = self.branch |
2010 | + syncUpdate(self.product.development_focus) |
2011 | + self.assertEqual(self.branch.canBeDeleted(), False, |
2012 | + "A branch that is a translations branch for a " |
2013 | + "product series is not deletable.") |
2014 | + self.assertRaises(CannotDeleteBranch, self.branch.destroySelf) |
2015 | + |
2016 | def test_revisionsDeletable(self): |
2017 | """A branch that has some revisions can be deleted.""" |
2018 | revision = self.factory.makeRevision() |
2019 | |
2020 | === modified file 'lib/lp/code/model/tests/test_branchjob.py' |
2021 | --- lib/lp/code/model/tests/test_branchjob.py 2009-07-17 00:26:05 +0000 |
2022 | +++ lib/lp/code/model/tests/test_branchjob.py 2009-07-19 04:41:14 +0000 |
2023 | @@ -622,6 +622,8 @@ |
2024 | bmp = self.factory.makeBranchMergeProposal(target_branch=job.branch, |
2025 | registrant=hacker) |
2026 | bmp.source_branch.last_scanned_id = 'rev3-id' |
2027 | + transaction.commit() |
2028 | + self.layer.switchDbUser(config.sendbranchmail.dbuser) |
2029 | message = job.getRevisionMessage('rev2d-id', 1) |
2030 | self.assertEqual( |
2031 | 'Merge authors:\n' |
2032 | |
2033 | === modified file 'lib/lp/code/stories/branches/xx-branch-visibility-policy.txt' |
2034 | --- lib/lp/code/stories/branches/xx-branch-visibility-policy.txt 2009-07-09 19:54:12 +0000 |
2035 | +++ lib/lp/code/stories/branches/xx-branch-visibility-policy.txt 2009-07-20 18:22:54 +0000 |
2036 | @@ -1,7 +1,7 @@ |
2037 | = Branch Visibility Policy Pages = |
2038 | |
2039 | Controlling the branch visibility policies for products and projects is only |
2040 | -available to launchpad admins. |
2041 | +available to launchpad admins and launchpad commercial admins. |
2042 | |
2043 | Not to anonymous people. |
2044 | |
2045 | @@ -43,14 +43,19 @@ |
2046 | ... |
2047 | Unauthorized: ... |
2048 | |
2049 | -Launchpad admins however, can get to it. |
2050 | +Launchpad admins however, can get to the page to the branch visibility |
2051 | +overview page and to the page to actually modify the policies. |
2052 | |
2053 | >>> admin_browser.open('http://code.launchpad.dev/firefox') |
2054 | >>> admin_browser.getLink('Define branch visibility').click() |
2055 | >>> print admin_browser.url |
2056 | http://launchpad.dev/firefox/+branchvisibility |
2057 | |
2058 | -And members of the Launchpad Commercial team can view the page. |
2059 | + >>> admin_browser.getLink('Customise policy for Mozilla Firefox').click() |
2060 | + >>> print admin_browser.url |
2061 | + http://launchpad.dev/firefox/+addbranchvisibilitypolicy |
2062 | + |
2063 | +And members of the Launchpad Commercial team can view the pages. |
2064 | |
2065 | >>> commercial_browser = setupBrowser( |
2066 | ... auth='Basic commercial-member@canonical.com:test') |
2067 | @@ -59,6 +64,10 @@ |
2068 | >>> print commercial_browser.url |
2069 | http://launchpad.dev/firefox/+branchvisibility |
2070 | |
2071 | + >>> commercial_browser.getLink('Customise policy for Mozilla Firefox').click() |
2072 | + >>> print commercial_browser.url |
2073 | + http://launchpad.dev/firefox/+addbranchvisibilitypolicy |
2074 | + |
2075 | |
2076 | == Default policies == |
2077 | |
2078 | @@ -68,6 +77,8 @@ |
2079 | specify any branch visibility policy items and there is an inherited branch |
2080 | visibility policy, then that policy is used. |
2081 | |
2082 | + >>> admin_browser.open('http://launchpad.dev/firefox/+branchvisibility') |
2083 | + |
2084 | >>> print extract_text(find_tag_by_id(admin_browser.contents, 'inherited')) |
2085 | Using inherited policy from the Mozilla Project. |
2086 | |
2087 | @@ -192,6 +203,14 @@ |
2088 | |
2089 | See? All still there. |
2090 | |
2091 | +Before we remove them, let's ensure that the commercial admins can see |
2092 | +the removal page. |
2093 | + |
2094 | + >>> commercial_browser.open('http://launchpad.dev/firefox/+branchvisibility') |
2095 | + >>> commercial_browser.getLink('Remove policy items').click() |
2096 | + >>> print commercial_browser.url |
2097 | + http://launchpad.dev/firefox/+removebranchvisibilitypolicy |
2098 | + |
2099 | Now to remove two. The override for Everyone, and Launchpad Devs. |
2100 | |
2101 | >>> admin_browser.getControl('Everyone: Public').click() |
2102 | @@ -204,13 +223,15 @@ |
2103 | Ubuntu Gnome Team: Private |
2104 | |
2105 | As you can see there is still a default visibility of Public. This |
2106 | -is now implicit rather than explicit. So if we go bact to remove more |
2107 | +is now implicit rather than explicit. So if we go back to remove more |
2108 | items, there is only one more policy item to remove. Once that is removed |
2109 | -Firefox will go back to inheriting the polices of Mozilla. |
2110 | - |
2111 | - >>> admin_browser.getLink('Remove policy items').click() |
2112 | - >>> admin_browser.getControl('Ubuntu Gnome Team: Private').click() |
2113 | - >>> admin_browser.getControl('Remove Selected Policy Items').click() |
2114 | - |
2115 | - >>> print_tag_with_id(admin_browser.contents, 'inherited') |
2116 | +Firefox will go back to inheriting the polices of Mozilla. Let's let |
2117 | +the commercial admin do the removal to ensure he has the permission. |
2118 | + |
2119 | + >>> commercial_browser.open('http://launchpad.dev/firefox/+branchvisibility') |
2120 | + >>> commercial_browser.getLink('Remove policy items').click() |
2121 | + >>> commercial_browser.getControl('Ubuntu Gnome Team: Private').click() |
2122 | + >>> commercial_browser.getControl('Remove Selected Policy Items').click() |
2123 | + |
2124 | + >>> print_tag_with_id(commercial_browser.contents, 'inherited') |
2125 | Using inherited policy from the Mozilla Project. |
2126 | |
2127 | === modified file 'lib/lp/code/templates/branch-index.pt' |
2128 | --- lib/lp/code/templates/branch-index.pt 2009-07-17 17:59:07 +0000 |
2129 | +++ lib/lp/code/templates/branch-index.pt 2009-07-19 04:41:14 +0000 |
2130 | @@ -449,6 +449,17 @@ |
2131 | |
2132 | </div> |
2133 | |
2134 | + <div |
2135 | + id="translations-sources" |
2136 | + tal:define="translations_sources view/translations_sources" |
2137 | + tal:condition="translations_sources"> |
2138 | + <h2>Automatic translations commits</h2> |
2139 | + <ul> |
2140 | + <tal:sources-list repeat="source translations_sources"> |
2141 | + <li tal:content="structure source/fmt:link">~foo/example/branch</li> |
2142 | + </tal:sources-list> |
2143 | + </ul> |
2144 | + </div> |
2145 | |
2146 | <div id="recent-revisions"> |
2147 | <h2 style="clear: both">Recent revisions</h2> |
2148 | |
2149 | === modified file 'lib/lp/codehosting/scanner/tests/test_buglinks.py' |
2150 | --- lib/lp/codehosting/scanner/tests/test_buglinks.py 2009-07-17 00:26:05 +0000 |
2151 | +++ lib/lp/codehosting/scanner/tests/test_buglinks.py 2009-07-19 04:41:14 +0000 |
2152 | @@ -12,16 +12,16 @@ |
2153 | import zope.component.event |
2154 | from zope.component import getUtility |
2155 | |
2156 | +from canonical.config import config |
2157 | +from canonical.launchpad.interfaces import ( |
2158 | + IBugBranchSet, IBugSet, NotFoundError) |
2159 | +from canonical.testing.layers import LaunchpadZopelessLayer |
2160 | + |
2161 | from lp.codehosting.scanner.buglinks import got_new_revision, BugBranchLinker |
2162 | from lp.codehosting.scanner.fixture import make_zope_event_fixture |
2163 | from lp.codehosting.scanner.tests.test_bzrsync import BzrSyncTestCase |
2164 | -from canonical.config import config |
2165 | -from canonical.launchpad.interfaces import ( |
2166 | - IBugBranchSet, IBugSet, ILaunchpadCelebrities, |
2167 | - NotFoundError) |
2168 | +from lp.soyuz.interfaces.publishing import PackagePublishingPocket |
2169 | from lp.testing import TestCase |
2170 | -from lp.testing.factory import LaunchpadObjectFactory |
2171 | -from canonical.testing import LaunchpadZopelessLayer |
2172 | |
2173 | |
2174 | class RevisionPropertyParsing(TestCase): |
2175 | @@ -153,6 +153,27 @@ |
2176 | self.syncBazaarBranchToDatabase(self.bzr_branch, self.db_branch) |
2177 | self.assertBugBranchLinked(self.bug1, self.db_branch) |
2178 | |
2179 | + def makePackageBranch(self): |
2180 | + LaunchpadZopelessLayer.switchDbUser(self.lp_db_user) |
2181 | + try: |
2182 | + branch = self.factory.makePackageBranch() |
2183 | + branch.sourcepackage.setBranch( |
2184 | + PackagePublishingPocket.RELEASE, branch, branch.owner) |
2185 | + LaunchpadZopelessLayer.txn.commit() |
2186 | + finally: |
2187 | + LaunchpadZopelessLayer.switchDbUser(config.branchscanner.dbuser) |
2188 | + return branch |
2189 | + |
2190 | + def test_linking_bug_to_official_package_branch(self): |
2191 | + # We can link a bug to an official package branch. Test added to catch |
2192 | + # bug 391303. |
2193 | + self.commitRevision( |
2194 | + rev_id='rev1', |
2195 | + revprops={'bugs': '%s fixed' % self.getBugURL(self.bug1)}) |
2196 | + branch = self.makePackageBranch() |
2197 | + self.syncBazaarBranchToDatabase(self.bzr_branch, branch) |
2198 | + self.assertBugBranchLinked(self.bug1, branch) |
2199 | + |
2200 | def test_knownMainlineRevisionsDoesntMakeLink(self): |
2201 | """Don't add BugBranches for known mainline revision.""" |
2202 | self.commitRevision( |
2203 | |
2204 | === modified file 'lib/lp/registry/browser/configure.zcml' |
2205 | --- lib/lp/registry/browser/configure.zcml 2009-07-18 00:05:49 +0000 |
2206 | +++ lib/lp/registry/browser/configure.zcml 2009-07-19 04:41:14 +0000 |
2207 | @@ -1753,6 +1753,13 @@ |
2208 | permission="launchpad.Admin" |
2209 | template="../templates/productseries-review.pt"/> |
2210 | <browser:page |
2211 | + for="lp.registry.interfaces.productseries.IProductSeries" |
2212 | + name="+link-translations-branch" |
2213 | + class="lp.registry.browser.productseries.LinkTranslationsBranchView" |
2214 | + template="../templates/productseries-link-translations-branch.pt" |
2215 | + facet="translations" |
2216 | + permission="launchpad.Edit"/> |
2217 | + <browser:page |
2218 | name="+ask-a-question-button" |
2219 | for="lp.registry.interfaces.productseries.IProductSeries" |
2220 | class="canonical.launchpad.browser.AskAQuestionButtonView" |
2221 | |
2222 | === modified file 'lib/lp/registry/browser/productseries.py' |
2223 | --- lib/lp/registry/browser/productseries.py 2009-07-17 00:26:05 +0000 |
2224 | +++ lib/lp/registry/browser/productseries.py 2009-07-21 22:27:22 +0000 |
2225 | @@ -7,6 +7,7 @@ |
2226 | |
2227 | __all__ = [ |
2228 | 'get_series_branch_error', |
2229 | + 'LinkTranslationsBranchView', |
2230 | 'ProductSeriesBreadcrumbBuilder', |
2231 | 'ProductSeriesBugsMenu', |
2232 | 'ProductSeriesDeleteView', |
2233 | @@ -47,14 +48,13 @@ |
2234 | from lp.code.interfaces.codeimport import ( |
2235 | ICodeImportSet) |
2236 | from lp.services.worlddata.interfaces.country import ICountry |
2237 | -from lp.bugs.interfaces.bugtask import BugTaskSearchParams, IBugTaskSet |
2238 | +from lp.bugs.interfaces.bugtask import IBugTaskSet |
2239 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities |
2240 | from lp.registry.browser import StatusCount |
2241 | from lp.translations.interfaces.potemplate import IPOTemplateSet |
2242 | from lp.translations.interfaces.productserieslanguage import ( |
2243 | IProductSeriesLanguageSet) |
2244 | from lp.services.worlddata.interfaces.language import ILanguageSet |
2245 | -from canonical.launchpad.searchbuilder import any |
2246 | from canonical.launchpad.webapp import ( |
2247 | action, ApplicationMenu, canonical_url, custom_widget, |
2248 | enabled_with_permission, LaunchpadEditFormView, |
2249 | @@ -69,6 +69,7 @@ |
2250 | |
2251 | from lp.registry.browser import ( |
2252 | MilestoneOverlayMixin, RegistryDeleteViewMixin) |
2253 | +from lp.registry.interfaces.distroseries import DistroSeriesStatus |
2254 | from lp.registry.interfaces.productseries import IProductSeries |
2255 | from lp.registry.interfaces.sourcepackagename import ( |
2256 | ISourcePackageNameSet) |
2257 | @@ -417,6 +418,17 @@ |
2258 | return (branch is not None and |
2259 | check_permission('launchpad.View', branch)) |
2260 | |
2261 | + @property |
2262 | + def is_obsolete(self): |
2263 | + """Return True if the series is OBSOLETE" |
2264 | + |
2265 | + Obsolete series do not need to display as much information as other |
2266 | + series. Accessing private bugs is an expensive operation and showing |
2267 | + them for obsolete series can be a problem if many series are being |
2268 | + displayed. |
2269 | + """ |
2270 | + return self.context.status == DistroSeriesStatus.OBSOLETE |
2271 | + |
2272 | @cachedproperty |
2273 | def bugtask_status_counts(self): |
2274 | """A list StatusCounts summarising the targeted bugtasks.""" |
2275 | @@ -585,6 +597,27 @@ |
2276 | """Do nothing and go back to the product series page.""" |
2277 | |
2278 | |
2279 | +class LinkTranslationsBranchView(LaunchpadEditFormView): |
2280 | + """View to set the series' translations export branch.""" |
2281 | + |
2282 | + schema = IProductSeries |
2283 | + field_names = ['translations_branch'] |
2284 | + |
2285 | + @property |
2286 | + def next_url(self): |
2287 | + return canonical_url(self.context) + '/+translations-settings' |
2288 | + |
2289 | + @action(_('Update'), name='update') |
2290 | + def update_action(self, action, data): |
2291 | + self.updateContextFromData(data) |
2292 | + self.request.response.addInfoNotification( |
2293 | + 'Translations export branch updated.') |
2294 | + |
2295 | + @action('Cancel', name='cancel', validator='validate_cancel') |
2296 | + def cancel_action(self, action, data): |
2297 | + """Do nothing and go back to the settings page.""" |
2298 | + |
2299 | + |
2300 | class ProductSeriesLinkBranchFromCodeView(ProductSeriesLinkBranchView): |
2301 | """Set the branch link from the code overview page.""" |
2302 | |
2303 | |
2304 | === modified file 'lib/lp/registry/browser/tests/productseries-views.txt' |
2305 | --- lib/lp/registry/browser/tests/productseries-views.txt 2009-07-07 11:24:01 +0000 |
2306 | +++ lib/lp/registry/browser/tests/productseries-views.txt 2009-07-21 16:30:28 +0000 |
2307 | @@ -60,6 +60,22 @@ |
2308 | >>> print view.milestone_table_class |
2309 | listing |
2310 | |
2311 | +Obsolete series are less interesting that other series. The ProductSeriesView |
2312 | +has an is_obsolete property that templates can check when choosing the content |
2313 | +to display. |
2314 | + |
2315 | + >>> from lp.registry.interfaces.distroseries import DistroSeriesStatus |
2316 | + |
2317 | + >>> print series.status |
2318 | + Active Development |
2319 | + >>> view.is_obsolete |
2320 | + False |
2321 | + |
2322 | + >>> series.status = DistroSeriesStatus.OBSOLETE |
2323 | + >>> view = create_view(series, '+index') |
2324 | + >>> view.is_obsolete |
2325 | + True |
2326 | + |
2327 | |
2328 | == Delete ProductSeries == |
2329 | |
2330 | |
2331 | === modified file 'lib/lp/registry/configure.zcml' |
2332 | --- lib/lp/registry/configure.zcml 2009-07-17 00:26:05 +0000 |
2333 | +++ lib/lp/registry/configure.zcml 2009-07-19 04:41:14 +0000 |
2334 | @@ -1302,7 +1302,7 @@ |
2335 | <require |
2336 | permission="launchpad.Edit" |
2337 | set_attributes="product name owner driver summary branch |
2338 | - status releasefileglob |
2339 | + translations_branch status releasefileglob |
2340 | translations_autoimport_mode"/> |
2341 | <require |
2342 | permission="launchpad.AnyPerson" |
2343 | |
2344 | === modified file 'lib/lp/registry/doc/person.txt' |
2345 | --- lib/lp/registry/doc/person.txt 2009-07-16 22:17:03 +0000 |
2346 | +++ lib/lp/registry/doc/person.txt 2009-07-18 01:03:09 +0000 |
2347 | @@ -809,6 +809,7 @@ |
2348 | Ubuntu branches (ubuntu-branches): [] |
2349 | Ubuntu Doc Team (doc): [u'doc@lists.ubuntu.com'] |
2350 | Ubuntu Gnome Team (name18): [] |
2351 | + Ubuntu Security Team (ubuntu-security): [] |
2352 | Ubuntu Team (ubuntu-team): [u'support@ubuntu.com'] |
2353 | Ubuntu Translators (ubuntu-translators): [] |
2354 | Ubuntu-branches-owner (ubuntu-branches-owner): [u'ubuntu-branches-owner@example.com'] |
2355 | @@ -844,6 +845,7 @@ |
2356 | testing Spanish team (testing-spanish-team): [] |
2357 | Ubuntu Doc Team (doc): [u'doc@lists.ubuntu.com'] |
2358 | Ubuntu Gnome Team (name18): [] |
2359 | + Ubuntu Security Team (ubuntu-security): [] |
2360 | Ubuntu Team (ubuntu-team): [u'support@ubuntu.com'] |
2361 | Warty Gnome Team (warty-gnome): [] |
2362 | Warty Security Team (name20): [] |
2363 | @@ -864,6 +866,7 @@ |
2364 | testing Spanish team (testing-spanish-team): [] |
2365 | Ubuntu Doc Team (doc): [u'doc@lists.ubuntu.com'] |
2366 | Ubuntu Gnome Team (name18): [] |
2367 | + Ubuntu Security Team (ubuntu-security): [] |
2368 | Ubuntu Team (ubuntu-team): [u'support@ubuntu.com'] |
2369 | Warty Gnome Team (warty-gnome): [] |
2370 | Warty Security Team (name20): [] |
2371 | @@ -921,6 +924,7 @@ |
2372 | Simple Team (simple-team): [] |
2373 | testing Spanish team (testing-spanish-team): [] |
2374 | Ubuntu Gnome Team (name18): [] |
2375 | + Ubuntu Security Team (ubuntu-security): [] |
2376 | Ubuntu Team (ubuntu-team): [u'support@ubuntu.com'] |
2377 | Warty Gnome Team (warty-gnome): [] |
2378 | Warty Security Team (name20): [] |
2379 | @@ -939,6 +943,7 @@ |
2380 | Simple Team (simple-team): [] |
2381 | testing Spanish team (testing-spanish-team): [] |
2382 | Ubuntu Gnome Team (name18): [] |
2383 | + Ubuntu Security Team (ubuntu-security): [] |
2384 | Ubuntu Team (ubuntu-team): [u'support@ubuntu.com'] |
2385 | Warty Gnome Team (warty-gnome): [] |
2386 | Warty Security Team (name20): [] |
2387 | |
2388 | === modified file 'lib/lp/registry/doc/vocabularies.txt' |
2389 | --- lib/lp/registry/doc/vocabularies.txt 2009-07-17 16:22:12 +0000 |
2390 | +++ lib/lp/registry/doc/vocabularies.txt 2009-07-18 06:35:05 +0000 |
2391 | @@ -743,7 +743,8 @@ |
2392 | |
2393 | >>> [(p.name, getattr(p.teamowner, 'name', None)) |
2394 | ... for p in vocab.search('ubuntu-team')] |
2395 | - [(u'doc', None), (u'name18', u'sabdfl'), (u'ubuntu-team', u'sabdfl')] |
2396 | + [(u'doc', None), (u'name18', u'sabdfl'), |
2397 | + (u'ubuntu-security', u'kamion'), (u'ubuntu-team', u'sabdfl')] |
2398 | |
2399 | But it doesn't include merged accounts: |
2400 | |
2401 | @@ -780,7 +781,7 @@ |
2402 | >>> sorted(person.name for person in vocab.search('team')) |
2403 | [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2404 | u'otherteam', u'simple-team', u'testing-spanish-team', |
2405 | - u'ubuntu-team', u'warty-gnome'] |
2406 | + u'ubuntu-security', u'ubuntu-team', u'warty-gnome'] |
2407 | |
2408 | Logging in as 'owner', who is a member of myteam shows that the token |
2409 | lookup still omits myteam. |
2410 | @@ -789,7 +790,7 @@ |
2411 | >>> sorted(person.name for person in vocab.search('team')) |
2412 | [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2413 | u'otherteam', u'simple-team', u'testing-spanish-team', |
2414 | - u'ubuntu-team', u'warty-gnome'] |
2415 | + u'ubuntu-security', u'ubuntu-team', u'warty-gnome'] |
2416 | |
2417 | A PRIVATE team is displayed when the logged in user is a member of the team. |
2418 | |
2419 | @@ -802,17 +803,19 @@ |
2420 | ... owner=commercial, |
2421 | ... visibility=PersonVisibility.PRIVATE) |
2422 | >>> sorted(person.name for person in vocab.search('team')) |
2423 | - [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2424 | - u'otherteam', u'private-team', u'simple-team', u'testing-spanish-team', |
2425 | + [u'hwdb-team', u'name18', u'name20', u'name21', |
2426 | + u'no-team-memberships', u'otherteam', u'private-team', |
2427 | + u'simple-team', u'testing-spanish-team', u'ubuntu-security', |
2428 | u'ubuntu-team', u'warty-gnome'] |
2429 | |
2430 | The PRIVATE team is also displayed for Launchpad admins. |
2431 | |
2432 | >>> login('foo.bar@canonical.com') |
2433 | >>> sorted(person.name for person in vocab.search('team')) |
2434 | - [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2435 | - u'otherteam', u'private-team', u'simple-team', u'testing-spanish-team', |
2436 | - u'ubuntu-team', u'warty-gnome'] |
2437 | + [u'hwdb-team', u'name18', u'name20', u'name21', |
2438 | + u'no-team-memberships', u'otherteam', |
2439 | + u'private-team', u'simple-team', u'testing-spanish-team', |
2440 | + u'ubuntu-security', u'ubuntu-team', u'warty-gnome'] |
2441 | |
2442 | The PRIVATE team can be looked up via getTermByToken for a member of the team. |
2443 | |
2444 | @@ -826,7 +829,7 @@ |
2445 | >>> sorted(person.name for person in vocab.search('team')) |
2446 | [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2447 | u'otherteam', u'simple-team', u'testing-spanish-team', |
2448 | - u'ubuntu-team', u'warty-gnome'] |
2449 | + u'ubuntu-security', u'ubuntu-team', u'warty-gnome'] |
2450 | |
2451 | The anonymous user will not see the private team either. |
2452 | |
2453 | @@ -834,7 +837,7 @@ |
2454 | >>> sorted(person.name for person in vocab.search('team')) |
2455 | [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2456 | u'otherteam', u'simple-team', u'testing-spanish-team', |
2457 | - u'ubuntu-team', u'warty-gnome'] |
2458 | + u'ubuntu-security', u'ubuntu-team', u'warty-gnome'] |
2459 | |
2460 | Attempting to lookup the team via getTermByToken results in a |
2461 | LookupError, the same as if the team didn't exist, which is really |
2462 | @@ -854,7 +857,6 @@ |
2463 | >>> sorted(person.name for person in vocab.search('')) |
2464 | [...u'private-team'...] |
2465 | |
2466 | - |
2467 | A search for 'support' will give us only the persons which have support |
2468 | as part of their name or displayname, or the beginning of |
2469 | one of its email addresses. |
2470 | @@ -890,8 +892,9 @@ |
2471 | >>> login(ANONYMOUS) |
2472 | >>> [person.displayname for person in vocab.search('team')] |
2473 | [u'HWDB Team', u'Hoary Gnome Team', u'No Team Memberships', |
2474 | - u'Other Team', u'Simple Team', u'Ubuntu Gnome Team', u'Ubuntu Team', |
2475 | - u'Warty Gnome Team', u'Warty Security Team', u'testing Spanish team'] |
2476 | + u'Other Team', u'Simple Team', u'Ubuntu Gnome Team', |
2477 | + u'Ubuntu Security Team', u'Ubuntu Team', u'Warty Gnome Team', |
2478 | + u'Warty Security Team', u'testing Spanish team'] |
2479 | |
2480 | >>> login(ANONYMOUS) |
2481 | >>> vocab.LIMIT |
2482 | @@ -920,9 +923,10 @@ |
2483 | search for 'team' should give us some of them: |
2484 | |
2485 | >>> sorted(person.name for person in vocab.search('team')) |
2486 | - [u'hwdb-team', u'name18', u'name20', u'name21', u'no-team-memberships', |
2487 | - u'otherteam', u'simple-team', u'testing-spanish-team', u'ubuntu-team', |
2488 | - u'warty-gnome'] |
2489 | + [u'hwdb-team', u'name18', u'name20', u'name21', |
2490 | + u'no-team-memberships', u'otherteam', |
2491 | + u'simple-team', u'testing-spanish-team', u'ubuntu-security', |
2492 | + u'ubuntu-team', u'warty-gnome'] |
2493 | |
2494 | |
2495 | === ValidTeam === |
2496 | @@ -961,6 +965,7 @@ |
2497 | (u'ShipIt Administrators', u'Mark Shuttleworth'), |
2498 | (u'Simple Team', u'One Membership'), |
2499 | (u'Ubuntu Gnome Team', u'Mark Shuttleworth'), |
2500 | + (u'Ubuntu Security Team', u'Colin Watson'), |
2501 | (u'Ubuntu Team', u'Mark Shuttleworth'), |
2502 | (u'Ubuntu Translators', u'Rosetta Administrators'), |
2503 | (u'Ubuntu branches', u'Ubuntu-branches-owner'), |
2504 | @@ -980,6 +985,7 @@ |
2505 | ... for team in vocab.search('spanish | ubuntu')) |
2506 | [(u'Mirror Administrators', u'Mark Shuttleworth'), |
2507 | (u'Ubuntu Gnome Team', u'Mark Shuttleworth'), |
2508 | + (u'Ubuntu Security Team', u'Colin Watson'), |
2509 | (u'Ubuntu Team', u'Mark Shuttleworth'), |
2510 | (u'Ubuntu Translators', u'Rosetta Administrators'), |
2511 | (u'Ubuntu branches', u'Ubuntu-branches-owner'), |
2512 | @@ -992,6 +998,7 @@ |
2513 | (u'Other Team', u'Owner'), |
2514 | (u'Simple Team', u'One Membership'), |
2515 | (u'Ubuntu Gnome Team', u'Mark Shuttleworth'), |
2516 | + (u'Ubuntu Security Team', u'Colin Watson'), |
2517 | (u'Ubuntu Team', u'Mark Shuttleworth'), |
2518 | (u'Warty Gnome Team', u'Mark Shuttleworth'), |
2519 | (u'Warty Security Team', u'Mark Shuttleworth'), |
2520 | @@ -1008,6 +1015,7 @@ |
2521 | (u'Private Team', u'Commercial Member'), |
2522 | (u'Simple Team', u'One Membership'), |
2523 | (u'Ubuntu Gnome Team', u'Mark Shuttleworth'), |
2524 | + (u'Ubuntu Security Team', u'Colin Watson'), |
2525 | (u'Ubuntu Team', u'Mark Shuttleworth'), |
2526 | (u'Warty Gnome Team', u'Mark Shuttleworth'), |
2527 | (u'Warty Security Team', u'Mark Shuttleworth'), |
2528 | @@ -1060,7 +1068,7 @@ |
2529 | >>> team in vocab |
2530 | False |
2531 | >>> [person.name for person in vocab.search('ubuntu-team')] |
2532 | - [u'name18'] |
2533 | + [u'name18', u'ubuntu-security'] |
2534 | |
2535 | 'ubuntu-team' is a member of 'guadamen', so 'guadamen' can't be a member |
2536 | of 'ubuntu-team'. |
2537 | @@ -1103,7 +1111,7 @@ |
2538 | >>> team in vocab |
2539 | False |
2540 | >>> [person.name for person in vocab.search('ubuntu-team')] |
2541 | - [u'name18'] |
2542 | + [u'name18', u'ubuntu-security'] |
2543 | |
2544 | 'name16' is a valid owner for 'ubuntu-team'. |
2545 | |
2546 | |
2547 | === modified file 'lib/lp/registry/interfaces/productseries.py' |
2548 | --- lib/lp/registry/interfaces/productseries.py 2009-07-17 00:26:05 +0000 |
2549 | +++ lib/lp/registry/interfaces/productseries.py 2009-07-19 04:41:14 +0000 |
2550 | @@ -251,6 +251,15 @@ |
2551 | "A Bazaar branch to commit translation snapshots to. " |
2552 | "Leave blank to disable.")) |
2553 | |
2554 | + translations_branch = ReferenceChoice( |
2555 | + title=_("Translations export branch"), |
2556 | + vocabulary='HostedBranchRestrictedOnOwner', |
2557 | + schema=IBranch, |
2558 | + required=False, |
2559 | + description=_( |
2560 | + "A Bazaar branch to commit translation snapshots to. " |
2561 | + "Leave blank to disable.")) |
2562 | + |
2563 | def getRelease(version): |
2564 | """Get the release in this series that has the specified version. |
2565 | Return None is there is no such release. |
2566 | |
2567 | === modified file 'lib/lp/registry/stories/productseries/xx-productseries-series.txt' |
2568 | --- lib/lp/registry/stories/productseries/xx-productseries-series.txt 2009-06-29 23:15:46 +0000 |
2569 | +++ lib/lp/registry/stories/productseries/xx-productseries-series.txt 2009-07-21 16:30:28 +0000 |
2570 | @@ -54,12 +54,12 @@ |
2571 | >>> print series_1_0['class'] |
2572 | unhighlighted series |
2573 | |
2574 | -Any user can see that obsolete series are dimmed. |
2575 | +Any user can see that obsolete series are dimmed. Obsolete series do not |
2576 | +show bug status counts because it is expensive to retrieve the information. |
2577 | |
2578 | >>> series_xxx = find_tag_by_id(content, 'series-xxx') |
2579 | >>> print extract_text(series_xxx) |
2580 | xxx series Obsolete |
2581 | - Bugs targeted: None |
2582 | Blueprints targeted: None |
2583 | Use true GTK UI. |
2584 | |
2585 | |
2586 | === modified file 'lib/lp/registry/templates/productrelease-add-from-series.pt' |
2587 | --- lib/lp/registry/templates/productrelease-add-from-series.pt 2009-07-17 17:59:07 +0000 |
2588 | +++ lib/lp/registry/templates/productrelease-add-from-series.pt 2009-07-21 17:29:12 +0000 |
2589 | @@ -51,7 +51,7 @@ |
2590 | var select_menu = get_by_id('field.milestone_for_release'); |
2591 | var create_milestone_link = Y.Node.create( |
2592 | '<a href="+addmilestone" id="create-milestone-link" ' + |
2593 | - 'class="add js-action">Create milestone</a>'); |
2594 | + 'class="add js-action sprite">Create milestone</a>'); |
2595 | select_menu.ancestor().appendChild(create_milestone_link); |
2596 | var config = { |
2597 | milestone_form_uri: milestone_form_uri, |
2598 | |
2599 | === added file 'lib/lp/registry/templates/productseries-link-translations-branch.pt' |
2600 | --- lib/lp/registry/templates/productseries-link-translations-branch.pt 1970-01-01 00:00:00 +0000 |
2601 | +++ lib/lp/registry/templates/productseries-link-translations-branch.pt 2009-07-03 09:50:12 +0000 |
2602 | @@ -0,0 +1,25 @@ |
2603 | +<tal:root |
2604 | + xmlns:tal="http://xml.zope.org/namespaces/tal" |
2605 | + omit-tag=""> |
2606 | + |
2607 | +<html |
2608 | + xmlns="http://www.w3.org/1999/xhtml" |
2609 | + xmlns:metal="http://xml.zope.org/namespaces/metal" |
2610 | + xmlns:i18n="http://xml.zope.org/namespaces/i18n" |
2611 | + xml:lang="en" |
2612 | + lang="en" |
2613 | + dir="ltr" |
2614 | + metal:use-macro="view/macro:page/onecolumn" |
2615 | + i18n:domain="launchpad" |
2616 | +> |
2617 | + |
2618 | +<body> |
2619 | +<div metal:fill-slot="main"> |
2620 | + <div metal:use-macro="context/@@launchpad_form/form"> |
2621 | + <h1 metal:fill-slot="heading" |
2622 | + >Set translations export branch for this series</h1> |
2623 | + </div> |
2624 | +</div> |
2625 | +</body> |
2626 | +</html> |
2627 | +</tal:root> |
2628 | |
2629 | === modified file 'lib/lp/registry/templates/productseries-status.pt' |
2630 | --- lib/lp/registry/templates/productseries-status.pt 2009-07-17 17:59:07 +0000 |
2631 | +++ lib/lp/registry/templates/productseries-status.pt 2009-07-21 18:17:49 +0000 |
2632 | @@ -6,24 +6,27 @@ |
2633 | tal:define=" |
2634 | series context; |
2635 | is_focus context/is_development_focus; |
2636 | - bug_count_status view/bugtask_status_counts; |
2637 | spec_count_status view/specification_status_counts;" |
2638 | > |
2639 | <metal:series use-macro="series/@@+macros/detailed_display"> |
2640 | <div metal:fill-slot="extra"> |
2641 | <div> |
2642 | - Bugs targeted: |
2643 | - <tal:statuses repeat="count_status bug_count_status"> |
2644 | - <span tal:attributes="class string:status${count_status/status/name}"> |
2645 | - <strong tal:content="count_status/count">2</strong> |
2646 | - <tal:status replace="count_status/status/title" /><tal:comma |
2647 | - condition="not: repeat/count_status/end">,</tal:comma> |
2648 | - </span> |
2649 | - </tal:statuses> |
2650 | - <tal:no-statuses condition="not: bug_count_status"> |
2651 | - None |
2652 | - </tal:no-statuses> |
2653 | - <br /> |
2654 | + <tal:not-obsolete |
2655 | + condition="not: view/is_obsolete" |
2656 | + define="bug_count_status view/bugtask_status_counts;"> |
2657 | + Bugs targeted: |
2658 | + <tal:statuses repeat="count_status bug_count_status"> |
2659 | + <span tal:attributes="class string:status${count_status/status/name}"> |
2660 | + <strong tal:content="count_status/count">2</strong> |
2661 | + <tal:status replace="count_status/status/title" /><tal:comma |
2662 | + condition="not: repeat/count_status/end">,</tal:comma> |
2663 | + </span> |
2664 | + </tal:statuses> |
2665 | + <tal:no-statuses condition="not: bug_count_status"> |
2666 | + None |
2667 | + </tal:no-statuses> |
2668 | + <br /> |
2669 | + </tal:not-obsolete> |
2670 | Blueprints targeted: |
2671 | <tal:statuses repeat="count_status spec_count_status"> |
2672 | <span tal:attributes="class string:specdelivery${count_status/status/name}"> |
2673 | |
2674 | === modified file 'lib/lp/services/inlinehelp/javascript/inlinehelp.js' |
2675 | --- lib/lp/services/inlinehelp/javascript/inlinehelp.js 2009-06-30 21:06:27 +0000 |
2676 | +++ lib/lp/services/inlinehelp/javascript/inlinehelp.js 2009-07-20 23:17:14 +0000 |
2677 | @@ -24,8 +24,10 @@ |
2678 | page elements. |
2679 | */ |
2680 | // The button is inserted in the page dynamically: |
2681 | + // Changed from an <input type=button> to a <button> since |
2682 | + // IE8 doesn't handle style.css's input{visibility:inherit} correctly. |
2683 | $('help-close').innerHTML = |
2684 | - '<input id="help-close-btn" type="button" value="Continue">'; |
2685 | + '<button id="help-close-btn" value="Continue">'; |
2686 | forEach(findHelpLinks(), setupHelpTrigger); |
2687 | initHelpPane(); |
2688 | } |
2689 | |
2690 | === modified file 'lib/lp/soyuz/browser/queue.py' |
2691 | --- lib/lp/soyuz/browser/queue.py 2009-07-17 00:26:05 +0000 |
2692 | +++ lib/lp/soyuz/browser/queue.py 2009-07-19 04:41:14 +0000 |
2693 | @@ -186,7 +186,12 @@ |
2694 | if len(uploads) == 0: |
2695 | return None |
2696 | |
2697 | - upload_ids = [upload.id for upload in uploads] |
2698 | + # Operate only on upload and/or processed delayed-copies. |
2699 | + upload_ids = [ |
2700 | + upload.id |
2701 | + for upload in uploads |
2702 | + if not (upload.is_delayed_copy and |
2703 | + upload.status != PackageUploadStatus.DONE)] |
2704 | binary_file_set = getUtility(IBinaryPackageFileSet) |
2705 | binary_files = binary_file_set.getByPackageUploadIDs(upload_ids) |
2706 | source_file_set = getUtility(ISourcePackageReleaseFileSet) |
2707 | @@ -456,3 +461,22 @@ |
2708 | self.sourcepackagerelease = self.sources[0].sourcepackagerelease |
2709 | else: |
2710 | self.sourcepackagerelease = None |
2711 | + |
2712 | + @property |
2713 | + def pending_delayed_copy(self): |
2714 | + """Whether the context is a delayed-copy pending processing.""" |
2715 | + return ( |
2716 | + self.is_delayed_copy and self.status != PackageUploadStatus.DONE) |
2717 | + |
2718 | + @property |
2719 | + def changesfile(self): |
2720 | + """Return the upload changesfile object, even for delayed-copies. |
2721 | + |
2722 | + If the context `PackageUpload` is a delayed-copy, which doesn't |
2723 | + have '.changesfile' by design, return the changesfile originally |
2724 | + used to upload the contained source. |
2725 | + """ |
2726 | + if self.is_delayed_copy: |
2727 | + return self.sources[0].sourcepackagerelease.upload_changesfile |
2728 | + return self.context.changesfile |
2729 | + |
2730 | |
2731 | === modified file 'lib/lp/soyuz/browser/tests/archive-views.txt' |
2732 | --- lib/lp/soyuz/browser/tests/archive-views.txt 2009-07-17 13:11:44 +0000 |
2733 | +++ lib/lp/soyuz/browser/tests/archive-views.txt 2009-07-18 01:03:09 +0000 |
2734 | @@ -907,11 +907,11 @@ |
2735 | WidgetInputError: ('destination_archive', u'Destination PPA', ) |
2736 | |
2737 | |
2738 | -=== Copy privacy mismatch === |
2739 | +=== Copy private files to public archives === |
2740 | |
2741 | -Users are only allowed to copy private sources into private PPAs, |
2742 | -otherwise builders won't be able to retrieve the files for |
2743 | -building. See `testCopyFromPrivateToPublicPPAs` for more information. |
2744 | +Users are allowed to copy private sources into private PPAs, however |
2745 | +it happens via 'delayed-copies' not the usual direct copying method. |
2746 | +See more information in scripts/packagecopier.py |
2747 | |
2748 | First we will make Celso's PPA private. |
2749 | |
2750 | @@ -954,14 +954,24 @@ |
2751 | ... 'field.actions.copy': 'Copy', |
2752 | ... }) |
2753 | |
2754 | -The action cannot be performed due to the 'private mismatch' |
2755 | -error. Nothing was copied to Ubuntu-team PPA. |
2756 | + >>> len(view.errors) |
2757 | + 0 |
2758 | + |
2759 | +The action is performed as a delayed-copy, and the user is informed of |
2760 | +it via a page notification. |
2761 | |
2762 | >>> from canonical.launchpad.testing.pages import extract_text |
2763 | - >>> for error in view.errors: |
2764 | - ... print extract_text(error) |
2765 | - The following source cannot be copied: |
2766 | - private 1.0 in hoary (cannot copy private files into public archives) |
2767 | - |
2768 | - >>> ubuntu_team_ppa.getPublishedSources().count() |
2769 | - 0 |
2770 | + >>> for notification in view.request.response.notifications: |
2771 | + ... print extract_text(notification.message) |
2772 | + Packages copied to PPA for Ubuntu Team: |
2773 | + Delayed copy of private - 1.0 (source) |
2774 | + |
2775 | +The delayed-copy request is waiting to be processed in the ACCEPTED |
2776 | +upload queue. |
2777 | + |
2778 | + >>> from lp.soyuz.interfaces.queue import IPackageUploadSet |
2779 | + >>> copy = getUtility(IPackageUploadSet).findSourceUpload( |
2780 | + ... 'private', '1.0', ubuntu_team_ppa, ubuntu) |
2781 | + |
2782 | + >>> print copy.status.name |
2783 | + ACCEPTED |
2784 | |
2785 | === modified file 'lib/lp/soyuz/doc/archive.txt' |
2786 | --- lib/lp/soyuz/doc/archive.txt 2009-07-11 14:46:40 +0000 |
2787 | +++ lib/lp/soyuz/doc/archive.txt 2009-07-20 19:13:31 +0000 |
2788 | @@ -2065,6 +2065,42 @@ |
2789 | >>> cprov_archive.buildd_secret = '' |
2790 | >>> cprov_archive.private = False |
2791 | |
2792 | +Another important aspect of the upload permission for ubuntu main |
2793 | +archives (PRIMARY, PARTNER and DEBUG) is that in addition to owners |
2794 | +and users which were specifically granted permissions, members of the |
2795 | +ubuntu-security' team also have 'launchpad.Append' on them. |
2796 | + |
2797 | +In the sampledata, Carlos does not have permission to append contents |
2798 | +to the Ubuntu main archives. |
2799 | + |
2800 | + >>> primary, partner, debug = ubuntu.all_distro_archives |
2801 | + |
2802 | + >>> login('carlos@canonical.com') |
2803 | + >>> check_permission('launchpad.Append', primary) |
2804 | + False |
2805 | + >>> check_permission('launchpad.Append', partner) |
2806 | + False |
2807 | + >>> check_permission('launchpad.Append', debug) |
2808 | + False |
2809 | + |
2810 | +When Carlos becomes a member of the 'ubuntu-security' team he is |
2811 | +allowed to append to ubuntu main archives. In practice it means that |
2812 | +Carlos can now *copy* packages directly to ubuntu. |
2813 | + |
2814 | + # Make Carlos a member of the ubuntu-security team. |
2815 | + >>> login('foo.bar@canonical.com') |
2816 | + >>> ubuntu_security = getUtility(IPersonSet).getByName( |
2817 | + ... 'ubuntu-security') |
2818 | + >>> ubuntu_security.addMember(carlos, cprov) |
2819 | + |
2820 | + >>> login('carlos@canonical.com') |
2821 | + >>> check_permission('launchpad.Append', primary) |
2822 | + True |
2823 | + >>> check_permission('launchpad.Append', partner) |
2824 | + True |
2825 | + >>> check_permission('launchpad.Append', debug) |
2826 | + True |
2827 | + |
2828 | |
2829 | == Rebuild archives == |
2830 | |
2831 | @@ -2080,6 +2116,7 @@ |
2832 | Creating new COPY archive without passing a name results in an |
2833 | AssertionError. |
2834 | |
2835 | + >>> login('foo.bar@canonical.com') |
2836 | >>> rebuild_archive = getUtility(IArchiveSet).new( |
2837 | ... owner=cprov, purpose=ArchivePurpose.COPY, |
2838 | ... distribution=ubuntutest) |
2839 | @@ -2227,7 +2264,6 @@ |
2840 | ... |
2841 | DistroSeriesNotFound: badseries |
2842 | |
2843 | - |
2844 | We can also specify a single source to be copied with the `syncSource` |
2845 | call. This allows a version to be specified so older versions can be |
2846 | pulled. |
2847 | @@ -2312,6 +2348,38 @@ |
2848 | ... |
2849 | CannotCopy: Destination pocket must be 'release' for a PPA. |
2850 | |
2851 | +syncSource() will always use only the latest publication of the |
2852 | +specific source, ignoring the previous ones. Multiple publications can |
2853 | +be resulted from copies and/or overrides of the copy candidates in the |
2854 | +source archive. |
2855 | + |
2856 | + # Create a copy candidate (override_1.0) in ubuntu primary archive |
2857 | + # and override its section. Resulting in 2 publications in the |
2858 | + # source archive. |
2859 | + >>> from lp.soyuz.interfaces.section import ISectionSet |
2860 | + >>> source_old = test_publisher.getPubSource( |
2861 | + ... sourcename="overridden", version="1.0") |
2862 | + >>> python_section = getUtility(ISectionSet).ensure('python') |
2863 | + >>> copy_candidate = source_old.changeOverride(new_section=python_section) |
2864 | + |
2865 | + >>> source_archive = copy_candidate.archive |
2866 | + >>> source_archive.getPublishedSources(name="overridden").count() |
2867 | + 2 |
2868 | + |
2869 | + >>> print copy_candidate.section.name |
2870 | + python |
2871 | + |
2872 | +When syncing 'overridden_1.0' to Mark's PPA, the latest publication, |
2873 | +the one published in 'python' section, will be used. |
2874 | + |
2875 | + >>> sabdfl.archive.syncSource( |
2876 | + ... source_name='overridden', version='1.0', |
2877 | + ... from_archive=source_archive, to_pocket='release') |
2878 | + |
2879 | + >>> [copy] = sabdfl.archive.getPublishedSources(name="overridden") |
2880 | + >>> print copy.section.name |
2881 | + python |
2882 | + |
2883 | |
2884 | == Publish flag == |
2885 | |
2886 | |
2887 | === modified file 'lib/lp/soyuz/doc/closing-bugs-from-changelogs.txt' |
2888 | --- lib/lp/soyuz/doc/closing-bugs-from-changelogs.txt 2009-05-06 20:53:05 +0000 |
2889 | +++ lib/lp/soyuz/doc/closing-bugs-from-changelogs.txt 2009-07-16 00:00:27 +0000 |
2890 | @@ -49,18 +49,18 @@ |
2891 | Launchpad-bugs-fixed header. This is required so that we have some data |
2892 | for close_bugs to operate on. |
2893 | |
2894 | - >>> from canonical.launchpad.interfaces import ( |
2895 | - ... PackagePublishingPocket, PackageUploadStatus) |
2896 | - |
2897 | + >>> from canonical.launchpad.interfaces import PackagePublishingPocket |
2898 | >>> def add_package_upload( |
2899 | ... source_release, fixing_text, |
2900 | ... pocket=PackagePublishingPocket.RELEASE, |
2901 | - ... archive=None): |
2902 | + ... archive=None, distroseries=None): |
2903 | ... """Create a PackageUpload record.""" |
2904 | ... changes = changes_template % fixing_text |
2905 | + ... if distroseries is None: |
2906 | + ... distroseries = ubuntu_hoary |
2907 | ... if archive is None: |
2908 | - ... archive = ubuntu_hoary.main_archive |
2909 | - ... queue_item = ubuntu_hoary.createQueueEntry( |
2910 | + ... archive = distroseries.main_archive |
2911 | + ... queue_item = distroseries.createQueueEntry( |
2912 | ... archive=archive, |
2913 | ... pocket=pocket, |
2914 | ... changesfilename='%s.changes' % source_release.name, |
2915 | @@ -68,7 +68,6 @@ |
2916 | ... source_queue = queue_item.addSource(source_release) |
2917 | ... return queue_item |
2918 | |
2919 | - |
2920 | Throughout this document we'll now create PackageUpload records for various |
2921 | packages in the sample data (e.g. pmount, cdrkit) using this helper function. |
2922 | |
2923 | @@ -215,7 +214,7 @@ |
2924 | >>> cdrkit_bug_id = cdrkit_ubuntu.createBug(bug_params).id |
2925 | |
2926 | >>> queue_item_id = add_package_upload( |
2927 | - ... cdrkit_release, cdrkit_bug_id, |
2928 | + ... cdrkit_release, cdrkit_bug_id, |
2929 | ... pocket=PackagePublishingPocket.PROPOSED).id |
2930 | |
2931 | >>> close_bugs_and_check_status([cdrkit_bug_id], queue_item_id) |
2932 | @@ -226,7 +225,7 @@ |
2933 | #295621). |
2934 | |
2935 | >>> queue_item_id = add_package_upload( |
2936 | - ... cdrkit_release, cdrkit_bug_id, |
2937 | + ... cdrkit_release, cdrkit_bug_id, |
2938 | ... pocket=PackagePublishingPocket.BACKPORTS).id |
2939 | |
2940 | >>> close_bugs_and_check_status([cdrkit_bug_id], queue_item_id) |
2941 | @@ -246,6 +245,28 @@ |
2942 | Before: NEW |
2943 | After: NEW |
2944 | |
2945 | +Delayed-copies to allowed archives and pockets will close bugs when |
2946 | +processed. |
2947 | + |
2948 | + # Create the cdrkit 'original upload' in ubuntu/breezy-autotest |
2949 | + # with an appropriate 'changelog_entry'. |
2950 | + >>> original_upload = add_package_upload( |
2951 | + ... cdrkit_release, cdrkit_bug_id, |
2952 | + ... distroseries=cdrkit_release.upload_distroseries) |
2953 | + >>> from zope.security.proxy import removeSecurityProxy |
2954 | + >>> removeSecurityProxy(cdrkit_release).changelog_entry = 'Something!' |
2955 | + |
2956 | + # Create a delayed-copy for cdrkit in ubuntu/hoary. |
2957 | + >>> from lp.soyuz.interfaces.queue import IPackageUploadSet |
2958 | + >>> delayed_copy = getUtility(IPackageUploadSet).createDelayedCopy( |
2959 | + ... archive=ubuntu.main_archive, distroseries=ubuntu_hoary, |
2960 | + ... pocket=PackagePublishingPocket.RELEASE, signing_key=None) |
2961 | + >>> unused = delayed_copy.addSource(cdrkit_release) |
2962 | + |
2963 | + >>> close_bugs_and_check_status([cdrkit_bug_id], delayed_copy.id) |
2964 | + Before: NEW |
2965 | + After: FIXRELEASED |
2966 | + |
2967 | It's possible to specify more than one bug in the Launchpad-bugs-fixed |
2968 | header, each will be marked as Fix Released. If a nonexistent bug, |
2969 | '666', is specified, it's ignored. |
2970 | |
2971 | === modified file 'lib/lp/soyuz/doc/distroseriesqueue.txt' |
2972 | --- lib/lp/soyuz/doc/distroseriesqueue.txt 2009-07-16 03:31:45 +0000 |
2973 | +++ lib/lp/soyuz/doc/distroseriesqueue.txt 2009-07-16 15:06:35 +0000 |
2974 | @@ -1093,16 +1093,16 @@ |
2975 | |
2976 | >>> unused = delayed_copy.addSource(a_source_release) |
2977 | |
2978 | -IPackageUpload.acceptFromCopy() checks and accepts a delayed-copy |
2979 | -record. It also closes related bug reports and grant karma to people |
2980 | -related with the upload, although it doesn't send any emails. |
2981 | +IPackageUpload.acceptFromCopy() simply checks and accepts a |
2982 | +delayed-copy record. Bugs mentioned in the changelog are closed by |
2983 | +`process-accepted` (transition to ACCEPTED to DONE) see |
2984 | +closing-bugs-from-changelogs.txt for more information. |
2985 | + |
2986 | + >>> print delayed_copy.status.name |
2987 | + NEW |
2988 | |
2989 | >>> delayed_copy.acceptFromCopy() |
2990 | |
2991 | - >>> transaction.commit() |
2992 | - >>> pop_notifications() |
2993 | - [] |
2994 | - |
2995 | >>> print delayed_copy.status.name |
2996 | ACCEPTED |
2997 | |
2998 | |
2999 | === modified file 'lib/lp/soyuz/interfaces/queue.py' |
3000 | --- lib/lp/soyuz/interfaces/queue.py 2009-07-17 00:26:05 +0000 |
3001 | +++ lib/lp/soyuz/interfaces/queue.py 2009-07-19 04:41:14 +0000 |
3002 | @@ -274,19 +274,25 @@ |
3003 | * Publish and close bugs for 'single-source' uploads. |
3004 | * Skip bug-closing for PPA uploads. |
3005 | * Grant karma to people involved with the upload. |
3006 | + |
3007 | + :raises: AssertionError if the context is a delayed-copy. |
3008 | """ |
3009 | |
3010 | def acceptFromCopy(): |
3011 | """Perform upload acceptance for a delayed-copy record. |
3012 | |
3013 | * Move the upload to accepted queue in all cases. |
3014 | - * Close bugs for uploaded sources (skip imported ones). |
3015 | + |
3016 | + :raises: AssertionError if the context is not a delayed-copy or |
3017 | + has no sources associated to it. |
3018 | """ |
3019 | |
3020 | def acceptFromQueue(announce_list, logger=None, dry_run=False): |
3021 | """Call setAccepted, do a syncUpdate, and send notification email. |
3022 | |
3023 | * Grant karma to people involved with the upload. |
3024 | + |
3025 | + :raises: AssertionError if the context is a delayed-copy. |
3026 | """ |
3027 | |
3028 | def rejectFromQueue(logger=None, dry_run=False): |
3029 | |
3030 | === modified file 'lib/lp/soyuz/model/archive.py' |
3031 | --- lib/lp/soyuz/model/archive.py 2009-07-18 21:19:37 +0000 |
3032 | +++ lib/lp/soyuz/model/archive.py 2009-07-20 18:05:13 +0000 |
3033 | @@ -1082,9 +1082,9 @@ |
3034 | raise SourceNotFound(e) |
3035 | |
3036 | source = from_archive.getPublishedSources( |
3037 | - name=source_name, version=version, exact_match=True) |
3038 | + name=source_name, version=version, exact_match=True)[0] |
3039 | |
3040 | - self._copySources(source, to_pocket, to_series, include_binaries) |
3041 | + self._copySources([source], to_pocket, to_series, include_binaries) |
3042 | |
3043 | def _copySources(self, sources, to_pocket, to_series=None, |
3044 | include_binaries=False): |
3045 | |
3046 | === modified file 'lib/lp/soyuz/model/archivepermission.py' |
3047 | --- lib/lp/soyuz/model/archivepermission.py 2009-07-17 00:26:05 +0000 |
3048 | +++ lib/lp/soyuz/model/archivepermission.py 2009-07-21 08:27:00 +0000 |
3049 | @@ -121,10 +121,8 @@ |
3050 | clauses = [""" |
3051 | ArchivePermission.archive = %s AND |
3052 | ArchivePermission.permission = %s AND |
3053 | - EXISTS (SELECT TeamParticipation.person |
3054 | - FROM TeamParticipation |
3055 | - WHERE TeamParticipation.person = %s AND |
3056 | - TeamParticipation.team = ArchivePermission.person) |
3057 | + ArchivePermission.person = TeamParticipation.team AND |
3058 | + TeamParticipation.person = %s |
3059 | """ % sqlvalues(archive, permission, person) |
3060 | ] |
3061 | |
3062 | @@ -149,7 +147,7 @@ |
3063 | |
3064 | query = " AND ".join(clauses) |
3065 | auth = ArchivePermission.select( |
3066 | - query, clauseTables=["TeamParticipation"], distinct=True, |
3067 | + query, clauseTables=["TeamParticipation"], |
3068 | prejoins=prejoins) |
3069 | |
3070 | return auth |
3071 | @@ -337,11 +335,11 @@ |
3072 | SELECT ap.id |
3073 | FROM archivepermission ap, teamparticipation tp |
3074 | WHERE |
3075 | - (ap.person = ? OR (ap.person = tp.team AND tp.person = ?)) |
3076 | + ap.person = tp.team AND tp.person = ? |
3077 | AND ap.archive = ? |
3078 | AND ap.packageset IS NOT NULL |
3079 | ''' |
3080 | - query = SQL(query, (person.id, person.id, archive.id)) |
3081 | + query = SQL(query, (person.id, archive.id)) |
3082 | return store.find(ArchivePermission, In(ArchivePermission.id, query)) |
3083 | |
3084 | def uploadersForPackageset( |
3085 | @@ -375,10 +373,10 @@ |
3086 | SELECT ap.id |
3087 | FROM archivepermission ap, teamparticipation tp |
3088 | WHERE |
3089 | - (ap.person = ? OR (ap.person = tp.team AND tp.person = ?)) |
3090 | + ap.person = tp.team AND tp.person = ? |
3091 | AND ap.packageset = ? AND ap.archive = ? |
3092 | ''' |
3093 | - query = SQL(query, (person.id, person.id, packageset.id, archive.id)) |
3094 | + query = SQL(query, (person.id, packageset.id, archive.id)) |
3095 | permissions = list( |
3096 | store.find(ArchivePermission, In(ArchivePermission.id, query))) |
3097 | if len(permissions) > 0: |
3098 | @@ -444,14 +442,14 @@ |
3099 | archivepermission ap, teamparticipation tp, |
3100 | packagesetsources pss, flatpackagesetinclusion fpsi |
3101 | WHERE |
3102 | - (ap.person = ? OR (ap.person = tp.team AND tp.person = ?)) |
3103 | + ap.person = tp.team AND tp.person = ? |
3104 | AND ap.packageset = fpsi.parent |
3105 | AND pss.packageset = fpsi.child |
3106 | AND pss.sourcepackagename = ? |
3107 | AND ap.archive = ? |
3108 | ''' |
3109 | query = SQL( |
3110 | - query, (person.id, person.id, sourcepackagename.id, archive.id)) |
3111 | + query, (person.id, sourcepackagename.id, archive.id)) |
3112 | return store.find(ArchivePermission, In(ArchivePermission.id, query)) |
3113 | |
3114 | def packagesetsForSource( |
3115 | @@ -491,9 +489,9 @@ |
3116 | # Query parameters for the first WHERE clause. |
3117 | (archive.id, sourcepackagename.id) + |
3118 | # Query parameters for the second WHERE clause. |
3119 | - (sourcepackagename.id,) + (person.id,)*2 + archive_params + |
3120 | + (sourcepackagename.id,) + (person.id,) + archive_params + |
3121 | # Query parameters for the third WHERE clause. |
3122 | - (sourcepackagename.id,) + (person.id,)*2 + archive_params) |
3123 | + (sourcepackagename.id,) + (person.id,) + archive_params) |
3124 | |
3125 | query = ''' |
3126 | SELECT CASE |
3127 | @@ -511,7 +509,7 @@ |
3128 | teamparticipation tp |
3129 | WHERE |
3130 | pss.sourcepackagename = %s |
3131 | - AND (ap.person = %s OR (ap.person = tp.team AND tp.person = %s)) |
3132 | + AND ap.person = tp.team AND tp.person = %s |
3133 | AND pss.packageset = ap.packageset AND ap.explicit = TRUE |
3134 | AND ap.permission = %s AND ap.archive = %s) |
3135 | ELSE ( |
3136 | @@ -521,7 +519,7 @@ |
3137 | teamparticipation tp, flatpackagesetinclusion fpsi |
3138 | WHERE |
3139 | pss.sourcepackagename = %s |
3140 | - AND (ap.person = %s OR (ap.person = tp.team AND tp.person = %s)) |
3141 | + AND ap.person = tp.team AND tp.person = %s |
3142 | AND pss.packageset = fpsi.child AND fpsi.parent = ap.packageset |
3143 | AND ap.permission = %s AND ap.archive = %s) |
3144 | END AS number_of_permitted_package_sets; |
3145 | |
3146 | === modified file 'lib/lp/soyuz/model/queue.py' |
3147 | --- lib/lp/soyuz/model/queue.py 2009-07-17 00:26:05 +0000 |
3148 | +++ lib/lp/soyuz/model/queue.py 2009-07-19 04:41:14 +0000 |
3149 | @@ -57,7 +57,8 @@ |
3150 | PackageUploadStatus, PackageUploadCustomFormat) |
3151 | from lp.registry.interfaces.person import IPersonSet |
3152 | from lp.soyuz.interfaces.publishing import ( |
3153 | - PackagePublishingPocket, PackagePublishingStatus, pocketsuffix) |
3154 | + ISourcePackagePublishingHistory, PackagePublishingPocket, |
3155 | + PackagePublishingStatus, pocketsuffix) |
3156 | from lp.soyuz.interfaces.queue import ( |
3157 | IPackageUpload, IPackageUploadBuild, IPackageUploadCustom, |
3158 | IPackageUploadQueue, IPackageUploadSource, IPackageUploadSet, |
3159 | @@ -368,21 +369,8 @@ |
3160 | assert self.is_delayed_copy, 'Can only process delayed-copies.' |
3161 | assert self.sources.count() == 1, ( |
3162 | 'Source is mandatory for delayed copies.') |
3163 | - |
3164 | self.setAccepted() |
3165 | |
3166 | - # XXX cprov 2009-06-22 bug=390851: self.sourcepackagerelease |
3167 | - # is cached, we cannot rely on it. |
3168 | - sourcepackagerelease = self.sources[0].sourcepackagerelease |
3169 | - |
3170 | - # Close bugs if possible, skip imported sources. |
3171 | - original_changesfile = sourcepackagerelease.upload_changesfile |
3172 | - if original_changesfile is not None: |
3173 | - changesfile_object = StringIO.StringIO( |
3174 | - original_changesfile.read()) |
3175 | - close_bugs_for_queue_item( |
3176 | - self, changesfile_object=changesfile_object) |
3177 | - |
3178 | def rejectFromQueue(self, logger=None, dry_run=False): |
3179 | """See `IPackageUpload`.""" |
3180 | self.setRejected() |
3181 | @@ -538,6 +526,11 @@ |
3182 | for new_file in update_files_privacy(pub_record): |
3183 | debug(logger, |
3184 | "Re-uploaded %s to librarian" % new_file.filename) |
3185 | + if ISourcePackagePublishingHistory.providedBy(pub_record): |
3186 | + pas_verify = BuildDaemonPackagesArchSpecific( |
3187 | + config.builddmaster.root, self.distroseries) |
3188 | + pub_record.createMissingBuilds( |
3189 | + pas_verify=pas_verify, logger=logger) |
3190 | |
3191 | self.setDone() |
3192 | |
3193 | |
3194 | === modified file 'lib/lp/soyuz/model/sourcepackagerelease.py' |
3195 | --- lib/lp/soyuz/model/sourcepackagerelease.py 2009-07-17 00:26:05 +0000 |
3196 | +++ lib/lp/soyuz/model/sourcepackagerelease.py 2009-07-20 15:05:42 +0000 |
3197 | @@ -35,7 +35,6 @@ |
3198 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities |
3199 | from lp.translations.interfaces.translationimportqueue import ( |
3200 | ITranslationImportQueue) |
3201 | -from canonical.librarian.interfaces import ILibrarianClient |
3202 | from canonical.launchpad.webapp.interfaces import NotFoundError |
3203 | from lp.soyuz.interfaces.archive import ( |
3204 | ArchivePurpose, IArchiveSet, MAIN_ARCHIVE_PURPOSES) |
3205 | @@ -574,12 +573,9 @@ |
3206 | return change |
3207 | |
3208 | def attachTranslationFiles(self, tarball_alias, is_published, |
3209 | - importer=None): |
3210 | + importer=None): |
3211 | """See ISourcePackageRelease.""" |
3212 | - client = getUtility(ILibrarianClient) |
3213 | - |
3214 | - tarball_file = client.getFileByAlias(tarball_alias.id) |
3215 | - tarball = tarball_file.read() |
3216 | + tarball = tarball_alias.read() |
3217 | |
3218 | if importer is None: |
3219 | importer = getUtility(ILaunchpadCelebrities).rosetta_experts |
3220 | |
3221 | === modified file 'lib/lp/soyuz/scripts/packagecopier.py' |
3222 | --- lib/lp/soyuz/scripts/packagecopier.py 2009-07-17 00:26:05 +0000 |
3223 | +++ lib/lp/soyuz/scripts/packagecopier.py 2009-07-19 04:41:14 +0000 |
3224 | @@ -30,11 +30,13 @@ |
3225 | build_package_location) |
3226 | from lp.soyuz.interfaces.archive import ( |
3227 | ArchivePurpose, CannotCopy) |
3228 | -from lp.soyuz.interfaces.build import BuildSetStatus |
3229 | +from lp.soyuz.interfaces.build import ( |
3230 | + BuildStatus, BuildSetStatus) |
3231 | from lp.soyuz.interfaces.publishing import ( |
3232 | IBinaryPackagePublishingHistory, ISourcePackagePublishingHistory, |
3233 | active_publishing_status) |
3234 | -from lp.soyuz.interfaces.queue import IPackageUploadSet |
3235 | +from lp.soyuz.interfaces.queue import ( |
3236 | + IPackageUpload, IPackageUploadSet) |
3237 | from lp.soyuz.scripts.ftpmasterbase import ( |
3238 | SoyuzScript, SoyuzScriptError) |
3239 | from lp.soyuz.scripts.processaccepted import ( |
3240 | @@ -74,8 +76,8 @@ |
3241 | |
3242 | return new_lfa |
3243 | |
3244 | -# XXX cprov 2009-06-12: These two functions could be incorporated in |
3245 | -# ISPPH and BPPH. I just don't see a clear benefit in doing that right now. |
3246 | +# XXX cprov 2009-06-12: this function should be incorporated in |
3247 | +# IPublishing. |
3248 | def update_files_privacy(pub_record): |
3249 | """Update file privacy according the publishing detination |
3250 | |
3251 | @@ -133,23 +135,43 @@ |
3252 | return re_uploaded_files |
3253 | |
3254 | |
3255 | +# XXX cprov 2009-07-01: should be part of `ISourcePackagePublishingHistory`. |
3256 | +def has_restricted_files(source): |
3257 | + """Whether or not a given source files has restricted files.""" |
3258 | + for source_file in source.sourcepackagerelease.files: |
3259 | + if source_file.libraryfile.restricted: |
3260 | + return True |
3261 | + |
3262 | + for binary in source.getBuiltBinaries(): |
3263 | + for binary_file in binary.binarypackagerelease.files: |
3264 | + if binary_file.libraryfile.restricted: |
3265 | + return True |
3266 | + |
3267 | + return False |
3268 | + |
3269 | + |
3270 | class CheckedCopy: |
3271 | """Representation of a copy that was checked and approved. |
3272 | |
3273 | Decorates `ISourcePackagePublishingHistory`, tweaking |
3274 | `getStatusSummaryForBuilds` to return `BuildSetStatus.NEEDSBUILD` |
3275 | for source-only copies. |
3276 | + |
3277 | + It also store the 'delayed' boolean, which controls the way this source |
3278 | + should be copied to the destionation archive (see `_do_delayed_copy` and |
3279 | + `_do_direct_copy`) |
3280 | """ |
3281 | delegates(ISourcePackagePublishingHistory) |
3282 | |
3283 | - def __init__(self, context, include_binaries): |
3284 | + def __init__(self, context, include_binaries, delayed): |
3285 | self.context = context |
3286 | self.include_binaries = include_binaries |
3287 | + self.delayed = delayed |
3288 | |
3289 | def getStatusSummaryForBuilds(self): |
3290 | """Always `BuildSetStatus.NEEDSBUILD` for source-only copies.""" |
3291 | if self.include_binaries: |
3292 | - self.context.getStatusSummaryForBuilds() |
3293 | + return self.context.getStatusSummaryForBuilds() |
3294 | else: |
3295 | return {'status': BuildSetStatus.NEEDSBUILD} |
3296 | |
3297 | @@ -160,9 +182,10 @@ |
3298 | Allows the checker function to identify conflicting copy candidates |
3299 | within the copying batch. |
3300 | """ |
3301 | - def __init__(self, archive, include_binaries): |
3302 | + def __init__(self, archive, include_binaries, allow_delayed_copies=True): |
3303 | self.archive = archive |
3304 | self.include_binaries = include_binaries |
3305 | + self.allow_delayed_copies = allow_delayed_copies |
3306 | self._inventory = {} |
3307 | |
3308 | def _getInventoryKey(self, candidate): |
3309 | @@ -174,11 +197,18 @@ |
3310 | return ( |
3311 | candidate.source_package_name, candidate.source_package_version) |
3312 | |
3313 | - def addCopy(self, source): |
3314 | + def addCopy(self, source, delayed): |
3315 | """Story a copy in the inventory as a `CheckedCopy` instance.""" |
3316 | inventory_key = self._getInventoryKey(source) |
3317 | + checked_copy = CheckedCopy(source, self.include_binaries, delayed) |
3318 | candidates = self._inventory.setdefault(inventory_key, []) |
3319 | - candidates.append(CheckedCopy(source, self.include_binaries)) |
3320 | + candidates.append(checked_copy) |
3321 | + |
3322 | + def getCheckedCopies(self): |
3323 | + """Return a list of copies allowed to be performed.""" |
3324 | + for copies in self._inventory.values(): |
3325 | + for copy in copies: |
3326 | + yield copy |
3327 | |
3328 | def getConflicts(self, candidate): |
3329 | """Conflicting `CheckedCopy` objects in the inventory. |
3330 | @@ -356,34 +386,27 @@ |
3331 | "version older than the %s published in %s" % |
3332 | (ancestry.displayname, ancestry.distroseries.name)) |
3333 | |
3334 | - |
3335 | -def check_privacy_mismatch(source, archive): |
3336 | - """Whether or not source files match the archive privacy. |
3337 | - |
3338 | - Public source files can be copied to any archive, it does not |
3339 | - represent a 'privacy mismatch'. |
3340 | - |
3341 | - On the other hand, private source files can be copied to private |
3342 | - archives where builders will fetch it directly from the repository |
3343 | - and not from the restricted librarian. |
3344 | - """ |
3345 | - if archive.private: |
3346 | - return False |
3347 | - |
3348 | - for source_file in source.sourcepackagerelease.files: |
3349 | - if source_file.libraryfile.restricted: |
3350 | - return True |
3351 | - |
3352 | - for binary in source.getBuiltBinaries(): |
3353 | - for binary_file in binary.binarypackagerelease.files: |
3354 | - if binary_file.libraryfile.restricted: |
3355 | - return True |
3356 | - |
3357 | - return False |
3358 | + delayed = ( |
3359 | + self.allow_delayed_copies and |
3360 | + not self.archive.private and |
3361 | + has_restricted_files(source)) |
3362 | + |
3363 | + if delayed: |
3364 | + upload_conflict = getUtility(IPackageUploadSet).findSourceUpload( |
3365 | + name=source.sourcepackagerelease.name, |
3366 | + version=source.sourcepackagerelease.version, |
3367 | + archive=self.archive, distribution=series.distribution) |
3368 | + if upload_conflict is not None: |
3369 | + raise CannotCopy( |
3370 | + 'same version already uploaded and waiting in ' |
3371 | + 'ACCEPTED queue') |
3372 | + |
3373 | + # Copy is approved, update the copy inventory. |
3374 | + self.addCopy(source, delayed) |
3375 | |
3376 | |
3377 | def do_copy(sources, archive, series, pocket, include_binaries=False, |
3378 | - deny_privacy_mismatch=True): |
3379 | + allow_delayed_copies=True): |
3380 | """Perform the complete copy of the given sources incrementally. |
3381 | |
3382 | Verifies if each copy can be performed using `CopyChecker` and |
3383 | @@ -402,9 +425,9 @@ |
3384 | :param: include_binaries: optional boolean, controls whether or |
3385 | not the published binaries for each given source should be also |
3386 | copied along with the source. |
3387 | - :param deny_privacy_mismatch: boolean indicating whether or not private |
3388 | - sources can be copied to public archives. Defaults to True, only |
3389 | - set as False in the UnembargoPackage context. |
3390 | + :param allow_delayed_copies: boolean indicating whether or not private |
3391 | + sources can be copied to public archives using delayed_copies. |
3392 | + Defaults to True, only set as False in the UnembargoPackage context. |
3393 | |
3394 | :raise CannotCopy when one or more copies were not allowed. The error |
3395 | will contain the reason why each copy was denied. |
3396 | @@ -415,42 +438,35 @@ |
3397 | """ |
3398 | copies = [] |
3399 | errors = [] |
3400 | - copy_checker = CopyChecker(archive, include_binaries) |
3401 | + copy_checker = CopyChecker( |
3402 | + archive, include_binaries, allow_delayed_copies) |
3403 | |
3404 | for source in sources: |
3405 | if series is None: |
3406 | destination_series = source.distroseries |
3407 | else: |
3408 | destination_series = series |
3409 | - |
3410 | try: |
3411 | copy_checker.checkCopy(source, destination_series, pocket) |
3412 | except CannotCopy, reason: |
3413 | errors.append("%s (%s)" % (source.displayname, reason)) |
3414 | continue |
3415 | |
3416 | - # For now, deny copies implying in file privacy mismatch. |
3417 | - if (deny_privacy_mismatch and |
3418 | - check_privacy_mismatch(source, archive)): |
3419 | - errors.append( |
3420 | - "%s (cannot copy private files into public archives)" % |
3421 | - source.displayname) |
3422 | - continue |
3423 | - |
3424 | - # Update the copy inventory. |
3425 | - copy_checker.addCopy(source) |
3426 | - |
3427 | if len(errors) != 0: |
3428 | raise CannotCopy("\n".join(errors)) |
3429 | |
3430 | - for source in sources: |
3431 | + for source in copy_checker.getCheckedCopies(): |
3432 | if series is None: |
3433 | destination_series = source.distroseries |
3434 | else: |
3435 | destination_series = series |
3436 | - |
3437 | - sub_copies = _do_direct_copy( |
3438 | - source, archive, destination_series, pocket, include_binaries) |
3439 | + if source.delayed: |
3440 | + delayed_copy = _do_delayed_copy( |
3441 | + source, archive, destination_series, pocket, include_binaries) |
3442 | + sub_copies = [delayed_copy] |
3443 | + else: |
3444 | + sub_copies = _do_direct_copy( |
3445 | + source, archive, destination_series, pocket, include_binaries) |
3446 | |
3447 | copies.extend(sub_copies) |
3448 | |
3449 | @@ -534,6 +550,21 @@ |
3450 | return copies |
3451 | |
3452 | |
3453 | +class DelayedCopy: |
3454 | + """Decorates `IPackageUpload` with a more descriptive 'displayname'.""" |
3455 | + |
3456 | + delegates(IPackageUpload) |
3457 | + |
3458 | + def __init__(self, context): |
3459 | + self.context = context |
3460 | + |
3461 | + @property |
3462 | + def displayname(self): |
3463 | + return 'Delayed copy of %s (%s)' % ( |
3464 | + self.context.sourcepackagerelease.title, |
3465 | + self.context.displayarchs) |
3466 | + |
3467 | + |
3468 | def _do_delayed_copy(source, archive, series, pocket, include_binaries): |
3469 | """Schedule the given source for copy. |
3470 | |
3471 | @@ -571,6 +602,8 @@ |
3472 | # If binaries are included in the copy we include binary custom files. |
3473 | if include_binaries: |
3474 | for build in source.getBuilds(): |
3475 | + if build.buildstate != BuildStatus.FULLYBUILT: |
3476 | + continue |
3477 | delayed_copy.addBuild(build) |
3478 | original_build_upload = build.package_upload |
3479 | for custom in original_build_upload.customfiles: |
3480 | @@ -586,11 +619,7 @@ |
3481 | # the destination context. |
3482 | delayed_copy.acceptFromCopy() |
3483 | |
3484 | - # XXX cprov 2009-06-22 bug=390845: `IPackageUpload.displayname` |
3485 | - # implementation is very poor, if we can't fix in place we should |
3486 | - # build a decorated object implemented a more complete 'displayname' |
3487 | - # property. |
3488 | - return delayed_copy |
3489 | + return DelayedCopy(delayed_copy) |
3490 | |
3491 | |
3492 | class PackageCopier(SoyuzScript): |
3493 | @@ -609,7 +638,7 @@ |
3494 | |
3495 | usage = '%prog -s warty mozilla-firefox --to-suite hoary' |
3496 | description = 'MOVE or COPY a published package to another suite.' |
3497 | - deny_privacy_mismatch = True |
3498 | + allow_delayed_copies = True |
3499 | |
3500 | def add_my_options(self): |
3501 | |
3502 | @@ -707,11 +736,15 @@ |
3503 | copies = do_copy( |
3504 | sources, self.destination.archive, |
3505 | self.destination.distroseries, self.destination.pocket, |
3506 | - self.options.include_binaries, self.deny_privacy_mismatch) |
3507 | + self.options.include_binaries, self.allow_delayed_copies) |
3508 | except CannotCopy, error: |
3509 | self.logger.error(str(error)) |
3510 | return [] |
3511 | |
3512 | + self.logger.info("Copied:") |
3513 | + for copy in copies: |
3514 | + self.logger.info('\t%s' % copy.displayname) |
3515 | + |
3516 | if len(copies) == 1: |
3517 | self.logger.info( |
3518 | "%s package successfully copied." % len(copies)) |
3519 | @@ -772,7 +805,7 @@ |
3520 | description = ("Unembargo packages in a private PPA by copying to the " |
3521 | "specified location and re-uploading any files to the " |
3522 | "unrestricted librarian.") |
3523 | - deny_privacy_mismatch = False |
3524 | + allow_delayed_copies = False |
3525 | |
3526 | def add_my_options(self): |
3527 | """Add -d, -s, dry-run and confirmation options.""" |
3528 | |
3529 | === modified file 'lib/lp/soyuz/scripts/processaccepted.py' |
3530 | --- lib/lp/soyuz/scripts/processaccepted.py 2009-06-25 04:06:00 +0000 |
3531 | +++ lib/lp/soyuz/scripts/processaccepted.py 2009-07-19 04:41:14 +0000 |
3532 | @@ -20,6 +20,8 @@ |
3533 | from lp.soyuz.interfaces.archive import ArchivePurpose |
3534 | from lp.soyuz.interfaces.publishing import PackagePublishingPocket |
3535 | from lp.soyuz.interfaces.queue import IPackageUploadSet |
3536 | + |
3537 | + |
3538 | def get_bugs_from_changes_file(changes_file): |
3539 | """Parse the changes file and return a list of bugs referenced by it. |
3540 | |
3541 | @@ -54,6 +56,7 @@ |
3542 | queue_item = getUtility(IPackageUploadSet).get(queue_id) |
3543 | close_bugs_for_queue_item(queue_item) |
3544 | |
3545 | + |
3546 | def can_close_bugs(target): |
3547 | """Whether or not bugs should be closed in the given target. |
3548 | |
3549 | @@ -74,6 +77,7 @@ |
3550 | |
3551 | return True |
3552 | |
3553 | + |
3554 | def close_bugs_for_queue_item(queue_item, changesfile_object=None): |
3555 | """Close bugs for a given queue item. |
3556 | |
3557 | @@ -96,12 +100,17 @@ |
3558 | return |
3559 | |
3560 | if changesfile_object is None: |
3561 | - changesfile_object = queue_item.changesfile |
3562 | + if queue_item.is_delayed_copy: |
3563 | + sourcepackagerelease = queue_item.sources[0].sourcepackagerelease |
3564 | + changesfile_object = sourcepackagerelease.upload_changesfile |
3565 | + else: |
3566 | + changesfile_object = queue_item.changesfile |
3567 | |
3568 | for source_queue_item in queue_item.sources: |
3569 | close_bugs_for_sourcepackagerelease( |
3570 | source_queue_item.sourcepackagerelease, changesfile_object) |
3571 | |
3572 | + |
3573 | def close_bugs_for_sourcepublication(source_publication): |
3574 | """Close bugs for a given sourcepublication. |
3575 | |
3576 | @@ -121,6 +130,7 @@ |
3577 | close_bugs_for_sourcepackagerelease( |
3578 | sourcepackagerelease, changesfile_object) |
3579 | |
3580 | + |
3581 | def close_bugs_for_sourcepackagerelease(source_release, changesfile_object): |
3582 | """Close bugs for a given source. |
3583 | |
3584 | |
3585 | === modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py' |
3586 | --- lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-07-17 00:26:05 +0000 |
3587 | +++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2009-07-19 04:41:14 +0000 |
3588 | @@ -28,7 +28,8 @@ |
3589 | from lp.soyuz.adapters.packagelocation import PackageLocationError |
3590 | from lp.soyuz.interfaces.archive import ( |
3591 | ArchivePurpose, CannotCopy) |
3592 | -from lp.soyuz.interfaces.build import BuildStatus |
3593 | +from lp.soyuz.interfaces.build import ( |
3594 | + BuildSetStatus, BuildStatus) |
3595 | from lp.soyuz.interfaces.component import IComponentSet |
3596 | from lp.soyuz.interfaces.publishing import ( |
3597 | IBinaryPackagePublishingHistory, ISourcePackagePublishingHistory, |
3598 | @@ -42,7 +43,7 @@ |
3599 | from lp.soyuz.model.processor import ProcessorFamily |
3600 | from lp.soyuz.scripts.ftpmasterbase import SoyuzScriptError |
3601 | from lp.soyuz.scripts.packagecopier import ( |
3602 | - CopyChecker, _do_delayed_copy, _do_direct_copy, PackageCopier, |
3603 | + CopyChecker, do_copy, _do_delayed_copy, _do_direct_copy, PackageCopier, |
3604 | re_upload_file, UnembargoSecurityPackage, update_files_privacy) |
3605 | from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
3606 | from lp.testing import ( |
3607 | @@ -387,33 +388,81 @@ |
3608 | class CopyCheckerHarness: |
3609 | """Basic checks common for all scenarios.""" |
3610 | |
3611 | - def assertCanCopySourceOnly(self): |
3612 | - """checkCopy() for source-only copy returns None.""" |
3613 | + def assertCanCopySourceOnly(self, delayed=False): |
3614 | + """Source-only copy is allowed. |
3615 | + |
3616 | + Initialise a `CopyChecker` and assert a `checkCopy` call returns |
3617 | + None (more importantly, doesn't raise `CannotCopy`) in the test |
3618 | + suite context. |
3619 | + |
3620 | + Also assert that: |
3621 | + * 1 'CheckedCopy' was allowed and stored as so. |
3622 | + * Since it was source-only, the `CheckedCopy` objects is in |
3623 | + NEEDSBUILD state. |
3624 | + * Finally check whether is a delayed-copy or not according to the |
3625 | + given state. |
3626 | + """ |
3627 | copy_checker = CopyChecker(self.archive, include_binaries=False) |
3628 | self.assertIs( |
3629 | None, |
3630 | copy_checker.checkCopy(self.source, self.series, self.pocket)) |
3631 | - |
3632 | - def assertCanCopyBinaries(self): |
3633 | - """checkCopy() for copy including binaries returns None.""" |
3634 | + checked_copies = list(copy_checker.getCheckedCopies()) |
3635 | + self.assertEquals(1, len(checked_copies)) |
3636 | + [checked_copy] = checked_copies |
3637 | + self.assertEquals( |
3638 | + BuildSetStatus.NEEDSBUILD, |
3639 | + checked_copy.getStatusSummaryForBuilds()['status']) |
3640 | + self.assertEquals(delayed, checked_copy.delayed) |
3641 | + |
3642 | + def assertCanCopyBinaries(self, delayed=False): |
3643 | + """Source and binary copy is allowed. |
3644 | + |
3645 | + Initialise a `CopyChecker` and assert a `checkCopy` call returns |
3646 | + None (more importantly, doesn't raise `CannotCopy`) in the test |
3647 | + suite context. |
3648 | + |
3649 | + Also assert that: |
3650 | + * 1 'CheckedCopy' was allowed and stored as so. |
3651 | + * The `CheckedCopy` objects is in FULLYBUILT_PENDING or FULLYBUILT |
3652 | + status, so there are binaries to be copied. |
3653 | + * Finally check whether is a delayed-copy or not according to the |
3654 | + given state. |
3655 | + """ |
3656 | copy_checker = CopyChecker(self.archive, include_binaries=True) |
3657 | self.assertIs( |
3658 | None, |
3659 | copy_checker.checkCopy(self.source, self.series, self.pocket)) |
3660 | + checked_copies = list(copy_checker.getCheckedCopies()) |
3661 | + self.assertEquals(1, len(checked_copies)) |
3662 | + [checked_copy] = checked_copies |
3663 | + self.assertTrue( |
3664 | + checked_copy.getStatusSummaryForBuilds()['status'] >= |
3665 | + BuildSetStatus.FULLYBUILT_PENDING) |
3666 | + self.assertEquals(delayed, checked_copy.delayed) |
3667 | |
3668 | def assertCannotCopySourceOnly(self, msg): |
3669 | - """checkCopy() for source-only copy raises CannotCopy.""" |
3670 | + """`CopyChecker.checkCopy()` for source-only copy raises CannotCopy. |
3671 | + |
3672 | + No `CheckedCopy` is stored. |
3673 | + """ |
3674 | copy_checker = CopyChecker(self.archive, include_binaries=False) |
3675 | self.assertRaisesWithContent( |
3676 | CannotCopy, msg, |
3677 | copy_checker.checkCopy, self.source, self.series, self.pocket) |
3678 | + checked_copies = list(copy_checker.getCheckedCopies()) |
3679 | + self.assertEquals(0, len(checked_copies)) |
3680 | |
3681 | def assertCannotCopyBinaries(self, msg): |
3682 | - """checkCopy() for copy including binaries raises CannotCopy.""" |
3683 | + """`CopyChecker.checkCopy()` including binaries raises CannotCopy. |
3684 | + |
3685 | + No `CheckedCopy` is stored. |
3686 | + """ |
3687 | copy_checker = CopyChecker(self.archive, include_binaries=True) |
3688 | self.assertRaisesWithContent( |
3689 | CannotCopy, msg, |
3690 | copy_checker.checkCopy, self.source, self.series, self.pocket) |
3691 | + checked_copies = list(copy_checker.getCheckedCopies()) |
3692 | + self.assertEquals(0, len(checked_copies)) |
3693 | |
3694 | def test_cannot_copy_binaries_from_building(self): |
3695 | [build] = self.source.createMissingBuilds() |
3696 | @@ -549,6 +598,32 @@ |
3697 | status=PackagePublishingStatus.PUBLISHED) |
3698 | self.assertCanCopySourceOnly() |
3699 | |
3700 | + def switchToAPrivateSource(self): |
3701 | + """Override the probing source with a private one.""" |
3702 | + private_archive = self.factory.makeArchive( |
3703 | + distribution=self.test_publisher.ubuntutest, |
3704 | + purpose=ArchivePurpose.PPA) |
3705 | + private_archive.buildd_secret = 'x' |
3706 | + private_archive.private = True |
3707 | + |
3708 | + self.source = self.test_publisher.getPubSource( |
3709 | + archive=private_archive) |
3710 | + |
3711 | + def test_can_copy_only_source_from_private_archives(self): |
3712 | + # Source-only copies from private archives to public ones |
3713 | + # are allowed and result in a delayed-copy. |
3714 | + self.switchToAPrivateSource() |
3715 | + self.assertCanCopySourceOnly(delayed=True) |
3716 | + |
3717 | + def test_can_copy_binaries_from_private_archives(self): |
3718 | + # Source and binary copies from private archives to public ones |
3719 | + # are allowed and result in a delayed-copy. |
3720 | + self.switchToAPrivateSource() |
3721 | + self.test_publisher.getPubBinaries( |
3722 | + pub_source=self.source, |
3723 | + status=PackagePublishingStatus.PUBLISHED) |
3724 | + self.assertCanCopyBinaries(delayed=True) |
3725 | + |
3726 | |
3727 | class CopyCheckerTestCase(TestCaseWithFactory): |
3728 | |
3729 | @@ -658,7 +733,6 @@ |
3730 | None, |
3731 | copy_checker.checkCopy( |
3732 | source, source.distroseries, source.pocket)) |
3733 | - copy_checker.addCopy(source) |
3734 | |
3735 | # The second source-only copy, for hoary-test, fails, since it |
3736 | # conflicts with the just-approved copy. |
3737 | @@ -669,6 +743,73 @@ |
3738 | copy_checker.checkCopy, |
3739 | copied_source, copied_source.distroseries, copied_source.pocket) |
3740 | |
3741 | + def test_checkCopy_identifies_delayed_copies_conflicts(self): |
3742 | + # checkCopy() detects copy conflicts in the upload queue for |
3743 | + # delayed-copies. This is mostly caused by previous delayed-copies |
3744 | + # that are waiting to be processed. |
3745 | + |
3746 | + # Create a private archive with a restricted source publication. |
3747 | + private_archive = self.factory.makeArchive( |
3748 | + distribution=self.test_publisher.ubuntutest, |
3749 | + purpose=ArchivePurpose.PPA) |
3750 | + private_archive.buildd_secret = 'x' |
3751 | + private_archive.private = True |
3752 | + source = self.test_publisher.getPubSource(archive=private_archive) |
3753 | + |
3754 | + archive = self.test_publisher.ubuntutest.main_archive |
3755 | + series = source.distroseries |
3756 | + pocket = source.pocket |
3757 | + |
3758 | + # Commit so the just-created files are accessible and perform |
3759 | + # the delayed-copy. |
3760 | + self.layer.txn.commit() |
3761 | + do_copy([source], archive, series, pocket, include_binaries=False) |
3762 | + |
3763 | + # Repeating the copy is denied. |
3764 | + copy_checker = CopyChecker(archive, include_binaries=False) |
3765 | + self.assertRaisesWithContent( |
3766 | + CannotCopy, |
3767 | + 'same version already uploaded and waiting in ACCEPTED queue', |
3768 | + copy_checker.checkCopy, source, series, pocket) |
3769 | + |
3770 | + def test_checkCopy_suppressing_delayed_copies(self): |
3771 | + # `CopyChecker` by default will request delayed-copies when it's |
3772 | + # the case (restricted files being copied to public archives). |
3773 | + # However this feature can be turned off, and the operation can |
3774 | + # be performed as a direct-copy by passing 'allow_delayed_copies' |
3775 | + # as False when initialising `CopyChecker`. |
3776 | + # This aspect is currently only used in `UnembargoSecurityPackage` |
3777 | + # script class, because it performs the file privacy fixes in |
3778 | + # place. |
3779 | + |
3780 | + # Create a private archive with a restricted source publication. |
3781 | + private_archive = self.factory.makeArchive( |
3782 | + distribution=self.test_publisher.ubuntutest, |
3783 | + purpose=ArchivePurpose.PPA) |
3784 | + private_archive.buildd_secret = 'x' |
3785 | + private_archive.private = True |
3786 | + source = self.test_publisher.getPubSource(archive=private_archive) |
3787 | + |
3788 | + archive = self.test_publisher.ubuntutest.main_archive |
3789 | + series = source.distroseries |
3790 | + pocket = source.pocket |
3791 | + |
3792 | + # Normally `CopyChecker` would store a delayed-copy representing |
3793 | + # this operation, since restricted files are being copied to |
3794 | + # public archives. |
3795 | + copy_checker = CopyChecker(archive, include_binaries=False) |
3796 | + copy_checker.checkCopy(source, series, pocket) |
3797 | + [checked_copy] = list(copy_checker.getCheckedCopies()) |
3798 | + self.assertTrue(checked_copy.delayed) |
3799 | + |
3800 | + # When 'allow_delayed_copies' is off, a direct-copy will be |
3801 | + # scheduled. |
3802 | + copy_checker = CopyChecker( |
3803 | + archive, include_binaries=False, allow_delayed_copies=False) |
3804 | + copy_checker.checkCopy(source, series, pocket) |
3805 | + [checked_copy] = list(copy_checker.getCheckedCopies()) |
3806 | + self.assertFalse(checked_copy.delayed) |
3807 | + |
3808 | |
3809 | class DoDirectCopyTestCase(TestCaseWithFactory): |
3810 | |
3811 | @@ -722,6 +863,7 @@ |
3812 | class DoDelayedCopyTestCase(TestCaseWithFactory): |
3813 | |
3814 | layer = LaunchpadZopelessLayer |
3815 | + dbuser = config.archivepublisher.dbuser |
3816 | |
3817 | def setUp(self): |
3818 | super(DoDelayedCopyTestCase, self).setUp() |
3819 | @@ -763,22 +905,32 @@ |
3820 | self.test_publisher.breezy_autotest.status = ( |
3821 | DistroSeriesStatus.CURRENT) |
3822 | |
3823 | + # Setup and execute the delayed copy procedure. |
3824 | + copy_archive = self.test_publisher.ubuntutest.main_archive |
3825 | + copy_series = source.distroseries |
3826 | + copy_pocket = PackagePublishingPocket.SECURITY |
3827 | + |
3828 | # Commit for making the just-create library files available. |
3829 | self.layer.txn.commit() |
3830 | - |
3831 | - # Setup and execute the delayed copy procedure. |
3832 | - copy_archive = self.test_publisher.ubuntutest.main_archive |
3833 | - copy_series = source.distroseries |
3834 | - copy_pocket = PackagePublishingPocket.SECURITY |
3835 | + self.layer.switchDbUser(self.dbuser) |
3836 | |
3837 | delayed_copy = _do_delayed_copy( |
3838 | source, copy_archive, copy_series, copy_pocket, True) |
3839 | |
3840 | + self.layer.txn.commit() |
3841 | + self.layer.switchDbUser('launchpad') |
3842 | + |
3843 | # A delayed-copy `IPackageUpload` record is returned. |
3844 | self.assertTrue(delayed_copy.is_delayed_copy) |
3845 | self.assertEquals( |
3846 | PackageUploadStatus.ACCEPTED, delayed_copy.status) |
3847 | |
3848 | + # The returned object has a more descriptive 'displayname' |
3849 | + # attribute than plain `IPackageUpload` instances. |
3850 | + self.assertEquals( |
3851 | + 'Delayed copy of foo - 666 (source, i386, raw-dist-upgrader)', |
3852 | + delayed_copy.displayname) |
3853 | + |
3854 | # It is targeted to the right publishing context. |
3855 | self.assertEquals(copy_archive, delayed_copy.archive) |
3856 | self.assertEquals(copy_series, delayed_copy.distroseries) |
3857 | @@ -801,6 +953,68 @@ |
3858 | [custom_file], |
3859 | [custom.libraryfilealias for custom in delayed_copy.customfiles]) |
3860 | |
3861 | + def createPartiallyBuiltDelayedCopyContext(self): |
3862 | + """Allow tests on delayed-copies of partially built sources. |
3863 | + |
3864 | + Create an architecture-specific source publication in a private PPA |
3865 | + capable of building for i386 and hppa architectures. |
3866 | + |
3867 | + Upload and publish only the i386 binary, letting the hppa build |
3868 | + in pending status. |
3869 | + """ |
3870 | + self.test_publisher.prepareBreezyAutotest() |
3871 | + |
3872 | + ppa = self.factory.makeArchive( |
3873 | + distribution=self.test_publisher.ubuntutest, |
3874 | + purpose=ArchivePurpose.PPA) |
3875 | + ppa.buildd_secret = 'x' |
3876 | + ppa.private = True |
3877 | + ppa.require_virtualized = False |
3878 | + |
3879 | + source = self.test_publisher.getPubSource( |
3880 | + archive=ppa, architecturehintlist='any') |
3881 | + |
3882 | + [build_hppa, build_i386] = source.createMissingBuilds() |
3883 | + lazy_bin = self.test_publisher.uploadBinaryForBuild( |
3884 | + build_i386, 'lazy-bin') |
3885 | + self.test_publisher.publishBinaryInArchive(lazy_bin, source.archive) |
3886 | + changes_file_name = '%s_%s_%s.changes' % ( |
3887 | + lazy_bin.name, lazy_bin.version, build_i386.arch_tag) |
3888 | + package_upload = self.test_publisher.addPackageUpload( |
3889 | + ppa, build_i386.distroarchseries.distroseries, |
3890 | + build_i386.pocket, changes_file_content='anything', |
3891 | + changes_file_name=changes_file_name) |
3892 | + package_upload.addBuild(build_i386) |
3893 | + |
3894 | + return source |
3895 | + |
3896 | + def test_do_delayed_copy_of_partially_built_sources(self): |
3897 | + # delayed-copies of partially built sources are allowed and only |
3898 | + # the FULLYBUILT builds are copied. |
3899 | + source = self.createPartiallyBuiltDelayedCopyContext() |
3900 | + |
3901 | + # Setup and execute the delayed copy procedure. |
3902 | + copy_archive = self.test_publisher.ubuntutest.main_archive |
3903 | + copy_series = source.distroseries |
3904 | + copy_pocket = PackagePublishingPocket.RELEASE |
3905 | + |
3906 | + # Make new libraryfiles available by committing the transaction. |
3907 | + self.layer.txn.commit() |
3908 | + |
3909 | + # Perform the delayed-copy including binaries. |
3910 | + delayed_copy = _do_delayed_copy( |
3911 | + source, copy_archive, copy_series, copy_pocket, True) |
3912 | + |
3913 | + # Only the i386 build is included in the delayed-copy. |
3914 | + # For the record, later on, when the delayed-copy gets processed, |
3915 | + # a new hppa build record will be created in the destination |
3916 | + # archive context. Also after this point, the same delayed-copy |
3917 | + # request will be denied by `CopyChecker`. |
3918 | + [build_hppa, build_i386] = source.getBuilds() |
3919 | + self.assertEquals( |
3920 | + [build_i386], |
3921 | + [pub.build for pub in delayed_copy.builds]) |
3922 | + |
3923 | |
3924 | class CopyPackageScriptTestCase(unittest.TestCase): |
3925 | """Test the copy-package.py script.""" |
3926 | @@ -1772,15 +1986,7 @@ |
3927 | copy_helper.mainTask) |
3928 | |
3929 | def testCopyFromPrivateToPublicPPAs(self): |
3930 | - """Check if copying private sources into public archives is denied. |
3931 | - |
3932 | - Private source files can only be published in private archives, |
3933 | - because builders do not have access to the restricted librarian. |
3934 | - |
3935 | - Builders only fetch the sources files from the repository itself |
3936 | - for private PPAs. If we copy a restricted file into a public PPA |
3937 | - builders will not be able to fetch it. |
3938 | - """ |
3939 | + """Copies from private to public archives are allowed.""" |
3940 | # Set up a private PPA. |
3941 | cprov = getUtility(IPersonSet).getByName("cprov") |
3942 | cprov.archive.buildd_secret = "secret" |
3943 | @@ -1794,6 +2000,7 @@ |
3944 | archive=cprov.archive, version='1.0', distroseries=hoary) |
3945 | ppa_binaries = test_publisher.getPubBinaries( |
3946 | pub_source=ppa_source, distroseries=hoary) |
3947 | + self.layer.txn.commit() |
3948 | |
3949 | # Run the copy package script storing the logged information. |
3950 | copy_helper = self.getCopier( |
3951 | @@ -1801,12 +2008,20 @@ |
3952 | from_suite='hoary', to_suite='hoary') |
3953 | copied = copy_helper.mainTask() |
3954 | |
3955 | - # Nothing was copied and an error message was printed explaining why. |
3956 | - self.assertEqual(len(copied), 0) |
3957 | + # The private files are copied via a delayed-copy request. |
3958 | + self.assertEqual(len(copied), 1) |
3959 | self.assertEqual( |
3960 | - copy_helper.logger.buffer.getvalue().splitlines()[-1], |
3961 | - 'ERROR: foo 1.0 in hoary ' |
3962 | - '(cannot copy private files into public archives)') |
3963 | + ['INFO: FROM: cprov: hoary-RELEASE', |
3964 | + 'INFO: TO: Primary Archive for Ubuntu Linux: hoary-RELEASE', |
3965 | + 'INFO: Copy candidates:', |
3966 | + 'INFO: \tfoo 1.0 in hoary', |
3967 | + 'INFO: \tfoo-bin 1.0 in hoary hppa', |
3968 | + 'INFO: \tfoo-bin 1.0 in hoary i386', |
3969 | + 'INFO: Copied:', |
3970 | + 'INFO: \tDelayed copy of foo - 1.0 (source, i386)', |
3971 | + 'INFO: 1 package successfully copied.', |
3972 | + ], |
3973 | + copy_helper.logger.buffer.getvalue().splitlines()) |
3974 | |
3975 | def testUnembargoing(self): |
3976 | """Test UnembargoSecurityPackage, which wraps PackagerCopier.""" |
3977 | |
3978 | === added file 'lib/lp/soyuz/stories/soyuz/xx-queue-pages-delayed-copies.txt' |
3979 | --- lib/lp/soyuz/stories/soyuz/xx-queue-pages-delayed-copies.txt 1970-01-01 00:00:00 +0000 |
3980 | +++ lib/lp/soyuz/stories/soyuz/xx-queue-pages-delayed-copies.txt 2009-07-16 18:07:13 +0000 |
3981 | @@ -0,0 +1,128 @@ |
3982 | +Displaying delayed-copies |
3983 | +========================= |
3984 | + |
3985 | +Delayed copies can be browsed in the UI as if they were normal uploads. |
3986 | + |
3987 | +We will create a testing delayed-copy for Ubuntu/breezy-autotest. |
3988 | + |
3989 | + # Create a delayed-copy in ubuntu/breezy-autotest. |
3990 | + >>> from zope.component import getUtility |
3991 | + >>> from lp.registry.interfaces.distribution import IDistributionSet |
3992 | + >>> from lp.registry.interfaces.person import IPersonSet |
3993 | + >>> from lp.soyuz.interfaces.publishing import PackagePublishingPocket |
3994 | + >>> from lp.soyuz.interfaces.queue import IPackageUploadSet |
3995 | + >>> from lp.soyuz.tests.test_publishing import SoyuzTestPublisher |
3996 | + >>> login('foo.bar@canonical.com') |
3997 | + >>> ubuntu = getUtility(IDistributionSet).getByName('ubuntu') |
3998 | + >>> cprov = getUtility(IPersonSet).getByName('cprov') |
3999 | + >>> cprov.archive.buildd_secret = 'x' |
4000 | + >>> cprov.archive.private = True |
4001 | + >>> cprov.archive.require_virtualized = False |
4002 | + >>> stp = SoyuzTestPublisher() |
4003 | + >>> stp.prepareBreezyAutotest() |
4004 | + >>> [bin_hppa, bin_i386] = stp.getPubBinaries(archive=cprov.archive) |
4005 | + >>> build = bin_hppa.binarypackagerelease.build |
4006 | + >>> breezy_autotest = ubuntu.getSeries('breezy-autotest') |
4007 | + >>> delayed_copy = getUtility(IPackageUploadSet).createDelayedCopy( |
4008 | + ... archive=ubuntu.main_archive, distroseries=breezy_autotest, |
4009 | + ... pocket=PackagePublishingPocket.RELEASE, signing_key=None) |
4010 | + >>> unused = delayed_copy.addSource(build.sourcepackagerelease) |
4011 | + >>> unused = delayed_copy.addBuild(build) |
4012 | + >>> transaction.commit() |
4013 | + >>> delayed_copy.acceptFromCopy() |
4014 | + >>> logout() |
4015 | + |
4016 | +Any user accessing the breezy-autotest ACCEPTED queue will notice the |
4017 | +delayed-copy. They show up in the ACCEPTED distro series queue while |
4018 | +they are pending processing. |
4019 | + |
4020 | + >>> anon_browser.open( |
4021 | + ... "http://launchpad.dev/ubuntu/breezy-autotest/+queue") |
4022 | + >>> anon_browser.getControl( |
4023 | + ... name="queue_state", index=0).displayValue = ['Accepted'] |
4024 | + >>> anon_browser.getControl("Update").click() |
4025 | + |
4026 | +It's is listed as a normal upload, however there is no link to the |
4027 | +'changesfile'. |
4028 | + |
4029 | + >>> for row in find_tags_by_class(anon_browser.contents, "queue-row"): |
4030 | + ... print extract_text(row) |
4031 | + Package Version Component Section Priority Pocket When |
4032 | + foo, foo (source, i386) 666 main base low Release ... |
4033 | + |
4034 | + >>> anon_browser.getLink('foo, foo') |
4035 | + Traceback (most recent call last): |
4036 | + ... |
4037 | + LinkNotFoundError |
4038 | + |
4039 | +On the corresponding expandable area, below the row, there is no file |
4040 | +information, since the delayed-copy is still pending processing. A |
4041 | +user can simply view where the delayed copy came from. |
4042 | + |
4043 | + >>> print extract_text( |
4044 | + ... first_tag_by_class(anon_browser.contents, |
4045 | + ... 'queue-%s' % delayed_copy.id)) |
4046 | + Copied from PPA for Celso Providelo |
4047 | + |
4048 | +The delayed-copy source archive is not linked, since the requester has |
4049 | +no permission to view it. |
4050 | + |
4051 | + >>> anon_browser.getLink('PPA for Celso Providelo') |
4052 | + Traceback (most recent call last): |
4053 | + ... |
4054 | + LinkNotFoundError |
4055 | + |
4056 | +While the delayed-copy is still in ACCEPTED state, i.e not processed, |
4057 | +authenticated users with permission to view the archive from where the |
4058 | +delayed-copy was issued can additionally access a link to its original |
4059 | +archive, nothing else. |
4060 | + |
4061 | + >>> cprov_browser = setupBrowser( |
4062 | + ... auth="Basic celso.providelo@canonical.com:cprov") |
4063 | + >>> cprov_browser.open(anon_browser.url) |
4064 | + |
4065 | + >>> for row in find_tags_by_class(cprov_browser.contents, "queue-row"): |
4066 | + ... print extract_text(row) |
4067 | + Package Version Component Section Priority Pocket When |
4068 | + foo, foo (source, i386) 666 main base low Release ... |
4069 | + |
4070 | + >>> anon_browser.getLink('foo, foo') |
4071 | + Traceback (most recent call last): |
4072 | + ... |
4073 | + LinkNotFoundError |
4074 | + |
4075 | + >>> print extract_text( |
4076 | + ... first_tag_by_class(cprov_browser.contents, |
4077 | + ... 'queue-%s' % delayed_copy.id)) |
4078 | + Copied from PPA for Celso Providelo |
4079 | + |
4080 | + >>> print cprov_browser.getLink('PPA for Celso Providelo').url |
4081 | + http://launchpad.dev/~cprov/+archive/ppa |
4082 | + |
4083 | +When the delayed-copy is processed (moved to DONE queue) its contents |
4084 | +becomes available to everyone. |
4085 | + |
4086 | + # Process the delayed-copy. |
4087 | + >>> login('foo.bar@canonical.com') |
4088 | + >>> stp.addFakeChroots(breezy_autotest) |
4089 | + >>> unused = delayed_copy.realiseUpload() |
4090 | + >>> transaction.commit() |
4091 | + >>> logout() |
4092 | + |
4093 | +Any user can access the DONE queue and access the delayed-copy |
4094 | +'changesfile' and view its files in the expandable area. |
4095 | + |
4096 | + >>> anon_browser.getControl( |
4097 | + ... name="queue_state", index=0).displayValue = ['Done'] |
4098 | + >>> anon_browser.getControl("Update").click() |
4099 | + |
4100 | + >>> print anon_browser.getLink('foo, foo').url |
4101 | + http://localhost:58000/.../foo_666_source.changes |
4102 | + |
4103 | + >>> extra_information = find_tags_by_class( |
4104 | + ... anon_browser.contents, 'queue-%s' % delayed_copy.id) |
4105 | + >>> for info in extra_information: |
4106 | + ... print extract_text(info) |
4107 | + foo_666.dsc (28 bytes) |
4108 | + foo-bin_666_all.deb (18 bytes) 666 main base standard |
4109 | + |
4110 | |
4111 | === modified file 'lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt' |
4112 | --- lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt 2009-07-01 13:16:44 +0000 |
4113 | +++ lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt 2009-07-16 00:31:36 +0000 |
4114 | @@ -135,15 +135,15 @@ |
4115 | >>> anon_browser.getControl(name="queue_text").value = '' |
4116 | >>> anon_browser.getControl("Update").click() |
4117 | |
4118 | - >>> print find_tag_by_id(anon_browser.contents, 'queue-alsa-utils-4-arrow') |
4119 | + >>> print find_tag_by_id(anon_browser.contents, 'queue-4-arrow') |
4120 | <img width="14" height="14" src="/@@/treeCollapsed" alt="view files" |
4121 | - id="queue-alsa-utils-4-arrow" /> |
4122 | + id="queue-4-arrow" /> |
4123 | |
4124 | The 'filelist' is expanded as one or more table rows, right below the |
4125 | clicked item: |
4126 | |
4127 | >>> filelist = find_tags_by_class( |
4128 | - ... anon_browser.contents, 'queue-alsa-utils-4') |
4129 | + ... anon_browser.contents, 'queue-4') |
4130 | |
4131 | It contains a list of files related to the queue item clicked, followed |
4132 | by its size, one file per line: |
4133 | @@ -164,7 +164,7 @@ |
4134 | candidates). The binary items will also individually show their |
4135 | version, component, section and priority. |
4136 | |
4137 | - >>> [filelist] = find_tags_by_class(anon_browser.contents, 'queue-pmount-2') |
4138 | + >>> [filelist] = find_tags_by_class(anon_browser.contents, 'queue-2') |
4139 | >>> print extract_text(filelist) |
4140 | pmount_1.0-1_all.deb (18 bytes) NEW 0.1-1 main base important |
4141 | |
4142 | @@ -409,7 +409,7 @@ |
4143 | values: |
4144 | |
4145 | >>> filelist = find_tags_by_class( |
4146 | - ... anon_browser.contents, 'queue-pmount-2') |
4147 | + ... anon_browser.contents, 'queue-2') |
4148 | >>> for row in filelist: |
4149 | ... print extract_text(row) |
4150 | pmount_1.0-1_all.deb (18 bytes) NEW 0.1-1 restricted admin extra |
4151 | |
4152 | === modified file 'lib/lp/soyuz/stories/webservice/xx-archive.txt' |
4153 | --- lib/lp/soyuz/stories/webservice/xx-archive.txt 2009-07-17 18:25:27 +0000 |
4154 | +++ lib/lp/soyuz/stories/webservice/xx-archive.txt 2009-07-18 11:51:59 +0000 |
4155 | @@ -834,33 +834,47 @@ |
4156 | <BLANKLINE> |
4157 | |
4158 | |
4159 | -=== Copy privacy mismatch === |
4160 | - |
4161 | -A CannotCopy error, giving the reason "Cannot copy private source into |
4162 | -public archives." is raised if such operation is requested. |
4163 | - |
4164 | -When we try to copy the private source to the primary archive, which |
4165 | -is public, the 'privacy mismatch' error is raised. The behaviour is |
4166 | -the same for `syncSource` or `syncSources` operations. |
4167 | +=== Copying private file to public archives === |
4168 | + |
4169 | +Copying private sources to public archives works fine with |
4170 | +`syncSource` or `syncSources` operations. |
4171 | + |
4172 | +We use `syncSource` to copy 'private - 1.0' source from Celso's |
4173 | +private PPA to the ubuntu primary archive. |
4174 | |
4175 | >>> print cprov_webservice.named_post( |
4176 | ... ubuntu['main_archive_link'], 'syncSource', {}, |
4177 | - ... source_name='private', version="1.0", to_pocket='release', |
4178 | + ... source_name='private', version='1.0', to_pocket='release', |
4179 | ... from_archive=cprov_archive['self_link'], |
4180 | ... to_series="hoary") |
4181 | - HTTP/1.1 400 Bad Request |
4182 | + HTTP/1.1 200 Ok |
4183 | ... |
4184 | - CannotCopy: private 1.0 in hoary |
4185 | - (cannot copy private files into public archives) |
4186 | - <BLANKLINE> |
4187 | + |
4188 | +In the same way we can use 'syncSources' for syncing an subsequent |
4189 | +version. |
4190 | + |
4191 | + >>> login('foo.bar@canonical.com') |
4192 | + >>> unused = test_publisher.getPubSource( |
4193 | + ... sourcename="private", version="1.1", archive=cprov.archive) |
4194 | + >>> logout() |
4195 | |
4196 | >>> print cprov_webservice.named_post( |
4197 | ... ubuntu['main_archive_link'], 'syncSources', {}, |
4198 | ... source_names=['private'], to_pocket='release', |
4199 | ... from_archive=cprov_archive['self_link'], |
4200 | ... to_series="hoary") |
4201 | + HTTP/1.1 200 Ok |
4202 | + ... |
4203 | + |
4204 | +Although if we try to copy an old version, by repeating the copy an |
4205 | +error is returned. |
4206 | + |
4207 | + >>> print cprov_webservice.named_post( |
4208 | + ... ubuntu['main_archive_link'], 'syncSource', {}, |
4209 | + ... source_name='private', version='1.1', to_pocket='release', |
4210 | + ... from_archive=cprov_archive['self_link'], |
4211 | + ... to_series="hoary") |
4212 | HTTP/1.1 400 Bad Request |
4213 | ... |
4214 | - CannotCopy: private 1.0 in hoary |
4215 | - (cannot copy private files into public archives) |
4216 | - <BLANKLINE> |
4217 | + CannotCopy: private 1.1 in hoary |
4218 | + (same version already uploaded and waiting in ACCEPTED queue) |
4219 | |
4220 | === modified file 'lib/lp/soyuz/templates/distroseries-queue.pt' |
4221 | --- lib/lp/soyuz/templates/distroseries-queue.pt 2009-07-17 17:59:07 +0000 |
4222 | +++ lib/lp/soyuz/templates/distroseries-queue.pt 2009-07-19 04:41:14 +0000 |
4223 | @@ -72,8 +72,7 @@ |
4224 | </thead> |
4225 | <tbody class="lesser"> |
4226 | <tal:batch repeat="packageupload batch"> |
4227 | - <tal:block |
4228 | - define="filelist_class string:queue-${packageupload/displayname}-${packageupload/id}"> |
4229 | + <tal:block define="filelist_class string:queue-${packageupload/id}"> |
4230 | <tr class="queue-row"> |
4231 | <tal:comment condition="nothing"> |
4232 | Every column is top-padded apart from the checkbox |
4233 | @@ -203,6 +202,28 @@ |
4234 | :packageupload: A PackageUpload record for which we display files. |
4235 | </tal:comment> |
4236 | |
4237 | + <tal:copy condition="packageupload/pending_delayed_copy"> |
4238 | + <tr tal:attributes="class string:${filelist_class}" |
4239 | + tal:define="archive |
4240 | + packageupload/sourcepackagerelease/upload_archive" |
4241 | + style="display:none"> |
4242 | + <td /> |
4243 | + <td tal:condition="view/availableActions" /> |
4244 | + <td>Copied from |
4245 | + <tal:linked condition="archive/required:launchpad.View"> |
4246 | + <a tal:attributes="href archive/fmt:url" |
4247 | + tal:content="archive/displayname" /> |
4248 | + </tal:linked> |
4249 | + <tal:not_linked |
4250 | + condition="not: archive/required:launchpad.View" |
4251 | + replace="archive/displayname"> |
4252 | + </tal:not_linked> |
4253 | + </td> |
4254 | + <td colspan="6" /> |
4255 | + </tr> |
4256 | + </tal:copy> |
4257 | + |
4258 | + <tal:upload condition="not: packageupload/pending_delayed_copy"> |
4259 | <tr tal:repeat="file packageupload/source_files" |
4260 | tal:attributes="class string:${filelist_class}" |
4261 | style="display:none"> |
4262 | @@ -230,7 +251,7 @@ |
4263 | <td tal:condition="view/availableActions"/> |
4264 | <td> |
4265 | <a tal:attributes="href file/libraryfile/http_url"> |
4266 | - <tal:filename content="file/libraryfile/filename"/> |
4267 | + <tal:filename replace="file/libraryfile/filename"/> |
4268 | </a> |
4269 | (<tal:size replace="file/libraryfile/content/filesize/fmt:bytes" />) |
4270 | <span style="color: red" tal:condition="is_new">NEW</span> |
4271 | @@ -260,6 +281,7 @@ |
4272 | <td colspan="6"/> |
4273 | </tr> |
4274 | </tal:custom> |
4275 | + </tal:upload> |
4276 | |
4277 | </metal:macro> |
4278 | |
4279 | @@ -287,12 +309,16 @@ |
4280 | alt="[Debian Description Translation Project Indexes]" |
4281 | src="/@@/ubuntu-icon" |
4282 | title="Debian Description Translation Project Indexes"/> |
4283 | - <a tal:attributes="href packageupload/changesfile/http_url; |
4284 | - title string:Changes file for ${packageupload/displayname}"> |
4285 | - <tal:name replace="string: ${packageupload/displayname}"/> |
4286 | + <a tal:condition="not: packageupload/pending_delayed_copy" |
4287 | + tal:content="packageupload/displayname" |
4288 | + tal:attributes=" |
4289 | + href packageupload/changesfile/http_url; |
4290 | + title string:Changes file for ${packageupload/displayname};"> |
4291 | </a> |
4292 | - <tal:version replace=" |
4293 | - string: (${packageupload/displayarchs})"/> |
4294 | + <tal:pending_delayed_copy_title |
4295 | + condition="packageupload/pending_delayed_copy" |
4296 | + replace="packageupload/displayname" /> |
4297 | + <tal:arches replace="string: (${packageupload/displayarchs})"/> |
4298 | </div> |
4299 | </metal:macro> |
4300 | |
4301 | |
4302 | === modified file 'lib/lp/soyuz/tests/test_packageupload.py' |
4303 | --- lib/lp/soyuz/tests/test_packageupload.py 2009-07-17 02:25:09 +0000 |
4304 | +++ lib/lp/soyuz/tests/test_packageupload.py 2009-07-19 04:41:14 +0000 |
4305 | @@ -16,6 +16,7 @@ |
4306 | from lp.registry.interfaces.distribution import IDistributionSet |
4307 | from lp.registry.interfaces.distroseries import DistroSeriesStatus |
4308 | from lp.soyuz.interfaces.archive import ArchivePurpose |
4309 | +from lp.soyuz.interfaces.build import BuildStatus |
4310 | from lp.soyuz.interfaces.publishing import ( |
4311 | PackagePublishingPocket, PackagePublishingStatus) |
4312 | from lp.soyuz.interfaces.queue import ( |
4313 | @@ -24,13 +25,13 @@ |
4314 | from lp.testing import TestCaseWithFactory |
4315 | |
4316 | |
4317 | -class TestPackageUpload(TestCaseWithFactory): |
4318 | +class PackageUploadTestCase(TestCaseWithFactory): |
4319 | |
4320 | layer = LaunchpadZopelessLayer |
4321 | dbuser = config.uploadqueue.dbuser |
4322 | |
4323 | def setUp(self): |
4324 | - super(TestPackageUpload, self).setUp() |
4325 | + super(PackageUploadTestCase, self).setUp() |
4326 | self.test_publisher = SoyuzTestPublisher() |
4327 | |
4328 | def createEmptyDelayedCopy(self): |
4329 | @@ -65,11 +66,15 @@ |
4330 | 'Source is mandatory for delayed copies.', |
4331 | delayed_copy.acceptFromCopy) |
4332 | |
4333 | - def createDelayedCopy(self): |
4334 | + def createDelayedCopy(self, source_only=False): |
4335 | """Return a delayed-copy targeted to ubuntutest/breezy-autotest. |
4336 | |
4337 | - The delayed-copy is target to the SECURITY pocket with: |
4338 | + The delayed-copy is targeted to the SECURITY pocket with: |
4339 | + |
4340 | * source foo - 1.1 |
4341 | + |
4342 | + And if 'source_only' is False, the default behavior, also attach: |
4343 | + |
4344 | * binaries foo - 1.1 in i386 and hppa |
4345 | * a DIST_UPGRADER custom file |
4346 | |
4347 | @@ -83,16 +88,6 @@ |
4348 | ppa.private = True |
4349 | |
4350 | source = self.test_publisher.getPubSource(archive=ppa, version='1.1') |
4351 | - self.test_publisher.getPubBinaries(pub_source=source) |
4352 | - custom_path = datadir( |
4353 | - 'dist-upgrader/dist-upgrader_20060302.0120_all.tar.gz') |
4354 | - custom_file = self.factory.makeLibraryFileAlias( |
4355 | - filename='dist-upgrader_20060302.0120_all.tar.gz', |
4356 | - content=open(custom_path).read(), restricted=True) |
4357 | - [build] = source.getBuilds() |
4358 | - build.package_upload.addCustom( |
4359 | - custom_file, PackageUploadCustomFormat.DIST_UPGRADER) |
4360 | - |
4361 | delayed_copy = getUtility(IPackageUploadSet).createDelayedCopy( |
4362 | self.test_publisher.ubuntutest.main_archive, |
4363 | self.test_publisher.breezy_autotest, |
4364 | @@ -100,11 +95,21 @@ |
4365 | self.test_publisher.person.gpgkeys[0]) |
4366 | |
4367 | delayed_copy.addSource(source.sourcepackagerelease) |
4368 | - for build in source.getBuilds(): |
4369 | - delayed_copy.addBuild(build) |
4370 | - for custom in build.package_upload.customfiles: |
4371 | - delayed_copy.addCustom( |
4372 | - custom.libraryfilealias, custom.customformat) |
4373 | + if not source_only: |
4374 | + self.test_publisher.getPubBinaries(pub_source=source) |
4375 | + custom_path = datadir( |
4376 | + 'dist-upgrader/dist-upgrader_20060302.0120_all.tar.gz') |
4377 | + custom_file = self.factory.makeLibraryFileAlias( |
4378 | + filename='dist-upgrader_20060302.0120_all.tar.gz', |
4379 | + content=open(custom_path).read(), restricted=True) |
4380 | + [build] = source.getBuilds() |
4381 | + build.package_upload.addCustom( |
4382 | + custom_file, PackageUploadCustomFormat.DIST_UPGRADER) |
4383 | + for build in source.getBuilds(): |
4384 | + delayed_copy.addBuild(build) |
4385 | + for custom in build.package_upload.customfiles: |
4386 | + delayed_copy.addCustom( |
4387 | + custom.libraryfilealias, custom.customformat) |
4388 | |
4389 | # Commit for using just-created library files. |
4390 | self.layer.txn.commit() |
4391 | @@ -136,6 +141,10 @@ |
4392 | # and has their files privacy adjusted according test destination |
4393 | # context. |
4394 | |
4395 | + # Add a cleanup for removing the repository where the custom upload |
4396 | + # was published. |
4397 | + self.addCleanup(self.removeRepository) |
4398 | + |
4399 | # Create the default delayed-copy context. |
4400 | delayed_copy = self.createDelayedCopy() |
4401 | |
4402 | @@ -158,23 +167,27 @@ |
4403 | self.test_publisher.getPubBinaries( |
4404 | pub_source=ancestry_source, |
4405 | status=PackagePublishingStatus.PUBLISHED) |
4406 | + package_diff = ancestry_source.sourcepackagerelease.requestDiffTo( |
4407 | + requester=self.test_publisher.person, |
4408 | + to_sourcepackagerelease=delayed_copy.sourcepackagerelease) |
4409 | + package_diff.diff_content = self.factory.makeLibraryFileAlias( |
4410 | + restricted=True) |
4411 | |
4412 | # Accept and publish the delayed-copy. |
4413 | delayed_copy.acceptFromCopy() |
4414 | self.assertEquals( |
4415 | PackageUploadStatus.ACCEPTED, delayed_copy.status) |
4416 | |
4417 | + self.layer.txn.commit() |
4418 | + self.layer.switchDbUser(self.dbuser) |
4419 | + |
4420 | logger = BufferLogger() |
4421 | pub_records = delayed_copy.realiseUpload(logger=logger) |
4422 | self.assertEquals( |
4423 | PackageUploadStatus.DONE, delayed_copy.status) |
4424 | |
4425 | - # Commit for comparing objects correctly. |
4426 | self.layer.txn.commit() |
4427 | - |
4428 | - # Add a cleanup for removing the repository where the custom upload |
4429 | - # was published. |
4430 | - self.addCleanup(self.removeRepository) |
4431 | + self.layer.switchDbUser('launchpad') |
4432 | |
4433 | # One source and 2 binaries are pending publication. They all were |
4434 | # overridden to multiverse and had their files moved to the public |
4435 | @@ -192,6 +205,9 @@ |
4436 | pub_record, delayed_copy.archive, delayed_copy.pocket, |
4437 | ancestry_source.component, False) |
4438 | |
4439 | + # The package diff file is now public. |
4440 | + self.assertFalse(package_diff.diff_content.restricted) |
4441 | + |
4442 | # The custom file was also published. |
4443 | custom_path = os.path.join( |
4444 | config.archivepublisher.root, |
4445 | @@ -200,6 +216,30 @@ |
4446 | self.assertEquals( |
4447 | ['20060302.0120', 'current'], sorted(os.listdir(custom_path))) |
4448 | |
4449 | + def test_realiseUpload_for_source_only_delayed_copies(self): |
4450 | + # Source-only delayed-copies results in the source published |
4451 | + # in the destination archive and its corresponding build |
4452 | + # recors ready to be dispatched. |
4453 | + |
4454 | + # Create the default delayed-copy context. |
4455 | + delayed_copy = self.createDelayedCopy(source_only=True) |
4456 | + self.test_publisher.breezy_autotest.status = ( |
4457 | + DistroSeriesStatus.CURRENT) |
4458 | + self.layer.txn.commit() |
4459 | + |
4460 | + # Accept and publish the delayed-copy. |
4461 | + delayed_copy.acceptFromCopy() |
4462 | + logger = BufferLogger() |
4463 | + pub_records = delayed_copy.realiseUpload(logger=logger) |
4464 | + |
4465 | + # Only the source is published and the needed builds are created |
4466 | + # in the destination archive. |
4467 | + self.assertEquals(1, len(pub_records)) |
4468 | + [pub_record] = pub_records |
4469 | + [build] = pub_record.getBuilds() |
4470 | + self.assertEquals( |
4471 | + BuildStatus.NEEDSBUILD, build.buildstate) |
4472 | + |
4473 | |
4474 | def test_suite(): |
4475 | return unittest.TestLoader().loadTestsFromName(__name__) |
4476 | |
4477 | === modified file 'lib/lp/translations/doc/sourcepackagerelease-translations.txt' |
4478 | --- lib/lp/translations/doc/sourcepackagerelease-translations.txt 2009-07-02 17:16:50 +0000 |
4479 | +++ lib/lp/translations/doc/sourcepackagerelease-translations.txt 2009-07-20 15:05:42 +0000 |
4480 | @@ -2,66 +2,85 @@ |
4481 | |
4482 | It's time to check the translation upload function. |
4483 | |
4484 | +We need a test tarball uploaded into librarian to run this test. We |
4485 | +will upload the same sampledata tarball twice, one public and one |
4486 | +restricted `LibraryFileAlias` objects. |
4487 | + |
4488 | >>> import os.path |
4489 | - >>> import transaction |
4490 | - >>> from canonical.launchpad.database import SourcePackageRelease |
4491 | - >>> from lp.translations.interfaces.translationimportqueue import ( |
4492 | - ... ITranslationImportQueue) |
4493 | - >>> from canonical.librarian.interfaces import ILibrarianClient |
4494 | - >>> translation_import_queue = getUtility(ITranslationImportQueue) |
4495 | - >>> client = getUtility(ILibrarianClient) |
4496 | - |
4497 | -We need a test tarball uploaded into librarian to run this test. |
4498 | - |
4499 | >>> import lp.translations |
4500 | - >>> test_file_name = os.path.join( |
4501 | + >>> tarball_path = os.path.join( |
4502 | ... os.path.dirname(lp.translations.__file__), |
4503 | ... 'doc/sourcepackagerelease-translations.tar.gz') |
4504 | - >>> file = open(test_file_name) |
4505 | - >>> size = len(file.read()) |
4506 | - >>> file.seek(0) |
4507 | - >>> alias = client.addFile( |
4508 | + >>> tarball = open(tarball_path) |
4509 | + >>> tarball_size = len(tarball.read()) |
4510 | + >>> tarball.seek(0) |
4511 | + |
4512 | + >>> from canonical.launchpad.interfaces.librarian import ( |
4513 | + ... ILibraryFileAliasSet) |
4514 | + >>> public_translation = getUtility(ILibraryFileAliasSet).create( |
4515 | ... name='test.tar.gz', |
4516 | - ... size=size, |
4517 | - ... file=file, |
4518 | + ... size=tarball_size, |
4519 | + ... file=tarball, |
4520 | ... contentType='application/x-gtar') |
4521 | |
4522 | -We need the commit to see the upload. |
4523 | + >>> tarball.seek(0) |
4524 | + >>> restricted_translation = getUtility(ILibraryFileAliasSet).create( |
4525 | + ... name='test.tar.gz', |
4526 | + ... size=tarball_size, |
4527 | + ... file=tarball, |
4528 | + ... contentType='application/x-gtar', |
4529 | + ... restricted=True) |
4530 | + |
4531 | +Commit, so uploaded contents are available in the current test. |
4532 | |
4533 | >>> transaction.commit() |
4534 | |
4535 | -Now we do the upload. It's necessary to retrive an ILibraryFileAlias |
4536 | -correspondent to the alias (long) we already have. |
4537 | - |
4538 | - >>> from canonical.launchpad.interfaces import ILibraryFileAliasSet |
4539 | - >>> file_alias = getUtility(ILibraryFileAliasSet)[alias] |
4540 | - |
4541 | +We will use an arbitrary source package release from the sampledata. |
4542 | + |
4543 | + >>> from canonical.launchpad.database import SourcePackageRelease |
4544 | >>> spr_test = SourcePackageRelease.get(20) |
4545 | - >>> spr_test.name |
4546 | - u'pmount' |
4547 | - |
4548 | -Before the final upload, the queue should be empty. |
4549 | - |
4550 | + >>> print spr_test.title |
4551 | + pmount - 0.1-1 |
4552 | + |
4553 | +And the 'katie' celebrity as the user responsible for the transalation. |
4554 | + |
4555 | + >>> from canonical.launchpad.interfaces import ILaunchpadCelebrities |
4556 | + >>> katie = getUtility(ILaunchpadCelebrities).katie |
4557 | + |
4558 | +Before the final upload, we can see that the translation queue for the |
4559 | +testing source package is empty. |
4560 | + |
4561 | + >>> from lp.translations.interfaces.translationimportqueue import ( |
4562 | + ... ITranslationImportQueue) |
4563 | + >>> translation_import_queue = getUtility(ITranslationImportQueue) |
4564 | >>> translation_import_queue.getAllEntries( |
4565 | ... target=spr_test.sourcepackage).count() |
4566 | 0 |
4567 | |
4568 | - >>> from canonical.launchpad.interfaces import ILaunchpadCelebrities |
4569 | - >>> katie = getUtility(ILaunchpadCelebrities).katie |
4570 | - >>> spr_test.attachTranslationFiles(file_alias, True, katie) |
4571 | - |
4572 | -The commit is needed to see the new entries |
4573 | - |
4574 | - >>> transaction.commit() |
4575 | - |
4576 | -And the queue should have a new entry. |
4577 | - |
4578 | - >>> for entry in translation_import_queue.getAllEntries( |
4579 | - ... target=spr_test.sourcepackage): |
4580 | +Now we bind both uploaded translations, the public and the restricted |
4581 | +ones, to the testing source package. |
4582 | + |
4583 | + >>> spr_test.attachTranslationFiles(public_translation, True, katie) |
4584 | + |
4585 | + >>> spr_test.attachTranslationFiles(restricted_translation, True, katie) |
4586 | + |
4587 | +And the queue should have 2 entries, which exactly the same contents. |
4588 | + |
4589 | + >>> queue_entries = translation_import_queue.getAllEntries( |
4590 | + ... target=spr_test.sourcepackage) |
4591 | + |
4592 | + >>> queue_entries.count() |
4593 | + 2 |
4594 | + |
4595 | + >>> for entry in queue_entries: |
4596 | ... print entry.path, entry.importer.name |
4597 | something/en-US.xpi katie |
4598 | po/es.po katie |
4599 | |
4600 | +Commit, so the uploaded traslations become available to the scripts. |
4601 | + |
4602 | + >>> transaction.commit() |
4603 | + |
4604 | Now, we need to do the final import. It's done as a two steps procedure. |
4605 | |
4606 | The first one, approves the import. |
4607 | |
4608 | === modified file 'lib/lp/translations/interfaces/potmsgset.py' |
4609 | --- lib/lp/translations/interfaces/potmsgset.py 2009-07-17 00:26:05 +0000 |
4610 | +++ lib/lp/translations/interfaces/potmsgset.py 2009-07-19 04:41:14 +0000 |
4611 | @@ -93,10 +93,16 @@ |
4612 | """ |
4613 | |
4614 | def getCurrentTranslationMessage(potemplate, language, variant=None): |
4615 | - """Returns a TranslationMessage marked as being currently used.""" |
4616 | + """Returns a TranslationMessage marked as being currently used. |
4617 | + |
4618 | + Diverged messages are preferred. |
4619 | + """ |
4620 | |
4621 | def getImportedTranslationMessage(potemplate, language, variant=None): |
4622 | - """Returns a TranslationMessage as imported from the package.""" |
4623 | + """Returns a TranslationMessage as imported from the package. |
4624 | + |
4625 | + Diverged messages are preferred. |
4626 | + """ |
4627 | |
4628 | def getSharedTranslationMessage(language, variant=None): |
4629 | """Returns a shared TranslationMessage.""" |
4630 | |
4631 | === modified file 'lib/lp/translations/model/potmsgset.py' |
4632 | --- lib/lp/translations/model/potmsgset.py 2009-07-17 00:26:05 +0000 |
4633 | +++ lib/lp/translations/model/potmsgset.py 2009-07-19 04:41:14 +0000 |
4634 | @@ -210,10 +210,13 @@ |
4635 | 'There is already a translation message in our database.') |
4636 | return DummyTranslationMessage(pofile, self) |
4637 | |
4638 | - def _getUsedTranslationMessage( |
4639 | - self, potemplate, language, variant, current=True): |
4640 | + def _getUsedTranslationMessage(self, potemplate, language, variant, |
4641 | + current=True): |
4642 | """Get a translation message which is either used in |
4643 | - Launchpad (current=True) or in an import (current=False).""" |
4644 | + Launchpad (current=True) or in an import (current=False). |
4645 | + |
4646 | + Prefers a diverged message if present. |
4647 | + """ |
4648 | # Change 'is_current IS TRUE' and 'is_imported IS TRUE' conditions |
4649 | # carefully: they need to match condition specified in indexes, |
4650 | # or Postgres may not pick them up (in complicated queries, |
4651 | @@ -239,17 +242,12 @@ |
4652 | clauses.append( |
4653 | 'TranslationMessage.variant=%s' % sqlvalues(variant)) |
4654 | |
4655 | - # This returns at most two messages: |
4656 | - # 1. a current translation for this particular potemplate. |
4657 | - # 2. a shared current translation for this. |
4658 | - messages = list(TranslationMessage.select( |
4659 | - ' AND '.join(clauses), |
4660 | - orderBy=['-COALESCE(potemplate, -1)'])) |
4661 | - if len(messages) > 0: |
4662 | - return messages[0] |
4663 | - else: |
4664 | - return None |
4665 | + order_by = '-COALESCE(potemplate, -1)' |
4666 | |
4667 | + # This should find at most two messages: zero or one shared |
4668 | + # message, and zero or one diverged one. |
4669 | + return TranslationMessage.selectFirst( |
4670 | + ' AND '.join(clauses), orderBy=[order_by]) |
4671 | |
4672 | def getCurrentTranslationMessage(self, potemplate, |
4673 | language, variant=None): |
4674 | @@ -481,12 +479,8 @@ |
4675 | translations[pluralform] is not None): |
4676 | translation = translations[pluralform] |
4677 | # Find or create a POTranslation for the specified text |
4678 | - try: |
4679 | - potranslations[pluralform] = ( |
4680 | - POTranslation.byTranslation(translation)) |
4681 | - except SQLObjectNotFound: |
4682 | - potranslations[pluralform] = ( |
4683 | - POTranslation(translation=translation)) |
4684 | + potranslations[pluralform] = ( |
4685 | + POTranslation.getOrCreateTranslation(translation)) |
4686 | else: |
4687 | potranslations[pluralform] = None |
4688 | return potranslations |
4689 | |
4690 | === modified file 'lib/lp/translations/model/translationmessage.py' |
4691 | --- lib/lp/translations/model/translationmessage.py 2009-07-17 00:26:05 +0000 |
4692 | +++ lib/lp/translations/model/translationmessage.py 2009-07-19 04:41:14 +0000 |
4693 | @@ -14,6 +14,7 @@ |
4694 | import pytz |
4695 | |
4696 | from sqlobject import BoolCol, ForeignKey, SQLObjectNotFound, StringCol |
4697 | +from storm.expr import And |
4698 | from storm.locals import SQL |
4699 | from storm.store import Store |
4700 | from zope.interface import implements |
4701 | @@ -85,7 +86,7 @@ |
4702 | return self.potmsgset.makeHTMLID('_'.join(elements)) |
4703 | |
4704 | def setPOFile(self, pofile): |
4705 | - """See `ITransationMessage`.""" |
4706 | + """See `ITranslationMessage`.""" |
4707 | self.browser_pofile = pofile |
4708 | |
4709 | |
4710 | @@ -449,6 +450,28 @@ |
4711 | # suggestions will always be shared. |
4712 | self.destroySelf() |
4713 | |
4714 | + def findIdenticalMessage(self, target_potmsgset, target_potemplate): |
4715 | + """See `ITranslationMessage`.""" |
4716 | + store = Store.of(self) |
4717 | + |
4718 | + forms_match = (TranslationMessage.msgstr0 == self.msgstr0) |
4719 | + for form in xrange(1, TranslationConstants.MAX_PLURAL_FORMS): |
4720 | + form_name = 'msgstr%d' % form |
4721 | + form_value = getattr(self, form_name) |
4722 | + forms_match = And( |
4723 | + forms_match, |
4724 | + getattr(TranslationMessage, form_name) == form_value) |
4725 | + |
4726 | + twins = store.find(TranslationMessage, And( |
4727 | + TranslationMessage.potmsgset == target_potmsgset, |
4728 | + TranslationMessage.potemplate == target_potemplate, |
4729 | + TranslationMessage.language == self.language, |
4730 | + TranslationMessage.variant == self.variant, |
4731 | + TranslationMessage.id != self.id, |
4732 | + forms_match)) |
4733 | + |
4734 | + return twins.order_by(TranslationMessage.id).first() |
4735 | + |
4736 | |
4737 | class TranslationMessageSet: |
4738 | """See `ITranslationMessageSet`.""" |
4739 | |
4740 | === modified file 'lib/lp/translations/scripts/message_sharing_migration.py' |
4741 | --- lib/lp/translations/scripts/message_sharing_migration.py 2009-07-17 00:26:05 +0000 |
4742 | +++ lib/lp/translations/scripts/message_sharing_migration.py 2009-07-19 04:41:14 +0000 |
4743 | @@ -2,11 +2,7 @@ |
4744 | # GNU Affero General Public License version 3 (see the file LICENSE). |
4745 | |
4746 | __metaclass__ = type |
4747 | -__all__ = [ |
4748 | - 'MessageSharingMerge', |
4749 | - 'merge_potmsgsets', |
4750 | - 'merge_translationmessages', |
4751 | - ] |
4752 | +__all__ = [ 'MessageSharingMerge' ] |
4753 | |
4754 | |
4755 | from zope.component import getUtility |
4756 | @@ -90,106 +86,62 @@ |
4757 | merge_pofiletranslators(item.potmsgset, representative_template) |
4758 | |
4759 | |
4760 | -def merge_potmsgsets(potemplates): |
4761 | - """Merge POTMsgSets for given sequence of sharing templates.""" |
4762 | - |
4763 | - # Map each POTMsgSet key (context, msgid, plural) to its |
4764 | - # representative POTMsgSet. |
4765 | - representatives = {} |
4766 | - |
4767 | - # Map each representative POTMsgSet to a list of subordinate |
4768 | - # POTMsgSets it represents. |
4769 | - subordinates = {} |
4770 | - |
4771 | - # Map each representative POTMsgSet to its representative |
4772 | - # POTemplate. |
4773 | - representative_templates = {} |
4774 | - |
4775 | - # Figure out representative potmsgsets and their subordinates. Go |
4776 | - # through the templates, starting at the most representative and |
4777 | - # moving towards the least representative. For any unique potmsgset |
4778 | - # key we find, the first POTMsgSet is the representative one. |
4779 | - order_check = OrderingCheck( |
4780 | - cmp=getUtility(IPOTemplateSet).compareSharingPrecedence) |
4781 | - for template in potemplates: |
4782 | - order_check.check(template) |
4783 | - for potmsgset in template.getPOTMsgSets(False): |
4784 | - key = get_potmsgset_key(potmsgset) |
4785 | - if key not in representatives: |
4786 | - representatives[key] = potmsgset |
4787 | - representative_templates[potmsgset] = template |
4788 | - representative = representatives[key] |
4789 | - if representative in subordinates: |
4790 | - subordinates[representative].append(potmsgset) |
4791 | - else: |
4792 | - subordinates[representative] = [] |
4793 | - |
4794 | - for representative, potmsgsets in subordinates.iteritems(): |
4795 | - # Merge each subordinate POTMsgSet into its representative. |
4796 | - seen_potmsgsets = set([representative]) |
4797 | - for subordinate in potmsgsets: |
4798 | - if subordinate in seen_potmsgsets: |
4799 | - # We've already done this one when we found it in a more |
4800 | - # representative template. |
4801 | - continue |
4802 | - |
4803 | - seen_potmsgsets.add(subordinate) |
4804 | - |
4805 | - original_template = None |
4806 | - |
4807 | - for message in subordinate.getAllTranslationMessages(): |
4808 | - message = removeSecurityProxy(message) |
4809 | - if message.potemplate is None: |
4810 | - # Guard against multiple shared current or imported |
4811 | - # messages. |
4812 | - if message.is_current: |
4813 | - clashing_shared_current = ( |
4814 | - representative.getCurrentTranslationMessage( |
4815 | - potemplate=None, language=message.language, |
4816 | - variant=message.variant)) |
4817 | - else: |
4818 | - clashing_shared_current = None |
4819 | - |
4820 | - if message.is_imported: |
4821 | - clashing_shared_imported = ( |
4822 | - representative.getImportedTranslationMessage( |
4823 | - potemplate=None, language=message.language, |
4824 | - variant=message.variant)) |
4825 | - else: |
4826 | - clashing_shared_imported = None |
4827 | - |
4828 | - if clashing_shared_current or clashing_shared_imported: |
4829 | - # This shared message can't cohabitate with one |
4830 | - # that was more representative. Make it diverged. |
4831 | - if original_template is None: |
4832 | - # Look up subordinate's original template if |
4833 | - # we haven't already. We can't just get its |
4834 | - # potemplate field because that field is |
4835 | - # being phased out and might not be set. |
4836 | - links = ( |
4837 | - subordinate.getAllTranslationTemplateItems()) |
4838 | - original_template = links[0].potemplate |
4839 | - |
4840 | - message.potemplate = original_template |
4841 | - |
4842 | - message.potmsgset = representative |
4843 | - |
4844 | - merge_translationtemplateitems( |
4845 | - subordinate, representative, |
4846 | - representative_templates[representative]) |
4847 | - |
4848 | - removeSecurityProxy(subordinate).destroySelf() |
4849 | - |
4850 | - |
4851 | -def merge_translationmessages(potemplates): |
4852 | - """Share `TranslationMessage`s between `potemplates` where possible.""" |
4853 | - order_check = OrderingCheck( |
4854 | - cmp=getUtility(IPOTemplateSet).compareSharingPrecedence) |
4855 | - for template in potemplates: |
4856 | - order_check.check(template) |
4857 | - for potmsgset in template.getPOTMsgSets(False): |
4858 | - for message in potmsgset.getAllTranslationMessages(): |
4859 | - removeSecurityProxy(message).shareIfPossible() |
4860 | +def filter_clashes(clashing_current, clashing_imported, twin): |
4861 | + """Filter clashes for harmless clashes with an identical message. |
4862 | + |
4863 | + Takes the three forms of clashes a message can have in a context |
4864 | + it's being merged into: |
4865 | + * Another message that also has the is_current flag. |
4866 | + * Another message that also has the is_imported flag. |
4867 | + * Another message with the same translations. |
4868 | + |
4869 | + If either of the first two clashes matches the third, that is not a |
4870 | + real clash since it can be resolved by merging the message into the |
4871 | + twin. |
4872 | + |
4873 | + This function returns the same tuple but with these "harmless" |
4874 | + clashes eliminated. |
4875 | + """ |
4876 | + if clashing_current == twin: |
4877 | + clashing_current = None |
4878 | + if clashing_imported == twin: |
4879 | + clashing_imported = None |
4880 | + return clashing_current, clashing_imported, twin |
4881 | + |
4882 | + |
4883 | +def sacrifice_flags(message, incumbents=None): |
4884 | + """Drop current/imported flags if held by any of `incumbents`. |
4885 | + |
4886 | + :param message: a `TranslationMessage` to drop flags on. |
4887 | + :param incumbents: a sequence of reference messages. If any of |
4888 | + these has either is_current or is_imported set, that same |
4889 | + flag will be dropped on message (if set). |
4890 | + """ |
4891 | + if incumbents: |
4892 | + for incumbent in incumbents: |
4893 | + if incumbent is not None and incumbent.is_current: |
4894 | + message.is_current = False |
4895 | + if incumbent is not None and incumbent.is_imported: |
4896 | + message.is_imported = False |
4897 | + |
4898 | + |
4899 | +def bequeathe_flags(source_message, target_message, incumbents=None): |
4900 | + """Destroy `source_message`, leaving flags to `target_message`. |
4901 | + |
4902 | + If `source_message` holds the is_current flag, and there are no |
4903 | + `incumbents` that hold the same flag, then `target_message` inherits |
4904 | + it. Similar for the is_imported flag. |
4905 | + """ |
4906 | + sacrifice_flags(source_message, incumbents) |
4907 | + |
4908 | + if source_message.is_current and not target_message.is_current: |
4909 | + source_message.is_current = False |
4910 | + target_message.is_current = True |
4911 | + if source_message.is_imported and not target_message.is_imported: |
4912 | + source_message.is_imported = False |
4913 | + target_message.is_imported = True |
4914 | + |
4915 | + source_message.destroySelf() |
4916 | |
4917 | |
4918 | class MessageSharingMerge(LaunchpadScript): |
4919 | @@ -280,16 +232,348 @@ |
4920 | self.logger.debug("Templates: %s" % str(templates)) |
4921 | |
4922 | if self.options.merge_potmsgsets: |
4923 | - merge_potmsgsets(templates) |
4924 | + self._mergePOTMsgSets(templates) |
4925 | |
4926 | if self.options.merge_translationmessages: |
4927 | - merge_translationmessages(templates) |
4928 | + self._mergeTranslationMessages(templates) |
4929 | |
4930 | self._endTransaction() |
4931 | |
4932 | + self.logger.info("Done.") |
4933 | + |
4934 | def _endTransaction(self): |
4935 | if self.options.dry_run: |
4936 | self.txn.abort() |
4937 | else: |
4938 | self.txn.commit() |
4939 | self.txn.begin() |
4940 | + |
4941 | + def _mergePOTMsgSets(self, potemplates): |
4942 | + """Merge POTMsgSets for given sequence of sharing templates.""" |
4943 | + |
4944 | + # Map each POTMsgSet key (context, msgid, plural) to its |
4945 | + # representative POTMsgSet. |
4946 | + representatives = {} |
4947 | + |
4948 | + # Map each representative POTMsgSet to a list of subordinate |
4949 | + # POTMsgSets it represents. |
4950 | + subordinates = {} |
4951 | + |
4952 | + # Map each representative POTMsgSet to its representative |
4953 | + # POTemplate. |
4954 | + representative_templates = {} |
4955 | + |
4956 | + # Figure out representative potmsgsets and their subordinates. Go |
4957 | + # through the templates, starting at the most representative and |
4958 | + # moving towards the least representative. For any unique potmsgset |
4959 | + # key we find, the first POTMsgSet is the representative one. |
4960 | + templateset = getUtility(IPOTemplateSet) |
4961 | + order_check = OrderingCheck(cmp=templateset.compareSharingPrecedence) |
4962 | + |
4963 | + for template in potemplates: |
4964 | + order_check.check(template) |
4965 | + for potmsgset in template.getPOTMsgSets(False): |
4966 | + key = get_potmsgset_key(potmsgset) |
4967 | + if key not in representatives: |
4968 | + representatives[key] = potmsgset |
4969 | + representative_templates[potmsgset] = template |
4970 | + representative = representatives[key] |
4971 | + if representative in subordinates: |
4972 | + subordinates[representative].append(potmsgset) |
4973 | + else: |
4974 | + subordinates[representative] = [] |
4975 | + |
4976 | + for representative, potmsgsets in subordinates.iteritems(): |
4977 | + # Scrub the representative POTMsgSet of any duplicate |
4978 | + # translation messages. We don't need do this for subordinates |
4979 | + # because the algorithm will refuse to add new duplicates to the |
4980 | + # representative POTMsgSet anyway. |
4981 | + existing_tms = self._scrubPOTMsgSetTranslations(representative) |
4982 | + |
4983 | + # Keep track of current/imported TranslationMessages for |
4984 | + # representative. These will matter when we try to merge |
4985 | + # subordinate TranslationMessages into the representative, some |
4986 | + # of which may be current or imported as well. |
4987 | + current_tms, imported_tms = self._findUsedMessages(existing_tms) |
4988 | + |
4989 | + seen_potmsgsets = set([representative]) |
4990 | + |
4991 | + # Merge each subordinate POTMsgSet into its representative. |
4992 | + for subordinate in potmsgsets: |
4993 | + if subordinate in seen_potmsgsets: |
4994 | + continue |
4995 | + |
4996 | + seen_potmsgsets.add(subordinate) |
4997 | + |
4998 | + for message in subordinate.getAllTranslationMessages(): |
4999 | + message = removeSecurityProxy(message) |
5000 | + |
This is my implementation to address base-layout requirement changes.
lp:~sinzui/launchpad/base-layout-additions /bugs.launchpad .net/bugs/ 403637 /bugs.launchpad .net/bugs/ 403638 /bugs.launchpad .net/bugs/ 403666 base-layout| xx-beta- testers- redirection| tales" implementation: beuno, salgado
Diff size: 512
Launchpad bug: https:/
https:/
https:/
Test command: ./bin/test -vvt "tests/
Pre-
Target release: 2.2.8
= base-layout requirement changes =
Bug #403637 base-layout must support a heading slot
The 3.0 design places the <h1> in the page header above the breadcrumbs.
The base-layout template must support a heading slot.
Bug #403638 base-layout must set the pubic/private body class
To fix Bug #298152 [We should have a very visible indication for private
objects], base-layout must set the body class as public or private.
Bug #403666 Replace ids with classes in style-3-0.css
The are styles that are assigned to ids in style-3-0.css. This is
wrong because it discourages presentation reuse. Only elements and
classes may be used to presentation. Ids are reserved for testing
and scripts.
== Rules ==
Bug #403637 base-layout must support a heading slot slot="heading" >Heading< /tal:heading> heading< /h1>
Support this notation in template
<tal:heading metal:fill-
to create
<h1>
Bug #403638 base-layout must set the pubic/private body class
Create a formatter that will work with any context. If the context is
private, return 'private', otherwise return 'public'. Use the formatter
in base-layout. Add a CSS body.private rule.
Bug #403666 Replace ids with classes in style-3-0.css
Replace the '#' with a '.'. Use names that are less specify so that they
can use used in similar cases.
== QA ==
Bug #403637 base-layout must support a heading slot
The only pages using the new layout are the locationless pages that
remain unchanged. There is no visual confirmation that the heading
slot works.
I Do not know how to visually verify a timeout error for edge or a notfound. pt did not require more
librarian failure. locationless is a 1-to-1 replacement for freeform.
The other pages, such as the launchpad-
changes than the ones done here.
Bug #403638 base-layout must set the pubic/private body class /edge.launchpad .net/bnorked
* Visit https:/
* View the source of the not found page
* Verify that 'public' is among the body tag's classes
Bug #403666 Replace ids with classes in style-3-0.css /edge.launchpad .net/bnorked
* Visit https:/
* Verify the footer has a top border and is about 2 ems from the content.
== Lint ==
Checking for conflicts. and issues in doctests and templates.
Running jslint, xmllint, pyflakes, and pylint.
Using normal rules.
Linting changed files: /launchpad/ doc/tales. txt /launchpad/ icing/style- 3-0.css /launchpad/ images/ private- y-bg.png /launchpad/ templates/ launchpad- librarianfailur e.pt /launchpad/ templates/ launchpad- requestexpired. pt /launchpad/ webapp/ tales.py app/browser/ tests.. .
lib/canonical
lib/canonical
lib/canonical
lib/canonical
lib/canonical
lib/canonical
lib/lp/