Merge lp:~jelmer/launchpad/bzr-code-imports into lp:launchpad
- bzr-code-imports
- Merge into devel
Status: | Superseded |
---|---|
Proposed branch: | lp:~jelmer/launchpad/bzr-code-imports |
Merge into: | lp:launchpad |
Prerequisite: | lp:~jelmer/launchpad/bzr-code-imports-prereq |
Diff against target: |
3364 lines (+1594/-323) 56 files modified
cronscripts/generate-contents-files.py (+1/-2) cronscripts/publish-ftpmaster.py (+1/-2) database/replication/helpers.py (+8/-2) database/replication/slon_ctl.py (+1/-1) database/sampledata/current-dev.sql (+11/-11) database/sampledata/current.sql (+10/-10) database/schema/Makefile (+0/-1) database/schema/comments.sql (+3/-0) database/schema/full-update.py (+15/-6) database/schema/patch-2208-76-3.sql (+15/-0) database/schema/patch-2208-76-4.sql (+159/-0) database/schema/patch-2208-79-0.sql (+9/-0) database/schema/patch-2208-79-1.sql (+26/-0) database/schema/patch-2208-80-1.sql (+12/-0) database/schema/preflight.py (+37/-18) database/schema/security.cfg (+10/-3) database/schema/security.py (+227/-105) database/schema/testfuncs.sql (+0/-29) database/schema/trusted.sql (+5/-2) database/schema/upgrade.py (+40/-11) lib/canonical/config/schema-lazr.conf (+5/-0) lib/lp/code/mail/codeimport.py (+3/-1) lib/lp/code/model/codeimport.py (+8/-3) lib/lp/code/model/codeimportevent.py (+2/-1) lib/lp/code/model/tests/test_codeimport.py (+44/-0) lib/lp/codehosting/codeimport/tests/servers.py (+58/-2) lib/lp/codehosting/codeimport/tests/test_worker.py (+102/-6) lib/lp/codehosting/codeimport/tests/test_workermonitor.py (+20/-0) lib/lp/codehosting/codeimport/worker.py (+97/-5) lib/lp/codehosting/puller/tests/test_scheduler.py (+1/-1) lib/lp/codehosting/puller/worker.py (+0/-1) lib/lp/registry/browser/distribution.py (+1/-0) lib/lp/registry/browser/productseries.py (+9/-17) lib/lp/registry/browser/tests/distribution-views.txt (+2/-1) lib/lp/registry/browser/tests/test_distribution_views.py (+21/-0) lib/lp/registry/configure.zcml (+1/-0) lib/lp/registry/interfaces/distribution.py (+7/-0) lib/lp/registry/model/distribution.py (+1/-0) lib/lp/registry/tests/test_distribution.py (+20/-0) lib/lp/soyuz/adapters/notification.py (+3/-1) lib/lp/soyuz/adapters/tests/test_notification.py (+38/-11) lib/lp/testing/factory.py (+9/-3) lib/lp/testing/pgsql.py (+5/-1) lib/lp/translations/configure.zcml (+8/-0) lib/lp/translations/interfaces/translationsharingjob.py (+3/-0) lib/lp/translations/model/translationpackagingjob.py (+34/-1) lib/lp/translations/model/translationsharingjob.py (+39/-3) lib/lp/translations/scripts/tests/test_packaging_translations.py (+1/-0) lib/lp/translations/tests/test_translationpackagingjob.py (+88/-11) lib/lp/translations/tests/test_translationsplitter.py (+184/-7) lib/lp/translations/translationmerger.py (+20/-1) lib/lp/translations/utilities/translationsplitter.py (+158/-36) scripts/code-import-worker.py (+7/-2) utilities/sourcedeps.cache (+2/-2) utilities/sourcedeps.conf (+2/-2) versions.cfg (+1/-1) |
To merge this branch: | bzr merge lp:~jelmer/launchpad/bzr-code-imports |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Robert Collins | code | Pending | |
Gavin Panella | Pending | ||
Michael Hudson-Doyle | Pending | ||
Review via email: mp+71625@code.launchpad.net |
This proposal supersedes a proposal from 2011-08-16.
This proposal has been superseded by a proposal from 2011-08-24.
Commit message
Description of the change
Add support for importing code from Bazaar branches.
At the moment mirrors of remote Bazaar branches are created with completely different infrastructure as the code imports. This is confusing for users (bug 611837) and duplicates a lot of code. Several features are only available for code imports (bug 362622, bug 193607, bug 193607, bug 371484) and vice versa. Having shared infrastructure would also make it easier to fix several open bugs that affect both code imports and code mirrors (bug 519159, bug 136939)
Code imports are a bit heavier than mirrors at the moment, as they run on a separate machine and require an extra copy of the branch that is imported.
This branch only adds backend support for Bazaar branches, it does not yet add a UI which can add code imports of this kind nor does it migrate any of the existing code mirrors.
Gavin Panella (allenap) wrote : Posted in a previous version of this proposal | # |
Jelmer Vernooij (jelmer) wrote : Posted in a previous version of this proposal | # |
> I don't know the ins and outs of bzrlib or codehosting - for that I
> guess that's why you've asked Michael to review - but the rest looks
> good.
Thanks :)
I'm pretty confident about this code change (especially since nothing actually triggers the new code yet), but I'd like to get some feedback from more people (Michael, Jono?, Rob?) on to confirm this is the right direction to go in.
> + def test_partial(self):
> + # Skip tests of partial tests, as they are disabled for native
> imports
> + # at the moment.
> + return
>
> You could use TestCase.skip() here:
>
> def test_partial(self):
> self.skip("Disabled for native imports at the moment.")
I looked for things that raised SkipTest or TestSkipped and couldn't find any. I didn't know about TestCase.skip - thanks, fixed.
Michael Hudson-Doyle (mwhudson) wrote : Posted in a previous version of this proposal | # |
Some random comments: it would have been nice to see the things you had to do wrt the bzr upgrade
The fact that you put this line in suggests that the tests aren't very well isolated:
Does something need to inherit from bzrlib's TestCase? It's all a complete tangle though.
Is simply prohibiting branch references the right thing to do? The branch puller goes to some lengths to support them safely -- being able to dump all that code would surely be very nice. Relatedly, and on thinking about it I think this is a bit more serious, I think you might need to be careful about stacking -- it's contrived but you might be able to construct a branch that was stacked on some DC-internal branch and have that be imported so you can grab it.
In terms of overall direction, I'm all for removing duplication. I think the UI will require some effort (e.g. do we want to count bzr mirrors as "imported branches" on the code.launchpad.net frontpage?) but engineering wise, this looks fine, modulo the above comments.
Michael Hudson-Doyle (mwhudson) wrote : Posted in a previous version of this proposal | # |
"Some random comments: it would have been nice to see the things you had to do wrt the bzr upgrade " ... in a separate branch, I meant to say.
Jelmer Vernooij (jelmer) wrote : Posted in a previous version of this proposal | # |
On 06/24/2011 05:35 AM, Michael Hudson-Doyle wrote:
> Review: Approve
> Some random comments: it would have been nice to see the things you had to do wrt the bzr upgrade
This branch has two prerequisite branches, but I can only set one in
Launchpad. Hopefully the diff will get smaller when the update to the
newer bzr lands on lp:launchpad...
>
> The fact that you put this line in suggests that the tests aren't very well isolated:
>
> branch.
>
> Does something need to inherit from bzrlib's TestCase? It's all a complete tangle though.
Perhaps; bzr's TestCase does a lot though, and I'm kindof worried that
mixing it in will break other things. It should do more than just this
ad-hoc override though. Perhaps reset the global bzr configuration in setUp?
>
> Is simply prohibiting branch references the right thing to do? The branch puller goes to some lengths to support them safely -- being able to dump all that code would surely be very nice. Relatedly, and on thinking about it I think this is a bit more serious, I think you might need to be careful about stacking -- it's contrived but you might be able to construct a branch that was stacked on some DC-internal branch and have that be imported so you can grab it.
Stacking is a very good point, and one that I had not considered -
thanks. I should probably also have another, closer, look at the branch
puller to see what it does and why.
For branch references, the easiest thing to do for the moment seemed to
be to just refuse to mirror them. If code mirrors support branch
references at the moment, we should keep that support.
>
> In terms of overall direction, I'm all for removing duplication. I think the UI will require some effort (e.g. do we want to count bzr mirrors as "imported branches" on the code.launchpad.net frontpage?) but engineering wise, this looks fine, modulo the above comments.
Thanks for having a look at this. I'm only just getting into this code
and am not very familiar with it yet.
Cheers,
Jelmer
Michael Hudson-Doyle (mwhudson) wrote : Posted in a previous version of this proposal | # |
On Fri, 24 Jun 2011 19:55:45 +0200, Jelmer Vernooij <email address hidden> wrote:
> On 06/24/2011 05:35 AM, Michael Hudson-Doyle wrote:
> > Review: Approve
> > Some random comments: it would have been nice to see the things you had to do wrt the bzr upgrade
> This branch has two prerequisite branches, but I can only set one in
> Launchpad. Hopefully the diff will get smaller when the update to the
> newer bzr lands on lp:launchpad...
Ah! I guess you could have created a branch with both of the
prerequisites merged in, but that's getting pretty tedious...
> >
> > The fact that you put this line in suggests that the tests aren't very well isolated:
> >
> > branch.
> >
> > Does something need to inherit from bzrlib's TestCase? It's all a complete tangle though.
> Perhaps; bzr's TestCase does a lot though, and I'm kindof worried that
> mixing it in will break other things.
I'd be surprised if it broke other stuff on past experience, but you
might be right.
> It should do more than just this ad-hoc override though. Perhaps reset
> the global bzr configuration in setUp?
I guess what you want is a test fixture that just does the environment
isolation in bzrlib you can reuse here... but yes, doing something more
generic than just clearing create_signatures would be better, I think.
> >
> > Is simply prohibiting branch references the right thing to do? The branch puller goes to some lengths to support them safely -- being able to dump all that code would surely be very nice. Relatedly, and on thinking about it I think this is a bit more serious, I think you might need to be careful about stacking -- it's contrived but you might be able to construct a branch that was stacked on some DC-internal branch and have that be imported so you can grab it.
> Stacking is a very good point, and one that I had not considered -
> thanks. I should probably also have another, closer, look at the branch
> puller to see what it does and why.
For reasons I should probably be ashamed of, there seem to be two
implementations of 'safe opening' in Launchpad; one is around
lp.codehosting.
lp.codehosting.
(looking at worker.py makes me realize how much code we can delete once
this branch is done, never mind how much can be deleted when the puller
is gone completely).
> For branch references, the easiest thing to do for the moment seemed to
> be to just refuse to mirror them. If code mirrors support branch
> references at the moment, we should keep that support.
Yeah, they do.
> >
> > In terms of overall direction, I'm all for removing duplication. I think the UI will require some effort (e.g. do we want to count bzr mirrors as "imported branches" on the code.launchpad.net frontpage?) but engineering wise, this looks fine, modulo the above comments.
> Thanks for having a look at this. I'm only just getting into this code
> and am not very familiar with it yet.
You seem to be doing fine :)
Cheers,
mwh
Michael Hudson-Doyle (mwhudson) wrote : Posted in a previous version of this proposal | # |
I don't see anything to do with stacking safety and/or supporting branch references in here yet -- is this really ready for review again?
Jelmer Vernooij (jelmer) wrote : Posted in a previous version of this proposal | # |
> I don't see anything to do with stacking safety and/or supporting branch
> references in here yet -- is this really ready for review again?
No, it indeed isn't - I was merely trying to generate a proper diff. Sorry.
Robert Collins (lifeless) wrote : Posted in a previous version of this proposal | # |
Does this permit imports from lp hosted branches? if so, thats likely to permit bypassing of private branch ACLs - can you check that that isn't the please?
Gavin Panella (allenap) : Posted in a previous version of this proposal | # |
Jelmer Vernooij (jelmer) wrote : Posted in a previous version of this proposal | # |
On Sun, 2011-08-07 at 22:41 +0000, Robert Collins wrote:
> Review: Needs Information
> Does this permit imports from lp hosted branches? if so, thats likely to permit bypassing of private branch ACLs - can you check that that isn't the please?
It does not allow imports from lp hosted branches, importing of branch
references to launchpad branches or even importing of branches stacked
on Launchpad branches.
It also limits imports to a specific (whitelist) set of schemes, which
excludes bzr+ssh://, sftp:// and file://.
This is the same policy as is currently in place for code mirrors at the
moment.
Cheers,
Jelmer
Gavin Panella (allenap) : Posted in a previous version of this proposal | # |
Preview Diff
1 | === modified file 'cronscripts/generate-contents-files.py' |
2 | --- cronscripts/generate-contents-files.py 2011-07-11 13:33:13 +0000 |
3 | +++ cronscripts/generate-contents-files.py 2011-08-16 00:31:36 +0000 |
4 | @@ -7,7 +7,6 @@ |
5 | |
6 | import _pythonpath |
7 | |
8 | -from canonical.config import config |
9 | from lp.archivepublisher.scripts.generate_contents_files import ( |
10 | GenerateContentsFiles, |
11 | ) |
12 | @@ -15,5 +14,5 @@ |
13 | |
14 | if __name__ == '__main__': |
15 | script = GenerateContentsFiles( |
16 | - "generate-contents", dbuser=config.archivepublisher.dbuser) |
17 | + "generate-contents", dbuser='generate_contents_files') |
18 | script.lock_and_run() |
19 | |
20 | === modified file 'cronscripts/publish-ftpmaster.py' |
21 | --- cronscripts/publish-ftpmaster.py 2011-03-31 06:29:09 +0000 |
22 | +++ cronscripts/publish-ftpmaster.py 2011-08-16 00:31:36 +0000 |
23 | @@ -7,11 +7,10 @@ |
24 | |
25 | import _pythonpath |
26 | |
27 | -from canonical.config import config |
28 | from lp.archivepublisher.scripts.publish_ftpmaster import PublishFTPMaster |
29 | |
30 | |
31 | if __name__ == '__main__': |
32 | script = PublishFTPMaster( |
33 | - "publish-ftpmaster", dbuser=config.archivepublisher.dbuser) |
34 | + "publish-ftpmaster", 'publish_ftpmaster') |
35 | script.lock_and_run() |
36 | |
37 | === modified file 'database/replication/helpers.py' |
38 | --- database/replication/helpers.py 2011-07-25 13:39:10 +0000 |
39 | +++ database/replication/helpers.py 2011-08-16 00:31:36 +0000 |
40 | @@ -145,7 +145,7 @@ |
41 | self.table_id, self.replication_set_id, self.master_node_id = row |
42 | |
43 | |
44 | -def sync(timeout): |
45 | +def sync(timeout, exit_on_fail=True): |
46 | """Generate a sync event and wait for it to complete on all nodes. |
47 | |
48 | This means that all pending events have propagated and are in sync |
49 | @@ -154,8 +154,14 @@ |
50 | |
51 | :param timeout: Number of seconds to wait for the sync. 0 to block |
52 | indefinitely. |
53 | + |
54 | + :param exit_on_fail: If True, on failure of the sync |
55 | + SystemExit is raised using the slonik return code. |
56 | + |
57 | + :returns: True if the sync completed successfully. False if |
58 | + exit_on_fail is False and the script failed for any reason. |
59 | """ |
60 | - return execute_slonik("", sync=timeout) |
61 | + return execute_slonik("", sync=timeout, exit_on_fail=exit_on_fail) |
62 | |
63 | |
64 | def execute_slonik(script, sync=None, exit_on_fail=True, auto_preamble=True): |
65 | |
66 | === modified file 'database/replication/slon_ctl.py' |
67 | --- database/replication/slon_ctl.py 2010-10-11 10:32:29 +0000 |
68 | +++ database/replication/slon_ctl.py 2011-08-16 00:31:36 +0000 |
69 | @@ -104,7 +104,7 @@ |
70 | log.debug("Logging to %s" % logfile) |
71 | log.debug("PID file %s" % pidfile) |
72 | # Hard code suitable command line arguments for development. |
73 | - slon_args = "-d 2 -s 2000 -t 10000" |
74 | + slon_args = "-d 2 -s 500 -t 2500" |
75 | if lag is not None: |
76 | slon_args = "%s -l '%s'" % (slon_args, lag) |
77 | cmd = [ |
78 | |
79 | === modified file 'database/sampledata/current-dev.sql' |
80 | --- database/sampledata/current-dev.sql 2011-07-13 06:06:53 +0000 |
81 | +++ database/sampledata/current-dev.sql 2011-08-16 00:31:36 +0000 |
82 | @@ -1918,22 +1918,22 @@ |
83 | |
84 | ALTER TABLE distribution DISABLE TRIGGER ALL; |
85 | |
86 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (1, 'ubuntu', 'Ubuntu Linux', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 'ubuntulinux.org', 17, 'Ubuntu', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 17, NULL, 1, NULL, true, true, NULL, NULL, 3, 59, NULL, NULL, '2006-10-16 18:31:43.415195', NULL, NULL, NULL, NULL, NULL, true, NULL, true, true, NULL, NULL, NULL, NULL, 20, 20, 20, 60); |
87 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (2, 'redhat', 'Redhat Advanced Server', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 'redhat.com', 1, 'Red Hat', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.417928', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
88 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (3, 'debian', 'Debian GNU/Linux', 'Debian GNU/Linux is |
89 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (1, 'ubuntu', 'Ubuntu Linux', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 'ubuntulinux.org', 17, 'Ubuntu', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 17, NULL, 1, NULL, true, true, NULL, NULL, 3, 59, NULL, NULL, '2006-10-16 18:31:43.415195', NULL, NULL, NULL, NULL, NULL, true, NULL, true, true, NULL, NULL, NULL, NULL, 20, 20, 20, 60, '{package_name}_derivatives@packages.qa.debian.org'); |
90 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (2, 'redhat', 'Redhat Advanced Server', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 'redhat.com', 1, 'Red Hat', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.417928', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
91 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (3, 'debian', 'Debian GNU/Linux', 'Debian GNU/Linux is |
92 | a non commercial distribution of a GNU/Linux Operating System for many |
93 | platforms.', 'debian.org', 1, 'Debian', 'Debian GNU/Linux is |
94 | a non commercial distribution of a GNU/Linux Operating System for many |
95 | -platforms.', 1, NULL, 1, NULL, false, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.418942', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
96 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (4, 'gentoo', 'The Gentoo Linux', 'Gentoo is a very |
97 | +platforms.', 1, NULL, 1, NULL, false, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.418942', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
98 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (4, 'gentoo', 'The Gentoo Linux', 'Gentoo is a very |
99 | customizeable GNU/Linux Distribution that is designed to let you build every |
100 | -single package yourself, with your own preferences.', 'gentoo.org', 1, 'Gentoo', 'Gentoo is a very customizeable GNU/Linux Distribution that is designed to let you build every single package yourself, with your own preferences.', 1, NULL, 1, NULL, true, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.41974', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
101 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (5, 'kubuntu', 'Kubuntu - Free KDE-based Linux', 'Kubuntu is an entirely free Linux distribution that uses the K Desktop |
102 | +single package yourself, with your own preferences.', 'gentoo.org', 1, 'Gentoo', 'Gentoo is a very customizeable GNU/Linux Distribution that is designed to let you build every single package yourself, with your own preferences.', 1, NULL, 1, NULL, true, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.41974', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
103 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (5, 'kubuntu', 'Kubuntu - Free KDE-based Linux', 'Kubuntu is an entirely free Linux distribution that uses the K Desktop |
104 | Environment as its default desktop after install.', 'kubuntu.org', 1, 'Kubuntu', 'Kubuntu is an entirely free Linux distribution that uses the K Desktop |
105 | -Environment as its default desktop after install.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.420551', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
106 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (7, 'guadalinex', 'GuadaLinex: Linux for Andalucia', 'GuadaLinex is based on Ubuntu and adds full support for applications specific to the local environment in Andalucia.', 'guadalinex.es', 4, 'GuadaLinex', 'The GuadaLinex team produces a high quality linux for the Andalucian marketplace.', 32, NULL, 1, NULL, false, false, NULL, NULL, NULL, 4, NULL, NULL, '2006-10-16 18:31:43.421329', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
107 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (8, 'ubuntutest', 'Ubuntu Test', 'Ubuntu Test', 'ubuntulinux.org', 17, 'ubuntutest', 'Ubuntu Test summary', 17, NULL, 1, NULL, false, false, NULL, NULL, NULL, 17, NULL, NULL, '2006-10-16 18:31:43.422162', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
108 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (9, 'deribuntu', 'Deribuntu', 'Deribuntu', 'deribuntu', 16, 'Deribuntu', 'Deribuntu', 16, NULL, 1, NULL, false, false, NULL, NULL, NULL, 16, NULL, NULL, '2011-03-17 14:28:54.354337', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
109 | +Environment as its default desktop after install.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.420551', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
110 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (7, 'guadalinex', 'GuadaLinex: Linux for Andalucia', 'GuadaLinex is based on Ubuntu and adds full support for applications specific to the local environment in Andalucia.', 'guadalinex.es', 4, 'GuadaLinex', 'The GuadaLinex team produces a high quality linux for the Andalucian marketplace.', 32, NULL, 1, NULL, false, false, NULL, NULL, NULL, 4, NULL, NULL, '2006-10-16 18:31:43.421329', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
111 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (8, 'ubuntutest', 'Ubuntu Test', 'Ubuntu Test', 'ubuntulinux.org', 17, 'ubuntutest', 'Ubuntu Test summary', 17, NULL, 1, NULL, false, false, NULL, NULL, NULL, 17, NULL, NULL, '2006-10-16 18:31:43.422162', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
112 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (9, 'deribuntu', 'Deribuntu', 'Deribuntu', 'deribuntu', 16, 'Deribuntu', 'Deribuntu', 16, NULL, 1, NULL, false, false, NULL, NULL, NULL, 16, NULL, NULL, '2011-03-17 14:28:54.354337', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
113 | |
114 | |
115 | ALTER TABLE distribution ENABLE TRIGGER ALL; |
116 | |
117 | === modified file 'database/sampledata/current.sql' |
118 | --- database/sampledata/current.sql 2011-07-13 06:06:53 +0000 |
119 | +++ database/sampledata/current.sql 2011-08-16 00:31:36 +0000 |
120 | @@ -1917,21 +1917,21 @@ |
121 | |
122 | ALTER TABLE distribution DISABLE TRIGGER ALL; |
123 | |
124 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (1, 'ubuntu', 'Ubuntu Linux', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 'ubuntulinux.org', 17, 'Ubuntu', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 17, NULL, 1, NULL, true, true, NULL, NULL, 3, 59, NULL, NULL, '2006-10-16 18:31:43.415195', NULL, NULL, NULL, NULL, NULL, true, NULL, true, true, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
125 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (2, 'redhat', 'Redhat Advanced Server', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 'redhat.com', 1, 'Red Hat', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.417928', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
126 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (3, 'debian', 'Debian GNU/Linux', 'Debian GNU/Linux is |
127 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (1, 'ubuntu', 'Ubuntu Linux', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 'ubuntulinux.org', 17, 'Ubuntu', 'Ubuntu is a new approach to Linux Distribution that includes regular releases, and a simplified single-CD installation system.', 17, NULL, 1, NULL, true, true, NULL, NULL, 3, 59, NULL, NULL, '2006-10-16 18:31:43.415195', NULL, NULL, NULL, NULL, NULL, true, NULL, true, true, NULL, NULL, NULL, NULL, 10, 10, 10, 60, '{package_name}_derivatives@packages.qa.debian.org'); |
128 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (2, 'redhat', 'Redhat Advanced Server', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 'redhat.com', 1, 'Red Hat', 'Red Hat is a commercial distribution of the GNU/Linux Operating System.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.417928', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
129 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (3, 'debian', 'Debian GNU/Linux', 'Debian GNU/Linux is |
130 | a non commercial distribution of a GNU/Linux Operating System for many |
131 | platforms.', 'debian.org', 1, 'Debian', 'Debian GNU/Linux is |
132 | a non commercial distribution of a GNU/Linux Operating System for many |
133 | -platforms.', 1, NULL, 1, NULL, false, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.418942', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
134 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (4, 'gentoo', 'The Gentoo Linux', 'Gentoo is a very |
135 | +platforms.', 1, NULL, 1, NULL, false, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.418942', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
136 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (4, 'gentoo', 'The Gentoo Linux', 'Gentoo is a very |
137 | customizeable GNU/Linux Distribution that is designed to let you build every |
138 | -single package yourself, with your own preferences.', 'gentoo.org', 1, 'Gentoo', 'Gentoo is a very customizeable GNU/Linux Distribution that is designed to let you build every single package yourself, with your own preferences.', 1, NULL, 1, NULL, true, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.41974', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
139 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (5, 'kubuntu', 'Kubuntu - Free KDE-based Linux', 'Kubuntu is an entirely free Linux distribution that uses the K Desktop |
140 | +single package yourself, with your own preferences.', 'gentoo.org', 1, 'Gentoo', 'Gentoo is a very customizeable GNU/Linux Distribution that is designed to let you build every single package yourself, with your own preferences.', 1, NULL, 1, NULL, true, false, NULL, NULL, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.41974', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
141 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (5, 'kubuntu', 'Kubuntu - Free KDE-based Linux', 'Kubuntu is an entirely free Linux distribution that uses the K Desktop |
142 | Environment as its default desktop after install.', 'kubuntu.org', 1, 'Kubuntu', 'Kubuntu is an entirely free Linux distribution that uses the K Desktop |
143 | -Environment as its default desktop after install.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.420551', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
144 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (7, 'guadalinex', 'GuadaLinex: Linux for Andalucia', 'GuadaLinex is based on Ubuntu and adds full support for applications specific to the local environment in Andalucia.', 'guadalinex.es', 4, 'GuadaLinex', 'The GuadaLinex team produces a high quality linux for the Andalucian marketplace.', 32, NULL, 1, NULL, false, false, NULL, NULL, NULL, 4, NULL, NULL, '2006-10-16 18:31:43.421329', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
145 | -INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant) VALUES (8, 'ubuntutest', 'Ubuntu Test', 'Ubuntu Test', 'ubuntulinux.org', 17, 'ubuntutest', 'Ubuntu Test summary', 17, NULL, 1, NULL, false, false, NULL, NULL, NULL, 17, NULL, NULL, '2006-10-16 18:31:43.422162', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60); |
146 | +Environment as its default desktop after install.', 1, NULL, 1, NULL, false, false, NULL, 8, NULL, 1, NULL, NULL, '2006-10-16 18:31:43.420551', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
147 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (7, 'guadalinex', 'GuadaLinex: Linux for Andalucia', 'GuadaLinex is based on Ubuntu and adds full support for applications specific to the local environment in Andalucia.', 'guadalinex.es', 4, 'GuadaLinex', 'The GuadaLinex team produces a high quality linux for the Andalucian marketplace.', 32, NULL, 1, NULL, false, false, NULL, NULL, NULL, 4, NULL, NULL, '2006-10-16 18:31:43.421329', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
148 | +INSERT INTO distribution (id, name, title, description, domainname, owner, displayname, summary, members, translationgroup, translationpermission, bug_supervisor, official_malone, official_rosetta, security_contact, driver, translation_focus, mirror_admin, upload_admin, upload_sender, date_created, homepage_content, icon, mugshot, logo, fti, official_answers, language_pack_admin, official_blueprints, enable_bug_expiration, bug_reporting_guidelines, reviewer_whiteboard, max_bug_heat, bug_reported_acknowledgement, answers_usage, blueprints_usage, translations_usage, registrant, package_derivatives_email) VALUES (8, 'ubuntutest', 'Ubuntu Test', 'Ubuntu Test', 'ubuntulinux.org', 17, 'ubuntutest', 'Ubuntu Test summary', 17, NULL, 1, NULL, false, false, NULL, NULL, NULL, 17, NULL, NULL, '2006-10-16 18:31:43.422162', NULL, NULL, NULL, NULL, NULL, false, NULL, false, false, NULL, NULL, NULL, NULL, 10, 10, 10, 60, NULL); |
149 | |
150 | |
151 | ALTER TABLE distribution ENABLE TRIGGER ALL; |
152 | |
153 | === modified file 'database/schema/Makefile' |
154 | --- database/schema/Makefile 2011-01-31 11:10:39 +0000 |
155 | +++ database/schema/Makefile 2011-08-16 00:31:36 +0000 |
156 | @@ -128,7 +128,6 @@ |
157 | @ psql -d ${EMPTY_DBNAME} -q -c "CREATE SCHEMA todrop;" |
158 | @ echo "* Creating functions" |
159 | @ psql -d ${EMPTY_DBNAME} -f trusted.sql | grep : | cat |
160 | - @ psql -d ${EMPTY_DBNAME} -f testfuncs.sql | grep : | cat |
161 | @ echo "* Installing tsearch2 into ts2 schema" |
162 | @ ${PYTHON} fti.py -q --setup-only -d ${EMPTY_DBNAME} |
163 | @ echo "* Loading base database schema" |
164 | |
165 | === modified file 'database/schema/comments.sql' |
166 | --- database/schema/comments.sql 2011-07-08 17:12:15 +0000 |
167 | +++ database/schema/comments.sql 2011-08-16 00:31:36 +0000 |
168 | @@ -1129,6 +1129,7 @@ |
169 | COMMENT ON COLUMN Distribution.max_bug_heat IS 'The highest heat value across bugs for this distribution.'; |
170 | COMMENT ON COLUMN Distribution.bug_reported_acknowledgement IS 'A message of acknowledgement to display to a bug reporter after they\'ve reported a new bug.'; |
171 | COMMENT ON COLUMN Distribution.registrant IS 'The person in launchpad who registered this distribution.'; |
172 | +COMMENT ON COLUMN Distribution.package_derivatives_email IS 'The optional email address template to use when sending emails about package updates in a distributrion. The string {package_name} in the template will be replaced with the actual package name being updated.'; |
173 | |
174 | -- DistroSeries |
175 | |
176 | @@ -1676,6 +1677,7 @@ |
177 | COMMENT ON COLUMN DistroSeries.driver IS 'This is a person or team who can act as a driver for this specific release - note that the distribution drivers can also set goals for any release.'; |
178 | COMMENT ON COLUMN DistroSeries.changeslist IS 'The email address (name name) of the changes announcement list for this distroseries. If NULL, no announcement mail will be sent.'; |
179 | COMMENT ON COLUMN DistroSeries.defer_translation_imports IS 'Don''t accept PO imports for this release just now.'; |
180 | +COMMENT ON COLUMN DistroSeries.include_long_descriptions IS 'Include long descriptions in Packages rather than in Translation-en.'; |
181 | |
182 | |
183 | -- DistroArchSeries |
184 | @@ -1900,6 +1902,7 @@ |
185 | COMMENT ON COLUMN PackagingJob.productseries IS 'The productseries of the Packaging.'; |
186 | COMMENT ON COLUMN PackagingJob.sourcepackagename IS 'The sourcepackage of the Packaging.'; |
187 | COMMENT ON COLUMN PackagingJob.distroseries IS 'The distroseries of the Packaging.'; |
188 | +COMMENT ON COLUMN PackagingJob.potemplate IS 'A POTemplate to restrict the job to or NULL if all templates need to be handled.'; |
189 | |
190 | -- Translator / TranslationGroup |
191 | |
192 | |
193 | === modified file 'database/schema/full-update.py' |
194 | --- database/schema/full-update.py 2011-07-26 08:37:52 +0000 |
195 | +++ database/schema/full-update.py 2011-08-16 00:31:36 +0000 |
196 | @@ -6,7 +6,7 @@ |
197 | |
198 | import _pythonpath |
199 | |
200 | -import os.path |
201 | +from datetime import datetime |
202 | from optparse import OptionParser |
203 | import subprocess |
204 | import sys |
205 | @@ -105,16 +105,17 @@ |
206 | # work unattended. |
207 | # |
208 | |
209 | + # Confirm we can invoke PGBOUNCER_INITD |
210 | + log.debug("Confirming sudo access to pgbouncer startup script") |
211 | + pgbouncer_rc = run_pgbouncer(log, 'status') |
212 | + if pgbouncer_rc != 0: |
213 | + return pgbouncer_rc |
214 | + |
215 | # We initially ignore open connections, as they will shortly be |
216 | # killed. |
217 | if not NoConnectionCheckPreflight(log).check_all(): |
218 | return 99 |
219 | |
220 | - # Confirm we can invoke PGBOUNCER_INITD |
221 | - pgbouncer_rc = run_pgbouncer(log, 'status') |
222 | - if pgbouncer_rc != 0: |
223 | - return pgbouncer_rc |
224 | - |
225 | # |
226 | # Start the actual upgrade. Failures beyond this point need to |
227 | # generate informative messages to help with recovery. |
228 | @@ -125,8 +126,11 @@ |
229 | upgrade_run = False |
230 | security_run = False |
231 | |
232 | + outage_start = datetime.now() |
233 | + |
234 | try: |
235 | # Shutdown pgbouncer |
236 | + log.info("Outage starts. Shutting down pgbouncer.") |
237 | pgbouncer_rc = run_pgbouncer(log, 'stop') |
238 | if pgbouncer_rc != 0: |
239 | log.fatal("pgbouncer not shut down [%s]", pgbouncer_rc) |
240 | @@ -136,10 +140,12 @@ |
241 | if not KillConnectionsPreflight(log).check_all(): |
242 | return 100 |
243 | |
244 | + log.info("Preflight check succeeded. Starting upgrade.") |
245 | upgrade_rc = run_upgrade(options, log) |
246 | if upgrade_rc != 0: |
247 | return upgrade_rc |
248 | upgrade_run = True |
249 | + log.info("Database patches applied. Stored procedures updated.") |
250 | |
251 | security_rc = run_security(options, log) |
252 | if security_rc != 0: |
253 | @@ -148,11 +154,13 @@ |
254 | |
255 | log.info("All database upgrade steps completed") |
256 | |
257 | + log.info("Restarting pgbouncer") |
258 | pgbouncer_rc = run_pgbouncer(log, 'start') |
259 | if pgbouncer_rc != 0: |
260 | log.fatal("pgbouncer not restarted [%s]", pgbouncer_rc) |
261 | return pgbouncer_rc |
262 | pgbouncer_down = False |
263 | + log.info("Outage complete. %s", datetime.now() - outage_start) |
264 | |
265 | # We will start seeing connections as soon as pgbouncer is |
266 | # reenabled, so ignore them here. |
267 | @@ -180,6 +188,7 @@ |
268 | pgbouncer_rc = run_pgbouncer(log, 'start') |
269 | if pgbouncer_rc == 0: |
270 | log.info("Despite failures, pgbouncer restarted.") |
271 | + log.info("Outage complete. %s", datetime.now() - outage_start) |
272 | else: |
273 | log.fatal("pgbouncer is down and refuses to restart") |
274 | if not upgrade_run: |
275 | |
276 | === added file 'database/schema/patch-2208-76-3.sql' |
277 | --- database/schema/patch-2208-76-3.sql 1970-01-01 00:00:00 +0000 |
278 | +++ database/schema/patch-2208-76-3.sql 2011-08-16 00:31:36 +0000 |
279 | @@ -0,0 +1,15 @@ |
280 | +-- Copyright 2011 Canonical Ltd. This software is licensed under the |
281 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
282 | + |
283 | +SET client_min_messages = ERROR; |
284 | + |
285 | +-- Drop old unused functions still lurking on production. |
286 | +DROP FUNCTION IF EXISTS is_blacklisted_name(text); |
287 | +DROP FUNCTION IF EXISTS name_blacklist_match(text); |
288 | +DROP FUNCTION IF EXISTS reverse(text); |
289 | +DROP FUNCTION IF EXISTS bug_summary_temp_journal_clean_row(bugsummary); |
290 | +DROP FUNCTION IF EXISTS valid_version(text); |
291 | +DROP FUNCTION IF EXISTS decendantrevision(integer); |
292 | +DROP FUNCTION IF EXISTS sleep_for_testing(float); |
293 | + |
294 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2208, 76, 3); |
295 | |
296 | === added file 'database/schema/patch-2208-76-4.sql' |
297 | --- database/schema/patch-2208-76-4.sql 1970-01-01 00:00:00 +0000 |
298 | +++ database/schema/patch-2208-76-4.sql 2011-08-16 00:31:36 +0000 |
299 | @@ -0,0 +1,159 @@ |
300 | +-- Copyright 2011 Canonical Ltd. This software is licensed under the |
301 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
302 | + |
303 | +SET client_min_messages=ERROR; |
304 | + |
305 | +DROP FUNCTION bugsummary_rollup_journal(); |
306 | + |
307 | +CREATE OR REPLACE FUNCTION bugsummary_rollup_journal(batchsize integer=NULL) |
308 | +RETURNS VOID |
309 | +LANGUAGE plpgsql VOLATILE |
310 | +CALLED ON NULL INPUT |
311 | +SECURITY DEFINER SET search_path TO public AS |
312 | +$$ |
313 | +DECLARE |
314 | + d bugsummary%ROWTYPE; |
315 | + max_id integer; |
316 | +BEGIN |
317 | + -- Lock so we don't content with other invokations of this |
318 | + -- function. We can happily lock the BugSummary table for writes |
319 | + -- as this function is the only thing that updates that table. |
320 | + -- BugSummaryJournal remains unlocked so nothing should be blocked. |
321 | + LOCK TABLE BugSummary IN ROW EXCLUSIVE MODE; |
322 | + |
323 | + IF batchsize IS NULL THEN |
324 | + SELECT MAX(id) INTO max_id FROM BugSummaryJournal; |
325 | + ELSE |
326 | + SELECT MAX(id) INTO max_id FROM ( |
327 | + SELECT id FROM BugSummaryJournal ORDER BY id LIMIT batchsize |
328 | + ) AS Whatever; |
329 | + END IF; |
330 | + |
331 | + FOR d IN |
332 | + SELECT |
333 | + NULL as id, |
334 | + SUM(count), |
335 | + product, |
336 | + productseries, |
337 | + distribution, |
338 | + distroseries, |
339 | + sourcepackagename, |
340 | + viewed_by, |
341 | + tag, |
342 | + status, |
343 | + milestone, |
344 | + importance, |
345 | + has_patch, |
346 | + fixed_upstream |
347 | + FROM BugSummaryJournal |
348 | + WHERE id <= max_id |
349 | + GROUP BY |
350 | + product, productseries, distribution, distroseries, |
351 | + sourcepackagename, viewed_by, tag, status, milestone, |
352 | + importance, has_patch, fixed_upstream |
353 | + HAVING sum(count) <> 0 |
354 | + LOOP |
355 | + IF d.count < 0 THEN |
356 | + PERFORM bug_summary_dec(d); |
357 | + ELSIF d.count > 0 THEN |
358 | + PERFORM bug_summary_inc(d); |
359 | + END IF; |
360 | + END LOOP; |
361 | + |
362 | + -- Clean out any counts we reduced to 0. |
363 | + DELETE FROM BugSummary WHERE count=0; |
364 | + -- Clean out the journal entries we have handled. |
365 | + DELETE FROM BugSummaryJournal WHERE id <= max_id; |
366 | +END; |
367 | +$$; |
368 | + |
369 | +COMMENT ON FUNCTION bugsummary_rollup_journal(integer) IS |
370 | +'Collate and migrate rows from BugSummaryJournal to BugSummary'; |
371 | + |
372 | + |
373 | +CREATE OR REPLACE FUNCTION bug_summary_dec(bugsummary) RETURNS VOID |
374 | +LANGUAGE SQL AS |
375 | +$$ |
376 | + -- We own the row reference, so in the absence of bugs this cannot |
377 | + -- fail - just decrement the row. |
378 | + UPDATE BugSummary SET count = count + $1.count |
379 | + WHERE |
380 | + ((product IS NULL AND $1.product IS NULL) |
381 | + OR product = $1.product) |
382 | + AND ((productseries IS NULL AND $1.productseries IS NULL) |
383 | + OR productseries = $1.productseries) |
384 | + AND ((distribution IS NULL AND $1.distribution IS NULL) |
385 | + OR distribution = $1.distribution) |
386 | + AND ((distroseries IS NULL AND $1.distroseries IS NULL) |
387 | + OR distroseries = $1.distroseries) |
388 | + AND ((sourcepackagename IS NULL AND $1.sourcepackagename IS NULL) |
389 | + OR sourcepackagename = $1.sourcepackagename) |
390 | + AND ((viewed_by IS NULL AND $1.viewed_by IS NULL) |
391 | + OR viewed_by = $1.viewed_by) |
392 | + AND ((tag IS NULL AND $1.tag IS NULL) |
393 | + OR tag = $1.tag) |
394 | + AND status = $1.status |
395 | + AND ((milestone IS NULL AND $1.milestone IS NULL) |
396 | + OR milestone = $1.milestone) |
397 | + AND importance = $1.importance |
398 | + AND has_patch = $1.has_patch |
399 | + AND fixed_upstream = $1.fixed_upstream; |
400 | +$$; |
401 | + |
402 | +CREATE OR REPLACE FUNCTION bug_summary_inc(d bugsummary) RETURNS VOID |
403 | +LANGUAGE plpgsql AS |
404 | +$$ |
405 | +BEGIN |
406 | + -- Shameless adaption from postgresql manual |
407 | + LOOP |
408 | + -- first try to update the row |
409 | + UPDATE BugSummary SET count = count + d.count |
410 | + WHERE |
411 | + ((product IS NULL AND $1.product IS NULL) |
412 | + OR product = $1.product) |
413 | + AND ((productseries IS NULL AND $1.productseries IS NULL) |
414 | + OR productseries = $1.productseries) |
415 | + AND ((distribution IS NULL AND $1.distribution IS NULL) |
416 | + OR distribution = $1.distribution) |
417 | + AND ((distroseries IS NULL AND $1.distroseries IS NULL) |
418 | + OR distroseries = $1.distroseries) |
419 | + AND ((sourcepackagename IS NULL AND $1.sourcepackagename IS NULL) |
420 | + OR sourcepackagename = $1.sourcepackagename) |
421 | + AND ((viewed_by IS NULL AND $1.viewed_by IS NULL) |
422 | + OR viewed_by = $1.viewed_by) |
423 | + AND ((tag IS NULL AND $1.tag IS NULL) |
424 | + OR tag = $1.tag) |
425 | + AND status = $1.status |
426 | + AND ((milestone IS NULL AND $1.milestone IS NULL) |
427 | + OR milestone = $1.milestone) |
428 | + AND importance = $1.importance |
429 | + AND has_patch = $1.has_patch |
430 | + AND fixed_upstream = $1.fixed_upstream; |
431 | + IF found THEN |
432 | + RETURN; |
433 | + END IF; |
434 | + -- not there, so try to insert the key |
435 | + -- if someone else inserts the same key concurrently, |
436 | + -- we could get a unique-key failure |
437 | + BEGIN |
438 | + INSERT INTO BugSummary( |
439 | + count, product, productseries, distribution, |
440 | + distroseries, sourcepackagename, viewed_by, tag, |
441 | + status, milestone, |
442 | + importance, has_patch, fixed_upstream) |
443 | + VALUES ( |
444 | + d.count, d.product, d.productseries, d.distribution, |
445 | + d.distroseries, d.sourcepackagename, d.viewed_by, d.tag, |
446 | + d.status, d.milestone, |
447 | + d.importance, d.has_patch, d.fixed_upstream); |
448 | + RETURN; |
449 | + EXCEPTION WHEN unique_violation THEN |
450 | + -- do nothing, and loop to try the UPDATE again |
451 | + END; |
452 | + END LOOP; |
453 | +END; |
454 | +$$; |
455 | + |
456 | + |
457 | + |
458 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2208, 76, 4); |
459 | |
460 | === added file 'database/schema/patch-2208-79-0.sql' |
461 | --- database/schema/patch-2208-79-0.sql 1970-01-01 00:00:00 +0000 |
462 | +++ database/schema/patch-2208-79-0.sql 2011-08-16 00:31:36 +0000 |
463 | @@ -0,0 +1,9 @@ |
464 | +-- Copyright 2011 Canonical Ltd. This software is licensed under the |
465 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
466 | + |
467 | +SET client_min_messages=ERROR; |
468 | + |
469 | +ALTER TABLE distroseries |
470 | + ADD COLUMN include_long_descriptions BOOLEAN NOT NULL DEFAULT TRUE; |
471 | + |
472 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2208, 79, 0); |
473 | |
474 | === added file 'database/schema/patch-2208-79-1.sql' |
475 | --- database/schema/patch-2208-79-1.sql 1970-01-01 00:00:00 +0000 |
476 | +++ database/schema/patch-2208-79-1.sql 2011-08-16 00:31:36 +0000 |
477 | @@ -0,0 +1,26 @@ |
478 | +-- Copyright 2011 Canonical Ltd. This software is licensed under the |
479 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
480 | +SET client_min_messages=ERROR; |
481 | + |
482 | +ALTER TABLE PackagingJob |
483 | + ADD COLUMN |
484 | + potemplate INTEGER DEFAULT NULL |
485 | + CONSTRAINT potemplate_fk REFERENCES POTemplate; |
486 | + |
487 | +ALTER TABLE PackagingJob |
488 | + ALTER COLUMN productseries DROP NOT NULL, |
489 | + ALTER COLUMN distroseries DROP NOT NULL, |
490 | + ALTER COLUMN sourcepackagename DROP NOT NULL, |
491 | + ADD CONSTRAINT translationtemplatejob_valid_link CHECK ( |
492 | + -- If there is a template, it is the template being moved. |
493 | + (potemplate IS NOT NULL AND productseries IS NULL AND |
494 | + distroseries IS NULL AND sourcepackagename IS NULL) OR |
495 | + -- If there is no template, we need all of productseries, distroseries |
496 | + -- and sourcepackagename because we are moving translations between |
497 | + -- a productseries and a source package. |
498 | + (potemplate IS NULL AND productseries IS NOT NULL AND |
499 | + distroseries IS NOT NULL AND sourcepackagename IS NOT NULL)); |
500 | + |
501 | +CREATE INDEX packagingjob__potemplate__idx ON PackagingJob (potemplate); |
502 | + |
503 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2208, 79, 1); |
504 | |
505 | === added file 'database/schema/patch-2208-80-1.sql' |
506 | --- database/schema/patch-2208-80-1.sql 1970-01-01 00:00:00 +0000 |
507 | +++ database/schema/patch-2208-80-1.sql 2011-08-16 00:31:36 +0000 |
508 | @@ -0,0 +1,12 @@ |
509 | +-- Copyright 2011 Canonical Ltd. This software is licensed under the |
510 | +-- GNU Affero General Public License version 3 (see the file LICENSE). |
511 | + |
512 | +SET client_min_messages=ERROR; |
513 | + |
514 | +ALTER TABLE distribution |
515 | + ADD COLUMN package_derivatives_email TEXT; |
516 | +UPDATE distribution |
517 | + SET package_derivatives_email = '{package_name}_derivatives@packages.qa.debian.org' |
518 | + WHERE name='ubuntu'; |
519 | + |
520 | +INSERT INTO LaunchpadDatabaseRevision VALUES (2208, 80, 1); |
521 | |
522 | === modified file 'database/schema/preflight.py' |
523 | --- database/schema/preflight.py 2011-07-25 13:59:01 +0000 |
524 | +++ database/schema/preflight.py 2011-08-16 00:31:36 +0000 |
525 | @@ -15,7 +15,7 @@ |
526 | |
527 | from datetime import timedelta |
528 | from optparse import OptionParser |
529 | -import sys |
530 | +import time |
531 | |
532 | import psycopg2 |
533 | |
534 | @@ -226,7 +226,7 @@ |
535 | cluster to be quiescent. |
536 | """ |
537 | if self.is_replicated: |
538 | - success = replication.helpers.sync(30) |
539 | + success = replication.helpers.sync(30, exit_on_fail=False) |
540 | if success: |
541 | self.log.info( |
542 | "Replication events are being propagated.") |
543 | @@ -282,21 +282,40 @@ |
544 | |
545 | System users are defined by SYSTEM_USERS. |
546 | """ |
547 | - for node in self.lpmain_nodes: |
548 | - cur = node.con.cursor() |
549 | - cur.execute(""" |
550 | - SELECT |
551 | - procpid, datname, usename, pg_terminate_backend(procpid) |
552 | - FROM pg_stat_activity |
553 | - WHERE |
554 | - datname=current_database() |
555 | - AND procpid <> pg_backend_pid() |
556 | - AND usename NOT IN %s |
557 | - """ % sqlvalues(SYSTEM_USERS)) |
558 | - for procpid, datname, usename, ignored in cur.fetchall(): |
559 | - self.log.warning( |
560 | - "Killed %s [%s] on %s", usename, procpid, datname) |
561 | - return True |
562 | + # We keep trying to terminate connections every 0.5 seconds for |
563 | + # up to 10 seconds. |
564 | + num_tries = 20 |
565 | + seconds_to_pause = 0.5 |
566 | + for loop_count in range(num_tries): |
567 | + all_clear = True |
568 | + for node in self.lpmain_nodes: |
569 | + cur = node.con.cursor() |
570 | + cur.execute(""" |
571 | + SELECT |
572 | + procpid, datname, usename, |
573 | + pg_terminate_backend(procpid) |
574 | + FROM pg_stat_activity |
575 | + WHERE |
576 | + datname=current_database() |
577 | + AND procpid <> pg_backend_pid() |
578 | + AND usename NOT IN %s |
579 | + """ % sqlvalues(SYSTEM_USERS)) |
580 | + for procpid, datname, usename, ignored in cur.fetchall(): |
581 | + all_clear = False |
582 | + if loop_count == num_tries - 1: |
583 | + self.log.fatal( |
584 | + "Unable to kill %s [%s] on %s", |
585 | + usename, procpid, datname) |
586 | + else: |
587 | + self.log.warning( |
588 | + "Killed %s [%s] on %s", usename, procpid, datname) |
589 | + if all_clear: |
590 | + break |
591 | + |
592 | + # Wait a little for any terminated connections to actually |
593 | + # terminate. |
594 | + time.sleep(seconds_to_pause) |
595 | + return all_clear |
596 | |
597 | |
598 | def main(): |
599 | @@ -337,4 +356,4 @@ |
600 | |
601 | |
602 | if __name__ == '__main__': |
603 | - sys.exit(main()) |
604 | + raise SystemExit(main()) |
605 | |
606 | === modified file 'database/schema/security.cfg' |
607 | --- database/schema/security.cfg 2011-08-10 15:57:35 +0000 |
608 | +++ database/schema/security.cfg 2011-08-16 00:31:36 +0000 |
609 | @@ -11,7 +11,6 @@ |
610 | |
611 | [public] |
612 | type=group |
613 | -public._killall_backends(text) = |
614 | public.activity() = EXECUTE |
615 | public.add_test_openid_identifier(integer) = EXECUTE |
616 | public.alllocks = |
617 | @@ -144,7 +143,7 @@ |
618 | public.bugnotificationrecipientarchive = SELECT, UPDATE |
619 | public.bugsummary = SELECT |
620 | public.bugsummaryjournal = SELECT |
621 | -public.bugsummary_rollup_journal() = EXECUTE |
622 | +public.bugsummary_rollup_journal(integer) = EXECUTE |
623 | public.bugtag = SELECT, INSERT, DELETE |
624 | public.bugtrackercomponent = SELECT, INSERT, UPDATE |
625 | public.bugtrackercomponentgroup = SELECT, INSERT, UPDATE |
626 | @@ -2163,7 +2162,7 @@ |
627 | public.bugsubscriptionfilterimportance = SELECT |
628 | public.bugsubscriptionfilterstatus = SELECT |
629 | public.bugsubscriptionfiltertag = SELECT |
630 | -public.bugsummary_rollup_journal() = EXECUTE |
631 | +public.bugsummary_rollup_journal(integer) = EXECUTE |
632 | public.bugtag = SELECT |
633 | public.bugwatch = SELECT, UPDATE |
634 | public.bugwatchactivity = SELECT, DELETE |
635 | @@ -2284,3 +2283,11 @@ |
636 | public.potemplate = SELECT |
637 | public.sourcepackagename = SELECT |
638 | type=user |
639 | + |
640 | +[generate_contents_files] |
641 | +type=user |
642 | +groups=archivepublisher |
643 | + |
644 | +[publish_ftpmaster] |
645 | +type=user |
646 | +groups=archivepublisher |
647 | |
648 | === modified file 'database/schema/security.py' |
649 | --- database/schema/security.py 2011-07-25 14:10:46 +0000 |
650 | +++ database/schema/security.py 2011-08-16 00:31:36 +0000 |
651 | @@ -26,22 +26,101 @@ |
652 | # The 'read' group does not get given select permission on the following |
653 | # tables. This is to stop the ro user being given access to secrurity |
654 | # sensitive information that interactive sessions don't need. |
655 | -SECURE_TABLES = [ |
656 | +SECURE_TABLES = set(( |
657 | 'public.accountpassword', |
658 | + 'public.accountpassword_id_seq', |
659 | 'public.oauthnonce', |
660 | + 'public.oauthnonce_id_seq', |
661 | 'public.openidnonce', |
662 | + 'public.openidnonce_id_seq', |
663 | 'public.openidconsumernonce', |
664 | - ] |
665 | + 'public.openidconsumernonce_id_seq', |
666 | + )) |
667 | + |
668 | +POSTGRES_ACL_MAP = { |
669 | + 'r': 'SELECT', |
670 | + 'w': 'UPDATE', |
671 | + 'a': 'INSERT', |
672 | + 'd': 'DELETE', |
673 | + 'D': 'TRUNCATE', |
674 | + 'x': 'REFERENCES', |
675 | + 't': 'TRIGGER', |
676 | + 'X': 'EXECUTE', |
677 | + 'U': 'USAGE', |
678 | + 'C': 'CREATE', |
679 | + 'c': 'CONNECT', |
680 | + 'T': 'TEMPORARY', |
681 | + } |
682 | + |
683 | + |
684 | +def _split_postgres_aclitem(aclitem): |
685 | + """Split a PostgreSQL aclitem textual representation. |
686 | + |
687 | + Returns the (grantee, privs, grantor), unquoted and separated. |
688 | + """ |
689 | + components = {'grantee': '', 'privs': '', 'grantor': ''} |
690 | + current_component = 'grantee' |
691 | + inside_quoted = False |
692 | + maybe_finished_quoted = False |
693 | + for char in aclitem: |
694 | + if inside_quoted: |
695 | + if maybe_finished_quoted: |
696 | + maybe_finished_quoted = False |
697 | + if char == '"': |
698 | + components[current_component] += '"' |
699 | + continue |
700 | + else: |
701 | + inside_quoted = False |
702 | + elif char == '"': |
703 | + maybe_finished_quoted = True |
704 | + continue |
705 | + # inside_quoted may have just been made False, so no else block |
706 | + # for you. |
707 | + if not inside_quoted: |
708 | + if char == '"': |
709 | + inside_quoted = True |
710 | + continue |
711 | + elif char == '=': |
712 | + current_component = 'privs' |
713 | + continue |
714 | + elif char == '/': |
715 | + current_component = 'grantor' |
716 | + continue |
717 | + components[current_component] += char |
718 | + return components['grantee'], components['privs'], components['grantor'] |
719 | + |
720 | + |
721 | +def parse_postgres_acl(acl): |
722 | + """Parse a PostgreSQL object ACL into a dict with permission names. |
723 | + |
724 | + The dict is of the form {user: {permission: grant option}}. |
725 | + """ |
726 | + parsed = {} |
727 | + if acl is None: |
728 | + return parsed |
729 | + for entry in acl: |
730 | + grantee, privs, grantor = _split_postgres_aclitem(entry) |
731 | + if grantee == '': |
732 | + grantee = 'public' |
733 | + parsed_privs = [] |
734 | + for priv in privs: |
735 | + if priv == '*': |
736 | + parsed_privs[-1] = (parsed_privs[-1][0], True) |
737 | + continue |
738 | + parsed_privs.append((POSTGRES_ACL_MAP[priv], False)) |
739 | + parsed[grantee] = dict(parsed_privs) |
740 | + return parsed |
741 | |
742 | |
743 | class DbObject(object): |
744 | |
745 | def __init__( |
746 | - self, schema, name, type_, owner, arguments=None, language=None): |
747 | + self, schema, name, type_, owner, acl, arguments=None, language=None): |
748 | self.schema = schema |
749 | self.name = name |
750 | self.type = type_ |
751 | self.owner = owner |
752 | + self.acl = acl |
753 | self.arguments = arguments |
754 | self.language = language |
755 | |
756 | @@ -80,7 +159,8 @@ |
757 | WHEN 'S' THEN 'sequence' |
758 | WHEN 's' THEN 'special' |
759 | END as "Type", |
760 | - u.usename as "Owner" |
761 | + u.usename as "Owner", |
762 | + c.relacl::text[] as "ACL" |
763 | FROM pg_catalog.pg_class c |
764 | LEFT JOIN pg_catalog.pg_user u ON u.usesysid = c.relowner |
765 | LEFT JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace |
766 | @@ -89,9 +169,10 @@ |
767 | AND pg_catalog.pg_table_is_visible(c.oid) |
768 | ORDER BY 1,2 |
769 | ''') |
770 | - for schema, name, type_, owner in cur.fetchall(): |
771 | + for schema, name, type_, owner, acl in cur.fetchall(): |
772 | key = '%s.%s' % (schema, name) |
773 | - self[key] = DbObject(schema, name, type_, owner) |
774 | + self[key] = DbObject( |
775 | + schema, name, type_, owner, parse_postgres_acl(acl)) |
776 | |
777 | cur.execute(r""" |
778 | SELECT |
779 | @@ -99,6 +180,7 @@ |
780 | p.proname as "name", |
781 | pg_catalog.oidvectortypes(p.proargtypes) as "Argument types", |
782 | u.usename as "owner", |
783 | + p.proacl::text[] as "acl", |
784 | l.lanname as "language" |
785 | FROM pg_catalog.pg_proc p |
786 | LEFT JOIN pg_catalog.pg_namespace n ON n.oid = p.pronamespace |
787 | @@ -110,9 +192,10 @@ |
788 | AND pg_catalog.pg_function_is_visible(p.oid) |
789 | AND n.nspname <> 'pg_catalog' |
790 | """) |
791 | - for schema, name, arguments, owner, language in cur.fetchall(): |
792 | + for schema, name, arguments, owner, acl, language in cur.fetchall(): |
793 | self['%s.%s(%s)' % (schema, name, arguments)] = DbObject( |
794 | - schema, name, 'function', owner, arguments, language) |
795 | + schema, name, 'function', owner, parse_postgres_acl(acl), |
796 | + arguments, language) |
797 | # Pull a list of groups |
798 | cur.execute("SELECT groname FROM pg_group") |
799 | self.groups = [r[0] for r in cur.fetchall()] |
800 | @@ -134,10 +217,10 @@ |
801 | def execute(self, cmd, params=None): |
802 | cmd = cmd.encode('utf8') |
803 | if params is None: |
804 | - log.debug2('%s' % (cmd, )) |
805 | + log.debug3('%s' % (cmd, )) |
806 | return self.__dict__['_cursor'].execute(cmd) |
807 | else: |
808 | - log.debug2('%s [%r]' % (cmd, params)) |
809 | + log.debug3('%s [%r]' % (cmd, params)) |
810 | return self.__dict__['_cursor'].execute(cmd, params) |
811 | |
812 | def __getattr__(self, key): |
813 | @@ -206,7 +289,7 @@ |
814 | self.entity_keyword = entity_keyword |
815 | self.permissions = defaultdict(dict) |
816 | |
817 | - def add(self, permission, entity, principal, is_group=False): |
818 | + def add(self, permission, entity, principal): |
819 | """Add a permission. |
820 | |
821 | Add all privileges you want to grant or revoke first, then use |
822 | @@ -217,14 +300,8 @@ |
823 | or revoke a privilege. |
824 | :param principal: User or group to which the privilege should |
825 | apply. |
826 | - :param is_group: Is `principal` a group? |
827 | """ |
828 | - if is_group: |
829 | - full_principal = "GROUP " + principal |
830 | - else: |
831 | - full_principal = principal |
832 | - self.permissions[permission].setdefault(entity, set()).add( |
833 | - full_principal) |
834 | + self.permissions[permission].setdefault(principal, set()).add(entity) |
835 | |
836 | def tabulate(self): |
837 | """Group privileges into single-statement work items. |
838 | @@ -239,9 +316,9 @@ |
839 | """ |
840 | result = [] |
841 | for permission, parties in self.permissions.iteritems(): |
842 | - for entity, principals in parties.iteritems(): |
843 | + for principal, entities in parties.iteritems(): |
844 | result.append( |
845 | - (permission, entity, ", ".join(principals))) |
846 | + (permission, ", ".join(entities), principal)) |
847 | return result |
848 | |
849 | def countPermissions(self): |
850 | @@ -250,17 +327,17 @@ |
851 | |
852 | def countEntities(self): |
853 | """Count the number of different entities.""" |
854 | - return len(set(sum([ |
855 | - entities.keys() |
856 | - for entities in self.permissions.itervalues()], []))) |
857 | + entities = set() |
858 | + for entities_and_entities in self.permissions.itervalues(): |
859 | + for extra_entities in entities_and_entities.itervalues(): |
860 | + entities.update(extra_entities) |
861 | + return len(entities) |
862 | |
863 | def countPrincipals(self): |
864 | """Count the number of different principals.""" |
865 | - principals = set() |
866 | - for entities_and_principals in self.permissions.itervalues(): |
867 | - for extra_principals in entities_and_principals.itervalues(): |
868 | - principals.update(extra_principals) |
869 | - return len(principals) |
870 | + return len(set(sum([ |
871 | + principals.keys() |
872 | + for principals in self.permissions.itervalues()], []))) |
873 | |
874 | def grant(self, cur): |
875 | """Grant all gathered permissions. |
876 | @@ -303,6 +380,30 @@ |
877 | log.debug("Issued %d REVOKE statement(s).", revoke_count) |
878 | |
879 | |
880 | +def alter_permissions(cur, which, revoke=False): |
881 | + """Efficiently apply a set of permission changes. |
882 | + |
883 | + :param cur: a database cursor |
884 | + :param which: an iterable of (object, role, permissions) |
885 | + :param revoke: whether to revoke or grant permissions |
886 | + """ |
887 | + gatherers = { |
888 | + 'table': PermissionGatherer("TABLE"), |
889 | + 'function': PermissionGatherer("FUNCTION"), |
890 | + 'sequence': PermissionGatherer("SEQUENCE"), |
891 | + } |
892 | + |
893 | + for obj, role, perms in which: |
894 | + gatherers.get(obj.type, gatherers['table']).add( |
895 | + ', '.join(perms), obj.fullname, quote_identifier(role)) |
896 | + |
897 | + for gatherer in gatherers.values(): |
898 | + if revoke: |
899 | + gatherer.revoke(cur) |
900 | + else: |
901 | + gatherer.grant(cur) |
902 | + |
903 | + |
904 | def reset_permissions(con, config, options): |
905 | schema = DbSchema(con) |
906 | all_users = list_identifiers(schema.users) |
907 | @@ -342,14 +443,14 @@ |
908 | if username in schema.principals: |
909 | if type_ == 'group': |
910 | if options.revoke: |
911 | - log.debug("Revoking membership of %s role", username) |
912 | + log.debug2("Revoking membership of %s role", username) |
913 | cur.execute("REVOKE %s FROM %s" % ( |
914 | quote_identifier(username), all_users)) |
915 | else: |
916 | # Note - we don't drop the user because it might own |
917 | # objects in other databases. We need to ensure they are |
918 | # not superusers though! |
919 | - log.debug("Resetting role options of %s role.", username) |
920 | + log.debug2("Resetting role options of %s role.", username) |
921 | cur.execute( |
922 | "ALTER ROLE %s WITH %s" % ( |
923 | quote_identifier(username), |
924 | @@ -380,12 +481,12 @@ |
925 | if user.endswith('_ro'): |
926 | groups = ['%s_ro' % group for group in groups] |
927 | if groups: |
928 | - log.debug("Adding %s to %s roles", user, ', '.join(groups)) |
929 | + log.debug2("Adding %s to %s roles", user, ', '.join(groups)) |
930 | for group in groups: |
931 | cur.execute(r"""ALTER GROUP %s ADD USER %s""" % ( |
932 | quote_identifier(group), quote_identifier(user))) |
933 | else: |
934 | - log.debug("%s not in any roles", user) |
935 | + log.debug2("%s not in any roles", user) |
936 | |
937 | if options.revoke: |
938 | # Change ownership of all objects to OWNER. |
939 | @@ -399,40 +500,14 @@ |
940 | log.info("Resetting ownership of %s", obj.fullname) |
941 | cur.execute("ALTER TABLE %s OWNER TO %s" % ( |
942 | obj.fullname, quote_identifier(options.owner))) |
943 | - |
944 | - # Revoke all privs from known groups. Don't revoke anything for |
945 | - # users or groups not defined in our security.cfg. |
946 | - table_revocations = PermissionGatherer("TABLE") |
947 | - function_revocations = PermissionGatherer("FUNCTION") |
948 | - sequence_revocations = PermissionGatherer("SEQUENCE") |
949 | - |
950 | - # Gather all revocations. |
951 | - for section_name in config.sections(): |
952 | - role = quote_identifier(section_name) |
953 | - if section_name == 'public': |
954 | - ro_role = None |
955 | - else: |
956 | - ro_role = quote_identifier(section_name + "_ro") |
957 | - |
958 | - for obj in schema.values(): |
959 | - if obj.type == 'function': |
960 | - gatherer = function_revocations |
961 | - else: |
962 | - gatherer = table_revocations |
963 | - |
964 | - gatherer.add("ALL", obj.fullname, role) |
965 | - |
966 | - if obj.seqname in schema: |
967 | - sequence_revocations.add("ALL", obj.seqname, role) |
968 | - if ro_role is not None: |
969 | - sequence_revocations.add("ALL", obj.seqname, ro_role) |
970 | - |
971 | - table_revocations.revoke(cur) |
972 | - function_revocations.revoke(cur) |
973 | - sequence_revocations.revoke(cur) |
974 | else: |
975 | log.info("Not resetting ownership of database objects") |
976 | - log.info("Not revoking permissions on database objects") |
977 | + |
978 | + managed_roles = set(['read', 'admin']) |
979 | + for section_name in config.sections(): |
980 | + managed_roles.add(section_name) |
981 | + if section_name != 'public': |
982 | + managed_roles.add(section_name + "_ro") |
983 | |
984 | # Set of all tables we have granted permissions on. After we have assigned |
985 | # permissions, we can use this to determine what tables have been |
986 | @@ -440,16 +515,26 @@ |
987 | found = set() |
988 | |
989 | # Set permissions as per config file |
990 | - |
991 | - table_permissions = PermissionGatherer("TABLE") |
992 | - function_permissions = PermissionGatherer("FUNCTION") |
993 | - sequence_permissions = PermissionGatherer("SEQUENCE") |
994 | + desired_permissions = defaultdict(lambda: defaultdict(set)) |
995 | + |
996 | + valid_objs = set(schema.iterkeys()) |
997 | + |
998 | + # Any object with permissions granted is accessible to the 'read' |
999 | + # role. Some (eg. the lp_* replicated tables and internal or trigger |
1000 | + # functions) aren't readable. |
1001 | + granted_objs = set() |
1002 | |
1003 | for username in config.sections(): |
1004 | + who = username |
1005 | + if username == 'public': |
1006 | + who_ro = who |
1007 | + else: |
1008 | + who_ro = '%s_ro' % username |
1009 | + |
1010 | for obj_name, perm in config.items(username): |
1011 | if '.' not in obj_name: |
1012 | continue |
1013 | - if obj_name not in schema.keys(): |
1014 | + if obj_name not in valid_objs: |
1015 | log.warn('Bad object name %r', obj_name) |
1016 | continue |
1017 | obj = schema[obj_name] |
1018 | @@ -461,46 +546,37 @@ |
1019 | # No perm means no rights. We can't grant no rights, so skip. |
1020 | continue |
1021 | |
1022 | - who = quote_identifier(username) |
1023 | - if username == 'public': |
1024 | - who_ro = who |
1025 | - else: |
1026 | - who_ro = quote_identifier('%s_ro' % username) |
1027 | + granted_objs.add(obj) |
1028 | |
1029 | - log.debug( |
1030 | - "Granting %s on %s to %s", perm, obj.fullname, who) |
1031 | if obj.type == 'function': |
1032 | - function_permissions.add(perm, obj.fullname, who) |
1033 | - function_permissions.add("EXECUTE", obj.fullname, who_ro) |
1034 | - function_permissions.add( |
1035 | - "EXECUTE", obj.fullname, "read", is_group=True) |
1036 | - function_permissions.add( |
1037 | - "ALL", obj.fullname, "admin", is_group=True) |
1038 | + desired_permissions[obj][who].update(perm.split(', ')) |
1039 | + if who_ro: |
1040 | + desired_permissions[obj][who_ro].add("EXECUTE") |
1041 | else: |
1042 | - table_permissions.add( |
1043 | - "ALL", obj.fullname, "admin", is_group=True) |
1044 | - table_permissions.add(perm, obj.fullname, who) |
1045 | - table_permissions.add("SELECT", obj.fullname, who_ro) |
1046 | - is_secure = (obj.fullname in SECURE_TABLES) |
1047 | - if not is_secure: |
1048 | - table_permissions.add( |
1049 | - "SELECT", obj.fullname, "read", is_group=True) |
1050 | - if obj.seqname in schema: |
1051 | + desired_permissions[obj][who].update(perm.split(', ')) |
1052 | + if who_ro: |
1053 | + desired_permissions[obj][who_ro].add("SELECT") |
1054 | + if obj.seqname in valid_objs: |
1055 | + seq = schema[obj.seqname] |
1056 | + granted_objs.add(seq) |
1057 | if 'INSERT' in perm: |
1058 | seqperm = 'USAGE' |
1059 | elif 'SELECT' in perm: |
1060 | seqperm = 'SELECT' |
1061 | - sequence_permissions.add(seqperm, obj.seqname, who) |
1062 | - if not is_secure: |
1063 | - sequence_permissions.add( |
1064 | - "SELECT", obj.seqname, "read", is_group=True) |
1065 | - sequence_permissions.add("SELECT", obj.seqname, who_ro) |
1066 | - sequence_permissions.add( |
1067 | - "ALL", obj.seqname, "admin", is_group=True) |
1068 | + else: |
1069 | + seqperm = None |
1070 | + if seqperm: |
1071 | + desired_permissions[seq][who].add(seqperm) |
1072 | + desired_permissions[seq][who_ro].add("SELECT") |
1073 | |
1074 | - function_permissions.grant(cur) |
1075 | - table_permissions.grant(cur) |
1076 | - sequence_permissions.grant(cur) |
1077 | + # read gets read access to all non-secure objects that we've granted |
1078 | + # anybody access to. |
1079 | + for obj in granted_objs: |
1080 | + if obj.type == 'function': |
1081 | + desired_permissions[obj]['read'].add("EXECUTE") |
1082 | + else: |
1083 | + if obj.fullname not in SECURE_TABLES: |
1084 | + desired_permissions[obj]['read'].add("SELECT") |
1085 | |
1086 | # Set permissions on public schemas |
1087 | public_schemas = [ |
1088 | @@ -516,10 +592,56 @@ |
1089 | continue |
1090 | found.add(obj) |
1091 | if obj.type == 'function': |
1092 | - cur.execute('GRANT EXECUTE ON FUNCTION %s TO PUBLIC' % |
1093 | - obj.fullname) |
1094 | - else: |
1095 | - cur.execute('GRANT SELECT ON TABLE %s TO PUBLIC' % obj.fullname) |
1096 | + desired_permissions[obj]['public'].add('EXECUTE') |
1097 | + else: |
1098 | + desired_permissions[obj]['public'].add('SELECT') |
1099 | + |
1100 | + # For every object in the DB, ensure that the privileges held by our |
1101 | + # managed roles match our expectations. If not, store the delta |
1102 | + # to be applied later. |
1103 | + # Also grants/revokes access by the admin role, which isn't a |
1104 | + # traditionally managed role. |
1105 | + unmanaged_roles = set() |
1106 | + required_grants = [] |
1107 | + required_revokes = [] |
1108 | + for obj in schema.values(): |
1109 | + # We only care about roles that are in either the desired or |
1110 | + # existing ACL, and are also our managed roles. But skip admin, |
1111 | + # because it's done at the end. |
1112 | + interesting_roles = set(desired_permissions[obj]).union(obj.acl) |
1113 | + unmanaged_roles.update(interesting_roles.difference(managed_roles)) |
1114 | + for role in managed_roles.intersection(interesting_roles): |
1115 | + if role == 'admin': |
1116 | + continue |
1117 | + new = desired_permissions[obj][role] |
1118 | + old_privs = obj.acl.get(role, {}) |
1119 | + old = set(old_privs) |
1120 | + if any(old_privs.itervalues()): |
1121 | + log.warning("%s has grant option on %s", role, obj.fullname) |
1122 | + if new == old: |
1123 | + continue |
1124 | + missing = new.difference(old) |
1125 | + extra = old.difference(new) |
1126 | + if missing: |
1127 | + required_grants.append((obj, role, missing)) |
1128 | + if extra: |
1129 | + required_revokes.append((obj, role, extra)) |
1130 | + |
1131 | + # admin get all privileges on anything with privileges granted |
1132 | + # in security.cfg. We don't have a mapping from ALL to real |
1133 | + # privileges for each object type, so we just grant or revoke ALL |
1134 | + # each time. |
1135 | + if obj in granted_objs: |
1136 | + required_grants.append((obj, "admin", ("ALL",))) |
1137 | + else: |
1138 | + if "admin" in obj.acl: |
1139 | + required_revokes.append((obj, "admin", ("ALL",))) |
1140 | + |
1141 | + log.debug("Unmanaged roles on managed objects: %r", list(unmanaged_roles)) |
1142 | + |
1143 | + alter_permissions(cur, required_grants) |
1144 | + if options.revoke: |
1145 | + alter_permissions(cur, required_revokes, revoke=True) |
1146 | |
1147 | # Raise an error if we have database objects lying around that have not |
1148 | # had permissions assigned. |
1149 | |
1150 | === removed file 'database/schema/testfuncs.sql' |
1151 | --- database/schema/testfuncs.sql 2009-06-24 21:17:33 +0000 |
1152 | +++ database/schema/testfuncs.sql 1970-01-01 00:00:00 +0000 |
1153 | @@ -1,29 +0,0 @@ |
1154 | -/* |
1155 | -Copyright 2009 Canonical Ltd. This software is licensed under the |
1156 | -GNU Affero General Public License version 3 (see the file LICENSE). |
1157 | - |
1158 | -Stored procedures designed for use only by the test suite. These |
1159 | -will not be loaded onto the production database |
1160 | -*/ |
1161 | - |
1162 | -CREATE OR REPLACE FUNCTION _killall_backends(text) |
1163 | -RETURNS Boolean AS $$ |
1164 | - import os |
1165 | - from signal import SIGTERM |
1166 | - |
1167 | - plan = plpy.prepare( |
1168 | - "SELECT procpid FROM pg_stat_activity WHERE datname=$1", ['text'] |
1169 | - ) |
1170 | - success = True |
1171 | - for row in plpy.execute(plan, args): |
1172 | - try: |
1173 | - plpy.info("Killing %d" % row['procpid']) |
1174 | - os.kill(row['procpid'], SIGTERM) |
1175 | - except OSError: |
1176 | - success = False |
1177 | - |
1178 | - return success |
1179 | -$$ LANGUAGE plpythonu; |
1180 | - |
1181 | -COMMENT ON FUNCTION _killall_backends(text) IS 'Kill all backend processes connected to the given database. Note that this is unlikely to work if you are connected to the database you are killing, as you are likely to kill your own connection before all the others have been killed.'; |
1182 | - |
1183 | |
1184 | === modified file 'database/schema/trusted.sql' |
1185 | --- database/schema/trusted.sql 2011-06-13 08:39:28 +0000 |
1186 | +++ database/schema/trusted.sql 2011-08-16 00:31:36 +0000 |
1187 | @@ -1854,11 +1854,11 @@ |
1188 | AFFECTED_USER = 4 |
1189 | SUBSCRIBER = 2 |
1190 | |
1191 | - |
1192 | def get_max_heat_for_bug(bug_id): |
1193 | results = plpy.execute(""" |
1194 | SELECT MAX( |
1195 | - GREATEST(Product.max_bug_heat, Distribution.max_bug_heat)) |
1196 | + GREATEST(Product.max_bug_heat, |
1197 | + DistributionSourcePackage.max_bug_heat)) |
1198 | AS max_heat |
1199 | FROM BugTask |
1200 | LEFT OUTER JOIN ProductSeries ON |
1201 | @@ -1871,6 +1871,9 @@ |
1202 | LEFT OUTER JOIN Distribution ON ( |
1203 | BugTask.distribution = Distribution.id |
1204 | OR DistroSeries.distribution = Distribution.id) |
1205 | + LEFT OUTER JOIN DistributionSourcePackage ON ( |
1206 | + BugTask.sourcepackagename = |
1207 | + DistributionSourcePackage.sourcepackagename) |
1208 | WHERE |
1209 | BugTask.bug = %s""" % bug_id) |
1210 | |
1211 | |
1212 | === modified file 'database/schema/upgrade.py' |
1213 | --- database/schema/upgrade.py 2011-07-25 13:39:10 +0000 |
1214 | +++ database/schema/upgrade.py 2011-08-16 00:31:36 +0000 |
1215 | @@ -178,7 +178,7 @@ |
1216 | # The first script applies the DB patches to all nodes. |
1217 | |
1218 | # First make sure the cluster is synced. |
1219 | - log.info("Waiting for cluster to sync.") |
1220 | + log.info("Waiting for cluster to sync, pre-update.") |
1221 | replication.helpers.sync(timeout=600) |
1222 | |
1223 | outf = StringIO() |
1224 | @@ -186,18 +186,15 @@ |
1225 | # Start a transaction block. |
1226 | print >> outf, "try {" |
1227 | |
1228 | + sql_to_run = [] |
1229 | + |
1230 | def run_sql(script): |
1231 | if os.path.isabs(script): |
1232 | full_path = script |
1233 | else: |
1234 | full_path = os.path.abspath(os.path.join(SCHEMA_DIR, script)) |
1235 | assert os.path.exists(full_path), "%s doesn't exist." % full_path |
1236 | - print >> outf, dedent("""\ |
1237 | - execute script ( |
1238 | - set id = @lpmain_set, event node = @master_node, |
1239 | - filename='%s' |
1240 | - ); |
1241 | - """ % full_path) |
1242 | + sql_to_run.append(full_path) |
1243 | |
1244 | # We are going to generate some temporary files using |
1245 | # NamedTempoararyFile. Store them here so we can control when |
1246 | @@ -236,6 +233,22 @@ |
1247 | combined_script.flush() |
1248 | run_sql(combined_script.name) |
1249 | |
1250 | + # Now combine all the written SQL (probably trusted.sql and |
1251 | + # patch*.sql) into one big file, which we execute with a single |
1252 | + # slonik execute_script statement to avoid multiple syncs. |
1253 | + single = NamedTemporaryFile(prefix='single', suffix='.sql') |
1254 | + for path in sql_to_run: |
1255 | + print >> single, open(path, 'r').read() |
1256 | + print >> single, "" |
1257 | + single.flush() |
1258 | + |
1259 | + print >> outf, dedent("""\ |
1260 | + execute script ( |
1261 | + set id = @lpmain_set, event node = @master_node, |
1262 | + filename='%s' |
1263 | + ); |
1264 | + """ % single.name) |
1265 | + |
1266 | # Close transaction block and abort on error. |
1267 | print >> outf, dedent("""\ |
1268 | } |
1269 | @@ -246,9 +259,11 @@ |
1270 | """) |
1271 | |
1272 | # Execute the script with slonik. |
1273 | + log.info("slonik(1) schema upgrade script generated. Invoking.") |
1274 | if not replication.helpers.execute_slonik(outf.getvalue()): |
1275 | log.fatal("Aborting.") |
1276 | raise SystemExit(4) |
1277 | + log.info("slonik(1) schema upgrade script completed.") |
1278 | |
1279 | # Cleanup our temporary files - they applied successfully. |
1280 | for temporary_file in temporary_files: |
1281 | @@ -256,6 +271,7 @@ |
1282 | del temporary_files |
1283 | |
1284 | # Wait for replication to sync. |
1285 | + log.info("Waiting for patches to apply to slaves and cluster to sync.") |
1286 | replication.helpers.sync(timeout=0) |
1287 | |
1288 | # The db patches have now been applied to all nodes, and we are now |
1289 | @@ -350,25 +366,33 @@ |
1290 | subscribe set ( |
1291 | id=@holding_set, provider=@master_node, |
1292 | receiver=@node%d_node, forward=yes); |
1293 | + wait for event ( |
1294 | + origin=@master_node, confirmed=all, |
1295 | + wait on=@master_node, timeout=0); |
1296 | echo 'Waiting for sync'; |
1297 | sync (id=@master_node); |
1298 | wait for event ( |
1299 | origin=@master_node, confirmed=ALL, |
1300 | - wait on=@master_node, timeout=0 |
1301 | - ); |
1302 | + wait on=@master_node, timeout=0); |
1303 | """ % (slave_node.node_id, slave_node.node_id)) |
1304 | |
1305 | print >> outf, dedent("""\ |
1306 | echo 'Merging holding set to lpmain'; |
1307 | merge set ( |
1308 | - id=@lpmain_set, add id=@holding_set, origin=@master_node |
1309 | - ); |
1310 | + id=@lpmain_set, add id=@holding_set, origin=@master_node); |
1311 | """) |
1312 | |
1313 | # Execute the script and sync. |
1314 | + log.info( |
1315 | + "Generated slonik(1) script to replicate new objects. Invoking.") |
1316 | if not replication.helpers.execute_slonik(outf.getvalue()): |
1317 | log.fatal("Aborting.") |
1318 | + log.info( |
1319 | + "slonik(1) script to replicate new objects completed.") |
1320 | + log.info("Waiting for sync.") |
1321 | replication.helpers.sync(timeout=0) |
1322 | + else: |
1323 | + log.info("No new tables or sequences to replicate.") |
1324 | |
1325 | # We also scan for tables and sequences we want to drop and do so using |
1326 | # a final slonik script. Instead of dropping tables in the DB patch, |
1327 | @@ -411,8 +435,10 @@ |
1328 | exit 1; |
1329 | } |
1330 | """ % sql.name) |
1331 | + log.info("Generated slonik(1) script to drop tables. Invoking.") |
1332 | if not replication.helpers.execute_slonik(sk.getvalue()): |
1333 | log.fatal("Aborting.") |
1334 | + log.info("slonik(1) script to drop tables completed.") |
1335 | sql.close() |
1336 | |
1337 | # Now drop sequences. We don't do this at the same time as the tables, |
1338 | @@ -455,8 +481,11 @@ |
1339 | exit 1; |
1340 | } |
1341 | """ % sql.name) |
1342 | + log.info("Generated slonik(1) script to drop sequences. Invoking.") |
1343 | if not replication.helpers.execute_slonik(sk.getvalue()): |
1344 | log.fatal("Aborting.") |
1345 | + log.info("slonik(1) script to drop sequences completed.") |
1346 | + log.info("Waiting for final sync.") |
1347 | replication.helpers.sync(timeout=0) |
1348 | |
1349 | |
1350 | |
1351 | === modified file 'lib/canonical/config/schema-lazr.conf' |
1352 | --- lib/canonical/config/schema-lazr.conf 2011-08-14 23:11:45 +0000 |
1353 | +++ lib/canonical/config/schema-lazr.conf 2011-08-16 00:31:36 +0000 |
1354 | @@ -476,6 +476,11 @@ |
1355 | # datatype: integer |
1356 | default_interval_cvs: 43200 |
1357 | |
1358 | +# The default value of the update interval of a code import from |
1359 | +# Bazaar, in seconds. |
1360 | +# datatype: integer |
1361 | +default_interval_bzr: 21600 |
1362 | + |
1363 | # Where the tarballs of foreign branches are uploaded for storage. |
1364 | # datatype: string |
1365 | foreign_tree_store: sftp://hoover@escudero/srv/importd/sources/ |
1366 | |
1367 | === modified file 'lib/lp/code/mail/codeimport.py' |
1368 | --- lib/lp/code/mail/codeimport.py 2011-08-12 14:39:51 +0000 |
1369 | +++ lib/lp/code/mail/codeimport.py 2011-08-16 00:31:36 +0000 |
1370 | @@ -51,6 +51,7 @@ |
1371 | RevisionControlSystems.BZR_SVN: 'subversion', |
1372 | RevisionControlSystems.GIT: 'git', |
1373 | RevisionControlSystems.HG: 'mercurial', |
1374 | + RevisionControlSystems.BZR: 'bazaar', |
1375 | } |
1376 | body = get_email_template('new-code-import.txt') % { |
1377 | 'person': code_import.registrant.displayname, |
1378 | @@ -123,7 +124,8 @@ |
1379 | elif code_import.rcs_type in (RevisionControlSystems.SVN, |
1380 | RevisionControlSystems.BZR_SVN, |
1381 | RevisionControlSystems.GIT, |
1382 | - RevisionControlSystems.HG): |
1383 | + RevisionControlSystems.HG, |
1384 | + RevisionControlSystems.BZR): |
1385 | if CodeImportEventDataType.OLD_URL in event_data: |
1386 | old_url = event_data[CodeImportEventDataType.OLD_URL] |
1387 | body.append( |
1388 | |
1389 | === modified file 'lib/lp/code/model/codeimport.py' |
1390 | --- lib/lp/code/model/codeimport.py 2011-04-27 01:42:46 +0000 |
1391 | +++ lib/lp/code/model/codeimport.py 2011-08-16 00:31:36 +0000 |
1392 | @@ -116,6 +116,8 @@ |
1393 | config.codeimport.default_interval_git, |
1394 | RevisionControlSystems.HG: |
1395 | config.codeimport.default_interval_hg, |
1396 | + RevisionControlSystems.BZR: |
1397 | + config.codeimport.default_interval_bzr, |
1398 | } |
1399 | seconds = default_interval_dict[self.rcs_type] |
1400 | return timedelta(seconds=seconds) |
1401 | @@ -133,7 +135,8 @@ |
1402 | RevisionControlSystems.SVN, |
1403 | RevisionControlSystems.GIT, |
1404 | RevisionControlSystems.BZR_SVN, |
1405 | - RevisionControlSystems.HG): |
1406 | + RevisionControlSystems.HG, |
1407 | + RevisionControlSystems.BZR): |
1408 | return self.url |
1409 | else: |
1410 | raise AssertionError( |
1411 | @@ -252,7 +255,8 @@ |
1412 | elif rcs_type in (RevisionControlSystems.SVN, |
1413 | RevisionControlSystems.BZR_SVN, |
1414 | RevisionControlSystems.GIT, |
1415 | - RevisionControlSystems.HG): |
1416 | + RevisionControlSystems.HG, |
1417 | + RevisionControlSystems.BZR): |
1418 | assert cvs_root is None and cvs_module is None |
1419 | assert url is not None |
1420 | else: |
1421 | @@ -262,7 +266,8 @@ |
1422 | if review_status is None: |
1423 | # Auto approve git and hg imports. |
1424 | if rcs_type in ( |
1425 | - RevisionControlSystems.GIT, RevisionControlSystems.HG): |
1426 | + RevisionControlSystems.GIT, RevisionControlSystems.HG, |
1427 | + RevisionControlSystems.BZR): |
1428 | review_status = CodeImportReviewStatus.REVIEWED |
1429 | else: |
1430 | review_status = CodeImportReviewStatus.NEW |
1431 | |
1432 | === modified file 'lib/lp/code/model/codeimportevent.py' |
1433 | --- lib/lp/code/model/codeimportevent.py 2010-10-17 22:51:50 +0000 |
1434 | +++ lib/lp/code/model/codeimportevent.py 2011-08-16 00:31:36 +0000 |
1435 | @@ -269,7 +269,8 @@ |
1436 | if code_import.rcs_type in (RevisionControlSystems.SVN, |
1437 | RevisionControlSystems.BZR_SVN, |
1438 | RevisionControlSystems.GIT, |
1439 | - RevisionControlSystems.HG): |
1440 | + RevisionControlSystems.HG, |
1441 | + RevisionControlSystems.BZR): |
1442 | yield 'URL', code_import.url |
1443 | elif code_import.rcs_type == RevisionControlSystems.CVS: |
1444 | yield 'CVS_ROOT', code_import.cvs_root |
1445 | |
1446 | === modified file 'lib/lp/code/model/tests/test_codeimport.py' |
1447 | --- lib/lp/code/model/tests/test_codeimport.py 2011-08-12 11:37:08 +0000 |
1448 | +++ lib/lp/code/model/tests/test_codeimport.py 2011-08-16 00:31:36 +0000 |
1449 | @@ -70,6 +70,20 @@ |
1450 | # No job is created for the import. |
1451 | self.assertIs(None, code_import.import_job) |
1452 | |
1453 | + def test_new_svn_import_svn_scheme(self): |
1454 | + """A subversion import can use the svn:// scheme.""" |
1455 | + code_import = CodeImportSet().new( |
1456 | + registrant=self.factory.makePerson(), |
1457 | + target=IBranchTarget(self.factory.makeProduct()), |
1458 | + branch_name='imported', |
1459 | + rcs_type=RevisionControlSystems.SVN, |
1460 | + url=self.factory.getUniqueURL(scheme="svn")) |
1461 | + self.assertEqual( |
1462 | + CodeImportReviewStatus.NEW, |
1463 | + code_import.review_status) |
1464 | + # No job is created for the import. |
1465 | + self.assertIs(None, code_import.import_job) |
1466 | + |
1467 | def test_reviewed_svn_import(self): |
1468 | """A specific review status can be set for a new import.""" |
1469 | code_import = CodeImportSet().new( |
1470 | @@ -116,6 +130,21 @@ |
1471 | # A job is created for the import. |
1472 | self.assertIsNot(None, code_import.import_job) |
1473 | |
1474 | + def test_git_import_git_scheme(self): |
1475 | + """A git import can have a git:// style URL.""" |
1476 | + code_import = CodeImportSet().new( |
1477 | + registrant=self.factory.makePerson(), |
1478 | + target=IBranchTarget(self.factory.makeProduct()), |
1479 | + branch_name='imported', |
1480 | + rcs_type=RevisionControlSystems.GIT, |
1481 | + url=self.factory.getUniqueURL(scheme="git"), |
1482 | + review_status=None) |
1483 | + self.assertEqual( |
1484 | + CodeImportReviewStatus.REVIEWED, |
1485 | + code_import.review_status) |
1486 | + # A job is created for the import. |
1487 | + self.assertIsNot(None, code_import.import_job) |
1488 | + |
1489 | def test_git_import_reviewed(self): |
1490 | """A new git import is always reviewed by default.""" |
1491 | code_import = CodeImportSet().new( |
1492 | @@ -146,6 +175,21 @@ |
1493 | # A job is created for the import. |
1494 | self.assertIsNot(None, code_import.import_job) |
1495 | |
1496 | + def test_bzr_import_reviewed(self): |
1497 | + """A new bzr import is always reviewed by default.""" |
1498 | + code_import = CodeImportSet().new( |
1499 | + registrant=self.factory.makePerson(), |
1500 | + target=IBranchTarget(self.factory.makeProduct()), |
1501 | + branch_name='mirrored', |
1502 | + rcs_type=RevisionControlSystems.BZR, |
1503 | + url=self.factory.getUniqueURL(), |
1504 | + review_status=None) |
1505 | + self.assertEqual( |
1506 | + CodeImportReviewStatus.REVIEWED, |
1507 | + code_import.review_status) |
1508 | + # A job is created for the import. |
1509 | + self.assertIsNot(None, code_import.import_job) |
1510 | + |
1511 | def test_junk_code_import_rejected(self): |
1512 | """You are not allowed to create code imports targetting +junk.""" |
1513 | registrant = self.factory.makePerson() |
1514 | |
1515 | === modified file 'lib/lp/codehosting/codeimport/tests/servers.py' |
1516 | --- lib/lp/codehosting/codeimport/tests/servers.py 2011-08-16 00:31:28 +0000 |
1517 | +++ lib/lp/codehosting/codeimport/tests/servers.py 2011-08-16 00:31:36 +0000 |
1518 | @@ -4,6 +4,7 @@ |
1519 | """Server classes that know how to create various kinds of foreign archive.""" |
1520 | |
1521 | __all__ = [ |
1522 | + 'BzrServer', |
1523 | 'CVSServer', |
1524 | 'GitServer', |
1525 | 'MercurialServer', |
1526 | @@ -13,6 +14,7 @@ |
1527 | __metaclass__ = type |
1528 | |
1529 | from cStringIO import StringIO |
1530 | +import errno |
1531 | import os |
1532 | import shutil |
1533 | import signal |
1534 | @@ -22,7 +24,14 @@ |
1535 | import time |
1536 | import threading |
1537 | |
1538 | +from bzrlib.branch import Branch |
1539 | +from bzrlib.branchbuilder import BranchBuilder |
1540 | +from bzrlib.bzrdir import BzrDir |
1541 | from bzrlib.tests.treeshape import build_tree_contents |
1542 | +from bzrlib.tests.test_server import ( |
1543 | + ReadonlySmartTCPServer_for_testing, |
1544 | + TestServer, |
1545 | + ) |
1546 | from bzrlib.transport import Server |
1547 | from bzrlib.urlutils import ( |
1548 | escape, |
1549 | @@ -119,8 +128,8 @@ |
1550 | for i in range(10): |
1551 | try: |
1552 | ra = self._get_ra(self.get_url()) |
1553 | - except subvertpy.SubversionException, e: |
1554 | - if 'Connection refused' in str(e): |
1555 | + except OSError, e: |
1556 | + if e.errno == errno.ECONNREFUSED: |
1557 | time.sleep(delay) |
1558 | delay *= 1.5 |
1559 | continue |
1560 | @@ -343,3 +352,50 @@ |
1561 | f.close() |
1562 | repo[None].add([filename]) |
1563 | repo.commit(text='<The commit message>', user='jane Foo <joe@foo.com>') |
1564 | + |
1565 | + |
1566 | +class BzrServer(Server): |
1567 | + |
1568 | + def __init__(self, repository_path, use_server=False): |
1569 | + super(BzrServer, self).__init__() |
1570 | + self.repository_path = repository_path |
1571 | + self._use_server = use_server |
1572 | + |
1573 | + def createRepository(self, path): |
1574 | + BzrDir.create_branch_convenience(path) |
1575 | + |
1576 | + def makeRepo(self, tree_contents): |
1577 | + branch = Branch.open(self.repository_path) |
1578 | + branch.get_config().set_user_option("create_signatures", "never") |
1579 | + builder = BranchBuilder(branch=branch) |
1580 | + actions = [('add', ('', 'tree-root', 'directory', None))] |
1581 | + actions += [('add', (path, path+'-id', 'file', content)) for (path, |
1582 | + content) in tree_contents] |
1583 | + builder.build_snapshot(None, None, |
1584 | + actions, committer='Joe Foo <joe@foo.com>', |
1585 | + message=u'<The commit message>') |
1586 | + |
1587 | + def get_url(self): |
1588 | + if self._use_server: |
1589 | + return self._bzrserver.get_url() |
1590 | + else: |
1591 | + return local_path_to_url(self.repository_path) |
1592 | + |
1593 | + def start_server(self): |
1594 | + super(BzrServer, self).start_server() |
1595 | + self.createRepository(self.repository_path) |
1596 | + class LocalURLServer(TestServer): |
1597 | + def __init__(self, repository_path): |
1598 | + self.repository_path = repository_path |
1599 | + def start_server(self): pass |
1600 | + def get_url(self): |
1601 | + return local_path_to_url(self.repository_path) |
1602 | + if self._use_server: |
1603 | + self._bzrserver = ReadonlySmartTCPServer_for_testing() |
1604 | + self._bzrserver.start_server( |
1605 | + LocalURLServer(self.repository_path)) |
1606 | + |
1607 | + def stop_server(self): |
1608 | + super(BzrServer, self).stop_server() |
1609 | + if self._use_server: |
1610 | + self._bzrserver.stop_server() |
1611 | |
1612 | === modified file 'lib/lp/codehosting/codeimport/tests/test_worker.py' |
1613 | --- lib/lp/codehosting/codeimport/tests/test_worker.py 2011-08-16 00:31:28 +0000 |
1614 | +++ lib/lp/codehosting/codeimport/tests/test_worker.py 2011-08-16 00:31:36 +0000 |
1615 | @@ -16,6 +16,9 @@ |
1616 | Branch, |
1617 | BranchReferenceFormat, |
1618 | ) |
1619 | +from bzrlib.branchbuilder import ( |
1620 | + BranchBuilder, |
1621 | + ) |
1622 | from bzrlib.bzrdir import ( |
1623 | BzrDir, |
1624 | BzrDirFormat, |
1625 | @@ -41,11 +44,13 @@ |
1626 | from canonical.config import config |
1627 | from canonical.testing.layers import BaseLayer |
1628 | from lp.codehosting import load_optional_plugin |
1629 | +from lp.codehosting.safe_open import BadUrl |
1630 | from lp.codehosting.codeimport.tarball import ( |
1631 | create_tarball, |
1632 | extract_tarball, |
1633 | ) |
1634 | from lp.codehosting.codeimport.tests.servers import ( |
1635 | + BzrServer, |
1636 | CVSServer, |
1637 | GitServer, |
1638 | MercurialServer, |
1639 | @@ -53,7 +58,9 @@ |
1640 | ) |
1641 | from lp.codehosting.codeimport.worker import ( |
1642 | BazaarBranchStore, |
1643 | + BzrImportWorker, |
1644 | BzrSvnImportWorker, |
1645 | + CodeImportBranchOpenPolicy, |
1646 | CodeImportWorkerExitCode, |
1647 | CSCVSImportWorker, |
1648 | ForeignTreeStore, |
1649 | @@ -958,7 +965,7 @@ |
1650 | def makeSourceDetails(self, branch_name, files): |
1651 | """Make a SVN `CodeImportSourceDetails` pointing at a real SVN repo. |
1652 | """ |
1653 | - svn_server = SubversionServer(self.makeTemporaryDirectory()) |
1654 | + svn_server = SubversionServer(self.makeTemporaryDirectory(), True) |
1655 | svn_server.start_server() |
1656 | self.addCleanup(svn_server.stop_server) |
1657 | |
1658 | @@ -998,10 +1005,10 @@ |
1659 | # import should be rejected. |
1660 | args = {'rcstype': self.rcstype} |
1661 | reference_url = self.createBranchReference() |
1662 | - if self.rcstype in ('git', 'bzr-svn', 'hg'): |
1663 | + if self.rcstype in ('git', 'bzr-svn', 'hg', 'bzr'): |
1664 | args['url'] = reference_url |
1665 | else: |
1666 | - raise AssertionError("unexpected rcs_type %r" % self.rcs_type) |
1667 | + raise AssertionError("unexpected rcs_type %r" % self.rcstype) |
1668 | source_details = self.factory.makeCodeImportSourceDetails(**args) |
1669 | worker = self.makeImportWorker(source_details) |
1670 | self.assertEqual( |
1671 | @@ -1010,7 +1017,21 @@ |
1672 | def test_invalid(self): |
1673 | # If there is no branch in the target URL, exit with FAILURE_INVALID |
1674 | worker = self.makeImportWorker(self.factory.makeCodeImportSourceDetails( |
1675 | - rcstype=self.rcstype, url="file:///path/non/existant")) |
1676 | + rcstype=self.rcstype, url="http://localhost/path/non/existant")) |
1677 | + self.assertEqual( |
1678 | + CodeImportWorkerExitCode.FAILURE_INVALID, worker.run()) |
1679 | + |
1680 | + def test_bad_url(self): |
1681 | + # Local path URLs are not allowed |
1682 | + worker = self.makeImportWorker(self.factory.makeCodeImportSourceDetails( |
1683 | + rcstype=self.rcstype, url="file:///tmp/path/non/existant")) |
1684 | + self.assertEqual( |
1685 | + CodeImportWorkerExitCode.FAILURE_INVALID, worker.run()) |
1686 | + |
1687 | + def test_launchpad_url(self): |
1688 | + # Launchpad URLs are not allowed |
1689 | + worker = self.makeImportWorker(self.factory.makeCodeImportSourceDetails( |
1690 | + rcstype=self.rcstype, url="https://code.launchpad.net/linux/")) |
1691 | self.assertEqual( |
1692 | CodeImportWorkerExitCode.FAILURE_INVALID, worker.run()) |
1693 | |
1694 | @@ -1079,7 +1100,7 @@ |
1695 | """Make a Git `CodeImportSourceDetails` pointing at a real Git repo. |
1696 | """ |
1697 | repository_path = self.makeTemporaryDirectory() |
1698 | - git_server = GitServer(repository_path) |
1699 | + git_server = GitServer(repository_path, use_server=True) |
1700 | git_server.start_server() |
1701 | self.addCleanup(git_server.stop_server) |
1702 | |
1703 | @@ -1130,7 +1151,7 @@ |
1704 | """Make a Mercurial `CodeImportSourceDetails` pointing at a real repo. |
1705 | """ |
1706 | repository_path = self.makeTemporaryDirectory() |
1707 | - hg_server = MercurialServer(repository_path) |
1708 | + hg_server = MercurialServer(repository_path, use_server=True) |
1709 | hg_server.start_server() |
1710 | self.addCleanup(hg_server.stop_server) |
1711 | |
1712 | @@ -1156,3 +1177,78 @@ |
1713 | return BzrSvnImportWorker( |
1714 | source_details, self.get_transport('import_data'), |
1715 | self.bazaar_store, logging.getLogger()) |
1716 | + |
1717 | + |
1718 | +class TestBzrImport(WorkerTest, TestActualImportMixin, |
1719 | + PullingImportWorkerTests): |
1720 | + |
1721 | + rcstype = 'bzr' |
1722 | + |
1723 | + def setUp(self): |
1724 | + super(TestBzrImport, self).setUp() |
1725 | + self.setUpImport() |
1726 | + |
1727 | + def makeImportWorker(self, source_details): |
1728 | + """Make a new `ImportWorker`.""" |
1729 | + return BzrImportWorker( |
1730 | + source_details, self.get_transport('import_data'), |
1731 | + self.bazaar_store, logging.getLogger()) |
1732 | + |
1733 | + def makeForeignCommit(self, source_details): |
1734 | + """Change the foreign tree, generating exactly one commit.""" |
1735 | + branch = Branch.open(source_details.url) |
1736 | + builder = BranchBuilder(branch=branch) |
1737 | + builder.build_commit(message=self.factory.getUniqueString(), |
1738 | + committer="Joe Random Hacker <joe@example.com>") |
1739 | + self.foreign_commit_count += 1 |
1740 | + |
1741 | + def makeSourceDetails(self, branch_name, files): |
1742 | + """Make Bzr `CodeImportSourceDetails` pointing at a real Bzr repo. |
1743 | + """ |
1744 | + repository_path = self.makeTemporaryDirectory() |
1745 | + bzr_server = BzrServer(repository_path, use_server=True) |
1746 | + bzr_server.start_server() |
1747 | + self.addCleanup(bzr_server.stop_server) |
1748 | + |
1749 | + bzr_server.makeRepo(files) |
1750 | + self.foreign_commit_count = 1 |
1751 | + |
1752 | + return self.factory.makeCodeImportSourceDetails( |
1753 | + rcstype='bzr', url=bzr_server.get_url()) |
1754 | + |
1755 | + def test_partial(self): |
1756 | + self.skip( |
1757 | + "Partial fetching is not supported for native bzr branches " |
1758 | + "at the moment.") |
1759 | + |
1760 | + def test_unsupported_feature(self): |
1761 | + self.skip("All Bazaar features are supported by Bazaar.") |
1762 | + |
1763 | + |
1764 | +class CodeImportBranchOpenPolicyTests(TestCase): |
1765 | + |
1766 | + def setUp(self): |
1767 | + super(CodeImportBranchOpenPolicyTests, self).setUp() |
1768 | + self.policy = CodeImportBranchOpenPolicy() |
1769 | + |
1770 | + def test_follows_references(self): |
1771 | + self.assertEquals(True, self.policy.shouldFollowReferences()) |
1772 | + |
1773 | + def assertBadUrl(self, url): |
1774 | + self.assertRaises(BadUrl, self.policy.checkOneUrl(url)) |
1775 | + |
1776 | + def assertGoodUrl(self, url): |
1777 | + self.assertNotRaises(BadUrl, self.policy.checkOneUrl(url)) |
1778 | + |
1779 | + def test_checkOneUrl(self): |
1780 | + self.assertBadUrl("https://launchpad.net/foo") |
1781 | + self.assertBadUrl("sftp://somehost/") |
1782 | + self.assertBadUrl("/etc/passwd") |
1783 | + self.assertBadUrl("bzr+ssh://devpad/") |
1784 | + self.assertBadUrl("bzr+ssh://devpad/") |
1785 | + self.assertBadUrl("unknown-scheme://devpad/") |
1786 | + self.assertGoodUrl("http://svn.example/branches/trunk") |
1787 | + self.assertGoodUrl("http://user:password@svn.example/branches/trunk") |
1788 | + self.assertGoodUrl("svn+ssh://svn.example.com/bla") |
1789 | + self.assertGoodUrl("git://git.example.com/repo") |
1790 | + self.assertGoodUrl("https://hg.example.com/hg/repo/branch") |
1791 | |
1792 | === modified file 'lib/lp/codehosting/codeimport/tests/test_workermonitor.py' |
1793 | --- lib/lp/codehosting/codeimport/tests/test_workermonitor.py 2011-08-16 00:31:28 +0000 |
1794 | +++ lib/lp/codehosting/codeimport/tests/test_workermonitor.py 2011-08-16 00:31:36 +0000 |
1795 | @@ -49,6 +49,7 @@ |
1796 | from lp.code.model.codeimportjob import CodeImportJob |
1797 | from lp.codehosting import load_optional_plugin |
1798 | from lp.codehosting.codeimport.tests.servers import ( |
1799 | + BzrServer, |
1800 | CVSServer, |
1801 | GitServer, |
1802 | MercurialServer, |
1803 | @@ -683,6 +684,16 @@ |
1804 | return self.factory.makeCodeImport( |
1805 | hg_repo_url=self.hg_server.get_url()) |
1806 | |
1807 | + def makeBzrCodeImport(self): |
1808 | + """Make a `CodeImport` that points to a real Bazaar branch.""" |
1809 | + self.bzr_server = BzrServer(self.repo_path) |
1810 | + self.bzr_server.start_server() |
1811 | + self.addCleanup(self.bzr_server.stop_server) |
1812 | + |
1813 | + self.bzr_server.makeRepo([('README', 'contents')]) |
1814 | + self.foreign_commit_count = 1 |
1815 | + return self.factory.makeCodeImport(bzr_branch_url=self.repo_path) |
1816 | + |
1817 | def getStartedJobForImport(self, code_import): |
1818 | """Get a started `CodeImportJob` for `code_import`. |
1819 | |
1820 | @@ -783,6 +794,15 @@ |
1821 | result = self.performImport(job_id) |
1822 | return result.addCallback(self.assertImported, code_import_id) |
1823 | |
1824 | + def test_import_bzr(self): |
1825 | + # Create a Bazaar CodeImport and import it. |
1826 | + job = self.getStartedJobForImport(self.makeBzrCodeImport()) |
1827 | + code_import_id = job.code_import.id |
1828 | + job_id = job.id |
1829 | + self.layer.txn.commit() |
1830 | + result = self.performImport(job_id) |
1831 | + return result.addCallback(self.assertImported, code_import_id) |
1832 | + |
1833 | # XXX 2010-03-24 MichaelHudson, bug=541526: This test fails intermittently |
1834 | # in EC2. |
1835 | def DISABLED_test_import_bzrsvn(self): |
1836 | |
1837 | === modified file 'lib/lp/codehosting/codeimport/worker.py' |
1838 | --- lib/lp/codehosting/codeimport/worker.py 2011-08-16 00:31:28 +0000 |
1839 | +++ lib/lp/codehosting/codeimport/worker.py 2011-08-16 00:31:36 +0000 |
1840 | @@ -6,8 +6,10 @@ |
1841 | __metaclass__ = type |
1842 | __all__ = [ |
1843 | 'BazaarBranchStore', |
1844 | + 'BzrImportWorker', |
1845 | 'BzrSvnImportWorker', |
1846 | 'CSCVSImportWorker', |
1847 | + 'CodeImportBranchOpenPolicy', |
1848 | 'CodeImportSourceDetails', |
1849 | 'CodeImportWorkerExitCode', |
1850 | 'ForeignTreeStore', |
1851 | @@ -49,11 +51,22 @@ |
1852 | import SCM |
1853 | |
1854 | from canonical.config import config |
1855 | + |
1856 | +from lazr.uri import ( |
1857 | + URI, |
1858 | + ) |
1859 | + |
1860 | from lp.code.enums import RevisionControlSystems |
1861 | +from lp.code.interfaces.branch import get_blacklisted_hostnames |
1862 | from lp.codehosting.codeimport.foreigntree import ( |
1863 | CVSWorkingTree, |
1864 | SubversionWorkingTree, |
1865 | ) |
1866 | +from lp.codehosting.safe_open import ( |
1867 | + BadUrl, |
1868 | + BranchOpenPolicy, |
1869 | + SafeBranchOpener, |
1870 | + ) |
1871 | from lp.codehosting.codeimport.tarball import ( |
1872 | create_tarball, |
1873 | extract_tarball, |
1874 | @@ -62,6 +75,52 @@ |
1875 | from lp.services.propertycache import cachedproperty |
1876 | |
1877 | |
1878 | +class CodeImportBranchOpenPolicy(BranchOpenPolicy): |
1879 | + """Branch open policy for code imports. |
1880 | + |
1881 | + In summary: |
1882 | + - follow references, |
1883 | + - only open non-Launchpad URLs |
1884 | + - only open the allowed schemes |
1885 | + """ |
1886 | + |
1887 | + allowed_schemes = ['http', 'https', 'svn', 'git', 'ftp'] |
1888 | + |
1889 | + def shouldFollowReferences(self): |
1890 | + """See `BranchOpenPolicy.shouldFollowReferences`. |
1891 | + |
1892 | + We traverse branch references for MIRRORED branches because they |
1893 | + provide a useful redirection mechanism and we want to be consistent |
1894 | + with the bzr command line. |
1895 | + """ |
1896 | + return True |
1897 | + |
1898 | + def transformFallbackLocation(self, branch, url): |
1899 | + """See `BranchOpenPolicy.transformFallbackLocation`. |
1900 | + |
1901 | + For mirrored branches, we stack on whatever the remote branch claims |
1902 | + to stack on, but this URL still needs to be checked. |
1903 | + """ |
1904 | + return urljoin(branch.base, url), True |
1905 | + |
1906 | + def checkOneURL(self, url): |
1907 | + """See `BranchOpenPolicy.checkOneURL`. |
1908 | + |
1909 | + We refuse to mirror from Launchpad or a ssh-like or file URL. |
1910 | + """ |
1911 | + uri = URI(url) |
1912 | + launchpad_domain = config.vhost.mainsite.hostname |
1913 | + if uri.underDomain(launchpad_domain): |
1914 | + raise BadUrl(url) |
1915 | + for hostname in get_blacklisted_hostnames(): |
1916 | + if uri.underDomain(hostname): |
1917 | + raise BadUrl(url) |
1918 | + if uri.scheme in ['sftp', 'bzr+ssh']: |
1919 | + raise BadUrl(url) |
1920 | + elif uri.scheme not in self.allowed_schemes: |
1921 | + raise BadUrl(url) |
1922 | + |
1923 | + |
1924 | class CodeImportWorkerExitCode: |
1925 | """Exit codes used by the code import worker script.""" |
1926 | |
1927 | @@ -187,9 +246,9 @@ |
1928 | |
1929 | :ivar branch_id: The id of the branch associated to this code import, used |
1930 | for locating the existing import and the foreign tree. |
1931 | - :ivar rcstype: 'svn' or 'cvs' as appropriate. |
1932 | + :ivar rcstype: 'svn', 'cvs', 'hg', 'git', 'bzr-svn', 'bzr' as appropriate. |
1933 | :ivar url: The branch URL if rcstype in ['svn', 'bzr-svn', |
1934 | - 'git'], None otherwise. |
1935 | + 'git', 'hg', 'bzr'], None otherwise. |
1936 | :ivar cvs_root: The $CVSROOT if rcstype == 'cvs', None otherwise. |
1937 | :ivar cvs_module: The CVS module if rcstype == 'cvs', None otherwise. |
1938 | """ |
1939 | @@ -207,7 +266,7 @@ |
1940 | """Convert command line-style arguments to an instance.""" |
1941 | branch_id = int(arguments.pop(0)) |
1942 | rcstype = arguments.pop(0) |
1943 | - if rcstype in ['svn', 'bzr-svn', 'git', 'hg']: |
1944 | + if rcstype in ['svn', 'bzr-svn', 'git', 'hg', 'bzr']: |
1945 | [url] = arguments |
1946 | cvs_root = cvs_module = None |
1947 | elif rcstype == 'cvs': |
1948 | @@ -234,6 +293,8 @@ |
1949 | return cls(branch_id, 'git', str(code_import.url)) |
1950 | elif code_import.rcs_type == RevisionControlSystems.HG: |
1951 | return cls(branch_id, 'hg', str(code_import.url)) |
1952 | + elif code_import.rcs_type == RevisionControlSystems.BZR: |
1953 | + return cls(branch_id, 'bzr', str(code_import.url)) |
1954 | else: |
1955 | raise AssertionError("Unknown rcstype %r." % code_import.rcs_type) |
1956 | |
1957 | @@ -241,7 +302,7 @@ |
1958 | """Return a list of arguments suitable for passing to a child process. |
1959 | """ |
1960 | result = [str(self.branch_id), self.rcstype] |
1961 | - if self.rcstype in ['svn', 'bzr-svn', 'git', 'hg']: |
1962 | + if self.rcstype in ['svn', 'bzr-svn', 'git', 'hg', 'bzr']: |
1963 | result.append(self.url) |
1964 | elif self.rcstype == 'cvs': |
1965 | result.append(self.cvs_root) |
1966 | @@ -601,11 +662,18 @@ |
1967 | def _doImport(self): |
1968 | self._logger.info("Starting job.") |
1969 | saved_factory = bzrlib.ui.ui_factory |
1970 | + opener_policy = CodeImportBranchOpenPolicy() |
1971 | + opener = SafeBranchOpener(opener_policy) |
1972 | bzrlib.ui.ui_factory = LoggingUIFactory(logger=self._logger) |
1973 | try: |
1974 | self._logger.info( |
1975 | "Getting exising bzr branch from central store.") |
1976 | bazaar_branch = self.getBazaarBranch() |
1977 | + try: |
1978 | + opener_policy.checkOneURL(self.source_details.url) |
1979 | + except BadUrl, e: |
1980 | + self._logger.info("Invalid URL: %s" % e) |
1981 | + return CodeImportWorkerExitCode.FAILURE_INVALID |
1982 | transport = get_transport(self.source_details.url) |
1983 | for prober_kls in self.probers: |
1984 | prober = prober_kls() |
1985 | @@ -617,7 +685,13 @@ |
1986 | else: |
1987 | self._logger.info("No branch found at remote location.") |
1988 | return CodeImportWorkerExitCode.FAILURE_INVALID |
1989 | - remote_branch = format.open(transport).open_branch() |
1990 | + remote_dir = format.open(transport) |
1991 | + try: |
1992 | + remote_branch = opener.runWithTransformFallbackLocationHookInstalled( |
1993 | + remote_dir.open_branch) |
1994 | + except BadUrl, e: |
1995 | + self._logger.info("Invalid URL: %s" % e) |
1996 | + return CodeImportWorkerExitCode.FAILURE_INVALID |
1997 | remote_branch_tip = remote_branch.last_revision() |
1998 | inter_branch = InterBranch.get(remote_branch, bazaar_branch) |
1999 | self._logger.info("Importing branch.") |
2000 | @@ -820,3 +894,21 @@ |
2001 | """See `PullingImportWorker.probers`.""" |
2002 | from bzrlib.plugins.svn import SvnRemoteProber |
2003 | return [SvnRemoteProber] |
2004 | + |
2005 | + |
2006 | +class BzrImportWorker(PullingImportWorker): |
2007 | + """An import worker for importing Bazaar branches.""" |
2008 | + |
2009 | + invalid_branch_exceptions = [] |
2010 | + unsupported_feature_exceptions = [] |
2011 | + |
2012 | + def getRevisionLimit(self): |
2013 | + """See `PullingImportWorker.getRevisionLimit`.""" |
2014 | + # For now, just grab the whole branch at once |
2015 | + return None |
2016 | + |
2017 | + @property |
2018 | + def probers(self): |
2019 | + """See `PullingImportWorker.probers`.""" |
2020 | + from bzrlib.bzrdir import BzrProber, RemoteBzrProber |
2021 | + return [BzrProber, RemoteBzrProber] |
2022 | |
2023 | === modified file 'lib/lp/codehosting/puller/tests/test_scheduler.py' |
2024 | --- lib/lp/codehosting/puller/tests/test_scheduler.py 2011-08-12 11:37:08 +0000 |
2025 | +++ lib/lp/codehosting/puller/tests/test_scheduler.py 2011-08-16 00:31:36 +0000 |
2026 | @@ -767,7 +767,7 @@ |
2027 | branch.lock_write() |
2028 | protocol.mirrorFailed('a', 'b') |
2029 | protocol.sendEvent( |
2030 | - 'lock_id', branch.control_files._lock.peek()['user']) |
2031 | + 'lock_id', branch.control_files._lock.peek().get('user')) |
2032 | sys.stdout.flush() |
2033 | branch.unlock() |
2034 | """ |
2035 | |
2036 | === modified file 'lib/lp/codehosting/puller/worker.py' |
2037 | --- lib/lp/codehosting/puller/worker.py 2011-08-16 00:31:28 +0000 |
2038 | +++ lib/lp/codehosting/puller/worker.py 2011-08-16 00:31:36 +0000 |
2039 | @@ -15,7 +15,6 @@ |
2040 | urlutils, |
2041 | ) |
2042 | from bzrlib.branch import Branch |
2043 | -from bzrlib.bzrdir import BzrDir |
2044 | from bzrlib.plugins.weave_fmt.branch import BzrBranchFormat4 |
2045 | from bzrlib.plugins.weave_fmt.repository import ( |
2046 | RepositoryFormat4, |
2047 | |
2048 | === modified file 'lib/lp/registry/browser/distribution.py' |
2049 | --- lib/lp/registry/browser/distribution.py 2011-06-16 13:50:58 +0000 |
2050 | +++ lib/lp/registry/browser/distribution.py 2011-08-16 00:31:36 +0000 |
2051 | @@ -827,6 +827,7 @@ |
2052 | 'description', |
2053 | 'bug_reporting_guidelines', |
2054 | 'bug_reported_acknowledgement', |
2055 | + 'package_derivatives_email', |
2056 | 'icon', |
2057 | 'logo', |
2058 | 'mugshot', |
2059 | |
2060 | === modified file 'lib/lp/registry/browser/productseries.py' |
2061 | --- lib/lp/registry/browser/productseries.py 2011-05-27 21:12:25 +0000 |
2062 | +++ lib/lp/registry/browser/productseries.py 2011-08-16 00:31:36 +0000 |
2063 | @@ -827,15 +827,6 @@ |
2064 | )) |
2065 | |
2066 | |
2067 | -class RevisionControlSystemsExtended(RevisionControlSystems): |
2068 | - """External RCS plus Bazaar.""" |
2069 | - BZR = DBItem(99, """ |
2070 | - Bazaar |
2071 | - |
2072 | - External Bazaar branch. |
2073 | - """) |
2074 | - |
2075 | - |
2076 | class SetBranchForm(Interface): |
2077 | """The fields presented on the form for setting a branch.""" |
2078 | |
2079 | @@ -844,7 +835,7 @@ |
2080 | ['cvs_module']) |
2081 | |
2082 | rcs_type = Choice(title=_("Type of RCS"), |
2083 | - required=False, vocabulary=RevisionControlSystemsExtended, |
2084 | + required=False, vocabulary=RevisionControlSystems, |
2085 | description=_( |
2086 | "The version control system to import from. ")) |
2087 | |
2088 | @@ -908,7 +899,7 @@ |
2089 | @property |
2090 | def initial_values(self): |
2091 | return dict( |
2092 | - rcs_type=RevisionControlSystemsExtended.BZR, |
2093 | + rcs_type=RevisionControlSystems.BZR, |
2094 | branch_type=LINK_LP_BZR, |
2095 | branch_location=self.context.branch) |
2096 | |
2097 | @@ -989,7 +980,7 @@ |
2098 | self.setFieldError( |
2099 | 'rcs_type', |
2100 | 'You must specify the type of RCS for the remote host.') |
2101 | - elif rcs_type == RevisionControlSystemsExtended.CVS: |
2102 | + elif rcs_type == RevisionControlSystems.CVS: |
2103 | if 'cvs_module' not in data: |
2104 | self.setFieldError( |
2105 | 'cvs_module', |
2106 | @@ -1022,8 +1013,9 @@ |
2107 | # Extend the allowed schemes for the repository URL based on |
2108 | # rcs_type. |
2109 | extra_schemes = { |
2110 | - RevisionControlSystemsExtended.BZR_SVN: ['svn'], |
2111 | - RevisionControlSystemsExtended.GIT: ['git'], |
2112 | + RevisionControlSystems.BZR_SVN: ['svn'], |
2113 | + RevisionControlSystems.GIT: ['git'], |
2114 | + RevisionControlSystems.BZR: ['bzr'], |
2115 | } |
2116 | schemes.update(extra_schemes.get(rcs_type, [])) |
2117 | return schemes |
2118 | @@ -1050,7 +1042,7 @@ |
2119 | # The branch location is not required for validation. |
2120 | self._setRequired(['branch_location'], False) |
2121 | # The cvs_module is required if it is a CVS import. |
2122 | - if rcs_type == RevisionControlSystemsExtended.CVS: |
2123 | + if rcs_type == RevisionControlSystems.CVS: |
2124 | self._setRequired(['cvs_module'], True) |
2125 | else: |
2126 | raise AssertionError("Unknown branch type %s" % branch_type) |
2127 | @@ -1110,7 +1102,7 @@ |
2128 | # Either create an externally hosted bzr branch |
2129 | # (a.k.a. 'mirrored') or create a new code import. |
2130 | rcs_type = data.get('rcs_type') |
2131 | - if rcs_type == RevisionControlSystemsExtended.BZR: |
2132 | + if rcs_type == RevisionControlSystems.BZR: |
2133 | branch = self._createBzrBranch( |
2134 | BranchType.MIRRORED, branch_name, branch_owner, |
2135 | data['repo_url']) |
2136 | @@ -1123,7 +1115,7 @@ |
2137 | 'the series.') |
2138 | else: |
2139 | # We need to create an import request. |
2140 | - if rcs_type == RevisionControlSystemsExtended.CVS: |
2141 | + if rcs_type == RevisionControlSystems.CVS: |
2142 | cvs_root = data.get('repo_url') |
2143 | cvs_module = data.get('cvs_module') |
2144 | url = None |
2145 | |
2146 | === modified file 'lib/lp/registry/browser/tests/distribution-views.txt' |
2147 | --- lib/lp/registry/browser/tests/distribution-views.txt 2011-04-08 13:04:24 +0000 |
2148 | +++ lib/lp/registry/browser/tests/distribution-views.txt 2011-08-16 00:31:36 +0000 |
2149 | @@ -127,7 +127,8 @@ |
2150 | |
2151 | >>> view.field_names |
2152 | ['displayname', 'title', 'summary', 'description', |
2153 | - 'bug_reporting_guidelines', 'bug_reported_acknowledgement', 'icon', |
2154 | + 'bug_reporting_guidelines', 'bug_reported_acknowledgement', |
2155 | + 'package_derivatives_email', 'icon', |
2156 | 'logo', 'mugshot', 'official_malone', 'enable_bug_expiration', |
2157 | 'blueprints_usage', 'official_rosetta', 'answers_usage', |
2158 | 'translation_focus'] |
2159 | |
2160 | === modified file 'lib/lp/registry/browser/tests/test_distribution_views.py' |
2161 | --- lib/lp/registry/browser/tests/test_distribution_views.py 2011-03-16 17:14:26 +0000 |
2162 | +++ lib/lp/registry/browser/tests/test_distribution_views.py 2011-08-16 00:31:36 +0000 |
2163 | @@ -14,6 +14,7 @@ |
2164 | from lp.registry.interfaces.distribution import IDistributionSet |
2165 | from lp.testing import ( |
2166 | login_celebrity, |
2167 | + person_logged_in, |
2168 | TestCaseWithFactory, |
2169 | ) |
2170 | from lp.testing.sampledata import LAUNCHPAD_ADMIN |
2171 | @@ -126,6 +127,26 @@ |
2172 | self.assertEqual(distribution.registrant, admin) |
2173 | |
2174 | |
2175 | +class TestDistroEditView(TestCaseWithFactory): |
2176 | + """Test the +edit page for a distro.""" |
2177 | + |
2178 | + layer = DatabaseFunctionalLayer |
2179 | + |
2180 | + def test_package_derivatives_email(self): |
2181 | + # Test that the edit form allows changing package_derivatives_email |
2182 | + distro = self.factory.makeDistribution() |
2183 | + email = '{package_name}_thing@foo.com' |
2184 | + form = { |
2185 | + 'field.package_derivatives_email': email, |
2186 | + 'field.actions.change': 'Change', |
2187 | + } |
2188 | + with person_logged_in(distro.owner): |
2189 | + create_initialized_view( |
2190 | + distro, '+edit', principal=distro.owner, method="POST", |
2191 | + form=form) |
2192 | + self.assertEqual(distro.package_derivatives_email, email) |
2193 | + |
2194 | + |
2195 | class TestDistroReassignView(TestCaseWithFactory): |
2196 | """Test the +reassign page for a new distribution.""" |
2197 | |
2198 | |
2199 | === modified file 'lib/lp/registry/configure.zcml' |
2200 | --- lib/lp/registry/configure.zcml 2011-08-09 04:09:33 +0000 |
2201 | +++ lib/lp/registry/configure.zcml 2011-08-16 00:31:36 +0000 |
2202 | @@ -1539,6 +1539,7 @@ |
2203 | official_blueprints |
2204 | official_malone |
2205 | owner |
2206 | + package_derivatives_email |
2207 | security_contact |
2208 | summary |
2209 | title"/> |
2210 | |
2211 | === modified file 'lib/lp/registry/interfaces/distribution.py' |
2212 | --- lib/lp/registry/interfaces/distribution.py 2011-07-22 21:47:34 +0000 |
2213 | +++ lib/lp/registry/interfaces/distribution.py 2011-08-16 00:31:36 +0000 |
2214 | @@ -282,6 +282,13 @@ |
2215 | uploaders = Attribute(_( |
2216 | "ArchivePermission records for uploaders with rights to upload to " |
2217 | "this distribution.")) |
2218 | + package_derivatives_email = TextLine( |
2219 | + title=_("Package Derivatives Email Address"), |
2220 | + description=_( |
2221 | + "The email address to send information about updates to packages " |
2222 | + "that are derived from another distribution. The sequence " |
2223 | + "{package_name} is replaced with the actual package name."), |
2224 | + required=False) |
2225 | |
2226 | # properties |
2227 | currentseries = exported( |
2228 | |
2229 | === modified file 'lib/lp/registry/model/distribution.py' |
2230 | --- lib/lp/registry/model/distribution.py 2011-07-22 21:47:34 +0000 |
2231 | +++ lib/lp/registry/model/distribution.py 2011-08-16 00:31:36 +0000 |
2232 | @@ -276,6 +276,7 @@ |
2233 | schema=TranslationPermission, default=TranslationPermission.OPEN) |
2234 | active = True |
2235 | max_bug_heat = Int() |
2236 | + package_derivatives_email = StringCol(notNull=False, default=None) |
2237 | |
2238 | def __repr__(self): |
2239 | displayname = self.displayname.encode('ASCII', 'backslashreplace') |
2240 | |
2241 | === modified file 'lib/lp/registry/tests/test_distribution.py' |
2242 | --- lib/lp/registry/tests/test_distribution.py 2011-08-03 11:00:11 +0000 |
2243 | +++ lib/lp/registry/tests/test_distribution.py 2011-08-16 00:31:36 +0000 |
2244 | @@ -7,12 +7,14 @@ |
2245 | |
2246 | from lazr.lifecycle.snapshot import Snapshot |
2247 | import soupmatchers |
2248 | +from storm.store import Store |
2249 | from testtools import ExpectedException |
2250 | from testtools.matchers import ( |
2251 | MatchesAny, |
2252 | Not, |
2253 | ) |
2254 | from zope.component import getUtility |
2255 | +from zope.security.interfaces import Unauthorized |
2256 | from zope.security.proxy import removeSecurityProxy |
2257 | |
2258 | from canonical.database.constants import UTC_NOW |
2259 | @@ -40,6 +42,7 @@ |
2260 | ) |
2261 | from lp.testing import ( |
2262 | login_person, |
2263 | + person_logged_in, |
2264 | TestCaseWithFactory, |
2265 | ) |
2266 | from lp.testing.matchers import Provides |
2267 | @@ -186,6 +189,23 @@ |
2268 | sourcepackage.distribution.guessPublishedSourcePackageName( |
2269 | 'my-package').name) |
2270 | |
2271 | + def test_derivatives_email(self): |
2272 | + # Make sure the package_derivatives_email column stores data |
2273 | + # correctly. |
2274 | + email = "thingy@foo.com" |
2275 | + distro = self.factory.makeDistribution() |
2276 | + with person_logged_in(distro.owner): |
2277 | + distro.package_derivatives_email = email |
2278 | + Store.of(distro).flush() |
2279 | + self.assertEqual(email, distro.package_derivatives_email) |
2280 | + |
2281 | + def test_derivatives_email_permissions(self): |
2282 | + # package_derivatives_email requires lp.edit to set/change. |
2283 | + distro = self.factory.makeDistribution() |
2284 | + self.assertRaises( |
2285 | + Unauthorized, |
2286 | + setattr, distro, "package_derivatives_email", "foo") |
2287 | + |
2288 | |
2289 | class TestDistributionCurrentSourceReleases( |
2290 | TestDistroSeriesCurrentSourceReleases): |
2291 | |
2292 | === modified file 'lib/lp/soyuz/adapters/notification.py' |
2293 | --- lib/lp/soyuz/adapters/notification.py 2011-07-29 15:17:22 +0000 |
2294 | +++ lib/lp/soyuz/adapters/notification.py 2011-08-16 00:31:36 +0000 |
2295 | @@ -251,7 +251,9 @@ |
2296 | elif bprs: |
2297 | name = bprs[0].build.source_package_release.name |
2298 | if name: |
2299 | - bcc_addr = '%s_derivatives@packages.qa.debian.org' % name |
2300 | + email_base = distroseries.distribution.package_derivatives_email |
2301 | + if email_base: |
2302 | + bcc_addr = email_base.format(package_name=name) |
2303 | |
2304 | build_and_send_mail( |
2305 | 'announcement', [str(distroseries.changeslist)], from_addr, |
2306 | |
2307 | === modified file 'lib/lp/soyuz/adapters/tests/test_notification.py' |
2308 | --- lib/lp/soyuz/adapters/tests/test_notification.py 2011-07-29 15:17:22 +0000 |
2309 | +++ lib/lp/soyuz/adapters/tests/test_notification.py 2011-08-16 00:31:36 +0000 |
2310 | @@ -38,7 +38,10 @@ |
2311 | from lp.soyuz.model.distroseriessourcepackagerelease import ( |
2312 | DistroSeriesSourcePackageRelease, |
2313 | ) |
2314 | -from lp.testing import TestCaseWithFactory |
2315 | +from lp.testing import ( |
2316 | + person_logged_in, |
2317 | + TestCaseWithFactory, |
2318 | + ) |
2319 | from lp.testing.mail_helpers import pop_notifications |
2320 | |
2321 | |
2322 | @@ -83,20 +86,28 @@ |
2323 | 'accepted') |
2324 | self.assertEqual(expected_subject, subject) |
2325 | |
2326 | + def _setup_notification(self, from_person=None, distroseries=None, |
2327 | + spr=None): |
2328 | + if spr is None: |
2329 | + spr = self.factory.makeSourcePackageRelease() |
2330 | + self.factory.makeSourcePackageReleaseFile(sourcepackagerelease=spr) |
2331 | + archive = self.factory.makeArchive(purpose=ArchivePurpose.PRIMARY) |
2332 | + pocket = PackagePublishingPocket.RELEASE |
2333 | + if distroseries is None: |
2334 | + distroseries = self.factory.makeDistroSeries() |
2335 | + distroseries.changeslist = "blah@example.com" |
2336 | + blamer = self.factory.makePerson() |
2337 | + if from_person is None: |
2338 | + from_person = self.factory.makePerson() |
2339 | + notify( |
2340 | + blamer, spr, [], [], archive, distroseries, pocket, |
2341 | + action='accepted', announce_from_person=from_person) |
2342 | + |
2343 | def test_notify_from_person_override(self): |
2344 | # notify() takes an optional from_person to override the calculated |
2345 | # From: address in announcement emails. |
2346 | - spr = self.factory.makeSourcePackageRelease() |
2347 | - self.factory.makeSourcePackageReleaseFile(sourcepackagerelease=spr) |
2348 | - archive = self.factory.makeArchive(purpose=ArchivePurpose.PRIMARY) |
2349 | - pocket = PackagePublishingPocket.RELEASE |
2350 | - distroseries = self.factory.makeDistroSeries() |
2351 | - distroseries.changeslist = "blah@example.com" |
2352 | - blamer = self.factory.makePerson() |
2353 | from_person = self.factory.makePerson() |
2354 | - notify( |
2355 | - blamer, spr, [], [], archive, distroseries, pocket, |
2356 | - action='accepted', announce_from_person=from_person) |
2357 | + self._setup_notification(from_person=from_person) |
2358 | notifications = pop_notifications() |
2359 | self.assertEqual(2, len(notifications)) |
2360 | # The first notification is to the blamer, |
2361 | @@ -105,6 +116,22 @@ |
2362 | self.assertEqual( |
2363 | from_person.preferredemail.email, notifications[1]["From"]) |
2364 | |
2365 | + def test_notify_bcc_to_derivatives_list(self): |
2366 | + # notify() will BCC the announcement email to the address defined in |
2367 | + # Distribution.package_derivatives_email if it's defined. |
2368 | + email = "{package_name}_thing@foo.com" |
2369 | + distroseries = self.factory.makeDistroSeries() |
2370 | + with person_logged_in(distroseries.distribution.owner): |
2371 | + distroseries.distribution.package_derivatives_email = email |
2372 | + spr = self.factory.makeSourcePackageRelease() |
2373 | + self._setup_notification(distroseries=distroseries, spr=spr) |
2374 | + |
2375 | + notifications = pop_notifications() |
2376 | + self.assertEqual(2, len(notifications)) |
2377 | + bcc_address = notifications[1]["Bcc"] |
2378 | + expected_email = email.format(package_name=spr.sourcepackagename.name) |
2379 | + self.assertIn(expected_email, bcc_address) |
2380 | + |
2381 | |
2382 | class TestNotification(TestCaseWithFactory): |
2383 | |
2384 | |
2385 | === modified file 'lib/lp/testing/factory.py' |
2386 | --- lib/lp/testing/factory.py 2011-08-10 07:02:59 +0000 |
2387 | +++ lib/lp/testing/factory.py 2011-08-16 00:31:36 +0000 |
2388 | @@ -474,7 +474,7 @@ |
2389 | branch_id = self.getUniqueInteger() |
2390 | if rcstype is None: |
2391 | rcstype = 'svn' |
2392 | - if rcstype in ['svn', 'bzr-svn', 'hg']: |
2393 | + if rcstype in ['svn', 'bzr-svn', 'hg', 'bzr']: |
2394 | assert cvs_root is cvs_module is None |
2395 | if url is None: |
2396 | url = self.getUniqueURL() |
2397 | @@ -2123,7 +2123,8 @@ |
2398 | |
2399 | def makeCodeImport(self, svn_branch_url=None, cvs_root=None, |
2400 | cvs_module=None, target=None, branch_name=None, |
2401 | - git_repo_url=None, hg_repo_url=None, registrant=None, |
2402 | + git_repo_url=None, hg_repo_url=None, |
2403 | + bzr_branch_url=None, registrant=None, |
2404 | rcs_type=None, review_status=None): |
2405 | """Create and return a new, arbitrary code import. |
2406 | |
2407 | @@ -2132,7 +2133,7 @@ |
2408 | unique URL. |
2409 | """ |
2410 | if (svn_branch_url is cvs_root is cvs_module is git_repo_url is |
2411 | - hg_repo_url is None): |
2412 | + hg_repo_url is bzr_branch_url is None): |
2413 | svn_branch_url = self.getUniqueURL() |
2414 | |
2415 | if target is None: |
2416 | @@ -2163,6 +2164,11 @@ |
2417 | registrant, target, branch_name, |
2418 | rcs_type=RevisionControlSystems.HG, |
2419 | url=hg_repo_url) |
2420 | + elif bzr_branch_url is not None: |
2421 | + code_import = code_import_set.new( |
2422 | + registrant, target, branch_name, |
2423 | + rcs_type=RevisionControlSystems.BZR, |
2424 | + url=bzr_branch_url) |
2425 | else: |
2426 | assert rcs_type in (None, RevisionControlSystems.CVS) |
2427 | code_import = code_import_set.new( |
2428 | |
2429 | === modified file 'lib/lp/testing/pgsql.py' |
2430 | --- lib/lp/testing/pgsql.py 2011-07-01 07:11:27 +0000 |
2431 | +++ lib/lp/testing/pgsql.py 2011-08-16 00:31:36 +0000 |
2432 | @@ -399,7 +399,11 @@ |
2433 | # always having this is a problem. |
2434 | try: |
2435 | cur = con.cursor() |
2436 | - cur.execute('SELECT _killall_backends(%s)', [self.dbname]) |
2437 | + cur.execute(""" |
2438 | + SELECT pg_terminate_backend(procpid) |
2439 | + FROM pg_stat_activity |
2440 | + WHERE procpid <> pg_backend_pid() AND datname=%s |
2441 | + """, [self.dbname]) |
2442 | except psycopg2.DatabaseError: |
2443 | pass |
2444 | |
2445 | |
2446 | === modified file 'lib/lp/translations/configure.zcml' |
2447 | --- lib/lp/translations/configure.zcml 2011-07-30 14:21:23 +0000 |
2448 | +++ lib/lp/translations/configure.zcml 2011-08-16 00:31:36 +0000 |
2449 | @@ -154,6 +154,10 @@ |
2450 | for="lp.registry.interfaces.packaging.IPackaging |
2451 | lazr.lifecycle.interfaces.IObjectEvent" |
2452 | handler=".model.translationsharingjob.schedule_packaging_job" /> |
2453 | + <subscriber |
2454 | + for="lp.translations.interfaces.potemplate.IPOTemplate |
2455 | + lazr.lifecycle.interfaces.IObjectModifiedEvent" |
2456 | + handler=".model.translationsharingjob.schedule_potemplate_job" /> |
2457 | <facet |
2458 | facet="translations"> |
2459 | |
2460 | @@ -643,6 +647,10 @@ |
2461 | class="lp.translations.model.translationpackagingjob.TranslationSplitJob"> |
2462 | <allow interface='lp.services.job.interfaces.job.IRunnableJob'/> |
2463 | </class> |
2464 | + <class |
2465 | + class="lp.translations.model.translationpackagingjob.TranslationTemplateChangeJob"> |
2466 | + <allow interface='lp.services.job.interfaces.job.IRunnableJob'/> |
2467 | + </class> |
2468 | <utility |
2469 | component="lp.translations.model.translationtemplatesbuildjob.TranslationTemplatesBuildJob" |
2470 | provides="lp.buildmaster.interfaces.buildfarmjob.IBuildFarmJob" |
2471 | |
2472 | === modified file 'lib/lp/translations/interfaces/translationsharingjob.py' |
2473 | --- lib/lp/translations/interfaces/translationsharingjob.py 2011-07-30 14:05:25 +0000 |
2474 | +++ lib/lp/translations/interfaces/translationsharingjob.py 2011-08-16 00:31:36 +0000 |
2475 | @@ -19,3 +19,6 @@ |
2476 | |
2477 | sourcepackagename = Attribute( |
2478 | _("The sourcepackagename of the Packaging.")) |
2479 | + |
2480 | + potemplate = Attribute( |
2481 | + _("The POTemplate to pass around as the relevant template.")) |
2482 | |
2483 | === modified file 'lib/lp/translations/model/translationpackagingjob.py' |
2484 | --- lib/lp/translations/model/translationpackagingjob.py 2011-07-30 14:18:38 +0000 |
2485 | +++ lib/lp/translations/model/translationpackagingjob.py 2011-08-16 00:31:36 +0000 |
2486 | @@ -10,6 +10,7 @@ |
2487 | __all__ = [ |
2488 | 'TranslationMergeJob', |
2489 | 'TranslationSplitJob', |
2490 | + 'TranslationTemplateChangeJob', |
2491 | ] |
2492 | |
2493 | import logging |
2494 | @@ -17,6 +18,7 @@ |
2495 | from lazr.lifecycle.interfaces import ( |
2496 | IObjectCreatedEvent, |
2497 | IObjectDeletedEvent, |
2498 | + IObjectModifiedEvent, |
2499 | ) |
2500 | import transaction |
2501 | from zope.interface import ( |
2502 | @@ -40,7 +42,10 @@ |
2503 | TransactionManager, |
2504 | TranslationMerger, |
2505 | ) |
2506 | -from lp.translations.utilities.translationsplitter import TranslationSplitter |
2507 | +from lp.translations.utilities.translationsplitter import ( |
2508 | + TranslationSplitter, |
2509 | + TranslationTemplateSplitter, |
2510 | + ) |
2511 | |
2512 | |
2513 | class TranslationPackagingJob(TranslationSharingJobDerived, BaseRunnableJob): |
2514 | @@ -117,3 +122,31 @@ |
2515 | 'Splitting %s and %s', self.productseries.displayname, |
2516 | self.sourcepackage.displayname) |
2517 | TranslationSplitter(self.productseries, self.sourcepackage).split() |
2518 | + |
2519 | + |
2520 | +class TranslationTemplateChangeJob(TranslationPackagingJob): |
2521 | + """Job for merging/splitting translations when template is changed.""" |
2522 | + |
2523 | + implements(IRunnableJob) |
2524 | + |
2525 | + class_job_type = TranslationSharingJobType.TEMPLATE_CHANGE |
2526 | + |
2527 | + create_on_event = IObjectModifiedEvent |
2528 | + |
2529 | + @classmethod |
2530 | + def forPOTemplate(cls, potemplate): |
2531 | + """Create a TranslationTemplateChangeJob for a POTemplate. |
2532 | + |
2533 | + :param potemplate: The `POTemplate` to create the job for. |
2534 | + :return: A `TranslationTemplateChangeJob`. |
2535 | + """ |
2536 | + return cls.create(potemplate=potemplate) |
2537 | + |
2538 | + def run(self): |
2539 | + """See `IRunnableJob`.""" |
2540 | + logger = logging.getLogger() |
2541 | + logger.info("Sanitizing translations for '%s'" % ( |
2542 | + self.potemplate.displayname)) |
2543 | + TranslationTemplateSplitter(self.potemplate).split() |
2544 | + tm = TransactionManager(transaction.manager, False) |
2545 | + TranslationMerger.mergeModifiedTemplates(self.potemplate, tm) |
2546 | |
2547 | === modified file 'lib/lp/translations/model/translationsharingjob.py' |
2548 | --- lib/lp/translations/model/translationsharingjob.py 2011-07-30 14:30:41 +0000 |
2549 | +++ lib/lp/translations/model/translationsharingjob.py 2011-08-16 00:31:36 +0000 |
2550 | @@ -37,6 +37,7 @@ |
2551 | from lp.translations.interfaces.translationsharingjob import ( |
2552 | ITranslationSharingJob, |
2553 | ) |
2554 | +from lp.translations.model.potemplate import POTemplate |
2555 | |
2556 | |
2557 | class TranslationSharingJobType(DBEnumeratedType): |
2558 | @@ -54,6 +55,12 @@ |
2559 | Split translations between productseries and sourcepackage. |
2560 | """) |
2561 | |
2562 | + TEMPLATE_CHANGE = DBItem(2, """ |
2563 | + Split/merge translations for a single translation template. |
2564 | + |
2565 | + Split/merge translations for a single translation template. |
2566 | + """) |
2567 | + |
2568 | |
2569 | class TranslationSharingJob(StormBase): |
2570 | """Base class for jobs related to a packaging.""" |
2571 | @@ -82,8 +89,12 @@ |
2572 | |
2573 | sourcepackagename = Reference(sourcepackagename_id, SourcePackageName.id) |
2574 | |
2575 | + potemplate_id = Int('potemplate') |
2576 | + |
2577 | + potemplate = Reference(potemplate_id, POTemplate.id) |
2578 | + |
2579 | def __init__(self, job, job_type, productseries, distroseries, |
2580 | - sourcepackagename): |
2581 | + sourcepackagename, potemplate=None): |
2582 | """"Constructor. |
2583 | |
2584 | :param job: The `Job` to use for storing basic job state. |
2585 | @@ -96,6 +107,7 @@ |
2586 | self.distroseries = distroseries |
2587 | self.sourcepackagename = sourcepackagename |
2588 | self.productseries = productseries |
2589 | + self.potemplate = potemplate |
2590 | |
2591 | |
2592 | class RegisteredSubclass(type): |
2593 | @@ -143,16 +155,18 @@ |
2594 | self.job = job |
2595 | |
2596 | @classmethod |
2597 | - def create(cls, productseries, distroseries, sourcepackagename): |
2598 | + def create(cls, productseries=None, distroseries=None, |
2599 | + sourcepackagename=None, potemplate=None): |
2600 | """"Create a TranslationPackagingJob backed by TranslationSharingJob. |
2601 | |
2602 | :param productseries: The ProductSeries side of the Packaging. |
2603 | :param distroseries: The distroseries of the Packaging sourcepackage. |
2604 | :param sourcepackagename: The name of the Packaging sourcepackage. |
2605 | + :param potemplate: POTemplate to restrict to (if any). |
2606 | """ |
2607 | context = TranslationSharingJob( |
2608 | Job(), cls.class_job_type, productseries, |
2609 | - distroseries, sourcepackagename) |
2610 | + distroseries, sourcepackagename, potemplate) |
2611 | return cls(context) |
2612 | |
2613 | @classmethod |
2614 | @@ -170,6 +184,27 @@ |
2615 | job_class.forPackaging(packaging) |
2616 | |
2617 | @classmethod |
2618 | + def schedulePOTemplateJob(cls, potemplate, event): |
2619 | + """Event subscriber to create a TranslationSharingJob on events. |
2620 | + |
2621 | + :param potemplate: The `POTemplate` to create |
2622 | + a `TranslationSharingJob` for. |
2623 | + :param event: The event itself. |
2624 | + """ |
2625 | + if ('name' not in event.edited_fields and |
2626 | + 'productseries' not in event.edited_fields and |
2627 | + 'distroseries' not in event.edited_fields and |
2628 | + 'sourcepackagename' not in event.edited_fields): |
2629 | + # Ignore changes to POTemplates that are neither renames, |
2630 | + # nor moves to a different package/project. |
2631 | + return |
2632 | + for event_type, job_classes in cls._event_types.iteritems(): |
2633 | + if not event_type.providedBy(event): |
2634 | + continue |
2635 | + for job_class in job_classes: |
2636 | + job_class.forPOTemplate(potemplate) |
2637 | + |
2638 | + @classmethod |
2639 | def iterReady(cls, extra_clauses): |
2640 | """See `IJobSource`. |
2641 | |
2642 | @@ -207,3 +242,4 @@ |
2643 | |
2644 | #make accessible to zcml |
2645 | schedule_packaging_job = TranslationSharingJobDerived.schedulePackagingJob |
2646 | +schedule_potemplate_job = TranslationSharingJobDerived.schedulePOTemplateJob |
2647 | |
2648 | === modified file 'lib/lp/translations/scripts/tests/test_packaging_translations.py' |
2649 | --- lib/lp/translations/scripts/tests/test_packaging_translations.py 2011-08-12 13:55:02 +0000 |
2650 | +++ lib/lp/translations/scripts/tests/test_packaging_translations.py 2011-08-16 00:31:36 +0000 |
2651 | @@ -39,6 +39,7 @@ |
2652 | INFO Deleted POTMsgSets: 1. TranslationMessages: 1. |
2653 | INFO Running TranslationSplitJob \(ID .*\) in status Waiting |
2654 | INFO Splitting .* and .* in Ubuntu Distroseries.* |
2655 | + INFO 1 entries split. |
2656 | INFO Ran 1 TranslationMergeJob jobs. |
2657 | INFO Ran 1 TranslationSplitJob jobs. |
2658 | """)) |
2659 | |
2660 | === modified file 'lib/lp/translations/tests/test_translationpackagingjob.py' |
2661 | --- lib/lp/translations/tests/test_translationpackagingjob.py 2011-08-01 14:32:07 +0000 |
2662 | +++ lib/lp/translations/tests/test_translationpackagingjob.py 2011-08-16 00:31:36 +0000 |
2663 | @@ -8,12 +8,17 @@ |
2664 | |
2665 | import transaction |
2666 | from zope.component import getUtility |
2667 | +from zope.event import notify |
2668 | + |
2669 | +from lazr.lifecycle.event import ObjectModifiedEvent |
2670 | +from lazr.lifecycle.snapshot import Snapshot |
2671 | |
2672 | from canonical.launchpad.webapp.testing import verifyObject |
2673 | from canonical.testing.layers import ( |
2674 | LaunchpadZopelessLayer, |
2675 | ) |
2676 | from lp.registry.interfaces.packaging import IPackagingUtil |
2677 | +from lp.translations.interfaces.potemplate import IPOTemplate |
2678 | from lp.translations.model.translationsharingjob import ( |
2679 | TranslationSharingJob, |
2680 | TranslationSharingJobDerived, |
2681 | @@ -36,6 +41,7 @@ |
2682 | TranslationMergeJob, |
2683 | TranslationPackagingJob, |
2684 | TranslationSplitJob, |
2685 | + TranslationTemplateChangeJob, |
2686 | ) |
2687 | from lp.translations.tests.test_translationsplitter import ( |
2688 | make_shared_potmsgset, |
2689 | @@ -101,20 +107,32 @@ |
2690 | |
2691 | class JobFinder: |
2692 | |
2693 | - def __init__(self, productseries, sourcepackage, job_class): |
2694 | - self.productseries = productseries |
2695 | - self.sourcepackagename = sourcepackage.sourcepackagename |
2696 | - self.distroseries = sourcepackage.distroseries |
2697 | + def __init__(self, productseries, sourcepackage, job_class, |
2698 | + potemplate=None): |
2699 | + if potemplate is None: |
2700 | + self.productseries = productseries |
2701 | + self.sourcepackagename = sourcepackage.sourcepackagename |
2702 | + self.distroseries = sourcepackage.distroseries |
2703 | + self.potemplate = None |
2704 | + else: |
2705 | + self.potemplate = potemplate |
2706 | self.job_type = job_class.class_job_type |
2707 | |
2708 | def find(self): |
2709 | - return list(TranslationSharingJobDerived.iterReady([ |
2710 | - TranslationSharingJob.productseries_id == self.productseries.id, |
2711 | - (TranslationSharingJob.sourcepackagename_id == |
2712 | - self.sourcepackagename.id), |
2713 | - TranslationSharingJob.distroseries_id == self.distroseries.id, |
2714 | - TranslationSharingJob.job_type == self.job_type, |
2715 | - ])) |
2716 | + if self.potemplate is None: |
2717 | + return list(TranslationSharingJobDerived.iterReady([ |
2718 | + TranslationSharingJob.productseries_id == self.productseries.id, |
2719 | + (TranslationSharingJob.sourcepackagename_id == |
2720 | + self.sourcepackagename.id), |
2721 | + TranslationSharingJob.distroseries_id == self.distroseries.id, |
2722 | + TranslationSharingJob.job_type == self.job_type, |
2723 | + ])) |
2724 | + else: |
2725 | + return list( |
2726 | + TranslationSharingJobDerived.iterReady([ |
2727 | + TranslationSharingJob.potemplate_id == self.potemplate.id, |
2728 | + TranslationSharingJob.job_type == self.job_type, |
2729 | + ])) |
2730 | |
2731 | |
2732 | class TestTranslationPackagingJob(TestCaseWithFactory): |
2733 | @@ -273,3 +291,62 @@ |
2734 | packaging.distroseries) |
2735 | (job,) = finder.find() |
2736 | self.assertIsInstance(job, TranslationSplitJob) |
2737 | + |
2738 | + |
2739 | +class TestTranslationTemplateChangeJob(TestCaseWithFactory): |
2740 | + |
2741 | + layer = LaunchpadZopelessLayer |
2742 | + |
2743 | + def test_modifyPOTemplate_makes_job(self): |
2744 | + """Creating a Packaging should make a TranslationMergeJob.""" |
2745 | + potemplate = self.factory.makePOTemplate() |
2746 | + finder = JobFinder( |
2747 | + None, None, TranslationTemplateChangeJob, potemplate) |
2748 | + self.assertEqual([], finder.find()) |
2749 | + with person_logged_in(potemplate.owner): |
2750 | + snapshot = Snapshot(potemplate, providing=IPOTemplate) |
2751 | + potemplate.name = self.factory.getUniqueString() |
2752 | + notify(ObjectModifiedEvent(potemplate, snapshot, ["name"])) |
2753 | + |
2754 | + (job,) = finder.find() |
2755 | + self.assertIsInstance(job, TranslationTemplateChangeJob) |
2756 | + |
2757 | + def test_splits_and_merges(self): |
2758 | + """Changing a template makes the translations split and then |
2759 | + re-merged in the new target sharing set.""" |
2760 | + potemplate = self.factory.makePOTemplate(name='template') |
2761 | + other_ps = self.factory.makeProductSeries( |
2762 | + product=potemplate.productseries.product) |
2763 | + old_shared = self.factory.makePOTemplate(name='template', |
2764 | + productseries=other_ps) |
2765 | + new_shared = self.factory.makePOTemplate(name='renamed', |
2766 | + productseries=other_ps) |
2767 | + |
2768 | + # Set up shared POTMsgSets and translations. |
2769 | + potmsgset = self.factory.makePOTMsgSet(potemplate, sequence=1) |
2770 | + potmsgset.setSequence(old_shared, 1) |
2771 | + self.factory.makeCurrentTranslationMessage(potmsgset=potmsgset) |
2772 | + |
2773 | + # This is the identical English message in the new_shared template. |
2774 | + target_potmsgset = self.factory.makePOTMsgSet( |
2775 | + new_shared, sequence=1, singular=potmsgset.singular_text) |
2776 | + |
2777 | + # Rename the template and confirm that messages are now shared |
2778 | + # with new_shared instead of old_shared. |
2779 | + potemplate.name = 'renamed' |
2780 | + job = TranslationTemplateChangeJob.create(potemplate=potemplate) |
2781 | + |
2782 | + self.becomeDbUser('rosettaadmin') |
2783 | + job.run() |
2784 | + |
2785 | + # New POTMsgSet is now different from the old one (it's been split), |
2786 | + # but matches the target potmsgset (it's been merged into it). |
2787 | + new_potmsgset = potemplate.getPOTMsgSets()[0] |
2788 | + self.assertNotEqual(potmsgset, new_potmsgset) |
2789 | + self.assertEqual(target_potmsgset, new_potmsgset) |
2790 | + |
2791 | + # Translations have been merged as well. |
2792 | + self.assertContentEqual( |
2793 | + [tm.translations for tm in potmsgset.getAllTranslationMessages()], |
2794 | + [tm.translations |
2795 | + for tm in new_potmsgset.getAllTranslationMessages()]) |
2796 | |
2797 | === modified file 'lib/lp/translations/tests/test_translationsplitter.py' |
2798 | --- lib/lp/translations/tests/test_translationsplitter.py 2011-02-25 20:23:40 +0000 |
2799 | +++ lib/lp/translations/tests/test_translationsplitter.py 2011-08-16 00:31:36 +0000 |
2800 | @@ -13,6 +13,7 @@ |
2801 | ) |
2802 | from lp.translations.utilities.translationsplitter import ( |
2803 | TranslationSplitter, |
2804 | + TranslationTemplateSplitter, |
2805 | ) |
2806 | |
2807 | |
2808 | @@ -77,8 +78,6 @@ |
2809 | splitter = make_translation_splitter(self.factory) |
2810 | upstream_item, ubuntu_item = make_shared_potmsgset( |
2811 | self.factory, splitter) |
2812 | - ubuntu_template = ubuntu_item.potemplate |
2813 | - ubuntu_sequence = ubuntu_item.sequence |
2814 | new_potmsgset = splitter.splitPOTMsgSet(ubuntu_item) |
2815 | self.assertEqual(new_potmsgset, ubuntu_item.potmsgset) |
2816 | |
2817 | @@ -91,8 +90,7 @@ |
2818 | potmsgset=upstream_item.potmsgset, |
2819 | potemplate=upstream_item.potemplate, diverged=True) |
2820 | splitter.splitPOTMsgSet(ubuntu_item) |
2821 | - upstream_translation = splitter.migrateTranslations( |
2822 | - upstream_item.potmsgset, ubuntu_item) |
2823 | + splitter.migrateTranslations(upstream_item.potmsgset, ubuntu_item) |
2824 | self.assertEqual( |
2825 | upstream_message, |
2826 | upstream_item.potmsgset.getAllTranslationMessages().one()) |
2827 | @@ -108,8 +106,7 @@ |
2828 | potmsgset=ubuntu_item.potmsgset, |
2829 | potemplate=ubuntu_item.potemplate, diverged=True) |
2830 | splitter.splitPOTMsgSet(ubuntu_item) |
2831 | - upstream_translation = splitter.migrateTranslations( |
2832 | - upstream_item.potmsgset, ubuntu_item) |
2833 | + splitter.migrateTranslations(upstream_item.potmsgset, ubuntu_item) |
2834 | self.assertEqual( |
2835 | ubuntu_message, |
2836 | ubuntu_item.potmsgset.getAllTranslationMessages().one()) |
2837 | @@ -139,7 +136,7 @@ |
2838 | splitter = make_translation_splitter(self.factory) |
2839 | upstream_item, ubuntu_item = make_shared_potmsgset( |
2840 | self.factory, splitter) |
2841 | - upstream_message = self.factory.makeCurrentTranslationMessage( |
2842 | + self.factory.makeCurrentTranslationMessage( |
2843 | potmsgset=upstream_item.potmsgset, |
2844 | potemplate=upstream_item.potemplate) |
2845 | splitter.split() |
2846 | @@ -153,3 +150,183 @@ |
2847 | upstream_item.potmsgset.getAllTranslationMessages().count(), |
2848 | ubuntu_item.potmsgset.getAllTranslationMessages().count(), |
2849 | ) |
2850 | + |
2851 | + |
2852 | +class TestTranslationTemplateSplitterBase: |
2853 | + |
2854 | + layer = ZopelessDatabaseLayer |
2855 | + |
2856 | + def getPOTMsgSetAndTemplateToSplit(self, splitter): |
2857 | + return [(tti1.potmsgset, tti1.potemplate) |
2858 | + for tti1, tti2 in splitter.findShared()] |
2859 | + |
2860 | + def setUpSharingTemplates(self, other_side=False): |
2861 | + """Sets up two sharing templates with one sharing message and |
2862 | + one non-sharing message in each template.""" |
2863 | + template1 = self.makePOTemplate() |
2864 | + template2 = self.makeSharingTemplate(template1, other_side) |
2865 | + |
2866 | + shared_potmsgset = self.factory.makePOTMsgSet(template1, sequence=1) |
2867 | + shared_potmsgset.setSequence(template2, 1) |
2868 | + |
2869 | + # POTMsgSets appearing in only one of the templates are not returned. |
2870 | + self.factory.makePOTMsgSet(template1, sequence=2) |
2871 | + self.factory.makePOTMsgSet(template2, sequence=2) |
2872 | + return template1, template2, shared_potmsgset |
2873 | + |
2874 | + def makePOTemplate(self): |
2875 | + raise NotImplementedError('Subclasses should implement this.') |
2876 | + |
2877 | + def makeSharingTemplate(self, template, other_side=False): |
2878 | + raise NotImplementedError('Subclasses should implement this.') |
2879 | + |
2880 | + def test_findShared_renamed(self): |
2881 | + """Shared POTMsgSets are included for a renamed template.""" |
2882 | + template1, template2, shared_potmsgset = self.setUpSharingTemplates() |
2883 | + |
2884 | + splitter = TranslationTemplateSplitter(template2) |
2885 | + self.assertContentEqual([], splitter.findShared()) |
2886 | + |
2887 | + template2.name = 'renamed' |
2888 | + self.assertContentEqual( |
2889 | + [(shared_potmsgset, template1)], |
2890 | + self.getPOTMsgSetAndTemplateToSplit(splitter)) |
2891 | + |
2892 | + def test_findShared_moved_product(self): |
2893 | + """Moving a template to a different product splits its messages.""" |
2894 | + template1, template2, shared_potmsgset = self.setUpSharingTemplates() |
2895 | + |
2896 | + splitter = TranslationTemplateSplitter(template2) |
2897 | + self.assertContentEqual([], splitter.findShared()) |
2898 | + |
2899 | + # Move the template to a different product entirely. |
2900 | + template2.productseries = self.factory.makeProduct().development_focus |
2901 | + template2.distroseries = None |
2902 | + template2.sourcepackagename = None |
2903 | + self.assertContentEqual( |
2904 | + [(shared_potmsgset, template1)], |
2905 | + self.getPOTMsgSetAndTemplateToSplit(splitter)) |
2906 | + |
2907 | + def test_findShared_moved_distribution(self): |
2908 | + """Moving a template to a different distribution gets it split.""" |
2909 | + template1, template2, shared_potmsgset = self.setUpSharingTemplates() |
2910 | + |
2911 | + splitter = TranslationTemplateSplitter(template2) |
2912 | + self.assertContentEqual([], splitter.findShared()) |
2913 | + |
2914 | + # Move the template to a different distribution entirely. |
2915 | + sourcepackage = self.factory.makeSourcePackage() |
2916 | + template2.distroseries = sourcepackage.distroseries |
2917 | + template2.sourcepackagename = sourcepackage.sourcepackagename |
2918 | + template2.productseries = None |
2919 | + self.assertContentEqual( |
2920 | + [(shared_potmsgset, template1)], |
2921 | + self.getPOTMsgSetAndTemplateToSplit(splitter)) |
2922 | + |
2923 | + def test_findShared_moved_to_nonsharing_target(self): |
2924 | + """Moving a template to a target not sharing with the existing |
2925 | + upstreams and source package gets it split.""" |
2926 | + template1, template2, shared_potmsgset = self.setUpSharingTemplates( |
2927 | + other_side=True) |
2928 | + |
2929 | + splitter = TranslationTemplateSplitter(template2) |
2930 | + self.assertContentEqual([], splitter.findShared()) |
2931 | + |
2932 | + # Move the template to a different distribution entirely. |
2933 | + sourcepackage = self.factory.makeSourcePackage() |
2934 | + template2.distroseries = sourcepackage.distroseries |
2935 | + template2.sourcepackagename = sourcepackage.sourcepackagename |
2936 | + template2.productseries = None |
2937 | + self.assertContentEqual( |
2938 | + [(shared_potmsgset, template1)], |
2939 | + self.getPOTMsgSetAndTemplateToSplit(splitter)) |
2940 | + |
2941 | + def test_split_messages(self): |
2942 | + """Splitting messages works properly.""" |
2943 | + template1, template2, shared_potmsgset = self.setUpSharingTemplates() |
2944 | + |
2945 | + splitter = TranslationTemplateSplitter(template2) |
2946 | + self.assertContentEqual([], splitter.findShared()) |
2947 | + |
2948 | + # Move the template to a different product entirely. |
2949 | + template2.productseries = self.factory.makeProduct().development_focus |
2950 | + template2.distroseries = None |
2951 | + template2.sourcepackagename = None |
2952 | + |
2953 | + other_item, this_item = splitter.findShared()[0] |
2954 | + |
2955 | + splitter.split() |
2956 | + |
2957 | + self.assertNotEqual(other_item.potmsgset, this_item.potmsgset) |
2958 | + self.assertEqual(shared_potmsgset, other_item.potmsgset) |
2959 | + self.assertNotEqual(shared_potmsgset, this_item.potmsgset) |
2960 | + |
2961 | + |
2962 | +class TestProductTranslationTemplateSplitter( |
2963 | + TestCaseWithFactory, TestTranslationTemplateSplitterBase): |
2964 | + """Templates in a product get split appropriately.""" |
2965 | + |
2966 | + def makePOTemplate(self): |
2967 | + return self.factory.makePOTemplate( |
2968 | + name='template', |
2969 | + side=TranslationSide.UPSTREAM) |
2970 | + |
2971 | + def makeSharingTemplate(self, template, other_side=False): |
2972 | + if other_side: |
2973 | + template2 = self.factory.makePOTemplate( |
2974 | + name='template', |
2975 | + side=TranslationSide.UBUNTU) |
2976 | + self.factory.makePackagingLink( |
2977 | + productseries=template.productseries, |
2978 | + distroseries=template2.distroseries, |
2979 | + sourcepackagename=template2.sourcepackagename) |
2980 | + return template2 |
2981 | + else: |
2982 | + product = template.productseries.product |
2983 | + other_series = self.factory.makeProductSeries(product=product) |
2984 | + return self.factory.makePOTemplate(name='template', |
2985 | + productseries=other_series) |
2986 | + |
2987 | + |
2988 | +class TestDistributionTranslationTemplateSplitter( |
2989 | + TestCaseWithFactory, TestTranslationTemplateSplitterBase): |
2990 | + """Templates in a distribution get split appropriately.""" |
2991 | + |
2992 | + def makePOTemplate(self): |
2993 | + return self.factory.makePOTemplate( |
2994 | + name='template', |
2995 | + side=TranslationSide.UBUNTU) |
2996 | + |
2997 | + def makeSharingTemplate(self, template, other_side=False): |
2998 | + if other_side: |
2999 | + template2 = self.factory.makePOTemplate( |
3000 | + name='template', |
3001 | + side=TranslationSide.UPSTREAM) |
3002 | + self.factory.makePackagingLink( |
3003 | + productseries=template2.productseries, |
3004 | + distroseries=template.distroseries, |
3005 | + sourcepackagename=template.sourcepackagename) |
3006 | + return template2 |
3007 | + else: |
3008 | + distro = template.distroseries.distribution |
3009 | + other_series = self.factory.makeDistroSeries(distribution=distro) |
3010 | + return self.factory.makePOTemplate( |
3011 | + name='template', |
3012 | + distroseries=other_series, |
3013 | + sourcepackagename=template.sourcepackagename) |
3014 | + |
3015 | + def test_findShared_moved_sourcepackage(self): |
3016 | + """Moving a template to a different source package gets it split.""" |
3017 | + template1, template2, shared_potmsgset = self.setUpSharingTemplates() |
3018 | + |
3019 | + splitter = TranslationTemplateSplitter(template2) |
3020 | + self.assertContentEqual([], splitter.findShared()) |
3021 | + |
3022 | + # Move the template to a different source package inside the |
3023 | + # same distroseries. |
3024 | + sourcepackage = self.factory.makeSourcePackage( |
3025 | + distroseries=template2.distroseries) |
3026 | + template2.sourcepackagename = sourcepackage.sourcepackagename |
3027 | + self.assertContentEqual( |
3028 | + [(shared_potmsgset, template1)], |
3029 | + self.getPOTMsgSetAndTemplateToSplit(splitter)) |
3030 | |
3031 | === modified file 'lib/lp/translations/translationmerger.py' |
3032 | --- lib/lp/translations/translationmerger.py 2011-05-27 21:12:25 +0000 |
3033 | +++ lib/lp/translations/translationmerger.py 2011-08-16 00:31:36 +0000 |
3034 | @@ -387,6 +387,26 @@ |
3035 | merger = cls(templates, tm) |
3036 | merger.mergePOTMsgSets() |
3037 | |
3038 | + @classmethod |
3039 | + def mergeModifiedTemplates(cls, potemplate, tm): |
3040 | + subset = getUtility(IPOTemplateSet).getSharingSubset( |
3041 | + distribution=potemplate.distribution, |
3042 | + sourcepackagename=potemplate.sourcepackagename, |
3043 | + product=potemplate.product) |
3044 | + templates = list(subset.getSharingPOTemplates(potemplate.name)) |
3045 | + templates.sort(key=methodcaller('sharingKey'), reverse=True) |
3046 | + merger = cls(templates, tm) |
3047 | + merger.mergeAll() |
3048 | + |
3049 | + def mergeAll(self): |
3050 | + """Properly merge POTMsgSets and TranslationMessages.""" |
3051 | + self._removeDuplicateMessages() |
3052 | + self.tm.endTransaction(intermediate=True) |
3053 | + self.mergePOTMsgSets() |
3054 | + self.tm.endTransaction(intermediate=True) |
3055 | + self.mergeTranslationMessages() |
3056 | + self.tm.endTransaction() |
3057 | + |
3058 | def __init__(self, potemplates, tm): |
3059 | """Constructor. |
3060 | |
3061 | @@ -548,7 +568,6 @@ |
3062 | deletions = 0 |
3063 | order_check.check(template) |
3064 | potmsgset_ids = self._getPOTMsgSetIds(template) |
3065 | - total_ids = len(potmsgset_ids) |
3066 | for potmsgset_id in potmsgset_ids: |
3067 | potmsgset = POTMsgSet.get(potmsgset_id) |
3068 | |
3069 | |
3070 | === modified file 'lib/lp/translations/utilities/translationsplitter.py' |
3071 | --- lib/lp/translations/utilities/translationsplitter.py 2011-05-12 20:21:58 +0000 |
3072 | +++ lib/lp/translations/utilities/translationsplitter.py 2011-08-16 00:31:36 +0000 |
3073 | @@ -6,50 +6,30 @@ |
3074 | |
3075 | import logging |
3076 | |
3077 | -from storm.locals import ClassAlias, Store |
3078 | +from storm.expr import ( |
3079 | + And, |
3080 | + Join, |
3081 | + LeftJoin, |
3082 | + Not, |
3083 | + Or, |
3084 | + ) |
3085 | +from storm.locals import ( |
3086 | + ClassAlias, |
3087 | + Store, |
3088 | + ) |
3089 | import transaction |
3090 | |
3091 | +from lp.registry.model.distroseries import DistroSeries |
3092 | +from lp.registry.model.packaging import Packaging |
3093 | +from lp.registry.model.productseries import ProductSeries |
3094 | from lp.translations.model.potemplate import POTemplate |
3095 | from lp.translations.model.translationtemplateitem import ( |
3096 | TranslationTemplateItem, |
3097 | ) |
3098 | |
3099 | |
3100 | -class TranslationSplitter: |
3101 | - """Split translations for a productseries, sourcepackage pair. |
3102 | - |
3103 | - If a productseries and sourcepackage were linked in error, and then |
3104 | - unlinked, they may still share some translations. This class breaks those |
3105 | - associations. |
3106 | - """ |
3107 | - |
3108 | - def __init__(self, productseries, sourcepackage): |
3109 | - """Constructor. |
3110 | - |
3111 | - :param productseries: The `ProductSeries` to split from. |
3112 | - :param sourcepackage: The `SourcePackage` to split from. |
3113 | - """ |
3114 | - self.productseries = productseries |
3115 | - self.sourcepackage = sourcepackage |
3116 | - |
3117 | - def findShared(self): |
3118 | - """Provide tuples of upstream, ubuntu for each shared POTMsgSet.""" |
3119 | - store = Store.of(self.productseries) |
3120 | - UpstreamItem = ClassAlias(TranslationTemplateItem, 'UpstreamItem') |
3121 | - UpstreamTemplate = ClassAlias(POTemplate, 'UpstreamTemplate') |
3122 | - UbuntuItem = ClassAlias(TranslationTemplateItem, 'UbuntuItem') |
3123 | - UbuntuTemplate = ClassAlias(POTemplate, 'UbuntuTemplate') |
3124 | - return store.find( |
3125 | - (UpstreamItem, UbuntuItem), |
3126 | - UpstreamItem.potmsgsetID == UbuntuItem.potmsgsetID, |
3127 | - UbuntuItem.potemplateID == UbuntuTemplate.id, |
3128 | - UbuntuTemplate.sourcepackagenameID == |
3129 | - self.sourcepackage.sourcepackagename.id, |
3130 | - UbuntuTemplate.distroseriesID == |
3131 | - self.sourcepackage.distroseries.id, |
3132 | - UpstreamItem.potemplateID == UpstreamTemplate.id, |
3133 | - UpstreamTemplate.productseriesID == self.productseries.id, |
3134 | - ) |
3135 | +class TranslationSplitterBase: |
3136 | + """Base class for translation splitting jobs.""" |
3137 | |
3138 | @staticmethod |
3139 | def splitPOTMsgSet(ubuntu_item): |
3140 | @@ -86,9 +66,151 @@ |
3141 | """Split the translations for the ProductSeries and SourcePackage.""" |
3142 | logger = logging.getLogger() |
3143 | shared = enumerate(self.findShared(), 1) |
3144 | + total = 0 |
3145 | for num, (upstream_item, ubuntu_item) in shared: |
3146 | self.splitPOTMsgSet(ubuntu_item) |
3147 | self.migrateTranslations(upstream_item.potmsgset, ubuntu_item) |
3148 | if num % 100 == 0: |
3149 | logger.info('%d entries split. Committing...', num) |
3150 | transaction.commit() |
3151 | + total = num |
3152 | + |
3153 | + if total % 100 != 0 or total == 0: |
3154 | + transaction.commit() |
3155 | + logger.info('%d entries split.', total) |
3156 | + |
3157 | + |
3158 | +class TranslationSplitter(TranslationSplitterBase): |
3159 | + """Split translations for a productseries, sourcepackage pair. |
3160 | + |
3161 | + If a productseries and sourcepackage were linked in error, and then |
3162 | + unlinked, they may still share some translations. This class breaks those |
3163 | + associations. |
3164 | + """ |
3165 | + |
3166 | + def __init__(self, productseries, sourcepackage): |
3167 | + """Constructor. |
3168 | + |
3169 | + :param productseries: The `ProductSeries` to split from. |
3170 | + :param sourcepackage: The `SourcePackage` to split from. |
3171 | + """ |
3172 | + self.productseries = productseries |
3173 | + self.sourcepackage = sourcepackage |
3174 | + |
3175 | + def findShared(self): |
3176 | + """Provide tuples of upstream, ubuntu for each shared POTMsgSet.""" |
3177 | + store = Store.of(self.productseries) |
3178 | + UpstreamItem = ClassAlias(TranslationTemplateItem, 'UpstreamItem') |
3179 | + UpstreamTemplate = ClassAlias(POTemplate, 'UpstreamTemplate') |
3180 | + UbuntuItem = ClassAlias(TranslationTemplateItem, 'UbuntuItem') |
3181 | + UbuntuTemplate = ClassAlias(POTemplate, 'UbuntuTemplate') |
3182 | + return store.find( |
3183 | + (UpstreamItem, UbuntuItem), |
3184 | + UpstreamItem.potmsgsetID == UbuntuItem.potmsgsetID, |
3185 | + UbuntuItem.potemplateID == UbuntuTemplate.id, |
3186 | + UbuntuTemplate.sourcepackagenameID == |
3187 | + self.sourcepackage.sourcepackagename.id, |
3188 | + UbuntuTemplate.distroseriesID == |
3189 | + self.sourcepackage.distroseries.id, |
3190 | + UpstreamItem.potemplateID == UpstreamTemplate.id, |
3191 | + UpstreamTemplate.productseriesID == self.productseries.id, |
3192 | + ) |
3193 | + |
3194 | + |
3195 | +class TranslationTemplateSplitter(TranslationSplitterBase): |
3196 | + """Split translations for an extracted potemplate. |
3197 | + |
3198 | + When a POTemplate is removed from a set of sharing templates, |
3199 | + it keeps sharing POTMsgSets with other templates. This class |
3200 | + removes those associations. |
3201 | + """ |
3202 | + |
3203 | + def __init__(self, potemplate): |
3204 | + """Constructor. |
3205 | + |
3206 | + :param potemplate: The `POTemplate` to sanitize. |
3207 | + """ |
3208 | + self.potemplate = potemplate |
3209 | + |
3210 | + def findShared(self): |
3211 | + """Provide tuples of (other, this) items for each shared POTMsgSet. |
3212 | + |
3213 | + Only return those that are shared but shouldn't be because they |
3214 | + are now in non-sharing templates. |
3215 | + """ |
3216 | + store = Store.of(self.potemplate) |
3217 | + ThisItem = ClassAlias(TranslationTemplateItem, 'ThisItem') |
3218 | + OtherItem = ClassAlias(TranslationTemplateItem, 'OtherItem') |
3219 | + OtherTemplate = ClassAlias(POTemplate, 'OtherTemplate') |
3220 | + |
3221 | + tables = [ |
3222 | + OtherTemplate, |
3223 | + Join(OtherItem, OtherItem.potemplateID == OtherTemplate.id), |
3224 | + Join(ThisItem, |
3225 | + And(ThisItem.potmsgsetID == OtherItem.potmsgsetID, |
3226 | + ThisItem.potemplateID == self.potemplate.id)), |
3227 | + ] |
3228 | + |
3229 | + if self.potemplate.productseries is not None: |
3230 | + # If the template is now in a product, we look for all |
3231 | + # effectively sharing templates that are in *different* |
3232 | + # products, or that are in a sourcepackage which is not |
3233 | + # linked (through Packaging table) with this product. |
3234 | + ps = self.potemplate.productseries |
3235 | + productseries_join = LeftJoin( |
3236 | + ProductSeries, |
3237 | + ProductSeries.id == OtherTemplate.productseriesID) |
3238 | + packaging_join = LeftJoin( |
3239 | + Packaging, |
3240 | + And(Packaging.productseriesID == ps.id, |
3241 | + (Packaging.sourcepackagenameID == |
3242 | + OtherTemplate.sourcepackagenameID), |
3243 | + Packaging.distroseriesID == OtherTemplate.distroseriesID |
3244 | + )) |
3245 | + tables.extend([productseries_join, packaging_join]) |
3246 | + # Template should not be sharing if... |
3247 | + other_clauses = Or( |
3248 | + # The name is different, or... |
3249 | + OtherTemplate.name != self.potemplate.name, |
3250 | + # It's in a different product, or... |
3251 | + And(Not(ProductSeries.id == None), |
3252 | + ProductSeries.productID != ps.productID), |
3253 | + # There is no link between this product series and |
3254 | + # a source package the template is in. |
3255 | + And(Not(OtherTemplate.distroseriesID == None), |
3256 | + Packaging.id == None)) |
3257 | + else: |
3258 | + # If the template is now in a source package, we look for all |
3259 | + # effectively sharing templates that are in *different* |
3260 | + # distributions or source packages, or that are in a product |
3261 | + # which is not linked with this source package. |
3262 | + ds = self.potemplate.distroseries |
3263 | + spn = self.potemplate.sourcepackagename |
3264 | + distroseries_join = LeftJoin( |
3265 | + DistroSeries, |
3266 | + DistroSeries.id == OtherTemplate.distroseriesID) |
3267 | + packaging_join = LeftJoin( |
3268 | + Packaging, |
3269 | + And(Packaging.distroseriesID == ds.id, |
3270 | + Packaging.sourcepackagenameID == spn.id, |
3271 | + Packaging.productseriesID == OtherTemplate.productseriesID |
3272 | + )) |
3273 | + tables.extend([distroseries_join, packaging_join]) |
3274 | + # Template should not be sharing if... |
3275 | + other_clauses = Or( |
3276 | + # The name is different, or... |
3277 | + OtherTemplate.name != self.potemplate.name, |
3278 | + # It's in a different distribution or source package, or... |
3279 | + And(Not(DistroSeries.id == None), |
3280 | + Or(DistroSeries.distributionID != ds.distributionID, |
3281 | + OtherTemplate.sourcepackagenameID != spn.id)), |
3282 | + # There is no link between this source package and |
3283 | + # a product the template is in. |
3284 | + And(Not(OtherTemplate.productseriesID == None), |
3285 | + Packaging.id == None)) |
3286 | + |
3287 | + return store.using(*tables).find( |
3288 | + (OtherItem, ThisItem), |
3289 | + OtherTemplate.id != self.potemplate.id, |
3290 | + other_clauses, |
3291 | + ) |
3292 | |
3293 | === modified file 'scripts/code-import-worker.py' |
3294 | --- scripts/code-import-worker.py 2011-06-16 23:43:04 +0000 |
3295 | +++ scripts/code-import-worker.py 2011-08-16 00:31:36 +0000 |
3296 | @@ -26,8 +26,9 @@ |
3297 | from canonical.config import config |
3298 | from lp.codehosting import load_optional_plugin |
3299 | from lp.codehosting.codeimport.worker import ( |
3300 | - BzrSvnImportWorker, CSCVSImportWorker, CodeImportSourceDetails, |
3301 | - GitImportWorker, HgImportWorker, get_default_bazaar_branch_store) |
3302 | + BzrImportWorker, BzrSvnImportWorker, CSCVSImportWorker, |
3303 | + CodeImportSourceDetails, GitImportWorker, HgImportWorker, |
3304 | + get_default_bazaar_branch_store) |
3305 | from canonical.launchpad import scripts |
3306 | |
3307 | |
3308 | @@ -65,6 +66,10 @@ |
3309 | elif source_details.rcstype == 'hg': |
3310 | load_optional_plugin('hg') |
3311 | import_worker_cls = HgImportWorker |
3312 | + elif source_details.rcstype == 'bzr': |
3313 | + load_optional_plugin('loom') |
3314 | + load_optional_plugin('weave_fmt') |
3315 | + import_worker_cls = BzrImportWorker |
3316 | elif source_details.rcstype in ['cvs', 'svn']: |
3317 | import_worker_cls = CSCVSImportWorker |
3318 | else: |
3319 | |
3320 | === modified file 'utilities/sourcedeps.cache' |
3321 | --- utilities/sourcedeps.cache 2011-08-16 00:31:28 +0000 |
3322 | +++ utilities/sourcedeps.cache 2011-08-16 00:31:36 +0000 |
3323 | @@ -56,8 +56,8 @@ |
3324 | "aaron@aaronbentley.com-20100715135013-uoi3q430urx9gwb8" |
3325 | ], |
3326 | "bzr-svn": [ |
3327 | - 2715, |
3328 | - "launchpad@pqm.canonical.com-20110810144016-m5f5pbnrpkbz04v3" |
3329 | + 2716, |
3330 | + "launchpad@pqm.canonical.com-20110813142415-1izlitsieztuzkly" |
3331 | ], |
3332 | "bzr-hg": [ |
3333 | 287, |
3334 | |
3335 | === modified file 'utilities/sourcedeps.conf' |
3336 | --- utilities/sourcedeps.conf 2011-08-16 00:31:28 +0000 |
3337 | +++ utilities/sourcedeps.conf 2011-08-16 00:31:36 +0000 |
3338 | @@ -2,11 +2,11 @@ |
3339 | bzr-git lp:~launchpad-pqm/bzr-git/devel;revno=259 |
3340 | bzr-hg lp:~launchpad-pqm/bzr-hg/devel;revno=287 |
3341 | bzr-loom lp:~launchpad-pqm/bzr-loom/trunk;revno=50 |
3342 | -bzr-svn lp:~launchpad-pqm/bzr-svn/devel;revno=2715 |
3343 | +bzr-svn lp:~launchpad-pqm/bzr-svn/devel;revno=2716 |
3344 | cscvs lp:~launchpad-pqm/launchpad-cscvs/devel;revno=432 |
3345 | dulwich lp:~launchpad-pqm/dulwich/devel;revno=426 |
3346 | difftacular lp:difftacular;revno=6 |
3347 | -loggerhead lp:~loggerhead-team/loggerhead/trunk-rich;revno=452 |
3348 | +loggerhead lp:~loggerhead-team/loggerhead/trunk-rich;revno=454 |
3349 | lpreview lp:~launchpad-pqm/bzr-lpreview/devel;revno=23 |
3350 | mailman lp:~launchpad-pqm/mailman/2.1;revno=976 |
3351 | old_xmlplus lp:~launchpad-pqm/dtdparser/trunk;revno=4 |
3352 | |
3353 | === modified file 'versions.cfg' |
3354 | --- versions.cfg 2011-08-16 00:31:28 +0000 |
3355 | +++ versions.cfg 2011-08-16 00:31:36 +0000 |
3356 | @@ -7,7 +7,7 @@ |
3357 | ampoule = 0.2.0 |
3358 | amqplib = 0.6.1 |
3359 | BeautifulSoup = 3.1.0.1 |
3360 | -bzr = 2.4b5 |
3361 | +bzr = 2.4.1dev-r6032 |
3362 | chameleon.core = 1.0b35 |
3363 | chameleon.zpt = 1.0b17 |
3364 | ClientForm = 0.2.10 |
I don't know the ins and outs of bzrlib or codehosting - for that I
guess that's why you've asked Michael to review - but the rest looks
good.
[1]
+ def test_partial(self):
+ # Skip tests of partial tests, as they are disabled for native imports
+ # at the moment.
+ return
You could use TestCase.skip() here:
def test_partial(self):
self.skip( "Disabled for native imports at the moment.")