Merge lp:~lifeless/launchpad/bug-727560 into lp:launchpad

Proposed by Robert Collins
Status: Merged
Approved by: Robert Collins
Approved revision: no longer in the source branch.
Merged at revision: 12535
Proposed branch: lp:~lifeless/launchpad/bug-727560
Merge into: lp:launchpad
Diff against target: 1300 lines (+271/-300)
30 files modified
lib/canonical/launchpad/doc/publishing-security.txt (+6/-10)
lib/canonical/launchpad/testing/fakepackager.py (+2/-4)
lib/lp/archivepublisher/tests/archive-signing.txt (+1/-1)
lib/lp/archiveuploader/nascentupload.py (+3/-1)
lib/lp/archiveuploader/nascentuploadfile.py (+5/-3)
lib/lp/archiveuploader/tests/nascentupload-ddebs.txt (+2/-2)
lib/lp/archiveuploader/tests/test_ppauploadprocessor.py (+8/-11)
lib/lp/archiveuploader/tests/test_uploadprocessor.py (+5/-5)
lib/lp/registry/model/distroseriesdifference.py (+5/-6)
lib/lp/soyuz/browser/archive.py (+9/-15)
lib/lp/soyuz/browser/tests/publishing-views.txt (+1/-1)
lib/lp/soyuz/browser/tests/test_archive_packages.py (+2/-2)
lib/lp/soyuz/doc/archive.txt (+21/-156)
lib/lp/soyuz/doc/distribution.txt (+1/-1)
lib/lp/soyuz/doc/publishing.txt (+3/-3)
lib/lp/soyuz/doc/sourcepackagerelease-build-lookup.txt (+3/-3)
lib/lp/soyuz/interfaces/publishing.py (+4/-0)
lib/lp/soyuz/interfaces/sourcepackagerelease.py (+1/-0)
lib/lp/soyuz/model/archive.py (+76/-49)
lib/lp/soyuz/model/packagecloner.py (+0/-4)
lib/lp/soyuz/scripts/add_missing_builds.py (+2/-1)
lib/lp/soyuz/scripts/ftpmasterbase.py (+3/-3)
lib/lp/soyuz/scripts/packagecopier.py (+3/-3)
lib/lp/soyuz/scripts/tests/test_copypackage.py (+6/-6)
lib/lp/soyuz/stories/ppa/xx-delete-packages.txt (+1/-1)
lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt (+2/-2)
lib/lp/soyuz/stories/soyuz/xx-package-diff.txt (+1/-1)
lib/lp/soyuz/tests/test_archive.py (+92/-2)
lib/lp/soyuz/tests/test_processaccepted.py (+2/-2)
lib/lp/soyuz/tests/test_syncpackagejob.py (+1/-2)
To merge this branch: bzr merge lp:~lifeless/launchpad/bug-727560
Reviewer Review Type Date Requested Status
Tim Penhey (community) code Approve
Ian Booth (community) *code Approve
Review via email: mp+51989@code.launchpad.net

Commit message

[r=thumper,wallyworld][ui=none] [r=thumper,wallyworld][bug=727560] Convert Archive.getPublishedSources to storm and use DecoratedResultSet rather than prejoins for eager loading.

Description of the change

Hopefully fix bug 727560:
 - query plan analysis showed prejoins causing pessimistic query plans.
 - stormified to let DecoratedResultSet be used (this caused all the edits all over the place)
 - used DRS to set load just one copy of the related objects
 - also fixed a 1/2 dozen bugs of the form 'if foo: something with foo[0]' which wastes a query for no benefit.

To post a comment you must log in.
Revision history for this message
Ian Booth (wallyworld) wrote :

Hooray for removing a doctest \o/

I have some questions:

1. Use of candidates[0] vs candidates.first()

    try:
        return candidates[0]
    except IndexError:
        pass

return None

Looking at the ResultSet code candidates.first() returns None if there's nothing available. So to me it's preferable to just do

return candidates.first()

and not have the ugly (imho) exception handling. Not having to raise and catch an exception is also perhaps more efficient?

2. This construct:

    try:
        first_source = published_sources[0]
    except IndexError:
        pass
    else:
        sources.append(first_source)

Would it be better as:

    try:
        first_source = published_sources[0]
        sources.append(first_source)
    except IndexError:
        pass

Or even:

    first_source = published_sources.first()
    if first_source:
        sources.append(first_source)

3. This comment:

        # Its not clear that this eager load is necessary or sufficient, it
        # replaces a prejoin that had pathological query plans.

I would prefer that it be established if it is necessary before committing code that would produce unnecessary queries.

review: Needs Information
Revision history for this message
Robert Collins (lifeless) wrote :

On Thu, Mar 3, 2011 at 3:07 PM, Ian Booth <email address hidden> wrote:
> I have some questions:
>
> 1. Use of candidates[0] vs candidates.first()
>
>    try:
>        return candidates[0]
>    except IndexError:
>        pass
>
> return None
>
> Looking at the ResultSet code candidates.first() returns None if there's nothing available. So to me it's preferable to just do
>
> return candidates.first()
>
> and not have the ugly (imho) exception handling. Not having to raise and catch an exception is also perhaps more efficient?

The efficiency gain of avoiding exception handling will be lost in the
noise for lp - its µs scale. However, note that returning None in the
example you give wouldn't work. We're in a loop - we need the flow
control.

> 2. This construct:
>
>    try:
>        first_source = published_sources[0]
>    except IndexError:
>        pass
>    else:
>        sources.append(first_source)
>
> Would it be better as:
>
>    try:
>        first_source = published_sources[0]
>        sources.append(first_source)
>    except IndexError:
>        pass

No, because only the thing expected to raise the exception you catch
should be in the try: clause.

> Or even:
>
>    first_source = published_sources.first()
>    if first_source:
>        sources.append(first_source)

Thats nicer, I can certainly update to that for this case.

> 3. This comment:
>
>        # Its not clear that this eager load is necessary or sufficient, it
>        # replaces a prejoin that had pathological query plans.
>
> I would prefer that it be established if it is necessary before committing code that would produce unnecessary queries.

Removing a prior eager load without careful auditing is a serious
performance regression risk. It would be a mistake to just delete the
eager loads and hope that its ok.

If we need them (and we probably do for at least some callers) its
better to migrate and then tune, rather than drop it and put 30 or 40
pages out by causing late evaluation.

-Rob

Revision history for this message
Ian Booth (wallyworld) wrote :

> On Thu, Mar 3, 2011 at 3:07 PM, Ian Booth <email address hidden> wrote:
> > I have some questions:
> >
> > 1. Use of candidates[0] vs candidates.first()
> >
> >    try:
> >        return candidates[0]
> >    except IndexError:
> >        pass
> >
> > return None
> >
> > Looking at the ResultSet code candidates.first() returns None if there's
> nothing available. So to me it's preferable to just do
> >
> > return candidates.first()
> >
> > and not have the ugly (imho) exception handling. Not having to raise and
> catch an exception is also perhaps more efficient?
>
> The efficiency gain of avoiding exception handling will be lost in the
> noise for lp - its µs scale. However, note that returning None in the
> example you give wouldn't work. We're in a loop - we need the flow
> control.
>

Sorry, I wasn't clear:

candidate = candidates.first()
if candidate:
    return candidate

Depends on your preference - my personal preference is not to use exceptions where a simple check will suffice.

>
> > Or even:
> >
> >    first_source = published_sources.first()
> >    if first_source:
> >        sources.append(first_source)
>
> Thats nicer, I can certainly update to that for this case.
>

Ok. And any others that may exist :-). I just picked out one example. Removing the use of xxx[0] also ensure better consistency with the other places xxx.first() is used.

> > 3. This comment:
> >
> >        # Its not clear that this eager load is necessary or sufficient, it
> >        # replaces a prejoin that had pathological query plans.
> >
> > I would prefer that it be established if it is necessary before committing
> code that would produce unnecessary queries.
>
> Removing a prior eager load without careful auditing is a serious
> performance regression risk. It would be a mistake to just delete the
> eager loads and hope that its ok.
>
> If we need them (and we probably do for at least some callers) its
> better to migrate and then tune, rather than drop it and put 30 or 40
> pages out by causing late evaluation.
>

My point was that we should not simply do the eager load "just in case" but figure out what's necessary. If it's needed to prevent gobs of late evaluation, fine. If not, we are doing an unnecessary db query. But migrating now and tuning later is ok, so long as it's not forgotten, which often is the case in general :-)

review: Approve (*code)
Revision history for this message
Tim Penhey (thumper) wrote :

I agree with Ian's comments. It would have been nicer for this to have been broken into two different pipes, or dependent branches with the mechanical changes separate.

Revision history for this message
Robert Collins (lifeless) wrote :

On Thu, Mar 3, 2011 at 4:26 PM, Tim Penhey <email address hidden> wrote:
> I agree with Ian's comments.  It would have been nicer for this to have been broken into two different pipes, or dependent branches with the mechanical changes separate.

We can't do it separately, because without changing to storm we can't
make the mechanical changes, and if we change to storm we need the
mechanical changes.

Revision history for this message
Robert Collins (lifeless) wrote :

On Thu, Mar 3, 2011 at 4:22 PM, Ian Booth <email address hidden> wrote:
> Review: Approve *code
>> On Thu, Mar 3, 2011 at 3:07 PM, Ian Booth <email address hidden> wrote:
>> > I have some questions:
>> >
>> > 1. Use of candidates[0] vs candidates.first()
>> >
>> >    try:
>> >        return candidates[0]
>> >    except IndexError:
>> >        pass
>> >
>> > return None
>> >
>> > Looking at the ResultSet code candidates.first() returns None if there's
>> nothing available. So to me it's preferable to just do
>> >
>> > return candidates.first()
>> >
>> > and not have the ugly (imho) exception handling. Not having to raise and
>> catch an exception is also perhaps more efficient?
>>
>> The efficiency gain of avoiding exception handling will be lost in the
>> noise for lp - its µs scale. However, note that returning None in the
>> example you give wouldn't work. We're in a loop - we need the flow
>> control.
>>
>
> Sorry, I wasn't clear:
>
> candidate = candidates.first()
> if candidate:
>    return candidate
>
> Depends on your preference - my personal preference is not to use exceptions where a simple check will suffice.

You were clear, but it won't work in this loop. Note the indentation.
I've updated the other one where it will work.

>> If we need them (and we probably do for at least some callers) its
>> better to migrate and then tune, rather than drop it and put 30 or 40
>> pages out by causing late evaluation.
>>
>
> My point was that we should not simply do the eager load "just in case" but figure out what's necessary. If it's needed to prevent gobs of late evaluation, fine. If not, we are doing an unnecessary db query. But migrating now and tuning later is ok, so long as it's not forgotten, which often is the case in general :-)

This was previously eager loaded. So to figure out if its needed means
trawling through years of bug reports. Thats not useful: if this fix
is insufficient, we'll see in the page performance reports. If it is
sufficient, even if there is a little waste, we can come back once
we're at the performance point where the incremental win is worth
hours of work.

-Rob

Revision history for this message
Tim Penhey (thumper) wrote :

You can have separate pipes, review separately, and land the last. We have done this often.

review: Approve (code)

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'lib/canonical/launchpad/doc/publishing-security.txt'
2--- lib/canonical/launchpad/doc/publishing-security.txt 2010-03-02 22:33:24 +0000
3+++ lib/canonical/launchpad/doc/publishing-security.txt 2011-03-06 11:23:52 +0000
4@@ -25,8 +25,7 @@
5 the public PPA:
6
7 >>> login(ANONYMOUS)
8- >>> source_pub = public_ppa.getPublishedSources()[0]
9- >>> print source_pub.displayname
10+ >>> print public_ppa.getPublishedSources().first().displayname
11 foo 666 in breezy-autotest
12
13 >>> binary_pub = public_ppa.getAllPublishedBinaries()[0]
14@@ -36,8 +35,7 @@
15 A regular user can see them too:
16
17 >>> login('no-priv@canonical.com')
18- >>> source_pub = public_ppa.getPublishedSources()[0]
19- >>> print source_pub.displayname
20+ >>> print public_ppa.getPublishedSources().first().displayname
21 foo 666 in breezy-autotest
22
23 >>> binary_pub = public_ppa.getAllPublishedBinaries()[0]
24@@ -47,7 +45,7 @@
25 But when querying the private PPA, anonymous access will be refused:
26
27 >>> login(ANONYMOUS)
28- >>> source_pub = private_ppa.getPublishedSources()[0]
29+ >>> source_pub = private_ppa.getPublishedSources().first()
30 Traceback (most recent call last):
31 ...
32 Unauthorized:...
33@@ -60,7 +58,7 @@
34 As is for a regular user.
35
36 >>> login('no-priv@canonical.com')
37- >>> source_pub = private_ppa.getPublishedSources()[0]
38+ >>> source_pub = private_ppa.getPublishedSources().first()
39 Traceback (most recent call last):
40 ...
41 Unauthorized:...
42@@ -73,8 +71,7 @@
43 But the owner can see them.
44
45 >>> login_person(private_ppa.owner)
46- >>> source_pub = private_ppa.getPublishedSources()[0]
47- >>> print source_pub.displayname
48+ >>> print public_ppa.getPublishedSources().first().displayname
49 foo 666 in breezy-autotest
50
51 >>> binary_pub = private_ppa.getAllPublishedBinaries()[0]
52@@ -84,8 +81,7 @@
53 As can an administrator.
54
55 >>> login('admin@canonical.com')
56- >>> source_pub = private_ppa.getPublishedSources()[0]
57- >>> print source_pub.displayname
58+ >>> print public_ppa.getPublishedSources().first().displayname
59 foo 666 in breezy-autotest
60
61 >>> binary_pub = private_ppa.getAllPublishedBinaries()[0]
62
63=== modified file 'lib/canonical/launchpad/testing/fakepackager.py'
64--- lib/canonical/launchpad/testing/fakepackager.py 2010-12-20 03:21:03 +0000
65+++ lib/canonical/launchpad/testing/fakepackager.py 2011-03-06 11:23:52 +0000
66@@ -421,8 +421,6 @@
67 if queue_record.status in needs_acceptance_statuses:
68 queue_record.acceptFromUploader(changesfile_path, logger)
69
70- pub_record = queue_record.archive.getPublishedSources(
71- name=self.name, version=version, exact_match=True)[0]
72-
73- return pub_record
74+ return queue_record.archive.getPublishedSources(
75+ name=self.name, version=version, exact_match=True).first()
76
77
78=== modified file 'lib/lp/archivepublisher/tests/archive-signing.txt'
79--- lib/lp/archivepublisher/tests/archive-signing.txt 2011-02-17 15:52:05 +0000
80+++ lib/lp/archivepublisher/tests/archive-signing.txt 2011-03-06 11:23:52 +0000
81@@ -65,7 +65,7 @@
82 'No Privileges' PPA will be considered for signing_key creation when
83 we copy an arbitrary source into it.
84
85- >>> a_source = cprov.archive.getPublishedSources()[0]
86+ >>> a_source = cprov.archive.getPublishedSources().first()
87 >>> copied_sources = a_source.copyTo(
88 ... a_source.distroseries, a_source.pocket, no_priv.archive)
89
90
91=== modified file 'lib/lp/archiveuploader/nascentupload.py'
92--- lib/lp/archiveuploader/nascentupload.py 2010-11-19 12:47:32 +0000
93+++ lib/lp/archiveuploader/nascentupload.py 2011-03-06 11:23:52 +0000
94@@ -566,8 +566,10 @@
95 candidates = self.policy.distroseries.getPublishedSources(
96 source_name, include_pending=True, pocket=pocket,
97 archive=archive)
98- if candidates:
99+ try:
100 return candidates[0]
101+ except IndexError:
102+ pass
103
104 return None
105
106
107=== modified file 'lib/lp/archiveuploader/nascentuploadfile.py'
108--- lib/lp/archiveuploader/nascentuploadfile.py 2011-02-17 17:02:54 +0000
109+++ lib/lp/archiveuploader/nascentuploadfile.py 2011-03-06 11:23:52 +0000
110@@ -796,13 +796,15 @@
111 spphs = distroseries.getPublishedSources(
112 self.source_name, version=self.source_version,
113 include_pending=True, archive=self.policy.archive)
114-
115- if spphs.count() == 0:
116+ # Workaround storm bug in EmptyResultSet.
117+ spphs = list(spphs[:1])
118+ try:
119+ return spphs[0].sourcepackagerelease
120+ except IndexError:
121 raise UploadError(
122 "Unable to find source package %s/%s in %s" % (
123 self.source_name, self.source_version, distroseries.name))
124
125- return spphs[0].sourcepackagerelease
126
127 def verifySourcePackageRelease(self, sourcepackagerelease):
128 """Check if the given ISourcePackageRelease matches the context."""
129
130=== modified file 'lib/lp/archiveuploader/tests/nascentupload-ddebs.txt'
131--- lib/lp/archiveuploader/tests/nascentupload-ddebs.txt 2010-10-04 19:50:45 +0000
132+++ lib/lp/archiveuploader/tests/nascentupload-ddebs.txt 2011-03-06 11:23:52 +0000
133@@ -43,8 +43,8 @@
134 >>> print src.queue_root.status.name
135 DONE
136
137- >>> [src_pub] = src.queue_root.archive.getPublishedSources(
138- ... name='debug', version='1.0-1', exact_match=True)
139+ >>> src_pub = src.queue_root.archive.getPublishedSources(
140+ ... name='debug', version='1.0-1', exact_match=True).one()
141
142 >>> print src_pub.displayname, src_pub.status.name
143 debug 1.0-1 in hoary PENDING
144
145=== modified file 'lib/lp/archiveuploader/tests/test_ppauploadprocessor.py'
146--- lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2011-01-26 17:22:39 +0000
147+++ lib/lp/archiveuploader/tests/test_ppauploadprocessor.py 2011-03-06 11:23:52 +0000
148@@ -209,8 +209,7 @@
149 self.assertEqual(pending_ppas.count(), 1)
150 self.assertEqual(pending_ppas[0], self.name16.archive)
151
152- pub_sources = self.name16.archive.getPublishedSources(name='bar')
153- [pub_bar] = pub_sources
154+ pub_bar = self.name16.archive.getPublishedSources(name='bar').one()
155
156 self.assertEqual(pub_bar.sourcepackagerelease.version, u'1.0-1')
157 self.assertEqual(pub_bar.status, PackagePublishingStatus.PENDING)
158@@ -340,8 +339,7 @@
159 _from_addr, _to_addrs, _raw_msg = stub.test_emails.pop()
160
161 # The SourcePackageRelease still has a component of universe:
162- pub_sources = self.name16.archive.getPublishedSources(name="bar")
163- [pub_foo] = pub_sources
164+ pub_foo = self.name16.archive.getPublishedSources(name="bar").one()
165 self.assertEqual(
166 pub_foo.sourcepackagerelease.component.name, "universe")
167
168@@ -387,8 +385,7 @@
169 # Source publication and build record for breezy-i386
170 # distroarchseries were created as expected. The source is ready
171 # to receive the binary upload.
172- pub_sources = self.name16.archive.getPublishedSources(name='bar')
173- [pub_bar] = pub_sources
174+ pub_bar = self.name16.archive.getPublishedSources(name='bar').one()
175 self.assertEqual(pub_bar.sourcepackagerelease.version, u'1.0-1')
176 self.assertEqual(pub_bar.status, PackagePublishingStatus.PENDING)
177 self.assertEqual(pub_bar.component.name, 'main')
178@@ -439,8 +436,8 @@
179 PackageUploadStatus.DONE)
180
181 # Copy source uploaded to name16 PPA to cprov's PPA.
182- pub_sources = self.name16.archive.getPublishedSources(name='bar')
183- [name16_pub_bar] = pub_sources
184+ name16_pub_bar = self.name16.archive.getPublishedSources(
185+ name='bar').one()
186 cprov = getUtility(IPersonSet).getByName("cprov")
187 cprov_pub_bar = name16_pub_bar.copyTo(
188 self.breezy, PackagePublishingPocket.RELEASE, cprov.archive)
189@@ -635,8 +632,8 @@
190 # the main archive later where it would be published using the
191 # source's component if the standard auto-overrides don't match
192 # an existing publication.
193- pub_sources = self.name16.archive.getPublishedSources(name='foocomm')
194- [pub_foocomm] = pub_sources
195+ pub_foocomm = self.name16.archive.getPublishedSources(
196+ name='foocomm').one()
197 self.assertEqual(
198 pub_foocomm.sourcepackagerelease.component.name, 'partner')
199 self.assertEqual(pub_foocomm.component.name, 'main')
200@@ -1125,7 +1122,7 @@
201 PackageUploadStatus.DONE)
202
203 # Delete the published file.
204- [bar_src] = self.name16.archive.getPublishedSources(name="bar")
205+ bar_src = self.name16.archive.getPublishedSources(name="bar").one()
206 bar_src.requestDeletion(self.name16)
207 bar_src.dateremoved = UTC_NOW
208 self.layer.txn.commit()
209
210=== modified file 'lib/lp/archiveuploader/tests/test_uploadprocessor.py'
211--- lib/lp/archiveuploader/tests/test_uploadprocessor.py 2011-02-01 02:59:05 +0000
212+++ lib/lp/archiveuploader/tests/test_uploadprocessor.py 2011-03-06 11:23:52 +0000
213@@ -351,7 +351,7 @@
214 bar = archive.getPublishedSources(
215 name='bar', version="1.0-1", exact_match=True)
216 changes_lfa = getUtility(IPublishingSet).getChangesFileLFA(
217- bar[0].sourcepackagerelease)
218+ bar.first().sourcepackagerelease)
219 changes_file = changes_lfa.read()
220 self.assertTrue(
221 "Format: " in changes_file, "Does not look like a changes file")
222@@ -823,8 +823,8 @@
223 # Upload 'bar-1.0-1' source and binary to ubuntu/breezy.
224 upload_dir = self.queueUpload("bar_1.0-2")
225 self.processUpload(uploadprocessor, upload_dir)
226- [bar_source_pub] = self.ubuntu.main_archive.getPublishedSources(
227- name='bar', version='1.0-2', exact_match=True)
228+ bar_source_pub = self.ubuntu.main_archive.getPublishedSources(
229+ name='bar', version='1.0-2', exact_match=True).one()
230 [bar_original_build] = bar_source_pub.getBuilds()
231
232 self.options.context = 'buildd'
233@@ -1096,8 +1096,8 @@
234 # Single source uploads also get their corrsponding builds created
235 # at upload-time. 'foocomm' only builds in 'i386', thus only one
236 # build gets created.
237- [foocomm_source] = partner_archive.getPublishedSources(
238- name='foocomm', version='1.0-2')
239+ foocomm_source = partner_archive.getPublishedSources(
240+ name='foocomm', version='1.0-2').one()
241 [build] = foocomm_source.sourcepackagerelease.builds
242 self.assertEqual(
243 build.title,
244
245=== modified file 'lib/lp/registry/model/distroseriesdifference.py'
246--- lib/lp/registry/model/distroseriesdifference.py 2010-10-26 16:22:57 +0000
247+++ lib/lp/registry/model/distroseriesdifference.py 2011-03-06 11:23:52 +0000
248@@ -157,10 +157,9 @@
249 name=self.source_package_name.name,
250 version=self.base_version,
251 distroseries=self.derived_series)
252- # As we know there is a base version published in the
253- # distroseries' main archive, we don't check (equivalent
254- # of calling .one() for a storm resultset.
255- return pubs[0]
256+ # We know there is a base version published in the distroseries'
257+ # main archive.
258+ return pubs.first()
259
260 return None
261
262@@ -211,9 +210,9 @@
263 self.source_package_name, include_pending=True)
264
265 # The most recent published source is the first one.
266- if pubs:
267+ try:
268 return pubs[0]
269- else:
270+ except IndexError:
271 return None
272
273 def update(self):
274
275=== modified file 'lib/lp/soyuz/browser/archive.py'
276--- lib/lp/soyuz/browser/archive.py 2011-02-21 21:10:45 +0000
277+++ lib/lp/soyuz/browser/archive.py 2011-03-06 11:23:52 +0000
278@@ -35,6 +35,7 @@
279
280 import pytz
281 from sqlobject import SQLObjectNotFound
282+from storm.expr import Desc
283 from storm.zope.interfaces import IResultSet
284 from zope.app.form.browser import TextAreaWidget
285 from zope.component import getUtility
286@@ -150,6 +151,7 @@
287 IPublishingSet,
288 )
289 from lp.soyuz.model.archive import Archive
290+from lp.soyuz.model.publishing import SourcePackagePublishingHistory
291 from lp.soyuz.scripts.packagecopier import do_copy
292
293
294@@ -573,7 +575,7 @@
295 the view to determine whether to display "This PPA does not yet
296 have any published sources" or "No sources matching 'blah'."
297 """
298- return bool(self.context.getPublishedSources())
299+ return not self.context.getPublishedSources().is_empty()
300
301 @cachedproperty
302 def repository_usage(self):
303@@ -924,13 +926,8 @@
304 """Return the last five published sources for this archive."""
305 sources = self.context.getPublishedSources(
306 status=PackagePublishingStatus.PUBLISHED)
307-
308- # We adapt the ISQLResultSet into a normal storm IResultSet so we
309- # can re-order and limit the results (orderBy is not included on
310- # the ISQLResultSet interface). Because this query contains
311- # pre-joins, the result of the adaption is a set of tuples.
312- result_tuples = IResultSet(sources)
313- result_tuples = result_tuples.order_by('datepublished DESC')[:5]
314+ sources.order_by(Desc(SourcePackagePublishingHistory.datepublished))
315+ result_tuples = sources[:5]
316
317 # We want to return a list of dicts for easy template rendering.
318 latest_updates_list = []
319@@ -948,7 +945,7 @@
320 }
321
322 now = datetime.now(tz=pytz.UTC)
323- source_ids = [result_tuple[0].id for result_tuple in result_tuples]
324+ source_ids = [result_tuple.id for result_tuple in result_tuples]
325 summaries = getUtility(
326 IPublishingSet).getBuildStatusSummariesForSourceIdsAndArchive(
327 source_ids, self.context)
328@@ -981,11 +978,8 @@
329 """Return the number of updates over the past days."""
330 now = datetime.now(tz=pytz.UTC)
331 created_since = now - timedelta(num_days)
332-
333- sources = self.context.getPublishedSources(
334- created_since_date=created_since)
335-
336- return sources.count()
337+ return self.context.getPublishedSources(
338+ created_since_date=created_since).count()
339
340 @property
341 def num_pkgs_building(self):
342@@ -1941,7 +1935,7 @@
343
344 if data.get('private') != self.context.private:
345 # The privacy is being switched.
346- if bool(self.context.getPublishedSources()):
347+ if not self.context.getPublishedSources().is_empty():
348 self.setFieldError(
349 'private',
350 'This archive already has published sources. It is '
351
352=== modified file 'lib/lp/soyuz/browser/tests/publishing-views.txt'
353--- lib/lp/soyuz/browser/tests/publishing-views.txt 2010-10-09 16:36:22 +0000
354+++ lib/lp/soyuz/browser/tests/publishing-views.txt 2011-03-06 11:23:52 +0000
355@@ -78,7 +78,7 @@
356 >>> cprov = getUtility(IPersonSet).getByName("cprov")
357
358 >>> iceweasel_source_pub = cprov.archive.getPublishedSources(
359- ... 'iceweasel')[0]
360+ ... 'iceweasel').first()
361
362 >>> ppa_source_view = getMultiAdapter(
363 ... (iceweasel_source_pub, LaunchpadTestRequest()),
364
365=== modified file 'lib/lp/soyuz/browser/tests/test_archive_packages.py'
366--- lib/lp/soyuz/browser/tests/test_archive_packages.py 2011-02-24 01:12:02 +0000
367+++ lib/lp/soyuz/browser/tests/test_archive_packages.py 2011-03-06 11:23:52 +0000
368@@ -187,7 +187,7 @@
369 self.assertIs(None, view.specified_name_filter)
370
371 def test_source_query_counts(self):
372- query_baseline = 43
373+ query_baseline = 47
374 # Assess the baseline.
375 collector = QueryCollector()
376 collector.register()
377@@ -228,7 +228,7 @@
378 self.assertThat(collector, HasQueryCount(LessThan(expected_count)))
379
380 def test_binary_query_counts(self):
381- query_baseline = 40
382+ query_baseline = 43
383 # Assess the baseline.
384 collector = QueryCollector()
385 collector.register()
386
387=== modified file 'lib/lp/soyuz/doc/archive.txt'
388--- lib/lp/soyuz/doc/archive.txt 2011-02-01 16:41:18 +0000
389+++ lib/lp/soyuz/doc/archive.txt 2011-03-06 11:23:52 +0000
390@@ -299,11 +299,14 @@
391 >>> cprov_archive.getAllPublishedBinaries().count()
392 4
393
394-This lookup also supports optional filters:
395-
396- * 'name': as in SQL "LIKE '%%' || NAME || '%%'");
397- * 'version': exact version string matching;
398- * 'status': a item or a list of PackagePublishingStatus.
399+This lookup also supports various filters - see the api docs for more info.
400+
401+Binary publication lookups
402+--------------------------
403+
404+'getPublishedOnDiskBinaries' returns only unique publications, i.e., it
405+excludes architecture-independent duplications which is necessary for
406+having correct publication counters and archive size.
407
408 >>> from lp.soyuz.enums import PackagePublishingStatus
409
410@@ -313,146 +316,10 @@
411 >>> inactive_status = [PackagePublishingStatus.SUPERSEDED,
412 ... PackagePublishingStatus.DELETED]
413
414-Let's inspect source publications in Cprov PPA:
415-
416- >>> all_sources = cd_lookup = cprov_archive.getPublishedSources()
417- >>> for pub in all_sources:
418- ... title = pub.sourcepackagerelease.title
419- ... pub_ds = pub.distroseries.name
420- ... print "%s -> %s" % (title, pub_ds)
421- cdrkit - 1.0 -> breezy-autotest
422- iceweasel - 1.0 -> warty
423- pmount - 0.1-1 -> warty
424-
425-Using 'name' filter:
426-
427- >>> cprov_archive.getPublishedSources(name='cd').count()
428- 1
429-
430- >>> cprov_archive.getPublishedSources(name='ice').count()
431- 1
432-
433-Combining 'name' filter and 'exact_match' flag:
434-
435- >>> cprov_archive.getPublishedSources(
436- ... name='iceweasel', exact_match=True).count()
437- 1
438- >>> cprov_archive.getPublishedSources(
439- ... name='ice', exact_match=True).count()
440- 0
441-
442-Using 'version' filter:
443-
444- >>> ice_version_lookup = cprov_archive.getPublishedSources(
445- ... version='1.0')
446- Traceback (most recent call last):
447- ...
448- VersionRequiresName: The 'version' parameter can be used only together
449- with the 'name' parameter.
450-
451- >>> ice_version_lookup = cprov_archive.getPublishedSources(
452- ... name='ice', version='1.0')
453- >>> ice_version_lookup.count()
454- 1
455-
456- >>> cprov_archive.getPublishedSources(
457- ... name='ice', version='666').count()
458- 0
459-
460-Using 'status' filter:
461-
462- >>> cprov_archive.getPublishedSources(
463- ... status=PackagePublishingStatus.PUBLISHED).count()
464- 3
465-
466- >>> cprov_archive.getPublishedSources(
467- ... status=active_status).count()
468- 3
469-
470- >>> cprov_archive.getPublishedSources(
471- ... status=inactive_status).count()
472- 0
473-
474-Using 'distroseries' filter:
475-
476 >>> warty = cprov_archive.distribution['warty']
477 >>> hoary = cprov_archive.distribution['hoary']
478 >>> breezy_autotest = cprov_archive.distribution['breezy-autotest']
479-
480- >>> cprov_archive.getPublishedSources(
481- ... distroseries=warty).count()
482- 2
483- >>> cprov_archive.getPublishedSources(
484- ... distroseries=hoary).count()
485- 0
486- >>> cprov_archive.getPublishedSources(
487- ... distroseries=breezy_autotest).count()
488- 1
489-
490-Using 'pocket' filter:
491-
492 >>> from lp.registry.interfaces.pocket import PackagePublishingPocket
493- >>> cprov_archive.getPublishedSources(
494- ... distroseries=warty,
495- ... pocket=PackagePublishingPocket.RELEASE).count()
496- 2
497-
498- >>> cprov_archive.getPublishedSources(
499- ... distroseries=warty,
500- ... pocket=PackagePublishingPocket.UPDATES).count()
501- 0
502-
503-Combining 'name' and 'distroseries' filters:
504-
505- >>> cprov_archive.getPublishedSources(
506- ... name='ice', distroseries=warty).count()
507- 1
508- >>> cprov_archive.getPublishedSources(
509- ... name='ice', distroseries=breezy_autotest).count()
510- 0
511-
512- >>> cprov_archive.getPublishedSources(
513- ... name='cd', distroseries=warty).count()
514- 0
515- >>> cprov_archive.getPublishedSources(
516- ... name='cd', distroseries=breezy_autotest).count()
517- 1
518-
519-Using the 'created_since_date' filter. This argument accept both,
520-iso-timestamp strings and datetime objects.
521-
522- >>> cprov_archive.getPublishedSources(
523- ... created_since_date='2007-07-09 14:00:00').count()
524- 0
525-
526- >>> import datetime
527- >>> instant = datetime.datetime(year=2007, month=7, day=9, hour=14)
528-
529- >>> cprov_archive.getPublishedSources(
530- ... created_since_date=instant).count()
531- 0
532-
533-As we move the given 'create_since_date' instant to the past the
534-publications in Celso's PPA start to show up.
535-
536- >>> one_hour_step = datetime.timedelta(hours=1)
537- >>> one_hour_earlier = instant - one_hour_step
538- >>> cprov_archive.getPublishedSources(
539- ... created_since_date=one_hour_earlier).count()
540- 1
541-
542- >>> two_hours_earlier = one_hour_earlier - one_hour_step
543- >>> cprov_archive.getPublishedSources(
544- ... created_since_date=two_hours_earlier).count()
545- 3
546-
547-
548-Binary publication lookups
549---------------------------
550-
551-'getPublishedOnDiskBinaries' returns only unique publications, i.e., it
552-excludes architecture-independent duplications which is necessary for
553-having correct publication counters and archive size.
554
555 >>> def check_bin_pubs(pubs):
556 ... """Print binary publication details."""
557@@ -785,8 +652,7 @@
558
559 # Grab some source IDs from the archive that we can use for calls to
560 # getBuildSummariesForSourceIds():
561- >>> sources = cprov_private_ppa.getPublishedSources()
562- >>> source_ids = [sources[0].id]
563+ >>> source_ids = [cprov_private_ppa.getPublishedSources()[0].id]
564
565 Then verify that an admin can see the counters and build summaries:
566
567@@ -872,8 +738,7 @@
568 files as it happens in the archive filesystem (pool/):
569
570 >>> def print_published_files(archive):
571- ... pub_sources = archive.getPublishedSources()
572- ... for pub_source in pub_sources:
573+ ... for pub_source in archive.getPublishedSources():
574 ... for src_file in pub_source.sourcepackagerelease.files:
575 ... print '%s: %s (%s, %d bytes)' % (
576 ... src_file.sourcepackagerelease.title,
577@@ -896,7 +761,7 @@
578 ... )
579 >>> huge_firefox_orig_file = getUtility(ILibraryFileAliasSet)[3]
580 >>> cprov_cdrkit_src = cprov_archive.getPublishedSources(
581- ... name='cdrkit')[0]
582+ ... name='cdrkit').first()
583 >>> unused_src_file = cprov_cdrkit_src.sourcepackagerelease.addFile(
584 ... huge_firefox_orig_file)
585
586@@ -926,7 +791,7 @@
587 >>> cprov_archive.number_of_sources
588 3
589 >>> cprov_archive.getPublishedSources(
590- ... name='cdrkit')[0].supersede()
591+ ... name='cdrkit').first().supersede()
592
593 >>> cprov_archive.number_of_sources
594 2
595@@ -987,7 +852,7 @@
596 it continues to be considered 'available for deletion'.
597
598 >>> removal_candidate = cprov_archive.getPublishedSources(
599- ... name='ice')[0]
600+ ... name='ice').first()
601 >>> len(removal_candidate.getPublishedBinaries())
602 1
603
604@@ -1820,7 +1685,7 @@
605 >>> name12_archive = archive_set.new(
606 ... owner=name12, distribution=None, purpose=ArchivePurpose.PPA)
607
608- >>> a_pub = cprov_archive.getPublishedSources()[0]
609+ >>> a_pub = cprov_archive.getPublishedSources().first()
610 >>> def create_activity(where, how_many):
611 ... for i in range(how_many):
612 ... a_pub.copyTo(
613@@ -2418,10 +2283,10 @@
614
615 >>> mark.archive.syncSources(sources, cprov.archive, "release")
616
617- >>> [mark_one] = mark.archive.getPublishedSources(name="package1")
618+ >>> mark_one = mark.archive.getPublishedSources(name="package1").one()
619 >>> print mark_one.sourcepackagerelease.version
620 1.1
621- >>> [mark_two] = mark.archive.getPublishedSources(name="package2")
622+ >>> mark_two = mark.archive.getPublishedSources(name="package2").one()
623 >>> print mark_two.sourcepackagerelease.version
624 1.0
625
626@@ -2501,15 +2366,15 @@
627
628 >>> login("mark@example.com")
629 >>> mark.archive.syncSource("pack", "1.0", cprov.archive, "release")
630- >>> [pack] = mark.archive.getPublishedSources(
631- ... name="pack", exact_match=True)
632+ >>> pack = mark.archive.getPublishedSources(
633+ ... name="pack", exact_match=True).one()
634 >>> print pack.sourcepackagerelease.version
635 1.0
636
637 Copy package3 1.0 explicitly:
638
639 >>> mark.archive.syncSource("package3", "1.0", cprov.archive, "release")
640- >>> [mark_three] = mark.archive.getPublishedSources(name="package3")
641+ >>> mark_three = mark.archive.getPublishedSources(name="package3").one()
642 >>> print mark_three.sourcepackagerelease.version
643 1.0
644
645@@ -2539,7 +2404,7 @@
646
647 Now, Mark's PPA has 'built-source' source and it's 2 binaries.
648
649- >>> [copy] = mark.archive.getPublishedSources(name="built-source")
650+ >>> copy = mark.archive.getPublishedSources(name="built-source").one()
651 >>> len(copy.getPublishedBinaries())
652 2
653
654@@ -2580,7 +2445,7 @@
655 ... source_name='overridden', version='1.0',
656 ... from_archive=source_archive, to_pocket='release')
657
658- >>> [copy] = mark.archive.getPublishedSources(name="overridden")
659+ >>> copy = mark.archive.getPublishedSources(name="overridden").one()
660 >>> print copy.section.name
661 python
662
663
664=== modified file 'lib/lp/soyuz/doc/distribution.txt'
665--- lib/lp/soyuz/doc/distribution.txt 2010-10-18 22:24:59 +0000
666+++ lib/lp/soyuz/doc/distribution.txt 2011-03-06 11:23:52 +0000
667@@ -383,7 +383,7 @@
668 We can make Celso's PPA pending publication by copying a published
669 source to another location within the PPA.
670
671- >>> cprov_src = cprov.archive.getPublishedSources()[0]
672+ >>> cprov_src = cprov.archive.getPublishedSources().first()
673
674 >>> warty = ubuntu['warty']
675 >>> pocket_release = PackagePublishingPocket.RELEASE
676
677=== modified file 'lib/lp/soyuz/doc/publishing.txt'
678--- lib/lp/soyuz/doc/publishing.txt 2010-11-10 23:55:07 +0000
679+++ lib/lp/soyuz/doc/publishing.txt 2011-03-06 11:23:52 +0000
680@@ -112,7 +112,7 @@
681
682 The iceweasel source has good data:
683
684- >>> pub = spph.archive.getPublishedSources(name="iceweasel")[0]
685+ >>> pub = spph.archive.getPublishedSources(name="iceweasel").first()
686 >>> print pub.changesFileUrl()
687 http://launchpad.dev/ubuntu/+archive/primary/+files/mozilla-firefox_0.9_i386.changes
688
689@@ -1432,9 +1432,9 @@
690 Last but not least the publishing set class allows for the bulk deletion
691 of publishing history records.
692
693- >>> cprov_sources = sorted(list(
694+ >>> cprov_sources = sorted(
695 ... cprov.archive.getPublishedSources(
696- ... status=PackagePublishingStatus.PUBLISHED)),
697+ ... status=PackagePublishingStatus.PUBLISHED),
698 ... key=operator.attrgetter('id'))
699 >>> print len(cprov_sources)
700 6
701
702=== modified file 'lib/lp/soyuz/doc/sourcepackagerelease-build-lookup.txt'
703--- lib/lp/soyuz/doc/sourcepackagerelease-build-lookup.txt 2010-10-18 22:24:59 +0000
704+++ lib/lp/soyuz/doc/sourcepackagerelease-build-lookup.txt 2011-03-06 11:23:52 +0000
705@@ -288,7 +288,7 @@
706 same source published in hoary to another suite, let's say
707 breezy-autotest in his PPA.
708
709- >>> cprov_foo_src = cprov.archive.getPublishedSources(name='foo')[0]
710+ >>> cprov_foo_src = cprov.archive.getPublishedSources(name='foo').first()
711
712 >>> breezy_autotest = ubuntu.getSeries('breezy-autotest')
713 >>> copy = cprov_foo_src.copyTo(
714@@ -310,7 +310,7 @@
715 suite in Celso's PPA.
716
717 >>> cprov_foo_src = cprov.archive.getPublishedSources(
718- ... name='foo', distroseries=hoary)[0]
719+ ... name='foo', distroseries=hoary).first()
720 >>> cprov_foo_bin = cprov_foo_src.getPublishedBinaries()[0]
721
722 >>> warty = ubuntu.getSeries('warty')
723@@ -340,7 +340,7 @@
724 to Mark Shuttleworth's PPA.
725
726 >>> cprov_foo_src = cprov.archive.getPublishedSources(
727- ... name='foo', distroseries=hoary)[0]
728+ ... name='foo', distroseries=hoary).first()
729 >>> cprov_foo_bin = cprov_foo_src.getPublishedBinaries()[0]
730
731 >>> mark = getUtility(IPersonSet).getByName('mark')
732
733=== modified file 'lib/lp/soyuz/interfaces/publishing.py'
734--- lib/lp/soyuz/interfaces/publishing.py 2011-02-22 22:14:32 +0000
735+++ lib/lp/soyuz/interfaces/publishing.py 2011-03-06 11:23:52 +0000
736@@ -327,6 +327,8 @@
737 id = Int(
738 title=_('ID'), required=True, readonly=True,
739 )
740+ sourcepackagereleaseID = Attribute(
741+ "The DB id for the sourcepackagerelease.")
742 sourcepackagerelease = Int(
743 title=_('The source package release being published'),
744 required=False, readonly=False,
745@@ -338,6 +340,7 @@
746 vocabulary=PackagePublishingStatus,
747 required=False, readonly=False,
748 ))
749+ distroseriesID = Attribute("DB ID for distroseries.")
750 distroseries = exported(
751 Reference(
752 IDistroSeries,
753@@ -349,6 +352,7 @@
754 title=_('The component being published into'),
755 required=False, readonly=False,
756 )
757+ sectionID = Attribute("DB ID for the section")
758 section = Int(
759 title=_('The section being published into'),
760 required=False, readonly=False,
761
762=== modified file 'lib/lp/soyuz/interfaces/sourcepackagerelease.py'
763--- lib/lp/soyuz/interfaces/sourcepackagerelease.py 2011-03-03 06:24:47 +0000
764+++ lib/lp/soyuz/interfaces/sourcepackagerelease.py 2011-03-06 11:23:52 +0000
765@@ -38,6 +38,7 @@
766 version = Attribute("A version string")
767 dateuploaded = Attribute("Date of Upload")
768 urgency = Attribute("Source Package Urgency")
769+ dscsigningkeyID = Attribute("DB ID of the DSC Signing Key")
770 dscsigningkey = Attribute("DSC Signing Key")
771 component = Attribute("Source Package Component")
772 format = Attribute("The Source Package Format")
773
774=== modified file 'lib/lp/soyuz/model/archive.py'
775--- lib/lp/soyuz/model/archive.py 2011-02-24 15:30:54 +0000
776+++ lib/lp/soyuz/model/archive.py 2011-03-06 11:23:52 +0000
777@@ -9,6 +9,7 @@
778
779 __all__ = ['Archive', 'ArchiveSet']
780
781+from operator import attrgetter
782 import re
783
784 from lazr.lifecycle.event import ObjectCreatedEvent
785@@ -25,6 +26,7 @@
786 Or,
787 Select,
788 Sum,
789+ SQL,
790 )
791 from storm.locals import (
792 Count,
793@@ -50,6 +52,9 @@
794 SQLBase,
795 sqlvalues,
796 )
797+from canonical.launchpad.components.decoratedresultset import (
798+ DecoratedResultSet,
799+ )
800 from canonical.launchpad.components.tokens import (
801 create_unique_token_for_table,
802 )
803@@ -81,6 +86,7 @@
804 from lp.buildmaster.model.packagebuild import PackageBuild
805 from lp.registry.interfaces.distroseries import IDistroSeriesSet
806 from lp.registry.interfaces.person import (
807+ IPersonSet,
808 OPEN_TEAM_POLICY,
809 PersonVisibility,
810 validate_person,
811@@ -88,6 +94,7 @@
812 from lp.registry.interfaces.pocket import PackagePublishingPocket
813 from lp.registry.interfaces.role import IHasOwner
814 from lp.registry.interfaces.sourcepackagename import ISourcePackageNameSet
815+from lp.registry.model.sourcepackagename import SourcePackageName
816 from lp.registry.model.teammembership import TeamParticipation
817 from lp.services.job.interfaces.job import JobStatus
818 from lp.services.propertycache import (
819@@ -182,6 +189,7 @@
820 PackageUpload,
821 PackageUploadSource,
822 )
823+from lp.soyuz.model.section import Section
824 from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease
825 from lp.soyuz.scripts.packagecopier import do_copy
826
827@@ -458,76 +466,94 @@
828 distroseries=None, pocket=None,
829 exact_match=False, created_since_date=None):
830 """See `IArchive`."""
831- clauses = ["""
832- SourcePackagePublishingHistory.archive = %s AND
833- SourcePackagePublishingHistory.sourcepackagerelease =
834- SourcePackageRelease.id AND
835- SourcePackageRelease.sourcepackagename =
836+ # clauses contains literal sql expressions for things that don't work
837+ # easily in storm : this method was migrated from sqlobject but some
838+ # callers are problematic. (Migrate them and test to see).
839+ clauses = []
840+ storm_clauses = [
841+ SourcePackagePublishingHistory.archiveID==self.id,
842+ SourcePackagePublishingHistory.sourcepackagereleaseID==
843+ SourcePackageRelease.id,
844+ SourcePackageRelease.sourcepackagenameID==
845 SourcePackageName.id
846- """ % sqlvalues(self)]
847- clauseTables = ['SourcePackageRelease', 'SourcePackageName']
848- orderBy = ['SourcePackageName.name',
849- '-SourcePackagePublishingHistory.id']
850+ ]
851+ orderBy = [SourcePackageName.name,
852+ Desc(SourcePackagePublishingHistory.id)]
853
854 if name is not None:
855 if exact_match:
856- clauses.append("""
857- SourcePackageName.name=%s
858- """ % sqlvalues(name))
859+ storm_clauses.append(SourcePackageName.name==name)
860 else:
861- clauses.append("""
862- SourcePackageName.name LIKE '%%' || %s || '%%'
863- """ % quote_like(name))
864+ clauses.append(
865+ "SourcePackageName.name LIKE '%%%%' || %s || '%%%%'"
866+ % quote_like(name))
867
868 if version is not None:
869 if name is None:
870 raise VersionRequiresName(
871 "The 'version' parameter can be used only together with"
872 " the 'name' parameter.")
873- clauses.append("""
874- SourcePackageRelease.version = %s
875- """ % sqlvalues(version))
876+ storm_clauses.append(SourcePackageRelease.version==version)
877 else:
878- order_const = "SourcePackageRelease.version"
879- desc_version_order = SQLConstant(order_const+" DESC")
880- orderBy.insert(1, desc_version_order)
881+ orderBy.insert(1, Desc(SourcePackageRelease.version))
882
883 if status is not None:
884 try:
885 status = tuple(status)
886 except TypeError:
887 status = (status, )
888- clauses.append("""
889- SourcePackagePublishingHistory.status IN %s
890- """ % sqlvalues(status))
891+ clauses.append(
892+ "SourcePackagePublishingHistory.status IN %s "
893+ % sqlvalues(status))
894
895 if distroseries is not None:
896- clauses.append("""
897- SourcePackagePublishingHistory.distroseries = %s
898- """ % sqlvalues(distroseries))
899+ storm_clauses.append(
900+ SourcePackagePublishingHistory.distroseriesID==distroseries.id)
901
902 if pocket is not None:
903- clauses.append("""
904- SourcePackagePublishingHistory.pocket = %s
905- """ % sqlvalues(pocket))
906+ storm_clauses.append(
907+ SourcePackagePublishingHistory.pocket==pocket)
908
909 if created_since_date is not None:
910- clauses.append("""
911- SourcePackagePublishingHistory.datecreated >= %s
912- """ % sqlvalues(created_since_date))
913-
914- preJoins = [
915- 'sourcepackagerelease.creator',
916- 'sourcepackagerelease.dscsigningkey',
917- 'distroseries',
918- 'section',
919- ]
920-
921- sources = SourcePackagePublishingHistory.select(
922- ' AND '.join(clauses), clauseTables=clauseTables, orderBy=orderBy,
923- prejoins=preJoins)
924-
925- return sources
926+ clauses.append(
927+ "SourcePackagePublishingHistory.datecreated >= %s"
928+ % sqlvalues(created_since_date))
929+
930+ store = Store.of(self)
931+ if clauses:
932+ storm_clauses.append(SQL(' AND '.join(clauses)))
933+ resultset = store.find(SourcePackagePublishingHistory,
934+ *storm_clauses).order_by(
935+ *orderBy)
936+ # Its not clear that this eager load is necessary or sufficient, it
937+ # replaces a prejoin that had pathological query plans.
938+ def eager_load(rows):
939+ # \o/ circular imports.
940+ from lp.registry.model.distroseries import DistroSeries
941+ from lp.registry.model.gpgkey import GPGKey
942+ ids = set(map(attrgetter('distroseriesID'), rows))
943+ ids.discard(None)
944+ if ids:
945+ list(store.find(DistroSeries, DistroSeries.id.is_in(ids)))
946+ ids = set(map(attrgetter('sectionID'), rows))
947+ ids.discard(None)
948+ if ids:
949+ list(store.find(Section, Section.id.is_in(ids)))
950+ ids = set(map(attrgetter('sourcepackagereleaseID'), rows))
951+ ids.discard(None)
952+ if not ids:
953+ return
954+ releases = list(store.find(
955+ SourcePackageRelease, SourcePackageRelease.id.is_in(ids)))
956+ ids = set(map(attrgetter('creatorID'), releases))
957+ ids.discard(None)
958+ if ids:
959+ list(getUtility(IPersonSet).getPrecachedPersonsFromIDs(ids))
960+ ids = set(map(attrgetter('dscsigningkeyID'), releases))
961+ ids.discard(None)
962+ if ids:
963+ list(store.find(GPGKey, GPGKey.id.is_in(ids)))
964+ return DecoratedResultSet(resultset, pre_iter_hook=eager_load)
965
966 def getSourcesForDeletion(self, name=None, status=None,
967 distroseries=None):
968@@ -1419,7 +1445,7 @@
969 getUtility(ISourcePackageNameSet)[source_name]
970 # Find and validate the source package version required.
971 source = from_archive.getPublishedSources(
972- name=source_name, version=version, exact_match=True)[0]
973+ name=source_name, version=version, exact_match=True).first()
974
975 self._copySources([source], to_pocket, to_series, include_binaries)
976
977@@ -1441,8 +1467,9 @@
978 published_sources = from_archive.getPublishedSources(
979 name=name, exact_match=True,
980 status=PackagePublishingStatus.PUBLISHED)
981- if published_sources.count() > 0:
982- sources.append(published_sources[0])
983+ first_source = published_sources.first()
984+ if first_source is not None:
985+ sources.append(first_source)
986 return sources
987
988 def _copySources(self, sources, to_pocket, to_series=None,
989
990=== modified file 'lib/lp/soyuz/model/packagecloner.py'
991--- lib/lp/soyuz/model/packagecloner.py 2011-01-27 22:17:07 +0000
992+++ lib/lp/soyuz/model/packagecloner.py 2011-03-06 11:23:52 +0000
993@@ -135,10 +135,6 @@
994 sources_published = archive.getPublishedSources(
995 distroseries=distroseries, status=active_publishing_status)
996
997- def get_spn(pub):
998- """Return the source package name for a publishing record."""
999- return pub.sourcepackagerelease.sourcepackagename.name
1000-
1001 for pubrec in sources_published:
1002 builds = pubrec.createMissingBuilds(
1003 architectures_available=architectures)
1004
1005=== modified file 'lib/lp/soyuz/scripts/add_missing_builds.py'
1006--- lib/lp/soyuz/scripts/add_missing_builds.py 2011-03-02 10:00:36 +0000
1007+++ lib/lp/soyuz/scripts/add_missing_builds.py 2011-03-06 11:23:52 +0000
1008@@ -60,7 +60,8 @@
1009 distroseries=distroseries,
1010 pocket=pocket,
1011 status=PackagePublishingStatus.PUBLISHED)
1012- if not bool(sources):
1013+ sources = list(sources)
1014+ if not sources:
1015 self.logger.info("No sources published, nothing to do.")
1016 return
1017
1018
1019=== modified file 'lib/lp/soyuz/scripts/ftpmasterbase.py'
1020--- lib/lp/soyuz/scripts/ftpmasterbase.py 2010-08-23 16:51:11 +0000
1021+++ lib/lp/soyuz/scripts/ftpmasterbase.py 2011-03-06 11:23:52 +0000
1022@@ -153,12 +153,12 @@
1023 pocket=self.location.pocket,
1024 exact_match=True)
1025
1026- if not published_sources:
1027+ try:
1028+ latest_source = published_sources[0]
1029+ except IndexError:
1030 raise SoyuzScriptError(
1031 "Could not find source '%s/%s' in %s" % (
1032 name, self.options.version, self.location))
1033-
1034- latest_source = published_sources[0]
1035 self._validatePublishing(latest_source)
1036 return latest_source
1037
1038
1039=== modified file 'lib/lp/soyuz/scripts/packagecopier.py'
1040--- lib/lp/soyuz/scripts/packagecopier.py 2011-03-03 08:02:29 +0000
1041+++ lib/lp/soyuz/scripts/packagecopier.py 2011-03-06 11:23:52 +0000
1042@@ -265,7 +265,7 @@
1043
1044 # If there are no conflicts with the same version, we can skip the
1045 # rest of the checks, but we still want to check conflicting files
1046- if (not bool(destination_archive_conflicts) and
1047+ if (destination_archive_conflicts.is_empty() and
1048 len(inventory_conflicts) == 0):
1049 self._checkConflictingFiles(source)
1050 return
1051@@ -556,12 +556,12 @@
1052 version=source.sourcepackagerelease.version,
1053 status=active_publishing_status,
1054 distroseries=series, pocket=pocket)
1055- if not bool(source_in_destination):
1056+ if source_in_destination.is_empty():
1057 source_copy = source.copyTo(series, pocket, archive)
1058 close_bugs_for_sourcepublication(source_copy)
1059 copies.append(source_copy)
1060 else:
1061- source_copy = source_in_destination[0]
1062+ source_copy = source_in_destination.first()
1063
1064 if not include_binaries:
1065 source_copy.createMissingBuilds()
1066
1067=== modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py'
1068--- lib/lp/soyuz/scripts/tests/test_copypackage.py 2011-01-25 00:44:23 +0000
1069+++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2011-03-06 11:23:52 +0000
1070@@ -1813,8 +1813,8 @@
1071 target_archive = copy_helper.destination.archive
1072 self.checkCopies(copied, target_archive, 3)
1073
1074- [copied_source] = ubuntu.main_archive.getPublishedSources(
1075- name='boing')
1076+ copied_source = ubuntu.main_archive.getPublishedSources(
1077+ name='boing').one()
1078 self.assertEqual(copied_source.displayname, 'boing 1.0 in hoary')
1079 self.assertEqual(len(copied_source.getPublishedBinaries()), 2)
1080 self.assertEqual(len(copied_source.getBuilds()), 1)
1081@@ -1891,8 +1891,8 @@
1082 # The source and the only existing binary were correctly copied.
1083 # No build was created, but the architecture independent binary
1084 # was propagated to the new architecture (hoary/amd64).
1085- [copied_source] = ubuntu.main_archive.getPublishedSources(
1086- name='boing', distroseries=hoary)
1087+ copied_source = ubuntu.main_archive.getPublishedSources(
1088+ name='boing', distroseries=hoary).one()
1089 self.assertEqual(copied_source.displayname, 'boing 1.0 in hoary')
1090
1091 self.assertEqual(len(copied_source.getBuilds()), 0)
1092@@ -1925,8 +1925,8 @@
1093
1094 # The source and the only existing binary were correctly copied.
1095 hoary = ubuntu.getSeries('hoary')
1096- [copied_source] = ubuntu.main_archive.getPublishedSources(
1097- name='boing', distroseries=hoary)
1098+ copied_source = ubuntu.main_archive.getPublishedSources(
1099+ name='boing', distroseries=hoary).one()
1100 self.assertEqual(copied_source.displayname, 'boing 1.0 in hoary')
1101
1102 [copied_binary] = copied_source.getPublishedBinaries()
1103
1104=== modified file 'lib/lp/soyuz/stories/ppa/xx-delete-packages.txt'
1105--- lib/lp/soyuz/stories/ppa/xx-delete-packages.txt 2010-10-18 22:24:59 +0000
1106+++ lib/lp/soyuz/stories/ppa/xx-delete-packages.txt 2011-03-06 11:23:52 +0000
1107@@ -507,7 +507,7 @@
1108 >>> login('foo.bar@canonical.com')
1109 >>> no_priv = getUtility(IPersonSet).getByName('no-priv')
1110 >>> deleted_pub = no_priv.archive.getPublishedSources(
1111- ... status=PackagePublishingStatus.DELETED)[0]
1112+ ... status=PackagePublishingStatus.DELETED).first()
1113 >>> deleted_pub.dateremoved = deleted_pub.datecreated
1114 >>> logout()
1115
1116
1117=== modified file 'lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt'
1118--- lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt 2010-10-18 22:24:59 +0000
1119+++ lib/lp/soyuz/stories/ppa/xx-ppa-packages.txt 2011-03-06 11:23:52 +0000
1120@@ -276,14 +276,14 @@
1121 >>> login('foo.bar@canonical.com')
1122 >>> cprov = getUtility(IPersonSet).getByName('cprov')
1123 >>> iceweasel_pub = cprov.archive.getPublishedSources(
1124- ... name='iceweasel')[0]
1125+ ... name='iceweasel').first()
1126 >>> bpr = test_publisher.uploadBinaryForBuild(
1127 ... iceweasel_pub.getBuilds()[0], 'bar-bin')
1128 >>> pub_bins = test_publisher.publishBinaryInArchive(
1129 ... bpr, cprov.archive, status=PackagePublishingStatus.PUBLISHED)
1130 >>> iceweasel_pub.status = (
1131 ... PackagePublishingStatus.SUPERSEDED)
1132- >>> pmount_pub = cprov.archive.getPublishedSources(name='pmount')[0]
1133+ >>> pmount_pub = cprov.archive.getPublishedSources(name='pmount').first()
1134 >>> pmount_pub.status = PackagePublishingStatus.DELETED
1135 >>> pmount_pub.removed_by = cprov
1136 >>> pmount_pub.removal_comment = 'nhack !'
1137
1138=== modified file 'lib/lp/soyuz/stories/soyuz/xx-package-diff.txt'
1139--- lib/lp/soyuz/stories/soyuz/xx-package-diff.txt 2010-10-18 22:24:59 +0000
1140+++ lib/lp/soyuz/stories/soyuz/xx-package-diff.txt 2011-03-06 11:23:52 +0000
1141@@ -202,7 +202,7 @@
1142 >>> anon_browser.open(
1143 ... 'http://launchpad.dev/~name16/+archive/ppa/+packages')
1144 >>> login('foo.bar@canonical.com')
1145- >>> biscuit_ppa = name16.archive.getPublishedSources()[0]
1146+ >>> biscuit_ppa = name16.archive.getPublishedSources().first()
1147 >>> biscuit_ppa_id = biscuit_ppa.id
1148 >>> diff_three.date_fulfilled = None
1149 >>> diff_three.status = PackageDiffStatus.PENDING
1150
1151=== modified file 'lib/lp/soyuz/tests/test_archive.py'
1152--- lib/lp/soyuz/tests/test_archive.py 2011-01-20 18:27:00 +0000
1153+++ lib/lp/soyuz/tests/test_archive.py 2011-03-06 11:23:52 +0000
1154@@ -3,7 +3,11 @@
1155
1156 """Test Archive features."""
1157
1158-from datetime import date
1159+from datetime import (
1160+ date,
1161+ datetime,
1162+ timedelta,
1163+ )
1164
1165 import transaction
1166 from zope.component import getUtility
1167@@ -23,7 +27,10 @@
1168 )
1169 from lp.app.errors import NotFoundError
1170 from lp.buildmaster.enums import BuildStatus
1171-from lp.registry.interfaces.person import TeamSubscriptionPolicy
1172+from lp.registry.interfaces.person import (
1173+ IPersonSet,
1174+ TeamSubscriptionPolicy,
1175+ )
1176 from lp.registry.interfaces.pocket import PackagePublishingPocket
1177 from lp.registry.interfaces.series import SeriesStatus
1178 from lp.services.job.interfaces.job import JobStatus
1179@@ -45,6 +52,7 @@
1180 InvalidPocketForPPA,
1181 NoRightsForArchive,
1182 NoRightsForComponent,
1183+ VersionRequiresName,
1184 )
1185 from lp.soyuz.interfaces.archivearch import IArchiveArchSet
1186 from lp.soyuz.interfaces.archivepermission import IArchivePermissionSet
1187@@ -1635,3 +1643,85 @@
1188 new_dsc = self.factory.makeLibraryFileAlias(filename=dsc.filename)
1189 pub.sourcepackagerelease.addFile(new_dsc)
1190 self.assertEquals(new_dsc, self.archive.getFileByName(dsc.filename))
1191+
1192+
1193+class TestGetPublishedSources(TestCaseWithFactory):
1194+
1195+ layer = DatabaseFunctionalLayer
1196+
1197+ def test_getPublishedSources_comprehensive(self):
1198+ # The doctests for getPublishedSources migrated from a doctest for
1199+ # better testing.
1200+ cprov = getUtility(IPersonSet).getByName('cprov')
1201+ cprov_archive = cprov.archive
1202+ # There are three published sources by default - no args returns all
1203+ # publications.
1204+ self.assertEqual(3, cprov_archive.getPublishedSources().count())
1205+ # Various filters.
1206+ active_status = [PackagePublishingStatus.PENDING,
1207+ PackagePublishingStatus.PUBLISHED]
1208+ inactive_status = [PackagePublishingStatus.SUPERSEDED,
1209+ PackagePublishingStatus.DELETED]
1210+ warty = cprov_archive.distribution['warty']
1211+ hoary = cprov_archive.distribution['hoary']
1212+ breezy_autotest = cprov_archive.distribution['breezy-autotest']
1213+ all_sources = cprov_archive.getPublishedSources()
1214+ expected = [('cdrkit - 1.0', 'breezy-autotest'),
1215+ ('iceweasel - 1.0', 'warty'),
1216+ ('pmount - 0.1-1', 'warty'),
1217+ ]
1218+ found = []
1219+ for pub in all_sources:
1220+ title = pub.sourcepackagerelease.title
1221+ pub_ds = pub.distroseries.name
1222+ found.append((title, pub_ds))
1223+ self.assertEqual(expected, found)
1224+ self.assertEqual(1,
1225+ cprov_archive.getPublishedSources(name='cd').count())
1226+ self.assertEqual(1,
1227+ cprov_archive.getPublishedSources(name='ice').count())
1228+ self.assertEqual(1, cprov_archive.getPublishedSources(
1229+ name='iceweasel', exact_match=True).count())
1230+ self.assertEqual(0, cprov_archive.getPublishedSources(
1231+ name='ice', exact_match=True).count())
1232+ self.assertRaises(VersionRequiresName,
1233+ cprov_archive.getPublishedSources,
1234+ version='1.0')
1235+ self.assertEqual(1, cprov_archive.getPublishedSources(
1236+ name='ice', version='1.0').count())
1237+ self.assertEqual(0, cprov_archive.getPublishedSources(
1238+ name='ice', version='666').count())
1239+ self.assertEqual(3, cprov_archive.getPublishedSources(
1240+ status=PackagePublishingStatus.PUBLISHED).count())
1241+ self.assertEqual(3, cprov_archive.getPublishedSources(
1242+ status=active_status).count())
1243+ self.assertEqual(0, cprov_archive.getPublishedSources(
1244+ status=inactive_status).count())
1245+ self.assertEqual(2, cprov_archive.getPublishedSources(
1246+ distroseries=warty).count())
1247+ self.assertEqual(0, cprov_archive.getPublishedSources(
1248+ distroseries=hoary).count())
1249+ self.assertEqual(1, cprov_archive.getPublishedSources(
1250+ distroseries=breezy_autotest).count())
1251+ self.assertEqual(2, cprov_archive.getPublishedSources(
1252+ distroseries=warty,
1253+ pocket=PackagePublishingPocket.RELEASE).count())
1254+ self.assertEqual(0, cprov_archive.getPublishedSources(
1255+ distroseries=warty,
1256+ pocket=PackagePublishingPocket.UPDATES).count())
1257+ self.assertEqual(1, cprov_archive.getPublishedSources(
1258+ name='ice', distroseries=warty).count())
1259+ self.assertEqual(0, cprov_archive.getPublishedSources(
1260+ name='ice', distroseries=breezy_autotest).count())
1261+ self.assertEqual(0, cprov_archive.getPublishedSources(
1262+ created_since_date='2007-07-09 14:00:00').count())
1263+ mid_2007 = datetime(year=2007, month=7, day=9, hour=14)
1264+ self.assertEqual(0, cprov_archive.getPublishedSources(
1265+ created_since_date=mid_2007).count())
1266+ one_hour_step = timedelta(hours=1)
1267+ one_hour_earlier = mid_2007 - one_hour_step
1268+ self.assertEqual(1, cprov_archive.getPublishedSources(
1269+ created_since_date=one_hour_earlier).count())
1270+ two_hours_earlier = one_hour_earlier - one_hour_step
1271+ self.assertEqual(3, cprov_archive.getPublishedSources(
1272+ created_since_date=two_hours_earlier).count())
1273
1274=== modified file 'lib/lp/soyuz/tests/test_processaccepted.py'
1275--- lib/lp/soyuz/tests/test_processaccepted.py 2010-12-20 03:21:03 +0000
1276+++ lib/lp/soyuz/tests/test_processaccepted.py 2011-03-06 11:23:52 +0000
1277@@ -125,8 +125,8 @@
1278 self.assertEqual(published_main.count(), 0)
1279
1280 # Check the copy archive source was accepted.
1281- [published_copy] = copy_archive.getPublishedSources(
1282- name=self.test_package_name)
1283+ published_copy = copy_archive.getPublishedSources(
1284+ name=self.test_package_name).one()
1285 self.assertEqual(
1286 published_copy.status, PackagePublishingStatus.PENDING)
1287 self.assertEqual(copy_source, published_copy.sourcepackagerelease)
1288
1289=== modified file 'lib/lp/soyuz/tests/test_syncpackagejob.py'
1290--- lib/lp/soyuz/tests/test_syncpackagejob.py 2010-11-15 16:25:05 +0000
1291+++ lib/lp/soyuz/tests/test_syncpackagejob.py 2011-03-06 11:23:52 +0000
1292@@ -105,8 +105,7 @@
1293 job.run()
1294
1295 published_sources = archive2.getPublishedSources()
1296- self.assertEquals(1, published_sources.count())
1297- spr = published_sources[0].sourcepackagerelease
1298+ spr = published_sources.one().sourcepackagerelease
1299 self.assertEquals("libc", spr.name)
1300 self.assertEquals("2.8-1", spr.version)
1301