Merge lp:~jtv/launchpad/db-bug-786970 into lp:launchpad/db-devel
- db-bug-786970
- Merge into db-devel
Status: | Merged |
---|---|
Approved by: | Jeroen T. Vermeulen |
Approved revision: | no longer in the source branch. |
Merged at revision: | 10617 |
Proposed branch: | lp:~jtv/launchpad/db-bug-786970 |
Merge into: | lp:launchpad/db-devel |
Diff against target: |
559 lines (+211/-115) 10 files modified
database/schema/security.cfg (+6/-0) lib/lp/registry/browser/distroseries.py (+8/-11) lib/lp/registry/browser/tests/test_distroseries.py (+7/-3) lib/lp/registry/interfaces/distroseriesdifference.py (+0/-12) lib/lp/registry/model/distroseriesdifference.py (+0/-11) lib/lp/registry/tests/test_distroseriesdifference.py (+0/-18) lib/lp/soyuz/browser/archive.py (+3/-16) lib/lp/soyuz/browser/tests/test_package_copying_mixin.py (+0/-25) lib/lp/soyuz/model/packagecopyjob.py (+62/-0) lib/lp/soyuz/tests/test_packagecopyjob.py (+125/-19) |
To merge this branch: | bzr merge lp:~jtv/launchpad/db-bug-786970 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Gavin Panella (community) | Approve | ||
Review via email: mp+62292@code.launchpad.net |
Commit message
[r=allenap][bug=786970,787462] Register CannotCopy failures as DSDComments.
Description of the change
= Summary =
When a PlainPackageCopyJob fails with a CannotCopy error, that's not really meant to be an oops. It's meant to be an error that the user should address. But how do we tell the user? This job is something you "see going on" in the UI, so we want the errors in the UI as well — email is not an answer.
== Proposed fix ==
Julian and I (with feedback from Colin) decided on the following error-reporting mechanism: any one of the PlainPackageCop
Next we'll make the page poll, so that these messages can show up on the fly.
== Pre-implementation notes ==
Multiple package copies for multiple DistroSeriesDif
But alas. There are complications: it takes some relatively complicated tracking, and because the checks are to some extent phased, you might see your package failing because some other package broke the whole job. In the end we decided on a simple fix: only handle one package per job. This affects two callsites.
Another question is: what Launchpad user will the error messages be attributed to? An attempt to start a discussion about having a user identity to represent Launchpad in the database for this sort of thing failed miserably. We thought it might be easy impersonate the requesting user (which will be a little disconcerting because automatically generated messages meant primarily for the user will look as if they'd been entered by that same user). But the only current place for that information is Job.requester, which… isn't accessible in Python. In the end Julian gave me the green light to use the Janitor, which fulfills many of the roles that I had wanted a "Launchpad persona" user for.
== Implementation details ==
For once I remembered to test the new code for database privileges. Which revealed a few that were needed, hopefully saving us some agony later on.
Sadly gone are very similar bits of code that Gavin and I developed in parallel for collating packages by archive, so as to bundle them into the minimum number of PlainPackageCop
== Tests ==
{{{
./bin/test -vvc lp.soyuz.
}}}
== Demo and Q/A ==
Request asynchronous package copies on a derived distroseries' +localpackagediffs page. (Do this by requesting more copies than the max_synchronous
= Launchpad lint =
No lint, but it's complicated. I touched a file that had pre-existing lint, which I did not fix, and then I undid my changes to that file.
Jeroen T. Vermeulen (jtv) wrote : | # |
Thanks Gavin.
> However, I have a major beef with it: PlainPackageCopyJob is no longer
> a plain copy job; it's now heavily intertwined with derivative
> distributions. I think PlainPackageCopyJob should be restored, and a
> new subclass created, something like DerivativePacka
> knows about DSDs and so forth. It's unfortunate that that involves a
> lot of boilerplate code.
That seems like a job for a separate branch though; it will affect a lot more than what's in this branch.
Do I understand that correctly? If so, it seems easier to land this branch first and then make a separate card of splitting up the various usages of PPCJ. Would be nice to get the present branch off my hands!
> Other than that, I think it's a shame that packages can only be copied
> one at a time, but I can see this is a far from trivial problem to
> solve. It does mean that there's no longer an all-or-nothing guarantee
> when copying multiple packages. A *big* downside is that users must
> check all DSD comments for errors when they request multiple copies. I
> assume both of those are acceptable compromises for now.
Yes— we considered that, but it turns out that multi-parent breaks that anyway. You could be selecting DSDs from different parent archives, which have to go into separate jobs regardless!
So in the final analysis we can blame this on the post-last-minute change in requirements.
> + try:
> + self.attemptCopy()
> + except CannotCopy, e:
> + self.abort()
> + self.reportFail
> + self.commit()
>
> Fwiw, the job runner commits after run(), so only the abort() is
> needed.
Nice one! That commit was there from a time when the exception was still propagated, which I preserved in this branch initially but was then rightly asked to change.
> + def findMatchingDSD
> [...]
> + source_distro = self.source_
> + return [
> + dsd
> + for dsd in candidates
> + if dsd.parent_
>
> Can you do with with IDs instead?
Sure. Silly thing that the ORM needs to fetch the distribution there, really.
> + def test_findMatchi
> + # findMatchingDSDs finds matching DSDs for any of the packages
> + # in the job.
> + dsd = self.factory.
> + job = removeSecurityP
>
> This should be naked_job.
>
> Also in test_findMatchi
Fixed.
Jeroen
Gavin Panella (allenap) wrote : | # |
> Thanks Gavin.
>
> > However, I have a major beef with it: PlainPackageCopyJob is no longer
> > a plain copy job; it's now heavily intertwined with derivative
> > distributions. I think PlainPackageCopyJob should be restored, and a
> > new subclass created, something like DerivativePacka
> > knows about DSDs and so forth. It's unfortunate that that involves a
> > lot of boilerplate code.
>
> That seems like a job for a separate branch though; it will affect a lot more
> than what's in this branch.
>
> Do I understand that correctly? If so, it seems easier to land this branch
> first and then make a separate card of splitting up the various usages of
> PPCJ. Would be nice to get the present branch off my hands!
As agreed otp just now (between bigjools, jtv and allenap), it's cool
to leave this branch as it is and things can be split out again if and
when a solid requirement is there.
>
>
> > Other than that, I think it's a shame that packages can only be copied
> > one at a time, but I can see this is a far from trivial problem to
> > solve. It does mean that there's no longer an all-or-nothing guarantee
> > when copying multiple packages. A *big* downside is that users must
> > check all DSD comments for errors when they request multiple copies. I
> > assume both of those are acceptable compromises for now.
>
> Yes— we considered that, but it turns out that multi-parent breaks that
> anyway. You could be selecting DSDs from different parent archives, which
> have to go into separate jobs regardless!
Of course, that's very true, I didn't think of that.
>
> So in the final analysis we can blame this on the post-last-minute change in
> requirements.
Yes! Grumble, grumble.
Preview Diff
1 | === modified file 'database/schema/security.cfg' | |||
2 | --- database/schema/security.cfg 2011-05-25 17:29:39 +0000 | |||
3 | +++ database/schema/security.cfg 2011-05-28 06:24:35 +0000 | |||
4 | @@ -984,6 +984,7 @@ | |||
5 | 984 | 984 | ||
6 | 985 | [sync_packages] | 985 | [sync_packages] |
7 | 986 | groups=script | 986 | groups=script |
8 | 987 | public.account = SELECT | ||
9 | 987 | public.archive = SELECT | 988 | public.archive = SELECT |
10 | 988 | public.archivepermission = SELECT, INSERT | 989 | public.archivepermission = SELECT, INSERT |
11 | 989 | public.binarypackagebuild = SELECT, INSERT | 990 | public.binarypackagebuild = SELECT, INSERT |
12 | @@ -1001,12 +1002,17 @@ | |||
13 | 1001 | public.distributionsourcepackage = SELECT, INSERT, UPDATE | 1002 | public.distributionsourcepackage = SELECT, INSERT, UPDATE |
14 | 1002 | public.distroarchseries = SELECT, INSERT | 1003 | public.distroarchseries = SELECT, INSERT |
15 | 1003 | public.distroseries = SELECT, UPDATE | 1004 | public.distroseries = SELECT, UPDATE |
16 | 1005 | public.distroseriesdifference = SELECT | ||
17 | 1006 | public.distroseriesdifferencemessage = SELECT, INSERT, UPDATE | ||
18 | 1004 | public.distroseriesparent = SELECT | 1007 | public.distroseriesparent = SELECT |
19 | 1008 | public.emailaddress = SELECT | ||
20 | 1005 | public.flatpackagesetinclusion = SELECT, INSERT | 1009 | public.flatpackagesetinclusion = SELECT, INSERT |
21 | 1006 | public.gpgkey = SELECT | 1010 | public.gpgkey = SELECT |
22 | 1007 | public.job = SELECT, INSERT, UPDATE, DELETE | 1011 | public.job = SELECT, INSERT, UPDATE, DELETE |
23 | 1008 | public.libraryfilealias = SELECT, INSERT, UPDATE, DELETE | 1012 | public.libraryfilealias = SELECT, INSERT, UPDATE, DELETE |
24 | 1009 | public.libraryfilecontent = SELECT, INSERT | 1013 | public.libraryfilecontent = SELECT, INSERT |
25 | 1014 | public.message = SELECT, INSERT, UPDATE | ||
26 | 1015 | public.messagechunk = SELECT, INSERT, UPDATE | ||
27 | 1010 | public.packagebuild = SELECT, INSERT | 1016 | public.packagebuild = SELECT, INSERT |
28 | 1011 | public.packagecopyjob = SELECT | 1017 | public.packagecopyjob = SELECT |
29 | 1012 | public.packageset = SELECT, INSERT | 1018 | public.packageset = SELECT, INSERT |
30 | 1013 | 1019 | ||
31 | === modified file 'lib/lp/registry/browser/distroseries.py' | |||
32 | --- lib/lp/registry/browser/distroseries.py 2011-05-27 08:02:37 +0000 | |||
33 | +++ lib/lp/registry/browser/distroseries.py 2011-05-28 06:24:35 +0000 | |||
34 | @@ -1078,17 +1078,14 @@ | |||
35 | 1078 | """Request sync of packages that can be easily upgraded.""" | 1078 | """Request sync of packages that can be easily upgraded.""" |
36 | 1079 | target_distroseries = self.context | 1079 | target_distroseries = self.context |
37 | 1080 | target_archive = target_distroseries.main_archive | 1080 | target_archive = target_distroseries.main_archive |
49 | 1081 | differences_by_archive = ( | 1081 | job_source = getUtility(IPlainPackageCopyJobSource) |
50 | 1082 | getUtility(IDistroSeriesDifferenceSource) | 1082 | |
51 | 1083 | .collateDifferencesByParentArchive(self.getUpgrades())) | 1083 | for dsd in self.getUpgrades(): |
52 | 1084 | for source_archive, differences in differences_by_archive.iteritems(): | 1084 | job_source.create( |
53 | 1085 | source_package_info = [ | 1085 | [specify_dsd_package(dsd)], dsd.parent_series.main_archive, |
54 | 1086 | (difference.source_package_name.name, | 1086 | target_archive, target_distroseries, |
55 | 1087 | difference.parent_source_version) | 1087 | PackagePublishingPocket.UPDATES) |
56 | 1088 | for difference in differences] | 1088 | |
46 | 1089 | getUtility(IPlainPackageCopyJobSource).create( | ||
47 | 1090 | source_package_info, source_archive, target_archive, | ||
48 | 1091 | target_distroseries, PackagePublishingPocket.UPDATES) | ||
57 | 1092 | self.request.response.addInfoNotification( | 1089 | self.request.response.addInfoNotification( |
58 | 1093 | (u"Upgrades of {context.displayname} packages have been " | 1090 | (u"Upgrades of {context.displayname} packages have been " |
59 | 1094 | u"requested. Please give Launchpad some time to complete " | 1091 | u"requested. Please give Launchpad some time to complete " |
60 | 1095 | 1092 | ||
61 | === modified file 'lib/lp/registry/browser/tests/test_distroseries.py' | |||
62 | --- lib/lp/registry/browser/tests/test_distroseries.py 2011-05-27 07:50:38 +0000 | |||
63 | +++ lib/lp/registry/browser/tests/test_distroseries.py 2011-05-28 06:24:35 +0000 | |||
64 | @@ -1090,13 +1090,17 @@ | |||
65 | 1090 | with StormStatementRecorder() as recorder1: | 1090 | with StormStatementRecorder() as recorder1: |
66 | 1091 | self.makeView(derived_series).requestUpgrades() | 1091 | self.makeView(derived_series).requestUpgrades() |
67 | 1092 | self.assertThat(recorder1, HasQueryCount(LessThan(10))) | 1092 | self.assertThat(recorder1, HasQueryCount(LessThan(10))) |
70 | 1093 | # The query count does not increase with more differences. | 1093 | # Creating Jobs and DistributionJobs takes 2 extra queries per |
71 | 1094 | for index in xrange(3): | 1094 | # requested sync. |
72 | 1095 | requested_syncs = 3 | ||
73 | 1096 | for index in xrange(requested_syncs): | ||
74 | 1095 | self.makePackageUpgrade(derived_series=derived_series) | 1097 | self.makePackageUpgrade(derived_series=derived_series) |
75 | 1096 | flush_database_caches() | 1098 | flush_database_caches() |
76 | 1097 | with StormStatementRecorder() as recorder2: | 1099 | with StormStatementRecorder() as recorder2: |
77 | 1098 | self.makeView(derived_series).requestUpgrades() | 1100 | self.makeView(derived_series).requestUpgrades() |
79 | 1099 | self.assertThat(recorder2, HasQueryCount(Equals(recorder1.count))) | 1101 | self.assertThat( |
80 | 1102 | recorder2, | ||
81 | 1103 | HasQueryCount(Equals(recorder1.count + 2 * requested_syncs))) | ||
82 | 1100 | 1104 | ||
83 | 1101 | 1105 | ||
84 | 1102 | class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory, | 1106 | class TestDistroSeriesLocalDifferencesFunctional(TestCaseWithFactory, |
85 | 1103 | 1107 | ||
86 | === modified file 'lib/lp/registry/interfaces/distroseriesdifference.py' | |||
87 | --- lib/lp/registry/interfaces/distroseriesdifference.py 2011-05-24 15:36:35 +0000 | |||
88 | +++ lib/lp/registry/interfaces/distroseriesdifference.py 2011-05-28 06:24:35 +0000 | |||
89 | @@ -338,15 +338,3 @@ | |||
90 | 338 | 338 | ||
91 | 339 | Blacklisted items are excluded. | 339 | Blacklisted items are excluded. |
92 | 340 | """ | 340 | """ |
93 | 341 | |||
94 | 342 | def collateDifferencesByParentArchive(differences): | ||
95 | 343 | """Collate the given differences by parent archive. | ||
96 | 344 | |||
97 | 345 | The given `IDistroSeriesDifference`s are returned in a `dict`, with | ||
98 | 346 | the parent `Archive` as keys. | ||
99 | 347 | |||
100 | 348 | :param differences: An iterable sequence of `IDistroSeriesDifference`. | ||
101 | 349 | |||
102 | 350 | :return: A `dict` of iterable sequences of `IDistroSeriesDifference` | ||
103 | 351 | keyed by their parent `IArchive`. | ||
104 | 352 | """ | ||
105 | 353 | 341 | ||
106 | === modified file 'lib/lp/registry/model/distroseriesdifference.py' | |||
107 | --- lib/lp/registry/model/distroseriesdifference.py 2011-05-24 11:32:34 +0000 | |||
108 | +++ lib/lp/registry/model/distroseriesdifference.py 2011-05-28 06:24:35 +0000 | |||
109 | @@ -494,17 +494,6 @@ | |||
110 | 494 | DistroSeriesDifference.source_package_name_id) | 494 | DistroSeriesDifference.source_package_name_id) |
111 | 495 | return DecoratedResultSet(differences, itemgetter(0)) | 495 | return DecoratedResultSet(differences, itemgetter(0)) |
112 | 496 | 496 | ||
113 | 497 | @staticmethod | ||
114 | 498 | def collateDifferencesByParentArchive(differences): | ||
115 | 499 | by_archive = dict() | ||
116 | 500 | for difference in differences: | ||
117 | 501 | archive = difference.parent_series.main_archive | ||
118 | 502 | if archive in by_archive: | ||
119 | 503 | by_archive[archive].append(difference) | ||
120 | 504 | else: | ||
121 | 505 | by_archive[archive] = [difference] | ||
122 | 506 | return by_archive | ||
123 | 507 | |||
124 | 508 | @cachedproperty | 497 | @cachedproperty |
125 | 509 | def source_pub(self): | 498 | def source_pub(self): |
126 | 510 | """See `IDistroSeriesDifference`.""" | 499 | """See `IDistroSeriesDifference`.""" |
127 | 511 | 500 | ||
128 | === modified file 'lib/lp/registry/tests/test_distroseriesdifference.py' | |||
129 | --- lib/lp/registry/tests/test_distroseriesdifference.py 2011-05-24 15:36:35 +0000 | |||
130 | +++ lib/lp/registry/tests/test_distroseriesdifference.py 2011-05-28 06:24:35 +0000 | |||
131 | @@ -1129,24 +1129,6 @@ | |||
132 | 1129 | self.assertContentEqual( | 1129 | self.assertContentEqual( |
133 | 1130 | [], dsd_source.getSimpleUpgrades(dsd.derived_series)) | 1130 | [], dsd_source.getSimpleUpgrades(dsd.derived_series)) |
134 | 1131 | 1131 | ||
135 | 1132 | def test_collateDifferencesByParentArchive(self): | ||
136 | 1133 | dsp1 = self.factory.makeDistroSeriesParent() | ||
137 | 1134 | dsp2 = self.factory.makeDistroSeriesParent() | ||
138 | 1135 | differences = [ | ||
139 | 1136 | self.factory.makeDistroSeriesDifference(dsp1.derived_series), | ||
140 | 1137 | self.factory.makeDistroSeriesDifference(dsp2.derived_series), | ||
141 | 1138 | self.factory.makeDistroSeriesDifference(dsp1.derived_series), | ||
142 | 1139 | self.factory.makeDistroSeriesDifference(dsp2.derived_series), | ||
143 | 1140 | ] | ||
144 | 1141 | dsd_source = getUtility(IDistroSeriesDifferenceSource) | ||
145 | 1142 | observed = ( | ||
146 | 1143 | dsd_source.collateDifferencesByParentArchive(differences)) | ||
147 | 1144 | expected = { | ||
148 | 1145 | dsp1.parent_series.main_archive: differences[0::2], | ||
149 | 1146 | dsp2.parent_series.main_archive: differences[1::2], | ||
150 | 1147 | } | ||
151 | 1148 | self.assertEqual(observed, expected) | ||
152 | 1149 | |||
153 | 1150 | 1132 | ||
154 | 1151 | class TestMostRecentComments(TestCaseWithFactory): | 1133 | class TestMostRecentComments(TestCaseWithFactory): |
155 | 1152 | 1134 | ||
156 | 1153 | 1135 | ||
157 | === modified file 'lib/lp/soyuz/browser/archive.py' | |||
158 | --- lib/lp/soyuz/browser/archive.py 2011-05-26 15:06:44 +0000 | |||
159 | +++ lib/lp/soyuz/browser/archive.py 2011-05-28 06:24:35 +0000 | |||
160 | @@ -1285,19 +1285,6 @@ | |||
161 | 1285 | dest_display_name) | 1285 | dest_display_name) |
162 | 1286 | 1286 | ||
163 | 1287 | 1287 | ||
164 | 1288 | def partition_pubs_by_archive(source_pubs): | ||
165 | 1289 | """Group `source_pubs` by archive. | ||
166 | 1290 | |||
167 | 1291 | :param source_pubs: A sequence of `SourcePackagePublishingHistory`. | ||
168 | 1292 | :return: A dict mapping `Archive`s to the list of entries from | ||
169 | 1293 | `source_pubs` that are in that archive. | ||
170 | 1294 | """ | ||
171 | 1295 | by_source_archive = {} | ||
172 | 1296 | for spph in source_pubs: | ||
173 | 1297 | by_source_archive.setdefault(spph.archive, []).append(spph) | ||
174 | 1298 | return by_source_archive | ||
175 | 1299 | |||
176 | 1300 | |||
177 | 1301 | def name_pubs_with_versions(source_pubs): | 1288 | def name_pubs_with_versions(source_pubs): |
178 | 1302 | """Annotate each entry from `source_pubs` with its version. | 1289 | """Annotate each entry from `source_pubs` with its version. |
179 | 1303 | 1290 | ||
180 | @@ -1328,11 +1315,11 @@ | |||
181 | 1328 | person, dest_archive, dest_series, dest_pocket, spns) | 1315 | person, dest_archive, dest_series, dest_pocket, spns) |
182 | 1329 | 1316 | ||
183 | 1330 | job_source = getUtility(IPlainPackageCopyJobSource) | 1317 | job_source = getUtility(IPlainPackageCopyJobSource) |
186 | 1331 | archive_pubs = partition_pubs_by_archive(source_pubs) | 1318 | for spph in source_pubs: |
185 | 1332 | for source_archive, spphs in archive_pubs.iteritems(): | ||
187 | 1333 | job_source.create( | 1319 | job_source.create( |
189 | 1334 | name_pubs_with_versions(spphs), source_archive, dest_archive, | 1320 | name_pubs_with_versions([spph]), spph.archive, dest_archive, |
190 | 1335 | dest_series, dest_pocket, include_binaries=include_binaries) | 1321 | dest_series, dest_pocket, include_binaries=include_binaries) |
191 | 1322 | |||
192 | 1336 | return structured(""" | 1323 | return structured(""" |
193 | 1337 | <p>Requested sync of %s packages.</p> | 1324 | <p>Requested sync of %s packages.</p> |
194 | 1338 | <p>Please allow some time for these to be processed.</p> | 1325 | <p>Please allow some time for these to be processed.</p> |
195 | 1339 | 1326 | ||
196 | === modified file 'lib/lp/soyuz/browser/tests/test_package_copying_mixin.py' | |||
197 | --- lib/lp/soyuz/browser/tests/test_package_copying_mixin.py 2011-05-20 20:40:19 +0000 | |||
198 | +++ lib/lp/soyuz/browser/tests/test_package_copying_mixin.py 2011-05-28 06:24:35 +0000 | |||
199 | @@ -19,7 +19,6 @@ | |||
200 | 19 | FEATURE_FLAG_MAX_SYNCHRONOUS_SYNCS, | 19 | FEATURE_FLAG_MAX_SYNCHRONOUS_SYNCS, |
201 | 20 | name_pubs_with_versions, | 20 | name_pubs_with_versions, |
202 | 21 | PackageCopyingMixin, | 21 | PackageCopyingMixin, |
203 | 22 | partition_pubs_by_archive, | ||
204 | 23 | render_cannotcopy_as_html, | 22 | render_cannotcopy_as_html, |
205 | 24 | ) | 23 | ) |
206 | 25 | from lp.soyuz.interfaces.archive import CannotCopy | 24 | from lp.soyuz.interfaces.archive import CannotCopy |
207 | @@ -104,30 +103,6 @@ | |||
208 | 104 | packages = [self.getUniqueString() for counter in range(300)] | 103 | packages = [self.getUniqueString() for counter in range(300)] |
209 | 105 | self.assertFalse(PackageCopyingMixin().canCopySynchronously(packages)) | 104 | self.assertFalse(PackageCopyingMixin().canCopySynchronously(packages)) |
210 | 106 | 105 | ||
211 | 107 | def test_partition_pubs_by_archive_maps_archives_to_pubs(self): | ||
212 | 108 | # partition_pubs_by_archive returns a dict mapping each archive | ||
213 | 109 | # to a list of SPPHs on that archive. | ||
214 | 110 | spph = FakeSPPH() | ||
215 | 111 | self.assertEqual( | ||
216 | 112 | {spph.archive: [spph]}, partition_pubs_by_archive([spph])) | ||
217 | 113 | |||
218 | 114 | def test_partition_pubs_by_archive_splits_by_archive(self): | ||
219 | 115 | # partition_pubs_by_archive keeps SPPHs for different archives | ||
220 | 116 | # separate. | ||
221 | 117 | spphs = [FakeSPPH() for counter in xrange(2)] | ||
222 | 118 | mapping = partition_pubs_by_archive(spphs) | ||
223 | 119 | self.assertEqual( | ||
224 | 120 | dict((spph.archive, [spph]) for spph in spphs), mapping) | ||
225 | 121 | |||
226 | 122 | def test_partition_pubs_by_archive_clusters_by_archive(self): | ||
227 | 123 | # partition_pubs_by_archive bundles SPPHs for the same archive | ||
228 | 124 | # into a single dict entry. | ||
229 | 125 | archive = FakeArchive() | ||
230 | 126 | spphs = [FakeSPPH(archive=archive) for counter in xrange(2)] | ||
231 | 127 | mapping = partition_pubs_by_archive(spphs) | ||
232 | 128 | self.assertEqual([archive], mapping.keys()) | ||
233 | 129 | self.assertContentEqual(spphs, mapping[archive]) | ||
234 | 130 | |||
235 | 131 | def test_name_pubs_with_versions_lists_packages_and_versions(self): | 106 | def test_name_pubs_with_versions_lists_packages_and_versions(self): |
236 | 132 | # name_pubs_with_versions returns a list of tuples of source | 107 | # name_pubs_with_versions returns a list of tuples of source |
237 | 133 | # package name and source package version, one per SPPH. | 108 | # package name and source package version, one per SPPH. |
238 | 134 | 109 | ||
239 | === modified file 'lib/lp/soyuz/model/packagecopyjob.py' | |||
240 | --- lib/lp/soyuz/model/packagecopyjob.py 2011-05-19 17:20:02 +0000 | |||
241 | +++ lib/lp/soyuz/model/packagecopyjob.py 2011-05-28 06:24:35 +0000 | |||
242 | @@ -17,6 +17,8 @@ | |||
243 | 17 | Reference, | 17 | Reference, |
244 | 18 | Unicode, | 18 | Unicode, |
245 | 19 | ) | 19 | ) |
246 | 20 | import transaction | ||
247 | 21 | from zope.component import getUtility | ||
248 | 20 | from zope.interface import ( | 22 | from zope.interface import ( |
249 | 21 | classProvides, | 23 | classProvides, |
250 | 22 | implements, | 24 | implements, |
251 | @@ -26,13 +28,20 @@ | |||
252 | 26 | from canonical.launchpad.components.decoratedresultset import ( | 28 | from canonical.launchpad.components.decoratedresultset import ( |
253 | 27 | DecoratedResultSet, | 29 | DecoratedResultSet, |
254 | 28 | ) | 30 | ) |
255 | 31 | from canonical.launchpad.interfaces.launchpad import ILaunchpadCelebrities | ||
256 | 29 | from canonical.launchpad.interfaces.lpstorm import ( | 32 | from canonical.launchpad.interfaces.lpstorm import ( |
257 | 30 | IMasterStore, | 33 | IMasterStore, |
258 | 31 | IStore, | 34 | IStore, |
259 | 32 | ) | 35 | ) |
260 | 33 | from lp.app.errors import NotFoundError | 36 | from lp.app.errors import NotFoundError |
261 | 37 | from lp.registry.interfaces.distroseriesdifference import ( | ||
262 | 38 | IDistroSeriesDifferenceSource, | ||
263 | 39 | ) | ||
264 | 34 | from lp.registry.interfaces.pocket import PackagePublishingPocket | 40 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
265 | 35 | from lp.registry.model.distroseries import DistroSeries | 41 | from lp.registry.model.distroseries import DistroSeries |
266 | 42 | from lp.registry.interfaces.distroseriesdifferencecomment import ( | ||
267 | 43 | IDistroSeriesDifferenceCommentSource, | ||
268 | 44 | ) | ||
269 | 36 | from lp.services.database.stormbase import StormBase | 45 | from lp.services.database.stormbase import StormBase |
270 | 37 | from lp.services.job.interfaces.job import JobStatus | 46 | from lp.services.job.interfaces.job import JobStatus |
271 | 38 | from lp.services.job.model.job import Job | 47 | from lp.services.job.model.job import Job |
272 | @@ -153,6 +162,11 @@ | |||
273 | 153 | 162 | ||
274 | 154 | class PlainPackageCopyJob(PackageCopyJobDerived): | 163 | class PlainPackageCopyJob(PackageCopyJobDerived): |
275 | 155 | """Job that copies packages between archives.""" | 164 | """Job that copies packages between archives.""" |
276 | 165 | # This job type serves in different places: it supports copying | ||
277 | 166 | # packages between archives, but also the syncing of packages from | ||
278 | 167 | # parents into a derived distroseries. We may split these into | ||
279 | 168 | # separate types at some point, but for now we (allenap, bigjools, | ||
280 | 169 | # jtv) chose to keep it as one. | ||
281 | 156 | 170 | ||
282 | 157 | implements(IPlainPackageCopyJob) | 171 | implements(IPlainPackageCopyJob) |
283 | 158 | 172 | ||
284 | @@ -232,6 +246,18 @@ | |||
285 | 232 | 246 | ||
286 | 233 | def run(self): | 247 | def run(self): |
287 | 234 | """See `IRunnableJob`.""" | 248 | """See `IRunnableJob`.""" |
288 | 249 | try: | ||
289 | 250 | self.attemptCopy() | ||
290 | 251 | except CannotCopy, e: | ||
291 | 252 | self.abort() | ||
292 | 253 | self.reportFailure(e) | ||
293 | 254 | |||
294 | 255 | def attemptCopy(self): | ||
295 | 256 | """Attempt to perform the copy. | ||
296 | 257 | |||
297 | 258 | :raise CannotCopy: If the copy fails for a reason that the user | ||
298 | 259 | can deal with. | ||
299 | 260 | """ | ||
300 | 235 | if self.target_archive.is_ppa: | 261 | if self.target_archive.is_ppa: |
301 | 236 | if self.target_pocket != PackagePublishingPocket.RELEASE: | 262 | if self.target_pocket != PackagePublishingPocket.RELEASE: |
302 | 237 | raise CannotCopy( | 263 | raise CannotCopy( |
303 | @@ -250,6 +276,42 @@ | |||
304 | 250 | series=self.target_distroseries, pocket=self.target_pocket, | 276 | series=self.target_distroseries, pocket=self.target_pocket, |
305 | 251 | include_binaries=self.include_binaries, check_permissions=False) | 277 | include_binaries=self.include_binaries, check_permissions=False) |
306 | 252 | 278 | ||
307 | 279 | def abort(self): | ||
308 | 280 | """Abort work.""" | ||
309 | 281 | transaction.abort() | ||
310 | 282 | |||
311 | 283 | def findMatchingDSDs(self): | ||
312 | 284 | """Find any `DistroSeriesDifference`s that this job might resolve.""" | ||
313 | 285 | dsd_source = getUtility(IDistroSeriesDifferenceSource) | ||
314 | 286 | target_series = self.target_distroseries | ||
315 | 287 | candidates = dsd_source.getForDistroSeries( | ||
316 | 288 | distro_series=target_series) | ||
317 | 289 | # The job doesn't know what distroseries a given package is | ||
318 | 290 | # coming from, and the version number in the DSD may have | ||
319 | 291 | # changed. We can however filter out DSDs that are from | ||
320 | 292 | # different distributions, based on the job's target archive. | ||
321 | 293 | source_distro_id = self.source_archive.distributionID | ||
322 | 294 | return [ | ||
323 | 295 | dsd | ||
324 | 296 | for dsd in candidates | ||
325 | 297 | if dsd.parent_series.distributionID == source_distro_id] | ||
326 | 298 | |||
327 | 299 | def reportFailure(self, cannotcopy_exception): | ||
328 | 300 | """Attempt to report failure to the user.""" | ||
329 | 301 | message = unicode(cannotcopy_exception) | ||
330 | 302 | dsds = self.findMatchingDSDs() | ||
331 | 303 | comment_source = getUtility(IDistroSeriesDifferenceCommentSource) | ||
332 | 304 | |||
333 | 305 | # Register the error comment in the name of the Janitor. Not a | ||
334 | 306 | # great choice, but we have no user identity to represent | ||
335 | 307 | # Launchpad; it's far too costly to create one; and | ||
336 | 308 | # impersonating the requester can be misleading and would also | ||
337 | 309 | # involve extra bookkeeping. | ||
338 | 310 | reporting_persona = getUtility(ILaunchpadCelebrities).janitor | ||
339 | 311 | |||
340 | 312 | for dsd in dsds: | ||
341 | 313 | comment_source.new(dsd, reporting_persona, message) | ||
342 | 314 | |||
343 | 253 | def __repr__(self): | 315 | def __repr__(self): |
344 | 254 | """Returns an informative representation of the job.""" | 316 | """Returns an informative representation of the job.""" |
345 | 255 | parts = ["%s to copy" % self.__class__.__name__] | 317 | parts = ["%s to copy" % self.__class__.__name__] |
346 | 256 | 318 | ||
347 | === modified file 'lib/lp/soyuz/tests/test_packagecopyjob.py' | |||
348 | --- lib/lp/soyuz/tests/test_packagecopyjob.py 2011-05-24 14:29:51 +0000 | |||
349 | +++ lib/lp/soyuz/tests/test_packagecopyjob.py 2011-05-28 06:24:35 +0000 | |||
350 | @@ -8,7 +8,12 @@ | |||
351 | 8 | from zope.component import getUtility | 8 | from zope.component import getUtility |
352 | 9 | from zope.security.proxy import removeSecurityProxy | 9 | from zope.security.proxy import removeSecurityProxy |
353 | 10 | 10 | ||
354 | 11 | from canonical.config import config | ||
355 | 12 | from canonical.launchpad.interfaces.lpstorm import IStore | ||
356 | 11 | from canonical.testing import LaunchpadZopelessLayer | 13 | from canonical.testing import LaunchpadZopelessLayer |
357 | 14 | from lp.registry.model.distroseriesdifferencecomment import ( | ||
358 | 15 | DistroSeriesDifferenceComment, | ||
359 | 16 | ) | ||
360 | 12 | from lp.registry.interfaces.pocket import PackagePublishingPocket | 17 | from lp.registry.interfaces.pocket import PackagePublishingPocket |
361 | 13 | from lp.services.features.testing import FeatureFixture | 18 | from lp.services.features.testing import FeatureFixture |
362 | 14 | from lp.services.job.interfaces.job import JobStatus | 19 | from lp.services.job.interfaces.job import JobStatus |
363 | @@ -34,15 +39,23 @@ | |||
364 | 34 | run_script, | 39 | run_script, |
365 | 35 | TestCaseWithFactory, | 40 | TestCaseWithFactory, |
366 | 36 | ) | 41 | ) |
375 | 37 | 42 | from lp.testing.fakemethod import FakeMethod | |
376 | 38 | 43 | ||
377 | 39 | class PlainPackageCopyJobTests(TestCaseWithFactory): | 44 | |
378 | 40 | """Test case for PlainPackageCopyJob.""" | 45 | def get_dsd_comments(dsd): |
379 | 41 | 46 | """Retrieve `DistroSeriesDifferenceComment`s for `dsd`.""" | |
380 | 42 | layer = LaunchpadZopelessLayer | 47 | return IStore(dsd).find( |
381 | 43 | 48 | DistroSeriesDifferenceComment, | |
382 | 44 | def makeJob(self, dsd): | 49 | DistroSeriesDifferenceComment.distro_series_difference == dsd) |
383 | 50 | |||
384 | 51 | |||
385 | 52 | class LocalTestHelper: | ||
386 | 53 | """Put test helpers that want to be in the test classes here.""" | ||
387 | 54 | |||
388 | 55 | def makeJob(self, dsd=None): | ||
389 | 45 | """Create a `PlainPackageCopyJob` that would resolve `dsd`.""" | 56 | """Create a `PlainPackageCopyJob` that would resolve `dsd`.""" |
390 | 57 | if dsd is None: | ||
391 | 58 | dsd = self.factory.makeDistroSeriesDifference() | ||
392 | 46 | source_packages = [specify_dsd_package(dsd)] | 59 | source_packages = [specify_dsd_package(dsd)] |
393 | 47 | source_archive = dsd.parent_series.main_archive | 60 | source_archive = dsd.parent_series.main_archive |
394 | 48 | target_archive = dsd.derived_series.main_archive | 61 | target_archive = dsd.derived_series.main_archive |
395 | @@ -58,6 +71,12 @@ | |||
396 | 58 | self.layer.switchDbUser('sync_packages') | 71 | self.layer.switchDbUser('sync_packages') |
397 | 59 | job.run() | 72 | job.run() |
398 | 60 | 73 | ||
399 | 74 | |||
400 | 75 | class PlainPackageCopyJobTests(TestCaseWithFactory, LocalTestHelper): | ||
401 | 76 | """Test case for PlainPackageCopyJob.""" | ||
402 | 77 | |||
403 | 78 | layer = LaunchpadZopelessLayer | ||
404 | 79 | |||
405 | 61 | def test_create(self): | 80 | def test_create(self): |
406 | 62 | # A PackageCopyJob can be created and stores its arguments. | 81 | # A PackageCopyJob can be created and stores its arguments. |
407 | 63 | distroseries = self.factory.makeDistroSeries() | 82 | distroseries = self.factory.makeDistroSeries() |
408 | @@ -105,23 +124,59 @@ | |||
409 | 105 | 124 | ||
410 | 106 | def test_getActiveJobs_only_returns_waiting_jobs(self): | 125 | def test_getActiveJobs_only_returns_waiting_jobs(self): |
411 | 107 | # getActiveJobs ignores jobs that aren't in the WAITING state. | 126 | # getActiveJobs ignores jobs that aren't in the WAITING state. |
413 | 108 | job = self.makeJob(self.factory.makeDistroSeriesDifference()) | 127 | job = self.makeJob() |
414 | 109 | removeSecurityProxy(job).job._status = JobStatus.RUNNING | 128 | removeSecurityProxy(job).job._status = JobStatus.RUNNING |
415 | 110 | source = getUtility(IPlainPackageCopyJobSource) | 129 | source = getUtility(IPlainPackageCopyJobSource) |
416 | 111 | self.assertContentEqual([], source.getActiveJobs(job.target_archive)) | 130 | self.assertContentEqual([], source.getActiveJobs(job.target_archive)) |
417 | 112 | 131 | ||
420 | 113 | def test_run_unknown_package(self): | 132 | def test_run_raises_errors(self): |
421 | 114 | # A job properly records failure. | 133 | # A job reports unexpected errors as exceptions. |
422 | 134 | class Boom(Exception): | ||
423 | 135 | pass | ||
424 | 136 | |||
425 | 137 | job = self.makeJob() | ||
426 | 138 | removeSecurityProxy(job).attemptCopy = FakeMethod(failure=Boom()) | ||
427 | 139 | |||
428 | 140 | self.assertRaises(Boom, job.run) | ||
429 | 141 | |||
430 | 142 | def test_run_posts_copy_failure_as_comment(self): | ||
431 | 143 | # If the job fails with a CannotCopy exception, it swallows the | ||
432 | 144 | # exception and posts a DistroSeriesDifferenceComment with the | ||
433 | 145 | # failure message. | ||
434 | 146 | dsd = self.factory.makeDistroSeriesDifference() | ||
435 | 147 | self.factory.makeArchive(distribution=dsd.derived_series.distribution) | ||
436 | 148 | job = self.makeJob(dsd) | ||
437 | 149 | removeSecurityProxy(job).attemptCopy = FakeMethod( | ||
438 | 150 | failure=CannotCopy("Server meltdown")) | ||
439 | 151 | |||
440 | 152 | # The job's error handling will abort, so commit the objects we | ||
441 | 153 | # created as would have happened in real life. | ||
442 | 154 | transaction.commit() | ||
443 | 155 | |||
444 | 156 | job.run() | ||
445 | 157 | |||
446 | 158 | self.assertEqual( | ||
447 | 159 | ["Server meltdown"], | ||
448 | 160 | [comment.body_text for comment in get_dsd_comments(dsd)]) | ||
449 | 161 | |||
450 | 162 | def test_run_cannot_copy_unknown_package(self): | ||
451 | 163 | # Attempting to copy an unknown package is reported as a | ||
452 | 164 | # failure. | ||
453 | 115 | distroseries = self.factory.makeDistroSeries() | 165 | distroseries = self.factory.makeDistroSeries() |
454 | 116 | archive1 = self.factory.makeArchive(distroseries.distribution) | 166 | archive1 = self.factory.makeArchive(distroseries.distribution) |
455 | 117 | archive2 = self.factory.makeArchive(distroseries.distribution) | 167 | archive2 = self.factory.makeArchive(distroseries.distribution) |
458 | 118 | source = getUtility(IPlainPackageCopyJobSource) | 168 | job_source = getUtility(IPlainPackageCopyJobSource) |
459 | 119 | job = source.create( | 169 | job = job_source.create( |
460 | 120 | source_packages=[("foo", "1.0-1")], source_archive=archive1, | 170 | source_packages=[("foo", "1.0-1")], source_archive=archive1, |
461 | 121 | target_archive=archive2, target_distroseries=distroseries, | 171 | target_archive=archive2, target_distroseries=distroseries, |
462 | 122 | target_pocket=PackagePublishingPocket.RELEASE, | 172 | target_pocket=PackagePublishingPocket.RELEASE, |
463 | 123 | include_binaries=False) | 173 | include_binaries=False) |
465 | 124 | self.assertRaises(CannotCopy, self.runJob, job) | 174 | naked_job = removeSecurityProxy(job) |
466 | 175 | naked_job.reportFailure = FakeMethod() | ||
467 | 176 | |||
468 | 177 | job.run() | ||
469 | 178 | |||
470 | 179 | self.assertEqual(1, naked_job.reportFailure.call_count) | ||
471 | 125 | 180 | ||
472 | 126 | def test_target_ppa_non_release_pocket(self): | 181 | def test_target_ppa_non_release_pocket(self): |
473 | 127 | # When copying to a PPA archive the target must be the release pocket. | 182 | # When copying to a PPA archive the target must be the release pocket. |
474 | @@ -134,7 +189,13 @@ | |||
475 | 134 | target_archive=archive2, target_distroseries=distroseries, | 189 | target_archive=archive2, target_distroseries=distroseries, |
476 | 135 | target_pocket=PackagePublishingPocket.UPDATES, | 190 | target_pocket=PackagePublishingPocket.UPDATES, |
477 | 136 | include_binaries=False) | 191 | include_binaries=False) |
479 | 137 | self.assertRaises(CannotCopy, self.runJob, job) | 192 | |
480 | 193 | naked_job = removeSecurityProxy(job) | ||
481 | 194 | naked_job.reportFailure = FakeMethod() | ||
482 | 195 | |||
483 | 196 | job.run() | ||
484 | 197 | |||
485 | 198 | self.assertEqual(1, naked_job.reportFailure.call_count) | ||
486 | 138 | 199 | ||
487 | 139 | def test_run(self): | 200 | def test_run(self): |
488 | 140 | # A proper test run synchronizes packages. | 201 | # A proper test run synchronizes packages. |
489 | @@ -164,8 +225,9 @@ | |||
490 | 164 | 225 | ||
491 | 165 | source = getUtility(IPlainPackageCopyJobSource) | 226 | source = getUtility(IPlainPackageCopyJobSource) |
492 | 166 | job = source.create( | 227 | job = source.create( |
495 | 167 | source_packages=[("libc", "2.8-1")], source_archive=breezy_archive, | 228 | source_packages=[("libc", "2.8-1")], |
496 | 168 | target_archive=target_archive, target_distroseries=target_series, | 229 | source_archive=breezy_archive, target_archive=target_archive, |
497 | 230 | target_distroseries=target_series, | ||
498 | 169 | target_pocket=PackagePublishingPocket.RELEASE, | 231 | target_pocket=PackagePublishingPocket.RELEASE, |
499 | 170 | include_binaries=False) | 232 | include_binaries=False) |
500 | 171 | self.assertContentEqual( | 233 | self.assertContentEqual( |
501 | @@ -275,8 +337,7 @@ | |||
502 | 275 | def test_getPendingJobsPerPackage_ignores_other_distroseries(self): | 337 | def test_getPendingJobsPerPackage_ignores_other_distroseries(self): |
503 | 276 | # getPendingJobsPerPackage only looks for jobs on the indicated | 338 | # getPendingJobsPerPackage only looks for jobs on the indicated |
504 | 277 | # distroseries. | 339 | # distroseries. |
507 | 278 | dsd = self.factory.makeDistroSeriesDifference() | 340 | self.makeJob() |
506 | 279 | self.makeJob(dsd) | ||
508 | 280 | other_series = self.factory.makeDistroSeries() | 341 | other_series = self.factory.makeDistroSeries() |
509 | 281 | job_source = getUtility(IPlainPackageCopyJobSource) | 342 | job_source = getUtility(IPlainPackageCopyJobSource) |
510 | 282 | self.assertEqual( | 343 | self.assertEqual( |
511 | @@ -333,3 +394,48 @@ | |||
512 | 333 | job_source = getUtility(IPlainPackageCopyJobSource) | 394 | job_source = getUtility(IPlainPackageCopyJobSource) |
513 | 334 | self.assertEqual( | 395 | self.assertEqual( |
514 | 335 | {}, job_source.getPendingJobsPerPackage(dsd.derived_series)) | 396 | {}, job_source.getPendingJobsPerPackage(dsd.derived_series)) |
515 | 397 | |||
516 | 398 | def test_findMatchingDSDs_matches_all_DSDs_for_job(self): | ||
517 | 399 | # findMatchingDSDs finds matching DSDs for any of the packages | ||
518 | 400 | # in the job. | ||
519 | 401 | dsd = self.factory.makeDistroSeriesDifference() | ||
520 | 402 | naked_job = removeSecurityProxy(self.makeJob(dsd)) | ||
521 | 403 | self.assertContentEqual([dsd], naked_job.findMatchingDSDs()) | ||
522 | 404 | |||
523 | 405 | def test_findMatchingDSDs_ignores_other_source_series(self): | ||
524 | 406 | # findMatchingDSDs tries to ignore DSDs that are for different | ||
525 | 407 | # parent series than the job's source series. (This can't be | ||
526 | 408 | # done with perfect precision because the job doesn't keep track | ||
527 | 409 | # of source distroseries, but in practice it should be good | ||
528 | 410 | # enough). | ||
529 | 411 | dsd = self.factory.makeDistroSeriesDifference() | ||
530 | 412 | naked_job = removeSecurityProxy(self.makeJob(dsd)) | ||
531 | 413 | |||
532 | 414 | # If the dsd differs only in parent series, that's enough to | ||
533 | 415 | # make it a non-match. | ||
534 | 416 | removeSecurityProxy(dsd).parent_series = ( | ||
535 | 417 | self.factory.makeDistroSeries()) | ||
536 | 418 | |||
537 | 419 | self.assertContentEqual([], naked_job.findMatchingDSDs()) | ||
538 | 420 | |||
539 | 421 | |||
540 | 422 | class TestPlainPackageCopyJobPrivileges(TestCaseWithFactory, LocalTestHelper): | ||
541 | 423 | """Test that `PlainPackageCopyJob` has the privileges it needs. | ||
542 | 424 | |||
543 | 425 | This test looks for errors, not failures. It's here only to see that | ||
544 | 426 | these operations don't run into any privilege limitations. | ||
545 | 427 | """ | ||
546 | 428 | |||
547 | 429 | layer = LaunchpadZopelessLayer | ||
548 | 430 | |||
549 | 431 | def test_findMatchingDSDs(self): | ||
550 | 432 | job = self.makeJob() | ||
551 | 433 | transaction.commit() | ||
552 | 434 | self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) | ||
553 | 435 | removeSecurityProxy(job).findMatchingDSDs() | ||
554 | 436 | |||
555 | 437 | def test_reportFailure(self): | ||
556 | 438 | job = self.makeJob() | ||
557 | 439 | transaction.commit() | ||
558 | 440 | self.layer.switchDbUser(config.IPlainPackageCopyJobSource.dbuser) | ||
559 | 441 | removeSecurityProxy(job).reportFailure(CannotCopy("Mommy it hurts")) |
This looks really good.
However, I have a major beef with it: PlainPackageCopyJob is no longer geCopyJob, that
a plain copy job; it's now heavily intertwined with derivative
distributions. I think PlainPackageCopyJob should be restored, and a
new subclass created, something like DerivativePacka
knows about DSDs and so forth. It's unfortunate that that involves a
lot of boilerplate code.
Other than that, I think it's a shame that packages can only be copied
one at a time, but I can see this is a far from trivial problem to
solve. It does mean that there's no longer an all-or-nothing guarantee
when copying multiple packages. A *big* downside is that users must
check all DSD comments for errors when they request multiple copies. I
assume both of those are acceptable compromises for now.
[1]
+ try: ure(e)
+ self.attemptCopy()
+ except CannotCopy, e:
+ self.abort()
+ self.reportFail
+ self.commit()
Fwiw, the job runner commits after run(), so only the abort() is
needed.
[2]
+ def findMatchingDSD s(self) : archive. distribution series. distribution == source_distro]
[...]
+ source_distro = self.source_
+ return [
+ dsd
+ for dsd in candidates
+ if dsd.parent_
Can you do with with IDs instead?
return [
dsd
for dsd in candidates
if dsd.parent_
[3]
+ def test_findMatchi ngDSDs_ matches_ all_DSDs_ for_job( self): makeDistroSerie sDifference( ) roxy(self. makeJob( dsd))
+ # findMatchingDSDs finds matching DSDs for any of the packages
+ # in the job.
+ dsd = self.factory.
+ job = removeSecurityP
This should be naked_job.
Also in test_findMatchi ngDSDs_ ignores_ other_source_ series.