Merge ~ilasc/launchpad:populate-store-upload-revision into launchpad:master

Proposed by Ioana Lasc
Status: Merged
Approved by: Ioana Lasc
Approved revision: 87014a2cd2a230179585b1d3db9b26aad4f21014
Merge reported by: Otto Co-Pilot
Merged at revision: not available
Proposed branch: ~ilasc/launchpad:populate-store-upload-revision
Merge into: launchpad:master
Diff against target: 252 lines (+149/-5)
4 files modified
database/schema/security.cfg (+1/-0)
lib/lp/scripts/garbo.py (+51/-2)
lib/lp/scripts/tests/test_garbo.py (+83/-1)
lib/lp/snappy/model/snapbuild.py (+14/-2)
Reviewer Review Type Date Requested Status
Colin Watson (community) Approve
Review via email: mp+407679@code.launchpad.net

Commit message

Populate store_upload_revision

Description of the change

This populates the new DB column store_upload_revision on SnapBuild.
We're renaming the existent "store_upload_revision" property on SnapBuild to "store_upload_revision_property". We will be using this property to populate the new DB column "store_upload_revision" in PopulateSnapBuildStoreRevision.

This work will be followed by a second MP for LP-349 to start updating the new DB column "store_upload_revision" when uploading the snap to the store.

To post a comment you must log in.
Revision history for this message
Ioana Lasc (ilasc) :
Revision history for this message
Colin Watson (cjwatson) :
review: Needs Fixing
Revision history for this message
Ioana Lasc (ilasc) wrote :

Thanks Colin! That helped debug interactively and get to the bottom of it. It turns out the script was in fact getting triggered but it was failing with a "hidden" SilentLaunchpadScriptFailure caused by garbo_daily not having enough permissions to update SnapBuild.

In terms of breaking API clients that expect the `store_upload_revision` attribute of a snap build to contain useful data, indeed I believe we have a problem there:
I might be misunderstanding this but it looks like we now have a collision between the DB column name and the property that I didn't realize was exposed through the API before we landed the DB patch. I think we won't be able to define a getter with the same name and have it exposed through API same as before along the lines of:

    def store_upload_revision(self):
        if self.store_upload_revision is None:
            return self._store_upload_revision

because that will interfere with the query to populate the DB (the new getter will not return None when it should so we can backfill, it would return the value in the former property).

I could propose a patch to alter the name of the currently empty column from "store_upload_revision" to "store_revision".

It's entirely possible I misunderstood what's possible / what's required (for backward compatibility) here though...

Revision history for this message
Colin Watson (cjwatson) wrote :

There's certainly no need to rename the DB column; for this sort of thing it's always easier to handle any necessary renamings in Python, and there are various primitives available for that. The underlying DB column name isn't required to match the name of the Storm column (it's set using the 'name' parameter to the Storm variable constructor), and there are quite a few cases where we have something like this:

    _foo = Int(name='foo')

    @property
    def foo(self):
        if self._foo is not None:
            return self._foo
        else:
            # compute and return something

Of course anything that sets the actual DB column needs to use the Storm variable name, unless we also use a setter property (which can sometimes be useful), and Storm queries typically need to be written to refer to the name of the Storm variable rather than the wrapping property. The important thing is that any names used in API-exported attributes need to be defined in a way that preserves compatibility.

Happy to help with getting the details of this right tomorrow if you need it.

Revision history for this message
Ioana Lasc (ilasc) wrote :

Thanks Colin! Updates done, MP ready for review.

Revision history for this message
Colin Watson (cjwatson) wrote :

It's a bit strange to land the backfill job before starting to populate the new column in the "ordinary" inline way (in this case having `SnapStoreUploadJob` populate it) - we'd normally do it the other way round. I don't think it's actually a problem here, though, since everything should still catch up eventually.

When you do start populating the new column in `SnapStoreUploadJob`, be careful to run your garbo job tests; it looks like they'll probably need rearranging at that point to manually simulate an old job that's run but that hasn't populated the new column.

review: Approve
Revision history for this message
Colin Watson (cjwatson) :
review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1diff --git a/database/schema/security.cfg b/database/schema/security.cfg
2index 7f39796..8567419 100644
3--- a/database/schema/security.cfg
4+++ b/database/schema/security.cfg
5@@ -2530,6 +2530,7 @@ public.revisionauthor = SELECT, UPDATE
6 public.revisioncache = SELECT, DELETE
7 public.snap = SELECT, UPDATE
8 public.snapbase = SELECT
9+public.snapbuild = SELECT, UPDATE
10 public.snapfile = SELECT, DELETE
11 public.snappydistroseries = SELECT
12 public.snappyseries = SELECT
13diff --git a/lib/lp/scripts/garbo.py b/lib/lp/scripts/garbo.py
14index fba6c29..f416b3f 100644
15--- a/lib/lp/scripts/garbo.py
16+++ b/lib/lp/scripts/garbo.py
17@@ -37,6 +37,7 @@ from storm.expr import (
18 Except,
19 In,
20 Join,
21+ LeftJoin,
22 Max,
23 Min,
24 Or,
25@@ -134,8 +135,14 @@ from lp.services.verification.model.logintoken import LoginToken
26 from lp.services.webapp.publisher import canonical_url
27 from lp.services.webhooks.interfaces import IWebhookJobSource
28 from lp.services.webhooks.model import WebhookJob
29-from lp.snappy.model.snapbuild import SnapFile
30-from lp.snappy.model.snapbuildjob import SnapBuildJobType
31+from lp.snappy.model.snapbuild import (
32+ SnapBuild,
33+ SnapFile,
34+ )
35+from lp.snappy.model.snapbuildjob import (
36+ SnapBuildJob,
37+ SnapBuildJobType,
38+ )
39 from lp.soyuz.enums import ArchiveSubscriberStatus
40 from lp.soyuz.interfaces.publishing import active_publishing_status
41 from lp.soyuz.model.archive import Archive
42@@ -1717,6 +1724,47 @@ class ArchiveAuthTokenDeactivator(BulkPruner):
43 transaction.commit()
44
45
46+class PopulateSnapBuildStoreRevision(TunableLoop):
47+ """Populates snapbuild.store_revision if not set."""
48+
49+ maximum_chunk_size = 5000
50+
51+ def __init__(self, log, abort_time=None):
52+ super(PopulateSnapBuildStoreRevision, self).__init__(log, abort_time)
53+ self.start_at = 1
54+ self.store = IMasterStore(SnapBuild)
55+
56+ def findSnapBuilds(self):
57+ origin = [
58+ SnapBuild,
59+ LeftJoin(
60+ SnapBuildJob,
61+ SnapBuildJob.snapbuild_id == SnapBuild.id),
62+ LeftJoin(
63+ Job,
64+ Job.id == SnapBuildJob.job_id)
65+ ]
66+ builds = self.store.using(*origin).find(
67+ SnapBuild,
68+ SnapBuild.id >= self.start_at,
69+ SnapBuild._store_upload_revision == None,
70+ SnapBuildJob.job_type == SnapBuildJobType.STORE_UPLOAD,
71+ Job._status == JobStatus.COMPLETED)
72+
73+ return builds.order_by(SnapBuild.id)
74+
75+ def isDone(self):
76+ return self.findSnapBuilds().is_empty()
77+
78+ def __call__(self, chunk_size):
79+ builds = list(self.findSnapBuilds()[:chunk_size])
80+ for build in builds:
81+ build._store_upload_revision = build.store_upload_revision
82+ if len(builds):
83+ self.start_at = builds[-1].id + 1
84+ transaction.commit()
85+
86+
87 class BaseDatabaseGarbageCollector(LaunchpadCronScript):
88 """Abstract base class to run a collection of TunableLoops."""
89 script_name = None # Script name for locking and database user. Override.
90@@ -2008,6 +2056,7 @@ class DailyDatabaseGarbageCollector(BaseDatabaseGarbageCollector):
91 OCIFilePruner,
92 ObsoleteBugAttachmentPruner,
93 OldTimeLimitedTokenDeleter,
94+ PopulateSnapBuildStoreRevision,
95 POTranslationPruner,
96 PreviewDiffPruner,
97 ProductVCSPopulator,
98diff --git a/lib/lp/scripts/tests/test_garbo.py b/lib/lp/scripts/tests/test_garbo.py
99index 0d34fce..3a43045 100644
100--- a/lib/lp/scripts/tests/test_garbo.py
101+++ b/lib/lp/scripts/tests/test_garbo.py
102@@ -97,6 +97,7 @@ from lp.scripts.garbo import (
103 load_garbo_job_state,
104 LoginTokenPruner,
105 OpenIDConsumerAssociationPruner,
106+ PopulateSnapBuildStoreRevision,
107 ProductVCSPopulator,
108 save_garbo_job_state,
109 UnusedPOTMsgSetPruner,
110@@ -130,11 +131,17 @@ from lp.services.verification.interfaces.authtoken import LoginTokenType
111 from lp.services.verification.model.logintoken import LoginToken
112 from lp.services.worlddata.interfaces.language import ILanguageSet
113 from lp.snappy.interfaces.snap import SNAP_TESTING_FLAGS
114+from lp.snappy.interfaces.snapbuildjob import ISnapStoreUploadJobSource
115+from lp.snappy.interfaces.snapstoreclient import ISnapStoreClient
116 from lp.snappy.model.snapbuild import SnapFile
117 from lp.snappy.model.snapbuildjob import (
118 SnapBuildJob,
119 SnapStoreUploadJob,
120 )
121+from lp.snappy.tests.test_snapbuildjob import (
122+ FakeSnapStoreClient,
123+ run_isolated_jobs,
124+ )
125 from lp.soyuz.enums import (
126 ArchiveSubscriberStatus,
127 PackagePublishingStatus,
128@@ -153,7 +160,11 @@ from lp.testing import (
129 TestCase,
130 TestCaseWithFactory,
131 )
132-from lp.testing.dbuser import switch_dbuser
133+from lp.testing.dbuser import (
134+ dbuser,
135+ switch_dbuser,
136+ )
137+from lp.testing.fixture import ZopeUtilityFixture
138 from lp.testing.layers import (
139 DatabaseLayer,
140 LaunchpadScriptLayer,
141@@ -1978,6 +1989,77 @@ class TestGarbo(FakeAdapterMixin, TestCaseWithFactory):
142 self.assertIsNotNone(token.date_deactivated)
143 self.assertEmailQueueLength(0)
144
145+ def test_PopulateSnapBuildStoreRevision(self):
146+ switch_dbuser('testadmin')
147+ snap1 = self.factory.makeSnap()
148+ build1 = self.factory.makeSnapBuild(
149+ snap=snap1,
150+ status=BuildStatus.FULLYBUILT)
151+
152+ # test that build1 does not get picked up
153+ # as it is a build without a store upload
154+ populator = PopulateSnapBuildStoreRevision(None)
155+ rs = populator.findSnapBuilds()
156+ self.assertEqual(0, rs.count())
157+
158+ # Upload build
159+ job = getUtility(ISnapStoreUploadJobSource).create(build1)
160+ client = FakeSnapStoreClient()
161+ client.upload.result = (
162+ "http://sca.example/dev/api/snaps/1/builds/1/status")
163+ client.checkStatus.result = (
164+ "http://sca.example/dev/click-apps/1/rev/1/", 1)
165+ self.useFixture(ZopeUtilityFixture(client, ISnapStoreClient))
166+ with dbuser(config.ISnapStoreUploadJobSource.dbuser):
167+ run_isolated_jobs([job])
168+
169+ # this mimics what we have in the DB right now:
170+ # uploaded snaps that do not have the new DB column
171+ # _store_upload_revision populated yet
172+ populator = PopulateSnapBuildStoreRevision(None)
173+ filter = populator.findSnapBuilds()
174+ build1 = removeSecurityProxy(build1)
175+ self.assertEqual(1, filter.count())
176+ self.assertEqual(build1, filter.one())
177+ self.assertEqual(build1._store_upload_revision, None)
178+
179+ # run the garbo job and verify _store_upload_revision
180+ # is not populated with the value assigned to the build during upload
181+ self.runDaily()
182+ switch_dbuser('testadmin')
183+ self.assertEqual(build1._store_upload_revision, 1)
184+
185+ # Tests that of all builds for the same snap only those that have
186+ # been uploaded to the store will get
187+ # their new _store_upload_revision DB column updated
188+ build2 = self.factory.makeSnapBuild(
189+ snap=snap1,
190+ status=BuildStatus.FULLYBUILT)
191+ build3 = self.factory.makeSnapBuild(
192+ snap=snap1,
193+ status=BuildStatus.FULLYBUILT)
194+ job = getUtility(ISnapStoreUploadJobSource).create(build2)
195+ client = FakeSnapStoreClient()
196+ client.upload.result = (
197+ "http://sca.example/dev/api/snaps/1/builds/2/status")
198+ client.checkStatus.result = (
199+ "http://sca.example/dev/click-apps/1/rev/2/", 1)
200+ self.useFixture(ZopeUtilityFixture(client, ISnapStoreClient))
201+ with dbuser(config.ISnapStoreUploadJobSource.dbuser):
202+ run_isolated_jobs([job])
203+
204+ populator = PopulateSnapBuildStoreRevision(None)
205+ filter = populator.findSnapBuilds()
206+ self.assertEqual(1, filter.count())
207+ self.assertEqual(build2, filter.one())
208+
209+ self.runDaily()
210+ switch_dbuser('testadmin')
211+ build2 = removeSecurityProxy(build2)
212+ self.assertEqual(build2._store_upload_revision, 1)
213+ build3 = removeSecurityProxy(build3)
214+ self.assertIsNone(build3._store_upload_revision)
215+
216
217 class TestGarboTasks(TestCaseWithFactory):
218 layer = LaunchpadZopelessLayer
219diff --git a/lib/lp/snappy/model/snapbuild.py b/lib/lp/snappy/model/snapbuild.py
220index ca5285a..23864c9 100644
221--- a/lib/lp/snappy/model/snapbuild.py
222+++ b/lib/lp/snappy/model/snapbuild.py
223@@ -191,6 +191,8 @@ class SnapBuild(PackageBuildMixin, Storm):
224
225 failure_count = Int(name='failure_count', allow_none=False)
226
227+ _store_upload_revision = Int(name='store_upload_revision', allow_none=True)
228+
229 store_upload_metadata = JSON('store_upload_json_data', allow_none=True)
230
231 def __init__(self, build_farm_job, requester, snap, archive,
232@@ -527,8 +529,18 @@ class SnapBuild(PackageBuildMixin, Storm):
233
234 @property
235 def store_upload_revision(self):
236- job = self.last_store_upload_job
237- return job and job.store_revision
238+ # We are now persisting the revision assigned by the store
239+ # on package upload in the new DB column _store_upload_revision.
240+ # We backfill _store_upload_revision with
241+ # PopulateSnapBuildStoreRevision.
242+ # If the persisted field (_store_upload_revision)
243+ # is not populated yet we return the old way of computing this
244+ # value so that we don't break existing API clients.
245+ if self._store_upload_revision:
246+ return self._store_upload_revision
247+ else:
248+ job = self.last_store_upload_job
249+ return job and job.store_revision
250
251 @property
252 def store_upload_error_message(self):

Subscribers

People subscribed via source and target branches

to status/vote changes: