Merge ~ilasc/launchpad:populate-store-upload-revision into launchpad:master

Proposed by Ioana Lasc
Status: Merged
Approved by: Ioana Lasc
Approved revision: 87014a2cd2a230179585b1d3db9b26aad4f21014
Merge reported by: Otto Co-Pilot
Merged at revision: not available
Proposed branch: ~ilasc/launchpad:populate-store-upload-revision
Merge into: launchpad:master
Diff against target: 252 lines (+149/-5)
4 files modified
database/schema/security.cfg (+1/-0)
lib/lp/scripts/garbo.py (+51/-2)
lib/lp/scripts/tests/test_garbo.py (+83/-1)
lib/lp/snappy/model/snapbuild.py (+14/-2)
Reviewer Review Type Date Requested Status
Colin Watson (community) Approve
Review via email: mp+407679@code.launchpad.net

Commit message

Populate store_upload_revision

Description of the change

This populates the new DB column store_upload_revision on SnapBuild.
We're renaming the existent "store_upload_revision" property on SnapBuild to "store_upload_revision_property". We will be using this property to populate the new DB column "store_upload_revision" in PopulateSnapBuildStoreRevision.

This work will be followed by a second MP for LP-349 to start updating the new DB column "store_upload_revision" when uploading the snap to the store.

To post a comment you must log in.
Revision history for this message
Ioana Lasc (ilasc) :
Revision history for this message
Colin Watson (cjwatson) :
review: Needs Fixing
Revision history for this message
Ioana Lasc (ilasc) wrote :

Thanks Colin! That helped debug interactively and get to the bottom of it. It turns out the script was in fact getting triggered but it was failing with a "hidden" SilentLaunchpadScriptFailure caused by garbo_daily not having enough permissions to update SnapBuild.

In terms of breaking API clients that expect the `store_upload_revision` attribute of a snap build to contain useful data, indeed I believe we have a problem there:
I might be misunderstanding this but it looks like we now have a collision between the DB column name and the property that I didn't realize was exposed through the API before we landed the DB patch. I think we won't be able to define a getter with the same name and have it exposed through API same as before along the lines of:

    def store_upload_revision(self):
        if self.store_upload_revision is None:
            return self._store_upload_revision

because that will interfere with the query to populate the DB (the new getter will not return None when it should so we can backfill, it would return the value in the former property).

I could propose a patch to alter the name of the currently empty column from "store_upload_revision" to "store_revision".

It's entirely possible I misunderstood what's possible / what's required (for backward compatibility) here though...

Revision history for this message
Colin Watson (cjwatson) wrote :

There's certainly no need to rename the DB column; for this sort of thing it's always easier to handle any necessary renamings in Python, and there are various primitives available for that. The underlying DB column name isn't required to match the name of the Storm column (it's set using the 'name' parameter to the Storm variable constructor), and there are quite a few cases where we have something like this:

    _foo = Int(name='foo')

    @property
    def foo(self):
        if self._foo is not None:
            return self._foo
        else:
            # compute and return something

Of course anything that sets the actual DB column needs to use the Storm variable name, unless we also use a setter property (which can sometimes be useful), and Storm queries typically need to be written to refer to the name of the Storm variable rather than the wrapping property. The important thing is that any names used in API-exported attributes need to be defined in a way that preserves compatibility.

Happy to help with getting the details of this right tomorrow if you need it.

Revision history for this message
Ioana Lasc (ilasc) wrote :

Thanks Colin! Updates done, MP ready for review.

Revision history for this message
Colin Watson (cjwatson) wrote :

It's a bit strange to land the backfill job before starting to populate the new column in the "ordinary" inline way (in this case having `SnapStoreUploadJob` populate it) - we'd normally do it the other way round. I don't think it's actually a problem here, though, since everything should still catch up eventually.

When you do start populating the new column in `SnapStoreUploadJob`, be careful to run your garbo job tests; it looks like they'll probably need rearranging at that point to manually simulate an old job that's run but that hasn't populated the new column.

review: Approve
Revision history for this message
Colin Watson (cjwatson) :
review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
diff --git a/database/schema/security.cfg b/database/schema/security.cfg
index 7f39796..8567419 100644
--- a/database/schema/security.cfg
+++ b/database/schema/security.cfg
@@ -2530,6 +2530,7 @@ public.revisionauthor = SELECT, UPDATE
2530public.revisioncache = SELECT, DELETE2530public.revisioncache = SELECT, DELETE
2531public.snap = SELECT, UPDATE2531public.snap = SELECT, UPDATE
2532public.snapbase = SELECT2532public.snapbase = SELECT
2533public.snapbuild = SELECT, UPDATE
2533public.snapfile = SELECT, DELETE2534public.snapfile = SELECT, DELETE
2534public.snappydistroseries = SELECT2535public.snappydistroseries = SELECT
2535public.snappyseries = SELECT2536public.snappyseries = SELECT
diff --git a/lib/lp/scripts/garbo.py b/lib/lp/scripts/garbo.py
index fba6c29..f416b3f 100644
--- a/lib/lp/scripts/garbo.py
+++ b/lib/lp/scripts/garbo.py
@@ -37,6 +37,7 @@ from storm.expr import (
37 Except,37 Except,
38 In,38 In,
39 Join,39 Join,
40 LeftJoin,
40 Max,41 Max,
41 Min,42 Min,
42 Or,43 Or,
@@ -134,8 +135,14 @@ from lp.services.verification.model.logintoken import LoginToken
134from lp.services.webapp.publisher import canonical_url135from lp.services.webapp.publisher import canonical_url
135from lp.services.webhooks.interfaces import IWebhookJobSource136from lp.services.webhooks.interfaces import IWebhookJobSource
136from lp.services.webhooks.model import WebhookJob137from lp.services.webhooks.model import WebhookJob
137from lp.snappy.model.snapbuild import SnapFile138from lp.snappy.model.snapbuild import (
138from lp.snappy.model.snapbuildjob import SnapBuildJobType139 SnapBuild,
140 SnapFile,
141 )
142from lp.snappy.model.snapbuildjob import (
143 SnapBuildJob,
144 SnapBuildJobType,
145 )
139from lp.soyuz.enums import ArchiveSubscriberStatus146from lp.soyuz.enums import ArchiveSubscriberStatus
140from lp.soyuz.interfaces.publishing import active_publishing_status147from lp.soyuz.interfaces.publishing import active_publishing_status
141from lp.soyuz.model.archive import Archive148from lp.soyuz.model.archive import Archive
@@ -1717,6 +1724,47 @@ class ArchiveAuthTokenDeactivator(BulkPruner):
1717 transaction.commit()1724 transaction.commit()
17181725
17191726
1727class PopulateSnapBuildStoreRevision(TunableLoop):
1728 """Populates snapbuild.store_revision if not set."""
1729
1730 maximum_chunk_size = 5000
1731
1732 def __init__(self, log, abort_time=None):
1733 super(PopulateSnapBuildStoreRevision, self).__init__(log, abort_time)
1734 self.start_at = 1
1735 self.store = IMasterStore(SnapBuild)
1736
1737 def findSnapBuilds(self):
1738 origin = [
1739 SnapBuild,
1740 LeftJoin(
1741 SnapBuildJob,
1742 SnapBuildJob.snapbuild_id == SnapBuild.id),
1743 LeftJoin(
1744 Job,
1745 Job.id == SnapBuildJob.job_id)
1746 ]
1747 builds = self.store.using(*origin).find(
1748 SnapBuild,
1749 SnapBuild.id >= self.start_at,
1750 SnapBuild._store_upload_revision == None,
1751 SnapBuildJob.job_type == SnapBuildJobType.STORE_UPLOAD,
1752 Job._status == JobStatus.COMPLETED)
1753
1754 return builds.order_by(SnapBuild.id)
1755
1756 def isDone(self):
1757 return self.findSnapBuilds().is_empty()
1758
1759 def __call__(self, chunk_size):
1760 builds = list(self.findSnapBuilds()[:chunk_size])
1761 for build in builds:
1762 build._store_upload_revision = build.store_upload_revision
1763 if len(builds):
1764 self.start_at = builds[-1].id + 1
1765 transaction.commit()
1766
1767
1720class BaseDatabaseGarbageCollector(LaunchpadCronScript):1768class BaseDatabaseGarbageCollector(LaunchpadCronScript):
1721 """Abstract base class to run a collection of TunableLoops."""1769 """Abstract base class to run a collection of TunableLoops."""
1722 script_name = None # Script name for locking and database user. Override.1770 script_name = None # Script name for locking and database user. Override.
@@ -2008,6 +2056,7 @@ class DailyDatabaseGarbageCollector(BaseDatabaseGarbageCollector):
2008 OCIFilePruner,2056 OCIFilePruner,
2009 ObsoleteBugAttachmentPruner,2057 ObsoleteBugAttachmentPruner,
2010 OldTimeLimitedTokenDeleter,2058 OldTimeLimitedTokenDeleter,
2059 PopulateSnapBuildStoreRevision,
2011 POTranslationPruner,2060 POTranslationPruner,
2012 PreviewDiffPruner,2061 PreviewDiffPruner,
2013 ProductVCSPopulator,2062 ProductVCSPopulator,
diff --git a/lib/lp/scripts/tests/test_garbo.py b/lib/lp/scripts/tests/test_garbo.py
index 0d34fce..3a43045 100644
--- a/lib/lp/scripts/tests/test_garbo.py
+++ b/lib/lp/scripts/tests/test_garbo.py
@@ -97,6 +97,7 @@ from lp.scripts.garbo import (
97 load_garbo_job_state,97 load_garbo_job_state,
98 LoginTokenPruner,98 LoginTokenPruner,
99 OpenIDConsumerAssociationPruner,99 OpenIDConsumerAssociationPruner,
100 PopulateSnapBuildStoreRevision,
100 ProductVCSPopulator,101 ProductVCSPopulator,
101 save_garbo_job_state,102 save_garbo_job_state,
102 UnusedPOTMsgSetPruner,103 UnusedPOTMsgSetPruner,
@@ -130,11 +131,17 @@ from lp.services.verification.interfaces.authtoken import LoginTokenType
130from lp.services.verification.model.logintoken import LoginToken131from lp.services.verification.model.logintoken import LoginToken
131from lp.services.worlddata.interfaces.language import ILanguageSet132from lp.services.worlddata.interfaces.language import ILanguageSet
132from lp.snappy.interfaces.snap import SNAP_TESTING_FLAGS133from lp.snappy.interfaces.snap import SNAP_TESTING_FLAGS
134from lp.snappy.interfaces.snapbuildjob import ISnapStoreUploadJobSource
135from lp.snappy.interfaces.snapstoreclient import ISnapStoreClient
133from lp.snappy.model.snapbuild import SnapFile136from lp.snappy.model.snapbuild import SnapFile
134from lp.snappy.model.snapbuildjob import (137from lp.snappy.model.snapbuildjob import (
135 SnapBuildJob,138 SnapBuildJob,
136 SnapStoreUploadJob,139 SnapStoreUploadJob,
137 )140 )
141from lp.snappy.tests.test_snapbuildjob import (
142 FakeSnapStoreClient,
143 run_isolated_jobs,
144 )
138from lp.soyuz.enums import (145from lp.soyuz.enums import (
139 ArchiveSubscriberStatus,146 ArchiveSubscriberStatus,
140 PackagePublishingStatus,147 PackagePublishingStatus,
@@ -153,7 +160,11 @@ from lp.testing import (
153 TestCase,160 TestCase,
154 TestCaseWithFactory,161 TestCaseWithFactory,
155 )162 )
156from lp.testing.dbuser import switch_dbuser163from lp.testing.dbuser import (
164 dbuser,
165 switch_dbuser,
166 )
167from lp.testing.fixture import ZopeUtilityFixture
157from lp.testing.layers import (168from lp.testing.layers import (
158 DatabaseLayer,169 DatabaseLayer,
159 LaunchpadScriptLayer,170 LaunchpadScriptLayer,
@@ -1978,6 +1989,77 @@ class TestGarbo(FakeAdapterMixin, TestCaseWithFactory):
1978 self.assertIsNotNone(token.date_deactivated)1989 self.assertIsNotNone(token.date_deactivated)
1979 self.assertEmailQueueLength(0)1990 self.assertEmailQueueLength(0)
19801991
1992 def test_PopulateSnapBuildStoreRevision(self):
1993 switch_dbuser('testadmin')
1994 snap1 = self.factory.makeSnap()
1995 build1 = self.factory.makeSnapBuild(
1996 snap=snap1,
1997 status=BuildStatus.FULLYBUILT)
1998
1999 # test that build1 does not get picked up
2000 # as it is a build without a store upload
2001 populator = PopulateSnapBuildStoreRevision(None)
2002 rs = populator.findSnapBuilds()
2003 self.assertEqual(0, rs.count())
2004
2005 # Upload build
2006 job = getUtility(ISnapStoreUploadJobSource).create(build1)
2007 client = FakeSnapStoreClient()
2008 client.upload.result = (
2009 "http://sca.example/dev/api/snaps/1/builds/1/status")
2010 client.checkStatus.result = (
2011 "http://sca.example/dev/click-apps/1/rev/1/", 1)
2012 self.useFixture(ZopeUtilityFixture(client, ISnapStoreClient))
2013 with dbuser(config.ISnapStoreUploadJobSource.dbuser):
2014 run_isolated_jobs([job])
2015
2016 # this mimics what we have in the DB right now:
2017 # uploaded snaps that do not have the new DB column
2018 # _store_upload_revision populated yet
2019 populator = PopulateSnapBuildStoreRevision(None)
2020 filter = populator.findSnapBuilds()
2021 build1 = removeSecurityProxy(build1)
2022 self.assertEqual(1, filter.count())
2023 self.assertEqual(build1, filter.one())
2024 self.assertEqual(build1._store_upload_revision, None)
2025
2026 # run the garbo job and verify _store_upload_revision
2027 # is not populated with the value assigned to the build during upload
2028 self.runDaily()
2029 switch_dbuser('testadmin')
2030 self.assertEqual(build1._store_upload_revision, 1)
2031
2032 # Tests that of all builds for the same snap only those that have
2033 # been uploaded to the store will get
2034 # their new _store_upload_revision DB column updated
2035 build2 = self.factory.makeSnapBuild(
2036 snap=snap1,
2037 status=BuildStatus.FULLYBUILT)
2038 build3 = self.factory.makeSnapBuild(
2039 snap=snap1,
2040 status=BuildStatus.FULLYBUILT)
2041 job = getUtility(ISnapStoreUploadJobSource).create(build2)
2042 client = FakeSnapStoreClient()
2043 client.upload.result = (
2044 "http://sca.example/dev/api/snaps/1/builds/2/status")
2045 client.checkStatus.result = (
2046 "http://sca.example/dev/click-apps/1/rev/2/", 1)
2047 self.useFixture(ZopeUtilityFixture(client, ISnapStoreClient))
2048 with dbuser(config.ISnapStoreUploadJobSource.dbuser):
2049 run_isolated_jobs([job])
2050
2051 populator = PopulateSnapBuildStoreRevision(None)
2052 filter = populator.findSnapBuilds()
2053 self.assertEqual(1, filter.count())
2054 self.assertEqual(build2, filter.one())
2055
2056 self.runDaily()
2057 switch_dbuser('testadmin')
2058 build2 = removeSecurityProxy(build2)
2059 self.assertEqual(build2._store_upload_revision, 1)
2060 build3 = removeSecurityProxy(build3)
2061 self.assertIsNone(build3._store_upload_revision)
2062
19812063
1982class TestGarboTasks(TestCaseWithFactory):2064class TestGarboTasks(TestCaseWithFactory):
1983 layer = LaunchpadZopelessLayer2065 layer = LaunchpadZopelessLayer
diff --git a/lib/lp/snappy/model/snapbuild.py b/lib/lp/snappy/model/snapbuild.py
index ca5285a..23864c9 100644
--- a/lib/lp/snappy/model/snapbuild.py
+++ b/lib/lp/snappy/model/snapbuild.py
@@ -191,6 +191,8 @@ class SnapBuild(PackageBuildMixin, Storm):
191191
192 failure_count = Int(name='failure_count', allow_none=False)192 failure_count = Int(name='failure_count', allow_none=False)
193193
194 _store_upload_revision = Int(name='store_upload_revision', allow_none=True)
195
194 store_upload_metadata = JSON('store_upload_json_data', allow_none=True)196 store_upload_metadata = JSON('store_upload_json_data', allow_none=True)
195197
196 def __init__(self, build_farm_job, requester, snap, archive,198 def __init__(self, build_farm_job, requester, snap, archive,
@@ -527,8 +529,18 @@ class SnapBuild(PackageBuildMixin, Storm):
527529
528 @property530 @property
529 def store_upload_revision(self):531 def store_upload_revision(self):
530 job = self.last_store_upload_job532 # We are now persisting the revision assigned by the store
531 return job and job.store_revision533 # on package upload in the new DB column _store_upload_revision.
534 # We backfill _store_upload_revision with
535 # PopulateSnapBuildStoreRevision.
536 # If the persisted field (_store_upload_revision)
537 # is not populated yet we return the old way of computing this
538 # value so that we don't break existing API clients.
539 if self._store_upload_revision:
540 return self._store_upload_revision
541 else:
542 job = self.last_store_upload_job
543 return job and job.store_revision
532544
533 @property545 @property
534 def store_upload_error_message(self):546 def store_upload_error_message(self):

Subscribers

People subscribed via source and target branches

to status/vote changes: