Merge lp:~jtv/launchpad/bug-526462 into lp:launchpad
- bug-526462
- Merge into devel
Status: | Merged | ||||
---|---|---|---|---|---|
Approved by: | Jeroen T. Vermeulen | ||||
Approved revision: | not available | ||||
Merged at revision: | not available | ||||
Proposed branch: | lp:~jtv/launchpad/bug-526462 | ||||
Merge into: | lp:launchpad | ||||
Diff against target: |
347 lines (+338/-0) 2 files modified
lib/lp/soyuz/doc/sampledata-cleanup.txt (+22/-0) utilities/soyuz-sampledata-cleanup.py (+316/-0) |
||||
To merge this branch: | bzr merge lp:~jtv/launchpad/bug-526462 | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Michael Nelson (community) | code | Approve | |
Review via email: mp+20052@code.launchpad.net |
Commit message
Integrate make-ubuntu-sane.py sample-data cleanup script into Launchpad.
Description of the change
Jeroen T. Vermeulen (jtv) wrote : | # |
Michael Nelson (michael.nelson) wrote : | # |
Hi Jeroen,
This is great! Thanks for getting this in with a test to ensure it'll stay in sync with the code-base. I think it would also be worthwhile to add a test to ensure it won't run as expected in other envs (you never know who will come along and modify your code with unintended effects).
As mentioned on IRC, I realise this is the suspenders and not the belt, but I still think we should *only* allow this to run when LPCONFIG == 'development'. If someone wants to run it in another env. (or for some reason, hasn't set LPCONFIG), they'll know enough to modify the script for their needs. I mean, looking at lp-production-
Thoughts?
Michael Nelson (michael.nelson) wrote : | # |
As discussed, we have other damaging scripts intended for local development that do not even do any checks. I do think it would be a simpler (and sufficient) implementation to just check LPCONFIG == 'development', but I'll leave that up to you.
16:42 < jtv> noodles775: thanks for the review!
16:43 < jtv> noodles775: I'm a bit worried though that being this strict would stop people from playing with customized configs.
16:43 < jtv> I realize it's always a tradeoff, but there's also the "do we have a really large production-like table" test.
16:44 < noodles775> jtv: well, if they're playing with customized configs, they're not going to have a problem spending 10 seconds to adjust the script if
16:45 < jtv> noodles775: true, but this script already has another check to ensure it's not running on production; how many scripts do we have (without ever
any trouble) that are just as dangerous without anyone ever adding checks like this?
16:45 < jtv> I mean, if I hadn't added the check, would the average reviewer have thought of adding it?
16:45 < noodles775> jtv: which scripts? make schema?
16:46 < jtv> noodles775: I guess, though I've no idea whether that has any checks.
16:46 < noodles775> jtv: yeah, I'm not sure. All the other scripts that I can think of *add* info, not delete.
16:46 < jtv> launchpad-
16:46 * noodles775 looks
16:47 < jtv> mock-code-import ("warning! run make schema first!")
16:48 < jtv> We also have a script now, apparently, to remove private data. There may be more.
16:49 < noodles775> Yep, you're right.
16:50 < jtv> So I don't want to spend my time guarding against the admin who accidentally goes through the rigmarole for running scripts against a
16:51 < noodles775> yeah, I agree. I was more worried about the situation where a person runs it against a smaller DB with a different config.
16:53 < jtv> noodles775: there's a good chance that the script might fail, and I'd estimate the risk of them running "make schema" by accident considerably
16:53 < jtv> (from shell history, f'rinstance)
16:54 < noodles775> jtv: I just thought it would have been a simpler implementation, not more difficult (ie. LPCONFIG == 'development'), but yep, I'm
Jeroen T. Vermeulen (jtv) wrote : | # |
Thanks! I did start out thinking that one check should be enough. I ended up with one that might produce false positives, plus one that might produce false negatives. For that situation, a --force against false positives plus a final just-in-case safeguard seemed the right approach. In any case the check is easy; maybe someday we'll build reusable helpers for this.
Preview Diff
1 | === added file 'lib/lp/soyuz/doc/sampledata-cleanup.txt' | |||
2 | --- lib/lp/soyuz/doc/sampledata-cleanup.txt 1970-01-01 00:00:00 +0000 | |||
3 | +++ lib/lp/soyuz/doc/sampledata-cleanup.txt 2010-02-24 12:36:31 +0000 | |||
4 | @@ -0,0 +1,22 @@ | |||
5 | 1 | = Sample data cleanup = | ||
6 | 2 | |||
7 | 3 | In order to run Soyuz locally on a development system, the sample data | ||
8 | 4 | must be cleaned up and customized a bit. This is done by a the script | ||
9 | 5 | utilities/soyuz-sampledata-cleanup.py. | ||
10 | 6 | |||
11 | 7 | We only need this script for the playground sample data, so there's | ||
12 | 8 | little point in inspecting what it does to the test database in detail. | ||
13 | 9 | |||
14 | 10 | >>> from canonical.launchpad.ftests.script import run_script | ||
15 | 11 | |||
16 | 12 | The --dry-run option makes the script roll back its changes. | ||
17 | 13 | |||
18 | 14 | >>> return_code, output, error = run_script( | ||
19 | 15 | ... 'utilities/soyuz-sampledata-cleanup.py', args=['--dry-run']) | ||
20 | 16 | |||
21 | 17 | >>> print return_code | ||
22 | 18 | 0 | ||
23 | 19 | |||
24 | 20 | >>> print error | ||
25 | 21 | INFO ... | ||
26 | 22 | INFO Done. | ||
27 | 0 | 23 | ||
28 | === added file 'utilities/soyuz-sampledata-cleanup.py' | |||
29 | --- utilities/soyuz-sampledata-cleanup.py 1970-01-01 00:00:00 +0000 | |||
30 | +++ utilities/soyuz-sampledata-cleanup.py 2010-02-24 12:36:31 +0000 | |||
31 | @@ -0,0 +1,316 @@ | |||
32 | 1 | #!/usr/bin/python2.5 | ||
33 | 2 | # pylint: disable-msg=W0403 | ||
34 | 3 | |||
35 | 4 | # Copyright 2010 Canonical Ltd. This software is licensed under the | ||
36 | 5 | # GNU Affero General Public License version 3 (see the file LICENSE). | ||
37 | 6 | # | ||
38 | 7 | # This code is based on William Grant's make-ubuntu-sane.py script, but | ||
39 | 8 | # reorganized to fit Launchpad coding guidelines, and extended. The | ||
40 | 9 | # code is included under Canonical copyright with his permission | ||
41 | 10 | # (2010-02-24). | ||
42 | 11 | |||
43 | 12 | """Clean up sample data so it will allow Soyuz to run locally. | ||
44 | 13 | |||
45 | 14 | DO NOT RUN ON PRODUCTION SYSTEMS. This script deletes lots of | ||
46 | 15 | Ubuntu-related data. | ||
47 | 16 | """ | ||
48 | 17 | |||
49 | 18 | __metaclass__ = type | ||
50 | 19 | |||
51 | 20 | import _pythonpath | ||
52 | 21 | |||
53 | 22 | from optparse import OptionParser | ||
54 | 23 | from os import getenv | ||
55 | 24 | import re | ||
56 | 25 | import sys | ||
57 | 26 | |||
58 | 27 | from zope.component import getUtility | ||
59 | 28 | from zope.event import notify | ||
60 | 29 | from zope.lifecycleevent import ObjectCreatedEvent | ||
61 | 30 | from zope.security.proxy import removeSecurityProxy | ||
62 | 31 | |||
63 | 32 | from storm.store import Store | ||
64 | 33 | |||
65 | 34 | from canonical.database.sqlbase import sqlvalues | ||
66 | 35 | |||
67 | 36 | from canonical.lp import initZopeless | ||
68 | 37 | |||
69 | 38 | from canonical.launchpad.interfaces.launchpad import ( | ||
70 | 39 | ILaunchpadCelebrities) | ||
71 | 40 | from canonical.launchpad.scripts import execute_zcml_for_scripts | ||
72 | 41 | from canonical.launchpad.scripts.logger import logger, logger_options | ||
73 | 42 | from canonical.launchpad.webapp.interfaces import ( | ||
74 | 43 | IStoreSelector, MAIN_STORE, SLAVE_FLAVOR) | ||
75 | 44 | |||
76 | 45 | from lp.registry.interfaces.series import SeriesStatus | ||
77 | 46 | from lp.soyuz.interfaces.component import IComponentSet | ||
78 | 47 | from lp.soyuz.interfaces.section import ISectionSet | ||
79 | 48 | from lp.soyuz.interfaces.sourcepackageformat import ( | ||
80 | 49 | ISourcePackageFormatSelectionSet, SourcePackageFormat) | ||
81 | 50 | from lp.soyuz.model.section import SectionSelection | ||
82 | 51 | from lp.soyuz.model.component import ComponentSelection | ||
83 | 52 | |||
84 | 53 | |||
85 | 54 | class DoNotRunOnProduction(Exception): | ||
86 | 55 | """Error: do not run this script on production (-like) systems.""" | ||
87 | 56 | |||
88 | 57 | |||
89 | 58 | def get_max_id(store, table_name): | ||
90 | 59 | """Find highest assigned id in given table.""" | ||
91 | 60 | max_id = store.execute("SELECT max(id) FROM %s" % table_name).get_one() | ||
92 | 61 | if max_id is None: | ||
93 | 62 | return None | ||
94 | 63 | else: | ||
95 | 64 | return max_id[0] | ||
96 | 65 | |||
97 | 66 | |||
98 | 67 | def check_preconditions(options): | ||
99 | 68 | """Try to ensure that it's safe to run. | ||
100 | 69 | |||
101 | 70 | This script must not run on a production server, or anything | ||
102 | 71 | remotely like it. | ||
103 | 72 | """ | ||
104 | 73 | store = getUtility(IStoreSelector).get(MAIN_STORE, SLAVE_FLAVOR) | ||
105 | 74 | |||
106 | 75 | # Just a guess, but dev systems aren't likely to have ids this high | ||
107 | 76 | # in this table. Production data does. | ||
108 | 77 | real_data = (get_max_id(store, "TranslationMessage") >= 1000000) | ||
109 | 78 | if real_data and not options.force: | ||
110 | 79 | raise DoNotRunOnProduction( | ||
111 | 80 | "Refusing to delete Ubuntu data unless you --force me.") | ||
112 | 81 | |||
113 | 82 | # For some configs it's just absolutely clear this script shouldn't | ||
114 | 83 | # run. Don't even accept --force there. | ||
115 | 84 | forbidden_configs = re.compile('(edge|lpnet|production)') | ||
116 | 85 | current_config = getenv('LPCONFIG', 'an unknown config') | ||
117 | 86 | if forbidden_configs.match(current_config): | ||
118 | 87 | raise DoNotRunOnProduction( | ||
119 | 88 | "I won't delete Ubuntu data on %s and you can't --force me." | ||
120 | 89 | % current_config) | ||
121 | 90 | |||
122 | 91 | |||
123 | 92 | def parse_args(arguments): | ||
124 | 93 | """Parse command-line arguments. | ||
125 | 94 | |||
126 | 95 | :return: (options, args, logger) | ||
127 | 96 | """ | ||
128 | 97 | parser = OptionParser( | ||
129 | 98 | description="Delete existing Ubuntu releases and set up new ones.") | ||
130 | 99 | parser.add_option('-f', '--force', action='store_true', dest='force', | ||
131 | 100 | help="DANGEROUS: run even if the database looks production-like.") | ||
132 | 101 | parser.add_option('-n', '--dry-run', action='store_true', dest='dry_run', | ||
133 | 102 | help="Do not commit changes.") | ||
134 | 103 | logger_options(parser) | ||
135 | 104 | |||
136 | 105 | options, args = parser.parse_args(arguments) | ||
137 | 106 | return options, args, logger(options) | ||
138 | 107 | |||
139 | 108 | |||
140 | 109 | def get_person(name): | ||
141 | 110 | """Return `IPersonSet` utility.""" | ||
142 | 111 | # Avoid circular import. | ||
143 | 112 | from lp.registry.interfaces.person import IPersonSet | ||
144 | 113 | return getUtility(IPersonSet).getByName(name) | ||
145 | 114 | |||
146 | 115 | |||
147 | 116 | def retire_series(distribution): | ||
148 | 117 | """Mark all `DistroSeries` for `distribution` as obsolete.""" | ||
149 | 118 | for series in distribution.series: | ||
150 | 119 | series.status = SeriesStatus.OBSOLETE | ||
151 | 120 | |||
152 | 121 | |||
153 | 122 | def retire_active_publishing_histories(histories, requester): | ||
154 | 123 | """Retire all active publishing histories in the given collection.""" | ||
155 | 124 | # Avoid circular import. | ||
156 | 125 | from lp.soyuz.interfaces.publishing import active_publishing_status | ||
157 | 126 | for history in histories(status=active_publishing_status): | ||
158 | 127 | history.requestDeletion( | ||
159 | 128 | requester, "Cleaned up because of missing Librarian files.") | ||
160 | 129 | |||
161 | 130 | |||
162 | 131 | def retire_distro_archives(distribution, culprit): | ||
163 | 132 | """Retire all items in `distribution`'s archives.""" | ||
164 | 133 | for archive in distribution.all_distro_archives: | ||
165 | 134 | retire_active_publishing_histories( | ||
166 | 135 | archive.getPublishedSources, culprit) | ||
167 | 136 | retire_active_publishing_histories( | ||
168 | 137 | archive.getAllPublishedBinaries, culprit) | ||
169 | 138 | |||
170 | 139 | |||
171 | 140 | def retire_ppas(distribution): | ||
172 | 141 | """Disable all PPAs for `distribution`.""" | ||
173 | 142 | for ppa in distribution.getAllPPAs(): | ||
174 | 143 | removeSecurityProxy(ppa).publish = False | ||
175 | 144 | |||
176 | 145 | |||
177 | 146 | def set_lucille_config(distribution): | ||
178 | 147 | """Set lucilleconfig on all series of `distribution`.""" | ||
179 | 148 | for series in distribution.series: | ||
180 | 149 | removeSecurityProxy(series).lucilleconfig = '''[publishing] | ||
181 | 150 | components = main restricted universe multiverse''' | ||
182 | 151 | |||
183 | 152 | |||
184 | 153 | def create_sections(distroseries): | ||
185 | 154 | """Set up some sections for `distroseries`.""" | ||
186 | 155 | section_names = ( | ||
187 | 156 | 'admin', 'cli-mono', 'comm', 'database', 'devel', 'debug', 'doc', | ||
188 | 157 | 'editors', 'electronics', 'embedded', 'fonts', 'games', 'gnome', | ||
189 | 158 | 'graphics', 'gnu-r', 'gnustep', 'hamradio', 'haskell', 'httpd', | ||
190 | 159 | 'interpreters', 'java', 'kde', 'kernel', 'libs', 'libdevel', 'lisp', | ||
191 | 160 | 'localization', 'mail', 'math', 'misc', 'net', 'news', 'ocaml', | ||
192 | 161 | 'oldlibs', 'otherosfs', 'perl', 'php', 'python', 'ruby', 'science', | ||
193 | 162 | 'shells', 'sound', 'tex', 'text', 'utils', 'vcs', 'video', 'web', | ||
194 | 163 | 'x11', 'xfce', 'zope') | ||
195 | 164 | store = Store.of(distroseries) | ||
196 | 165 | for section_name in section_names: | ||
197 | 166 | section = getUtility(ISectionSet).ensure(section_name) | ||
198 | 167 | if section not in distroseries.sections: | ||
199 | 168 | store.add( | ||
200 | 169 | SectionSelection(distroseries=distroseries, section=section)) | ||
201 | 170 | |||
202 | 171 | |||
203 | 172 | def create_components(distroseries, uploader): | ||
204 | 173 | """Set up some components for `distroseries`.""" | ||
205 | 174 | component_names = ('main', 'restricted', 'universe', 'multiverse') | ||
206 | 175 | store = Store.of(distroseries) | ||
207 | 176 | main_archive = distroseries.distribution.main_archive | ||
208 | 177 | for component_name in component_names: | ||
209 | 178 | component = getUtility(IComponentSet).ensure(component_name) | ||
210 | 179 | if component not in distroseries.components: | ||
211 | 180 | store.add( | ||
212 | 181 | ComponentSelection( | ||
213 | 182 | distroseries=distroseries, component=component)) | ||
214 | 183 | main_archive.newComponentUploader(uploader, component) | ||
215 | 184 | main_archive.newQueueAdmin(uploader, component) | ||
216 | 185 | |||
217 | 186 | |||
218 | 187 | def create_series(parent, full_name, version, status): | ||
219 | 188 | """Set up a `DistroSeries`.""" | ||
220 | 189 | distribution = parent.distribution | ||
221 | 190 | owner = parent.owner | ||
222 | 191 | name = full_name.split()[0].lower() | ||
223 | 192 | title = "The " + full_name | ||
224 | 193 | displayname = full_name.split()[0] | ||
225 | 194 | new_series = distribution.newSeries(name=name, title=title, | ||
226 | 195 | displayname=displayname, summary='Ubuntu %s is good.' % version, | ||
227 | 196 | description='%s is awesome.' % version, version=version, | ||
228 | 197 | parent_series=parent, owner=owner) | ||
229 | 198 | new_series.status = status | ||
230 | 199 | notify(ObjectCreatedEvent(new_series)) | ||
231 | 200 | |||
232 | 201 | # This bit copied from scripts/ftpmaster-tools/initialise-from-parent.py. | ||
233 | 202 | assert new_series.architectures.count() == 0, ( | ||
234 | 203 | "Cannot copy distroarchseries from parent; this series already has " | ||
235 | 204 | "distroarchseries.") | ||
236 | 205 | |||
237 | 206 | store = Store.of(parent) | ||
238 | 207 | store.execute(""" | ||
239 | 208 | INSERT INTO DistroArchSeries | ||
240 | 209 | (distroseries, processorfamily, architecturetag, owner, official) | ||
241 | 210 | SELECT %s, processorfamily, architecturetag, %s, official | ||
242 | 211 | FROM DistroArchSeries WHERE distroseries = %s | ||
243 | 212 | """ % sqlvalues(new_series, owner, parent)) | ||
244 | 213 | |||
245 | 214 | i386 = new_series.getDistroArchSeries('i386') | ||
246 | 215 | i386.supports_virtualized = True | ||
247 | 216 | new_series.nominatedarchindep = i386 | ||
248 | 217 | |||
249 | 218 | new_series.initialiseFromParent() | ||
250 | 219 | return new_series | ||
251 | 220 | |||
252 | 221 | |||
253 | 222 | def create_sample_series(original_series, log): | ||
254 | 223 | """Set up sample `DistroSeries`. | ||
255 | 224 | |||
256 | 225 | :param original_series: The parent for the first new series to be | ||
257 | 226 | created. The second new series will have the first as a parent, | ||
258 | 227 | and so on. | ||
259 | 228 | """ | ||
260 | 229 | series_descriptions = [ | ||
261 | 230 | ('Dapper Drake', SeriesStatus.SUPPORTED, '6.06'), | ||
262 | 231 | ('Edgy Eft', SeriesStatus.OBSOLETE, '6.10'), | ||
263 | 232 | ('Feisty Fawn', SeriesStatus.OBSOLETE, '7.04'), | ||
264 | 233 | ('Gutsy Gibbon', SeriesStatus.OBSOLETE, '7.10'), | ||
265 | 234 | ('Hardy Heron', SeriesStatus.SUPPORTED, '8.04'), | ||
266 | 235 | ('Intrepid Ibex', SeriesStatus.SUPPORTED, '8.10'), | ||
267 | 236 | ('Jaunty Jackalope', SeriesStatus.SUPPORTED, '9.04'), | ||
268 | 237 | ('Karmic Koala', SeriesStatus.CURRENT, '9.10'), | ||
269 | 238 | ('Lucid Lynx', SeriesStatus.DEVELOPMENT, '10.04'), | ||
270 | 239 | ] | ||
271 | 240 | |||
272 | 241 | parent = original_series | ||
273 | 242 | for full_name, status, version in series_descriptions: | ||
274 | 243 | log.info('Creating %s...' % full_name) | ||
275 | 244 | parent = create_series(parent, full_name, version, status) | ||
276 | 245 | |||
277 | 246 | |||
278 | 247 | def clean_up(distribution, log): | ||
279 | 248 | # First we eliminate all active publishings in the Ubuntu main archives. | ||
280 | 249 | # None of the librarian files exist, so it kills the publisher. | ||
281 | 250 | |||
282 | 251 | # Could use IPublishingSet.requestDeletion() on the published sources to | ||
283 | 252 | # get rid of the binaries too, but I don't trust that there aren't | ||
284 | 253 | # published binaries without corresponding sources. | ||
285 | 254 | |||
286 | 255 | log.info("Deleting all items in official archives...") | ||
287 | 256 | retire_distro_archives(distribution, get_person('name16')) | ||
288 | 257 | |||
289 | 258 | # Disable publishing of all PPAs, as they probably have broken | ||
290 | 259 | # publishings too. | ||
291 | 260 | log.info("Disabling all PPAs...") | ||
292 | 261 | retire_ppas(distribution) | ||
293 | 262 | |||
294 | 263 | retire_series(distribution) | ||
295 | 264 | |||
296 | 265 | |||
297 | 266 | def set_source_package_format(distroseries): | ||
298 | 267 | """Register a series' source package format selection.""" | ||
299 | 268 | utility = getUtility(ISourcePackageFormatSelectionSet) | ||
300 | 269 | format = SourcePackageFormat.FORMAT_1_0 | ||
301 | 270 | if utility.getBySeriesAndFormat(distroseries, format) is None: | ||
302 | 271 | utility.add(distroseries, format) | ||
303 | 272 | |||
304 | 273 | |||
305 | 274 | def populate(distribution, parent_series_name, uploader_name, log): | ||
306 | 275 | """Set up sample data on `distribution`.""" | ||
307 | 276 | parent_series = distribution.getSeries(parent_series_name) | ||
308 | 277 | |||
309 | 278 | # Set up lucilleconfig on all series. The sample data lacks this. | ||
310 | 279 | log.info("Setting lucilleconfig...") | ||
311 | 280 | set_lucille_config(distribution) | ||
312 | 281 | |||
313 | 282 | log.info("Configuring sections...") | ||
314 | 283 | create_sections(parent_series) | ||
315 | 284 | |||
316 | 285 | log.info("Configuring components and permissions...") | ||
317 | 286 | create_components(parent_series, get_person(uploader_name)) | ||
318 | 287 | |||
319 | 288 | set_source_package_format(parent_series) | ||
320 | 289 | |||
321 | 290 | create_sample_series(parent_series, log) | ||
322 | 291 | |||
323 | 292 | |||
324 | 293 | def main(argv): | ||
325 | 294 | options, args, log = parse_args(argv[1:]) | ||
326 | 295 | |||
327 | 296 | execute_zcml_for_scripts() | ||
328 | 297 | txn = initZopeless(dbuser='launchpad') | ||
329 | 298 | |||
330 | 299 | check_preconditions(options.force) | ||
331 | 300 | |||
332 | 301 | ubuntu = getUtility(ILaunchpadCelebrities).ubuntu | ||
333 | 302 | clean_up(ubuntu, log) | ||
334 | 303 | |||
335 | 304 | # Use Hoary as the root, as Breezy and Grumpy are broken. | ||
336 | 305 | populate(ubuntu, 'hoary', 'ubuntu-team', log) | ||
337 | 306 | |||
338 | 307 | if options.dry_run: | ||
339 | 308 | txn.abort() | ||
340 | 309 | else: | ||
341 | 310 | txn.commit() | ||
342 | 311 | |||
343 | 312 | log.info("Done.") | ||
344 | 313 | |||
345 | 314 | |||
346 | 315 | if __name__ == "__main__": | ||
347 | 316 | main(sys.argv) |
= Bug 526462 =
To run Soyuz locally on a development machine, one needs to run a few scripts that are not being maintained, and are kept outside of the Launchpad source tree. See https:/ /dev.launchpad. net/Soyuz/ HowToUseSoyuzLo cally
This branch takes the make-ubuntu-sane.py script from that page and integrates it into the Launchpad source tree as utilities/ soyuz-sampledat a-cleanup. py. The original was written by William Grant, but I discussed with him on IRC today and he has no objection to this being included under Canonical's copyright with a "based on code by wgrant" notice.
As you'll see, the script replaces the existing Ubuntu test series in the playground sample data with a series mirroring real-world Ubuntu releases. This is meant for the playground sample data that you would run on your local machine while testing manually. We discussed updating the sample data as included in the source code, but that would involve uploading tarballs to the local librarian which would have complicated the job and bloated the branch.
Since the script is not meant to run on production databases, it attempts to check for that condition. Our procedures should prevent this, however, so it's a pair of suspenders in addition to the belt we already have. There is no guarantee that the check will prevent any attempt at disastrously wrong use, and I don't think it's worth the hassle of automated testing. If deviating from the "happy path" in this way should break the script, then the script is still dealing with the situation properly: die before it deletes any Ubuntu data.
A doctest now verifies that the script will execute properly. The test would have to force a dirty database in order to avoid test isolation problems—if it weren't for the fact that it uses the --dry-run option (not present in the original script) avoids making any changes to the database.
No lint. To Q/A, just use it on your local dev system. To test:
{{{
./bin/test -vv -t sampledata-cleanup
}}}
Jeroen