Merge lp:~jtv/launchpad/db-bug-752279 into lp:launchpad/db-devel

Proposed by Jeroen T. Vermeulen
Status: Merged
Approved by: Jeroen T. Vermeulen
Approved revision: no longer in the source branch.
Merged at revision: 10430
Proposed branch: lp:~jtv/launchpad/db-bug-752279
Merge into: lp:launchpad/db-devel
Diff against target: 826 lines (+281/-255)
5 files modified
cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases (+0/-1)
cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings (+1/-1)
cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt (+2/-2)
lib/lp/archivepublisher/scripts/publish_ftpmaster.py (+181/-126)
lib/lp/archivepublisher/tests/test_publish_ftpmaster.py (+97/-125)
To merge this branch: bzr merge lp:~jtv/launchpad/db-bug-752279
Reviewer Review Type Date Requested Status
William Grant code* Approve
Review via email: mp+57277@code.launchpad.net

Commit message

[r=wgrant][bug=752279,754453] Simplify and harden directory movements in publish-ftpmaster.

Description of the change

= Summary =

The cron.publish-ftpmaster script, which we're replacing with a python rewrite, handles "dists" directories in quite a complicated way. Trying to mimick this shuffle in the python script led to various bugs. The "happy path" (the least complicated to analyze) shuffles them around like so: http://people.canonical.com/~jtv/publish-ftparchive-dists.png

What actually goes on is not all that difficult. Two directories are perpetually maintained as copies of each other, using hard-linking rsync, with one holding the "currently published" state of a distribution and the other (the "backup" dists directory) a slightly outdated copy. The latter is effectively used as a working directory to create a new version of the current one, but by maintaining it as a reasonably complete copy, the initial rsync that populates the working directory can be kept nice and fast.

With this clearer picture in hand, I simplified the directory dance to this one: http://people.canonical.com/~jtv/simplified-publish-ftparchive-dists.png

Actually that's not entirely truthful: at one point we need the backup dists directory to be in the archive root directory, where it doesn't normally live. So we still need to move it temporarily, but with a few changes:

 * The initial rsync is done before moving anything.

 * Only very small, isolated stretches of code move dists directories to this temporary location.

 * The script recognizes and recovers dists directories in these temporary locations.

 * At most one directory is in a transient state at one time.

 * Re-using the same temporary locations for different needs reduces the recovery to one simple loop.

Lots more has changed. I introduced a --post-rsync option that copies the current dists directory to the backup copy at the end, so that they'll be identical. This will speed up the initial rsync on the next run, and so help reduce the turnaround time for security updates. The cost is more work overall, but at least some of it will be out of the critical path.

== Tests ==

This is a massively slow test, and some day we'll want to try and speed it up. Right now however it's the only test we have for an absolutely vital piece of infrastructure, so I largely ignored performance considerations.

{{{
./bin/test -vvc lp.archivepublisher.tests.test_publish_ftparchive
}}}

== Demo and Q/A ==

With this in place, we can do a new round of full Q/A on the new publish-ftpmaster script. This includes verifying that the run-parts plugin scripts all work properly (insofar that can be verified on test servers).

= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt
  cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings
  lib/lp/archivepublisher/scripts/publish_ftpmaster.py
  cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases
  lib/lp/archivepublisher/tests/test_publish_ftpmaster.py

Jeroen

To post a comment you must log in.
Revision history for this message
William Grant (wgrant) wrote :

Looks like a great simplification. A few nitpicks:

45 +def get_backup_dists(archive_config):
46 + """Return the path of an archive's backup dists directory."""
47 + return os.path.join(archive_config.archiveroot + "-distscopy", "dists")

Is archiveroot guaranteed to not have a trailing separator?

95 + The publishable files are kept in the filesystem. Most of the work
96 + is done in a working "dists" directory in each archive's dists copy
97 + root, which then replaces the current "dists" in the archive root.

Isn't that false now? Work is done in archiveroot, not distscopyroot.

331 + For each archive, this switches the dists directory and the
332 + backup dists directory around.

Lies! It rotates them through the three dirs.

409 + def publish(self, security_only=False):
410 + """Do the main publishing work.
[...]

Could the except/recoverWorkingDists/raise be a finally instead? It also seems like rsyncBackupDists might belong in this method, as installDists is there.

513 + """Write a marker file for checking direction movements.

s/direction/directory/?

review: Approve (code*)
Revision history for this message
Jeroen T. Vermeulen (jtv) wrote :

Thanks! You raise some good questions here.

> 45 +def get_backup_dists(archive_config):
> 46 + """Return the path of an archive's backup dists directory."""
> 47 + return os.path.join(archive_config.archiveroot + "-distscopy",
> "dists")
>
> Is archiveroot guaranteed to not have a trailing separator?

Internally it both constructs them in a way that can't produce one, and relies on there not being one in the same way that this code does.

> 95 + The publishable files are kept in the filesystem. Most of the
> work
> 96 + is done in a working "dists" directory in each archive's dists
> copy
> 97 + root, which then replaces the current "dists" in the archive
> root.
>
> Isn't that false now? Work is done in archiveroot, not distscopyroot.

You are correct. This text predated that change.

> 331 + For each archive, this switches the dists directory and the
> 332 + backup dists directory around.
>
> Lies! It rotates them through the three dirs.

That is implementation, and therefore suitable for a comment. This, however, is docstring and switching the two around is what the method does for the caller.

> 409 + def publish(self, security_only=False):
> 410 + """Do the main publishing work.
> [...]
>
> Could the except/recoverWorkingDists/raise be a finally instead? It also seems
> like rsyncBackupDists might belong in this method, as installDists is there.

It could be, but advantage of these working directories is that they are transient. The recoverBackupDists method is for failure paths only. The normal execution path does have a "finally" cleanup. I considered unifying the two, but it would mean either postponing the cleanup (turning the temporary directories back into state that needs managing) or hiding the "rename" cleanup matching the opening "rename" in a less symmetric, and much less precise, call to recoverBackupDists.

> 513 + """Write a marker file for checking direction movements.
>
> s/direction/directory/?

s.

Jeroen

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases'
2--- cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases 2011-04-08 18:29:54 +0000
3+++ cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases 2011-04-13 10:26:57 +0000
4@@ -6,7 +6,6 @@
5 GNUPGHOME=/srv/launchpad.net/ubuntu-archive/gnupg-home
6
7 RELEASE_FILES=`find $DISTSROOT -maxdepth 2 -name Release`
8-
9 DIST_UPGRADER_TARBALLS=`
10 find $DISTSROOT/*/*/dist-upgrader* -name "*.tar.gz" || true`
11
12
13=== modified file 'cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings'
14--- cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings 2011-04-07 06:30:02 +0000
15+++ cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings 2011-04-13 10:26:57 +0000
16@@ -8,4 +8,4 @@
17 # It's safe to do this since the uncompressed MD5 hashes have already been
18 # computed for inclusion in the Release files.
19
20-find $DISTSROOT.new \( -name -o -name Sources \) -exec rm -f -- "{}" \;
21+find $DISTSROOT \( -name Packages -o -name Sources \) -exec rm -f -- "{}" \;
22
23=== modified file 'cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt'
24--- cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt 2011-04-08 18:59:20 +0000
25+++ cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt 2011-04-13 10:26:57 +0000
26@@ -15,8 +15,8 @@
27 ARCHIVEROOT - the archive's root directory
28 (e.g. /srv/launchpad.net/ubuntu-archive/ubuntu/ )
29
30-DISTSROOT - the archive's dists root directory
31-(e.g. /srv/launchpad.net/ubuntu-archive/ubuntu/dists )
32+DISTSROOT - a working copy of the archive's dists root directory
33+(e.g. /srv/launchpad.net/ubuntu-archive/ubuntu/dists.new )
34
35 OVERRIDEROOT - the archive's overrides root directory (primary archive only)
36 (e.g. /srv/launchpad.net/ubuntu-overrides, or the empty string for partner)
37
38=== modified file 'lib/lp/archivepublisher/scripts/publish_ftpmaster.py'
39--- lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-04-08 18:59:20 +0000
40+++ lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-04-13 10:26:57 +0000
41@@ -71,6 +71,29 @@
42 return ' '.join(['='.join(pair) for pair in env.iteritems()])
43
44
45+def get_backup_dists(archive_config):
46+ """Return the path of an archive's backup dists directory."""
47+ return os.path.join(archive_config.archiveroot + "-distscopy", "dists")
48+
49+
50+def get_dists(archive_config):
51+ """Return the path of an archive's dists directory.
52+
53+ :param archive_config: Configuration for the archive in question.
54+ """
55+ return archive_config.distsroot
56+
57+
58+def get_working_dists(archive_config):
59+ """Return the working path for an archive's dists directory.
60+
61+ In order for publish-distro to operate on an archive, its dists
62+ directory must be in the archive root. So we move the backup
63+ dists directory to a working location below the archive root just
64+ for publish-distro. This method composes the temporary path.
65+ """
66+ return get_dists(archive_config) + ".in-progress"
67+
68 def extend_PATH():
69 """Produce env dict for extending $PATH.
70
71@@ -83,11 +106,6 @@
72 return {"PATH": '"$PATH":%s' % shell_quote(scripts_dir)}
73
74
75-def get_distscopyroot(archive_config):
76- """Return the distscopy root directory for `archive_config`."""
77- return archive_config.archiveroot + "-distscopy"
78-
79-
80 class StoreArgument:
81 """Helper class: receive argument and store it."""
82
83@@ -116,21 +134,54 @@
84
85
86 class PublishFTPMaster(LaunchpadCronScript):
87- """Publish a distro (update)."""
88-
89- # Has the publication been done? This indicates that the distsroots
90- # have been replaced with newly generated ones. It has implications
91- # for cleanup.
92- done_pub = False
93+ """Publish a distro (update).
94+
95+ The publishable files are kept in the filesystem. Most of the work
96+ is done in a working "dists" directory, which then replaces the
97+ current "dists" in the archive root.
98+
99+ For performance reasons, the old "dists" is not discarded. It is
100+ kept as the dists-copy version for the next run. Its contents
101+ don't matter in detail; an rsync updates it based on the currently
102+ published dists directory before we start working with it.
103+
104+ At the end of one pass of the script, the "dists" directory in the
105+ archive root and its backup copy in the dists-copy root will have
106+ traded places.
107+
108+ However the script normally does 2 passes: once just for security
109+ updates, to expedite publication of critical fixes, and once for the
110+ whole distribution. At the end of this, the directories will be
111+ back in their original places (though with updated contents).
112+ """
113
114 def add_my_options(self):
115+ """See `LaunchpadScript`."""
116 self.parser.add_option(
117 '-d', '--distribution', dest='distribution', default=None,
118 help="Distribution to publish.")
119 self.parser.add_option(
120+ '-p', '--post-rsync', dest='post_rsync', action='store_true',
121+ default=False,
122+ help="When done, rsync backup dists to speed up the next run.")
123+ self.parser.add_option(
124 '-s', '--security-only', dest='security_only',
125 action='store_true', default=False, help="Security upload only.")
126
127+ def processOptions(self):
128+ """Handle command-line options.
129+
130+ Sets `self.distribution` to the `Distribution` to publish.
131+ """
132+ if self.options.distribution is None:
133+ raise LaunchpadScriptFailure("Specify a distribution.")
134+
135+ self.distribution = getUtility(IDistributionSet).getByName(
136+ self.options.distribution)
137+ if self.distribution is None:
138+ raise LaunchpadScriptFailure(
139+ "Distribution %s not found." % self.options.distribution)
140+
141 def executeShell(self, command_line, failure=None):
142 """Run `command_line` through a shell.
143
144@@ -161,7 +212,7 @@
145 for archive in self.distribution.all_distro_archives
146 if archive.purpose in ARCHIVES_TO_PUBLISH]
147
148- def makeConfigs(self):
149+ def getConfigs(self):
150 """Set up configuration objects for archives to be published.
151
152 The configs dict maps the archive purposes that are relevant for
153@@ -171,24 +222,6 @@
154 (archive.purpose, getPubConfig(archive))
155 for archive in self.archives)
156
157- def cleanUp(self):
158- """Post-publishing cleanup."""
159- self.logger.debug("Cleaning up.")
160- for purpose, archive_config in self.configs.iteritems():
161- self.logger.debug(
162- "Moving %s dists backup to safe keeping for next time.",
163- purpose.title)
164- distscopyroot = get_distscopyroot(archive_config)
165- dists = os.path.join(distscopyroot, "dists")
166- if self.done_pub:
167- replacement_dists = archive_config.distsroot + ".old"
168- else:
169- replacement_dists = archive_config.distsroot + ".new"
170- if file_exists(replacement_dists):
171- self.logger.debug(
172- "Renaming %s to %s.", replacement_dists, dists)
173- os.rename(replacement_dists, dists)
174-
175 def processAccepted(self):
176 """Run the process-accepted script."""
177 self.logger.debug(
178@@ -212,50 +245,54 @@
179 suites = self.getDirtySuites()
180 return [suite for suite in suites if suite.endswith('-security')]
181
182- def rsyncNewDists(self, archive_purpose):
183- """Populate dists.new with a copy of distsroot.
184+ def rsyncBackupDists(self):
185+ """Populate the backup dists with a copy of distsroot.
186
187 Uses "rsync -aH --delete" so that any obsolete files that may
188- still be in dists.new are cleaned up (bug 58835).
189+ still be in the backup dists are cleaned out (bug 58835).
190
191 :param archive_purpose: The (purpose of the) archive to copy.
192 """
193- archive_config = self.configs[archive_purpose]
194- self.executeShell(
195- "rsync -aH --delete '%s/' '%s/dists.new'"
196- % (archive_config.distsroot, archive_config.archiveroot),
197- failure=LaunchpadScriptFailure(
198- "Failed to rsync dists.new for %s." % archive_purpose.title))
199+ for purpose, archive_config in self.configs.iteritems():
200+ dists = get_dists(archive_config)
201+ backup_dists = get_backup_dists(archive_config)
202+ self.executeShell(
203+ "rsync -aH --delete '%s/' '%s'" % (dists, backup_dists),
204+ failure=LaunchpadScriptFailure(
205+ "Failed to rsync new dists for %s." % purpose.title))
206+
207+ def recoverWorkingDists(self):
208+ """Look for and recover any dists left in transient working state.
209+
210+ An archive's dists directory is temporarily moved into the
211+ archive root for running publish-distro. If a previous script
212+ run died while in this state, restore the directory to its
213+ permanent location.
214+ """
215+ for archive_config in self.configs.itervalues():
216+ working_location = get_working_dists(archive_config)
217+ if file_exists(working_location):
218+ self.logger.info(
219+ "Recovering working directory %s from failed run.",
220+ working_location)
221+ os.rename(working_location, get_backup_dists(archive_config))
222
223 def setUpDirs(self):
224- """Copy the dists tree ready for publishing into.
225-
226- We do this so that we don't get an inconsistent dists tree at
227- any point during the publishing cycle (which would cause buildds
228- to explode).
229-
230- This is now done through maintaining a persistent backup copy of
231- the dists directory, which we move into place and bring up to
232- date with rsync. Should achieve the same effect as copying, but
233- faster.
234- """
235- for archive_config in self.configs.itervalues():
236+ """Create archive roots and such if they did not yet exist."""
237+ for archive_purpose, archive_config in self.configs.iteritems():
238 archiveroot = archive_config.archiveroot
239 if not file_exists(archiveroot):
240 self.logger.debug("Creating archive root %s.", archiveroot)
241 os.makedirs(archiveroot)
242- distsroot = archive_config.distsroot
243- if not file_exists(distsroot):
244- self.logger.debug("Creating dists root %s.", distsroot)
245- os.makedirs(distsroot)
246-
247- for purpose, archive_config in self.configs.iteritems():
248- dists = os.path.join(get_distscopyroot(archive_config), "dists")
249- dists_new = os.path.join(archive_config.archiveroot, "dists.new")
250+ dists = get_dists(archive_config)
251 if not file_exists(dists):
252+ self.logger.debug("Creating dists root %s.", dists)
253 os.makedirs(dists)
254- os.rename(dists, dists_new)
255- self.rsyncNewDists(purpose)
256+ distscopy = get_backup_dists(archive_config)
257+ if not file_exists(distscopy):
258+ self.logger.debug(
259+ "Creating backup dists directory %s", distscopy)
260+ os.makedirs(distscopy)
261
262 def publishDistroArchive(self, archive, security_suites=None):
263 """Publish the results for an archive.
264@@ -265,13 +302,19 @@
265 the publishing to.
266 """
267 purpose = archive.purpose
268+ archive_config = self.configs[purpose]
269 self.logger.debug(
270 "Publishing the %s %s...", self.distribution.name, purpose.title)
271- archive_config = self.configs[purpose]
272+
273+ # For reasons unknown, publishdistro only seems to work with a
274+ # directory that's inside the archive root. So we move it there
275+ # for the duration.
276+ temporary_dists = get_working_dists(archive_config)
277+
278 arguments = [
279 '-v', '-v',
280 '-d', self.distribution.name,
281- '-R', archive_config.distsroot + '.new',
282+ '-R', temporary_dists,
283 ]
284
285 if archive.purpose == ArchivePurpose.PARTNER:
286@@ -282,8 +325,14 @@
287
288 parser = OptionParser()
289 publishdistro.add_options(parser)
290- options, args = parser.parse_args(arguments)
291- publishdistro.run_publisher(options, txn=self.txn, log=self.logger)
292+
293+ os.rename(get_backup_dists(archive_config), temporary_dists)
294+ try:
295+ options, args = parser.parse_args(arguments)
296+ publishdistro.run_publisher(
297+ options, txn=self.txn, log=self.logger)
298+ finally:
299+ os.rename(temporary_dists, get_backup_dists(archive_config))
300
301 self.runPublishDistroParts(archive)
302
303@@ -292,41 +341,31 @@
304 archive_config = self.configs[archive.purpose]
305 env = {
306 'ARCHIVEROOT': shell_quote(archive_config.archiveroot),
307- 'DISTSROOT': shell_quote(archive_config.distsroot + ".new"),
308+ 'DISTSROOT': shell_quote(get_backup_dists(archive_config)),
309 }
310 if archive_config.overrideroot is not None:
311 env["OVERRIDEROOT"] = shell_quote(archive_config.overrideroot)
312 self.runParts('publish-distro.d', env)
313
314 def installDists(self):
315- """Put the new dists into place, as near-atomically as possible."""
316- # Before we start moving directories around, make as nearly
317- # sure as possible that we can do either none or all of them.
318- self.logger.debug("Looking for impediments to publication.")
319- for purpose, archive_config in self.configs.iteritems():
320- old_distsroot = archive_config.distsroot + '.old'
321- if file_exists(old_distsroot):
322- raise LaunchpadScriptFailure(
323- "Old %s distsroot %s is in the way."
324- % (purpose.title, old_distsroot))
325+ """Put the new dists into place, as near-atomically as possible.
326
327- # Move the existing distsroots out of the way, and move the new
328- # ones in their place.
329- self.logger.debug("Placing the new dists into place...")
330+ For each archive, this switches the dists directory and the
331+ backup dists directory around.
332+ """
333+ self.logger.debug("Moving the new dists into place...")
334 for archive_config in self.configs.itervalues():
335- distsroot = archive_config.distsroot
336- os.rename(distsroot, distsroot + ".old")
337- os.rename(distsroot + ".new", distsroot)
338-
339- # Yay, we did it! Mark the fact because it makes a difference
340- # to the cleanup procedure.
341- self.done_pub = True
342-
343- for purpose, archive_config in self.configs.iteritems():
344- dists = os.path.join(get_distscopyroot(archive_config), "dists")
345- self.logger.debug(
346- "Renaming old %s distsroot to %s." % (purpose.title, dists))
347- os.rename(archive_config.distsroot + ".old", dists)
348+ # Use the dists "working location" as a temporary place to
349+ # move the current dists out of the way for the switch. If
350+ # we die in this state, the next run will know to move the
351+ # temporary directory to the backup location.
352+ dists = get_dists(archive_config)
353+ temp_dists = get_working_dists(archive_config)
354+ backup_dists = get_backup_dists(archive_config)
355+
356+ os.rename(dists, temp_dists)
357+ os.rename(backup_dists, dists)
358+ os.rename(temp_dists, backup_dists)
359
360 def runCommercialCompat(self):
361 """Generate the -commercial pocket.
362@@ -371,20 +410,6 @@
363 "find '%s' -type d -empty | xargs -r rmdir"
364 % archive_config.archiveroot)
365
366- def processOptions(self):
367- """Handle command-line options.
368-
369- Sets `self.distribution` to the `Distribution` to publish.
370- """
371- if self.options.distribution is None:
372- raise LaunchpadScriptFailure("Specify a distribution.")
373-
374- self.distribution = getUtility(IDistributionSet).getByName(
375- self.options.distribution)
376- if self.distribution is None:
377- raise LaunchpadScriptFailure(
378- "Distribution %s not found." % self.options.distribution)
379-
380 def runParts(self, parts, env):
381 """Execute run-parts.
382
383@@ -422,14 +447,9 @@
384 if len(security_suites) == 0:
385 self.logger.debug("Nothing to do for security publisher.")
386 return
387- partner_archive = self.distribution.getArchive("partner")
388- if partner_archive is not None:
389- self.publishDistroArchive(partner_archive)
390+
391 self.publishDistroArchive(
392 self.distribution.main_archive, security_suites=security_suites)
393- self.installDists()
394- self.runCommercialCompat()
395- self.runFinalizeParts(security_only=True)
396
397 def publishAllUploads(self):
398 """Publish the distro's complete uploads."""
399@@ -439,26 +459,61 @@
400 # most of its time.
401 self.publishDistroArchive(archive)
402
403- self.installDists()
404- self.runCommercialCompat()
405- self.generateListings()
406- self.clearEmptyDirs()
407- self.runFinalizeParts()
408+ def publish(self, security_only=False):
409+ """Do the main publishing work.
410+
411+ :param security_only: If True, limit publication to security
412+ updates on the main archive. This is much faster, so it
413+ makes sense to do a security-only run before the main
414+ event to expedite critical fixes.
415+ """
416+ try:
417+ if security_only:
418+ self.publishSecurityUploads()
419+ else:
420+ self.publishAllUploads()
421+
422+ # Swizzle the now-updated backup dists and the current dists
423+ # around.
424+ self.installDists()
425+ except:
426+ # If we failed here, there's a chance that we left a
427+ # working dists directory in its temporary location. If so,
428+ # recover it. The next script run would do that anyway, but
429+ # doing it now is easier on admins trying to recover from
430+ # system problems.
431+ self.recoverWorkingDists()
432+ raise
433
434 def setUp(self):
435 """Process options, and set up internal state."""
436 self.processOptions()
437 self.archives = self.getArchives()
438- self.configs = self.makeConfigs()
439+ self.configs = self.getConfigs()
440
441 def main(self):
442 """See `LaunchpadScript`."""
443 self.setUp()
444- try:
445- self.processAccepted()
446- self.setUpDirs()
447- self.publishSecurityUploads()
448- if not self.options.security_only:
449- self.publishAllUploads()
450- finally:
451- self.cleanUp()
452+ self.recoverWorkingDists()
453+ self.processAccepted()
454+ self.setUpDirs()
455+
456+ self.rsyncBackupDists()
457+ self.publish(security_only=True)
458+ self.runCommercialCompat()
459+ self.runFinalizeParts(security_only=True)
460+
461+ if not self.options.security_only:
462+ self.rsyncBackupDists()
463+ self.publish(security_only=False)
464+ self.runCommercialCompat()
465+ self.generateListings()
466+ self.clearEmptyDirs()
467+ self.runFinalizeParts(security_only=False)
468+
469+ if self.options.post_rsync:
470+ # Update the backup dists with the published changes. The
471+ # initial rsync on the next run will not need to make any
472+ # changes, and so it'll take the next run a little less
473+ # time to publish its security updates.
474+ self.rsyncBackupDists()
475
476=== modified file 'lib/lp/archivepublisher/tests/test_publish_ftpmaster.py'
477--- lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-04-08 18:29:54 +0000
478+++ lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-04-13 10:26:57 +0000
479@@ -6,6 +6,7 @@
480 __metaclass__ = type
481
482 from apt_pkg import TagFile
483+import logging
484 import os
485 from textwrap import dedent
486 from zope.component import getUtility
487@@ -22,7 +23,10 @@
488 PackagePublishingPocket,
489 pocketsuffix,
490 )
491-from lp.services.log.logger import DevNullLogger
492+from lp.services.log.logger import (
493+ BufferLogger,
494+ DevNullLogger,
495+ )
496 from lp.services.scripts.base import LaunchpadScriptFailure
497 from lp.services.utils import file_exists
498 from lp.soyuz.enums import (
499@@ -33,6 +37,7 @@
500 compose_env_string,
501 compose_shell_boolean,
502 find_run_parts_dir,
503+ get_working_dists,
504 PublishFTPMaster,
505 shell_quote,
506 )
507@@ -80,6 +85,26 @@
508 return os.path.join("cronscripts", "publishing", "distro-parts")
509
510
511+def write_marker_file(path, contents):
512+ """Write a marker file for checking directory movements.
513+
514+ :param path: A list of path components.
515+ :param contents: Text to write into the file.
516+ """
517+ marker = file(os.path.join(*path), "w")
518+ marker.write(contents)
519+ marker.flush()
520+ marker.close()
521+
522+
523+def read_marker_file(path):
524+ """Read the contents of a marker file.
525+
526+ :param return: Contents of the marker file.
527+ """
528+ return file(os.path.join(*path)).read()
529+
530+
531 class HelpersMixin:
532 """Helpers for the PublishFTPMaster tests."""
533
534@@ -205,11 +230,11 @@
535 self.setUpForScriptRun(ubuntu)
536 return ubuntu
537
538- def makeScript(self, distro=None):
539+ def makeScript(self, distro=None, extra_args=[]):
540 """Produce instance of the `PublishFTPMaster` script."""
541 if distro is None:
542 distro = self.makeDistro()
543- script = PublishFTPMaster(test_args=["-d", distro.name])
544+ script = PublishFTPMaster(test_args=["-d", distro.name] + extra_args)
545 script.txn = self.layer.txn
546 script.logger = DevNullLogger()
547 return script
548@@ -220,24 +245,6 @@
549 self.assertEqual(1, len(sections))
550 return dict(sections[0])
551
552- def writeMarkerFile(self, path, contents):
553- """Write a marker file for checking direction movements.
554-
555- :param path: A list of path components.
556- :param contents: Text to write into the file.
557- """
558- marker = file(os.path.join(*path), "w")
559- marker.write(contents)
560- marker.flush()
561- marker.close()
562-
563- def readMarkerFile(self, path):
564- """Read the contents of a marker file.
565-
566- :param return: Contents of the marker file.
567- """
568- return file(os.path.join(*path)).read()
569-
570 def enableCommercialCompat(self):
571 """Enable commercial-compat.sh runs for the duration of the test."""
572 config.push("commercial-compat", dedent("""\
573@@ -349,41 +356,6 @@
574 self.assertEqual(distro.displayname, main_release["Label"])
575 self.assertEqual("source", main_release["Architecture"])
576
577- def test_cleanup_moves_dists_to_new_if_not_published(self):
578- distro = self.makeDistro()
579- pub_config = get_pub_config(distro)
580- dists_root = get_dists_root(pub_config)
581- dists_copy_root = get_distscopy_root(pub_config)
582- new_distsroot = dists_root + ".new"
583- os.makedirs(new_distsroot)
584- self.writeMarkerFile([new_distsroot, "marker"], "dists.new")
585- os.makedirs(dists_copy_root)
586-
587- script = self.makeScript(distro)
588- script.setUp()
589- script.cleanUp()
590- self.assertEqual(
591- "dists.new",
592- self.readMarkerFile([dists_copy_root, "dists", "marker"]))
593-
594- def test_cleanup_moves_dists_to_old_if_published(self):
595- distro = self.makeDistro()
596- pub_config = get_pub_config(distro)
597- dists_root = get_dists_root(pub_config)
598- old_distsroot = dists_root + ".old"
599- dists_copy_root = get_distscopy_root(pub_config)
600- os.makedirs(old_distsroot)
601- self.writeMarkerFile([old_distsroot, "marker"], "dists.old")
602- os.makedirs(dists_copy_root)
603-
604- script = self.makeScript(distro)
605- script.setUp()
606- script.done_pub = True
607- script.cleanUp()
608- self.assertEqual(
609- "dists.old",
610- self.readMarkerFile([dists_copy_root, "dists", "marker"]))
611-
612 def test_getDirtySuites_returns_suite_with_pending_publication(self):
613 spph = self.factory.makeSourcePackagePublishingHistory()
614 script = self.makeScript(spph.distroseries.distribution)
615@@ -446,24 +418,26 @@
616 script = self.makeScript(distro)
617 script.setUp()
618 dists_root = get_dists_root(get_pub_config(distro))
619+ dists_backup = os.path.join(
620+ get_distscopy_root(get_pub_config(distro)), "dists")
621+ os.makedirs(dists_backup)
622 os.makedirs(dists_root)
623- os.makedirs(dists_root + ".new")
624- self.writeMarkerFile([dists_root, "new-file"], "New file")
625- script.rsyncNewDists(ArchivePurpose.PRIMARY)
626+ write_marker_file([dists_root, "new-file"], "New file")
627+ script.rsyncBackupDists()
628 self.assertEqual(
629- "New file",
630- self.readMarkerFile([dists_root + ".new", "new-file"]))
631+ "New file", read_marker_file([dists_backup, "new-file"]))
632
633 def test_rsync_cleans_up_obsolete_files(self):
634 distro = self.makeDistro()
635 script = self.makeScript(distro)
636 script.setUp()
637- dists_root = get_dists_root(get_pub_config(distro))
638- os.makedirs(dists_root)
639- os.makedirs(dists_root + ".new")
640- old_file = [dists_root + ".new", "old-file"]
641- self.writeMarkerFile(old_file, "old-file")
642- script.rsyncNewDists(ArchivePurpose.PRIMARY)
643+ dists_backup = os.path.join(
644+ get_distscopy_root(get_pub_config(distro)), "dists")
645+ os.makedirs(dists_backup)
646+ old_file = [dists_backup, "old-file"]
647+ write_marker_file(old_file, "old-file")
648+ os.makedirs(get_dists_root(get_pub_config(distro)))
649+ script.rsyncBackupDists()
650 self.assertFalse(path_exists(*old_file))
651
652 def test_setUpDirs_creates_directory_structure(self):
653@@ -480,9 +454,9 @@
654
655 self.assertTrue(file_exists(archive_root))
656 self.assertTrue(file_exists(dists_root))
657- self.assertTrue(file_exists(dists_root + ".new"))
658+ self.assertTrue(file_exists(get_distscopy_root(pub_config)))
659
660- def test_setUpDirs_does_not_mind_if_directories_already_exist(self):
661+ def test_setUpDirs_does_not_mind_if_dist_directories_already_exist(self):
662 distro = self.makeDistro()
663 script = self.makeScript(distro)
664 script.setUp()
665@@ -490,17 +464,6 @@
666 script.setUpDirs()
667 self.assertTrue(file_exists(get_archive_root(get_pub_config(distro))))
668
669- def test_setUpDirs_moves_dists_to_dists_new(self):
670- distro = self.makeDistro()
671- dists_root = get_dists_root(get_pub_config(distro))
672- script = self.makeScript(distro)
673- script.setUp()
674- script.setUpDirs()
675- self.writeMarkerFile([dists_root, "marker"], "X")
676- script.setUpDirs()
677- self.assertEqual(
678- "X", self.readMarkerFile([dists_root + ".new", "marker"]))
679-
680 def test_publishDistroArchive_runs_parts(self):
681 distro = self.makeDistro()
682 script = self.makeScript(distro)
683@@ -527,31 +490,6 @@
684 missing_parameters = required_parameters.difference(set(env.keys()))
685 self.assertEqual(set(), missing_parameters)
686
687- def test_installDists_sets_done_pub(self):
688- script = self.makeScript()
689- script.setUp()
690- script.setUpDirs()
691- self.assertFalse(script.done_pub)
692- script.installDists()
693- self.assertTrue(script.done_pub)
694-
695- def test_installDists_replaces_distsroot(self):
696- distro = self.makeDistro()
697- script = self.makeScript(distro)
698- script.setUp()
699- script.setUpDirs()
700- pub_config = get_pub_config(distro)
701- dists_root = get_dists_root(pub_config)
702-
703- self.writeMarkerFile([dists_root, "marker"], "old")
704- self.writeMarkerFile([dists_root + ".new", "marker"], "new")
705-
706- script.installDists()
707-
708- self.assertEqual("new", self.readMarkerFile([dists_root, "marker"]))
709- self.assertEqual("old", self.readMarkerFile(
710- [get_distscopy_root(pub_config), "dists", "marker"]))
711-
712 def test_runCommercialCompat_runs_commercial_compat_script(self):
713 # XXX JeroenVermeulen 2011-03-29 bug=741683: Retire
714 # runCommercialCompat as soon as Dapper support ends.
715@@ -611,7 +549,7 @@
716 nonempty_dir = os.path.join(
717 get_dists_root(get_pub_config(distro)), 'nonempty-dir')
718 os.makedirs(nonempty_dir)
719- self.writeMarkerFile([nonempty_dir, "placeholder"], "Data here!")
720+ write_marker_file([nonempty_dir, "placeholder"], "Data here!")
721 script.clearEmptyDirs()
722 self.assertTrue(file_exists(nonempty_dir))
723
724@@ -690,27 +628,13 @@
725 self.assertEqual(set(), missing_parameters)
726
727 def test_publishSecurityUploads_skips_pub_if_no_security_updates(self):
728- script = self.makeScript(self.makeDistro())
729+ script = self.makeScript()
730 script.setUp()
731 script.setUpDirs()
732 script.installDists = FakeMethod()
733 script.publishSecurityUploads()
734 self.assertEqual(0, script.installDists.call_count)
735
736- def test_publishSecurityUploads_runs_finalize_parts(self):
737- distro = self.makeDistro()
738- self.factory.makeSourcePackagePublishingHistory(
739- distroseries=self.factory.makeDistroSeries(distribution=distro),
740- pocket=PackagePublishingPocket.SECURITY)
741- script = self.makeScript(distro)
742- script.setUp()
743- script.setUpDirs()
744- script.runFinalizeParts = FakeMethod()
745- script.publishSecurityUploads()
746- self.assertEqual(1, script.runFinalizeParts.call_count)
747- args, kwargs = script.runFinalizeParts.calls[0]
748- self.assertTrue(kwargs["security_only"])
749-
750 def test_publishAllUploads_publishes_all_distro_archives(self):
751 distro = self.makeDistro()
752 distroseries = self.factory.makeDistroSeries(distribution=distro)
753@@ -733,14 +657,62 @@
754 self.assertIn(distro.main_archive, published_archives)
755 self.assertIn(partner_archive, published_archives)
756
757- def test_publishAllUploads_runs_finalize_parts(self):
758+ def test_recoverWorkingDists_is_quiet_normally(self):
759+ script = self.makeScript()
760+ script.setUp()
761+ script.logger = BufferLogger()
762+ script.logger.setLevel(logging.INFO)
763+ script.recoverWorkingDists()
764+ self.assertEqual('', script.logger.getLogBuffer())
765+
766+ def test_recoverWorkingDists_recovers_working_directory(self):
767 distro = self.makeDistro()
768 script = self.makeScript(distro)
769 script.setUp()
770+ script.logger = BufferLogger()
771+ script.logger.setLevel(logging.INFO)
772 script.setUpDirs()
773- script.runFinalizeParts = FakeMethod()
774+ archive_config = script.configs[ArchivePurpose.PRIMARY]
775+ backup_dists = os.path.join(
776+ archive_config.archiveroot + "-distscopy", "dists")
777+ working_dists = get_working_dists(archive_config)
778+ os.rename(backup_dists, working_dists)
779+ write_marker_file([working_dists, "marker"], "Recovered")
780+ script.recoverWorkingDists()
781+ self.assertEqual(
782+ "Recovered", read_marker_file([backup_dists, "marker"]))
783+ self.assertNotEqual('', script.logger.getLogBuffer())
784+
785+ def test_publishes_first_security_updates_then_all_updates(self):
786+ script = self.makeScript()
787+ script.publish = FakeMethod()
788+ script.main()
789+ self.assertEqual(2, script.publish.call_count)
790+ args, kwargs = script.publish.calls[0]
791+ self.assertEqual({'security_only': True}, kwargs)
792+ args, kwargs = script.publish.calls[1]
793+ self.assertEqual(False, kwargs.get('security_only', False))
794+
795+ def test_security_run_publishes_only_security_updates(self):
796+ script = self.makeScript(extra_args=['--security-only'])
797+ script.publish = FakeMethod()
798+ script.main()
799+ self.assertEqual(1, script.publish.call_count)
800+ args, kwargs = script.publish.calls[0]
801+ self.assertEqual({'security_only': True}, kwargs)
802+
803+ def test_publishAllUploads_processes_all_archives(self):
804+ distro = self.makeDistro()
805+ partner_archive = self.factory.makeArchive(
806+ distribution=distro, purpose=ArchivePurpose.PARTNER)
807+ script = self.makeScript(distro)
808+ script.publishDistroArchive = FakeMethod()
809+ script.setUp()
810 script.publishAllUploads()
811- self.assertEqual(1, script.runFinalizeParts.call_count)
812+ published_archives = [
813+ args[0] for args, kwargs in script.publishDistroArchive.calls]
814+ self.assertContentEqual(
815+ [distro.main_archive, partner_archive], published_archives)
816
817 def test_runFinalizeParts_quotes_archiveroots(self):
818 # Passing ARCHIVEROOTS to the finalize.d scripts is a bit
819@@ -784,6 +756,6 @@
820 archive_root = getPubConfig(archive).archiveroot
821 self.assertEqual(
822 "This is an archive root.",
823- self.readMarkerFile([archive_root, "marker file"]).rstrip(),
824+ read_marker_file([archive_root, "marker file"]).rstrip(),
825 "Did not find expected marker for %s."
826 % archive.purpose.title)

Subscribers

People subscribed via source and target branches

to status/vote changes: