Merge lp:~spiv/bzr/hpss-second-push-error-bug-437626 into lp:bzr/2.0

Proposed by Andrew Bennetts
Status: Merged
Approved by: John A Meinel
Approved revision: no longer in the source branch.
Merged at revision: not available
Proposed branch: lp:~spiv/bzr/hpss-second-push-error-bug-437626
Merge into: lp:bzr/2.0
Diff against target: 81 lines
3 files modified
NEWS (+4/-0)
bzrlib/knit.py (+4/-2)
bzrlib/tests/per_versionedfile.py (+37/-0)
To merge this branch: bzr merge lp:~spiv/bzr/hpss-second-push-error-bug-437626
Reviewer Review Type Date Requested Status
John A Meinel Approve
Review via email: mp+13680@code.launchpad.net
To post a comment you must log in.
Revision history for this message
Andrew Bennetts (spiv) wrote :

This fixes bug 437626. The problem was that the fairly subtle requirements of add_lines(..., missing_compression_parents=True) were not being met by the way insert_record_stream was calling it multiple times (in effectively random order) for the one set of buffered index entries. The correct thing to do is call add_lines just once with all buffered entries together.

The result of the bug was that for C->B and B->A (where -> means left depends on right as a compression parent) get_missing_compression_parents would sometimes incorrectly report [B,A] rather than just [A], which then leads to fetch thinking that keys are still missing.

I feel a bit like a docstring or comment ought to be clearer about this, but really if you read carefully this requirement is already implicit, and the comments/docstrings for this part of the code are already getting towards the "eyes glaze over" point. On the other hand I've added a test so the chances of this regressing are pretty low.

Revision history for this message
John A Meinel (jameinel) wrote :

Looks good to me.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'NEWS'
2--- NEWS 2009-10-19 05:38:30 +0000
3+++ NEWS 2009-10-21 01:20:22 +0000
4@@ -25,6 +25,10 @@
5
6 * Diff parsing handles "Binary files differ" hunks. (Aaron Bentley, #436325)
7
8+* Fetching from stacked pre-2a repository via a smart server no longer
9+ fails intermittently with "second push failed to complete".
10+ (Andrew Bennetts, #437626)
11+
12 * PreviewTree file names are not limited by the encoding of the temp
13 directory's filesystem. (Aaron Bentley, #436794)
14
15
16=== modified file 'bzrlib/knit.py'
17--- bzrlib/knit.py 2009-09-09 08:48:29 +0000
18+++ bzrlib/knit.py 2009-10-21 01:20:22 +0000
19@@ -1710,10 +1710,12 @@
20 # There were index entries buffered at the end of the stream,
21 # So these need to be added (if the index supports holding such
22 # entries for later insertion)
23+ all_entries = []
24 for key in buffered_index_entries:
25 index_entries = buffered_index_entries[key]
26- self._index.add_records(index_entries,
27- missing_compression_parents=True)
28+ all_entries.extend(index_entries)
29+ self._index.add_records(
30+ all_entries, missing_compression_parents=True)
31
32 def get_missing_compression_parent_keys(self):
33 """Return an iterable of keys of missing compression parents.
34
35=== modified file 'bzrlib/tests/per_versionedfile.py'
36--- bzrlib/tests/per_versionedfile.py 2009-08-26 16:44:27 +0000
37+++ bzrlib/tests/per_versionedfile.py 2009-10-21 01:20:22 +0000
38@@ -2434,6 +2434,43 @@
39 else:
40 self.assertIdenticalVersionedFile(source, files)
41
42+ def test_insert_record_stream_long_parent_chain_out_of_order(self):
43+ """An out of order stream can either error or work."""
44+ if not self.graph:
45+ raise TestNotApplicable('ancestry info only relevant with graph.')
46+ # Create a reasonably long chain of records based on each other, where
47+ # most will be deltas.
48+ source = self.get_versionedfiles('source')
49+ parents = ()
50+ keys = []
51+ content = [('same same %d\n' % n) for n in range(500)]
52+ for letter in 'abcdefghijklmnopqrstuvwxyz':
53+ key = ('key-' + letter,)
54+ if self.key_length == 2:
55+ key = ('prefix',) + key
56+ content.append('content for ' + letter + '\n')
57+ source.add_lines(key, parents, content)
58+ keys.append(key)
59+ parents = (key,)
60+ # Create a stream of these records, excluding the first record that the
61+ # rest ultimately depend upon, and insert it into a new vf.
62+ streams = []
63+ for key in reversed(keys):
64+ streams.append(source.get_record_stream([key], 'unordered', False))
65+ deltas = chain(*streams[:-1])
66+ files = self.get_versionedfiles()
67+ try:
68+ files.insert_record_stream(deltas)
69+ except RevisionNotPresent:
70+ # Must not have corrupted the file.
71+ files.check()
72+ else:
73+ # Must only report either just the first key as a missing parent,
74+ # no key as missing (for nodelta scenarios).
75+ missing = set(files.get_missing_compression_parent_keys())
76+ missing.discard(keys[0])
77+ self.assertEqual(set(), missing)
78+
79 def get_knit_delta_source(self):
80 """Get a source that can produce a stream with knit delta records,
81 regardless of this test's scenario.

Subscribers

People subscribed via source and target branches