Code review comment for lp:~spiv/bzr/inventory-delta

Revision history for this message
Andrew Bennetts (spiv) wrote :

Robert Collins wrote:
> On Fri, 2009-08-07 at 05:39 +0000, Andrew Bennetts wrote:
[...]
> [debug flags]
> > I hope we don't need to ask people to use them, but they are a cheap insurance
> > policy.
> >
> > More usefully, they are helpful for benchmarking and testing. I'm ok with
> > hiding them if you like.
>
> I don't have a strong opinion. The flags are in our docs though as it
> stands, so they should be clear to people reading them rather than
> perhaps causing confusion.

I'll leave them in, assuming I can find a way to make the ReST markup cope with
them...

[inventory_delta.py API]
> Anyhow, at the moment it seems unclear and confusing, rather than
> clearer. I would find it clearer as either being on the __init__, I
> think, or perhaps with two seperate constructors? No biggy, not enough
> to block the patch on, but it is a very awkward halfway house at the
> moment - and I wouldn't want to see it left that way - so I'm concerned
> that you might switch focus away from this straight after landing it.

I've now split the class into separate serializer and deserializer, which gives
“two separate constructors”. I think it's an improvement, I hope you'll think
so too!

[repacking single pack bug]
> > I don't understand why that case isn't covered. Perhaps you should file the bug
> > instead of me?
>
> I'd like to make sure I explain it well enough first; once you
> understand either of us can file it - and I'll be happy enough to be
> that person.
[...]
> What you need to do is change the test from len(self.packs) == 1, to
> len(packs being combined) == 1 and that_pack has the same hash.

But AIUI “self.packs” on a Packer *is* “packs being combined”. If it's not then
your explanation makes sense, but my reading of the code says otherwise.

[...]
> > > > - if not hint or pack.name in hint:
> > > > + if hint is None or pack.name in hint:
> > >
> > > ^- the original form is actually faster, AFAIK. because it skips the in
> > > test for a hint of []. I'd rather we didn't change it, for all that its
> > > not in a common code path.
> >
> > But if hint is None, then “pack.name in hint” will fail.
>
> It would, but it won't execute. As we discussed, if you could change it
> back, but add a comment - its clearly worthy of expanding on (in either
> form the necessity for testing hint isn't obvious, apparently).

Ok. (I still find it cleaner to have the hint is None case handled distinctly
from the hint is a list case, it seems to express the intent more clearly to me,
rather than making it appear to simply be a micro-optimisation. But I
understand that tastes vary.)

Oh, actually, it does matter. When hint == [], no packs should be repacked.
When hint is None, all packs should be repacked.

I've added this comment:

                # Either no hint was provided (so we are packing everything),
                # or this pack was included in the hint.

[...]
> > > > + def _should_fake_unknown(self):
[...]
>
> As we discussed, I think the method should say what it does; the fact
> that the base class encodes more complex policy than a child is a smell
> (not one that you need to fix), but it would be very nice to have pydoc
> help make it more obvious that the base class does this special
> handling.

Made a docstring.

> > > > +class SmartServerRepositoryInsertStream_1_18(SmartServerRepositoryInsertStreamLocked):
[...]
>
> So, if the lock_token was required, there would be:
> - just one verb
> - no backwards compatibility issues
>
> Up to you; I do think less code is better, all things considered.

I'll leave it as is. I don't think the costs of doing it this way are
insignificant, and it doesn't lock us into assuming future clients/formats must
be repos in advance before inserting a stream.

[interrepo tests]
> > I considered leaving it there simply for test selection, but that's not the
> > right way to do that, because it can be (and was!) out of sync with what the
> > test was actually doing.
>
> You proposed putting class names in literally, that would be fine by me.
> Its no less likely to skew, but I think the issue goes deeper than 'it
> can skew' - we have many potentially expensive tests being executed in
> the same situation, AIUI. So we may well want to revisit this.

I've put the class names in literally. I agree that the test structure here
doesn't seem optimal, and is probably worth revisiting some day.

> > > > + def assertCanStreamRevision(self, repo, revision_id):
> > > > + exclude_keys = set(repo.all_revision_ids()) - set([revision_id])
> > > > + search = SearchResult([revision_id], exclude_keys, 1, [revision_id])
> > > > + source = repo._get_source(repo._format)
> > > > + for substream_kind, substream in source.get_stream(search):
> > > > + # Consume the substream
> > > > + list(substream)
> >
> > > ^ - this assertion, while named nice, isn't strong enough for me. I'd
> > > like it to assert that either a delta matching the local changes, or a
> > > full inventory, are streamed - and that the newly referenced file texts
> > > are present.
> >
> > Hmm, fair enough. That may happen after landing, though, but not long after.
>
> If this is opening a whole, I'd really rather it be done before landing;
> especially if you agree its worth doing. If its an existing whole that
> refactoring made obvious, then leaving it till later is fine with me.

Well, before it was only checking for some has_revisions and
.inventories.keys(), which turned out to be insufficient to notice some bugs.
e.g. not copying all the chk_bytes for the inventories. This assertion caught
those. So I think it is an incremental improvement, not opening a hole.

I think John has made some overlapping changes in bzr.dev based on the bugs I
noticed; I'll make sure this change fits his. Oh ho! Merging bzr.dev gets some
failures from assertCanStreamRevision that just the assertions in bzr.dev don't.
Hmm... oh! It appears that when insert_stream tries a pack, and it turns out
the single pack is already optimal, it was accidentally causing that pack to be
removed from the pack-names list! (Because it removes the unneeded pack, but
it's unneeded because it has the same name as the original pack, so it forgets
the original pack...). Ok, I've fixed that now, but again it's something this
assertion catches that none of the assertions in this test in bzr.dev did :)

And for now I've kept all the assertions in these tests from bzr.dev, so the
changes here are strictly increasing coverage, so definitely no hole added.

A tangent: when we transfer parent inventories to a stacked repo, should we
transfer the texts introduced by those inventories or not? I'm seeing a
worrisome half-and-half behaviour from the streaming code, where the synthesised
rich root, but no other texts, are transferred for inventories requested via
get_stream_for_missing_keys. I think the intention is not (if those parent
revisions are ever filled in later, then at that time the corresponding texts
would be backfilled too). I've now fixed this too.

The net result of this is that the stacking tests in
per_interrepository/test_fetch.py are starting to get pretty large. I suspect
we can do better here... (but after this branch lands, I hope!)

> > > > === modified file 'bzrlib/tests/test_remote.py'
> > > > --- bzrlib/tests/test_remote.py 2009-07-30 04:27:05 +0000
> > > > +++ bzrlib/tests/test_remote.py 2009-08-05 01:41:03 +0000
> > > >
> > > > + """Tests for using Repository.insert_stream verb when the _1.18 variant is
> > > > + not available.
> > > > +
> > > > + This test case is very similar to TestRepositoryInsertStream_1_18.
> > > > + """
> > >
> > > Perhaps a base class would make this smaller - it is nearly identical.
> >
> > I don't see any elegant ways to cut the duplication at this point. I haven't
> > tried very hard though, because it's just small enough that it fits under my
> > threshold for tolerating the ugly...
>
> It went past mine :). TemplateMethod perhaps?

The thing is that the important part of the tests is the expected network
conversation, and that's also the bulk of the near-duplication. So I'm finding
it hard to see a way to reduce that duplication without obscuring the key part
of the test.

I've refactored out a bit of duplication that isn't the network conversation,
and it does help a bit.

> > > I"m not sure how to say 'needs fixing' in review mail, but thats what
> > > I'd say.

(It's “review needs-fixing”, btw. Not sure when that got added.)

I've attached a psuedo-incremental diff (I merged latest bzr.dev into the tip of
this branch, and separately bzr.dev into the last reviewed revision, and then
produced the diff of those two). So if anything looks funny in the diff you
might want to check the actual branch.

-Andrew.

1=== modified file 'NEWS'
2--- NEWS 2009-08-13 12:51:59 +0000
3+++ NEWS 2009-08-13 03:29:52 +0000
4@@ -158,6 +158,10 @@
5 lots of backtraces about ``UnknownSmartMethod``, ``do_chunk`` or
6 ``do_end``. (Andrew Bennetts, #338561)
7
8+* ``RemoteStreamSource.get_stream_for_missing_keys`` will fetch CHK
9+ inventory pages when appropriate (by falling back to the vfs stream
10+ source). (Andrew Bennetts, #406686)
11+
12 * Streaming from bzr servers where there is a chain of stacked branches
13 (A stacked on B stacked on C) will now work. (Robert Collins, #406597)
14
15@@ -249,8 +253,10 @@
16 * ``CHKMap.apply_delta`` now raises ``InconsistentDelta`` if a delta adds
17 as new a key which was already mapped. (Robert Collins)
18
19-* InterDifferingSerializer has been removed. The transformations it
20- provided are now done automatically by StreamSource. (Andrew Bennetts)
21+* InterDifferingSerializer is now only used locally. Other fetches that
22+ would have used InterDifferingSerializer now use the more network
23+ friendly StreamSource, which now automatically does the same
24+ transformations as InterDifferingSerializer. (Andrew Bennetts)
25
26 * Inventory delta application catches more cases of corruption and can
27 prevent corrupt deltas from affecting consistency of data structures on
28
29=== modified file 'bzrlib/fetch.py'
30--- bzrlib/fetch.py 2009-07-29 07:08:54 +0000
31+++ bzrlib/fetch.py 2009-08-07 04:27:02 +0000
32@@ -260,6 +260,19 @@
33
34 def _new_root_data_stream(
35 root_keys_to_create, rev_id_to_root_id_map, parent_map, repo, graph=None):
36+ """Generate a texts substream of synthesised root entries.
37+
38+ Used in fetches that do rich-root upgrades.
39+
40+ :param root_keys_to_create: iterable of (root_id, rev_id) pairs describing
41+ the root entries to create.
42+ :param rev_id_to_root_id_map: dict of known rev_id -> root_id mappings for
43+ calculating the parents. If a parent rev_id is not found here then it
44+ will be recalculated.
45+ :param parent_map: a parent map for all the revisions in
46+ root_keys_to_create.
47+ :param graph: a graph to use instead of repo.get_graph().
48+ """
49 for root_key in root_keys_to_create:
50 root_id, rev_id = root_key
51 parent_keys = _parent_keys_for_root_version(
52@@ -270,7 +283,10 @@
53
54 def _parent_keys_for_root_version(
55 root_id, rev_id, rev_id_to_root_id_map, parent_map, repo, graph=None):
56- """Get the parent keys for a given root id."""
57+ """Get the parent keys for a given root id.
58+
59+ A helper function for _new_root_data_stream.
60+ """
61 # Include direct parents of the revision, but only if they used the same
62 # root_id and are heads.
63 rev_parents = parent_map[rev_id]
64
65=== modified file 'bzrlib/inventory_delta.py'
66--- bzrlib/inventory_delta.py 2009-08-05 02:30:11 +0000
67+++ bzrlib/inventory_delta.py 2009-08-11 08:40:32 +0000
68@@ -29,6 +29,25 @@
69 from bzrlib import inventory
70 from bzrlib.revision import NULL_REVISION
71
72+FORMAT_1 = 'bzr inventory delta v1 (bzr 1.14)'
73+
74+
75+class InventoryDeltaError(errors.BzrError):
76+ """An error when serializing or deserializing an inventory delta."""
77+
78+ # Most errors when serializing and deserializing are due to bugs, although
79+ # damaged input (i.e. a bug in a different process) could cause
80+ # deserialization errors too.
81+ internal_error = True
82+
83+
84+class IncompatibleInventoryDelta(errors.BzrError):
85+ """The delta could not be deserialised because its contents conflict with
86+ the allow_versioned_root or allow_tree_references flags of the
87+ deserializer.
88+ """
89+ internal_error = False
90+
91
92 def _directory_content(entry):
93 """Serialize the content component of entry which is a directory.
94@@ -49,7 +68,7 @@
95 exec_bytes = ''
96 size_exec_sha = (entry.text_size, exec_bytes, entry.text_sha1)
97 if None in size_exec_sha:
98- raise errors.BzrError('Missing size or sha for %s' % entry.file_id)
99+ raise InventoryDeltaError('Missing size or sha for %s' % entry.file_id)
100 return "file\x00%d\x00%s\x00%s" % size_exec_sha
101
102
103@@ -60,7 +79,7 @@
104 """
105 target = entry.symlink_target
106 if target is None:
107- raise errors.BzrError('Missing target for %s' % entry.file_id)
108+ raise InventoryDeltaError('Missing target for %s' % entry.file_id)
109 return "link\x00%s" % target.encode('utf8')
110
111
112@@ -71,7 +90,8 @@
113 """
114 tree_revision = entry.reference_revision
115 if tree_revision is None:
116- raise errors.BzrError('Missing reference revision for %s' % entry.file_id)
117+ raise InventoryDeltaError(
118+ 'Missing reference revision for %s' % entry.file_id)
119 return "tree\x00%s" % tree_revision
120
121
122@@ -116,39 +136,22 @@
123
124
125 class InventoryDeltaSerializer(object):
126- """Serialize and deserialize inventory deltas."""
127-
128- # XXX: really, the serializer and deserializer should be two separate
129- # classes.
130-
131- FORMAT_1 = 'bzr inventory delta v1 (bzr 1.14)'
132-
133- def __init__(self):
134- """Create an InventoryDeltaSerializer."""
135- self._versioned_root = None
136- self._tree_references = None
137+ """Serialize inventory deltas."""
138+
139+ def __init__(self, versioned_root, tree_references):
140+ """Create an InventoryDeltaSerializer.
141+
142+ :param versioned_root: If True, any root entry that is seen is expected
143+ to be versioned, and root entries can have any fileid.
144+ :param tree_references: If True support tree-reference entries.
145+ """
146+ self._versioned_root = versioned_root
147+ self._tree_references = tree_references
148 self._entry_to_content = {
149 'directory': _directory_content,
150 'file': _file_content,
151 'symlink': _link_content,
152 }
153-
154- def require_flags(self, versioned_root=None, tree_references=None):
155- """Set the versioned_root and/or tree_references flags for this
156- (de)serializer.
157-
158- :param versioned_root: If True, any root entry that is seen is expected
159- to be versioned, and root entries can have any fileid.
160- :param tree_references: If True support tree-reference entries.
161- """
162- if versioned_root is not None and self._versioned_root is not None:
163- raise AssertionError(
164- "require_flags(versioned_root=...) already called.")
165- if tree_references is not None and self._tree_references is not None:
166- raise AssertionError(
167- "require_flags(tree_references=...) already called.")
168- self._versioned_root = versioned_root
169- self._tree_references = tree_references
170 if tree_references:
171 self._entry_to_content['tree-reference'] = _reference_content
172
173@@ -167,10 +170,6 @@
174 takes.
175 :return: The serialized delta as lines.
176 """
177- if self._versioned_root is None or self._tree_references is None:
178- raise AssertionError(
179- "Cannot serialise unless versioned_root/tree_references flags "
180- "are both set.")
181 if type(old_name) is not str:
182 raise TypeError('old_name should be str, got %r' % (old_name,))
183 if type(new_name) is not str:
184@@ -180,11 +179,11 @@
185 for delta_item in delta_to_new:
186 line = to_line(delta_item, new_name)
187 if line.__class__ != str:
188- raise errors.BzrError(
189+ raise InventoryDeltaError(
190 'to_line generated non-str output %r' % lines[-1])
191 lines.append(line)
192 lines.sort()
193- lines[0] = "format: %s\n" % InventoryDeltaSerializer.FORMAT_1
194+ lines[0] = "format: %s\n" % FORMAT_1
195 lines[1] = "parent: %s\n" % old_name
196 lines[2] = "version: %s\n" % new_name
197 lines[3] = "versioned_root: %s\n" % self._serialize_bool(
198@@ -234,23 +233,37 @@
199 # file-ids other than TREE_ROOT, e.g. repo formats that use the
200 # xml5 serializer.
201 if last_modified != new_version:
202- raise errors.BzrError(
203+ raise InventoryDeltaError(
204 'Version present for / in %s (%s != %s)'
205 % (file_id, last_modified, new_version))
206 if last_modified is None:
207- raise errors.BzrError("no version for fileid %s" % file_id)
208+ raise InventoryDeltaError("no version for fileid %s" % file_id)
209 content = self._entry_to_content[entry.kind](entry)
210 return ("%s\x00%s\x00%s\x00%s\x00%s\x00%s\n" %
211 (oldpath_utf8, newpath_utf8, file_id, parent_id, last_modified,
212 content))
213
214+
215+class InventoryDeltaDeserializer(object):
216+ """Deserialize inventory deltas."""
217+
218+ def __init__(self, allow_versioned_root=True, allow_tree_references=True):
219+ """Create an InventoryDeltaDeserializer.
220+
221+ :param versioned_root: If True, any root entry that is seen is expected
222+ to be versioned, and root entries can have any fileid.
223+ :param tree_references: If True support tree-reference entries.
224+ """
225+ self._allow_versioned_root = allow_versioned_root
226+ self._allow_tree_references = allow_tree_references
227+
228 def _deserialize_bool(self, value):
229 if value == "true":
230 return True
231 elif value == "false":
232 return False
233 else:
234- raise errors.BzrError("value %r is not a bool" % (value,))
235+ raise InventoryDeltaError("value %r is not a bool" % (value,))
236
237 def parse_text_bytes(self, bytes):
238 """Parse the text bytes of a serialized inventory delta.
239@@ -266,32 +279,24 @@
240 """
241 if bytes[-1:] != '\n':
242 last_line = bytes.rsplit('\n', 1)[-1]
243- raise errors.BzrError('last line not empty: %r' % (last_line,))
244+ raise InventoryDeltaError('last line not empty: %r' % (last_line,))
245 lines = bytes.split('\n')[:-1] # discard the last empty line
246- if not lines or lines[0] != 'format: %s' % InventoryDeltaSerializer.FORMAT_1:
247- raise errors.BzrError('unknown format %r' % lines[0:1])
248+ if not lines or lines[0] != 'format: %s' % FORMAT_1:
249+ raise InventoryDeltaError('unknown format %r' % lines[0:1])
250 if len(lines) < 2 or not lines[1].startswith('parent: '):
251- raise errors.BzrError('missing parent: marker')
252+ raise InventoryDeltaError('missing parent: marker')
253 delta_parent_id = lines[1][8:]
254 if len(lines) < 3 or not lines[2].startswith('version: '):
255- raise errors.BzrError('missing version: marker')
256+ raise InventoryDeltaError('missing version: marker')
257 delta_version_id = lines[2][9:]
258 if len(lines) < 4 or not lines[3].startswith('versioned_root: '):
259- raise errors.BzrError('missing versioned_root: marker')
260+ raise InventoryDeltaError('missing versioned_root: marker')
261 delta_versioned_root = self._deserialize_bool(lines[3][16:])
262 if len(lines) < 5 or not lines[4].startswith('tree_references: '):
263- raise errors.BzrError('missing tree_references: marker')
264+ raise InventoryDeltaError('missing tree_references: marker')
265 delta_tree_references = self._deserialize_bool(lines[4][17:])
266- if (self._versioned_root is not None and
267- delta_versioned_root != self._versioned_root):
268- raise errors.BzrError(
269- "serialized versioned_root flag is wrong: %s" %
270- (delta_versioned_root,))
271- if (self._tree_references is not None
272- and delta_tree_references != self._tree_references):
273- raise errors.BzrError(
274- "serialized tree_references flag is wrong: %s" %
275- (delta_tree_references,))
276+ if (not self._allow_versioned_root and delta_versioned_root):
277+ raise IncompatibleInventoryDelta("versioned_root not allowed")
278 result = []
279 seen_ids = set()
280 line_iter = iter(lines)
281@@ -302,24 +307,30 @@
282 content) = line.split('\x00', 5)
283 parent_id = parent_id or None
284 if file_id in seen_ids:
285- raise errors.BzrError(
286+ raise InventoryDeltaError(
287 "duplicate file id in inventory delta %r" % lines)
288 seen_ids.add(file_id)
289 if (newpath_utf8 == '/' and not delta_versioned_root and
290 last_modified != delta_version_id):
291- # Delta claims to be not rich root, yet here's a root entry
292- # with either non-default version, i.e. it's rich...
293- raise errors.BzrError("Versioned root found: %r" % line)
294+ # Delta claims to be not have a versioned root, yet here's
295+ # a root entry with a non-default version.
296+ raise InventoryDeltaError("Versioned root found: %r" % line)
297 elif newpath_utf8 != 'None' and last_modified[-1] == ':':
298 # Deletes have a last_modified of null:, but otherwise special
299 # revision ids should not occur.
300- raise errors.BzrError('special revisionid found: %r' % line)
301- if delta_tree_references is False and content.startswith('tree\x00'):
302- raise errors.BzrError("Tree reference found: %r" % line)
303+ raise InventoryDeltaError('special revisionid found: %r' % line)
304+ if content.startswith('tree\x00'):
305+ if delta_tree_references is False:
306+ raise InventoryDeltaError(
307+ "Tree reference found (but header said "
308+ "tree_references: false): %r" % line)
309+ elif not self._allow_tree_references:
310+ raise IncompatibleInventoryDelta(
311+ "Tree reference not allowed")
312 if oldpath_utf8 == 'None':
313 oldpath = None
314 elif oldpath_utf8[:1] != '/':
315- raise errors.BzrError(
316+ raise InventoryDeltaError(
317 "oldpath invalid (does not start with /): %r"
318 % (oldpath_utf8,))
319 else:
320@@ -328,7 +339,7 @@
321 if newpath_utf8 == 'None':
322 newpath = None
323 elif newpath_utf8[:1] != '/':
324- raise errors.BzrError(
325+ raise InventoryDeltaError(
326 "newpath invalid (does not start with /): %r"
327 % (newpath_utf8,))
328 else:
329@@ -340,15 +351,14 @@
330 entry = None
331 else:
332 entry = _parse_entry(
333- newpath_utf8, file_id, parent_id, last_modified,
334- content_tuple)
335+ newpath, file_id, parent_id, last_modified, content_tuple)
336 delta_item = (oldpath, newpath, file_id, entry)
337 result.append(delta_item)
338 return (delta_parent_id, delta_version_id, delta_versioned_root,
339 delta_tree_references, result)
340
341
342-def _parse_entry(utf8_path, file_id, parent_id, last_modified, content):
343+def _parse_entry(path, file_id, parent_id, last_modified, content):
344 entry_factory = {
345 'dir': _dir_to_entry,
346 'file': _file_to_entry,
347@@ -356,10 +366,10 @@
348 'tree': _tree_to_entry,
349 }
350 kind = content[0]
351- if utf8_path.startswith('/'):
352+ if path.startswith('/'):
353 raise AssertionError
354- path = utf8_path.decode('utf8')
355 name = basename(path)
356 return entry_factory[content[0]](
357 content, name, parent_id, file_id, last_modified)
358
359+
360
361=== modified file 'bzrlib/remote.py'
362--- bzrlib/remote.py 2009-08-13 12:51:59 +0000
363+++ bzrlib/remote.py 2009-08-13 08:21:02 +0000
364@@ -587,11 +587,6 @@
365 self._ensure_real()
366 return self._custom_format._serializer
367
368- @property
369- def repository_class(self):
370- self._ensure_real()
371- return self._custom_format.repository_class
372-
373
374 class RemoteRepository(_RpcHelper):
375 """Repository accessed over rpc.
376@@ -1684,9 +1679,6 @@
377
378 class RemoteStreamSink(repository.StreamSink):
379
380- def __init__(self, target_repo):
381- repository.StreamSink.__init__(self, target_repo)
382-
383 def _insert_real(self, stream, src_format, resume_tokens):
384 self.target_repo._ensure_real()
385 sink = self.target_repo._real_repository._get_sink()
386@@ -1708,6 +1700,10 @@
387 client = target._client
388 medium = client._medium
389 path = target.bzrdir._path_for_remote_call(client)
390+ # Probe for the verb to use with an empty stream before sending the
391+ # real stream to it. We do this both to avoid the risk of sending a
392+ # large request that is then rejected, and because we don't want to
393+ # implement a way to buffer, rewind, or restart the stream.
394 found_verb = False
395 for verb, required_version in candidate_calls:
396 if medium._is_remote_before(required_version):
397
398=== modified file 'bzrlib/repofmt/groupcompress_repo.py'
399--- bzrlib/repofmt/groupcompress_repo.py 2009-08-13 12:51:59 +0000
400+++ bzrlib/repofmt/groupcompress_repo.py 2009-08-13 08:20:28 +0000
401@@ -484,6 +484,8 @@
402 old_pack = self.packs[0]
403 if old_pack.name == self.new_pack._hash.hexdigest():
404 # The single old pack was already optimally packed.
405+ trace.mutter('single pack %s was already optimally packed',
406+ old_pack.name)
407 self.new_pack.abort()
408 return None
409 self.pb.update('finishing repack', 6, 7)
410@@ -600,7 +602,7 @@
411 packer = GCCHKPacker(self, packs, '.autopack',
412 reload_func=reload_func)
413 try:
414- packer.pack()
415+ result = packer.pack()
416 except errors.RetryWithNewPacks:
417 # An exception is propagating out of this context, make sure
418 # this packer has cleaned up. Packer() doesn't set its new_pack
419@@ -609,6 +611,8 @@
420 if packer.new_pack is not None:
421 packer.new_pack.abort()
422 raise
423+ if result is None:
424+ return
425 for pack in packs:
426 self._remove_pack_from_memory(pack)
427 # record the newly available packs and stop advertising the old
428@@ -792,6 +796,8 @@
429
430 def _iter_inventories(self, revision_ids, ordering):
431 """Iterate over many inventory objects."""
432+ if ordering is None:
433+ ordering = 'unordered'
434 keys = [(revision_id,) for revision_id in revision_ids]
435 stream = self.inventories.get_record_stream(keys, ordering, True)
436 texts = {}
437@@ -903,9 +909,7 @@
438
439 def _get_source(self, to_format):
440 """Return a source for streaming from this repository."""
441- if (to_format.supports_chks and
442- self._format.repository_class is to_format.repository_class and
443- self._format._serializer == to_format._serializer):
444+ if self._format._serializer == to_format._serializer:
445 # We must be exactly the same format, otherwise stuff like the chk
446 # page layout might be different.
447 # Actually, this test is just slightly looser than exact so that
448
449=== modified file 'bzrlib/repofmt/pack_repo.py'
450--- bzrlib/repofmt/pack_repo.py 2009-08-13 12:51:59 +0000
451+++ bzrlib/repofmt/pack_repo.py 2009-08-13 07:26:29 +0000
452@@ -1575,6 +1575,8 @@
453 pack_operations = [[0, []]]
454 for pack in self.all_packs():
455 if hint is None or pack.name in hint:
456+ # Either no hint was provided (so we are packing everything),
457+ # or this pack was included in the hint.
458 pack_operations[-1][0] += pack.get_revision_count()
459 pack_operations[-1][1].append(pack)
460 self._execute_pack_operations(pack_operations, OptimisingPacker)
461
462=== modified file 'bzrlib/repository.py'
463--- bzrlib/repository.py 2009-08-13 12:51:59 +0000
464+++ bzrlib/repository.py 2009-08-13 08:56:51 +0000
465@@ -1537,6 +1537,8 @@
466 """Commit the contents accrued within the current write group.
467
468 :seealso: start_write_group.
469+
470+ :return: it may return an opaque hint that can be passed to 'pack'.
471 """
472 if self._write_group is not self.get_transaction():
473 # has an unlock or relock occured ?
474@@ -2348,7 +2350,7 @@
475 """Get Inventory object by revision id."""
476 return self.iter_inventories([revision_id]).next()
477
478- def iter_inventories(self, revision_ids, ordering='unordered'):
479+ def iter_inventories(self, revision_ids, ordering=None):
480 """Get many inventories by revision_ids.
481
482 This will buffer some or all of the texts used in constructing the
483@@ -2356,7 +2358,9 @@
484 time.
485
486 :param revision_ids: The expected revision ids of the inventories.
487- :param ordering: optional ordering, e.g. 'topological'.
488+ :param ordering: optional ordering, e.g. 'topological'. If not
489+ specified, the order of revision_ids will be preserved (by
490+ buffering if necessary).
491 :return: An iterator of inventories.
492 """
493 if ((None in revision_ids)
494@@ -2370,29 +2374,41 @@
495 for text, revision_id in inv_xmls:
496 yield self.deserialise_inventory(revision_id, text)
497
498- def _iter_inventory_xmls(self, revision_ids, ordering='unordered'):
499+ def _iter_inventory_xmls(self, revision_ids, ordering):
500+ if ordering is None:
501+ order_as_requested = True
502+ ordering = 'unordered'
503+ else:
504+ order_as_requested = False
505 keys = [(revision_id,) for revision_id in revision_ids]
506 if not keys:
507 return
508- key_iter = iter(keys)
509- next_key = key_iter.next()
510+ if order_as_requested:
511+ key_iter = iter(keys)
512+ next_key = key_iter.next()
513 stream = self.inventories.get_record_stream(keys, ordering, True)
514 text_chunks = {}
515 for record in stream:
516 if record.storage_kind != 'absent':
517- text_chunks[record.key] = record.get_bytes_as('chunked')
518+ chunks = record.get_bytes_as('chunked')
519+ if order_as_requested:
520+ text_chunks[record.key] = chunks
521+ else:
522+ yield ''.join(chunks), record.key[-1]
523 else:
524 raise errors.NoSuchRevision(self, record.key)
525- while next_key in text_chunks:
526- chunks = text_chunks.pop(next_key)
527- yield ''.join(chunks), next_key[-1]
528- try:
529- next_key = key_iter.next()
530- except StopIteration:
531- # We still want to fully consume the get_record_stream,
532- # just in case it is not actually finished at this point
533- next_key = None
534- break
535+ if order_as_requested:
536+ # Yield as many results as we can while preserving order.
537+ while next_key in text_chunks:
538+ chunks = text_chunks.pop(next_key)
539+ yield ''.join(chunks), next_key[-1]
540+ try:
541+ next_key = key_iter.next()
542+ except StopIteration:
543+ # We still want to fully consume the get_record_stream,
544+ # just in case it is not actually finished at this point
545+ next_key = None
546+ break
547
548 def deserialise_inventory(self, revision_id, xml):
549 """Transform the xml into an inventory object.
550@@ -4224,20 +4240,14 @@
551 for record in substream:
552 # Insert the delta directly
553 inventory_delta_bytes = record.get_bytes_as('fulltext')
554- deserialiser = inventory_delta.InventoryDeltaSerializer()
555- parse_result = deserialiser.parse_text_bytes(inventory_delta_bytes)
556+ deserialiser = inventory_delta.InventoryDeltaDeserializer()
557+ try:
558+ parse_result = deserialiser.parse_text_bytes(
559+ inventory_delta_bytes)
560+ except inventory_delta.IncompatibleInventoryDelta, err:
561+ trace.mutter("Incompatible delta: %s", err.msg)
562+ raise errors.IncompatibleRevision(self.target_repo._format)
563 basis_id, new_id, rich_root, tree_refs, inv_delta = parse_result
564- # Make sure the delta is compatible with the target
565- if rich_root and not target_rich_root:
566- raise errors.IncompatibleRevision(self.target_repo._format)
567- if tree_refs and not target_tree_refs:
568- # The source supports tree refs and the target doesn't. Check
569- # the delta for tree refs; if it has any we can't insert it.
570- for delta_item in inv_delta:
571- entry = delta_item[3]
572- if entry.kind == 'tree-reference':
573- raise errors.IncompatibleRevision(
574- self.target_repo._format)
575 revision_id = new_id
576 parents = [key[0] for key in record.parents]
577 self.target_repo.add_inventory_by_delta(
578@@ -4404,10 +4414,6 @@
579 # Some missing keys are genuinely ghosts, filter those out.
580 present = self.from_repository.inventories.get_parent_map(keys)
581 revs = [key[0] for key in present]
582- # As with the original stream, we may need to generate root
583- # texts for the inventories we're about to stream.
584- for _ in self._generate_root_texts(revs):
585- yield _
586 # Get the inventory stream more-or-less as we do for the
587 # original stream; there's no reason to assume that records
588 # direct from the source will be suitable for the sink. (Think
589@@ -4474,7 +4480,7 @@
590
591 def _get_convertable_inventory_stream(self, revision_ids,
592 delta_versus_null=False):
593- # The source is using CHKs, but the target either doesn't or is has a
594+ # The source is using CHKs, but the target either doesn't or it has a
595 # different serializer. The StreamSink code expects to be able to
596 # convert on the target, so we need to put bytes-on-the-wire that can
597 # be converted. That means inventory deltas (if the remote is <1.18,
598@@ -4499,17 +4505,17 @@
599 # method...
600 inventories = self.from_repository.iter_inventories(
601 revision_ids, 'topological')
602- # XXX: ideally these flags would be per-revision, not per-repo (e.g.
603- # streaming a non-rich-root revision out of a rich-root repo back into
604- # a non-rich-root repo ought to be allowed)
605 format = from_repo._format
606- flags = (format.rich_root_data, format.supports_tree_reference)
607 invs_sent_so_far = set([_mod_revision.NULL_REVISION])
608 inventory_cache = lru_cache.LRUCache(50)
609 null_inventory = from_repo.revision_tree(
610 _mod_revision.NULL_REVISION).inventory
611- serializer = inventory_delta.InventoryDeltaSerializer()
612- serializer.require_flags(*flags)
613+ # XXX: ideally the rich-root/tree-refs flags would be per-revision, not
614+ # per-repo (e.g. streaming a non-rich-root revision out of a rich-root
615+ # repo back into a non-rich-root repo ought to be allowed)
616+ serializer = inventory_delta.InventoryDeltaSerializer(
617+ versioned_root=format.rich_root_data,
618+ tree_references=format.supports_tree_reference)
619 for inv in inventories:
620 key = (inv.revision_id,)
621 parent_keys = parent_map.get(key, ())
622
623=== modified file 'bzrlib/smart/repository.py'
624--- bzrlib/smart/repository.py 2009-08-04 00:51:24 +0000
625+++ bzrlib/smart/repository.py 2009-08-13 08:20:53 +0000
626@@ -424,18 +424,21 @@
627 return None # Signal that we want a body.
628
629 def _should_fake_unknown(self):
630- # This is a workaround for bugs in pre-1.18 clients that claim to
631- # support receiving streams of CHK repositories. The pre-1.18 client
632- # expects inventory records to be serialized in the format defined by
633- # to_network_name, but in pre-1.18 (at least) that format definition
634- # tries to use the xml5 serializer, which does not correctly handle
635- # rich-roots. After 1.18 the client can also accept inventory-deltas
636- # (which avoids this issue), and those clients will use the
637- # Repository.get_stream_1.18 verb instead of this one.
638- # So: if this repository is CHK, and the to_format doesn't match,
639- # we should just fake an UnknownSmartMethod error so that the client
640- # will fallback to VFS, rather than sending it a stream we know it
641- # cannot handle.
642+ """Return True if we should return UnknownMethod to the client.
643+
644+ This is a workaround for bugs in pre-1.18 clients that claim to
645+ support receiving streams of CHK repositories. The pre-1.18 client
646+ expects inventory records to be serialized in the format defined by
647+ to_network_name, but in pre-1.18 (at least) that format definition
648+ tries to use the xml5 serializer, which does not correctly handle
649+ rich-roots. After 1.18 the client can also accept inventory-deltas
650+ (which avoids this issue), and those clients will use the
651+ Repository.get_stream_1.18 verb instead of this one.
652+ So: if this repository is CHK, and the to_format doesn't match,
653+ we should just fake an UnknownSmartMethod error so that the client
654+ will fallback to VFS, rather than sending it a stream we know it
655+ cannot handle.
656+ """
657 from_format = self._repository._format
658 to_format = self._to_format
659 if not from_format.supports_chks:
660@@ -489,8 +492,7 @@
661 class SmartServerRepositoryGetStream_1_18(SmartServerRepositoryGetStream):
662
663 def _should_fake_unknown(self):
664- # The client is at least 1.18, so we don't need to work around any
665- # bugs.
666+ """Returns False; we don't need to workaround bugs in 1.18+ clients."""
667 return False
668
669
670
671=== modified file 'bzrlib/tests/per_interrepository/__init__.py'
672--- bzrlib/tests/per_interrepository/__init__.py 2009-08-13 12:51:59 +0000
673+++ bzrlib/tests/per_interrepository/__init__.py 2009-08-13 03:30:41 +0000
674@@ -46,12 +46,12 @@
675 """Transform the input formats to a list of scenarios.
676
677 :param formats: A list of tuples:
678- (interrepo_class, repository_format, repository_format_to).
679+ (label, repository_format, repository_format_to).
680 """
681 result = []
682- for repository_format, repository_format_to in formats:
683- id = '%s,%s' % (repository_format.__class__.__name__,
684- repository_format_to.__class__.__name__)
685+ for label, repository_format, repository_format_to in formats:
686+ id = '%s,%s,%s' % (label, repository_format.__class__.__name__,
687+ repository_format_to.__class__.__name__)
688 scenario = (id,
689 {"transport_server": transport_server,
690 "transport_readonly_server": transport_readonly_server,
691@@ -71,8 +71,8 @@
692 weaverepo,
693 )
694 result = []
695- def add_combo(from_format, to_format):
696- result.append((from_format, to_format))
697+ def add_combo(label, from_format, to_format):
698+ result.append((label, from_format, to_format))
699 # test the default InterRepository between format 6 and the current
700 # default format.
701 # XXX: robertc 20060220 reinstate this when there are two supported
702@@ -83,32 +83,47 @@
703 for optimiser_class in InterRepository._optimisers:
704 format_to_test = optimiser_class._get_repo_format_to_test()
705 if format_to_test is not None:
706- add_combo(format_to_test, format_to_test)
707+ add_combo(optimiser_class.__name__, format_to_test, format_to_test)
708 # if there are specific combinations we want to use, we can add them
709 # here. We want to test rich root upgrading.
710- add_combo(weaverepo.RepositoryFormat5(),
711- knitrepo.RepositoryFormatKnit3())
712- add_combo(knitrepo.RepositoryFormatKnit1(),
713- knitrepo.RepositoryFormatKnit3())
714- add_combo(knitrepo.RepositoryFormatKnit1(),
715+ # XXX: although we attach InterRepository class names to these scenarios,
716+ # there's nothing asserting that these labels correspond to what it
717+ # actually used.
718+ add_combo('InterRepository',
719+ weaverepo.RepositoryFormat5(),
720+ knitrepo.RepositoryFormatKnit3())
721+ add_combo('InterRepository',
722+ knitrepo.RepositoryFormatKnit1(),
723+ knitrepo.RepositoryFormatKnit3())
724+ add_combo('InterKnitRepo',
725+ knitrepo.RepositoryFormatKnit1(),
726 pack_repo.RepositoryFormatKnitPack1())
727- add_combo(pack_repo.RepositoryFormatKnitPack1(),
728+ add_combo('InterKnitRepo',
729+ pack_repo.RepositoryFormatKnitPack1(),
730 knitrepo.RepositoryFormatKnit1())
731- add_combo(knitrepo.RepositoryFormatKnit3(),
732+ add_combo('InterKnitRepo',
733+ knitrepo.RepositoryFormatKnit3(),
734 pack_repo.RepositoryFormatKnitPack3())
735- add_combo(pack_repo.RepositoryFormatKnitPack3(),
736+ add_combo('InterKnitRepo',
737+ pack_repo.RepositoryFormatKnitPack3(),
738 knitrepo.RepositoryFormatKnit3())
739- add_combo(pack_repo.RepositoryFormatKnitPack3(),
740+ add_combo('InterKnitRepo',
741+ pack_repo.RepositoryFormatKnitPack3(),
742 pack_repo.RepositoryFormatKnitPack4())
743- add_combo(pack_repo.RepositoryFormatKnitPack1(),
744- pack_repo.RepositoryFormatKnitPack6RichRoot())
745- add_combo(pack_repo.RepositoryFormatKnitPack6RichRoot(),
746- groupcompress_repo.RepositoryFormat2a())
747- add_combo(groupcompress_repo.RepositoryFormat2a(),
748- pack_repo.RepositoryFormatKnitPack6RichRoot())
749- add_combo(groupcompress_repo.RepositoryFormatCHK2(),
750- groupcompress_repo.RepositoryFormat2a())
751- add_combo(groupcompress_repo.RepositoryFormatCHK1(),
752+ add_combo('InterDifferingSerializer',
753+ pack_repo.RepositoryFormatKnitPack1(),
754+ pack_repo.RepositoryFormatKnitPack6RichRoot())
755+ add_combo('InterDifferingSerializer',
756+ pack_repo.RepositoryFormatKnitPack6RichRoot(),
757+ groupcompress_repo.RepositoryFormat2a())
758+ add_combo('InterDifferingSerializer',
759+ groupcompress_repo.RepositoryFormat2a(),
760+ pack_repo.RepositoryFormatKnitPack6RichRoot())
761+ add_combo('InterRepository',
762+ groupcompress_repo.RepositoryFormatCHK2(),
763+ groupcompress_repo.RepositoryFormat2a())
764+ add_combo('InterDifferingSerializer',
765+ groupcompress_repo.RepositoryFormatCHK1(),
766 groupcompress_repo.RepositoryFormat2a())
767 return result
768
769
770=== modified file 'bzrlib/tests/per_interrepository/test_fetch.py'
771--- bzrlib/tests/per_interrepository/test_fetch.py 2009-08-13 12:51:59 +0000
772+++ bzrlib/tests/per_interrepository/test_fetch.py 2009-08-13 08:55:59 +0000
773@@ -191,8 +191,19 @@
774 ['left', 'right'])
775 self.assertEqual(left_tree.inventory, stacked_left_tree.inventory)
776 self.assertEqual(right_tree.inventory, stacked_right_tree.inventory)
777-
778+
779+ # Finally, it's not enough to see that the basis inventories are
780+ # present. The texts introduced in merge (and only those) should be
781+ # present, and also generating a stream should succeed without blowing
782+ # up.
783 self.assertTrue(unstacked_repo.has_revision('merge'))
784+ expected_texts = set([('file-id', 'merge')])
785+ if stacked_branch.repository.texts.get_parent_map([('root-id',
786+ 'merge')]):
787+ # If a (root-id,merge) text exists, it should be in the stacked
788+ # repo.
789+ expected_texts.add(('root-id', 'merge'))
790+ self.assertEqual(expected_texts, unstacked_repo.texts.keys())
791 self.assertCanStreamRevision(unstacked_repo, 'merge')
792
793 def assertCanStreamRevision(self, repo, revision_id):
794@@ -241,6 +252,19 @@
795 self.addCleanup(stacked_branch.unlock)
796 stacked_second_tree = stacked_branch.repository.revision_tree('second')
797 self.assertEqual(second_tree.inventory, stacked_second_tree.inventory)
798+ # Finally, it's not enough to see that the basis inventories are
799+ # present. The texts introduced in merge (and only those) should be
800+ # present, and also generating a stream should succeed without blowing
801+ # up.
802+ self.assertTrue(unstacked_repo.has_revision('third'))
803+ expected_texts = set([('file-id', 'third')])
804+ if stacked_branch.repository.texts.get_parent_map([('root-id',
805+ 'third')]):
806+ # If a (root-id,third) text exists, it should be in the stacked
807+ # repo.
808+ expected_texts.add(('root-id', 'third'))
809+ self.assertEqual(expected_texts, unstacked_repo.texts.keys())
810+ self.assertCanStreamRevision(unstacked_repo, 'third')
811
812 def test_fetch_missing_basis_text(self):
813 """If fetching a delta, we should die if a basis is not present."""
814
815=== modified file 'bzrlib/tests/per_pack_repository.py'
816--- bzrlib/tests/per_pack_repository.py 2009-08-12 22:28:28 +0000
817+++ bzrlib/tests/per_pack_repository.py 2009-08-13 03:29:52 +0000
818@@ -1051,8 +1051,8 @@
819 tree.branch.push(remote_branch)
820 autopack_calls = len([call for call in self.hpss_calls if call ==
821 'PackRepository.autopack'])
822- streaming_calls = len([call for call in self.hpss_calls if call ==
823- 'Repository.insert_stream'])
824+ streaming_calls = len([call for call in self.hpss_calls if call in
825+ ('Repository.insert_stream', 'Repository.insert_stream_1.18')])
826 if autopack_calls:
827 # Non streaming server
828 self.assertEqual(1, autopack_calls)
829
830=== modified file 'bzrlib/tests/test_inventory_delta.py'
831--- bzrlib/tests/test_inventory_delta.py 2009-08-05 02:30:11 +0000
832+++ bzrlib/tests/test_inventory_delta.py 2009-08-11 06:52:07 +0000
833@@ -26,6 +26,7 @@
834 inventory,
835 inventory_delta,
836 )
837+from bzrlib.inventory_delta import InventoryDeltaError
838 from bzrlib.inventory import Inventory
839 from bzrlib.revision import NULL_REVISION
840 from bzrlib.tests import TestCase
841@@ -93,34 +94,34 @@
842 """Test InventoryDeltaSerializer.parse_text_bytes."""
843
844 def test_parse_no_bytes(self):
845- serializer = inventory_delta.InventoryDeltaSerializer()
846+ deserializer = inventory_delta.InventoryDeltaDeserializer()
847 err = self.assertRaises(
848- errors.BzrError, serializer.parse_text_bytes, '')
849+ InventoryDeltaError, deserializer.parse_text_bytes, '')
850 self.assertContainsRe(str(err), 'last line not empty')
851
852 def test_parse_bad_format(self):
853- serializer = inventory_delta.InventoryDeltaSerializer()
854- err = self.assertRaises(errors.BzrError,
855- serializer.parse_text_bytes, 'format: foo\n')
856+ deserializer = inventory_delta.InventoryDeltaDeserializer()
857+ err = self.assertRaises(InventoryDeltaError,
858+ deserializer.parse_text_bytes, 'format: foo\n')
859 self.assertContainsRe(str(err), 'unknown format')
860
861 def test_parse_no_parent(self):
862- serializer = inventory_delta.InventoryDeltaSerializer()
863- err = self.assertRaises(errors.BzrError,
864- serializer.parse_text_bytes,
865+ deserializer = inventory_delta.InventoryDeltaDeserializer()
866+ err = self.assertRaises(InventoryDeltaError,
867+ deserializer.parse_text_bytes,
868 'format: bzr inventory delta v1 (bzr 1.14)\n')
869 self.assertContainsRe(str(err), 'missing parent: marker')
870
871 def test_parse_no_version(self):
872- serializer = inventory_delta.InventoryDeltaSerializer()
873- err = self.assertRaises(errors.BzrError,
874- serializer.parse_text_bytes,
875+ deserializer = inventory_delta.InventoryDeltaDeserializer()
876+ err = self.assertRaises(InventoryDeltaError,
877+ deserializer.parse_text_bytes,
878 'format: bzr inventory delta v1 (bzr 1.14)\n'
879 'parent: null:\n')
880 self.assertContainsRe(str(err), 'missing version: marker')
881
882 def test_parse_duplicate_key_errors(self):
883- serializer = inventory_delta.InventoryDeltaSerializer()
884+ deserializer = inventory_delta.InventoryDeltaDeserializer()
885 double_root_lines = \
886 """format: bzr inventory delta v1 (bzr 1.14)
887 parent: null:
888@@ -130,14 +131,13 @@
889 None\x00/\x00an-id\x00\x00a@e\xc3\xa5ample.com--2004\x00dir\x00\x00
890 None\x00/\x00an-id\x00\x00a@e\xc3\xa5ample.com--2004\x00dir\x00\x00
891 """
892- err = self.assertRaises(errors.BzrError,
893- serializer.parse_text_bytes, double_root_lines)
894+ err = self.assertRaises(InventoryDeltaError,
895+ deserializer.parse_text_bytes, double_root_lines)
896 self.assertContainsRe(str(err), 'duplicate file id')
897
898 def test_parse_versioned_root_only(self):
899- serializer = inventory_delta.InventoryDeltaSerializer()
900- serializer.require_flags(versioned_root=True, tree_references=True)
901- parse_result = serializer.parse_text_bytes(root_only_lines)
902+ deserializer = inventory_delta.InventoryDeltaDeserializer()
903+ parse_result = deserializer.parse_text_bytes(root_only_lines)
904 expected_entry = inventory.make_entry(
905 'directory', u'', None, 'an-id')
906 expected_entry.revision = 'a@e\xc3\xa5ample.com--2004'
907@@ -147,8 +147,7 @@
908 parse_result)
909
910 def test_parse_special_revid_not_valid_last_mod(self):
911- serializer = inventory_delta.InventoryDeltaSerializer()
912- serializer.require_flags(versioned_root=False, tree_references=True)
913+ deserializer = inventory_delta.InventoryDeltaDeserializer()
914 root_only_lines = """format: bzr inventory delta v1 (bzr 1.14)
915 parent: null:
916 version: null:
917@@ -156,13 +155,12 @@
918 tree_references: true
919 None\x00/\x00TREE_ROOT\x00\x00null:\x00dir\x00\x00
920 """
921- err = self.assertRaises(errors.BzrError,
922- serializer.parse_text_bytes, root_only_lines)
923+ err = self.assertRaises(InventoryDeltaError,
924+ deserializer.parse_text_bytes, root_only_lines)
925 self.assertContainsRe(str(err), 'special revisionid found')
926
927 def test_parse_versioned_root_versioned_disabled(self):
928- serializer = inventory_delta.InventoryDeltaSerializer()
929- serializer.require_flags(versioned_root=False, tree_references=True)
930+ deserializer = inventory_delta.InventoryDeltaDeserializer()
931 root_only_lines = """format: bzr inventory delta v1 (bzr 1.14)
932 parent: null:
933 version: null:
934@@ -170,13 +168,12 @@
935 tree_references: true
936 None\x00/\x00TREE_ROOT\x00\x00a@e\xc3\xa5ample.com--2004\x00dir\x00\x00
937 """
938- err = self.assertRaises(errors.BzrError,
939- serializer.parse_text_bytes, root_only_lines)
940+ err = self.assertRaises(InventoryDeltaError,
941+ deserializer.parse_text_bytes, root_only_lines)
942 self.assertContainsRe(str(err), 'Versioned root found')
943
944 def test_parse_unique_root_id_root_versioned_disabled(self):
945- serializer = inventory_delta.InventoryDeltaSerializer()
946- serializer.require_flags(versioned_root=False, tree_references=True)
947+ deserializer = inventory_delta.InventoryDeltaDeserializer()
948 root_only_lines = """format: bzr inventory delta v1 (bzr 1.14)
949 parent: parent-id
950 version: a@e\xc3\xa5ample.com--2004
951@@ -184,29 +181,38 @@
952 tree_references: true
953 None\x00/\x00an-id\x00\x00parent-id\x00dir\x00\x00
954 """
955- err = self.assertRaises(errors.BzrError,
956- serializer.parse_text_bytes, root_only_lines)
957+ err = self.assertRaises(InventoryDeltaError,
958+ deserializer.parse_text_bytes, root_only_lines)
959 self.assertContainsRe(str(err), 'Versioned root found')
960
961 def test_parse_unversioned_root_versioning_enabled(self):
962- serializer = inventory_delta.InventoryDeltaSerializer()
963- serializer.require_flags(versioned_root=True, tree_references=True)
964- err = self.assertRaises(errors.BzrError,
965- serializer.parse_text_bytes, root_only_unversioned)
966- self.assertContainsRe(
967- str(err), 'serialized versioned_root flag is wrong: False')
968+ deserializer = inventory_delta.InventoryDeltaDeserializer()
969+ parse_result = deserializer.parse_text_bytes(root_only_unversioned)
970+ expected_entry = inventory.make_entry(
971+ 'directory', u'', None, 'TREE_ROOT')
972+ expected_entry.revision = 'entry-version'
973+ self.assertEqual(
974+ ('null:', 'entry-version', False, False,
975+ [(None, u'', 'TREE_ROOT', expected_entry)]),
976+ parse_result)
977+
978+ def test_parse_versioned_root_when_disabled(self):
979+ deserializer = inventory_delta.InventoryDeltaDeserializer(
980+ allow_versioned_root=False)
981+ err = self.assertRaises(inventory_delta.IncompatibleInventoryDelta,
982+ deserializer.parse_text_bytes, root_only_lines)
983+ self.assertEquals("versioned_root not allowed", str(err))
984
985 def test_parse_tree_when_disabled(self):
986- serializer = inventory_delta.InventoryDeltaSerializer()
987- serializer.require_flags(versioned_root=True, tree_references=False)
988- err = self.assertRaises(errors.BzrError,
989- serializer.parse_text_bytes, reference_lines)
990- self.assertContainsRe(
991- str(err), 'serialized tree_references flag is wrong: True')
992+ deserializer = inventory_delta.InventoryDeltaDeserializer(
993+ allow_tree_references=False)
994+ err = self.assertRaises(inventory_delta.IncompatibleInventoryDelta,
995+ deserializer.parse_text_bytes, reference_lines)
996+ self.assertEquals("Tree reference not allowed", str(err))
997
998 def test_parse_tree_when_header_disallows(self):
999 # A deserializer that allows tree_references to be set or unset.
1000- serializer = inventory_delta.InventoryDeltaSerializer()
1001+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1002 # A serialised inventory delta with a header saying no tree refs, but
1003 # that has a tree ref in its content.
1004 lines = """format: bzr inventory delta v1 (bzr 1.14)
1005@@ -216,13 +222,13 @@
1006 tree_references: false
1007 None\x00/foo\x00id\x00TREE_ROOT\x00changed\x00tree\x00subtree-version
1008 """
1009- err = self.assertRaises(errors.BzrError,
1010- serializer.parse_text_bytes, lines)
1011+ err = self.assertRaises(InventoryDeltaError,
1012+ deserializer.parse_text_bytes, lines)
1013 self.assertContainsRe(str(err), 'Tree reference found')
1014
1015 def test_parse_versioned_root_when_header_disallows(self):
1016 # A deserializer that allows tree_references to be set or unset.
1017- serializer = inventory_delta.InventoryDeltaSerializer()
1018+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1019 # A serialised inventory delta with a header saying no tree refs, but
1020 # that has a tree ref in its content.
1021 lines = """format: bzr inventory delta v1 (bzr 1.14)
1022@@ -232,35 +238,35 @@
1023 tree_references: false
1024 None\x00/\x00TREE_ROOT\x00\x00a@e\xc3\xa5ample.com--2004\x00dir
1025 """
1026- err = self.assertRaises(errors.BzrError,
1027- serializer.parse_text_bytes, lines)
1028+ err = self.assertRaises(InventoryDeltaError,
1029+ deserializer.parse_text_bytes, lines)
1030 self.assertContainsRe(str(err), 'Versioned root found')
1031
1032 def test_parse_last_line_not_empty(self):
1033 """newpath must start with / if it is not None."""
1034 # Trim the trailing newline from a valid serialization
1035 lines = root_only_lines[:-1]
1036- serializer = inventory_delta.InventoryDeltaSerializer()
1037- err = self.assertRaises(errors.BzrError,
1038- serializer.parse_text_bytes, lines)
1039+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1040+ err = self.assertRaises(InventoryDeltaError,
1041+ deserializer.parse_text_bytes, lines)
1042 self.assertContainsRe(str(err), 'last line not empty')
1043
1044 def test_parse_invalid_newpath(self):
1045 """newpath must start with / if it is not None."""
1046 lines = empty_lines
1047 lines += "None\x00bad\x00TREE_ROOT\x00\x00version\x00dir\n"
1048- serializer = inventory_delta.InventoryDeltaSerializer()
1049- err = self.assertRaises(errors.BzrError,
1050- serializer.parse_text_bytes, lines)
1051+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1052+ err = self.assertRaises(InventoryDeltaError,
1053+ deserializer.parse_text_bytes, lines)
1054 self.assertContainsRe(str(err), 'newpath invalid')
1055
1056 def test_parse_invalid_oldpath(self):
1057 """oldpath must start with / if it is not None."""
1058 lines = root_only_lines
1059 lines += "bad\x00/new\x00file-id\x00\x00version\x00dir\n"
1060- serializer = inventory_delta.InventoryDeltaSerializer()
1061- err = self.assertRaises(errors.BzrError,
1062- serializer.parse_text_bytes, lines)
1063+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1064+ err = self.assertRaises(InventoryDeltaError,
1065+ deserializer.parse_text_bytes, lines)
1066 self.assertContainsRe(str(err), 'oldpath invalid')
1067
1068 def test_parse_new_file(self):
1069@@ -270,8 +276,8 @@
1070 lines += (
1071 "None\x00/new\x00file-id\x00an-id\x00version\x00file\x00123\x00" +
1072 "\x00" + fake_sha + "\n")
1073- serializer = inventory_delta.InventoryDeltaSerializer()
1074- parse_result = serializer.parse_text_bytes(lines)
1075+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1076+ parse_result = deserializer.parse_text_bytes(lines)
1077 expected_entry = inventory.make_entry(
1078 'file', u'new', 'an-id', 'file-id')
1079 expected_entry.revision = 'version'
1080@@ -285,8 +291,8 @@
1081 lines = root_only_lines
1082 lines += (
1083 "/old-file\x00None\x00deleted-id\x00\x00null:\x00deleted\x00\x00\n")
1084- serializer = inventory_delta.InventoryDeltaSerializer()
1085- parse_result = serializer.parse_text_bytes(lines)
1086+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1087+ parse_result = deserializer.parse_text_bytes(lines)
1088 delta = parse_result[4]
1089 self.assertEqual(
1090 (u'old-file', None, 'deleted-id', None), delta[-1])
1091@@ -299,8 +305,8 @@
1092 old_inv = Inventory(None)
1093 new_inv = Inventory(None)
1094 delta = new_inv._make_delta(old_inv)
1095- serializer = inventory_delta.InventoryDeltaSerializer()
1096- serializer.require_flags(True, True)
1097+ serializer = inventory_delta.InventoryDeltaSerializer(
1098+ versioned_root=True, tree_references=True)
1099 self.assertEqual(StringIO(empty_lines).readlines(),
1100 serializer.delta_to_lines(NULL_REVISION, NULL_REVISION, delta))
1101
1102@@ -311,8 +317,8 @@
1103 root.revision = 'a@e\xc3\xa5ample.com--2004'
1104 new_inv.add(root)
1105 delta = new_inv._make_delta(old_inv)
1106- serializer = inventory_delta.InventoryDeltaSerializer()
1107- serializer.require_flags(versioned_root=True, tree_references=True)
1108+ serializer = inventory_delta.InventoryDeltaSerializer(
1109+ versioned_root=True, tree_references=True)
1110 self.assertEqual(StringIO(root_only_lines).readlines(),
1111 serializer.delta_to_lines(NULL_REVISION, 'entry-version', delta))
1112
1113@@ -324,15 +330,16 @@
1114 root.revision = 'entry-version'
1115 new_inv.add(root)
1116 delta = new_inv._make_delta(old_inv)
1117- serializer = inventory_delta.InventoryDeltaSerializer()
1118- serializer.require_flags(False, False)
1119+ serializer = inventory_delta.InventoryDeltaSerializer(
1120+ versioned_root=False, tree_references=False)
1121 serialized_lines = serializer.delta_to_lines(
1122 NULL_REVISION, 'entry-version', delta)
1123 self.assertEqual(StringIO(root_only_unversioned).readlines(),
1124 serialized_lines)
1125+ deserializer = inventory_delta.InventoryDeltaDeserializer()
1126 self.assertEqual(
1127 (NULL_REVISION, 'entry-version', False, False, delta),
1128- serializer.parse_text_bytes(''.join(serialized_lines)))
1129+ deserializer.parse_text_bytes(''.join(serialized_lines)))
1130
1131 def test_unversioned_non_root_errors(self):
1132 old_inv = Inventory(None)
1133@@ -343,9 +350,9 @@
1134 non_root = new_inv.make_entry('directory', 'foo', root.file_id, 'id')
1135 new_inv.add(non_root)
1136 delta = new_inv._make_delta(old_inv)
1137- serializer = inventory_delta.InventoryDeltaSerializer()
1138- serializer.require_flags(versioned_root=True, tree_references=True)
1139- err = self.assertRaises(errors.BzrError,
1140+ serializer = inventory_delta.InventoryDeltaSerializer(
1141+ versioned_root=True, tree_references=True)
1142+ err = self.assertRaises(InventoryDeltaError,
1143 serializer.delta_to_lines, NULL_REVISION, 'entry-version', delta)
1144 self.assertEqual(str(err), 'no version for fileid id')
1145
1146@@ -355,9 +362,9 @@
1147 root = new_inv.make_entry('directory', '', None, 'TREE_ROOT')
1148 new_inv.add(root)
1149 delta = new_inv._make_delta(old_inv)
1150- serializer = inventory_delta.InventoryDeltaSerializer()
1151- serializer.require_flags(versioned_root=True, tree_references=True)
1152- err = self.assertRaises(errors.BzrError,
1153+ serializer = inventory_delta.InventoryDeltaSerializer(
1154+ versioned_root=True, tree_references=True)
1155+ err = self.assertRaises(InventoryDeltaError,
1156 serializer.delta_to_lines, NULL_REVISION, 'entry-version', delta)
1157 self.assertEqual(str(err), 'no version for fileid TREE_ROOT')
1158
1159@@ -368,9 +375,9 @@
1160 root.revision = 'a@e\xc3\xa5ample.com--2004'
1161 new_inv.add(root)
1162 delta = new_inv._make_delta(old_inv)
1163- serializer = inventory_delta.InventoryDeltaSerializer()
1164- serializer.require_flags(versioned_root=False, tree_references=True)
1165- err = self.assertRaises(errors.BzrError,
1166+ serializer = inventory_delta.InventoryDeltaSerializer(
1167+ versioned_root=False, tree_references=True)
1168+ err = self.assertRaises(InventoryDeltaError,
1169 serializer.delta_to_lines, NULL_REVISION, 'entry-version', delta)
1170 self.assertStartsWith(str(err), 'Version present for / in TREE_ROOT')
1171
1172@@ -385,8 +392,8 @@
1173 non_root.kind = 'strangelove'
1174 new_inv.add(non_root)
1175 delta = new_inv._make_delta(old_inv)
1176- serializer = inventory_delta.InventoryDeltaSerializer()
1177- serializer.require_flags(True, True)
1178+ serializer = inventory_delta.InventoryDeltaSerializer(
1179+ versioned_root=True, tree_references=True)
1180 # we expect keyerror because there is little value wrapping this.
1181 # This test aims to prove that it errors more than how it errors.
1182 err = self.assertRaises(KeyError,
1183@@ -405,8 +412,8 @@
1184 non_root.reference_revision = 'subtree-version'
1185 new_inv.add(non_root)
1186 delta = new_inv._make_delta(old_inv)
1187- serializer = inventory_delta.InventoryDeltaSerializer()
1188- serializer.require_flags(versioned_root=True, tree_references=False)
1189+ serializer = inventory_delta.InventoryDeltaSerializer(
1190+ versioned_root=True, tree_references=False)
1191 # we expect keyerror because there is little value wrapping this.
1192 # This test aims to prove that it errors more than how it errors.
1193 err = self.assertRaises(KeyError,
1194@@ -425,8 +432,8 @@
1195 non_root.reference_revision = 'subtree-version'
1196 new_inv.add(non_root)
1197 delta = new_inv._make_delta(old_inv)
1198- serializer = inventory_delta.InventoryDeltaSerializer()
1199- serializer.require_flags(versioned_root=True, tree_references=True)
1200+ serializer = inventory_delta.InventoryDeltaSerializer(
1201+ versioned_root=True, tree_references=True)
1202 self.assertEqual(StringIO(reference_lines).readlines(),
1203 serializer.delta_to_lines(NULL_REVISION, 'entry-version', delta))
1204
1205@@ -434,19 +441,19 @@
1206 root_entry = inventory.make_entry('directory', '', None, 'TREE_ROOT')
1207 root_entry.revision = 'some-version'
1208 delta = [(None, '', 'TREE_ROOT', root_entry)]
1209- serializer = inventory_delta.InventoryDeltaSerializer()
1210- serializer.require_flags(versioned_root=False, tree_references=True)
1211+ serializer = inventory_delta.InventoryDeltaSerializer(
1212+ versioned_root=False, tree_references=True)
1213 self.assertRaises(
1214- errors.BzrError, serializer.delta_to_lines, 'old-version',
1215+ InventoryDeltaError, serializer.delta_to_lines, 'old-version',
1216 'new-version', delta)
1217
1218 def test_to_inventory_root_id_not_versioned(self):
1219 delta = [(None, '', 'an-id', inventory.make_entry(
1220 'directory', '', None, 'an-id'))]
1221- serializer = inventory_delta.InventoryDeltaSerializer()
1222- serializer.require_flags(versioned_root=True, tree_references=True)
1223+ serializer = inventory_delta.InventoryDeltaSerializer(
1224+ versioned_root=True, tree_references=True)
1225 self.assertRaises(
1226- errors.BzrError, serializer.delta_to_lines, 'old-version',
1227+ InventoryDeltaError, serializer.delta_to_lines, 'old-version',
1228 'new-version', delta)
1229
1230 def test_to_inventory_has_tree_not_meant_to(self):
1231@@ -459,9 +466,9 @@
1232 (None, 'foo', 'ref-id', tree_ref)
1233 # a file that followed the root move
1234 ]
1235- serializer = inventory_delta.InventoryDeltaSerializer()
1236- serializer.require_flags(versioned_root=True, tree_references=True)
1237- self.assertRaises(errors.BzrError, serializer.delta_to_lines,
1238+ serializer = inventory_delta.InventoryDeltaSerializer(
1239+ versioned_root=True, tree_references=True)
1240+ self.assertRaises(InventoryDeltaError, serializer.delta_to_lines,
1241 'old-version', 'new-version', delta)
1242
1243 def test_to_inventory_torture(self):
1244@@ -511,8 +518,8 @@
1245 executable=True, text_size=30, text_sha1='some-sha',
1246 revision='old-rev')),
1247 ]
1248- serializer = inventory_delta.InventoryDeltaSerializer()
1249- serializer.require_flags(versioned_root=True, tree_references=True)
1250+ serializer = inventory_delta.InventoryDeltaSerializer(
1251+ versioned_root=True, tree_references=True)
1252 lines = serializer.delta_to_lines(NULL_REVISION, 'something', delta)
1253 expected = """format: bzr inventory delta v1 (bzr 1.14)
1254 parent: null:
1255@@ -565,13 +572,13 @@
1256 def test_file_without_size(self):
1257 file_entry = inventory.make_entry('file', 'a file', None, 'file-id')
1258 file_entry.text_sha1 = 'foo'
1259- self.assertRaises(errors.BzrError,
1260+ self.assertRaises(InventoryDeltaError,
1261 inventory_delta._file_content, file_entry)
1262
1263 def test_file_without_sha1(self):
1264 file_entry = inventory.make_entry('file', 'a file', None, 'file-id')
1265 file_entry.text_size = 10
1266- self.assertRaises(errors.BzrError,
1267+ self.assertRaises(InventoryDeltaError,
1268 inventory_delta._file_content, file_entry)
1269
1270 def test_link_empty_target(self):
1271@@ -594,7 +601,7 @@
1272
1273 def test_link_no_target(self):
1274 entry = inventory.make_entry('symlink', 'a link', None)
1275- self.assertRaises(errors.BzrError,
1276+ self.assertRaises(InventoryDeltaError,
1277 inventory_delta._link_content, entry)
1278
1279 def test_reference_null(self):
1280@@ -611,5 +618,5 @@
1281
1282 def test_reference_no_reference(self):
1283 entry = inventory.make_entry('tree-reference', 'a tree', None)
1284- self.assertRaises(errors.BzrError,
1285+ self.assertRaises(InventoryDeltaError,
1286 inventory_delta._reference_content, entry)
1287
1288=== modified file 'bzrlib/tests/test_remote.py'
1289--- bzrlib/tests/test_remote.py 2009-08-13 12:51:59 +0000
1290+++ bzrlib/tests/test_remote.py 2009-08-13 03:29:52 +0000
1291@@ -2213,7 +2213,26 @@
1292 self.assertEqual([], client._calls)
1293
1294
1295-class TestRepositoryInsertStream(TestRemoteRepository):
1296+class TestRepositoryInsertStreamBase(TestRemoteRepository):
1297+ """Base class for Repository.insert_stream and .insert_stream_1.18
1298+ tests.
1299+ """
1300+
1301+ def checkInsertEmptyStream(self, repo, client):
1302+ """Insert an empty stream, checking the result.
1303+
1304+ This checks that there are no resume_tokens or missing_keys, and that
1305+ the client is finished.
1306+ """
1307+ sink = repo._get_sink()
1308+ fmt = repository.RepositoryFormat.get_default_format()
1309+ resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1310+ self.assertEqual([], resume_tokens)
1311+ self.assertEqual(set(), missing_keys)
1312+ self.assertFinished(client)
1313+
1314+
1315+class TestRepositoryInsertStream(TestRepositoryInsertStreamBase):
1316 """Tests for using Repository.insert_stream verb when the _1.18 variant is
1317 not available.
1318
1319@@ -2236,12 +2255,7 @@
1320 client.add_expected_call(
1321 'Repository.insert_stream', ('quack/', ''),
1322 'success', ('ok',))
1323- sink = repo._get_sink()
1324- fmt = repository.RepositoryFormat.get_default_format()
1325- resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1326- self.assertEqual([], resume_tokens)
1327- self.assertEqual(set(), missing_keys)
1328- self.assertFinished(client)
1329+ self.checkInsertEmptyStream(repo, client)
1330
1331 def test_locked_repo_with_no_lock_token(self):
1332 transport_path = 'quack'
1333@@ -2259,12 +2273,7 @@
1334 'Repository.insert_stream', ('quack/', ''),
1335 'success', ('ok',))
1336 repo.lock_write()
1337- sink = repo._get_sink()
1338- fmt = repository.RepositoryFormat.get_default_format()
1339- resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1340- self.assertEqual([], resume_tokens)
1341- self.assertEqual(set(), missing_keys)
1342- self.assertFinished(client)
1343+ self.checkInsertEmptyStream(repo, client)
1344
1345 def test_locked_repo_with_lock_token(self):
1346 transport_path = 'quack'
1347@@ -2282,18 +2291,13 @@
1348 'Repository.insert_stream_locked', ('quack/', '', 'a token'),
1349 'success', ('ok',))
1350 repo.lock_write()
1351- sink = repo._get_sink()
1352- fmt = repository.RepositoryFormat.get_default_format()
1353- resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1354- self.assertEqual([], resume_tokens)
1355- self.assertEqual(set(), missing_keys)
1356- self.assertFinished(client)
1357+ self.checkInsertEmptyStream(repo, client)
1358
1359 def test_stream_with_inventory_deltas(self):
1360- """'inventory-deltas' substreams can't be sent to the
1361- Repository.insert_stream verb. So when one is encountered the
1362- RemoteSink immediately stops using that verb and falls back to VFS
1363- insert_stream.
1364+ """'inventory-deltas' substreams cannot be sent to the
1365+ Repository.insert_stream verb, because not all servers that implement
1366+ that verb will accept them. So when one is encountered the RemoteSink
1367+ immediately stops using that verb and falls back to VFS insert_stream.
1368 """
1369 transport_path = 'quack'
1370 repo, client = self.setup_fake_client_and_repository(transport_path)
1371@@ -2368,8 +2372,8 @@
1372 'directory', 'newdir', inv.root.file_id, 'newdir-id')
1373 entry.revision = 'ghost'
1374 delta = [(None, 'newdir', 'newdir-id', entry)]
1375- serializer = inventory_delta.InventoryDeltaSerializer()
1376- serializer.require_flags(True, False)
1377+ serializer = inventory_delta.InventoryDeltaSerializer(
1378+ versioned_root=True, tree_references=False)
1379 lines = serializer.delta_to_lines('rev1', 'rev2', delta)
1380 yield versionedfile.ChunkedContentFactory(
1381 ('rev2',), (('rev1',)), None, lines)
1382@@ -2380,7 +2384,7 @@
1383 return stream_with_inv_delta()
1384
1385
1386-class TestRepositoryInsertStream_1_18(TestRemoteRepository):
1387+class TestRepositoryInsertStream_1_18(TestRepositoryInsertStreamBase):
1388
1389 def test_unlocked_repo(self):
1390 transport_path = 'quack'
1391@@ -2391,12 +2395,7 @@
1392 client.add_expected_call(
1393 'Repository.insert_stream_1.18', ('quack/', ''),
1394 'success', ('ok',))
1395- sink = repo._get_sink()
1396- fmt = repository.RepositoryFormat.get_default_format()
1397- resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1398- self.assertEqual([], resume_tokens)
1399- self.assertEqual(set(), missing_keys)
1400- self.assertFinished(client)
1401+ self.checkInsertEmptyStream(repo, client)
1402
1403 def test_locked_repo_with_no_lock_token(self):
1404 transport_path = 'quack'
1405@@ -2411,12 +2410,7 @@
1406 'Repository.insert_stream_1.18', ('quack/', ''),
1407 'success', ('ok',))
1408 repo.lock_write()
1409- sink = repo._get_sink()
1410- fmt = repository.RepositoryFormat.get_default_format()
1411- resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1412- self.assertEqual([], resume_tokens)
1413- self.assertEqual(set(), missing_keys)
1414- self.assertFinished(client)
1415+ self.checkInsertEmptyStream(repo, client)
1416
1417 def test_locked_repo_with_lock_token(self):
1418 transport_path = 'quack'
1419@@ -2431,12 +2425,7 @@
1420 'Repository.insert_stream_1.18', ('quack/', '', 'a token'),
1421 'success', ('ok',))
1422 repo.lock_write()
1423- sink = repo._get_sink()
1424- fmt = repository.RepositoryFormat.get_default_format()
1425- resume_tokens, missing_keys = sink.insert_stream([], fmt, [])
1426- self.assertEqual([], resume_tokens)
1427- self.assertEqual(set(), missing_keys)
1428- self.assertFinished(client)
1429+ self.checkInsertEmptyStream(repo, client)
1430
1431
1432 class TestRepositoryTarball(TestRemoteRepository):
1433
1434=== modified file 'bzrlib/tests/test_smart.py'
1435--- bzrlib/tests/test_smart.py 2009-07-23 07:37:05 +0000
1436+++ bzrlib/tests/test_smart.py 2009-08-07 02:26:45 +0000
1437@@ -1242,6 +1242,7 @@
1438 SmartServerResponse(('history-incomplete', 2, r2)),
1439 request.execute('stacked', 1, (3, r3)))
1440
1441+
1442 class TestSmartServerRepositoryGetStream(tests.TestCaseWithMemoryTransport):
1443
1444 def make_two_commit_repo(self):

« Back to merge proposal