Merge lp:~spiv/bzr/fetch-spec-everything-not-in-other into lp:bzr

Proposed by Andrew Bennetts
Status: Superseded
Proposed branch: lp:~spiv/bzr/fetch-spec-everything-not-in-other
Merge into: lp:bzr
Diff against target: 981 lines (+488/-74)
13 files modified
bzrlib/fetch.py (+36/-11)
bzrlib/graph.py (+174/-2)
bzrlib/remote.py (+30/-9)
bzrlib/repofmt/knitrepo.py (+19/-11)
bzrlib/repofmt/weaverepo.py (+19/-11)
bzrlib/repository.py (+94/-18)
bzrlib/smart/repository.py (+16/-0)
bzrlib/tests/per_interrepository/test_interrepository.py (+9/-2)
bzrlib/tests/per_repository_reference/test_fetch.py (+40/-1)
bzrlib/tests/test_remote.py (+27/-2)
bzrlib/tests/test_repository.py (+1/-1)
bzrlib/tests/test_smart.py (+18/-6)
doc/en/release-notes/bzr-2.3.txt (+5/-0)
To merge this branch: bzr merge lp:~spiv/bzr/fetch-spec-everything-not-in-other
Reviewer Review Type Date Requested Status
John A Meinel Needs Fixing
Review via email: mp+42078@code.launchpad.net

This proposal has been superseded by a proposal from 2010-12-06.

Commit message

Add EverythingResult, EverythingNotInOther, and NotInOtherForRevs.

Description of the change

Like <https://code.launchpad.net/~spiv/bzr/sprout-does-not-reopen-repo/+merge/41037> this is a step towards bug 309682 (i.e. is a pre-req for <lp:~spiv/bzr/fetch-all-tags-309682>, as well as being a general improvement.

This patch defines some new “fetch specs” to be passed as the fetch_spec argument of fetch: EverythingResult, EmptySearchResult, EverythingNotInOther, and NotInOtherForRevs. I think the naming is a bit off, the original fetch spec was called “SearchResult”, which made some sense, but there's also “PendingAncestryResult” which isn't really a “result”. So perhaps we should rename some or all of these things. As an added wrinkle, EverythingNotInOther an NotInOtherForRevs aren't directly “results” at all, but factories that can be turned into a result. Suggestions for clearer naming would be very welcome!

A nice thing about adding these objects is that it shifts logic to handle these different cases out of the innards of fetch.py to somewhere more modular. This in turn means we can support these things better over HPSS, and so this patch does that: the 'everything' search is now added to the network protocol. Happily new verb was needed, we can safely interpret a BadSearch error in this case as meaning we need to fallback.

Another change in this patch is the deprecation of search_missing_revision_ids' revision_id argument for the new revision_ids (plural).

To post a comment you must log in.
Revision history for this message
John A Meinel (jameinel) wrote :

+ if not isinstance(search, graph.EverythingResult):
^- can we add an attribute, rather than an isinstance check?

if not search.is_everything():

490 + def _present_source_revisions_for(self, revision_ids):
491 + """Returns set of all revisions in ancestry of revision_ids present in
492 + the source repo.
493 +
494 + :param revision_ids: if None, all revisions in source are returned.
495 + """
496 + if revision_ids is not None:
497 + # First, ensure all specified revisions exist. Callers expect
498 + # NoSuchRevision when they pass absent revision_ids here.
499 + revision_ids = set(revision_ids)
500 + graph = self.source.get_graph()
501 + present_revs = set(graph.get_parent_map(revision_ids))
502 + missing = revision_ids.difference(present_revs)
503 + if missing:
504 + raise errors.NoSuchRevision(self.source, missing.pop())
505 + source_ids = [rev_id for (rev_id, parents) in
506 + self.source.get_graph().iter_ancestry(revision_ids)
507 + if rev_id != _mod_revision.NULL_REVISION
508 + and parents is not None]
^- you have "graph = self.source.get_graph()" and then you inline "self.source.get_graph()" into the source_ids list comprehension. Better to re-use the graph object.

145 + def get_recipe(self):
146 + raise NotImplementedError(self.get_recipe)
147 +
148 + def get_network_struct(self):
149 + return ('everything',)

Why don't EverythingNotInOther and NotInOtherForRevs implement these functions?

Overall, I think these changes seem good, but I don't really see how this gets us closer to computing a search without having one end doing a step-by-step search. (Certainly you're still using the same 'search_missing_revision_ids', and you don't seem to be serializing anything over the wire...)

review: Needs Fixing
Revision history for this message
Andrew Bennetts (spiv) wrote :
Download full text (3.8 KiB)

John A Meinel wrote:
> Review: Needs Fixing
> + if not isinstance(search, graph.EverythingResult):
> ^- can we add an attribute, rather than an isinstance check?
>
> if not search.is_everything():

Hmm, I can do that, although currently there's no common base class for these
objects, and my gut feeling is that it would be good to keep their interface as
small and simple as possible. So I'm unsure about whether or not to do this. I
think it's probably easier to add this later than it is to remove it later.

> 490 + def _present_source_revisions_for(self, revision_ids):
> 491 + """Returns set of all revisions in ancestry of revision_ids present in
> 492 + the source repo.
> 493 +
> 494 + :param revision_ids: if None, all revisions in source are returned.
> 495 + """
> 496 + if revision_ids is not None:
> 497 + # First, ensure all specified revisions exist. Callers expect
> 498 + # NoSuchRevision when they pass absent revision_ids here.
> 499 + revision_ids = set(revision_ids)
> 500 + graph = self.source.get_graph()
> 501 + present_revs = set(graph.get_parent_map(revision_ids))
> 502 + missing = revision_ids.difference(present_revs)
> 503 + if missing:
> 504 + raise errors.NoSuchRevision(self.source, missing.pop())
> 505 + source_ids = [rev_id for (rev_id, parents) in
> 506 + self.source.get_graph().iter_ancestry(revision_ids)
> 507 + if rev_id != _mod_revision.NULL_REVISION
> 508 + and parents is not None]
> ^- you have "graph = self.source.get_graph()" and then you inline "self.source.get_graph()" into the source_ids list comprehension. Better to re-use the graph object.

Fixed, thanks for spotting that. I think it occurred because I initially
copy-and-pasted the list comprehension from graph.py.

> 145 + def get_recipe(self):
> 146 + raise NotImplementedError(self.get_recipe)
> 147 +
> 148 + def get_network_struct(self):
> 149 + return ('everything',)
>
> Why don't EverythingNotInOther and NotInOtherForRevs implement these functions?

Because they are a different kind of object. Obviously that's not clear enough
in the current code!

The names could stand to be improved, but there are basically two kinds of
“fetch spec” in this patch:

 * a network-ready search result: SearchResult, PendingAncestryResult,
   EverythingResult. They implement all the methods SearchResult implements
   (I'm not certain that's the perfect design, but it's not too far off.)
 * a lightweight object that can be resolved into a network-ready search result
   later: EverythingNotInOther, NotInOtherForRevs. They implement a get_search
   method that inspects the repository/ies to generate the search result.

> Overall, I think these changes seem good, but I don't really see how this gets
> us closer to computing a search without having one end doing a step-by-step
> search. (Certainly you're still using the same 'search_missing_revision_ids',
> and you don't seem to be serializing anything over the wire...)

Right, this change isn't meant to solve that problem. The concrete performance
issue they help with is in the next patch, fetch-all-tags-309682, to make it
and easy for the sprout code to request a fetch of all tags at the
same time as the tip.

Separately...

Read more...

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'bzrlib/fetch.py'
--- bzrlib/fetch.py 2010-09-17 04:35:23 +0000
+++ bzrlib/fetch.py 2010-12-03 07:07:15 +0000
@@ -28,6 +28,7 @@
28from bzrlib.lazy_import import lazy_import28from bzrlib.lazy_import import lazy_import
29lazy_import(globals(), """29lazy_import(globals(), """
30from bzrlib import (30from bzrlib import (
31 graph,
31 tsort,32 tsort,
32 versionedfile,33 versionedfile,
33 )34 )
@@ -93,7 +94,8 @@
93 try:94 try:
94 pb.update("Finding revisions", 0, 2)95 pb.update("Finding revisions", 0, 2)
95 search = self._revids_to_fetch()96 search = self._revids_to_fetch()
96 if search is None:97 mutter('fetching: %s', search)
98 if search.is_empty():
97 return99 return
98 pb.update("Fetching revisions", 1, 2)100 pb.update("Fetching revisions", 1, 2)
99 self._fetch_everything_for_search(search)101 self._fetch_everything_for_search(search)
@@ -126,8 +128,15 @@
126 resume_tokens, missing_keys = self.sink.insert_stream(128 resume_tokens, missing_keys = self.sink.insert_stream(
127 stream, from_format, [])129 stream, from_format, [])
128 if self.to_repository._fallback_repositories:130 if self.to_repository._fallback_repositories:
129 missing_keys.update(131 if not isinstance(search, graph.EverythingResult):
130 self._parent_inventories(search.get_keys()))132 # If search is EverythingResult this is be unnecessary,
133 # so we can skip this step. The source will send us
134 # every revision it has, and their parent inventories.
135 # (Unless the source is damaged! but not really worth
136 # optimising for that case. The pack code will reject bad
137 # streams anyway.)
138 missing_keys.update(
139 self._parent_inventories(search.get_keys()))
131 if missing_keys:140 if missing_keys:
132 pb.update("Missing keys")141 pb.update("Missing keys")
133 stream = source.get_stream_for_missing_keys(missing_keys)142 stream = source.get_stream_for_missing_keys(missing_keys)
@@ -151,17 +160,33 @@
151 """Determines the exact revisions needed from self.from_repository to160 """Determines the exact revisions needed from self.from_repository to
152 install self._last_revision in self.to_repository.161 install self._last_revision in self.to_repository.
153162
154 If no revisions need to be fetched, then this just returns None.163 :returns: A SearchResult of some sort. (Possibly a
164 PendingAncestryResult, EmptySearchResult, etc.)
155 """165 """
156 if self._fetch_spec is not None:166 mutter("self._fetch_spec, self._last_revision: %r, %r",
167 self._fetch_spec, self._last_revision)
168 get_search_result = getattr(self._fetch_spec, 'get_search_result', None)
169 if get_search_result is not None:
170 mutter(
171 'resolving fetch_spec into search result: %s', self._fetch_spec)
172 # This is EverythingNotInOther or a similar kind of fetch_spec.
173 # Turn it into a search result.
174 return get_search_result()
175 elif self._fetch_spec is not None:
176 # The fetch spec is already a concrete search result.
157 return self._fetch_spec177 return self._fetch_spec
158 mutter('fetch up to rev {%s}', self._last_revision)178 elif self._last_revision == NULL_REVISION:
159 if self._last_revision is NULL_REVISION:179 # fetch_spec is None + last_revision is null => empty fetch.
160 # explicit limit of no revisions needed180 # explicit limit of no revisions needed
161 return None181 return graph.EmptySearchResult()
162 return self.to_repository.search_missing_revision_ids(182 elif self._last_revision is not None:
163 self.from_repository, self._last_revision,183 return graph.NotInOtherForRevs(self.to_repository,
164 find_ghosts=self.find_ghosts)184 self.from_repository, [self._last_revision],
185 find_ghosts=self.find_ghosts).get_search_result()
186 else: # self._last_revision is None:
187 return graph.EverythingNotInOther(self.to_repository,
188 self.from_repository,
189 find_ghosts=self.find_ghosts).get_search_result()
165190
166 def _parent_inventories(self, revision_ids):191 def _parent_inventories(self, revision_ids):
167 # Find all the parent revisions referenced by the stream, but192 # Find all the parent revisions referenced by the stream, but
168193
=== modified file 'bzrlib/graph.py'
--- bzrlib/graph.py 2010-08-15 15:20:14 +0000
+++ bzrlib/graph.py 2010-12-03 07:07:15 +0000
@@ -1536,7 +1536,57 @@
1536 return revs, ghosts1536 return revs, ghosts
15371537
15381538
1539class SearchResult(object):1539class AbstractSearchResult(object):
1540
1541 def get_recipe(self):
1542 """Return a recipe that can be used to replay this search.
1543
1544 The recipe allows reconstruction of the same results at a later date.
1545
1546 :return: A tuple of (search_kind_str, *details). The details vary by
1547 kind of search result.
1548 """
1549 raise NotImplementedError(self.get_recipe)
1550
1551 def get_network_struct(self):
1552 """Return a tuple that can be transmitted via the HPSS protocol."""
1553 raise NotImplementedError(self.get_network_struct)
1554
1555 def get_keys(self):
1556 """Return the keys found in this search.
1557
1558 :return: A set of keys.
1559 """
1560 raise NotImplementedError(self.get_keys)
1561
1562 def is_empty(self):
1563 """Return false if the search lists 1 or more revisions."""
1564 raise NotImplementedError(self.is_empty)
1565
1566 def refine(self, seen, referenced):
1567 """Create a new search by refining this search.
1568
1569 :param seen: Revisions that have been satisfied.
1570 :param referenced: Revision references observed while satisfying some
1571 of this search.
1572 :return: A search result.
1573 """
1574 raise NotImplementedError(self.refine)
1575
1576
1577class AbstractSearch(object):
1578
1579 def get_search_result(self):
1580 """Construct a network-ready search result from this search description.
1581
1582 This may take some time to search repositories, etc.
1583
1584 :return: A search result.
1585 """
1586 raise NotImplementedError(self.get_search_result)
1587
1588
1589class SearchResult(AbstractSearchResult):
1540 """The result of a breadth first search.1590 """The result of a breadth first search.
15411591
1542 A SearchResult provides the ability to reconstruct the search or access a1592 A SearchResult provides the ability to reconstruct the search or access a
@@ -1557,6 +1607,19 @@
1557 self._recipe = ('search', start_keys, exclude_keys, key_count)1607 self._recipe = ('search', start_keys, exclude_keys, key_count)
1558 self._keys = frozenset(keys)1608 self._keys = frozenset(keys)
15591609
1610 def __repr__(self):
1611 kind, start_keys, exclude_keys, key_count = self._recipe
1612 if len(start_keys) > 5:
1613 start_keys_repr = repr(list(start_keys)[:5])[:-1] + ', ...]'
1614 else:
1615 start_keys_repr = repr(start_keys)
1616 if len(exclude_keys) > 5:
1617 exclude_keys_repr = repr(list(exclude_keys)[:5])[:-1] + ', ...]'
1618 else:
1619 exclude_keys_repr = repr(exclude_keys)
1620 return '<%s %s:(%s, %s, %d)>' % (self.__class__.__name__,
1621 kind, start_keys_repr, exclude_keys_repr, key_count)
1622
1560 def get_recipe(self):1623 def get_recipe(self):
1561 """Return a recipe that can be used to replay this search.1624 """Return a recipe that can be used to replay this search.
15621625
@@ -1580,6 +1643,12 @@
1580 """1643 """
1581 return self._recipe1644 return self._recipe
15821645
1646 def get_network_struct(self):
1647 start_keys = ' '.join(self._recipe[1])
1648 stop_keys = ' '.join(self._recipe[2])
1649 count = str(self._recipe[3])
1650 return (self._recipe[0], '\n'.join((start_keys, stop_keys, count)))
1651
1583 def get_keys(self):1652 def get_keys(self):
1584 """Return the keys found in this search.1653 """Return the keys found in this search.
15851654
@@ -1617,7 +1686,7 @@
1617 return SearchResult(pending_refs, exclude, count, keys)1686 return SearchResult(pending_refs, exclude, count, keys)
16181687
16191688
1620class PendingAncestryResult(object):1689class PendingAncestryResult(AbstractSearchResult):
1621 """A search result that will reconstruct the ancestry for some graph heads.1690 """A search result that will reconstruct the ancestry for some graph heads.
16221691
1623 Unlike SearchResult, this doesn't hold the complete search result in1692 Unlike SearchResult, this doesn't hold the complete search result in
@@ -1647,6 +1716,11 @@
1647 """1716 """
1648 return ('proxy-search', self.heads, set(), -1)1717 return ('proxy-search', self.heads, set(), -1)
16491718
1719 def get_network_struct(self):
1720 parts = ['ancestry-of']
1721 parts.extend(self.heads)
1722 return parts
1723
1650 def get_keys(self):1724 def get_keys(self):
1651 """See SearchResult.get_keys.1725 """See SearchResult.get_keys.
16521726
@@ -1679,6 +1753,104 @@
1679 return PendingAncestryResult(referenced - seen, self.repo)1753 return PendingAncestryResult(referenced - seen, self.repo)
16801754
16811755
1756class EmptySearchResult(AbstractSearchResult):
1757 """An empty search result."""
1758
1759 def is_empty(self):
1760 return True
1761
1762
1763class EverythingResult(AbstractSearchResult):
1764 """A search result that simply requests everything in the repository."""
1765
1766 def __init__(self, repo):
1767 self._repo = repo
1768
1769 def __repr__(self):
1770 return '%s(%r)' % (self.__class__.__name__, self._repo)
1771
1772 def get_recipe(self):
1773 raise NotImplementedError(self.get_recipe)
1774
1775 def get_network_struct(self):
1776 return ('everything',)
1777
1778 def get_keys(self):
1779 if 'evil' in debug.debug_flags:
1780 from bzrlib import remote
1781 if isinstance(self._repo, remote.RemoteRepository):
1782 # warn developers (not users) not to do this
1783 trace.mutter_callsite(
1784 2, "EverythingResult(RemoteRepository).get_keys() is slow.")
1785 return self._repo.all_revision_ids()
1786
1787 def is_empty(self):
1788 # It's ok for this to wrongly return False: the worst that can happen
1789 # is that RemoteStreamSource will initiate a get_stream on an empty
1790 # repository. And almost all repositories are non-empty.
1791 return False
1792
1793 def refine(self, seen, referenced):
1794 heads = set(self._repo.all_revision_ids())
1795 heads.difference_update(seen)
1796 heads.update(referenced)
1797 return PendingAncestryResult(heads, self._repo)
1798
1799
1800class EverythingNotInOther(AbstractSearch):
1801 """Find all revisions in that are in one repo but not the other."""
1802
1803 def __init__(self, to_repo, from_repo, find_ghosts=False):
1804 self.to_repo = to_repo
1805 self.from_repo = from_repo
1806 self.find_ghosts = find_ghosts
1807
1808 def get_search_result(self):
1809 return self.to_repo.search_missing_revision_ids(
1810 self.from_repo, find_ghosts=self.find_ghosts)
1811
1812
1813class NotInOtherForRevs(AbstractSearch):
1814 """Find all revisions missing in one repo for a some specific heads."""
1815
1816 def __init__(self, to_repo, from_repo, required_ids, if_present_ids=None,
1817 find_ghosts=False):
1818 """Constructor.
1819
1820 :param required_ids: revision IDs of heads that must be found, or else
1821 the search will fail with NoSuchRevision. All revisions in their
1822 ancestry not already in the other repository will be included in
1823 the search result.
1824 :param if_present_ids: revision IDs of heads that may be absent in the
1825 source repository. If present, then their ancestry not already
1826 found in other will be included in the search result.
1827 """
1828 self.to_repo = to_repo
1829 self.from_repo = from_repo
1830 self.find_ghosts = find_ghosts
1831 self.required_ids = required_ids
1832 self.if_present_ids = if_present_ids
1833
1834 def __repr__(self):
1835 if len(self.required_ids) > 5:
1836 reqd_revs_repr = repr(list(self.required_ids)[:5])[:-1] + ', ...]'
1837 else:
1838 reqd_revs_repr = repr(self.required_ids)
1839 if self.if_present_ids and len(self.if_present_ids) > 5:
1840 ifp_revs_repr = repr(list(self.if_present_ids)[:5])[:-1] + ', ...]'
1841 else:
1842 ifp_revs_repr = repr(self.if_present_ids)
1843
1844 return "<%s from:%r to:%r find_ghosts:%r req'd:%r if-present:%r>" % (
1845 self.__class__.__name__, self.from_repo, self.to_repo,
1846 self.find_ghosts, reqd_revs_repr, ifp_revs_repr)
1847
1848 def get_search_result(self):
1849 return self.to_repo.search_missing_revision_ids(
1850 self.from_repo, revision_ids=self.required_ids,
1851 if_present_ids=self.if_present_ids, find_ghosts=self.find_ghosts)
1852
1853
1682def collapse_linear_regions(parent_map):1854def collapse_linear_regions(parent_map):
1683 """Collapse regions of the graph that are 'linear'.1855 """Collapse regions of the graph that are 'linear'.
16841856
16851857
=== modified file 'bzrlib/remote.py'
--- bzrlib/remote.py 2010-12-02 10:41:05 +0000
+++ bzrlib/remote.py 2010-12-03 07:07:15 +0000
@@ -1344,15 +1344,29 @@
1344 return result1344 return result
13451345
1346 @needs_read_lock1346 @needs_read_lock
1347 def search_missing_revision_ids(self, other, revision_id=None, find_ghosts=True):1347 def search_missing_revision_ids(self, other,
1348 revision_id=symbol_versioning.DEPRECATED_PARAMETER,
1349 find_ghosts=True, revision_ids=None, if_present_ids=None):
1348 """Return the revision ids that other has that this does not.1350 """Return the revision ids that other has that this does not.
13491351
1350 These are returned in topological order.1352 These are returned in topological order.
13511353
1352 revision_id: only return revision ids included by revision_id.1354 revision_id: only return revision ids included by revision_id.
1353 """1355 """
1354 return repository.InterRepository.get(1356 if symbol_versioning.deprecated_passed(revision_id):
1355 other, self).search_missing_revision_ids(revision_id, find_ghosts)1357 symbol_versioning.warn(
1358 'search_missing_revision_ids(revision_id=...) was '
1359 'deprecated in 2.3. Use revision_ids=[...] instead.',
1360 DeprecationWarning, stacklevel=2)
1361 if revision_ids is not None:
1362 raise AssertionError(
1363 'revision_ids is mutually exclusive with revision_id')
1364 if revision_id is not None:
1365 revision_ids = [revision_id]
1366 inter_repo = repository.InterRepository.get(other, self)
1367 return inter_repo.search_missing_revision_ids(
1368 find_ghosts=find_ghosts, revision_ids=revision_ids,
1369 if_present_ids=if_present_ids)
13561370
1357 def fetch(self, source, revision_id=None, pb=None, find_ghosts=False,1371 def fetch(self, source, revision_id=None, pb=None, find_ghosts=False,
1358 fetch_spec=None):1372 fetch_spec=None):
@@ -1759,12 +1773,7 @@
1759 return '\n'.join((start_keys, stop_keys, count))1773 return '\n'.join((start_keys, stop_keys, count))
17601774
1761 def _serialise_search_result(self, search_result):1775 def _serialise_search_result(self, search_result):
1762 if isinstance(search_result, graph.PendingAncestryResult):1776 parts = search_result.get_network_struct()
1763 parts = ['ancestry-of']
1764 parts.extend(search_result.heads)
1765 else:
1766 recipe = search_result.get_recipe()
1767 parts = [recipe[0], self._serialise_search_recipe(recipe)]
1768 return '\n'.join(parts)1777 return '\n'.join(parts)
17691778
1770 def autopack(self):1779 def autopack(self):
@@ -1964,6 +1973,7 @@
1964 candidate_verbs = [1973 candidate_verbs = [
1965 ('Repository.get_stream_1.19', (1, 19)),1974 ('Repository.get_stream_1.19', (1, 19)),
1966 ('Repository.get_stream', (1, 13))]1975 ('Repository.get_stream', (1, 13))]
1976
1967 found_verb = False1977 found_verb = False
1968 for verb, version in candidate_verbs:1978 for verb, version in candidate_verbs:
1969 if medium._is_remote_before(version):1979 if medium._is_remote_before(version):
@@ -1973,6 +1983,17 @@
1973 verb, args, search_bytes)1983 verb, args, search_bytes)
1974 except errors.UnknownSmartMethod:1984 except errors.UnknownSmartMethod:
1975 medium._remember_remote_is_before(version)1985 medium._remember_remote_is_before(version)
1986 except errors.UnknownErrorFromSmartServer, e:
1987 if isinstance(search, graph.EverythingResult):
1988 error_verb = e.error_from_smart_server.error_verb
1989 if error_verb == 'BadSearch':
1990 # Pre-2.3 servers don't support this sort of search.
1991 # XXX: perhaps falling back to VFS on BadSearch is a
1992 # good idea in general? It might provide a little bit
1993 # of protection against client-side bugs.
1994 medium._remember_remote_is_before((2, 3))
1995 break
1996 raise
1976 else:1997 else:
1977 response_tuple, response_handler = response1998 response_tuple, response_handler = response
1978 found_verb = True1999 found_verb = True
19792000
=== modified file 'bzrlib/repofmt/knitrepo.py'
--- bzrlib/repofmt/knitrepo.py 2010-11-20 21:41:05 +0000
+++ bzrlib/repofmt/knitrepo.py 2010-12-03 07:07:15 +0000
@@ -43,6 +43,7 @@
43 RepositoryFormat,43 RepositoryFormat,
44 RootCommitBuilder,44 RootCommitBuilder,
45 )45 )
46from bzrlib import symbol_versioning
4647
4748
48class _KnitParentsProvider(object):49class _KnitParentsProvider(object):
@@ -534,16 +535,23 @@
534 return are_knits and InterRepository._same_model(source, target)535 return are_knits and InterRepository._same_model(source, target)
535536
536 @needs_read_lock537 @needs_read_lock
537 def search_missing_revision_ids(self, revision_id=None, find_ghosts=True):538 def search_missing_revision_ids(self,
538 """See InterRepository.missing_revision_ids()."""539 revision_id=symbol_versioning.DEPRECATED_PARAMETER,
539 if revision_id is not None:540 find_ghosts=True, revision_ids=None, if_present_ids=None):
540 source_ids = self.source.get_ancestry(revision_id)541 """See InterRepository.searcH_missing_revision_ids()."""
541 if source_ids[0] is not None:542 if symbol_versioning.deprecated_passed(revision_id):
542 raise AssertionError()543 symbol_versioning.warn(
543 source_ids.pop(0)544 'search_missing_revision_ids(revision_id=...) was '
544 else:545 'deprecated in 2.3. Use revision_ids=[...] instead.',
545 source_ids = self.source.all_revision_ids()546 DeprecationWarning, stacklevel=2)
546 source_ids_set = set(source_ids)547 if revision_ids is not None:
548 raise AssertionError(
549 'revision_ids is mutually exclusive with revision_id')
550 if revision_id is not None:
551 revision_ids = [revision_id]
552 del revision_id
553 source_ids_set = self._present_source_revisions_for(
554 revision_ids, if_present_ids)
547 # source_ids is the worst possible case we may need to pull.555 # source_ids is the worst possible case we may need to pull.
548 # now we want to filter source_ids against what we actually556 # now we want to filter source_ids against what we actually
549 # have in target, but don't try to check for existence where we know557 # have in target, but don't try to check for existence where we know
@@ -553,7 +561,7 @@
553 actually_present_revisions = set(561 actually_present_revisions = set(
554 self.target._eliminate_revisions_not_present(possibly_present_revisions))562 self.target._eliminate_revisions_not_present(possibly_present_revisions))
555 required_revisions = source_ids_set.difference(actually_present_revisions)563 required_revisions = source_ids_set.difference(actually_present_revisions)
556 if revision_id is not None:564 if revision_ids is not None:
557 # we used get_ancestry to determine source_ids then we are assured all565 # we used get_ancestry to determine source_ids then we are assured all
558 # revisions referenced are present as they are installed in topological order.566 # revisions referenced are present as they are installed in topological order.
559 # and the tip revision was validated by get_ancestry.567 # and the tip revision was validated by get_ancestry.
560568
=== modified file 'bzrlib/repofmt/weaverepo.py'
--- bzrlib/repofmt/weaverepo.py 2010-11-20 21:41:05 +0000
+++ bzrlib/repofmt/weaverepo.py 2010-12-03 07:07:15 +0000
@@ -39,6 +39,7 @@
39 lockable_files,39 lockable_files,
40 lockdir,40 lockdir,
41 osutils,41 osutils,
42 symbol_versioning,
42 trace,43 trace,
43 urlutils,44 urlutils,
44 versionedfile,45 versionedfile,
@@ -803,8 +804,10 @@
803 self.target.fetch(self.source, revision_id=revision_id)804 self.target.fetch(self.source, revision_id=revision_id)
804805
805 @needs_read_lock806 @needs_read_lock
806 def search_missing_revision_ids(self, revision_id=None, find_ghosts=True):807 def search_missing_revision_ids(self,
807 """See InterRepository.missing_revision_ids()."""808 revision_id=symbol_versioning.DEPRECATED_PARAMETER,
809 find_ghosts=True, revision_ids=None, if_present_ids=None):
810 """See InterRepository.search_missing_revision_ids()."""
808 # we want all revisions to satisfy revision_id in source.811 # we want all revisions to satisfy revision_id in source.
809 # but we don't want to stat every file here and there.812 # but we don't want to stat every file here and there.
810 # we want then, all revisions other needs to satisfy revision_id813 # we want then, all revisions other needs to satisfy revision_id
@@ -816,14 +819,19 @@
816 # disk format scales terribly for push anyway due to rewriting819 # disk format scales terribly for push anyway due to rewriting
817 # inventory.weave, this is considered acceptable.820 # inventory.weave, this is considered acceptable.
818 # - RBC 20060209821 # - RBC 20060209
819 if revision_id is not None:822 if symbol_versioning.deprecated_passed(revision_id):
820 source_ids = self.source.get_ancestry(revision_id)823 symbol_versioning.warn(
821 if source_ids[0] is not None:824 'search_missing_revision_ids(revision_id=...) was '
822 raise AssertionError()825 'deprecated in 2.3. Use revision_ids=[...] instead.',
823 source_ids.pop(0)826 DeprecationWarning, stacklevel=2)
824 else:827 if revision_ids is not None:
825 source_ids = self.source._all_possible_ids()828 raise AssertionError(
826 source_ids_set = set(source_ids)829 'revision_ids is mutually exclusive with revision_id')
830 if revision_id is not None:
831 revision_ids = [revision_id]
832 del revision_id
833 source_ids_set = self._present_source_revisions_for(
834 revision_ids, if_present_ids)
827 # source_ids is the worst possible case we may need to pull.835 # source_ids is the worst possible case we may need to pull.
828 # now we want to filter source_ids against what we actually836 # now we want to filter source_ids against what we actually
829 # have in target, but don't try to check for existence where we know837 # have in target, but don't try to check for existence where we know
@@ -833,7 +841,7 @@
833 actually_present_revisions = set(841 actually_present_revisions = set(
834 self.target._eliminate_revisions_not_present(possibly_present_revisions))842 self.target._eliminate_revisions_not_present(possibly_present_revisions))
835 required_revisions = source_ids_set.difference(actually_present_revisions)843 required_revisions = source_ids_set.difference(actually_present_revisions)
836 if revision_id is not None:844 if revision_ids is not None:
837 # we used get_ancestry to determine source_ids then we are assured all845 # we used get_ancestry to determine source_ids then we are assured all
838 # revisions referenced are present as they are installed in topological order.846 # revisions referenced are present as they are installed in topological order.
839 # and the tip revision was validated by get_ancestry.847 # and the tip revision was validated by get_ancestry.
840848
=== modified file 'bzrlib/repository.py'
--- bzrlib/repository.py 2010-12-02 10:41:05 +0000
+++ bzrlib/repository.py 2010-12-03 07:07:15 +0000
@@ -42,7 +42,6 @@
42 pyutils,42 pyutils,
43 revision as _mod_revision,43 revision as _mod_revision,
44 static_tuple,44 static_tuple,
45 symbol_versioning,
46 trace,45 trace,
47 tsort,46 tsort,
48 versionedfile,47 versionedfile,
@@ -57,6 +56,7 @@
57from bzrlib import (56from bzrlib import (
58 errors,57 errors,
59 registry,58 registry,
59 symbol_versioning,
60 ui,60 ui,
61 )61 )
62from bzrlib.decorators import needs_read_lock, needs_write_lock, only_raises62from bzrlib.decorators import needs_read_lock, needs_write_lock, only_raises
@@ -1561,15 +1561,28 @@
1561 return ret1561 return ret
15621562
1563 @needs_read_lock1563 @needs_read_lock
1564 def search_missing_revision_ids(self, other, revision_id=None, find_ghosts=True):1564 def search_missing_revision_ids(self, other,
1565 revision_id=symbol_versioning.DEPRECATED_PARAMETER,
1566 find_ghosts=True, revision_ids=None, if_present_ids=None):
1565 """Return the revision ids that other has that this does not.1567 """Return the revision ids that other has that this does not.
15661568
1567 These are returned in topological order.1569 These are returned in topological order.
15681570
1569 revision_id: only return revision ids included by revision_id.1571 revision_id: only return revision ids included by revision_id.
1570 """1572 """
1573 if symbol_versioning.deprecated_passed(revision_id):
1574 symbol_versioning.warn(
1575 'search_missing_revision_ids(revision_id=...) was '
1576 'deprecated in 2.3. Use revision_ids=[...] instead.',
1577 DeprecationWarning, stacklevel=3)
1578 if revision_ids is not None:
1579 raise AssertionError(
1580 'revision_ids is mutually exclusive with revision_id')
1581 if revision_id is not None:
1582 revision_ids = [revision_id]
1571 return InterRepository.get(other, self).search_missing_revision_ids(1583 return InterRepository.get(other, self).search_missing_revision_ids(
1572 revision_id, find_ghosts)1584 find_ghosts=find_ghosts, revision_ids=revision_ids,
1585 if_present_ids=if_present_ids)
15731586
1574 @staticmethod1587 @staticmethod
1575 def open(base):1588 def open(base):
@@ -3430,7 +3443,7 @@
3430 fetch_spec=fetch_spec,3443 fetch_spec=fetch_spec,
3431 find_ghosts=find_ghosts)3444 find_ghosts=find_ghosts)
34323445
3433 def _walk_to_common_revisions(self, revision_ids):3446 def _walk_to_common_revisions(self, revision_ids, if_present_ids=None):
3434 """Walk out from revision_ids in source to revisions target has.3447 """Walk out from revision_ids in source to revisions target has.
34353448
3436 :param revision_ids: The start point for the search.3449 :param revision_ids: The start point for the search.
@@ -3438,10 +3451,14 @@
3438 """3451 """
3439 target_graph = self.target.get_graph()3452 target_graph = self.target.get_graph()
3440 revision_ids = frozenset(revision_ids)3453 revision_ids = frozenset(revision_ids)
3454 if if_present_ids:
3455 all_wanted_revs = revision_ids.union(if_present_ids)
3456 else:
3457 all_wanted_revs = revision_ids
3441 missing_revs = set()3458 missing_revs = set()
3442 source_graph = self.source.get_graph()3459 source_graph = self.source.get_graph()
3443 # ensure we don't pay silly lookup costs.3460 # ensure we don't pay silly lookup costs.
3444 searcher = source_graph._make_breadth_first_searcher(revision_ids)3461 searcher = source_graph._make_breadth_first_searcher(all_wanted_revs)
3445 null_set = frozenset([_mod_revision.NULL_REVISION])3462 null_set = frozenset([_mod_revision.NULL_REVISION])
3446 searcher_exhausted = False3463 searcher_exhausted = False
3447 while True:3464 while True:
@@ -3483,30 +3500,79 @@
3483 return searcher.get_result()3500 return searcher.get_result()
34843501
3485 @needs_read_lock3502 @needs_read_lock
3486 def search_missing_revision_ids(self, revision_id=None, find_ghosts=True):3503 def search_missing_revision_ids(self,
3504 revision_id=symbol_versioning.DEPRECATED_PARAMETER,
3505 find_ghosts=True, revision_ids=None, if_present_ids=None):
3487 """Return the revision ids that source has that target does not.3506 """Return the revision ids that source has that target does not.
34883507
3489 :param revision_id: only return revision ids included by this3508 :param revision_id: only return revision ids included by this
3490 revision_id.3509 revision_id.
3510 :param revision_ids: return revision ids included by these
3511 revision_ids. NoSuchRevision will be raised if any of these
3512 revisions are not present.
3513 :param if_present_ids: like revision_ids, but will not cause
3514 NoSuchRevision if any of these are absent, instead they will simply
3515 not be in the result. This is useful for e.g. finding revisions
3516 to fetch for tags, which may reference absent revisions.
3491 :param find_ghosts: If True find missing revisions in deep history3517 :param find_ghosts: If True find missing revisions in deep history
3492 rather than just finding the surface difference.3518 rather than just finding the surface difference.
3493 :return: A bzrlib.graph.SearchResult.3519 :return: A bzrlib.graph.SearchResult.
3494 """3520 """
3521 if symbol_versioning.deprecated_passed(revision_id):
3522 symbol_versioning.warn(
3523 'search_missing_revision_ids(revision_id=...) was '
3524 'deprecated in 2.3. Use revision_ids=[...] instead.',
3525 DeprecationWarning, stacklevel=2)
3526 if revision_ids is not None:
3527 raise AssertionError(
3528 'revision_ids is mutually exclusive with revision_id')
3529 if revision_id is not None:
3530 revision_ids = [revision_id]
3531 del revision_id
3495 # stop searching at found target revisions.3532 # stop searching at found target revisions.
3496 if not find_ghosts and revision_id is not None:3533 if not find_ghosts and (revision_ids is not None or if_present_ids is
3497 return self._walk_to_common_revisions([revision_id])3534 not None):
3535 return self._walk_to_common_revisions(revision_ids,
3536 if_present_ids=if_present_ids)
3498 # generic, possibly worst case, slow code path.3537 # generic, possibly worst case, slow code path.
3499 target_ids = set(self.target.all_revision_ids())3538 target_ids = set(self.target.all_revision_ids())
3500 if revision_id is not None:3539 source_ids = self._present_source_revisions_for(
3501 source_ids = self.source.get_ancestry(revision_id)3540 revision_ids, if_present_ids)
3502 if source_ids[0] is not None:
3503 raise AssertionError()
3504 source_ids.pop(0)
3505 else:
3506 source_ids = self.source.all_revision_ids()
3507 result_set = set(source_ids).difference(target_ids)3541 result_set = set(source_ids).difference(target_ids)
3508 return self.source.revision_ids_to_search_result(result_set)3542 return self.source.revision_ids_to_search_result(result_set)
35093543
3544 def _present_source_revisions_for(self, revision_ids, if_present_ids=None):
3545 """Returns set of all revisions in ancestry of revision_ids present in
3546 the source repo.
3547
3548 :param revision_ids: if None, all revisions in source are returned.
3549 :param if_present_ids: like revision_ids, but if any/all of these are
3550 absent no error is raised.
3551 """
3552 if revision_ids is not None or if_present_ids is not None:
3553 # First, ensure all specified revisions exist. Callers expect
3554 # NoSuchRevision when they pass absent revision_ids here.
3555 if revision_ids is None:
3556 revision_ids = set()
3557 if if_present_ids is None:
3558 if_present_ids = set()
3559 revision_ids = set(revision_ids)
3560 if_present_ids = set(if_present_ids)
3561 all_wanted_ids = revision_ids.union(if_present_ids)
3562 graph = self.source.get_graph()
3563 present_revs = set(graph.get_parent_map(all_wanted_ids))
3564 missing = revision_ids.difference(present_revs)
3565 if missing:
3566 raise errors.NoSuchRevision(self.source, missing.pop())
3567 found_ids = all_wanted_ids.intersection(present_revs)
3568 source_ids = [rev_id for (rev_id, parents) in
3569 graph.iter_ancestry(found_ids)
3570 if rev_id != _mod_revision.NULL_REVISION
3571 and parents is not None]
3572 else:
3573 source_ids = self.source.all_revision_ids()
3574 return set(source_ids)
3575
3510 @staticmethod3576 @staticmethod
3511 def _same_model(source, target):3577 def _same_model(source, target):
3512 """True if source and target have the same data representation.3578 """True if source and target have the same data representation.
@@ -3830,7 +3896,13 @@
3830 fetch_spec=None):3896 fetch_spec=None):
3831 """See InterRepository.fetch()."""3897 """See InterRepository.fetch()."""
3832 if fetch_spec is not None:3898 if fetch_spec is not None:
3833 raise AssertionError("Not implemented yet...")3899 if (isinstance(fetch_spec, graph.NotInOtherForRevs) and
3900 len(fetch_spec.required_ids) == 1 and not
3901 fetch_spec.if_present_ids):
3902 revision_id = list(fetch_spec.required_ids)[0]
3903 del fetch_spec
3904 else:
3905 raise AssertionError("Not implemented yet...")
3834 ui.ui_factory.warn_experimental_format_fetch(self)3906 ui.ui_factory.warn_experimental_format_fetch(self)
3835 if (not self.source.supports_rich_root()3907 if (not self.source.supports_rich_root()
3836 and self.target.supports_rich_root()):3908 and self.target.supports_rich_root()):
@@ -3843,8 +3915,12 @@
3843 ui.ui_factory.show_user_warning('cross_format_fetch',3915 ui.ui_factory.show_user_warning('cross_format_fetch',
3844 from_format=self.source._format,3916 from_format=self.source._format,
3845 to_format=self.target._format)3917 to_format=self.target._format)
3918 if revision_id:
3919 search_revision_ids = [revision_id]
3920 else:
3921 search_revision_ids = None
3846 revision_ids = self.target.search_missing_revision_ids(self.source,3922 revision_ids = self.target.search_missing_revision_ids(self.source,
3847 revision_id, find_ghosts=find_ghosts).get_keys()3923 revision_ids=search_revision_ids, find_ghosts=find_ghosts).get_keys()
3848 if not revision_ids:3924 if not revision_ids:
3849 return 0, 03925 return 0, 0
3850 revision_ids = tsort.topo_sort(3926 revision_ids = tsort.topo_sort(
38513927
=== modified file 'bzrlib/smart/repository.py'
--- bzrlib/smart/repository.py 2010-11-16 06:06:11 +0000
+++ bzrlib/smart/repository.py 2010-12-03 07:07:15 +0000
@@ -81,6 +81,8 @@
81 recreate_search trusts that clients will look for missing things81 recreate_search trusts that clients will look for missing things
82 they expected and get it from elsewhere.82 they expected and get it from elsewhere.
83 """83 """
84 if search_bytes == 'everything':
85 return graph.EverythingResult(repository), None
84 lines = search_bytes.split('\n')86 lines = search_bytes.split('\n')
85 if lines[0] == 'ancestry-of':87 if lines[0] == 'ancestry-of':
86 heads = lines[1:]88 heads = lines[1:]
@@ -412,6 +414,13 @@
412 def do_repository_request(self, repository, to_network_name):414 def do_repository_request(self, repository, to_network_name):
413 """Get a stream for inserting into a to_format repository.415 """Get a stream for inserting into a to_format repository.
414416
417 The request body is 'search_bytes', a description of the revisions
418 being requested.
419
420 In 2.3 this verb added support for search_bytes == 'everything'. Older
421 implementations will respond with a BadSearch error, and clients should
422 catch this and fallback appropriately.
423
415 :param repository: The repository to stream from.424 :param repository: The repository to stream from.
416 :param to_network_name: The network name of the format of the target425 :param to_network_name: The network name of the format of the target
417 repository.426 repository.
@@ -489,6 +498,13 @@
489498
490499
491class SmartServerRepositoryGetStream_1_19(SmartServerRepositoryGetStream):500class SmartServerRepositoryGetStream_1_19(SmartServerRepositoryGetStream):
501 """The same as Repository.get_stream, but will return stream CHK formats to
502 clients.
503
504 See SmartServerRepositoryGetStream._should_fake_unknown.
505
506 New in 1.19.
507 """
492508
493 def _should_fake_unknown(self):509 def _should_fake_unknown(self):
494 """Returns False; we don't need to workaround bugs in 1.19+ clients."""510 """Returns False; we don't need to workaround bugs in 1.19+ clients."""
495511
=== modified file 'bzrlib/tests/per_interrepository/test_interrepository.py'
--- bzrlib/tests/per_interrepository/test_interrepository.py 2009-07-10 06:46:10 +0000
+++ bzrlib/tests/per_interrepository/test_interrepository.py 2010-12-03 07:07:15 +0000
@@ -139,9 +139,15 @@
139 self.assertFalse(repo_b.has_revision('pizza'))139 self.assertFalse(repo_b.has_revision('pizza'))
140 # Asking specifically for an absent revision errors.140 # Asking specifically for an absent revision errors.
141 self.assertRaises(errors.NoSuchRevision,141 self.assertRaises(errors.NoSuchRevision,
142 repo_b.search_missing_revision_ids, repo_a, revision_id='pizza',142 repo_b.search_missing_revision_ids, repo_a, revision_ids=['pizza'],
143 find_ghosts=True)143 find_ghosts=True)
144 self.assertRaises(errors.NoSuchRevision,144 self.assertRaises(errors.NoSuchRevision,
145 repo_b.search_missing_revision_ids, repo_a, revision_ids=['pizza'],
146 find_ghosts=False)
147 self.callDeprecated(
148 ['search_missing_revision_ids(revision_id=...) was deprecated in '
149 '2.3. Use revision_ids=[...] instead.'],
150 self.assertRaises, errors.NoSuchRevision,
145 repo_b.search_missing_revision_ids, repo_a, revision_id='pizza',151 repo_b.search_missing_revision_ids, repo_a, revision_id='pizza',
146 find_ghosts=False)152 find_ghosts=False)
147153
@@ -151,7 +157,8 @@
151 # make a repository to compare against that is empty157 # make a repository to compare against that is empty
152 repo_b = self.make_to_repository('empty')158 repo_b = self.make_to_repository('empty')
153 repo_a = self.bzrdir.open_repository()159 repo_a = self.bzrdir.open_repository()
154 result = repo_b.search_missing_revision_ids(repo_a, revision_id='rev1')160 result = repo_b.search_missing_revision_ids(
161 repo_a, revision_ids=['rev1'])
155 self.assertEqual(set(['rev1']), result.get_keys())162 self.assertEqual(set(['rev1']), result.get_keys())
156 self.assertEqual(('search', set(['rev1']), set([NULL_REVISION]), 1),163 self.assertEqual(('search', set(['rev1']), set([NULL_REVISION]), 1),
157 result.get_recipe())164 result.get_recipe())
158165
=== modified file 'bzrlib/tests/per_repository_reference/test_fetch.py'
--- bzrlib/tests/per_repository_reference/test_fetch.py 2009-06-01 18:13:46 +0000
+++ bzrlib/tests/per_repository_reference/test_fetch.py 2010-12-03 07:07:15 +0000
@@ -18,6 +18,7 @@
18from bzrlib import (18from bzrlib import (
19 branch,19 branch,
20 errors,20 errors,
21 graph,
21 )22 )
22from bzrlib.smart import (23from bzrlib.smart import (
23 server,24 server,
@@ -25,7 +26,7 @@
25from bzrlib.tests.per_repository import TestCaseWithRepository26from bzrlib.tests.per_repository import TestCaseWithRepository
2627
2728
28class TestFetch(TestCaseWithRepository):29class TestFetchBase(TestCaseWithRepository):
2930
30 def make_source_branch(self):31 def make_source_branch(self):
31 # It would be nice if there was a way to force this to be memory-only32 # It would be nice if there was a way to force this to be memory-only
@@ -51,6 +52,9 @@
51 self.addCleanup(source_b.unlock)52 self.addCleanup(source_b.unlock)
52 return content, source_b53 return content, source_b
5354
55
56class TestFetch(TestFetchBase):
57
54 def test_sprout_from_stacked_with_short_history(self):58 def test_sprout_from_stacked_with_short_history(self):
55 content, source_b = self.make_source_branch()59 content, source_b = self.make_source_branch()
56 # Split the generated content into a base branch, and a stacked branch60 # Split the generated content into a base branch, and a stacked branch
@@ -149,3 +153,38 @@
149 source_b.lock_read()153 source_b.lock_read()
150 self.addCleanup(source_b.unlock)154 self.addCleanup(source_b.unlock)
151 stacked.pull(source_b, stop_revision='B-id')155 stacked.pull(source_b, stop_revision='B-id')
156
157
158class TestFetchFromRepoWithUnconfiguredFallbacks(TestFetchBase):
159
160 def make_stacked_source_repo(self):
161 _, source_b = self.make_source_branch()
162 # Use 'make_branch' which gives us a bzr:// branch when appropriate,
163 # rather than creating a branch-on-disk
164 stack_b = self.make_branch('stack-on')
165 stack_b.pull(source_b, stop_revision='B-id')
166 stacked_b = self.make_branch('stacked')
167 stacked_b.set_stacked_on_url('../stack-on')
168 stacked_b.pull(source_b, stop_revision='C-id')
169 return stacked_b.repository
170
171 def test_fetch_everything_includes_parent_invs(self):
172 stacked = self.make_stacked_source_repo()
173 repo_missing_fallbacks = stacked.bzrdir.open_repository()
174 self.addCleanup(repo_missing_fallbacks.lock_read().unlock)
175 target = self.make_repository('target')
176 self.addCleanup(target.lock_write().unlock)
177 target.fetch(
178 repo_missing_fallbacks,
179 fetch_spec=graph.EverythingResult(repo_missing_fallbacks))
180 self.assertEqual(repo_missing_fallbacks.revisions.keys(),
181 target.revisions.keys())
182 self.assertEqual(repo_missing_fallbacks.inventories.keys(),
183 target.inventories.keys())
184 self.assertEqual(['C-id'],
185 sorted(k[-1] for k in target.revisions.keys()))
186 self.assertEqual(['B-id', 'C-id'],
187 sorted(k[-1] for k in target.inventories.keys()))
188
189
190
152191
=== modified file 'bzrlib/tests/test_remote.py'
--- bzrlib/tests/test_remote.py 2010-12-02 10:41:05 +0000
+++ bzrlib/tests/test_remote.py 2010-12-03 07:07:15 +0000
@@ -3182,11 +3182,36 @@
31823182
3183 def test_copy_content_into_avoids_revision_history(self):3183 def test_copy_content_into_avoids_revision_history(self):
3184 local = self.make_branch('local')3184 local = self.make_branch('local')
3185 remote_backing_tree = self.make_branch_and_tree('remote')3185 builder = self.make_branch_builder('remote')
3186 remote_backing_tree.commit("Commit.")3186 builder.build_commit(message="Commit.")
3187 remote_branch_url = self.smart_server.get_url() + 'remote'3187 remote_branch_url = self.smart_server.get_url() + 'remote'
3188 remote_branch = bzrdir.BzrDir.open(remote_branch_url).open_branch()3188 remote_branch = bzrdir.BzrDir.open(remote_branch_url).open_branch()
3189 local.repository.fetch(remote_branch.repository)3189 local.repository.fetch(remote_branch.repository)
3190 self.hpss_calls = []3190 self.hpss_calls = []
3191 remote_branch.copy_content_into(local)3191 remote_branch.copy_content_into(local)
3192 self.assertFalse('Branch.revision_history' in self.hpss_calls)3192 self.assertFalse('Branch.revision_history' in self.hpss_calls)
3193
3194 def test_fetch_everything_needs_just_one_call(self):
3195 local = self.make_branch('local')
3196 builder = self.make_branch_builder('remote')
3197 builder.build_commit(message="Commit.")
3198 remote_branch_url = self.smart_server.get_url() + 'remote'
3199 remote_branch = bzrdir.BzrDir.open(remote_branch_url).open_branch()
3200 self.hpss_calls = []
3201 local.repository.fetch(remote_branch.repository,
3202 fetch_spec=graph.EverythingResult(remote_branch.repository))
3203 self.assertEqual(['Repository.get_stream_1.19'], self.hpss_calls)
3204
3205 def test_fetch_everything_backwards_compat(self):
3206 """Can fetch with EverythingResult even when the server does not have
3207 the Repository.get_stream_2.3 verb.
3208 """
3209 local = self.make_branch('local')
3210 builder = self.make_branch_builder('remote')
3211 builder.build_commit(message="Commit.")
3212 remote_branch_url = self.smart_server.get_url() + 'remote'
3213 remote_branch = bzrdir.BzrDir.open(remote_branch_url).open_branch()
3214 self.hpss_calls = []
3215 local.repository.fetch(remote_branch.repository,
3216 fetch_spec=graph.EverythingResult(remote_branch.repository))
3217
31933218
=== modified file 'bzrlib/tests/test_repository.py'
--- bzrlib/tests/test_repository.py 2010-11-22 22:27:58 +0000
+++ bzrlib/tests/test_repository.py 2010-12-03 07:07:15 +0000
@@ -1659,7 +1659,7 @@
1659 self.orig_pack = target.pack1659 self.orig_pack = target.pack
1660 target.pack = self.log_pack1660 target.pack = self.log_pack
1661 search = target.search_missing_revision_ids(1661 search = target.search_missing_revision_ids(
1662 source_tree.branch.repository, tip)1662 source_tree.branch.repository, revision_ids=[tip])
1663 stream = source.get_stream(search)1663 stream = source.get_stream(search)
1664 from_format = source_tree.branch.repository._format1664 from_format = source_tree.branch.repository._format
1665 sink = target._get_sink()1665 sink = target._get_sink()
16661666
=== modified file 'bzrlib/tests/test_smart.py'
--- bzrlib/tests/test_smart.py 2010-05-13 16:17:54 +0000
+++ bzrlib/tests/test_smart.py 2010-12-03 07:07:15 +0000
@@ -25,15 +25,11 @@
25"""25"""
2626
27import bz227import bz2
28from cStringIO import StringIO
29import tarfile
3028
31from bzrlib import (29from bzrlib import (
32 bencode,
33 branch as _mod_branch,30 branch as _mod_branch,
34 bzrdir,31 bzrdir,
35 errors,32 errors,
36 pack,
37 tests,33 tests,
38 transport,34 transport,
39 urlutils,35 urlutils,
@@ -45,7 +41,6 @@
45 repository as smart_repo,41 repository as smart_repo,
46 packrepository as smart_packrepo,42 packrepository as smart_packrepo,
47 request as smart_req,43 request as smart_req,
48 server,
49 vfs,44 vfs,
50 )45 )
51from bzrlib.tests import test_server46from bzrlib.tests import test_server
@@ -1474,7 +1469,7 @@
1474 request.execute('stacked', 1, (3, r3)))1469 request.execute('stacked', 1, (3, r3)))
14751470
14761471
1477class TestSmartServerRepositoryGetStream(tests.TestCaseWithMemoryTransport):1472class GetStreamTestBase(tests.TestCaseWithMemoryTransport):
14781473
1479 def make_two_commit_repo(self):1474 def make_two_commit_repo(self):
1480 tree = self.make_branch_and_memory_tree('.')1475 tree = self.make_branch_and_memory_tree('.')
@@ -1486,6 +1481,9 @@
1486 repo = tree.branch.repository1481 repo = tree.branch.repository
1487 return repo, r1, r21482 return repo, r1, r2
14881483
1484
1485class TestSmartServerRepositoryGetStream(GetStreamTestBase):
1486
1489 def test_ancestry_of(self):1487 def test_ancestry_of(self):
1490 """The search argument may be a 'ancestry-of' some heads'."""1488 """The search argument may be a 'ancestry-of' some heads'."""
1491 backing = self.get_transport()1489 backing = self.get_transport()
@@ -1512,6 +1510,18 @@
1512 stream_bytes = ''.join(response.body_stream)1510 stream_bytes = ''.join(response.body_stream)
1513 self.assertStartsWith(stream_bytes, 'Bazaar pack format 1')1511 self.assertStartsWith(stream_bytes, 'Bazaar pack format 1')
15141512
1513 def test_search_everything(self):
1514 """A search of 'everything' returns a stream."""
1515 backing = self.get_transport()
1516 request = smart_repo.SmartServerRepositoryGetStream_1_19(backing)
1517 repo, r1, r2 = self.make_two_commit_repo()
1518 serialised_fetch_spec = 'everything'
1519 request.execute('', repo._format.network_name())
1520 response = request.do_body(serialised_fetch_spec)
1521 self.assertEqual(('ok',), response.args)
1522 stream_bytes = ''.join(response.body_stream)
1523 self.assertStartsWith(stream_bytes, 'Bazaar pack format 1')
1524
15151525
1516class TestSmartServerRequestHasRevision(tests.TestCaseWithMemoryTransport):1526class TestSmartServerRequestHasRevision(tests.TestCaseWithMemoryTransport):
15171527
@@ -1911,6 +1921,8 @@
1911 smart_repo.SmartServerRepositoryGetRevisionGraph)1921 smart_repo.SmartServerRepositoryGetRevisionGraph)
1912 self.assertHandlerEqual('Repository.get_stream',1922 self.assertHandlerEqual('Repository.get_stream',
1913 smart_repo.SmartServerRepositoryGetStream)1923 smart_repo.SmartServerRepositoryGetStream)
1924 self.assertHandlerEqual('Repository.get_stream_1.19',
1925 smart_repo.SmartServerRepositoryGetStream_1_19)
1914 self.assertHandlerEqual('Repository.has_revision',1926 self.assertHandlerEqual('Repository.has_revision',
1915 smart_repo.SmartServerRequestHasRevision)1927 smart_repo.SmartServerRequestHasRevision)
1916 self.assertHandlerEqual('Repository.insert_stream',1928 self.assertHandlerEqual('Repository.insert_stream',
19171929
=== modified file 'doc/en/release-notes/bzr-2.3.txt'
--- doc/en/release-notes/bzr-2.3.txt 2010-12-02 16:24:54 +0000
+++ doc/en/release-notes/bzr-2.3.txt 2010-12-03 07:07:15 +0000
@@ -128,6 +128,11 @@
128 crashes when encountering private bugs (they are just displayed as such).128 crashes when encountering private bugs (they are just displayed as such).
129 (Vincent Ladeuil, #354985)129 (Vincent Ladeuil, #354985)
130130
131* The ``revision_id`` parameter of
132 ``Repository.search_missing_revision_ids`` and
133 ``InterRepository.search_missing_revision_ids`` is deprecated. It is
134 replaced by the ``revision_ids`` parameter. (Andrew Bennetts)
135
131Internals136Internals
132*********137*********
133138