Merge lp:~blair/ubuntu/precise/rbtools/rbtools-fix-886436 into lp:ubuntu/precise/rbtools
- Precise (12.04)
- rbtools-fix-886436
- Merge into precise
Status: | Work in progress | ||||
---|---|---|---|---|---|
Proposed branch: | lp:~blair/ubuntu/precise/rbtools/rbtools-fix-886436 | ||||
Merge into: | lp:ubuntu/precise/rbtools | ||||
Diff against target: |
4678 lines (+2655/-828) 22 files modified
.pc/.version (+0/-1) .pc/applied-patches (+0/-1) .pc/fix_tarball/setup.cfg (+0/-16) .reviewboardrc (+0/-1) AUTHORS (+16/-0) MANIFEST.in (+2/-0) PKG-INFO (+2/-2) RBTools.egg-info/PKG-INFO (+0/-18) RBTools.egg-info/SOURCES.txt (+0/-18) RBTools.egg-info/dependency_links.txt (+0/-1) RBTools.egg-info/entry_points.txt (+0/-3) RBTools.egg-info/top_level.txt (+0/-1) contrib/internal/release.py (+121/-6) debian/changelog (+9/-0) debian/patches/fix_tarball (+0/-73) debian/patches/series (+0/-1) debian/watch (+1/-2) ez_setup.py (+20/-8) rbtools/__init__.py (+1/-1) rbtools/postreview.py (+1755/-591) rbtools/tests.py (+724/-82) setup.cfg (+4/-2) |
||||
To merge this branch: | bzr merge lp:~blair/ubuntu/precise/rbtools/rbtools-fix-886436 | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Daniel Holbach (community) | Needs Fixing | ||
Ubuntu branches | Pending | ||
Review via email: mp+81388@code.launchpad.net |
Commit message
Description of the change
Hello,
This is a commit for updating rbtools to the latest upstream release.
This is my first work with bzr and committing into launchpad, so some questions and notes on the work:
1) The initial checkout of rbtools had a .pc directory with two files in it, applied-patches and fix_tarball. Since upstream's source tarball and Debian's source didn't have this, I deleted it as part of the commit.
2) I rsynced my copy of the build that successfully did a pbuilder run into the bzr checkout to make it identical.
3) I didn't bump the policy to the latest version given I'm not familiar with them. I did do a diff between 0.9.2 and 0.9.3 in git://git.
4) Given that Debian is at 0.2, should this package have "ubuntu" in the version number?
Thanks,
Blair
Daniel Holbach (dholbach) wrote : | # |
I get an error message trying to merge this. Did you use "bzr merge-upstream"? (http://
Blair Zajac (blair) wrote : | # |
Hi Daniel,
What's the error message?
No, I didn't the instructions on that that page, didn't know it existed. I followed the instructions here:
https:/
https:/
which suggests to manually copy the upstream tarball into the bzr checkout.
Blair
Blair Zajac (blair) wrote : | # |
Could the merge issue be due to the fact that I branched from lp:ubuntu/precise/rbtools? What branch should I merge from?
Blair Zajac (blair) wrote : | # |
Also, could you give me the branch you're merging into? Is it possible to attempt the merge in my own bzr checkout? That would help.
Daniel Holbach (dholbach) wrote : | # |
This is the output I get when I try to review your branch:
daniel@daydream:~$ bzr branch lp:ubuntu/rbtools
Most recent Ubuntu version: 0.2-1
Packaging branch status: CURRENT
Branched 2 revision(s).
daniel@daydream:~$ cd rbtools/
daniel@
bzr: ERROR: None 0.3.4 was not found in <PristineTarSource at bzr+ssh:
daniel@
Stéphane Graber (stgraber) wrote : | # |
Based on comments above, marking as work in progress, please switch back to needs review once the merge issue solved.
Thanks for your work!
Unmerged revisions
- 3. By Blair Zajac
-
* New upstream release. Closes: #613669, LP #886436.
* debian/watch: update URL.
* debian/patches: delete since upstream fixed missing files from
tarball.
Preview Diff
1 | === removed directory '.pc' |
2 | === removed file '.pc/.version' |
3 | --- .pc/.version 2010-07-31 18:31:05 +0000 |
4 | +++ .pc/.version 1970-01-01 00:00:00 +0000 |
5 | @@ -1,1 +0,0 @@ |
6 | -2 |
7 | |
8 | === removed file '.pc/applied-patches' |
9 | --- .pc/applied-patches 2010-07-31 18:31:05 +0000 |
10 | +++ .pc/applied-patches 1970-01-01 00:00:00 +0000 |
11 | @@ -1,1 +0,0 @@ |
12 | -fix_tarball |
13 | |
14 | === removed directory '.pc/fix_tarball' |
15 | === removed file '.pc/fix_tarball/.reviewboardrc' |
16 | === removed file '.pc/fix_tarball/COPYING' |
17 | === removed file '.pc/fix_tarball/INSTALL' |
18 | === removed file '.pc/fix_tarball/setup.cfg' |
19 | --- .pc/fix_tarball/setup.cfg 2010-07-31 18:31:05 +0000 |
20 | +++ .pc/fix_tarball/setup.cfg 1970-01-01 00:00:00 +0000 |
21 | @@ -1,16 +0,0 @@ |
22 | -[egg_info] |
23 | -tag_build = |
24 | -tag_date = 0 |
25 | -tag_svn_revision = 0 |
26 | - |
27 | -[aliases] |
28 | -alpha2 = egg_info -DRb alpha2 |
29 | -alpha1 = egg_info -DRb alpha1 |
30 | -rc1 = egg_info -DRb rc1 |
31 | -rc2 = egg_info -DRb rc2 |
32 | -nightly = egg_info -dR |
33 | -snapshot = egg_info -Dr |
34 | -beta2 = egg_info -DRb beta2 |
35 | -beta1 = egg_info -DRb beta1 |
36 | -release = egg_info -DRb '' |
37 | - |
38 | |
39 | === removed file '.reviewboardrc' |
40 | --- .reviewboardrc 2010-07-31 18:31:05 +0000 |
41 | +++ .reviewboardrc 1970-01-01 00:00:00 +0000 |
42 | @@ -1,1 +0,0 @@ |
43 | -REVIEWBOARD_URL = "http://reviews.reviewboard.org" |
44 | |
45 | === modified file 'AUTHORS' |
46 | --- AUTHORS 2010-07-31 18:31:05 +0000 |
47 | +++ AUTHORS 2011-11-06 07:43:24 +0000 |
48 | @@ -6,13 +6,23 @@ |
49 | |
50 | Contributors: |
51 | |
52 | + * Andrew Stitcher |
53 | * Anthony Cruz |
54 | + * Ben Hollis |
55 | * Bryan Halter |
56 | * Chris Clark |
57 | * Dan Savilonis |
58 | * Dana Lacoste |
59 | + * Daniel Cestari |
60 | + * Daniel LaMotte |
61 | + * David Gardner |
62 | + * Dick Porter |
63 | * Eric Huss |
64 | * Flavio Castelli |
65 | + * Gyula Faller |
66 | + * Holden Karau |
67 | + * Ian Monroe |
68 | + * Jan Koprowski |
69 | * Jason Felice |
70 | * Jeremy Bettis |
71 | * Laurent Nicolas |
72 | @@ -21,14 +31,20 @@ |
73 | * Luke Robison |
74 | * Matthew Woehlke |
75 | * Mike Crute |
76 | + * Nathan Dimmock |
77 | * Nathan Heijermans |
78 | + * Noah Kantrowitz |
79 | * Paul Scott |
80 | + * Peter Ward |
81 | * Petr Novák |
82 | * Raghu Kaippully |
83 | * Ravi Kondamuru |
84 | * Ryan Oblak |
85 | + * Ryan Shelley |
86 | + * Severin Gehwolf |
87 | * Stacey Sheldon |
88 | * Stefan Ring |
89 | + * Steven Ihde |
90 | * Steven Russell |
91 | * Thilo-Alexander Ginkel |
92 | * Tom Saeger |
93 | |
94 | === modified file 'MANIFEST.in' |
95 | --- MANIFEST.in 2010-07-31 18:31:05 +0000 |
96 | +++ MANIFEST.in 2011-11-06 07:43:24 +0000 |
97 | @@ -1,5 +1,7 @@ |
98 | recursive-include contrib *.py *.txt README* |
99 | include ez_setup.py |
100 | include AUTHORS |
101 | +include COPYING |
102 | +include INSTALL |
103 | include NEWS |
104 | include README |
105 | |
106 | === modified file 'PKG-INFO' |
107 | --- PKG-INFO 2010-07-31 18:31:05 +0000 |
108 | +++ PKG-INFO 2011-11-06 07:43:24 +0000 |
109 | @@ -1,12 +1,12 @@ |
110 | Metadata-Version: 1.0 |
111 | Name: RBTools |
112 | -Version: 0.2 |
113 | +Version: 0.3.4 |
114 | Summary: Command line tools for use with Review Board |
115 | Home-page: http://www.reviewboard.org/ |
116 | Author: Christian Hammond |
117 | Author-email: chipx86@chipx86.com |
118 | License: MIT |
119 | -Download-URL: http://downloads.reviewboard.org/releases/RBTools/0.2/ |
120 | +Download-URL: http://downloads.reviewboard.org/releases/RBTools/0.3/ |
121 | Description: UNKNOWN |
122 | Platform: UNKNOWN |
123 | Classifier: Development Status :: 4 - Beta |
124 | |
125 | === removed file 'RBTools.egg-info/PKG-INFO' |
126 | --- RBTools.egg-info/PKG-INFO 2010-07-31 18:31:05 +0000 |
127 | +++ RBTools.egg-info/PKG-INFO 1970-01-01 00:00:00 +0000 |
128 | @@ -1,18 +0,0 @@ |
129 | -Metadata-Version: 1.0 |
130 | -Name: RBTools |
131 | -Version: 0.2 |
132 | -Summary: Command line tools for use with Review Board |
133 | -Home-page: http://www.reviewboard.org/ |
134 | -Author: Christian Hammond |
135 | -Author-email: chipx86@chipx86.com |
136 | -License: MIT |
137 | -Download-URL: http://downloads.reviewboard.org/releases/RBTools/0.2/ |
138 | -Description: UNKNOWN |
139 | -Platform: UNKNOWN |
140 | -Classifier: Development Status :: 4 - Beta |
141 | -Classifier: Environment :: Console |
142 | -Classifier: Intended Audience :: Developers |
143 | -Classifier: License :: OSI Approved :: MIT License |
144 | -Classifier: Operating System :: OS Independent |
145 | -Classifier: Programming Language :: Python |
146 | -Classifier: Topic :: Software Development |
147 | |
148 | === removed file 'RBTools.egg-info/SOURCES.txt' |
149 | --- RBTools.egg-info/SOURCES.txt 2010-07-31 18:31:05 +0000 |
150 | +++ RBTools.egg-info/SOURCES.txt 1970-01-01 00:00:00 +0000 |
151 | @@ -1,18 +0,0 @@ |
152 | -AUTHORS |
153 | -MANIFEST.in |
154 | -NEWS |
155 | -README |
156 | -ez_setup.py |
157 | -setup.cfg |
158 | -setup.py |
159 | -RBTools.egg-info/PKG-INFO |
160 | -RBTools.egg-info/SOURCES.txt |
161 | -RBTools.egg-info/dependency_links.txt |
162 | -RBTools.egg-info/entry_points.txt |
163 | -RBTools.egg-info/top_level.txt |
164 | -contrib/P4Tool.txt |
165 | -contrib/README.P4Tool |
166 | -contrib/internal/release.py |
167 | -rbtools/__init__.py |
168 | -rbtools/postreview.py |
169 | -rbtools/tests.py |
170 | \ No newline at end of file |
171 | |
172 | === removed file 'RBTools.egg-info/dependency_links.txt' |
173 | --- RBTools.egg-info/dependency_links.txt 2010-07-31 18:31:05 +0000 |
174 | +++ RBTools.egg-info/dependency_links.txt 1970-01-01 00:00:00 +0000 |
175 | @@ -1,1 +0,0 @@ |
176 | -http://downloads.reviewboard.org/releases/RBTools/0.2/ |
177 | |
178 | === removed file 'RBTools.egg-info/entry_points.txt' |
179 | --- RBTools.egg-info/entry_points.txt 2010-07-31 18:31:05 +0000 |
180 | +++ RBTools.egg-info/entry_points.txt 1970-01-01 00:00:00 +0000 |
181 | @@ -1,3 +0,0 @@ |
182 | -[console_scripts] |
183 | -post-review = rbtools.postreview:main |
184 | - |
185 | |
186 | === removed file 'RBTools.egg-info/top_level.txt' |
187 | --- RBTools.egg-info/top_level.txt 2010-07-31 18:31:05 +0000 |
188 | +++ RBTools.egg-info/top_level.txt 1970-01-01 00:00:00 +0000 |
189 | @@ -1,1 +0,0 @@ |
190 | -rbtools |
191 | |
192 | === modified file 'contrib/internal/release.py' |
193 | --- contrib/internal/release.py 2010-07-31 18:31:05 +0000 |
194 | +++ contrib/internal/release.py 2011-11-06 07:43:24 +0000 |
195 | @@ -4,17 +4,20 @@ |
196 | # developers with release permissions. |
197 | # |
198 | |
199 | +import hashlib |
200 | +import mimetools |
201 | import os |
202 | -import re |
203 | import shutil |
204 | +import subprocess |
205 | import sys |
206 | import tempfile |
207 | +import urllib2 |
208 | |
209 | sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", "..")) |
210 | from rbtools import __version__, __version_info__, is_release |
211 | |
212 | |
213 | -PY_VERSIONS = ["2.4", "2.5", "2.6"] |
214 | +PY_VERSIONS = ["2.4", "2.5", "2.6", "2.7"] |
215 | |
216 | LATEST_PY_VERSION = PY_VERSIONS[-1] |
217 | |
218 | @@ -26,17 +29,65 @@ |
219 | __version_info__[0], |
220 | __version_info__[1]) |
221 | |
222 | +RBWEBSITE_API_URL = 'http://www.reviewboard.org/api/' |
223 | +RELEASES_API_URL = '%sproducts/rbtools/releases/' % RBWEBSITE_API_URL |
224 | + |
225 | |
226 | built_files = [] |
227 | |
228 | |
229 | +def load_config(): |
230 | + filename = os.path.join(os.path.expanduser('~'), '.rbwebsiterc') |
231 | + |
232 | + if not os.path.exists(filename): |
233 | + sys.stderr.write("A .rbwebsiterc file must exist in the form of:\n") |
234 | + sys.stderr.write("\n") |
235 | + sys.stderr.write("USERNAME = '<username>'\n") |
236 | + sys.stderr.write("PASSWORD = '<password>'\n") |
237 | + sys.exit(1) |
238 | + |
239 | + user_config = {} |
240 | + |
241 | + try: |
242 | + execfile(filename, user_config) |
243 | + except SyntaxError, e: |
244 | + sys.stderr.write('Syntax error in config file: %s\n' |
245 | + 'Line %i offset %i\n' % (filename, e.lineno, e.offset)) |
246 | + sys.exit(1) |
247 | + |
248 | + auth_handler = urllib2.HTTPBasicAuthHandler() |
249 | + auth_handler.add_password(realm='Web API', |
250 | + uri=RBWEBSITE_API_URL, |
251 | + user=user_config['USERNAME'], |
252 | + passwd=user_config['PASSWORD']) |
253 | + opener = urllib2.build_opener(auth_handler) |
254 | + urllib2.install_opener(opener) |
255 | + |
256 | + |
257 | def execute(cmdline): |
258 | - print ">>> %s" % cmdline |
259 | - |
260 | - if os.system(cmdline) != 0: |
261 | + if isinstance(cmdline, list): |
262 | + print ">>> %s" % subprocess.list2cmdline(cmdline) |
263 | + else: |
264 | + print ">>> %s" % cmdline |
265 | + |
266 | + p = subprocess.Popen(cmdline, |
267 | + shell=True, |
268 | + stdout=subprocess.PIPE) |
269 | + |
270 | + s = '' |
271 | + |
272 | + for data in p.stdout.readlines(): |
273 | + s += data |
274 | + sys.stdout.write(data) |
275 | + |
276 | + rc = p.wait() |
277 | + |
278 | + if rc != 0: |
279 | print "!!! Error invoking command." |
280 | sys.exit(1) |
281 | |
282 | + return s |
283 | + |
284 | |
285 | def run_setup(target, pyver = LATEST_PY_VERSION): |
286 | execute("python%s ./setup.py release %s" % (pyver, target)) |
287 | @@ -62,6 +113,23 @@ |
288 | (PACKAGE_NAME, __version__)) |
289 | |
290 | |
291 | +def build_checksums(): |
292 | + sha_filename = 'dist/%s-%s.sha256sum' % (PACKAGE_NAME, __version__) |
293 | + out_f = open(sha_filename, 'w') |
294 | + |
295 | + for filename in built_files: |
296 | + m = hashlib.sha256() |
297 | + |
298 | + in_f = open(filename, 'r') |
299 | + m.update(in_f.read()) |
300 | + in_f.close() |
301 | + |
302 | + out_f.write('%s %s\n' % (m.hexdigest(), os.path.basename(filename))) |
303 | + |
304 | + out_f.close() |
305 | + built_files.append(sha_filename) |
306 | + |
307 | + |
308 | def upload_files(): |
309 | execute("scp %s %s" % (" ".join(built_files), RELEASES_URL)) |
310 | |
311 | @@ -71,7 +139,51 @@ |
312 | |
313 | |
314 | def register_release(): |
315 | - run_setup("register") |
316 | + if __version_info__[4] == 'final': |
317 | + run_setup("register") |
318 | + |
319 | + scm_revision = execute(['git rev-parse', 'release-%s' % __version__]) |
320 | + |
321 | + data = { |
322 | + 'major_version': __version_info__[0], |
323 | + 'minor_version': __version_info__[1], |
324 | + 'micro_version': __version_info__[2], |
325 | + 'release_type': __version_info__[3], |
326 | + 'release_num': __version_info__[4], |
327 | + 'scm_revision': scm_revision, |
328 | + } |
329 | + |
330 | + boundary = mimetools.choose_boundary() |
331 | + content = '' |
332 | + |
333 | + for key, value in data.iteritems(): |
334 | + content += '--%s\r\n' % boundary |
335 | + content += 'Content-Disposition: form-data; name="%s"\r\n' % key |
336 | + content += '\r\n' |
337 | + content += str(value) + '\r\n' |
338 | + |
339 | + content += '--%s--\r\n' % boundary |
340 | + content += '\r\n' |
341 | + |
342 | + headers = { |
343 | + 'Content-Type': 'multipart/form-data; boundary=%s' % boundary, |
344 | + 'Content-Length': str(len(content)), |
345 | + } |
346 | + |
347 | + print 'Posting release to reviewboard.org' |
348 | + try: |
349 | + f = urllib2.urlopen(urllib2.Request(url=RELEASES_API_URL, data=content, |
350 | + headers=headers)) |
351 | + f.read() |
352 | + except urllib2.HTTPError, e: |
353 | + print "Error uploading. Got HTTP code %d:" % e.code |
354 | + print e.read() |
355 | + except urllib2.URLError, e: |
356 | + try: |
357 | + print "Error uploading. Got URL error:" % e.code |
358 | + print e.read() |
359 | + except AttributeError: |
360 | + pass |
361 | |
362 | |
363 | def main(): |
364 | @@ -80,6 +192,8 @@ |
365 | "Djblets tree.\n") |
366 | sys.exit(1) |
367 | |
368 | + load_config() |
369 | + |
370 | if not is_release(): |
371 | sys.stderr.write('This has not been marked as a release in ' |
372 | 'rbtools/__init__.py\n') |
373 | @@ -89,6 +203,7 @@ |
374 | git_dir = clone_git_tree(cur_dir) |
375 | |
376 | build_targets() |
377 | + build_checksums() |
378 | upload_files() |
379 | |
380 | os.chdir(cur_dir) |
381 | |
382 | === modified file 'debian/changelog' |
383 | --- debian/changelog 2010-07-31 18:31:05 +0000 |
384 | +++ debian/changelog 2011-11-06 07:43:24 +0000 |
385 | @@ -1,3 +1,12 @@ |
386 | +rbtools (0.3.4-1) precise; urgency=low |
387 | + |
388 | + * New upstream release. Closes: #613669, LP #886436. |
389 | + * debian/watch: update URL. |
390 | + * debian/patches: delete since upstream fixed missing files from |
391 | + tarball. |
392 | + |
393 | + -- Blair Zajac <blair@orcaware.com> Sun, 06 Nov 2011 00:16:52 -0700 |
394 | + |
395 | rbtools (0.2-1) unstable; urgency=low |
396 | |
397 | * Initial release. (Closes: #573485) |
398 | |
399 | === removed directory 'debian/patches' |
400 | === removed file 'debian/patches/fix_tarball' |
401 | --- debian/patches/fix_tarball 2010-07-31 18:31:05 +0000 |
402 | +++ debian/patches/fix_tarball 1970-01-01 00:00:00 +0000 |
403 | @@ -1,73 +0,0 @@ |
404 | -Description: Fix missing files in the tarball |
405 | - Some files that are missing from the tarball, but are included in |
406 | - the upstream release-0.2 branch. |
407 | -Author: Christian Hammond <chipx86@chipx86.com> |
408 | - |
409 | ---- |
410 | - |
411 | ---- /dev/null |
412 | -+++ rbtools-0.2/INSTALL |
413 | -@@ -0,0 +1,11 @@ |
414 | -+Installation |
415 | -+============ |
416 | -+ |
417 | -+To install rbtools, simply run the following as root: |
418 | -+ |
419 | -+ $ python setup.py install |
420 | -+ |
421 | -+ |
422 | -+Or to automatically download and install the latest version, you can run: |
423 | -+ |
424 | -+ $ easy_install -U RBTools |
425 | ---- rbtools-0.2.orig/setup.cfg |
426 | -+++ rbtools-0.2/setup.cfg |
427 | -@@ -1,16 +1,14 @@ |
428 | - [egg_info] |
429 | --tag_build = |
430 | --tag_date = 0 |
431 | --tag_svn_revision = 0 |
432 | -+tag_build = .dev |
433 | -+tag_svn_revision = 1 |
434 | - |
435 | - [aliases] |
436 | -+snapshot = egg_info -Dr |
437 | -+nightly = egg_info -dR |
438 | - alpha2 = egg_info -DRb alpha2 |
439 | - alpha1 = egg_info -DRb alpha1 |
440 | --rc1 = egg_info -DRb rc1 |
441 | --rc2 = egg_info -DRb rc2 |
442 | --nightly = egg_info -dR |
443 | --snapshot = egg_info -Dr |
444 | - beta2 = egg_info -DRb beta2 |
445 | - beta1 = egg_info -DRb beta1 |
446 | -+rc1 = egg_info -DRb rc1 |
447 | -+rc2 = egg_info -DRb rc2 |
448 | - release = egg_info -DRb '' |
449 | -- |
450 | ---- /dev/null |
451 | -+++ rbtools-0.2/.reviewboardrc |
452 | -@@ -0,0 +1 @@ |
453 | -+REVIEWBOARD_URL = "http://reviews.reviewboard.org" |
454 | ---- /dev/null |
455 | -+++ rbtools-0.2/COPYING |
456 | -@@ -0,0 +1,20 @@ |
457 | -+Copyright (c) 2007-2010 Christian Hammond |
458 | -+Copyright (c) 2007-2010 David Trowbridge |
459 | -+ |
460 | -+Permission is hereby granted, free of charge, to any person obtaining a copy of |
461 | -+this software and associated documentation files (the "Software"), to deal in |
462 | -+the Software without restriction, including without limitation the rights to |
463 | -+use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies |
464 | -+of the Software, and to permit persons to whom the Software is furnished to do |
465 | -+so, subject to the following conditions: |
466 | -+ |
467 | -+The above copyright notice and this permission notice shall be included in all |
468 | -+copies or substantial portions of the Software. |
469 | -+ |
470 | -+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR |
471 | -+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, |
472 | -+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE |
473 | -+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER |
474 | -+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, |
475 | -+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE |
476 | -+SOFTWARE. |
477 | |
478 | === removed file 'debian/patches/series' |
479 | --- debian/patches/series 2010-07-31 18:31:05 +0000 |
480 | +++ debian/patches/series 1970-01-01 00:00:00 +0000 |
481 | @@ -1,1 +0,0 @@ |
482 | -fix_tarball |
483 | |
484 | === modified file 'debian/watch' |
485 | --- debian/watch 2010-07-31 18:31:05 +0000 |
486 | +++ debian/watch 2011-11-06 07:43:24 +0000 |
487 | @@ -1,5 +1,4 @@ |
488 | version=3 |
489 | |
490 | -# FIXME: Does not produce the latest |
491 | opts="dversionmangle=s/~rc/rc/,uversionmangle=s/rc/~rc/" \ |
492 | - http://downloads.reviewboard.org/releases/RBTools/0.2/ RBTools-(.*)\.tar\.gz |
493 | + http://downloads.reviewboard.org/releases/RBTools/0.3/ RBTools-(.*)\.tar\.gz |
494 | |
495 | === modified file 'ez_setup.py' |
496 | --- ez_setup.py 2010-07-31 18:31:05 +0000 |
497 | +++ ez_setup.py 2011-11-06 07:43:24 +0000 |
498 | @@ -14,7 +14,7 @@ |
499 | This file can also be run as a script to install or upgrade setuptools. |
500 | """ |
501 | import sys |
502 | -DEFAULT_VERSION = "0.6c8" |
503 | +DEFAULT_VERSION = "0.6c11" |
504 | DEFAULT_URL = "http://pypi.python.org/packages/%s/s/setuptools/" % sys.version[:3] |
505 | |
506 | md5_data = { |
507 | @@ -28,6 +28,14 @@ |
508 | 'setuptools-0.6b4-py2.4.egg': '4cb2a185d228dacffb2d17f103b3b1c4', |
509 | 'setuptools-0.6c1-py2.3.egg': 'b3f2b5539d65cb7f74ad79127f1a908c', |
510 | 'setuptools-0.6c1-py2.4.egg': 'b45adeda0667d2d2ffe14009364f2a4b', |
511 | + 'setuptools-0.6c10-py2.3.egg': 'ce1e2ab5d3a0256456d9fc13800a7090', |
512 | + 'setuptools-0.6c10-py2.4.egg': '57d6d9d6e9b80772c59a53a8433a5dd4', |
513 | + 'setuptools-0.6c10-py2.5.egg': 'de46ac8b1c97c895572e5e8596aeb8c7', |
514 | + 'setuptools-0.6c10-py2.6.egg': '58ea40aef06da02ce641495523a0b7f5', |
515 | + 'setuptools-0.6c11-py2.3.egg': '2baeac6e13d414a9d28e7ba5b5a596de', |
516 | + 'setuptools-0.6c11-py2.4.egg': 'bd639f9b0eac4c42497034dec2ec0c2b', |
517 | + 'setuptools-0.6c11-py2.5.egg': '64c94f3bf7a72a13ec83e0b24f2749b2', |
518 | + 'setuptools-0.6c11-py2.6.egg': 'bfa92100bd772d5a213eedd356d64086', |
519 | 'setuptools-0.6c2-py2.3.egg': 'f0064bf6aa2b7d0f3ba0b43f20817c27', |
520 | 'setuptools-0.6c2-py2.4.egg': '616192eec35f47e8ea16cd6a122b7277', |
521 | 'setuptools-0.6c3-py2.3.egg': 'f181fa125dfe85a259c9cd6f1d7b78fa', |
522 | @@ -48,13 +56,18 @@ |
523 | 'setuptools-0.6c8-py2.3.egg': '50759d29b349db8cfd807ba8303f1902', |
524 | 'setuptools-0.6c8-py2.4.egg': 'cba38d74f7d483c06e9daa6070cce6de', |
525 | 'setuptools-0.6c8-py2.5.egg': '1721747ee329dc150590a58b3e1ac95b', |
526 | + 'setuptools-0.6c9-py2.3.egg': 'a83c4020414807b496e4cfbe08507c03', |
527 | + 'setuptools-0.6c9-py2.4.egg': '260a2be2e5388d66bdaee06abec6342a', |
528 | + 'setuptools-0.6c9-py2.5.egg': 'fe67c3e5a17b12c0e7c541b7ea43a8e6', |
529 | + 'setuptools-0.6c9-py2.6.egg': 'ca37b1ff16fa2ede6e19383e7b59245a', |
530 | } |
531 | |
532 | import sys, os |
533 | +try: from hashlib import md5 |
534 | +except ImportError: from md5 import md5 |
535 | |
536 | def _validate_md5(egg_name, data): |
537 | if egg_name in md5_data: |
538 | - from md5 import md5 |
539 | digest = md5(data).hexdigest() |
540 | if digest != md5_data[egg_name]: |
541 | print >>sys.stderr, ( |
542 | @@ -64,7 +77,6 @@ |
543 | sys.exit(2) |
544 | return data |
545 | |
546 | - |
547 | def use_setuptools( |
548 | version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir, |
549 | download_delay=15 |
550 | @@ -100,11 +112,11 @@ |
551 | "\n\n(Currently using %r)" |
552 | ) % (version, e.args[0]) |
553 | sys.exit(2) |
554 | - else: |
555 | - del pkg_resources, sys.modules['pkg_resources'] # reload ok |
556 | - return do_download() |
557 | except pkg_resources.DistributionNotFound: |
558 | - return do_download() |
559 | + pass |
560 | + |
561 | + del pkg_resources, sys.modules['pkg_resources'] # reload ok |
562 | + return do_download() |
563 | |
564 | def download_setuptools( |
565 | version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir, |
566 | @@ -233,7 +245,6 @@ |
567 | """Update our built-in md5 registry""" |
568 | |
569 | import re |
570 | - from md5 import md5 |
571 | |
572 | for name in filenames: |
573 | base = os.path.basename(name) |
574 | @@ -270,3 +281,4 @@ |
575 | |
576 | |
577 | |
578 | + |
579 | |
580 | === modified file 'rbtools/__init__.py' |
581 | --- rbtools/__init__.py 2010-07-31 18:31:05 +0000 |
582 | +++ rbtools/__init__.py 2011-11-06 07:43:24 +0000 |
583 | @@ -31,7 +31,7 @@ |
584 | # |
585 | # (Major, Minor, Micro, alpha/beta/rc/final, Release Number, Released) |
586 | # |
587 | -VERSION = (0, 2, 0, 'final', 0, True) |
588 | +VERSION = (0, 3, 4, 'final', 0, True) |
589 | |
590 | |
591 | def get_version_string(): |
592 | |
593 | === modified file 'rbtools/postreview.py' |
594 | --- rbtools/postreview.py 2010-07-31 18:31:05 +0000 |
595 | +++ rbtools/postreview.py 2011-11-06 07:43:24 +0000 |
596 | @@ -1,10 +1,9 @@ |
597 | #!/usr/bin/env python |
598 | +import base64 |
599 | import cookielib |
600 | -import difflib |
601 | import getpass |
602 | import marshal |
603 | import mimetools |
604 | -import ntpath |
605 | import os |
606 | import re |
607 | import socket |
608 | @@ -14,7 +13,9 @@ |
609 | import tempfile |
610 | import urllib |
611 | import urllib2 |
612 | +from datetime import datetime |
613 | from optparse import OptionParser |
614 | +from pkg_resources import parse_version |
615 | from tempfile import mkstemp |
616 | from urlparse import urljoin, urlparse |
617 | |
618 | @@ -25,9 +26,11 @@ |
619 | from md5 import md5 |
620 | |
621 | try: |
622 | - import json |
623 | + # Specifically import json_loads, to work around some issues with |
624 | + # installations containing incompatible modules named "json". |
625 | + from json import loads as json_loads |
626 | except ImportError: |
627 | - import simplejson as json |
628 | + from simplejson import loads as json_loads |
629 | |
630 | # This specific import is necessary to handle the paths for |
631 | # cygwin enabled machines. |
632 | @@ -111,6 +114,7 @@ |
633 | user_config = None |
634 | tempfiles = [] |
635 | options = None |
636 | +configs = [] |
637 | |
638 | ADD_REPOSITORY_DOCS_URL = \ |
639 | 'http://www.reviewboard.org/docs/manual/dev/admin/management/repositories/' |
640 | @@ -136,6 +140,15 @@ |
641 | return code_str |
642 | |
643 | |
644 | +class HTTPRequest(urllib2.Request): |
645 | + def __init__(self, url, body='', headers={}, method="PUT"): |
646 | + urllib2.Request.__init__(self, url, body, headers) |
647 | + self.method = method |
648 | + |
649 | + def get_method(self): |
650 | + return self.method |
651 | + |
652 | + |
653 | class RepositoryInfo: |
654 | """ |
655 | A representation of a source code repository. |
656 | @@ -229,7 +242,7 @@ |
657 | |
658 | # If one of the directories doesn't match, then path is not relative |
659 | # to root. |
660 | - if rootdirs != pathdirs: |
661 | + if rootdirs != pathdirs[:len(rootdirs)]: |
662 | return None |
663 | |
664 | # All the directories matched, so the relative path is whatever |
665 | @@ -238,7 +251,7 @@ |
666 | if len(pathdirs) == len(rootdirs): |
667 | return '/' |
668 | else: |
669 | - return '/'.join(pathdirs[len(rootdirs):]) |
670 | + return '/' + '/'.join(pathdirs[len(rootdirs):]) |
671 | |
672 | def _split_on_slash(self, path): |
673 | # Split on slashes, but ignore multiple slashes and throw away any |
674 | @@ -248,6 +261,168 @@ |
675 | split = split[0:-1] |
676 | return split |
677 | |
678 | +class ClearCaseRepositoryInfo(RepositoryInfo): |
679 | + """ |
680 | + A representation of a ClearCase source code repository. This version knows |
681 | + how to find a matching repository on the server even if the URLs differ. |
682 | + """ |
683 | + |
684 | + def __init__(self, path, base_path, vobstag, supports_parent_diffs=False): |
685 | + RepositoryInfo.__init__(self, path, base_path, |
686 | + supports_parent_diffs=supports_parent_diffs) |
687 | + self.vobstag = vobstag |
688 | + |
689 | + def find_server_repository_info(self, server): |
690 | + """ |
691 | + The point of this function is to find a repository on the server that |
692 | + matches self, even if the paths aren't the same. (For example, if self |
693 | + uses an 'http' path, but the server uses a 'file' path for the same |
694 | + repository.) It does this by comparing VOB's name. If the |
695 | + repositories use the same path, you'll get back self, otherwise you'll |
696 | + get a different ClearCaseRepositoryInfo object (with a different path). |
697 | + """ |
698 | + |
699 | + # Find VOB's family uuid based on VOB's tag |
700 | + uuid = self._get_vobs_uuid(self.vobstag) |
701 | + debug("Repositorie's %s uuid is %r" % (self.vobstag, uuid)) |
702 | + |
703 | + repositories = server.get_repositories() |
704 | + for repository in repositories: |
705 | + if repository['tool'] != 'ClearCase': |
706 | + continue |
707 | + |
708 | + info = self._get_repository_info(server, repository) |
709 | + |
710 | + if not info or uuid != info['uuid']: |
711 | + continue |
712 | + |
713 | + debug('Matching repository uuid:%s with path:%s' %(uuid, |
714 | + info['repopath'])) |
715 | + return ClearCaseRepositoryInfo(info['repopath'], |
716 | + info['repopath'], uuid) |
717 | + |
718 | + # We didn't found uuid but if version is >= 1.5.3 |
719 | + # we can try to use VOB's name hoping it is better |
720 | + # than current VOB's path. |
721 | + if server.rb_version >= '1.5.3': |
722 | + self.path = cpath.split(self.vobstag)[1] |
723 | + |
724 | + # We didn't find a matching repository on the server. |
725 | + # We'll just return self and hope for the best. |
726 | + return self |
727 | + |
728 | + def _get_vobs_uuid(self, vobstag): |
729 | + """Return family uuid of VOB.""" |
730 | + |
731 | + property_lines = execute(["cleartool", "lsvob", "-long", vobstag], |
732 | + split_lines=True) |
733 | + for line in property_lines: |
734 | + if line.startswith('Vob family uuid:'): |
735 | + return line.split(' ')[-1].rstrip() |
736 | + |
737 | + def _get_repository_info(self, server, repository): |
738 | + try: |
739 | + return server.get_repository_info(repository['id']) |
740 | + except APIError, e: |
741 | + # If the server couldn't fetch the repository info, it will return |
742 | + # code 210. Ignore those. |
743 | + # Other more serious errors should still be raised, though. |
744 | + if e.error_code == 210: |
745 | + return None |
746 | + |
747 | + raise e |
748 | + |
749 | + |
750 | +class PresetHTTPAuthHandler(urllib2.BaseHandler): |
751 | + """urllib2 handler that conditionally presets the use of HTTP Basic Auth. |
752 | + |
753 | + This is used when specifying --username= on the command line. It will |
754 | + force an HTTP_AUTHORIZATION header with the user info, asking the user |
755 | + for any missing info beforehand. It will then try this header for that |
756 | + first request. |
757 | + |
758 | + It will only do this once. |
759 | + """ |
760 | + handler_order = 480 # After Basic auth |
761 | + |
762 | + def __init__(self, url, password_mgr): |
763 | + self.url = url |
764 | + self.password_mgr = password_mgr |
765 | + self.used = False |
766 | + |
767 | + def reset(self): |
768 | + self.password_mgr.rb_user = options.http_username |
769 | + self.password_mgr.rb_pass = options.http_password |
770 | + self.used = False |
771 | + |
772 | + def http_request(self, request): |
773 | + if options.username and not self.used: |
774 | + # Note that we call password_mgr.find_user_password to get the |
775 | + # username and password we're working with. This allows us to |
776 | + # prompt if, say, --username was specified but --password was not. |
777 | + username, password = \ |
778 | + self.password_mgr.find_user_password('Web API', self.url) |
779 | + raw = '%s:%s' % (username, password) |
780 | + request.add_header( |
781 | + urllib2.HTTPBasicAuthHandler.auth_header, |
782 | + 'Basic %s' % base64.b64encode(raw).strip()) |
783 | + self.used = True |
784 | + |
785 | + return request |
786 | + |
787 | + https_request = http_request |
788 | + |
789 | + |
790 | +class ReviewBoardHTTPErrorProcessor(urllib2.HTTPErrorProcessor): |
791 | + """Processes HTTP error codes. |
792 | + |
793 | + Python 2.6 gets HTTP error code processing right, but 2.4 and 2.5 only |
794 | + accepts HTTP 200 and 206 as success codes. This handler ensures that |
795 | + anything in the 200 range is a success. |
796 | + """ |
797 | + def http_response(self, request, response): |
798 | + if not (200 <= response.code < 300): |
799 | + response = self.parent.error('http', request, response, |
800 | + response.code, response.msg, |
801 | + response.info()) |
802 | + |
803 | + return response |
804 | + |
805 | + https_response = http_response |
806 | + |
807 | + |
808 | +class ReviewBoardHTTPBasicAuthHandler(urllib2.HTTPBasicAuthHandler): |
809 | + """Custom Basic Auth handler that doesn't retry excessively. |
810 | + |
811 | + urllib2's HTTPBasicAuthHandler retries over and over, which is useless. |
812 | + This subclass only retries once to make sure we've attempted with a |
813 | + valid username and password. It will then fail so we can use |
814 | + tempt_fate's retry handler. |
815 | + """ |
816 | + def __init__(self, *args, **kwargs): |
817 | + urllib2.HTTPBasicAuthHandler.__init__(self, *args, **kwargs) |
818 | + self._retried = False |
819 | + self._lasturl = "" |
820 | + |
821 | + def retry_http_basic_auth(self, *args, **kwargs): |
822 | + if self._lasturl != args[0]: |
823 | + self._retried = False |
824 | + |
825 | + self._lasturl = args[0] |
826 | + |
827 | + if not self._retried: |
828 | + self._retried = True |
829 | + self.retried = 0 |
830 | + response = urllib2.HTTPBasicAuthHandler.retry_http_basic_auth( |
831 | + self, *args, **kwargs) |
832 | + |
833 | + if response.code != 401: |
834 | + self._retried = False |
835 | + |
836 | + return response |
837 | + else: |
838 | + return None |
839 | + |
840 | |
841 | class ReviewBoardHTTPPasswordMgr(urllib2.HTTPPasswordMgr): |
842 | """ |
843 | @@ -260,11 +435,11 @@ |
844 | |
845 | See: http://bugs.python.org/issue974757 |
846 | """ |
847 | - def __init__(self, reviewboard_url): |
848 | + def __init__(self, reviewboard_url, rb_user=None, rb_pass=None): |
849 | self.passwd = {} |
850 | self.rb_url = reviewboard_url |
851 | - self.rb_user = None |
852 | - self.rb_pass = None |
853 | + self.rb_user = rb_user |
854 | + self.rb_pass = rb_pass |
855 | |
856 | def find_user_password(self, realm, uri): |
857 | if uri.startswith(self.rb_url): |
858 | @@ -274,10 +449,14 @@ |
859 | 'used with --diff-filename=-') |
860 | |
861 | print "==> HTTP Authentication Required" |
862 | - print 'Enter username and password for "%s" at %s' % \ |
863 | + print 'Enter authorization information for "%s" at %s' % \ |
864 | (realm, urlparse(uri)[1]) |
865 | - self.rb_user = raw_input('Username: ') |
866 | - self.rb_pass = getpass.getpass('Password: ') |
867 | + |
868 | + if not self.rb_user: |
869 | + self.rb_user = raw_input('Username: ') |
870 | + |
871 | + if not self.rb_pass: |
872 | + self.rb_pass = getpass.getpass('Password: ') |
873 | |
874 | return self.rb_user, self.rb_pass |
875 | else: |
876 | @@ -296,59 +475,110 @@ |
877 | self.url += '/' |
878 | self._info = info |
879 | self._server_info = None |
880 | + self.root_resource = None |
881 | + self.deprecated_api = False |
882 | self.cookie_file = cookie_file |
883 | self.cookie_jar = cookielib.MozillaCookieJar(self.cookie_file) |
884 | |
885 | + if self.cookie_file: |
886 | + try: |
887 | + self.cookie_jar.load(self.cookie_file, ignore_expires=True) |
888 | + except IOError: |
889 | + pass |
890 | + |
891 | # Set up the HTTP libraries to support all of the features we need. |
892 | cookie_handler = urllib2.HTTPCookieProcessor(self.cookie_jar) |
893 | - password_mgr = ReviewBoardHTTPPasswordMgr(self.url) |
894 | - basic_auth_handler = urllib2.HTTPBasicAuthHandler(password_mgr) |
895 | + password_mgr = ReviewBoardHTTPPasswordMgr(self.url, |
896 | + options.username, |
897 | + options.password) |
898 | + basic_auth_handler = ReviewBoardHTTPBasicAuthHandler(password_mgr) |
899 | digest_auth_handler = urllib2.HTTPDigestAuthHandler(password_mgr) |
900 | + self.preset_auth_handler = PresetHTTPAuthHandler(self.url, password_mgr) |
901 | + http_error_processor = ReviewBoardHTTPErrorProcessor() |
902 | |
903 | opener = urllib2.build_opener(cookie_handler, |
904 | basic_auth_handler, |
905 | - digest_auth_handler) |
906 | + digest_auth_handler, |
907 | + self.preset_auth_handler, |
908 | + http_error_processor) |
909 | opener.addheaders = [('User-agent', 'RBTools/' + get_package_version())] |
910 | urllib2.install_opener(opener) |
911 | |
912 | + def check_api_version(self): |
913 | + """Checks the API version on the server to determine which to use.""" |
914 | + try: |
915 | + root_resource = self.api_get('api/') |
916 | + rsp = self.api_get(root_resource['links']['info']['href']) |
917 | + |
918 | + self.rb_version = rsp['info']['product']['package_version'] |
919 | + |
920 | + if parse_version(self.rb_version) >= parse_version('1.5.2'): |
921 | + self.deprecated_api = False |
922 | + self.root_resource = root_resource |
923 | + debug('Using the new web API') |
924 | + return True |
925 | + except APIError, e: |
926 | + if e.http_status not in (401, 404): |
927 | + # We shouldn't reach this. If there's a permission denied |
928 | + # from lack of logging in, then the basic auth handler |
929 | + # should have hit it. |
930 | + # |
931 | + # However in some versions it wants you to be logged in |
932 | + # and returns a 401 from the application after you've |
933 | + # done your http basic auth |
934 | + die("Unable to access the root /api/ URL on the server.") |
935 | + |
936 | + return False |
937 | + |
938 | + # This is an older Review Board server with the old API. |
939 | + self.deprecated_api = True |
940 | + debug('Using the deprecated Review Board 1.0 web API') |
941 | + return True |
942 | + |
943 | def login(self, force=False): |
944 | """ |
945 | Logs in to a Review Board server, prompting the user for login |
946 | information if needed. |
947 | """ |
948 | - if not force and self.has_valid_cookie(): |
949 | - return |
950 | - |
951 | if (options.diff_filename == '-' and |
952 | not options.username and not options.submit_as and |
953 | not options.password): |
954 | die('Authentication information needs to be provided on ' |
955 | 'the command line when using --diff-filename=-') |
956 | |
957 | - print "==> Review Board Login Required" |
958 | - print "Enter username and password for Review Board at %s" % self.url |
959 | - if options.username: |
960 | - username = options.username |
961 | - elif options.submit_as: |
962 | - username = options.submit_as |
963 | - else: |
964 | - username = raw_input('Username: ') |
965 | - |
966 | - if not options.password: |
967 | - password = getpass.getpass('Password: ') |
968 | - else: |
969 | - password = options.password |
970 | - |
971 | - debug('Logging in with username "%s"' % username) |
972 | - try: |
973 | - self.api_post('api/json/accounts/login/', { |
974 | - 'username': username, |
975 | - 'password': password, |
976 | - }) |
977 | - except APIError, e: |
978 | - die("Unable to log in: %s" % e) |
979 | - |
980 | - debug("Logged in.") |
981 | + if self.deprecated_api: |
982 | + print "==> Review Board Login Required" |
983 | + print "Enter username and password for Review Board at %s" % \ |
984 | + self.url |
985 | + |
986 | + if options.username: |
987 | + username = options.username |
988 | + elif options.submit_as: |
989 | + username = options.submit_as |
990 | + elif not force and self.has_valid_cookie(): |
991 | + # We delay the check for a valid cookie until after looking |
992 | + # at args, so that it doesn't override the command line. |
993 | + return |
994 | + else: |
995 | + username = raw_input('Username: ') |
996 | + |
997 | + if not options.password: |
998 | + password = getpass.getpass('Password: ') |
999 | + else: |
1000 | + password = options.password |
1001 | + |
1002 | + debug('Logging in with username "%s"' % username) |
1003 | + try: |
1004 | + self.api_post('api/json/accounts/login/', { |
1005 | + 'username': username, |
1006 | + 'password': password, |
1007 | + }) |
1008 | + except APIError, e: |
1009 | + die("Unable to log in: %s" % e) |
1010 | + |
1011 | + debug("Logged in.") |
1012 | + elif force: |
1013 | + self.preset_auth_handler.reset() |
1014 | |
1015 | def has_valid_cookie(self): |
1016 | """ |
1017 | @@ -365,9 +595,12 @@ |
1018 | # get rid of the port number if it's present. |
1019 | host = host.split(":")[0] |
1020 | |
1021 | + # Cookie files also append .local to bare hostnames |
1022 | + if '.' not in host: |
1023 | + host += '.local' |
1024 | + |
1025 | debug("Looking for '%s %s' cookie in %s" % \ |
1026 | (host, path, self.cookie_file)) |
1027 | - self.cookie_jar.load(self.cookie_file, ignore_expires=True) |
1028 | |
1029 | try: |
1030 | cookie = self.cookie_jar._cookies[host][path]['rbsessionid'] |
1031 | @@ -384,6 +617,13 @@ |
1032 | |
1033 | return False |
1034 | |
1035 | + def get_configured_repository(self): |
1036 | + for config in configs: |
1037 | + if 'REPOSITORY' in config: |
1038 | + return config['REPOSITORY'] |
1039 | + |
1040 | + return None |
1041 | + |
1042 | def new_review_request(self, changenum, submit_as=None): |
1043 | """ |
1044 | Creates a review request on a Review Board server, updating an |
1045 | @@ -407,10 +647,39 @@ |
1046 | self.info.path = repository['path'] |
1047 | break |
1048 | |
1049 | + if isinstance(self.info.path, list): |
1050 | + sys.stderr.write('\n') |
1051 | + sys.stderr.write('There was an error creating this review ' |
1052 | + 'request.\n') |
1053 | + sys.stderr.write('\n') |
1054 | + sys.stderr.write('There was no matching repository path' |
1055 | + 'found on the server.\n') |
1056 | + sys.stderr.write('List of configured repositories:\n') |
1057 | + |
1058 | + for repository in repositories: |
1059 | + sys.stderr.write('\t%s\n' % repository['path']) |
1060 | + |
1061 | + sys.stderr.write('Unknown repository paths found:\n') |
1062 | + |
1063 | + for foundpath in self.info.path: |
1064 | + sys.stderr.write('\t%s\n' % foundpath) |
1065 | + |
1066 | + sys.stderr.write('Ask the administrator to add one of ' |
1067 | + 'these repositories\n') |
1068 | + sys.stderr.write('to the Review Board server.\n') |
1069 | + sys.stderr.write('For information on adding repositories, ' |
1070 | + 'please read\n') |
1071 | + sys.stderr.write(ADD_REPOSITORY_DOCS_URL + '\n') |
1072 | + die() |
1073 | + |
1074 | + repository = options.repository_url \ |
1075 | + or self.get_configured_repository() \ |
1076 | + or self.info.path |
1077 | + |
1078 | try: |
1079 | debug("Attempting to create review request on %s for %s" % |
1080 | - (self.info.path, changenum)) |
1081 | - data = { 'repository_path': self.info.path } |
1082 | + (repository, changenum)) |
1083 | + data = {} |
1084 | |
1085 | if changenum: |
1086 | data['changenum'] = changenum |
1087 | @@ -419,7 +688,16 @@ |
1088 | debug("Submitting the review request as %s" % submit_as) |
1089 | data['submit_as'] = submit_as |
1090 | |
1091 | - rsp = self.api_post('api/json/reviewrequests/new/', data) |
1092 | + if self.deprecated_api: |
1093 | + data['repository_path'] = repository |
1094 | + rsp = self.api_post('api/json/reviewrequests/new/', data) |
1095 | + else: |
1096 | + data['repository'] = repository |
1097 | + |
1098 | + links = self.root_resource['links'] |
1099 | + assert 'review_requests' in links |
1100 | + review_request_href = links['review_requests']['href'] |
1101 | + rsp = self.api_post(review_request_href, data) |
1102 | except APIError, e: |
1103 | if e.error_code == 204: # Change number in use |
1104 | rsp = e.rsp |
1105 | @@ -429,9 +707,8 @@ |
1106 | debug("Review request already exists.") |
1107 | else: |
1108 | debug("Review request already exists. Updating it...") |
1109 | - rsp = self.api_post( |
1110 | - 'api/json/reviewrequests/%s/update_from_changenum/' % |
1111 | - rsp['review_request']['id']) |
1112 | + self.update_review_request_from_changenum( |
1113 | + changenum, rsp['review_request']) |
1114 | elif e.error_code == 206: # Invalid repository |
1115 | sys.stderr.write('\n') |
1116 | sys.stderr.write('There was an error creating this review ' |
1117 | @@ -454,6 +731,16 @@ |
1118 | |
1119 | return rsp['review_request'] |
1120 | |
1121 | + def update_review_request_from_changenum(self, changenum, review_request): |
1122 | + if self.deprecated_api: |
1123 | + self.api_post( |
1124 | + 'api/json/reviewrequests/%s/update_from_changenum/' |
1125 | + % review_request['id']) |
1126 | + else: |
1127 | + self.api_put(review_request['links']['self']['href'], { |
1128 | + 'changenum': review_request['changenum'], |
1129 | + }) |
1130 | + |
1131 | def set_review_request_field(self, review_request, field, value): |
1132 | """ |
1133 | Sets a field in a review request to the specified value. |
1134 | @@ -463,37 +750,75 @@ |
1135 | debug("Attempting to set field '%s' to '%s' for review request '%s'" % |
1136 | (field, value, rid)) |
1137 | |
1138 | - self.api_post('api/json/reviewrequests/%s/draft/set/' % rid, { |
1139 | - field: value, |
1140 | - }) |
1141 | + if self.deprecated_api: |
1142 | + self.api_post('api/json/reviewrequests/%s/draft/set/' % rid, { |
1143 | + field: value, |
1144 | + }) |
1145 | + else: |
1146 | + self.api_put(review_request['links']['draft']['href'], { |
1147 | + field: value, |
1148 | + }) |
1149 | |
1150 | def get_review_request(self, rid): |
1151 | """ |
1152 | Returns the review request with the specified ID. |
1153 | """ |
1154 | - rsp = self.api_get('api/json/reviewrequests/%s/' % rid) |
1155 | + if self.deprecated_api: |
1156 | + url = 'api/json/reviewrequests/%s/' % rid |
1157 | + else: |
1158 | + url = '%s%s/' % ( |
1159 | + self.root_resource['links']['review_requests']['href'], rid) |
1160 | + |
1161 | + rsp = self.api_get(url) |
1162 | + |
1163 | return rsp['review_request'] |
1164 | |
1165 | def get_repositories(self): |
1166 | """ |
1167 | Returns the list of repositories on this server. |
1168 | """ |
1169 | - rsp = self.api_get('/api/json/repositories/') |
1170 | - return rsp['repositories'] |
1171 | + if self.deprecated_api: |
1172 | + rsp = self.api_get('api/json/repositories/') |
1173 | + repositories = rsp['repositories'] |
1174 | + else: |
1175 | + rsp = self.api_get( |
1176 | + self.root_resource['links']['repositories']['href']) |
1177 | + repositories = rsp['repositories'] |
1178 | + |
1179 | + while 'next' in rsp['links']: |
1180 | + rsp = self.api_get(rsp['links']['next']['href']) |
1181 | + repositories.extend(rsp['repositories']) |
1182 | + |
1183 | + return repositories |
1184 | |
1185 | def get_repository_info(self, rid): |
1186 | """ |
1187 | Returns detailed information about a specific repository. |
1188 | """ |
1189 | - rsp = self.api_get('/api/json/repositories/%s/info/' % rid) |
1190 | + if self.deprecated_api: |
1191 | + url = 'api/json/repositories/%s/info/' % rid |
1192 | + else: |
1193 | + rsp = self.api_get( |
1194 | + '%s%s/' % (self.root_resource['links']['repositories']['href'], |
1195 | + rid)) |
1196 | + url = rsp['repository']['links']['info']['href'] |
1197 | + |
1198 | + rsp = self.api_get(url) |
1199 | + |
1200 | return rsp['info'] |
1201 | |
1202 | def save_draft(self, review_request): |
1203 | """ |
1204 | Saves a draft of a review request. |
1205 | """ |
1206 | - self.api_post("api/json/reviewrequests/%s/draft/save/" % |
1207 | - review_request['id']) |
1208 | + if self.deprecated_api: |
1209 | + self.api_post('api/json/reviewrequests/%s/draft/save/' % \ |
1210 | + review_request['id']) |
1211 | + else: |
1212 | + self.api_put(review_request['links']['draft']['href'], { |
1213 | + 'public': 1, |
1214 | + }) |
1215 | + |
1216 | debug("Review request draft saved") |
1217 | |
1218 | def upload_diff(self, review_request, diff_content, parent_diff_content): |
1219 | @@ -522,16 +847,40 @@ |
1220 | 'content': parent_diff_content |
1221 | } |
1222 | |
1223 | - self.api_post('api/json/reviewrequests/%s/diff/new/' % |
1224 | - review_request['id'], fields, files) |
1225 | + if self.deprecated_api: |
1226 | + self.api_post('api/json/reviewrequests/%s/diff/new/' % |
1227 | + review_request['id'], fields, files) |
1228 | + else: |
1229 | + self.api_post(review_request['links']['diffs']['href'], |
1230 | + fields, files) |
1231 | + |
1232 | + def reopen(self, review_request): |
1233 | + """ |
1234 | + Reopen discarded review request. |
1235 | + """ |
1236 | + debug("Reopening") |
1237 | + |
1238 | + if self.deprecated_api: |
1239 | + self.api_post('api/json/reviewrequests/%s/reopen/' % |
1240 | + review_request['id']) |
1241 | + else: |
1242 | + self.api_put(review_request['links']['self']['href'], { |
1243 | + 'status': 'pending', |
1244 | + }) |
1245 | |
1246 | def publish(self, review_request): |
1247 | """ |
1248 | Publishes a review request. |
1249 | """ |
1250 | debug("Publishing") |
1251 | - self.api_post('api/json/reviewrequests/%s/publish/' % |
1252 | - review_request['id']) |
1253 | + |
1254 | + if self.deprecated_api: |
1255 | + self.api_post('api/json/reviewrequests/%s/publish/' % |
1256 | + review_request['id']) |
1257 | + else: |
1258 | + self.api_put(review_request['links']['draft']['href'], { |
1259 | + 'public': 1, |
1260 | + }) |
1261 | |
1262 | def _get_server_info(self): |
1263 | if not self._server_info: |
1264 | @@ -546,9 +895,12 @@ |
1265 | Loads in a JSON file and returns the data if successful. On failure, |
1266 | APIError is raised. |
1267 | """ |
1268 | - rsp = json.loads(data) |
1269 | + rsp = json_loads(data) |
1270 | |
1271 | if rsp['stat'] == 'fail': |
1272 | + # With the new API, we should get something other than HTTP |
1273 | + # 200 for errors, in which case we wouldn't get this far. |
1274 | + assert self.deprecated_api |
1275 | self.process_error(200, data) |
1276 | |
1277 | return rsp |
1278 | @@ -556,7 +908,7 @@ |
1279 | def process_error(self, http_status, data): |
1280 | """Processes an error, raising an APIError with the information.""" |
1281 | try: |
1282 | - rsp = json.loads(data) |
1283 | + rsp = json_loads(data) |
1284 | |
1285 | assert rsp['stat'] == 'fail' |
1286 | |
1287 | @@ -578,12 +930,21 @@ |
1288 | |
1289 | url = self._make_url(path) |
1290 | rsp = urllib2.urlopen(url).read() |
1291 | - self.cookie_jar.save(self.cookie_file) |
1292 | + |
1293 | + try: |
1294 | + self.cookie_jar.save(self.cookie_file) |
1295 | + except IOError, e: |
1296 | + debug('Failed to write cookie file: %s' % e) |
1297 | return rsp |
1298 | |
1299 | def _make_url(self, path): |
1300 | """Given a path on the server returns a full http:// style url""" |
1301 | + if path.startswith('http'): |
1302 | + # This is already a full path. |
1303 | + return path |
1304 | + |
1305 | app = urlparse(self.url)[2] |
1306 | + |
1307 | if path[0] == '/': |
1308 | url = urljoin(self.url, app[:-1] + path) |
1309 | else: |
1310 | @@ -624,7 +985,66 @@ |
1311 | } |
1312 | |
1313 | try: |
1314 | - r = urllib2.Request(url, body, headers) |
1315 | + r = urllib2.Request(str(url), body, headers) |
1316 | + data = urllib2.urlopen(r).read() |
1317 | + try: |
1318 | + self.cookie_jar.save(self.cookie_file) |
1319 | + except IOError, e: |
1320 | + debug('Failed to write cookie file: %s' % e) |
1321 | + return data |
1322 | + except urllib2.HTTPError, e: |
1323 | + # Re-raise so callers can interpret it. |
1324 | + raise e |
1325 | + except urllib2.URLError, e: |
1326 | + try: |
1327 | + debug(e.read()) |
1328 | + except AttributeError: |
1329 | + pass |
1330 | + |
1331 | + die("Unable to access %s. The host path may be invalid\n%s" % \ |
1332 | + (url, e)) |
1333 | + |
1334 | + def http_put(self, path, fields): |
1335 | + """ |
1336 | + Performs an HTTP PUT on the specified path, storing any cookies that |
1337 | + were set. |
1338 | + """ |
1339 | + url = self._make_url(path) |
1340 | + debug('HTTP PUTting to %s: %s' % (url, fields)) |
1341 | + |
1342 | + content_type, body = self._encode_multipart_formdata(fields, None) |
1343 | + headers = { |
1344 | + 'Content-Type': content_type, |
1345 | + 'Content-Length': str(len(body)) |
1346 | + } |
1347 | + |
1348 | + try: |
1349 | + r = HTTPRequest(url, body, headers, method='PUT') |
1350 | + data = urllib2.urlopen(r).read() |
1351 | + self.cookie_jar.save(self.cookie_file) |
1352 | + return data |
1353 | + except urllib2.HTTPError, e: |
1354 | + # Re-raise so callers can interpret it. |
1355 | + raise e |
1356 | + except urllib2.URLError, e: |
1357 | + try: |
1358 | + debug(e.read()) |
1359 | + except AttributeError: |
1360 | + pass |
1361 | + |
1362 | + die("Unable to access %s. The host path may be invalid\n%s" % \ |
1363 | + (url, e)) |
1364 | + |
1365 | + def http_delete(self, path): |
1366 | + """ |
1367 | + Performs an HTTP DELETE on the specified path, storing any cookies that |
1368 | + were set. |
1369 | + """ |
1370 | + url = self._make_url(path) |
1371 | + debug('HTTP DELETing %s' % url) |
1372 | + |
1373 | + try: |
1374 | + r = HTTPRequest(url, method='DELETE') |
1375 | data = urllib2.urlopen(r).read() |
1376 | self.cookie_jar.save(self.cookie_file) |
1377 | return data |
1378 | @@ -649,6 +1069,24 @@ |
1379 | except urllib2.HTTPError, e: |
1380 | self.process_error(e.code, e.read()) |
1381 | |
1382 | + def api_put(self, path, fields=None): |
1383 | + """ |
1384 | + Performs an API call using HTTP PUT at the specified path. |
1385 | + """ |
1386 | + try: |
1387 | + return self.process_json(self.http_put(path, fields)) |
1388 | + except urllib2.HTTPError, e: |
1389 | + self.process_error(e.code, e.read()) |
1390 | + |
1391 | + def api_delete(self, path): |
1392 | + """ |
1393 | + Performs an API call using HTTP DELETE at the specified path. |
1394 | + """ |
1395 | + try: |
1396 | + return self.process_json(self.http_delete(path)) |
1397 | + except urllib2.HTTPError, e: |
1398 | + self.process_error(e.code, e.read()) |
1399 | + |
1400 | def _encode_multipart_formdata(self, fields, files): |
1401 | """ |
1402 | Encodes data for use in an HTTP POST. |
1403 | @@ -663,7 +1101,7 @@ |
1404 | content += "--" + BOUNDARY + "\r\n" |
1405 | content += "Content-Disposition: form-data; name=\"%s\"\r\n" % key |
1406 | content += "\r\n" |
1407 | - content += fields[key] + "\r\n" |
1408 | + content += str(fields[key]) + "\r\n" |
1409 | |
1410 | for key in files: |
1411 | filename = files[key]['filename'] |
1412 | @@ -690,25 +1128,29 @@ |
1413 | def get_repository_info(self): |
1414 | return None |
1415 | |
1416 | + def check_options(self): |
1417 | + pass |
1418 | + |
1419 | def scan_for_server(self, repository_info): |
1420 | """ |
1421 | Scans the current directory on up to find a .reviewboard file |
1422 | containing the server path. |
1423 | """ |
1424 | - server_url = self._get_server_from_config(user_config, repository_info) |
1425 | - if server_url: |
1426 | - return server_url |
1427 | - |
1428 | - for path in walk_parents(os.getcwd()): |
1429 | - filename = os.path.join(path, ".reviewboardrc") |
1430 | - if os.path.exists(filename): |
1431 | - config = load_config_file(filename) |
1432 | + server_url = None |
1433 | + |
1434 | + if user_config: |
1435 | + server_url = self._get_server_from_config(user_config, |
1436 | + repository_info) |
1437 | + |
1438 | + if not server_url: |
1439 | + for config in configs: |
1440 | server_url = self._get_server_from_config(config, |
1441 | repository_info) |
1442 | + |
1443 | if server_url: |
1444 | - return server_url |
1445 | + break |
1446 | |
1447 | - return None |
1448 | + return server_url |
1449 | |
1450 | def diff(self, args): |
1451 | """ |
1452 | @@ -723,7 +1165,7 @@ |
1453 | """ |
1454 | Returns the generated diff between revisions in the repository. |
1455 | """ |
1456 | - return None |
1457 | + return (None, None) |
1458 | |
1459 | def _get_server_from_config(self, config, repository_info): |
1460 | if 'REVIEWBOARD_URL' in config: |
1461 | @@ -773,7 +1215,7 @@ |
1462 | if i != -1: |
1463 | repository_path = repository_path[i + 1:] |
1464 | |
1465 | - i = repository_path.find(":") |
1466 | + i = repository_path.rfind(":") |
1467 | if i != -1: |
1468 | host = repository_path[:i] |
1469 | try: |
1470 | @@ -804,7 +1246,7 @@ |
1471 | for rev in revision_range.split(":"): |
1472 | revs += ["-r", rev] |
1473 | |
1474 | - return self.do_diff(revs) |
1475 | + return (self.do_diff(revs + args), None) |
1476 | |
1477 | def do_diff(self, params): |
1478 | """ |
1479 | @@ -822,308 +1264,324 @@ |
1480 | information and generates compatible diffs. |
1481 | This client assumes that cygwin is installed on windows. |
1482 | """ |
1483 | - ccroot_path = "/view/reviewboard.diffview/vobs/" |
1484 | - viewinfo = "" |
1485 | - viewtype = "snapshot" |
1486 | - |
1487 | - def get_filename_hash(self, fname): |
1488 | - # Hash the filename string so its easy to find the file later on. |
1489 | - return md5(fname).hexdigest() |
1490 | + viewtype = None |
1491 | |
1492 | def get_repository_info(self): |
1493 | + """Returns information on the Clear Case repository. |
1494 | + |
1495 | + This will first check if the cleartool command is |
1496 | + installed and in the path, and post-review was run |
1497 | + from inside of the view. |
1498 | + """ |
1499 | if not check_install('cleartool help'): |
1500 | return None |
1501 | |
1502 | - # We must be running this from inside a view. |
1503 | - # Otherwise it doesn't make sense. |
1504 | - self.viewinfo = execute(["cleartool", "pwv", "-short"]) |
1505 | - if self.viewinfo.startswith('\*\* NONE'): |
1506 | + viewname = execute(["cleartool", "pwv", "-short"]).strip() |
1507 | + if viewname.startswith('** NONE'): |
1508 | return None |
1509 | |
1510 | - # Returning the hardcoded clearcase root path to match the server |
1511 | - # respository path. |
1512 | - # There is no reason to have a dynamic path unless you have |
1513 | - # multiple clearcase repositories. This should be implemented. |
1514 | - return RepositoryInfo(path=self.ccroot_path, |
1515 | - base_path=self.ccroot_path, |
1516 | + # Now that we know it's ClearCase, make sure we have GNU diff installed, |
1517 | + # and error out if we don't. |
1518 | + check_gnu_diff() |
1519 | + |
1520 | + property_lines = execute(["cleartool", "lsview", "-full", "-properties", |
1521 | + "-cview"], split_lines=True) |
1522 | + for line in property_lines: |
1523 | + properties = line.split(' ') |
1524 | + if properties[0] == 'Properties:': |
1525 | + # Determine the view type and check if it's supported. |
1526 | + # |
1527 | + # Specifically check if webview was listed in properties |
1528 | + # because webview types also list the 'snapshot' |
1529 | + # entry in properties. |
1530 | + if 'webview' in properties: |
1531 | + die("Webviews are not supported. You can use post-review" |
1532 | + " only in dynamic or snapshot view.") |
1533 | + if 'dynamic' in properties: |
1534 | + self.viewtype = 'dynamic' |
1535 | + else: |
1536 | + self.viewtype = 'snapshot' |
1537 | + |
1538 | + break |
1539 | + |
1540 | + # Find current VOB's tag |
1541 | + vobstag = execute(["cleartool", "describe", "-short", "vob:."], |
1542 | + ignore_errors=True).strip() |
1543 | + if "Error: " in vobstag: |
1544 | + die("To generate diff run post-review inside vob.") |
1545 | + |
1546 | + # From current working directory cut path to VOB. |
1547 | + # VOB's tag contain backslash character before VOB's name. |
1548 | + # I hope that first character of VOB's tag like '\new_proj' |
1549 | + # won't be treat as new line character but two separate: |
1550 | + # backslash and letter 'n' |
1551 | + cwd = os.getcwd() |
1552 | + base_path = cwd[:cwd.find(vobstag) + len(vobstag)] |
1553 | + |
1554 | + return ClearCaseRepositoryInfo(path=base_path, |
1555 | + base_path=base_path, |
1556 | + vobstag=vobstag, |
1557 | supports_parent_diffs=False) |
1558 | |
1559 | - def get_previous_version(self, files): |
1560 | - file = [] |
1561 | - curdir = os.getcwd() |
1562 | - |
1563 | - # Cygwin case must transform a linux-like path to windows like path |
1564 | - # including drive letter. |
1565 | - if 'cygdrive' in curdir: |
1566 | - where = curdir.index('cygdrive') + 9 |
1567 | - drive_letter = curdir[where:where+1] |
1568 | - curdir = drive_letter + ":\\" + curdir[where+2:len(curdir)] |
1569 | - |
1570 | - for key in files: |
1571 | - # Sometimes there is a quote in the filename. It must be removed. |
1572 | - key = key.replace('\'', '') |
1573 | - elem_path = cpath.normpath(os.path.join(curdir, key)) |
1574 | - |
1575 | - # Removing anything before the last /vobs |
1576 | - # because it may be repeated. |
1577 | - elem_path_idx = elem_path.rfind("/vobs") |
1578 | - if elem_path_idx != -1: |
1579 | - elem_path = elem_path[elem_path_idx:len(elem_path)].strip("\"") |
1580 | - |
1581 | - # Call cleartool to get this version and the previous version |
1582 | - # of the element. |
1583 | - curr_version, pre_version = execute( |
1584 | - ["cleartool", "desc", "-pre", elem_path], split_lines=True) |
1585 | - curr_version = cpath.normpath(curr_version) |
1586 | - pre_version = pre_version.split(':')[1].strip() |
1587 | - |
1588 | - # If a specific version was given, remove it from the path |
1589 | - # to avoid version duplication |
1590 | - if "@@" in elem_path: |
1591 | - elem_path = elem_path[:elem_path.rfind("@@")] |
1592 | - file.append(elem_path + "@@" + pre_version) |
1593 | - file.append(curr_version) |
1594 | - |
1595 | - # Determnine if the view type is snapshot or dynamic. |
1596 | - if os.path.exists(file[0]): |
1597 | - self.viewtype = "dynamic" |
1598 | - |
1599 | - return file |
1600 | - |
1601 | - def get_extended_namespace(self, files): |
1602 | - """ |
1603 | - Parses the file path to get the extended namespace |
1604 | - """ |
1605 | - versions = self.get_previous_version(files) |
1606 | - |
1607 | - evfiles = [] |
1608 | - hlist = [] |
1609 | - |
1610 | - for vkey in versions: |
1611 | - # Verify if it is a checkedout file. |
1612 | - if "CHECKEDOUT" in vkey: |
1613 | - # For checkedout files just add it to the file list |
1614 | - # since it cannot be accessed outside the view. |
1615 | - splversions = vkey[:vkey.rfind("@@")] |
1616 | - evfiles.append(splversions) |
1617 | - else: |
1618 | - # For checkedin files. |
1619 | - ext_path = [] |
1620 | - ver = [] |
1621 | - fname = "" # fname holds the file name without the version. |
1622 | - (bpath, fpath) = cpath.splitdrive(vkey) |
1623 | - if bpath : |
1624 | - # Windows. |
1625 | - # The version (if specified like file.c@@/main/1) |
1626 | - # should be kept as a single string |
1627 | - # so split the path and concat the file name |
1628 | - # and version in the last position of the list. |
1629 | - ver = fpath.split("@@") |
1630 | - splversions = fpath[:vkey.rfind("@@")].split("\\") |
1631 | - fname = splversions.pop() |
1632 | - splversions.append(fname + ver[1]) |
1633 | - else : |
1634 | - # Linux. |
1635 | - bpath = vkey[:vkey.rfind("vobs")+4] |
1636 | - fpath = vkey[vkey.rfind("vobs")+5:] |
1637 | - ver = fpath.split("@@") |
1638 | - splversions = ver[0][:vkey.rfind("@@")].split("/") |
1639 | - fname = splversions.pop() |
1640 | - splversions.append(fname + ver[1]) |
1641 | - |
1642 | - filename = splversions.pop() |
1643 | - bpath = cpath.normpath(bpath + "/") |
1644 | - elem_path = bpath |
1645 | - |
1646 | - for key in splversions: |
1647 | - # For each element (directory) in the path, |
1648 | - # get its version from clearcase. |
1649 | - elem_path = cpath.join(elem_path, key) |
1650 | - |
1651 | - # This is the version to be appended to the extended |
1652 | - # path list. |
1653 | - this_version = execute( |
1654 | - ["cleartool", "desc", "-fmt", "%Vn", |
1655 | - cpath.normpath(elem_path)]) |
1656 | - if this_version: |
1657 | - ext_path.append(key + "/@@" + this_version + "/") |
1658 | - else: |
1659 | - ext_path.append(key + "/") |
1660 | - |
1661 | - # This must be done in case we haven't specified |
1662 | - # the version on the command line. |
1663 | - ext_path.append(cpath.normpath(fname + "/@@" + |
1664 | - vkey[vkey.rfind("@@")+2:len(vkey)])) |
1665 | - epstr = cpath.join(bpath, cpath.normpath(''.join(ext_path))) |
1666 | - evfiles.append(epstr) |
1667 | - |
1668 | - """ |
1669 | - In windows, there is a problem with long names(> 254). |
1670 | - In this case, we hash the string and copy the unextended |
1671 | - filename to a temp file whose name is the hash. |
1672 | - This way we can get the file later on for diff. |
1673 | - The same problem applies to snapshot views where the |
1674 | - extended name isn't available. |
1675 | - The previous file must be copied from the CC server |
1676 | - to a local dir. |
1677 | - """ |
1678 | - if cpath.exists(epstr) : |
1679 | - pass |
1680 | - else: |
1681 | - if len(epstr) > 254 or self.viewtype == "snapshot": |
1682 | - name = self.get_filename_hash(epstr) |
1683 | - # Check if this hash is already in the list |
1684 | - try: |
1685 | - i = hlist.index(name) |
1686 | - die("ERROR: duplicate value %s : %s" % |
1687 | - (name, epstr)) |
1688 | - except ValueError: |
1689 | - hlist.append(name) |
1690 | - |
1691 | - normkey = cpath.normpath(vkey) |
1692 | - td = tempfile.gettempdir() |
1693 | - # Cygwin case must transform a linux-like path to |
1694 | - # windows like path including drive letter |
1695 | - if 'cygdrive' in td: |
1696 | - where = td.index('cygdrive') + 9 |
1697 | - drive_letter = td[where:where+1] + ":" |
1698 | - td = cpath.join(drive_letter, td[where+1:]) |
1699 | - tf = cpath.normpath(cpath.join(td, name)) |
1700 | - if cpath.exists(tf): |
1701 | - debug("WARNING: FILE EXISTS") |
1702 | - os.unlink(tf) |
1703 | - execute(["cleartool", "get", "-to", tf, normkey]) |
1704 | - else: |
1705 | - die("ERROR: FILE NOT FOUND : %s" % epstr) |
1706 | - |
1707 | - return evfiles |
1708 | - |
1709 | - def get_files_from_label(self, label): |
1710 | - voblist=[] |
1711 | - # Get the list of vobs for the current view |
1712 | - allvoblist = execute(["cleartool", "lsvob", "-short"]).split() |
1713 | - # For each vob, find if the label is present |
1714 | - for vob in allvoblist: |
1715 | - try: |
1716 | - execute(["cleartool", "describe", "-local", |
1717 | - "lbtype:%s@%s" % (label, vob)]).split() |
1718 | - voblist.append(vob) |
1719 | - except: |
1720 | - pass |
1721 | - |
1722 | - filelist=[] |
1723 | - # For each vob containing the label, get the file list |
1724 | - for vob in voblist: |
1725 | - try: |
1726 | - res = execute(["cleartool", "find", vob, "-all", "-version", |
1727 | - "lbtype(%s)" % label, "-print"]) |
1728 | - filelist.extend(res.split()) |
1729 | - except : |
1730 | - pass |
1731 | - |
1732 | - # Return only the unique itens |
1733 | - return set(filelist) |
1734 | + def check_options(self): |
1735 | + if ((options.revision_range or options.tracking) |
1736 | + and self.viewtype != "dynamic"): |
1737 | + die("To generate diff using parent branch or by passing revision " |
1738 | + "ranges, you must use a dynamic view.") |
1739 | + |
1740 | + def _determine_version(self, version_path): |
1741 | + """Determine numeric version of revision. |
1742 | + |
1743 | + CHECKEDOUT is marked as infinity to be treated |
1744 | + always as highest possible version of file. |
1745 | + CHECKEDOUT, in ClearCase, is something like HEAD. |
1746 | + """ |
1747 | + branch, number = cpath.split(version_path) |
1748 | + if number == 'CHECKEDOUT': |
1749 | + return float('inf') |
1750 | + return int(number) |
1751 | + |
1752 | + def _construct_extended_path(self, path, version): |
1753 | + """Combine extended_path from path and version. |
1754 | + |
1755 | + CHECKEDOUT must be removed becasue this one version |
1756 | + doesn't exists in MVFS (ClearCase dynamic view file |
1757 | + system). Only way to get content of checked out file |
1758 | + is to use filename only.""" |
1759 | + if not version or version.endswith('CHECKEDOUT'): |
1760 | + return path |
1761 | + |
1762 | + return "%s@@%s" % (path, version) |
1763 | + |
1764 | + def _sanitize_branch_changeset(self, changeset): |
1765 | + """Return changeset containing non-binary, branched file versions. |
1766 | + |
1767 | + Changeset contain only first and last version of file made on branch. |
1768 | + """ |
1769 | + changelist = {} |
1770 | + |
1771 | + for path, previous, current in changeset: |
1772 | + version_number = self._determine_version(current) |
1773 | + |
1774 | + if path not in changelist: |
1775 | + changelist[path] = { |
1776 | + 'highest': version_number, |
1777 | + 'current': current, |
1778 | + 'previous': previous |
1779 | + } |
1780 | + |
1781 | + if version_number == 0: |
1782 | + # Previous version of 0 version on branch is base |
1783 | + changelist[path]['previous'] = previous |
1784 | + elif version_number > changelist[path]['highest']: |
1785 | + changelist[path]['highest'] = version_number |
1786 | + changelist[path]['current'] = current |
1787 | + |
1788 | + # Convert to list |
1789 | + changeranges = [] |
1790 | + for path, version in changelist.iteritems(): |
1791 | + changeranges.append( |
1792 | + (self._construct_extended_path(path, version['previous']), |
1793 | + self._construct_extended_path(path, version['current'])) |
1794 | + ) |
1795 | + |
1796 | + return changeranges |
1797 | + |
1798 | + def _sanitize_checkedout_changeset(self, changeset): |
1799 | + """Return changeset containing non-binary, checkdout file versions.""" |
1800 | + |
1801 | + changeranges = [] |
1802 | + for path, previous, current in changeset: |
1803 | + version_number = self._determine_version(current) |
1804 | + changeranges.append( |
1805 | + (self._construct_extended_path(path, previous), |
1806 | + self._construct_extended_path(path, current)) |
1807 | + ) |
1808 | + |
1809 | + return changeranges |
1810 | + |
1811 | + def _directory_content(self, path): |
1812 | + """Return directory content ready for saving to tempfile.""" |
1813 | + |
1814 | + return ''.join([ |
1815 | + '%s\n' % s |
1816 | + for s in sorted(os.listdir(path)) |
1817 | + ]) |
1818 | + |
1819 | + def _construct_changeset(self, output): |
1820 | + return [ |
1821 | + info.split('\t') |
1822 | + for info in output.strip().split('\n') |
1823 | + ] |
1824 | + |
1825 | + def get_checkedout_changeset(self): |
1826 | + """Return information about the checked out changeset. |
1827 | + |
1828 | + This function returns: kind of element, path to file, |
1829 | + previews and current file version. |
1830 | + """ |
1831 | + changeset = [] |
1832 | + # We ignore return code 1 in order to |
1833 | + # omit files that Clear Case can't read. |
1834 | + output = execute([ |
1835 | + "cleartool", |
1836 | + "lscheckout", |
1837 | + "-all", |
1838 | + "-cview", |
1839 | + "-me", |
1840 | + "-fmt", |
1841 | + r"%En\t%PVn\t%Vn\n"], |
1842 | + extra_ignore_errors=(1,), |
1843 | + with_errors=False) |
1844 | + |
1845 | + if output: |
1846 | + changeset = self._construct_changeset(output) |
1847 | + |
1848 | + return self._sanitize_checkedout_changeset(changeset) |
1849 | + |
1850 | + def get_branch_changeset(self, branch): |
1851 | + """Returns information about the versions changed on a branch. |
1852 | + |
1853 | + This takes into account the changes on the branch owned by the |
1854 | + current user in all vobs of the current view. |
1855 | + """ |
1856 | + changeset = [] |
1857 | + |
1858 | + # We ignore return code 1 in order to |
1859 | + # omit files that Clear Case can't read. |
1860 | + if sys.platform.startswith('win'): |
1861 | + CLEARCASE_XPN = '%CLEARCASE_XPN%' |
1862 | + else: |
1863 | + CLEARCASE_XPN = '$CLEARCASE_XPN' |
1864 | + |
1865 | + output = execute([ |
1866 | + "cleartool", |
1867 | + "find", |
1868 | + "-all", |
1869 | + "-version", |
1870 | + "brtype(%s)" % branch, |
1871 | + "-exec", |
1872 | + 'cleartool descr -fmt ' \ |
1873 | + r'"%En\t%PVn\t%Vn\n" ' \ |
1874 | + + CLEARCASE_XPN], |
1875 | + extra_ignore_errors=(1,), |
1876 | + with_errors=False) |
1877 | + |
1878 | + if output: |
1879 | + changeset = self._construct_changeset(output) |
1880 | + |
1881 | + return self._sanitize_branch_changeset(changeset) |
1882 | |
1883 | def diff(self, files): |
1884 | - """ |
1885 | - Performs a diff of the specified file and its previous version. |
1886 | - """ |
1887 | - # We must be running this from inside a view. |
1888 | - # Otherwise it doesn't make sense. |
1889 | - return self.do_diff(self.get_extended_namespace(files)) |
1890 | - |
1891 | - def diff_label(self, label): |
1892 | - """ |
1893 | - Get the files that are attached to a label and diff them |
1894 | - TODO |
1895 | - """ |
1896 | - return self.diff(self.get_files_from_label(label)) |
1897 | + """Performs a diff of the specified file and its previous version.""" |
1898 | + |
1899 | + if options.tracking: |
1900 | + changeset = self.get_branch_changeset(options.tracking) |
1901 | + else: |
1902 | + changeset = self.get_checkedout_changeset() |
1903 | + |
1904 | + return self.do_diff(changeset) |
1905 | |
1906 | def diff_between_revisions(self, revision_range, args, repository_info): |
1907 | - """ |
1908 | - Performs a diff between 2 revisions of a CC repository. |
1909 | - """ |
1910 | - rev_str = '' |
1911 | - |
1912 | - for rev in revision_range.split(":"): |
1913 | - rev_str += "-r %s " % rev |
1914 | - |
1915 | - return self.do_diff(rev_str) |
1916 | - |
1917 | - def do_diff(self, params): |
1918 | - # Diff returns "1" if differences were found. |
1919 | - # Add the view name and view type to the description |
1920 | - if options.description: |
1921 | - options.description = ("VIEW: " + self.viewinfo + |
1922 | - "VIEWTYPE: " + self.viewtype + "\n" + options.description) |
1923 | + """Performs a diff between passed revisions or branch.""" |
1924 | + |
1925 | + # Convert revision range to list of: |
1926 | + # (previous version, current version) tuples |
1927 | + revision_range = revision_range.split(';') |
1928 | + changeset = zip(revision_range[0::2], revision_range[1::2]) |
1929 | + |
1930 | + return (self.do_diff(changeset)[0], None) |
1931 | + |
1932 | + def diff_files(self, old_file, new_file): |
1933 | + """Return unified diff for file. |
1934 | + |
1935 | + Most effective and reliable way is use gnu diff. |
1936 | + """ |
1937 | + diff_cmd = ["diff", "-uN", old_file, new_file] |
1938 | + dl = execute(diff_cmd, extra_ignore_errors=(1,2), |
1939 | + translate_newlines=False) |
1940 | + |
1941 | + # If the input file has ^M characters at end of line, lets ignore them. |
1942 | + dl = dl.replace('\r\r\n', '\r\n') |
1943 | + dl = dl.splitlines(True) |
1944 | + |
1945 | + # Special handling for the output of the diff tool on binary files: |
1946 | + # diff outputs "Files a and b differ" |
1947 | + # and the code below expects the output to start with |
1948 | + # "Binary files " |
1949 | + if (len(dl) == 1 and |
1950 | + dl[0].startswith('Files %s and %s differ' % (old_file, new_file))): |
1951 | + dl = ['Binary files %s and %s differ\n' % (old_file, new_file)] |
1952 | + |
1953 | + # We need oids of files to translate them to paths on reviewboard repository |
1954 | + old_oid = execute(["cleartool", "describe", "-fmt", "%On", old_file]) |
1955 | + new_oid = execute(["cleartool", "describe", "-fmt", "%On", new_file]) |
1956 | + |
1957 | + if dl == [] or dl[0].startswith("Binary files "): |
1958 | + if dl == []: |
1959 | + dl = ["File %s in your changeset is unmodified\n" % new_file] |
1960 | + |
1961 | + dl.insert(0, "==== %s %s ====\n" % (old_oid, new_oid)) |
1962 | + dl.append('\n') |
1963 | else: |
1964 | - options.description = (self.viewinfo + |
1965 | - "VIEWTYPE: " + self.viewtype + "\n") |
1966 | - |
1967 | - o = [] |
1968 | - Feol = False |
1969 | - while len(params) > 0: |
1970 | - # Read both original and modified files. |
1971 | - onam = params.pop(0) |
1972 | - mnam = params.pop(0) |
1973 | - file_data = [] |
1974 | - do_rem = False |
1975 | - # If the filename length is greater than 254 char for windows, |
1976 | - # we copied the file to a temp file |
1977 | - # because the open will not work for path greater than 254. |
1978 | - # This is valid for the original and |
1979 | - # modified files if the name size is > 254. |
1980 | - for filenam in (onam, mnam) : |
1981 | - if cpath.exists(filenam) and self.viewtype == "dynamic": |
1982 | - do_rem = False |
1983 | - fn = filenam |
1984 | - elif len(filenam) > 254 or self.viewtype == "snapshot": |
1985 | - fn = self.get_filename_hash(filenam) |
1986 | - fn = cpath.join(tempfile.gettempdir(), fn) |
1987 | - do_rem = True |
1988 | - fd = open(cpath.normpath(fn)) |
1989 | - fdata = fd.readlines() |
1990 | - fd.close() |
1991 | - file_data.append(fdata) |
1992 | - # If the file was temp, it should be removed. |
1993 | - if do_rem: |
1994 | - os.remove(filenam) |
1995 | - |
1996 | - modi = file_data.pop() |
1997 | - orig = file_data.pop() |
1998 | - |
1999 | - # For snapshot views, the local directories must be removed because |
2000 | - # they will break the diff on the server. Just replacing |
2001 | - # everything before the view name (including the view name) for |
2002 | - # vobs do the work. |
2003 | - if (self.viewtype == "snapshot" |
2004 | - and (sys.platform.startswith('win') |
2005 | - or sys.platform.startswith('cygwin'))): |
2006 | - vinfo = self.viewinfo.rstrip("\r\n") |
2007 | - mnam = "c:\\\\vobs" + mnam[mnam.rfind(vinfo) + len(vinfo):] |
2008 | - onam = "c:\\\\vobs" + onam[onam.rfind(vinfo) + len(vinfo):] |
2009 | - # Call the diff lib to generate a diff. |
2010 | - # The dates are bogus, since they don't natter anyway. |
2011 | - # The only thing is that two spaces are needed to the server |
2012 | - # so it can identify the heades correctly. |
2013 | - diff = difflib.unified_diff(orig, modi, onam, mnam, |
2014 | - ' 2002-02-21 23:30:39.942229878 -0800', |
2015 | - ' 2002-02-21 23:30:50.442260588 -0800', lineterm=' \n') |
2016 | - # Transform the generator output into a string output |
2017 | - # Use a comprehension instead of a generator, |
2018 | - # so 2.3.x doesn't fail to interpret. |
2019 | - diffstr = ''.join([str(l) for l in diff]) |
2020 | - # Workaround for the difflib no new line at end of file |
2021 | - # problem. |
2022 | - if not diffstr.endswith('\n'): |
2023 | - diffstr = diffstr + ("\n\\ No newline at end of file\n") |
2024 | - o.append(diffstr) |
2025 | - |
2026 | - ostr = ''.join(o) |
2027 | - return (ostr, None) # diff, parent_diff (not supported) |
2028 | + dl.insert(2, "==== %s %s ====\n" % (old_oid, new_oid)) |
2029 | + |
2030 | + return dl |
2031 | + |
2032 | + def diff_directories(self, old_dir, new_dir): |
2033 | + """Return uniffied diff between two directories content. |
2034 | + |
2035 | + Function save two version's content of directory to temp |
2036 | + files and treate them as casual diff between two files. |
2037 | + """ |
2038 | + old_content = self._directory_content(old_dir) |
2039 | + new_content = self._directory_content(new_dir) |
2040 | + |
2041 | + old_tmp = make_tempfile(content=old_content) |
2042 | + new_tmp = make_tempfile(content=new_content) |
2043 | + |
2044 | + diff_cmd = ["diff", "-uN", old_tmp, new_tmp] |
2045 | + dl = execute(diff_cmd, |
2046 | + extra_ignore_errors=(1,2), |
2047 | + translate_newlines=False, |
2048 | + split_lines=True) |
2049 | + |
2050 | + # Replacing temporary filenames to |
2051 | + # real directory names and add ids |
2052 | + if dl: |
2053 | + dl[0] = dl[0].replace(old_tmp, old_dir) |
2054 | + dl[1] = dl[1].replace(new_tmp, new_dir) |
2055 | + old_oid = execute(["cleartool", "describe", "-fmt", "%On", old_dir]) |
2056 | + new_oid = execute(["cleartool", "describe", "-fmt", "%On", new_dir]) |
2057 | + dl.insert(2, "==== %s %s ====\n" % (old_oid, new_oid)) |
2058 | + |
2059 | + return dl |
2060 | + |
2061 | + def do_diff(self, changeset): |
2062 | + """Generates a unified diff for all files in the changeset.""" |
2063 | + |
2064 | + diff = [] |
2065 | + for old_file, new_file in changeset: |
2066 | + dl = [] |
2067 | + if cpath.isdir(new_file): |
2068 | + dl = self.diff_directories(old_file, new_file) |
2069 | + elif cpath.exists(new_file): |
2070 | + dl = self.diff_files(old_file, new_file) |
2071 | + else: |
2072 | + debug("File %s does not exist or access is denied." % new_file) |
2073 | + continue |
2074 | + |
2075 | + if dl: |
2076 | + diff.append(''.join(dl)) |
2077 | + |
2078 | + return (''.join(diff), None) |
2079 | |
2080 | |
2081 | class SVNClient(SCMClient): |
2082 | + # Match the diff control lines generated by 'svn diff'. |
2083 | + DIFF_ORIG_FILE_LINE_RE = re.compile(r'^---\s+.*\s+\(.*\)') |
2084 | + DIFF_NEW_FILE_LINE_RE = re.compile(r'^\+\+\+\s+.*\s+\(.*\)') |
2085 | + |
2086 | """ |
2087 | A wrapper around the svn Subversion tool that fetches repository |
2088 | information and generates compatible diffs. |
2089 | @@ -1164,6 +1622,15 @@ |
2090 | |
2091 | return SvnRepositoryInfo(path, base_path, m.group(1)) |
2092 | |
2093 | + def check_options(self): |
2094 | + if (options.repository_url and |
2095 | + not options.revision_range and |
2096 | + not options.diff_filename): |
2097 | + sys.stderr.write("The --repository-url option requires either the " |
2098 | + "--revision-range option or the --diff-filename " |
2099 | + "option.\n") |
2100 | + sys.exit(1) |
2101 | + |
2102 | def scan_for_server(self, repository_info): |
2103 | # Scan first for dot files, since it's faster and will cover the |
2104 | # user's $HOME/.reviewboardrc |
2105 | @@ -1199,6 +1666,13 @@ |
2106 | return (self.do_diff(["svn", "diff", "--diff-cmd=diff"] + files), |
2107 | None) |
2108 | |
2109 | + def diff_changelist(self, changelist): |
2110 | + """ |
2111 | + Performs a diff for a local changelist. |
2112 | + """ |
2113 | + return (self.do_diff(["svn", "diff", "--changelist", changelist]), |
2114 | + None) |
2115 | + |
2116 | def diff_between_revisions(self, revision_range, args, repository_info): |
2117 | """ |
2118 | Performs a diff between 2 revisions of a Subversion repository. |
2119 | @@ -1233,14 +1707,14 @@ |
2120 | |
2121 | old_url = url + '@' + revisions[0] |
2122 | |
2123 | - return self.do_diff(["svn", "diff", "--diff-cmd=diff", old_url, |
2124 | - new_url] + files, |
2125 | - repository_info) |
2126 | + return (self.do_diff(["svn", "diff", "--diff-cmd=diff", old_url, |
2127 | + new_url] + files, |
2128 | + repository_info), None) |
2129 | # Otherwise, perform the revision range diff using a working copy |
2130 | else: |
2131 | - return self.do_diff(["svn", "diff", "--diff-cmd=diff", "-r", |
2132 | - revision_range], |
2133 | - repository_info) |
2134 | + return (self.do_diff(["svn", "diff", "--diff-cmd=diff", "-r", |
2135 | + revision_range], |
2136 | + repository_info), None) |
2137 | |
2138 | def do_diff(self, cmd, repository_info=None): |
2139 | """ |
2140 | @@ -1272,12 +1746,12 @@ |
2141 | |
2142 | from_line = "" |
2143 | for line in diff_content: |
2144 | - if line.startswith('--- '): |
2145 | + if self.DIFF_ORIG_FILE_LINE_RE.match(line): |
2146 | from_line = line |
2147 | continue |
2148 | |
2149 | # This is where we decide how mangle the previous '--- ' |
2150 | - if line.startswith('+++ '): |
2151 | + if self.DIFF_NEW_FILE_LINE_RE.match(line): |
2152 | to_file, _ = self.parse_filename_header(line[4:]) |
2153 | info = self.svn_info(to_file) |
2154 | if info.has_key("Copied From URL"): |
2155 | @@ -1306,7 +1780,9 @@ |
2156 | |
2157 | for line in diff_content: |
2158 | front = None |
2159 | - if line.startswith('+++ ') or line.startswith('--- ') or line.startswith('Index: '): |
2160 | + if (self.DIFF_NEW_FILE_LINE_RE.match(line) |
2161 | + or self.DIFF_ORIG_FILE_LINE_RE.match(line) |
2162 | + or line.startswith('Index: ')): |
2163 | front, line = line.split(" ", 1) |
2164 | |
2165 | if front: |
2166 | @@ -1451,15 +1927,20 @@ |
2167 | return None |
2168 | |
2169 | def get_changenum(self, args): |
2170 | - if len(args) == 1: |
2171 | + if len(args) == 0: |
2172 | + return "default" |
2173 | + elif len(args) == 1: |
2174 | if args[0] == "default": |
2175 | return "default" |
2176 | |
2177 | try: |
2178 | return str(int(args[0])) |
2179 | except ValueError: |
2180 | - pass |
2181 | - return None |
2182 | + # (if it isn't a number, it can't be a cln) |
2183 | + return None |
2184 | + # there are multiple args (not a cln) |
2185 | + else: |
2186 | + return None |
2187 | |
2188 | def diff(self, args): |
2189 | """ |
2190 | @@ -1474,6 +1955,9 @@ |
2191 | if options.p4_port: |
2192 | os.environ['P4PORT'] = options.p4_port |
2193 | |
2194 | + if options.p4_passwd: |
2195 | + os.environ['P4PASSWD'] = options.p4_passwd |
2196 | + |
2197 | changenum = self.get_changenum(args) |
2198 | if changenum is None: |
2199 | return self._path_diff(args) |
2200 | @@ -1539,7 +2023,7 @@ |
2201 | # 'depotFile': '...' |
2202 | # 'change': '123456' |
2203 | for record in records: |
2204 | - if record['action'] != 'delete': |
2205 | + if record['action'] not in ('delete', 'move/delete'): |
2206 | if revision2: |
2207 | files[record['depotFile']] = [record, None] |
2208 | else: |
2209 | @@ -1550,7 +2034,7 @@ |
2210 | second_rev_path = m.group('path') + revision2[1:] |
2211 | records = self._run_p4(['files', second_rev_path]) |
2212 | for record in records: |
2213 | - if record['action'] != 'delete': |
2214 | + if record['action'] not in ('delete', 'move/delete'): |
2215 | try: |
2216 | m = files[record['depotFile']] |
2217 | m[1] = record |
2218 | @@ -1574,6 +2058,11 @@ |
2219 | old_file = tmp_diff_from_filename |
2220 | changetype_short = 'D' |
2221 | base_revision = int(first_record['rev']) |
2222 | + elif first_record['rev'] == second_record['rev']: |
2223 | + # We when we know the revisions are the same, we don't need |
2224 | + # to do any diffing. This speeds up large revision-range |
2225 | + # diffs quite a bit. |
2226 | + continue |
2227 | else: |
2228 | self._write_file(depot_path + '#' + first_record['rev'], |
2229 | tmp_diff_from_filename) |
2230 | @@ -1644,8 +2133,15 @@ |
2231 | v = self.p4d_version |
2232 | |
2233 | if v[0] < 2002 or (v[0] == "2002" and v[1] < 2): |
2234 | - description = execute(["p4", "describe", "-s", changenum], |
2235 | - split_lines=True) |
2236 | + describeCmd = ["p4"] |
2237 | + |
2238 | + if options.p4_passwd: |
2239 | + describeCmd.append("-P") |
2240 | + describeCmd.append(options.p4_passwd) |
2241 | + |
2242 | + describeCmd = describeCmd + ["describe", "-s", changenum] |
2243 | + |
2244 | + description = execute(describeCmd, split_lines=True) |
2245 | |
2246 | if '*pending*' in description[0]: |
2247 | return None |
2248 | @@ -1672,8 +2168,18 @@ |
2249 | if changenum == "default": |
2250 | cl_is_pending = True |
2251 | else: |
2252 | - description = execute(["p4", "describe", "-s", changenum], |
2253 | - split_lines=True) |
2254 | + describeCmd = ["p4"] |
2255 | + |
2256 | + if options.p4_passwd: |
2257 | + describeCmd.append("-P") |
2258 | + describeCmd.append(options.p4_passwd) |
2259 | + |
2260 | + describeCmd = describeCmd + ["describe", "-s", changenum] |
2261 | + |
2262 | + description = execute(describeCmd, split_lines=True) |
2263 | + |
2264 | + if re.search("no such changelist", description[0]): |
2265 | + die("CLN %s does not exist." % changenum) |
2266 | |
2267 | # Some P4 wrappers are addding an extra line before the description |
2268 | if '*pending*' in description[0] or '*pending*' in description[1]: |
2269 | @@ -1689,6 +2195,9 @@ |
2270 | info = execute(["p4", "opened", "-c", str(changenum)], |
2271 | split_lines=True) |
2272 | |
2273 | + if len(info) == 1 and info[0].startswith("File(s) not opened on this client."): |
2274 | + die("Couldn't find any affected files for this change.") |
2275 | + |
2276 | for line in info: |
2277 | data = line.split(" ") |
2278 | description.append("... %s %s" % (data[0], data[2])) |
2279 | @@ -1912,126 +2421,273 @@ |
2280 | A wrapper around the hg Mercurial tool that fetches repository |
2281 | information and generates compatible diffs. |
2282 | """ |
2283 | + |
2284 | + def __init__(self): |
2285 | + self.hgrc = {} |
2286 | + self._type = 'hg' |
2287 | + self._hg_root = '' |
2288 | + self._remote_path = () |
2289 | + self._hg_env = { |
2290 | + 'HGRCPATH': os.devnull, |
2291 | + 'HGPLAIN': '1', |
2292 | + } |
2293 | + |
2294 | + # `self._remote_path_candidates` is an ordered set of hgrc |
2295 | + # paths that are checked if `parent_branch` option is not given |
2296 | + # explicitly. The first candidate found to exist will be used, |
2297 | + # falling back to `default` (the last member.) |
2298 | + self._remote_path_candidates = ['reviewboard', 'origin', 'parent', |
2299 | + 'default'] |
2300 | + |
2301 | def get_repository_info(self): |
2302 | if not check_install('hg --help'): |
2303 | return None |
2304 | |
2305 | - data = execute(["hg", "root"], ignore_errors=True) |
2306 | - if data.startswith('abort:'): |
2307 | + self._load_hgrc() |
2308 | + |
2309 | + if not self.hg_root: |
2310 | # hg aborted => no mercurial repository here. |
2311 | return None |
2312 | |
2313 | - # Elsewhere, hg root output give us the repository path. |
2314 | - |
2315 | - # We save data here to use it as a fallback. See below |
2316 | - local_data = data.strip() |
2317 | - |
2318 | - svn = execute(["hg", "svn", "info", ], ignore_errors=True) |
2319 | - |
2320 | - if (not svn.startswith('abort:') and |
2321 | - not svn.startswith("hg: unknown command")): |
2322 | - self.type = 'svn' |
2323 | - m = re.search(r'^Repository Root: (.+)$', svn, re.M) |
2324 | - |
2325 | - if not m: |
2326 | - return None |
2327 | - |
2328 | - path = m.group(1) |
2329 | - m2 = re.match(r'^(svn\+ssh|http|https)://([-a-zA-Z0-9.]*@)(.*)$', |
2330 | - path) |
2331 | - if m2: |
2332 | - path = '%s://%s' % (m2.group(1), m2.group(3)) |
2333 | - |
2334 | - m = re.search(r'^URL: (.+)$', svn, re.M) |
2335 | - |
2336 | - if not m: |
2337 | - return None |
2338 | - |
2339 | - base_path = m.group(1)[len(path):] or "/" |
2340 | - return RepositoryInfo(path=path, |
2341 | - base_path=base_path, |
2342 | - supports_parent_diffs=True) |
2343 | - |
2344 | - self.type = 'hg' |
2345 | - |
2346 | - # We are going to search .hg/hgrc for the default path. |
2347 | - file_name = os.path.join(local_data,'.hg', 'hgrc') |
2348 | - |
2349 | - if not os.path.exists(file_name): |
2350 | - return RepositoryInfo(path=local_data, base_path='/', |
2351 | - supports_parent_diffs=True) |
2352 | - |
2353 | - f = open(file_name) |
2354 | - data = f.read() |
2355 | - f.close() |
2356 | - |
2357 | - m = re.search(r'^default\s+=\s+(.+)$', data, re.M) |
2358 | + svn_info = execute(["hg", "svn", "info"], ignore_errors=True) |
2359 | + |
2360 | + if (not svn_info.startswith('abort:') and |
2361 | + not svn_info.startswith("hg: unknown command") and |
2362 | + not svn_info.lower().startswith('not a child of')): |
2363 | + return self._calculate_hgsubversion_repository_info(svn_info) |
2364 | + |
2365 | + self._type = 'hg' |
2366 | + |
2367 | + path = self.hg_root |
2368 | + base_path = '/' |
2369 | + |
2370 | + if self.hgrc: |
2371 | + self._calculate_remote_path() |
2372 | + |
2373 | + if self._remote_path: |
2374 | + path = self._remote_path[1] |
2375 | + base_path = '' |
2376 | + |
2377 | + return RepositoryInfo(path=path, base_path=base_path, |
2378 | + supports_parent_diffs=True) |
2379 | + |
2380 | + def _calculate_remote_path(self): |
2381 | + for candidate in self._remote_path_candidates: |
2382 | + |
2383 | + rc_key = 'paths.%s' % candidate |
2384 | + |
2385 | + if (not self._remote_path and self.hgrc.get(rc_key)): |
2386 | + self._remote_path = (candidate, self.hgrc.get(rc_key)) |
2387 | + debug('Using candidate path %r: %r' % self._remote_path) |
2388 | + |
2389 | + return |
2390 | + |
2391 | + def _calculate_hgsubversion_repository_info(self, svn_info): |
2392 | + self._type = 'svn' |
2393 | + m = re.search(r'^Repository Root: (.+)$', svn_info, re.M) |
2394 | |
2395 | if not m: |
2396 | - # Return the local path, if no default value is found. |
2397 | - return RepositoryInfo(path=local_data, base_path='/', |
2398 | - supports_parent_diffs=True) |
2399 | + return None |
2400 | |
2401 | - path = m.group(1).strip() |
2402 | - m2 = re.match(r'^(svn\+ssh|http|https)://([-a-zA-Z0-9.]*@)(.*)$', |
2403 | - path) |
2404 | + path = m.group(1) |
2405 | + m2 = re.match(r'^(svn\+ssh|http|https|svn)://([-a-zA-Z0-9.]*@)(.*)$', |
2406 | + path) |
2407 | if m2: |
2408 | path = '%s://%s' % (m2.group(1), m2.group(3)) |
2409 | |
2410 | - return RepositoryInfo(path=path, base_path='', |
2411 | + m = re.search(r'^URL: (.+)$', svn_info, re.M) |
2412 | + |
2413 | + if not m: |
2414 | + return None |
2415 | + |
2416 | + base_path = m.group(1)[len(path):] or "/" |
2417 | + return RepositoryInfo(path=path, base_path=base_path, |
2418 | supports_parent_diffs=True) |
2419 | |
2420 | + @property |
2421 | + def hg_root(self): |
2422 | + if not self._hg_root: |
2423 | + root = execute(['hg', 'root'], env=self._hg_env, |
2424 | + ignore_errors=True) |
2425 | + |
2426 | + if not root.startswith('abort:'): |
2427 | + self._hg_root = root.strip() |
2428 | + else: |
2429 | + return None |
2430 | + |
2431 | + return self._hg_root |
2432 | + |
2433 | + def _load_hgrc(self): |
2434 | + for line in execute(['hg', 'showconfig'], split_lines=True): |
2435 | + key, value = line.split('=', 1) |
2436 | + self.hgrc[key] = value.strip() |
2437 | + |
2438 | def extract_summary(self, revision): |
2439 | """ |
2440 | Extracts the first line from the description of the given changeset. |
2441 | """ |
2442 | return execute(['hg', 'log', '-r%s' % revision, '--template', |
2443 | - r'{desc|firstline}\n']) |
2444 | + r'{desc|firstline}\n'], env=self._hg_env) |
2445 | |
2446 | def extract_description(self, rev1, rev2): |
2447 | """ |
2448 | Extracts all descriptions in the given revision range and concatenates |
2449 | them, most recent ones going first. |
2450 | """ |
2451 | - numrevs = len(execute(['hg', 'log', '-r%s:%s' % (rev2, rev1), |
2452 | - '--follow', '--template', |
2453 | - r'{rev}\n']).strip().split('\n')) |
2454 | + numrevs = len(execute([ |
2455 | + 'hg', 'log', '-r%s:%s' % (rev2, rev1), |
2456 | + '--follow', '--template', r'{rev}\n'], env=self._hg_env |
2457 | + ).strip().split('\n')) |
2458 | + |
2459 | return execute(['hg', 'log', '-r%s:%s' % (rev2, rev1), |
2460 | '--follow', '--template', |
2461 | r'{desc}\n\n', '--limit', |
2462 | - str(numrevs - 1)]).strip() |
2463 | + str(numrevs - 1)], env=self._hg_env).strip() |
2464 | |
2465 | def diff(self, files): |
2466 | """ |
2467 | Performs a diff across all modified files in a Mercurial repository. |
2468 | """ |
2469 | - # We don't support parent diffs with Mercurial yet, so we always |
2470 | - # return None for the parent diff. |
2471 | - if self.type == 'svn': |
2472 | - parent = execute(['hg', 'parent', '--svn', '--template', |
2473 | - '{node}\n']).strip() |
2474 | - |
2475 | - if options.parent_branch: |
2476 | - parent = options.parent_branch |
2477 | - |
2478 | - if options.guess_summary and not options.summary: |
2479 | - options.summary = self.extract_summary(".") |
2480 | - |
2481 | - if options.guess_description and not options.description: |
2482 | - options.description = self.extract_description(parent, ".") |
2483 | - |
2484 | - return (execute(["hg", "diff", "--svn", '-r%s:.' % parent]), None) |
2485 | - |
2486 | - return (execute(["hg", "diff"] + files), None) |
2487 | + files = files or [] |
2488 | + |
2489 | + if self._type == 'svn': |
2490 | + return self._get_hgsubversion_diff(files) |
2491 | + else: |
2492 | + return self._get_outgoing_diff(files) |
2493 | + |
2494 | + def _get_hgsubversion_diff(self, files): |
2495 | + parent = execute(['hg', 'parent', '--svn', '--template', |
2496 | + '{node}\n']).strip() |
2497 | + |
2498 | + if options.parent_branch: |
2499 | + parent = options.parent_branch |
2500 | + |
2501 | + if options.guess_summary and not options.summary: |
2502 | + options.summary = self.extract_summary(".") |
2503 | + |
2504 | + if options.guess_description and not options.description: |
2505 | + options.description = self.extract_description(parent, ".") |
2506 | + |
2507 | + return (execute(["hg", "diff", "--svn", '-r%s:.' % parent]), None) |
2508 | + |
2509 | + def _get_outgoing_diff(self, files): |
2510 | + """ |
2511 | + When working with a clone of a Mercurial remote, we need to find |
2512 | + out what the outgoing revisions are for a given branch. It would |
2513 | + be nice if we could just do `hg outgoing --patch <remote>`, but |
2514 | + there are a couple of problems with this. |
2515 | + |
2516 | + For one, the server-side diff parser isn't yet equipped to filter out |
2517 | + diff headers such as "comparing with..." and "changeset: <rev>:<hash>". |
2518 | + Another problem is that the output of `outgoing` potentially includes |
2519 | + changesets across multiple branches. |
2520 | + |
2521 | + In order to provide the most accurate comparison between one's local |
2522 | + clone and a given remote -- something akin to git's diff command syntax |
2523 | + `git diff <treeish>..<treeish>` -- we have to do the following: |
2524 | + |
2525 | + - get the name of the current branch |
2526 | + - get a list of outgoing changesets, specifying a custom format |
2527 | + - filter outgoing changesets by the current branch name |
2528 | + - get the "top" and "bottom" outgoing changesets |
2529 | + - use these changesets as arguments to `hg diff -r <rev> -r <rev>` |
2530 | + |
2531 | + |
2532 | + Future modifications may need to be made to account for odd cases like |
2533 | + having multiple diverged branches which share partial history -- or we |
2534 | + can just punish developers for doing such nonsense :) |
2535 | + """ |
2536 | + files = files or [] |
2537 | + |
2538 | + remote = self._remote_path[0] |
2539 | + |
2540 | + if not remote and options.parent_branch: |
2541 | + remote = options.parent_branch |
2542 | + |
2543 | + current_branch = execute(['hg', 'branch'], env=self._hg_env).strip() |
2544 | + |
2545 | + outgoing_changesets = \ |
2546 | + self._get_outgoing_changesets(current_branch, remote) |
2547 | + |
2548 | + top_rev, bottom_rev = \ |
2549 | + self._get_top_and_bottom_outgoing_revs(outgoing_changesets) |
2550 | + |
2551 | + if options.guess_summary and not options.summary: |
2552 | + options.summary = self.extract_summary(top_rev).rstrip("\n") |
2553 | + |
2554 | + if options.guess_description and not options.description: |
2555 | + options.description = self.extract_description(bottom_rev, top_rev) |
2556 | + |
2557 | + full_command = ['hg', 'diff', '-r', str(bottom_rev), '-r', |
2558 | + str(top_rev)] + files |
2559 | + |
2560 | + return (execute(full_command, env=self._hg_env), None) |
2561 | + |
2562 | + def _get_outgoing_changesets(self, current_branch, remote): |
2563 | + """ |
2564 | + Given the current branch name and a remote path, return a list |
2565 | + of outgoing changeset numbers. |
2566 | + """ |
2567 | + outgoing_changesets = [] |
2568 | + raw_outgoing = execute(['hg', '-q', 'outgoing', '--template', |
2569 | + 'b:{branches}\nr:{rev}\n\n', remote], |
2570 | + env=self._hg_env) |
2571 | + |
2572 | + for pair in raw_outgoing.split('\n\n'): |
2573 | + if not pair.strip(): |
2574 | + continue |
2575 | + |
2576 | + branch, rev = pair.strip().split('\n') |
2577 | + |
2578 | + branch_name = branch[len('b:'):].strip() |
2579 | + branch_name = branch_name or 'default' |
2580 | + revno = rev[len('r:'):] |
2581 | + |
2582 | + if branch_name == current_branch and revno.isdigit(): |
2583 | + debug('Found outgoing changeset %s for branch %r' |
2584 | + % (revno, branch_name)) |
2585 | + outgoing_changesets.append(int(revno)) |
2586 | + |
2587 | + return outgoing_changesets |
2588 | + |
2589 | + def _get_top_and_bottom_outgoing_revs(self, outgoing_changesets): |
2590 | + # This is a classmethod rather than a func mostly just to keep the |
2591 | + # module namespace clean. Pylint told me to do it. |
2592 | + top_rev = max(outgoing_changesets) |
2593 | + bottom_rev = min(outgoing_changesets) |
2594 | + |
2595 | + parents = execute(["hg", "log", "-r", str(bottom_rev), |
2596 | + "--template", "{parents}"], |
2597 | + env=self._hg_env) |
2598 | + parents = parents.rstrip("\n").split(":") |
2599 | + |
2600 | + if len(parents) > 1: |
2601 | + bottom_rev = parents[0] |
2602 | + else: |
2603 | + bottom_rev = bottom_rev - 1 |
2604 | + |
2605 | + bottom_rev = max(0, bottom_rev) |
2606 | + |
2607 | + return top_rev, bottom_rev |
2608 | |
2609 | def diff_between_revisions(self, revision_range, args, repository_info): |
2610 | """ |
2611 | Performs a diff between 2 revisions of a Mercurial repository. |
2612 | """ |
2613 | - if self.type != 'hg': |
2614 | + if self._type != 'hg': |
2615 | raise NotImplementedError |
2616 | |
2617 | - r1, r2 = revision_range.split(':') |
2618 | + if ':' in revision_range: |
2619 | + r1, r2 = revision_range.split(':') |
2620 | + else: |
2621 | + # If only 1 revision is given, we find the first parent and use |
2622 | + # that as the second revision. |
2623 | + # |
2624 | + # We could also use "hg diff -c r1", but then we couldn't reuse the |
2625 | + # code for extracting descriptions. |
2626 | + r2 = revision_range |
2627 | + r1 = execute(["hg", "parents", "-r", r2, |
2628 | + "--template", "{rev}\n"]).split()[0] |
2629 | |
2630 | if options.guess_summary and not options.summary: |
2631 | options.summary = self.extract_summary(r2) |
2632 | @@ -2039,7 +2695,27 @@ |
2633 | if options.guess_description and not options.description: |
2634 | options.description = self.extract_description(r1, r2) |
2635 | |
2636 | - return execute(["hg", "diff", "-r", r1, "-r", r2]) |
2637 | + return (execute(["hg", "diff", "-r", r1, "-r", r2], |
2638 | + env=self._hg_env), None) |
2639 | + |
2640 | + def scan_for_server(self, repository_info): |
2641 | + # Scan first for dot files, since it's faster and will cover the |
2642 | + # user's $HOME/.reviewboardrc |
2643 | + server_url = \ |
2644 | + super(MercurialClient, self).scan_for_server(repository_info) |
2645 | + |
2646 | + if not server_url and self.hgrc.get('reviewboard.url'): |
2647 | + server_url = self.hgrc.get('reviewboard.url').strip() |
2648 | + |
2649 | + if not server_url and self._type == "svn": |
2650 | + # Try using the reviewboard:url property on the SVN repo, if it |
2651 | + # exists. |
2652 | + prop = SVNClient().scan_for_server_property(repository_info) |
2653 | + |
2654 | + if prop: |
2655 | + return prop |
2656 | + |
2657 | + return server_url |
2658 | |
2659 | |
2660 | class GitClient(SCMClient): |
2661 | @@ -2048,93 +2724,143 @@ |
2662 | compatible diffs. This will attempt to generate a diff suitable for the |
2663 | remote repository, whether git, SVN or Perforce. |
2664 | """ |
2665 | + def __init__(self): |
2666 | + SCMClient.__init__(self) |
2667 | + # Store the 'correct' way to invoke git, just plain old 'git' by default |
2668 | + self.git = 'git' |
2669 | + |
2670 | + def _strip_heads_prefix(self, ref): |
2671 | + """ Strips prefix from ref name, if possible """ |
2672 | + return re.sub(r'^refs/heads/', '', ref) |
2673 | + |
2674 | def get_repository_info(self): |
2675 | if not check_install('git --help'): |
2676 | - return None |
2677 | + # CreateProcess (launched via subprocess, used by check_install) |
2678 | + # does not automatically append .cmd for things it finds in PATH. |
2679 | + # If we're on Windows, and this works, save it for further use. |
2680 | + if sys.platform.startswith('win') and check_install('git.cmd --help'): |
2681 | + self.git = 'git.cmd' |
2682 | + else: |
2683 | + return None |
2684 | |
2685 | - git_dir = execute(["git", "rev-parse", "--git-dir"], |
2686 | - ignore_errors=True).strip() |
2687 | + git_dir = execute([self.git, "rev-parse", "--git-dir"], |
2688 | + ignore_errors=True).rstrip("\n") |
2689 | |
2690 | if git_dir.startswith("fatal:") or not os.path.isdir(git_dir): |
2691 | return None |
2692 | + self.bare = execute([self.git, "config", "core.bare"]).strip() == 'true' |
2693 | |
2694 | # post-review in directories other than the top level of |
2695 | # of a work-tree would result in broken diffs on the server |
2696 | - os.chdir(os.path.dirname(os.path.abspath(git_dir))) |
2697 | + if not self.bare: |
2698 | + os.chdir(os.path.dirname(os.path.abspath(git_dir))) |
2699 | |
2700 | - self.head_ref = execute(['git', 'symbolic-ref', '-q', 'HEAD']).strip() |
2701 | + self.head_ref = execute([self.git, 'symbolic-ref', '-q', 'HEAD']).strip() |
2702 | |
2703 | # We know we have something we can work with. Let's find out |
2704 | - # what it is. We'll try SVN first. |
2705 | - data = execute(["git", "svn", "info"], ignore_errors=True) |
2706 | - |
2707 | - m = re.search(r'^Repository Root: (.+)$', data, re.M) |
2708 | - if m: |
2709 | - path = m.group(1) |
2710 | - m = re.search(r'^URL: (.+)$', data, re.M) |
2711 | + # what it is. We'll try SVN first, but only if there's a .git/svn |
2712 | + # directory. Otherwise, it may attempt to create one and scan |
2713 | + # revisions, which can be slow. |
2714 | + git_svn_dir = os.path.join(git_dir, 'svn') |
2715 | + |
2716 | + if os.path.isdir(git_svn_dir) and len(os.listdir(git_svn_dir)) > 0: |
2717 | + data = execute([self.git, "svn", "info"], ignore_errors=True) |
2718 | + |
2719 | + m = re.search(r'^Repository Root: (.+)$', data, re.M) |
2720 | |
2721 | if m: |
2722 | - base_path = m.group(1)[len(path):] or "/" |
2723 | - m = re.search(r'^Repository UUID: (.+)$', data, re.M) |
2724 | + path = m.group(1) |
2725 | + m = re.search(r'^URL: (.+)$', data, re.M) |
2726 | |
2727 | if m: |
2728 | - uuid = m.group(1) |
2729 | - self.type = "svn" |
2730 | - self.upstream_branch = options.parent_branch or 'master' |
2731 | - |
2732 | - return SvnRepositoryInfo(path=path, base_path=base_path, |
2733 | - uuid=uuid, |
2734 | - supports_parent_diffs=True) |
2735 | - else: |
2736 | - # Versions of git-svn before 1.5.4 don't (appear to) support |
2737 | - # 'git svn info'. If we fail because of an older git install, |
2738 | - # here, figure out what version of git is installed and give |
2739 | - # the user a hint about what to do next. |
2740 | - version = execute(["git", "svn", "--version"], ignore_errors=True) |
2741 | - version_parts = re.search('version (\d+)\.(\d+)\.(\d+)', |
2742 | - version) |
2743 | - svn_remote = execute(["git", "config", "--get", |
2744 | - "svn-remote.svn.url"], ignore_errors=True) |
2745 | - |
2746 | - if (version_parts and |
2747 | - not self.is_valid_version((int(version_parts.group(1)), |
2748 | - int(version_parts.group(2)), |
2749 | - int(version_parts.group(3))), |
2750 | - (1, 5, 4)) and |
2751 | - svn_remote): |
2752 | - die("Your installation of git-svn must be upgraded to " + \ |
2753 | - "version 1.5.4 or later") |
2754 | + base_path = m.group(1)[len(path):] or "/" |
2755 | + m = re.search(r'^Repository UUID: (.+)$', data, re.M) |
2756 | + |
2757 | + if m: |
2758 | + uuid = m.group(1) |
2759 | + self.type = "svn" |
2760 | + |
2761 | + # Get SVN tracking branch |
2762 | + if options.parent_branch: |
2763 | + self.upstream_branch = options.parent_branch |
2764 | + else: |
2765 | + data = execute([self.git, "svn", "rebase", "-n"], |
2766 | + ignore_errors=True) |
2767 | + m = re.search(r'^Remote Branch:\s*(.+)$', data, re.M) |
2768 | + |
2769 | + if m: |
2770 | + self.upstream_branch = m.group(1) |
2771 | + else: |
2772 | + sys.stderr.write('Failed to determine SVN tracking ' |
2773 | + 'branch. Defaulting to "master"\n') |
2774 | + self.upstream_branch = 'master' |
2775 | + |
2776 | + return SvnRepositoryInfo(path=path, |
2777 | + base_path=base_path, |
2778 | + uuid=uuid, |
2779 | + supports_parent_diffs=True) |
2780 | + else: |
2781 | + # Versions of git-svn before 1.5.4 don't (appear to) support |
2782 | + # 'git svn info'. If we fail because of an older git install, |
2783 | + # here, figure out what version of git is installed and give |
2784 | + # the user a hint about what to do next. |
2785 | + version = execute([self.git, "svn", "--version"], |
2786 | + ignore_errors=True) |
2787 | + version_parts = re.search('version (\d+)\.(\d+)\.(\d+)', |
2788 | + version) |
2789 | + svn_remote = execute([self.git, "config", "--get", |
2790 | + "svn-remote.svn.url"], |
2791 | + ignore_errors=True) |
2792 | + |
2793 | + if (version_parts and |
2794 | + not self.is_valid_version((int(version_parts.group(1)), |
2795 | + int(version_parts.group(2)), |
2796 | + int(version_parts.group(3))), |
2797 | + (1, 5, 4)) and |
2798 | + svn_remote): |
2799 | + die("Your installation of git-svn must be upgraded to " |
2800 | + "version 1.5.4 or later") |
2801 | |
2802 | # Okay, maybe Perforce. |
2803 | # TODO |
2804 | |
2805 | # Nope, it's git then. |
2806 | # Check for a tracking branch and determine merge-base |
2807 | - short_head = self.head_ref.split('/')[-1] |
2808 | - merge = execute(['git', 'config', '--get', |
2809 | + short_head = self._strip_heads_prefix(self.head_ref) |
2810 | + merge = execute([self.git, 'config', '--get', |
2811 | 'branch.%s.merge' % short_head], |
2812 | ignore_errors=True).strip() |
2813 | - remote = execute(['git', 'config', '--get', |
2814 | + remote = execute([self.git, 'config', '--get', |
2815 | 'branch.%s.remote' % short_head], |
2816 | ignore_errors=True).strip() |
2817 | |
2818 | - HEADS_PREFIX = 'refs/heads/' |
2819 | - |
2820 | - if merge.startswith(HEADS_PREFIX): |
2821 | - merge = merge[len(HEADS_PREFIX):] |
2822 | - |
2823 | + merge = self._strip_heads_prefix(merge) |
2824 | self.upstream_branch = '' |
2825 | |
2826 | if remote and remote != '.' and merge: |
2827 | self.upstream_branch = '%s/%s' % (remote, merge) |
2828 | |
2829 | - self.upstream_branch, origin_url = self.get_origin(self.upstream_branch, |
2830 | - True) |
2831 | - |
2832 | - if not origin_url or origin_url.startswith("fatal:"): |
2833 | - self.upstream_branch, origin_url = self.get_origin() |
2834 | - |
2835 | + url = None |
2836 | + if options.repository_url: |
2837 | + url = options.repository_url |
2838 | + else: |
2839 | + self.upstream_branch, origin_url = \ |
2840 | + self.get_origin(self.upstream_branch, True) |
2841 | + |
2842 | + if not origin_url or origin_url.startswith("fatal:"): |
2843 | + self.upstream_branch, origin_url = self.get_origin() |
2844 | + |
2845 | + url = origin_url.rstrip('/') |
2846 | + |
2847 | + # Central bare repositories don't have origin URLs. |
2848 | + # We return git_dir instead and hope for the best. |
2849 | url = origin_url.rstrip('/') |
2850 | + |
2851 | + if not url: |
2852 | + url = os.path.abspath(git_dir) |
2853 | + |
2854 | + # There is no remote, so skip this part of upstream_branch. |
2855 | + self.upstream_branch = self.upstream_branch.split('/')[-1] |
2856 | if url: |
2857 | self.type = "git" |
2858 | return RepositoryInfo(path=url, base_path='', |
2859 | @@ -2150,10 +2876,10 @@ |
2860 | upstream_branch = options.tracking or default_upstream_branch or \ |
2861 | 'origin/master' |
2862 | upstream_remote = upstream_branch.split('/')[0] |
2863 | - origin_url = execute(["git", "config", "remote.%s.url" % upstream_remote], |
2864 | - ignore_errors=ignore_errors) |
2865 | - |
2866 | - return (upstream_branch, origin_url.rstrip('\n')) |
2867 | + origin_url = execute([self.git, "config", "--get", |
2868 | + "remote.%s.url" % upstream_remote], |
2869 | + ignore_errors=True).rstrip("\n") |
2870 | + return (upstream_branch, origin_url) |
2871 | |
2872 | def is_valid_version(self, actual, expected): |
2873 | """ |
2874 | @@ -2176,7 +2902,7 @@ |
2875 | return server_url |
2876 | |
2877 | # TODO: Maybe support a server per remote later? Is that useful? |
2878 | - url = execute(["git", "config", "--get", "reviewboard.url"], |
2879 | + url = execute([self.git, "config", "--get", "reviewboard.url"], |
2880 | ignore_errors=True).strip() |
2881 | if url: |
2882 | return url |
2883 | @@ -2198,7 +2924,7 @@ |
2884 | """ |
2885 | parent_branch = options.parent_branch |
2886 | |
2887 | - self.merge_base = execute(["git", "merge-base", self.upstream_branch, |
2888 | + self.merge_base = execute([self.git, "merge-base", self.upstream_branch, |
2889 | self.head_ref]).strip() |
2890 | |
2891 | if parent_branch: |
2892 | @@ -2209,12 +2935,13 @@ |
2893 | parent_diff_lines = None |
2894 | |
2895 | if options.guess_summary and not options.summary: |
2896 | - options.summary = execute(["git", "log", "--pretty=format:%s", |
2897 | - "HEAD^.."], ignore_errors=True).strip() |
2898 | + s = execute([self.git, "log", "--pretty=format:%s", "HEAD^.."], |
2899 | + ignore_errors=True) |
2900 | + options.summary = s.replace('\n', ' ').strip() |
2901 | |
2902 | if options.guess_description and not options.description: |
2903 | options.description = execute( |
2904 | - ["git", "log", "--pretty=format:%s%n%n%b", |
2905 | + [self.git, "log", "--pretty=format:%s%n%n%b", |
2906 | (parent_branch or self.merge_base) + ".."], |
2907 | ignore_errors=True).strip() |
2908 | |
2909 | @@ -2224,16 +2951,19 @@ |
2910 | """ |
2911 | Performs a diff on a particular branch range. |
2912 | """ |
2913 | - rev_range = "%s..%s" % (ancestor, commit) |
2914 | + if commit: |
2915 | + rev_range = "%s..%s" % (ancestor, commit) |
2916 | + else: |
2917 | + rev_range = ancestor |
2918 | |
2919 | if self.type == "svn": |
2920 | - diff_lines = execute(["git", "diff", "--no-color", "--no-prefix", |
2921 | - "-r", "-u", rev_range], |
2922 | + diff_lines = execute([self.git, "diff", "--no-color", "--no-prefix", |
2923 | + "--no-ext-diff", "-r", "-u", rev_range], |
2924 | split_lines=True) |
2925 | return self.make_svn_diff(ancestor, diff_lines) |
2926 | elif self.type == "git": |
2927 | - return execute(["git", "diff", "--no-color", "--full-index", |
2928 | - rev_range]) |
2929 | + return execute([self.git, "diff", "--no-color", "--full-index", |
2930 | + "--no-ext-diff", rev_range]) |
2931 | |
2932 | return None |
2933 | |
2934 | @@ -2243,14 +2973,13 @@ |
2935 | svn diff would generate. This is needed so the SVNTool in Review |
2936 | Board can properly parse this diff. |
2937 | """ |
2938 | - rev = execute(["git", "svn", "find-rev", parent_branch]).strip() |
2939 | + rev = execute([self.git, "svn", "find-rev", parent_branch]).strip() |
2940 | |
2941 | if not rev: |
2942 | return None |
2943 | |
2944 | diff_data = "" |
2945 | filename = "" |
2946 | - revision = "" |
2947 | newfile = False |
2948 | |
2949 | for line in diff_lines: |
2950 | @@ -2295,33 +3024,400 @@ |
2951 | |
2952 | def diff_between_revisions(self, revision_range, args, repository_info): |
2953 | """Perform a diff between two arbitrary revisions""" |
2954 | + |
2955 | + # Make a parent diff to the first of the revisions so that we |
2956 | + # never end up with broken patches: |
2957 | + self.merge_base = execute([self.git, "merge-base", self.upstream_branch, |
2958 | + self.head_ref]).strip() |
2959 | + |
2960 | if ":" not in revision_range: |
2961 | # only one revision is specified |
2962 | + |
2963 | + # Check if parent contains the first revision and make a |
2964 | + # parent diff if not: |
2965 | + pdiff_required = execute([self.git, "branch", "-r", |
2966 | + "--contains", revision_range]) |
2967 | + parent_diff_lines = None |
2968 | + |
2969 | + if not pdiff_required: |
2970 | + parent_diff_lines = self.make_diff(self.merge_base, revision_range) |
2971 | + |
2972 | if options.guess_summary and not options.summary: |
2973 | - options.summary = execute( |
2974 | - ["git", "log", "--pretty=format:%s", revision_range + ".."], |
2975 | - ignore_errors=True).strip() |
2976 | + s = execute([self.git, "log", "--pretty=format:%s", |
2977 | + revision_range + ".."], |
2978 | + ignore_errors=True) |
2979 | + options.summary = s.replace('\n', ' ').strip() |
2980 | |
2981 | if options.guess_description and not options.description: |
2982 | options.description = execute( |
2983 | - ["git", "log", "--pretty=format:%s%n%n%b", revision_range + ".."], |
2984 | + [self.git, "log", "--pretty=format:%s%n%n%b", revision_range + ".."], |
2985 | ignore_errors=True).strip() |
2986 | |
2987 | - return self.make_diff(revision_range) |
2988 | + return (self.make_diff(revision_range), parent_diff_lines) |
2989 | else: |
2990 | r1, r2 = revision_range.split(":") |
2991 | + # Check if parent contains the first revision and make a |
2992 | + # parent diff if not: |
2993 | + pdiff_required = execute([self.git, "branch", "-r", |
2994 | + "--contains", r1]) |
2995 | + parent_diff_lines = None |
2996 | + |
2997 | + if not pdiff_required: |
2998 | + parent_diff_lines = self.make_diff(self.merge_base, r1) |
2999 | |
3000 | if options.guess_summary and not options.summary: |
3001 | - options.summary = execute( |
3002 | - ["git", "log", "--pretty=format:%s", "%s..%s" % (r1, r2)], |
3003 | - ignore_errors=True).strip() |
3004 | + s = execute([self.git, "log", "--pretty=format:%s", |
3005 | + "%s..%s" % (r1, r2)], |
3006 | + ignore_errors=True) |
3007 | + options.summary = s.replace('\n', ' ').strip() |
3008 | |
3009 | if options.guess_description and not options.description: |
3010 | options.description = execute( |
3011 | - ["git", "log", "--pretty=format:%s%n%n%b", "%s..%s" % (r1, r2)], |
3012 | + [self.git, "log", "--pretty=format:%s%n%n%b", "%s..%s" % (r1, r2)], |
3013 | ignore_errors=True).strip() |
3014 | |
3015 | - return self.make_diff(r1, r2) |
3016 | + return (self.make_diff(r1, r2), parent_diff_lines) |
3017 | + |
3018 | + |
3019 | +class PlasticClient(SCMClient): |
3020 | + """ |
3021 | + A wrapper around the cm Plastic tool that fetches repository |
3022 | + information and generates compatible diffs |
3023 | + """ |
3024 | + def get_repository_info(self): |
3025 | + if not check_install('cm version'): |
3026 | + return None |
3027 | + |
3028 | + # Get the repository that the current directory is from. If there |
3029 | + # is more than one repository mounted in the current directory, |
3030 | + # bail out for now (in future, should probably enter a review |
3031 | + # request per each repository.) |
3032 | + split = execute(["cm", "ls", "--format={8}"], split_lines=True, |
3033 | + ignore_errors=True) |
3034 | + m = re.search(r'^rep:(.+)$', split[0], re.M) |
3035 | + |
3036 | + if not m: |
3037 | + return None |
3038 | + |
3039 | + # Make sure the repository list contains only one unique entry |
3040 | + if len(split) != split.count(split[0]): |
3041 | + # Not unique! |
3042 | + die('Directory contains more than one mounted repository') |
3043 | + |
3044 | + path = m.group(1) |
3045 | + |
3046 | + # Get the workspace directory, so we can strip it from the diff output |
3047 | + self.workspacedir = execute(["cm", "gwp", ".", "--format={1}"], |
3048 | + split_lines=False, |
3049 | + ignore_errors=True).strip() |
3050 | + |
3051 | + debug("Workspace is %s" % self.workspacedir) |
3052 | + |
3053 | + return RepositoryInfo(path, |
3054 | + supports_changesets=True, |
3055 | + supports_parent_diffs=False) |
3056 | + |
3057 | + def get_changenum(self, args): |
3058 | + """ Extract the integer value from a changeset ID (cs:1234) """ |
3059 | + if len(args) == 1 and args[0].startswith("cs:"): |
3060 | + try: |
3061 | + return str(int(args[0][3:])) |
3062 | + except ValueError: |
3063 | + pass |
3064 | + |
3065 | + return None |
3066 | + |
3067 | + def sanitize_changenum(self, changenum): |
3068 | + """ Return a "sanitized" change number. Currently a no-op """ |
3069 | + return changenum |
3070 | + |
3071 | + def diff(self, args): |
3072 | + """ |
3073 | + Performs a diff across all modified files in a Plastic workspace |
3074 | + |
3075 | + Parent diffs are not supported (the second value in the tuple). |
3076 | + """ |
3077 | + changenum = self.get_changenum(args) |
3078 | + |
3079 | + if changenum is None: |
3080 | + return self.branch_diff(args), None |
3081 | + else: |
3082 | + return self.changenum_diff(changenum), None |
3083 | + |
3084 | + def diff_between_revisions(self, revision_range, args, repository_info): |
3085 | + """ |
3086 | + Performs a diff between 2 revisions of a Plastic repository. |
3087 | + |
3088 | + Assume revision_range is a branch specification (br:/main/task001) |
3089 | + and hand over to branch_diff |
3090 | + """ |
3091 | + return (self.branch_diff(revision_range), None) |
3092 | + |
3093 | + def changenum_diff(self, changenum): |
3094 | + debug("changenum_diff: %s" % (changenum)) |
3095 | + files = execute(["cm", "log", "cs:" + changenum, |
3096 | + "--csFormat={items}", |
3097 | + "--itemFormat={shortstatus} {path} " |
3098 | + "rev:revid:{revid} rev:revid:{parentrevid} " |
3099 | + "src:{srccmpath} rev:revid:{srcdirrevid} " |
3100 | + "dst:{dstcmpath} rev:revid:{dstdirrevid}{newline}"], |
3101 | + split_lines = True) |
3102 | + |
3103 | + debug("got files: %s" % (files)) |
3104 | + |
3105 | + # Diff generation based on perforce client |
3106 | + diff_lines = [] |
3107 | + |
3108 | + empty_filename = make_tempfile() |
3109 | + tmp_diff_from_filename = make_tempfile() |
3110 | + tmp_diff_to_filename = make_tempfile() |
3111 | + |
3112 | + for f in files: |
3113 | + f = f.strip() |
3114 | + |
3115 | + if not f: |
3116 | + continue |
3117 | + |
3118 | + m = re.search(r'(?P<type>[ACIMR]) (?P<file>.*) ' |
3119 | + r'(?P<revspec>rev:revid:[-\d]+) ' |
3120 | + r'(?P<parentrevspec>rev:revid:[-\d]+) ' |
3121 | + r'src:(?P<srcpath>.*) ' |
3122 | + r'(?P<srcrevspec>rev:revid:[-\d]+) ' |
3123 | + r'dst:(?P<dstpath>.*) ' |
3124 | + r'(?P<dstrevspec>rev:revid:[-\d]+)$', |
3125 | + f) |
3126 | + if not m: |
3127 | + die("Could not parse 'cm log' response: %s" % f) |
3128 | + |
3129 | + changetype = m.group("type") |
3130 | + filename = m.group("file") |
3131 | + |
3132 | + if changetype == "M": |
3133 | + # Handle moved files as a delete followed by an add. |
3134 | + # Clunky, but at least it works |
3135 | + oldfilename = m.group("srcpath") |
3136 | + oldspec = m.group("srcrevspec") |
3137 | + newfilename = m.group("dstpath") |
3138 | + newspec = m.group("dstrevspec") |
3139 | + |
3140 | + self.write_file(oldfilename, oldspec, tmp_diff_from_filename) |
3141 | + dl = self.diff_files(tmp_diff_from_filename, empty_filename, |
3142 | + oldfilename, "rev:revid:-1", oldspec, |
3143 | + changetype) |
3144 | + diff_lines += dl |
3145 | + |
3146 | + self.write_file(newfilename, newspec, tmp_diff_to_filename) |
3147 | + dl = self.diff_files(empty_filename, tmp_diff_to_filename, |
3148 | + newfilename, newspec, "rev:revid:-1", |
3149 | + changetype) |
3150 | + diff_lines += dl |
3151 | + else: |
3152 | + newrevspec = m.group("revspec") |
3153 | + parentrevspec = m.group("parentrevspec") |
3154 | + |
3155 | + debug("Type %s File %s Old %s New %s" % (changetype, |
3156 | + filename, |
3157 | + parentrevspec, |
3158 | + newrevspec)) |
3159 | + |
3160 | + old_file = new_file = empty_filename |
3161 | + |
3162 | + if (changetype in ['A'] or |
3163 | + (changetype in ['C', 'I'] and |
3164 | + parentrevspec == "rev:revid:-1")): |
3165 | + # File was Added, or a Change or Merge (type I) and there |
3166 | + # is no parent revision |
3167 | + self.write_file(filename, newrevspec, tmp_diff_to_filename) |
3168 | + new_file = tmp_diff_to_filename |
3169 | + elif changetype in ['C', 'I']: |
3170 | + # File was Changed or Merged (type I) |
3171 | + self.write_file(filename, parentrevspec, |
3172 | + tmp_diff_from_filename) |
3173 | + old_file = tmp_diff_from_filename |
3174 | + self.write_file(filename, newrevspec, tmp_diff_to_filename) |
3175 | + new_file = tmp_diff_to_filename |
3176 | + elif changetype in ['R']: |
3177 | + # File was Removed |
3178 | + self.write_file(filename, parentrevspec, |
3179 | + tmp_diff_from_filename) |
3180 | + old_file = tmp_diff_from_filename |
3181 | + else: |
3182 | + die("Don't know how to handle change type '%s' for %s" % |
3183 | + (changetype, filename)) |
3184 | + |
3185 | + dl = self.diff_files(old_file, new_file, filename, |
3186 | + newrevspec, parentrevspec, changetype) |
3187 | + diff_lines += dl |
3188 | + |
3189 | + os.unlink(empty_filename) |
3190 | + os.unlink(tmp_diff_from_filename) |
3191 | + os.unlink(tmp_diff_to_filename) |
3192 | + |
3193 | + return ''.join(diff_lines) |
3194 | + |
3195 | + def branch_diff(self, args): |
3196 | + debug("branch diff: %s" % (args)) |
3197 | + |
3198 | + if len(args) > 0: |
3199 | + branch = args[0] |
3200 | + else: |
3201 | + branch = args |
3202 | + |
3203 | + if not branch.startswith("br:"): |
3204 | + return None |
3205 | + |
3206 | + if not options.branch: |
3207 | + options.branch = branch |
3208 | + |
3209 | + files = execute(["cm", "fbc", branch, "--format={3} {4}"], |
3210 | + split_lines = True) |
3211 | + debug("got files: %s" % (files)) |
3212 | + |
3213 | + diff_lines = [] |
3214 | + |
3215 | + empty_filename = make_tempfile() |
3216 | + tmp_diff_from_filename = make_tempfile() |
3217 | + tmp_diff_to_filename = make_tempfile() |
3218 | + |
3219 | + for f in files: |
3220 | + f = f.strip() |
3221 | + |
3222 | + if not f: |
3223 | + continue |
3224 | + |
3225 | + m = re.search(r'^(?P<branch>.*)#(?P<revno>\d+) (?P<file>.*)$', f) |
3226 | + |
3227 | + if not m: |
3228 | + die("Could not parse 'cm fbc' response: %s" % f) |
3229 | + |
3230 | + filename = m.group("file") |
3231 | + branch = m.group("branch") |
3232 | + revno = m.group("revno") |
3233 | + |
3234 | + # Get the base revision with a cm find |
3235 | + basefiles = execute(["cm", "find", "revs", "where", |
3236 | + "item='" + filename + "'", "and", |
3237 | + "branch='" + branch + "'", "and", |
3238 | + "revno=" + revno, |
3239 | + "--format={item} rev:revid:{id} " |
3240 | + "rev:revid:{parent}", "--nototal"], |
3241 | + split_lines = True) |
3242 | + |
3243 | + # We only care about the first line |
3244 | + m = re.search(r'^(?P<filename>.*) ' |
3245 | + r'(?P<revspec>rev:revid:[-\d]+) ' |
3246 | + r'(?P<parentrevspec>rev:revid:[-\d]+)$', |
3247 | + basefiles[0]) |
3248 | + basefilename = m.group("filename") |
3249 | + newrevspec = m.group("revspec") |
3250 | + parentrevspec = m.group("parentrevspec") |
3251 | + |
3252 | + # Cope with adds/removes |
3253 | + changetype = "C" |
3254 | + |
3255 | + if parentrevspec == "rev:revid:-1": |
3256 | + changetype = "A" |
3257 | + elif newrevspec == "rev:revid:-1": |
3258 | + changetype = "R" |
3259 | + |
3260 | + debug("Type %s File %s Old %s New %s" % (changetype, |
3261 | + basefilename, |
3262 | + parentrevspec, |
3263 | + newrevspec)) |
3264 | + |
3265 | + old_file = new_file = empty_filename |
3266 | + |
3267 | + if changetype == "A": |
3268 | + # File Added |
3269 | + self.write_file(basefilename, newrevspec, |
3270 | + tmp_diff_to_filename) |
3271 | + new_file = tmp_diff_to_filename |
3272 | + elif changetype == "R": |
3273 | + # File Removed |
3274 | + self.write_file(basefilename, parentrevspec, |
3275 | + tmp_diff_from_filename) |
3276 | + old_file = tmp_diff_from_filename |
3277 | + else: |
3278 | + self.write_file(basefilename, parentrevspec, |
3279 | + tmp_diff_from_filename) |
3280 | + old_file = tmp_diff_from_filename |
3281 | + |
3282 | + self.write_file(basefilename, newrevspec, |
3283 | + tmp_diff_to_filename) |
3284 | + new_file = tmp_diff_to_filename |
3285 | + |
3286 | + dl = self.diff_files(old_file, new_file, basefilename, |
3287 | + newrevspec, parentrevspec, changetype) |
3288 | + diff_lines += dl |
3289 | + |
3290 | + os.unlink(empty_filename) |
3291 | + os.unlink(tmp_diff_from_filename) |
3292 | + os.unlink(tmp_diff_to_filename) |
3293 | + |
3294 | + return ''.join(diff_lines) |
3295 | + |
3296 | + def diff_files(self, old_file, new_file, filename, newrevspec, |
3297 | + parentrevspec, changetype, ignore_unmodified=False): |
3298 | + """ |
3299 | + Do the work of producing a diff for Plastic (based on the Perforce one) |
3300 | + |
3301 | + old_file - The absolute path to the "old" file. |
3302 | + new_file - The absolute path to the "new" file. |
3303 | + filename - The file in the Plastic workspace |
3304 | + newrevspec - The revid spec of the changed file |
3305 | + parentrevspecspec - The revision spec of the "old" file |
3306 | + changetype - The change type as a single character string |
3307 | + ignore_unmodified - If true, will return an empty list if the file |
3308 | + is not changed. |
3309 | + |
3310 | + Returns a list of strings of diff lines. |
3311 | + """ |
3312 | + if filename.startswith(self.workspacedir): |
3313 | + filename = filename[len(self.workspacedir):] |
3314 | + |
3315 | + diff_cmd = ["diff", "-urN", old_file, new_file] |
3316 | + # Diff returns "1" if differences were found. |
3317 | + dl = execute(diff_cmd, extra_ignore_errors=(1,2), |
3318 | + translate_newlines = False) |
3319 | + |
3320 | + # If the input file has ^M characters at end of line, lets ignore them. |
3321 | + dl = dl.replace('\r\r\n', '\r\n') |
3322 | + dl = dl.splitlines(True) |
3323 | + |
3324 | + # Special handling for the output of the diff tool on binary files: |
3325 | + # diff outputs "Files a and b differ" |
3326 | + # and the code below expects the output to start with |
3327 | + # "Binary files " |
3328 | + if (len(dl) == 1 and |
3329 | + dl[0].startswith('Files %s and %s differ' % (old_file, new_file))): |
3330 | + dl = ['Binary files %s and %s differ\n' % (old_file, new_file)] |
3331 | + |
3332 | + if dl == [] or dl[0].startswith("Binary files "): |
3333 | + if dl == []: |
3334 | + if ignore_unmodified: |
3335 | + return [] |
3336 | + else: |
3337 | + print "Warning: %s in your changeset is unmodified" % \ |
3338 | + filename |
3339 | + |
3340 | + dl.insert(0, "==== %s (%s) ==%s==\n" % (filename, newrevspec, |
3341 | + changetype)) |
3342 | + dl.append('\n') |
3343 | + else: |
3344 | + dl[0] = "--- %s\t%s\n" % (filename, parentrevspec) |
3345 | + dl[1] = "+++ %s\t%s\n" % (filename, newrevspec) |
3346 | + |
3347 | + # Not everybody has files that end in a newline. This ensures |
3348 | + # that the resulting diff file isn't broken. |
3349 | + if dl[-1][-1] != '\n': |
3350 | + dl.append('\n') |
3351 | + |
3352 | + return dl |
3353 | + |
3354 | + def write_file(self, filename, filespec, tmpfile): |
3355 | + """ Grabs a file from Plastic and writes it to a temp file """ |
3356 | + debug("Writing '%s' (rev %s) to '%s'" % (filename, filespec, tmpfile)) |
3357 | + execute(["cm", "cat", filespec, "--file=" + tmpfile]) |
3358 | |
3359 | |
3360 | SCMCLIENTS = ( |
3361 | @@ -2331,6 +3427,7 @@ |
3362 | MercurialClient(), |
3363 | PerforceClient(), |
3364 | ClearCaseClient(), |
3365 | + PlasticClient(), |
3366 | ) |
3367 | |
3368 | def debug(s): |
3369 | @@ -2341,12 +3438,14 @@ |
3370 | print ">>> %s" % s |
3371 | |
3372 | |
3373 | -def make_tempfile(): |
3374 | +def make_tempfile(content=None): |
3375 | """ |
3376 | Creates a temporary file and returns the path. The path is stored |
3377 | in an array for later cleanup. |
3378 | """ |
3379 | fd, tmpfile = mkstemp() |
3380 | + if content: |
3381 | + os.write(fd, content) |
3382 | os.close(fd) |
3383 | tempfiles.append(tmpfile) |
3384 | return tmpfile |
3385 | @@ -2360,10 +3459,10 @@ |
3386 | instance, 'svn help' or 'git --version'). |
3387 | """ |
3388 | try: |
3389 | - p = subprocess.Popen(command.split(' '), |
3390 | - stdin=subprocess.PIPE, |
3391 | - stdout=subprocess.PIPE, |
3392 | - stderr=subprocess.PIPE) |
3393 | + subprocess.Popen(command.split(' '), |
3394 | + stdin=subprocess.PIPE, |
3395 | + stdout=subprocess.PIPE, |
3396 | + stderr=subprocess.PIPE) |
3397 | return True |
3398 | except OSError: |
3399 | return False |
3400 | @@ -2395,7 +3494,7 @@ |
3401 | |
3402 | |
3403 | def execute(command, env=None, split_lines=False, ignore_errors=False, |
3404 | - extra_ignore_errors=(), translate_newlines=True): |
3405 | + extra_ignore_errors=(), translate_newlines=True, with_errors=True): |
3406 | """ |
3407 | Utility function to execute a command and return the output. |
3408 | """ |
3409 | @@ -2412,11 +3511,16 @@ |
3410 | env['LC_ALL'] = 'en_US.UTF-8' |
3411 | env['LANGUAGE'] = 'en_US.UTF-8' |
3412 | |
3413 | + if with_errors: |
3414 | + errors_output = subprocess.STDOUT |
3415 | + else: |
3416 | + errors_output = subprocess.PIPE |
3417 | + |
3418 | if sys.platform.startswith('win'): |
3419 | p = subprocess.Popen(command, |
3420 | stdin=subprocess.PIPE, |
3421 | stdout=subprocess.PIPE, |
3422 | - stderr=subprocess.STDOUT, |
3423 | + stderr=errors_output, |
3424 | shell=False, |
3425 | universal_newlines=translate_newlines, |
3426 | env=env) |
3427 | @@ -2424,7 +3528,7 @@ |
3428 | p = subprocess.Popen(command, |
3429 | stdin=subprocess.PIPE, |
3430 | stdout=subprocess.PIPE, |
3431 | - stderr=subprocess.STDOUT, |
3432 | + stderr=errors_output, |
3433 | shell=False, |
3434 | close_fds=True, |
3435 | universal_newlines=translate_newlines, |
3436 | @@ -2466,22 +3570,33 @@ |
3437 | path = os.path.dirname(path) |
3438 | |
3439 | |
3440 | -def load_config_file(filename): |
3441 | - """ |
3442 | - Loads data from a config file. |
3443 | - """ |
3444 | - config = { |
3445 | - 'TREES': {}, |
3446 | - } |
3447 | - |
3448 | - if os.path.exists(filename): |
3449 | - try: |
3450 | - execfile(filename, config) |
3451 | - except SyntaxError, e: |
3452 | - die('Syntax error in config file: %s\n' |
3453 | - 'Line %i offset %i\n' % (filename, e.lineno, e.offset)) |
3454 | - |
3455 | - return config |
3456 | +def load_config_files(homepath): |
3457 | + """Loads data from .reviewboardrc files""" |
3458 | + def _load_config(path): |
3459 | + config = { |
3460 | + 'TREES': {}, |
3461 | + } |
3462 | + |
3463 | + filename = os.path.join(path, '.reviewboardrc') |
3464 | + |
3465 | + if os.path.exists(filename): |
3466 | + try: |
3467 | + execfile(filename, config) |
3468 | + except SyntaxError, e: |
3469 | + die('Syntax error in config file: %s\n' |
3470 | + 'Line %i offset %i\n' % (filename, e.lineno, e.offset)) |
3471 | + |
3472 | + return config |
3473 | + |
3474 | + return None |
3475 | + |
3476 | + for path in walk_parents(os.getcwd()): |
3477 | + config = _load_config(path) |
3478 | + |
3479 | + if config: |
3480 | + configs.append(config) |
3481 | + |
3482 | + globals()['user_config'] = _load_config(homepath) |
3483 | |
3484 | |
3485 | def tempt_fate(server, tool, changenum, diff_content=None, |
3486 | @@ -2527,6 +3642,10 @@ |
3487 | if options.testing_done: |
3488 | server.set_review_request_field(review_request, 'testing_done', |
3489 | options.testing_done) |
3490 | + |
3491 | + if options.change_description: |
3492 | + server.set_review_request_field(review_request, 'changedescription', |
3493 | + options.change_description) |
3494 | except APIError, e: |
3495 | if e.error_code == 103: # Not logged in |
3496 | retries = retries - 1 |
3497 | @@ -2569,10 +3688,13 @@ |
3498 | die("Your review request still exists, but the diff is not " + |
3499 | "attached.") |
3500 | |
3501 | + if options.reopen: |
3502 | + server.reopen(review_request) |
3503 | + |
3504 | if options.publish: |
3505 | server.publish(review_request) |
3506 | |
3507 | - request_url = 'r/' + str(review_request['id']) |
3508 | + request_url = 'r/' + str(review_request['id']) + '/' |
3509 | review_url = urljoin(server.url, request_url) |
3510 | |
3511 | if not review_url.startswith('http'): |
3512 | @@ -2614,6 +3736,10 @@ |
3513 | dest="diff_only", action="store_true", default=False, |
3514 | help="uploads a new diff, but does not update " |
3515 | "info from changelist") |
3516 | + parser.add_option("--reopen", |
3517 | + dest="reopen", action="store_true", default=False, |
3518 | + help="reopen discarded review request " |
3519 | + "after update") |
3520 | parser.add_option("--target-groups", |
3521 | dest="target_groups", default=TARGET_GROUPS, |
3522 | help="names of the groups who will perform " |
3523 | @@ -2635,12 +3761,12 @@ |
3524 | dest="guess_summary", action="store_true", |
3525 | default=False, |
3526 | help="guess summary from the latest commit (git/" |
3527 | - "hgsubversion only)") |
3528 | + "hg/hgsubversion only)") |
3529 | parser.add_option("--guess-description", |
3530 | dest="guess_description", action="store_true", |
3531 | default=False, |
3532 | help="guess description based on commits on this branch " |
3533 | - "(git/hgsubversion only)") |
3534 | + "(git/hg/hgsubversion only)") |
3535 | parser.add_option("--testing-done", |
3536 | dest="testing_done", default=None, |
3537 | help="details of testing done ") |
3538 | @@ -2653,13 +3779,13 @@ |
3539 | parser.add_option("--bugs-closed", |
3540 | dest="bugs_closed", default=None, |
3541 | help="list of bugs closed ") |
3542 | + parser.add_option("--change-description", default=None, |
3543 | + help="description of what changed in this revision of " |
3544 | + "the review request when updating an existing request") |
3545 | parser.add_option("--revision-range", |
3546 | dest="revision_range", default=None, |
3547 | help="generate the diff for review based on given " |
3548 | "revision range") |
3549 | - parser.add_option("--label", |
3550 | - dest="label", default=None, |
3551 | - help="label (ClearCase Only) ") |
3552 | parser.add_option("--submit-as", |
3553 | dest="submit_as", default=SUBMIT_AS, metavar="USERNAME", |
3554 | help="user name to be recorded as the author of the " |
3555 | @@ -2693,12 +3819,21 @@ |
3556 | parser.add_option("--p4-port", |
3557 | dest="p4_port", default=None, |
3558 | help="the Perforce servers IP address that the review is on") |
3559 | + parser.add_option("--p4-passwd", |
3560 | + dest="p4_passwd", default=None, |
3561 | + help="the Perforce password or ticket of the user in the P4USER environment variable") |
3562 | + parser.add_option('--svn-changelist', dest='svn_changelist', default=None, |
3563 | + help='generate the diff for review based on a local SVN ' |
3564 | + 'changelist') |
3565 | parser.add_option("--repository-url", |
3566 | dest="repository_url", default=None, |
3567 | help="the url for a repository for creating a diff " |
3568 | - "outside of a working copy (currently only supported " |
3569 | - "by Subversion). Requires either --revision-range" |
3570 | - "or --diff-filename options") |
3571 | + "outside of a working copy (currently only " |
3572 | + "supported by Subversion with --revision-range or " |
3573 | + "--diff-filename and ClearCase with relative " |
3574 | + "paths outside the view). For git, this specifies" |
3575 | + "the origin url of the current repository, " |
3576 | + "overriding the origin url supplied by the git client.") |
3577 | parser.add_option("-d", "--debug", |
3578 | action="store_true", dest="debug", default=DEBUG, |
3579 | help="display debug output") |
3580 | @@ -2706,6 +3841,12 @@ |
3581 | dest="diff_filename", default=None, |
3582 | help='upload an existing diff file, instead of ' |
3583 | 'generating a new diff') |
3584 | + parser.add_option('--http-username', |
3585 | + dest='http_username', default=None, metavar='USERNAME', |
3586 | + help='username for HTTP Basic authentication') |
3587 | + parser.add_option('--http-password', |
3588 | + dest='http_password', default=None, metavar='PASSWORD', |
3589 | + help='password for HTTP Basic authentication') |
3590 | |
3591 | (globals()["options"], args) = parser.parse_args(args) |
3592 | |
3593 | @@ -2739,12 +3880,14 @@ |
3594 | options.testing_file) |
3595 | sys.exit(1) |
3596 | |
3597 | - if (options.repository_url and |
3598 | - not options.revision_range and |
3599 | - not options.diff_filename): |
3600 | - sys.stderr.write("The --repository-url option requires either the " |
3601 | - "--revision-range option or the --diff-filename " |
3602 | - "option.\n") |
3603 | + if options.reopen and not options.rid: |
3604 | + sys.stderr.write("The --reopen option requires " |
3605 | + "--review-request-id option.\n") |
3606 | + sys.exit(1) |
3607 | + |
3608 | + if options.change_description and not options.rid: |
3609 | + sys.stderr.write("--change-description may only be used " |
3610 | + "when updating an existing review-request\n") |
3611 | sys.exit(1) |
3612 | |
3613 | return args |
3614 | @@ -2800,14 +3943,19 @@ |
3615 | homepath = '' |
3616 | |
3617 | # Load the config and cookie files |
3618 | - globals()['user_config'] = \ |
3619 | - load_config_file(os.path.join(homepath, ".reviewboardrc")) |
3620 | cookie_file = os.path.join(homepath, ".post-review-cookies.txt") |
3621 | + load_config_files(homepath) |
3622 | |
3623 | args = parse_options(sys.argv[1:]) |
3624 | |
3625 | + debug('RBTools %s' % get_version_string()) |
3626 | + debug('Home = %s' % homepath) |
3627 | + |
3628 | repository_info, tool = determine_client() |
3629 | |
3630 | + # Verify that options specific to an SCM Client have not been mis-used. |
3631 | + tool.check_options() |
3632 | + |
3633 | # Try to find a valid Review Board server to use. |
3634 | if options.server: |
3635 | server_url = options.server |
3636 | @@ -2820,17 +3968,20 @@ |
3637 | |
3638 | server = ReviewBoardServer(server_url, repository_info, cookie_file) |
3639 | |
3640 | + # Handle the case where /api/ requires authorization (RBCommons). |
3641 | + if not server.check_api_version(): |
3642 | + die("Unable to log in with the supplied username and password.") |
3643 | + |
3644 | if repository_info.supports_changesets: |
3645 | changenum = tool.get_changenum(args) |
3646 | else: |
3647 | changenum = None |
3648 | |
3649 | if options.revision_range: |
3650 | - diff = tool.diff_between_revisions(options.revision_range, args, |
3651 | - repository_info) |
3652 | - parent_diff = None |
3653 | - elif options.label and isinstance(tool, ClearCaseClient): |
3654 | - diff, parent_diff = tool.diff_label(options.label) |
3655 | + diff, parent_diff = tool.diff_between_revisions(options.revision_range, args, |
3656 | + repository_info) |
3657 | + elif options.svn_changelist: |
3658 | + diff, parent_diff = tool.diff_changelist(options.svn_changelist) |
3659 | elif options.diff_filename: |
3660 | parent_diff = None |
3661 | |
3662 | @@ -2846,11 +3997,24 @@ |
3663 | else: |
3664 | diff, parent_diff = tool.diff(args) |
3665 | |
3666 | - if isinstance(tool, PerforceClient) and changenum is not None: |
3667 | + if len(diff) == 0: |
3668 | + die("There don't seem to be any diffs!") |
3669 | + |
3670 | + if (isinstance(tool, PerforceClient) or |
3671 | + isinstance(tool, PlasticClient)) and changenum is not None: |
3672 | changenum = tool.sanitize_changenum(changenum) |
3673 | |
3674 | + # NOTE: In Review Board 1.5.2 through 1.5.3.1, the changenum support |
3675 | + # is broken, so we have to force the deprecated API. |
3676 | + if (parse_version(server.rb_version) >= parse_version('1.5.2') and |
3677 | + parse_version(server.rb_version) <= parse_version('1.5.3.1')): |
3678 | + debug('Using changenums on Review Board %s, which is broken. ' |
3679 | + 'Falling back to the deprecated 1.0 API' % server.rb_version) |
3680 | + server.deprecated_api = True |
3681 | + |
3682 | if options.output_diff_only: |
3683 | - print diff |
3684 | + # The comma here isn't a typo, but rather suppresses the extra newline |
3685 | + print diff, |
3686 | sys.exit(0) |
3687 | |
3688 | # Let's begin. |
3689 | |
3690 | === modified file 'rbtools/tests.py' |
3691 | --- rbtools/tests.py 2010-07-31 18:31:05 +0000 |
3692 | +++ rbtools/tests.py 2011-11-06 07:43:24 +0000 |
3693 | @@ -1,10 +1,13 @@ |
3694 | -import nose |
3695 | import os |
3696 | +import re |
3697 | import shutil |
3698 | import sys |
3699 | import tempfile |
3700 | +import time |
3701 | import unittest |
3702 | import urllib2 |
3703 | +from random import randint |
3704 | +from textwrap import dedent |
3705 | |
3706 | try: |
3707 | from cStringIO import StringIO |
3708 | @@ -16,66 +19,17 @@ |
3709 | except ImportError: |
3710 | import simplejson as json |
3711 | |
3712 | -from rbtools.postreview import execute, load_config_file |
3713 | -from rbtools.postreview import APIError, GitClient, RepositoryInfo, \ |
3714 | - ReviewBoardServer |
3715 | +import nose |
3716 | + |
3717 | +from rbtools.postreview import execute, load_config_files |
3718 | +from rbtools.postreview import APIError, GitClient, MercurialClient, \ |
3719 | + RepositoryInfo, ReviewBoardServer, \ |
3720 | + SvnRepositoryInfo |
3721 | import rbtools.postreview |
3722 | |
3723 | |
3724 | -FOO = """\ |
3725 | -ARMA virumque cano, Troiae qui primus ab oris |
3726 | -Italiam, fato profugus, Laviniaque venit |
3727 | -litora, multum ille et terris iactatus et alto |
3728 | -vi superum saevae memorem Iunonis ob iram; |
3729 | -multa quoque et bello passus, dum conderet urbem, |
3730 | -inferretque deos Latio, genus unde Latinum, |
3731 | -Albanique patres, atque altae moenia Romae. |
3732 | -Musa, mihi causas memora, quo numine laeso, |
3733 | -quidve dolens, regina deum tot volvere casus |
3734 | -insignem pietate virum, tot adire labores |
3735 | -impulerit. Tantaene animis caelestibus irae? |
3736 | - |
3737 | -""" |
3738 | - |
3739 | -FOO1 = """\ |
3740 | -ARMA virumque cano, Troiae qui primus ab oris |
3741 | -Italiam, fato profugus, Laviniaque venit |
3742 | -litora, multum ille et terris iactatus et alto |
3743 | -vi superum saevae memorem Iunonis ob iram; |
3744 | -multa quoque et bello passus, dum conderet urbem, |
3745 | -inferretque deos Latio, genus unde Latinum, |
3746 | -Albanique patres, atque altae moenia Romae. |
3747 | -Musa, mihi causas memora, quo numine laeso, |
3748 | - |
3749 | -""" |
3750 | - |
3751 | -FOO2 = """\ |
3752 | -ARMA virumque cano, Troiae qui primus ab oris |
3753 | -ARMA virumque cano, Troiae qui primus ab oris |
3754 | -ARMA virumque cano, Troiae qui primus ab oris |
3755 | -Italiam, fato profugus, Laviniaque venit |
3756 | -litora, multum ille et terris iactatus et alto |
3757 | -vi superum saevae memorem Iunonis ob iram; |
3758 | -multa quoque et bello passus, dum conderet urbem, |
3759 | -inferretque deos Latio, genus unde Latinum, |
3760 | -Albanique patres, atque altae moenia Romae. |
3761 | -Musa, mihi causas memora, quo numine laeso, |
3762 | - |
3763 | -""" |
3764 | - |
3765 | -FOO3 = """\ |
3766 | -ARMA virumque cano, Troiae qui primus ab oris |
3767 | -ARMA virumque cano, Troiae qui primus ab oris |
3768 | -Italiam, fato profugus, Laviniaque venit |
3769 | -litora, multum ille et terris iactatus et alto |
3770 | -vi superum saevae memorem Iunonis ob iram; |
3771 | -dum conderet urbem, |
3772 | -inferretque deos Latio, genus unde Latinum, |
3773 | -Albanique patres, atque altae moenia Romae. |
3774 | -Albanique patres, atque altae moenia Romae. |
3775 | -Musa, mihi causas memora, quo numine laeso, |
3776 | - |
3777 | -""" |
3778 | +TEMPDIR_SUFFIX = '__' + __name__.replace('.', '_') |
3779 | + |
3780 | |
3781 | def is_exe_in_path(name): |
3782 | """Checks whether an executable is in the user's search path. |
3783 | @@ -99,9 +53,17 @@ |
3784 | return False |
3785 | |
3786 | |
3787 | +def _get_tmpdir(): |
3788 | + return tempfile.mkdtemp(TEMPDIR_SUFFIX) |
3789 | + |
3790 | + |
3791 | class MockHttpUnitTest(unittest.TestCase): |
3792 | + deprecated_api = False |
3793 | + |
3794 | def setUp(self): |
3795 | # Save the old http_get and http_post |
3796 | + rbtools.postreview.options = OptionsStub() |
3797 | + |
3798 | self.saved_http_get = ReviewBoardServer.http_get |
3799 | self.saved_http_post = ReviewBoardServer.http_post |
3800 | |
3801 | @@ -110,19 +72,23 @@ |
3802 | ReviewBoardServer.http_get = self._http_method |
3803 | ReviewBoardServer.http_post = self._http_method |
3804 | |
3805 | - self.http_response = "" |
3806 | - |
3807 | - rbtools.postreview.options = OptionsStub() |
3808 | + self.server.deprecated_api = self.deprecated_api |
3809 | + self.http_response = {} |
3810 | |
3811 | def tearDown(self): |
3812 | ReviewBoardServer.http_get = self.saved_http_get |
3813 | ReviewBoardServer.http_post = self.saved_http_post |
3814 | |
3815 | - def _http_method(self, *args, **kwargs): |
3816 | - if isinstance(self.http_response, Exception): |
3817 | - raise self.http_response |
3818 | - else: |
3819 | - return self.http_response |
3820 | + def _http_method(self, path, *args, **kwargs): |
3821 | + if isinstance(self.http_response, dict): |
3822 | + http_response = self.http_response[path] |
3823 | + else: |
3824 | + http_response = self.http_response |
3825 | + |
3826 | + if isinstance(http_response, Exception): |
3827 | + raise http_response |
3828 | + else: |
3829 | + return http_response |
3830 | |
3831 | |
3832 | class OptionsStub(object): |
3833 | @@ -131,6 +97,9 @@ |
3834 | self.guess_summary = False |
3835 | self.guess_description = False |
3836 | self.tracking = None |
3837 | + self.username = None |
3838 | + self.password = None |
3839 | + self.repository_url = None |
3840 | |
3841 | |
3842 | class GitClientTests(unittest.TestCase): |
3843 | @@ -165,7 +134,7 @@ |
3844 | |
3845 | self.orig_dir = os.getcwd() |
3846 | |
3847 | - self.git_dir = tempfile.mkdtemp() |
3848 | + self.git_dir = _get_tmpdir() |
3849 | os.chdir(self.git_dir) |
3850 | self._gitcmd(['init'], git_dir=self.git_dir) |
3851 | foo = open(os.path.join(self.git_dir, 'foo.txt'), 'w') |
3852 | @@ -175,13 +144,14 @@ |
3853 | self._gitcmd(['add', 'foo.txt']) |
3854 | self._gitcmd(['commit', '-m', 'initial commit']) |
3855 | |
3856 | - self.clone_dir = tempfile.mkdtemp() |
3857 | + self.clone_dir = _get_tmpdir() |
3858 | os.rmdir(self.clone_dir) |
3859 | self._gitcmd(['clone', self.git_dir, self.clone_dir]) |
3860 | self.client = GitClient() |
3861 | os.chdir(self.orig_dir) |
3862 | |
3863 | - rbtools.postreview.user_config = load_config_file('') |
3864 | + rbtools.postreview.user_config = {} |
3865 | + rbtools.postreview.configs = [] |
3866 | rbtools.postreview.options = OptionsStub() |
3867 | rbtools.postreview.options.parent_branch = None |
3868 | |
3869 | @@ -214,13 +184,14 @@ |
3870 | rc = open(os.path.join(self.clone_dir, '.reviewboardrc'), 'w') |
3871 | rc.write('REVIEWBOARD_URL = "%s"' % self.TESTSERVER) |
3872 | rc.close() |
3873 | + rbtools.postreview.user_config = load_config_files(self.clone_dir) |
3874 | |
3875 | ri = self.client.get_repository_info() |
3876 | server = self.client.scan_for_server(ri) |
3877 | self.assertEqual(server, self.TESTSERVER) |
3878 | |
3879 | def test_scan_for_server_property(self): |
3880 | - """Test GitClientscan_for_server using repo property""" |
3881 | + """Test GitClient scan_for_server using repo property""" |
3882 | os.chdir(self.clone_dir) |
3883 | self._gitcmd(['config', 'reviewboard.url', self.TESTSERVER]) |
3884 | ri = self.client.get_repository_info() |
3885 | @@ -243,7 +214,7 @@ |
3886 | " \n" |
3887 | |
3888 | os.chdir(self.clone_dir) |
3889 | - ri = self.client.get_repository_info() |
3890 | + self.client.get_repository_info() |
3891 | |
3892 | self._git_add_file_commit('foo.txt', FOO1, 'delete and modify stuff') |
3893 | |
3894 | @@ -273,7 +244,7 @@ |
3895 | " \n" |
3896 | |
3897 | os.chdir(self.clone_dir) |
3898 | - ri = self.client.get_repository_info() |
3899 | + self.client.get_repository_info() |
3900 | |
3901 | self._git_add_file_commit('foo.txt', FOO1, 'commit 1') |
3902 | self._git_add_file_commit('foo.txt', FOO2, 'commit 1') |
3903 | @@ -323,11 +294,11 @@ |
3904 | self._gitcmd(['checkout', '-b', 'mybranch', '--track', 'origin/master']) |
3905 | self._git_add_file_commit('foo.txt', FOO2, 'commit 2') |
3906 | |
3907 | - ri = self.client.get_repository_info() |
3908 | + self.client.get_repository_info() |
3909 | self.assertEqual(self.client.diff(None), (diff1, None)) |
3910 | |
3911 | self._gitcmd(['checkout', 'master']) |
3912 | - ri = self.client.get_repository_info() |
3913 | + self.client.get_repository_info() |
3914 | self.assertEqual(self.client.diff(None), (diff2, None)) |
3915 | |
3916 | def test_diff_tracking_no_origin(self): |
3917 | @@ -352,7 +323,7 @@ |
3918 | self._gitcmd(['checkout', '-b', 'mybranch', '--track', 'quux/master']) |
3919 | self._git_add_file_commit('foo.txt', FOO1, 'delete and modify stuff') |
3920 | |
3921 | - ri = self.client.get_repository_info() |
3922 | + self.client.get_repository_info() |
3923 | |
3924 | self.assertEqual(self.client.diff(None), (diff, None)) |
3925 | |
3926 | @@ -385,7 +356,7 @@ |
3927 | self._gitcmd(['checkout', '-b', 'mybranch', '--track', 'master']) |
3928 | self._git_add_file_commit('foo.txt', FOO2, 'commit 2') |
3929 | |
3930 | - ri = self.client.get_repository_info() |
3931 | + self.client.get_repository_info() |
3932 | self.assertEqual(self.client.diff(None), (diff, None)) |
3933 | |
3934 | def test_diff_tracking_override(self): |
3935 | @@ -412,11 +383,478 @@ |
3936 | |
3937 | self._git_add_file_commit('foo.txt', FOO1, 'commit 1') |
3938 | |
3939 | - ri = self.client.get_repository_info() |
3940 | - self.assertEqual(self.client.diff(None), (diff, None)) |
3941 | + self.client.get_repository_info() |
3942 | + self.assertEqual(self.client.diff(None), (diff, None)) |
3943 | + |
3944 | + def test_diff_slash_tracking(self): |
3945 | + """Test GitClient diff with tracking branch that has slash in its name""" |
3946 | + diff = "diff --git a/foo.txt b/foo.txt\n" \ |
3947 | + "index 5e98e9540e1b741b5be24fcb33c40c1c8069c1fb..e619c1387f5feb91f0ca83194650bfe4f6c2e347 100644\n" \ |
3948 | + "--- a/foo.txt\n" \ |
3949 | + "+++ b/foo.txt\n" \ |
3950 | + "@@ -1,4 +1,6 @@\n" \ |
3951 | + " ARMA virumque cano, Troiae qui primus ab oris\n" \ |
3952 | + "+ARMA virumque cano, Troiae qui primus ab oris\n" \ |
3953 | + "+ARMA virumque cano, Troiae qui primus ab oris\n" \ |
3954 | + " Italiam, fato profugus, Laviniaque venit\n" \ |
3955 | + " litora, multum ille et terris iactatus et alto\n" \ |
3956 | + " vi superum saevae memorem Iunonis ob iram;\n" |
3957 | + |
3958 | + os.chdir(self.git_dir) |
3959 | + self._gitcmd(['checkout', '-b', 'not-master']) |
3960 | + self._git_add_file_commit('foo.txt', FOO1, 'commit 1') |
3961 | + |
3962 | + os.chdir(self.clone_dir) |
3963 | + self._gitcmd(['fetch', 'origin']) |
3964 | + self._gitcmd(['checkout', '-b', 'my/branch', '--track', 'origin/not-master']) |
3965 | + self._git_add_file_commit('foo.txt', FOO2, 'commit 2') |
3966 | + |
3967 | + self.client.get_repository_info() |
3968 | + self.assertEqual(self.client.diff(None), (diff, None)) |
3969 | + |
3970 | + |
3971 | +class MercurialTestBase(unittest.TestCase): |
3972 | + |
3973 | + def setUp(self): |
3974 | + self._hg_env = {} |
3975 | + |
3976 | + def _hgcmd(self, command, split_lines=False, |
3977 | + ignore_errors=False, extra_ignore_errors=(), |
3978 | + translate_newlines=True, hg_dir=None): |
3979 | + if hg_dir: |
3980 | + full_command = ['hg', '--cwd', hg_dir] |
3981 | + else: |
3982 | + full_command = ['hg'] |
3983 | + |
3984 | + # We're *not* doing `env = env or {}` here because |
3985 | + # we want the caller to be able to *enable* reading |
3986 | + # of user and system-level hgrc configuration. |
3987 | + env = self._hg_env.copy() |
3988 | + |
3989 | + if not env: |
3990 | + env = { |
3991 | + 'HGRCPATH': os.devnull, |
3992 | + 'HGPLAIN': '1', |
3993 | + } |
3994 | + |
3995 | + full_command.extend(command) |
3996 | + |
3997 | + return execute(full_command, env, split_lines, ignore_errors, |
3998 | + extra_ignore_errors, translate_newlines) |
3999 | + |
4000 | + def _hg_add_file_commit(self, filename, data, msg): |
4001 | + outfile = open(filename, 'w') |
4002 | + outfile.write(data) |
4003 | + outfile.close() |
4004 | + self._hgcmd(['add', filename]) |
4005 | + self._hgcmd(['commit', '-m', msg]) |
4006 | + |
4007 | + |
4008 | +class MercurialClientTests(MercurialTestBase): |
4009 | + TESTSERVER = 'http://127.0.0.1:8080' |
4010 | + CLONE_HGRC = dedent(""" |
4011 | + [paths] |
4012 | + default = %(hg_dir)s |
4013 | + cloned = %(clone_dir)s |
4014 | + |
4015 | + [reviewboard] |
4016 | + url = %(test_server)s |
4017 | + |
4018 | + [diff] |
4019 | + git = true |
4020 | + """).rstrip() |
4021 | + |
4022 | + def setUp(self): |
4023 | + MercurialTestBase.setUp(self) |
4024 | + if not is_exe_in_path('hg'): |
4025 | + raise nose.SkipTest('hg not found in path') |
4026 | + |
4027 | + self.orig_dir = os.getcwd() |
4028 | + |
4029 | + self.hg_dir = _get_tmpdir() |
4030 | + os.chdir(self.hg_dir) |
4031 | + self._hgcmd(['init'], hg_dir=self.hg_dir) |
4032 | + foo = open(os.path.join(self.hg_dir, 'foo.txt'), 'w') |
4033 | + foo.write(FOO) |
4034 | + foo.close() |
4035 | + |
4036 | + self._hgcmd(['add', 'foo.txt']) |
4037 | + self._hgcmd(['commit', '-m', 'initial commit']) |
4038 | + |
4039 | + self.clone_dir = _get_tmpdir() |
4040 | + os.rmdir(self.clone_dir) |
4041 | + self._hgcmd(['clone', self.hg_dir, self.clone_dir]) |
4042 | + os.chdir(self.clone_dir) |
4043 | + self.client = MercurialClient() |
4044 | + |
4045 | + clone_hgrc = open(self.clone_hgrc_path, 'wb') |
4046 | + clone_hgrc.write(self.CLONE_HGRC % { |
4047 | + 'hg_dir': self.hg_dir, |
4048 | + 'clone_dir': self.clone_dir, |
4049 | + 'test_server': self.TESTSERVER, |
4050 | + }) |
4051 | + clone_hgrc.close() |
4052 | + |
4053 | + self.client.get_repository_info() |
4054 | + rbtools.postreview.user_config = {} |
4055 | + rbtools.postreview.options = OptionsStub() |
4056 | + rbtools.postreview.options.parent_branch = None |
4057 | + os.chdir(self.clone_dir) |
4058 | + |
4059 | + @property |
4060 | + def clone_hgrc_path(self): |
4061 | + return os.path.join(self.clone_dir, '.hg', 'hgrc') |
4062 | + |
4063 | + @property |
4064 | + def hgrc_path(self): |
4065 | + return os.path.join(self.hg_dir, '.hg', 'hgrc') |
4066 | + |
4067 | + def tearDown(self): |
4068 | + os.chdir(self.orig_dir) |
4069 | + shutil.rmtree(self.hg_dir) |
4070 | + shutil.rmtree(self.clone_dir) |
4071 | + |
4072 | + def testGetRepositoryInfoSimple(self): |
4073 | + """Test MercurialClient get_repository_info, simple case""" |
4074 | + ri = self.client.get_repository_info() |
4075 | + |
4076 | + self.assertTrue(isinstance(ri, RepositoryInfo)) |
4077 | + self.assertEqual('', ri.base_path) |
4078 | + |
4079 | + hgpath = ri.path |
4080 | + |
4081 | + if os.path.basename(hgpath) == '.hg': |
4082 | + hgpath = os.path.dirname(hgpath) |
4083 | + |
4084 | + self.assertEqual(self.hg_dir, hgpath) |
4085 | + self.assertTrue(ri.supports_parent_diffs) |
4086 | + self.assertFalse(ri.supports_changesets) |
4087 | + |
4088 | + def testScanForServerSimple(self): |
4089 | + """Test MercurialClient scan_for_server, simple case""" |
4090 | + os.rename(self.clone_hgrc_path, |
4091 | + os.path.join(self.clone_dir, '._disabled_hgrc')) |
4092 | + |
4093 | + self.client.hgrc = {} |
4094 | + self.client._load_hgrc() |
4095 | + ri = self.client.get_repository_info() |
4096 | + |
4097 | + server = self.client.scan_for_server(ri) |
4098 | + self.assertTrue(server is None) |
4099 | + |
4100 | + def testScanForServerWhenPresentInHgrc(self): |
4101 | + """Test MercurialClient scan_for_server when present in hgrc""" |
4102 | + ri = self.client.get_repository_info() |
4103 | + |
4104 | + server = self.client.scan_for_server(ri) |
4105 | + self.assertEqual(self.TESTSERVER, server) |
4106 | + |
4107 | + def testScanForServerReviewboardrc(self): |
4108 | + """Test MercurialClient scan_for_server when in .reviewboardrc""" |
4109 | + rc = open(os.path.join(self.clone_dir, '.reviewboardrc'), 'w') |
4110 | + rc.write('REVIEWBOARD_URL = "%s"' % self.TESTSERVER) |
4111 | + rc.close() |
4112 | + |
4113 | + ri = self.client.get_repository_info() |
4114 | + server = self.client.scan_for_server(ri) |
4115 | + self.assertEqual(self.TESTSERVER, server) |
4116 | + |
4117 | + def testDiffSimple(self): |
4118 | + """Test MercurialClient diff, simple case""" |
4119 | + self.client.get_repository_info() |
4120 | + |
4121 | + self._hg_add_file_commit('foo.txt', FOO1, 'delete and modify stuff') |
4122 | + |
4123 | + diff_result = self.client.diff(None) |
4124 | + self.assertEqual((EXPECTED_HG_DIFF_0, None), diff_result) |
4125 | + |
4126 | + def testDiffSimpleMultiple(self): |
4127 | + """Test MercurialClient diff with multiple commits""" |
4128 | + self.client.get_repository_info() |
4129 | + |
4130 | + self._hg_add_file_commit('foo.txt', FOO1, 'commit 1') |
4131 | + self._hg_add_file_commit('foo.txt', FOO2, 'commit 2') |
4132 | + self._hg_add_file_commit('foo.txt', FOO3, 'commit 3') |
4133 | + |
4134 | + diff_result = self.client.diff(None) |
4135 | + |
4136 | + self.assertEqual((EXPECTED_HG_DIFF_1, None), diff_result) |
4137 | + |
4138 | + def testDiffBranchDiverge(self): |
4139 | + """Test MercurialClient diff with diverged branch""" |
4140 | + self._hg_add_file_commit('foo.txt', FOO1, 'commit 1') |
4141 | + |
4142 | + self._hgcmd(['branch', 'diverged']) |
4143 | + self._hg_add_file_commit('foo.txt', FOO2, 'commit 2') |
4144 | + self.client.get_repository_info() |
4145 | + |
4146 | + self.assertEqual((EXPECTED_HG_DIFF_2, None), self.client.diff(None)) |
4147 | + |
4148 | + self._hgcmd(['update', '-C', 'default']) |
4149 | + self.client.get_repository_info() |
4150 | + |
4151 | + self.assertEqual((EXPECTED_HG_DIFF_3, None), self.client.diff(None)) |
4152 | + |
4153 | + |
4154 | +class MercurialSubversionClientTests(MercurialTestBase): |
4155 | + TESTSERVER = "http://127.0.0.1:8080" |
4156 | + |
4157 | + def __init__(self, *args, **kwargs): |
4158 | + self._tmpbase = '' |
4159 | + self.clone_dir = '' |
4160 | + self.svn_repo = '' |
4161 | + self.svn_checkout = '' |
4162 | + self.client = None |
4163 | + self._svnserve_pid = 0 |
4164 | + self._max_svnserve_pid_tries = 12 |
4165 | + self._svnserve_port = os.environ.get('SVNSERVE_PORT') |
4166 | + self._required_exes = ('svnadmin', 'svnserve', 'svn') |
4167 | + MercurialTestBase.__init__(self, *args, **kwargs) |
4168 | + |
4169 | + def setUp(self): |
4170 | + MercurialTestBase.setUp(self) |
4171 | + self._hg_env = {'FOO': 'BAR'} |
4172 | + |
4173 | + for exe in self._required_exes: |
4174 | + if not is_exe_in_path(exe): |
4175 | + raise nose.SkipTest('missing svn stuff! giving up!') |
4176 | + |
4177 | + if not self._has_hgsubversion(): |
4178 | + raise nose.SkipTest('unable to use `hgsubversion` extension! ' |
4179 | + 'giving up!') |
4180 | + |
4181 | + if not self._tmpbase: |
4182 | + self._tmpbase = _get_tmpdir() |
4183 | + |
4184 | + self._create_svn_repo() |
4185 | + self._fire_up_svnserve() |
4186 | + self._fill_in_svn_repo() |
4187 | + |
4188 | + try: |
4189 | + self._get_testing_clone() |
4190 | + except (OSError, IOError): |
4191 | + msg = 'could not clone from svn repo! skipping...' |
4192 | + raise nose.SkipTest(msg), None, sys.exc_info()[2] |
4193 | + |
4194 | + self._spin_up_client() |
4195 | + self._stub_in_config_and_options() |
4196 | + os.chdir(self.clone_dir) |
4197 | + |
4198 | + def _has_hgsubversion(self): |
4199 | + output = self._hgcmd(['svn', '--help'], |
4200 | + ignore_errors=True, extra_ignore_errors=(255)) |
4201 | + |
4202 | + return not re.search("unknown command ['\"]svn['\"]", output, re.I) |
4203 | + |
4204 | + def tearDown(self): |
4205 | + shutil.rmtree(self.clone_dir) |
4206 | + os.kill(self._svnserve_pid, 9) |
4207 | + |
4208 | + if self._tmpbase: |
4209 | + shutil.rmtree(self._tmpbase) |
4210 | + |
4211 | + def _svn_add_file_commit(self, filename, data, msg): |
4212 | + outfile = open(filename, 'w') |
4213 | + outfile.write(data) |
4214 | + outfile.close() |
4215 | + execute(['svn', 'add', filename]) |
4216 | + execute(['svn', 'commit', '-m', msg]) |
4217 | + |
4218 | + def _create_svn_repo(self): |
4219 | + self.svn_repo = os.path.join(self._tmpbase, 'svnrepo') |
4220 | + execute(['svnadmin', 'create', self.svn_repo]) |
4221 | + |
4222 | + def _fire_up_svnserve(self): |
4223 | + if not self._svnserve_port: |
4224 | + self._svnserve_port = str(randint(30000, 40000)) |
4225 | + |
4226 | + pid_file = os.path.join(self._tmpbase, 'svnserve.pid') |
4227 | + execute(['svnserve', '--pid-file', pid_file, '-d', |
4228 | + '--listen-port', self._svnserve_port, '-r', self._tmpbase]) |
4229 | + |
4230 | + for i in range(0, self._max_svnserve_pid_tries): |
4231 | + try: |
4232 | + self._svnserve_pid = int(open(pid_file).read().strip()) |
4233 | + return |
4234 | + |
4235 | + except (IOError, OSError): |
4236 | + time.sleep(0.25) |
4237 | + |
4238 | + # This will re-raise the last exception, which will be either |
4239 | + # IOError or OSError if the above fails and this branch is reached |
4240 | + raise |
4241 | + |
4242 | + def _fill_in_svn_repo(self): |
4243 | + self.svn_checkout = os.path.join(self._tmpbase, 'checkout.svn') |
4244 | + execute(['svn', 'checkout', 'file://%s' % self.svn_repo, |
4245 | + self.svn_checkout]) |
4246 | + os.chdir(self.svn_checkout) |
4247 | + |
4248 | + for subtree in ('trunk', 'branches', 'tags'): |
4249 | + execute(['svn', 'mkdir', subtree]) |
4250 | + |
4251 | + execute(['svn', 'commit', '-m', 'filling in T/b/t']) |
4252 | + os.chdir(os.path.join(self.svn_checkout, 'trunk')) |
4253 | + |
4254 | + for i, data in enumerate([FOO, FOO1, FOO2]): |
4255 | + self._svn_add_file_commit('foo.txt', data, 'foo commit %s' % i) |
4256 | + |
4257 | + def _get_testing_clone(self): |
4258 | + self.clone_dir = os.path.join(self._tmpbase, 'checkout.hg') |
4259 | + self._hgcmd([ |
4260 | + 'clone', 'svn://127.0.0.1:%s/svnrepo' % self._svnserve_port, |
4261 | + self.clone_dir, |
4262 | + ]) |
4263 | + |
4264 | + def _spin_up_client(self): |
4265 | + os.chdir(self.clone_dir) |
4266 | + self.client = MercurialClient() |
4267 | + |
4268 | + def _stub_in_config_and_options(self): |
4269 | + rbtools.postreview.user_config = {} |
4270 | + rbtools.postreview.options = OptionsStub() |
4271 | + rbtools.postreview.options.parent_branch = None |
4272 | + |
4273 | + def testGetRepositoryInfoSimple(self): |
4274 | + """Test MercurialClient (+svn) get_repository_info, simple case""" |
4275 | + ri = self.client.get_repository_info() |
4276 | + |
4277 | + self.assertEqual('svn', self.client._type) |
4278 | + self.assertEqual('/trunk', ri.base_path) |
4279 | + self.assertEqual('svn://127.0.0.1:%s/svnrepo' % self._svnserve_port, |
4280 | + ri.path) |
4281 | + |
4282 | + def testScanForServerSimple(self): |
4283 | + """Test MercurialClient (+svn) scan_for_server, simple case""" |
4284 | + ri = self.client.get_repository_info() |
4285 | + server = self.client.scan_for_server(ri) |
4286 | + |
4287 | + self.assertTrue(server is None) |
4288 | + |
4289 | + def testScanForServerReviewboardrc(self): |
4290 | + """Test MercurialClient (+svn) scan_for_server in .reviewboardrc""" |
4291 | + rc_filename = os.path.join(self.clone_dir, '.reviewboardrc') |
4292 | + rc = open(rc_filename, 'w') |
4293 | + rc.write('REVIEWBOARD_URL = "%s"' % self.TESTSERVER) |
4294 | + rc.close() |
4295 | + |
4296 | + ri = self.client.get_repository_info() |
4297 | + server = self.client.scan_for_server(ri) |
4298 | + |
4299 | + self.assertEqual(self.TESTSERVER, server) |
4300 | + |
4301 | + def testScanForServerProperty(self): |
4302 | + """Test MercurialClient (+svn) scan_for_server in svn property""" |
4303 | + os.chdir(self.svn_checkout) |
4304 | + execute(['svn', 'update']) |
4305 | + execute(['svn', 'propset', 'reviewboard:url', self.TESTSERVER, |
4306 | + self.svn_checkout]) |
4307 | + execute(['svn', 'commit', '-m', 'adding reviewboard:url property']) |
4308 | + |
4309 | + os.chdir(self.clone_dir) |
4310 | + self._hgcmd(['pull']) |
4311 | + self._hgcmd(['update', '-C']) |
4312 | + |
4313 | + ri = self.client.get_repository_info() |
4314 | + |
4315 | + self.assertEqual(self.TESTSERVER, self.client.scan_for_server(ri)) |
4316 | + |
4317 | + def testDiffSimple(self): |
4318 | + """Test MercurialClient (+svn) diff, simple case""" |
4319 | + self.client.get_repository_info() |
4320 | + |
4321 | + self._hg_add_file_commit('foo.txt', FOO4, 'edit 4') |
4322 | + |
4323 | + self.assertEqual(EXPECTED_HG_SVN_DIFF_0, self.client.diff(None)[0]) |
4324 | + |
4325 | + def testDiffSimpleMultiple(self): |
4326 | + """Test MercurialClient (+svn) diff with multiple commits""" |
4327 | + self.client.get_repository_info() |
4328 | + |
4329 | + self._hg_add_file_commit('foo.txt', FOO4, 'edit 4') |
4330 | + self._hg_add_file_commit('foo.txt', FOO5, 'edit 5') |
4331 | + self._hg_add_file_commit('foo.txt', FOO6, 'edit 6') |
4332 | + |
4333 | + self.assertEqual(EXPECTED_HG_SVN_DIFF_1, self.client.diff(None)[0]) |
4334 | + |
4335 | + |
4336 | +class SVNClientTests(unittest.TestCase): |
4337 | + def test_relative_paths(self): |
4338 | + """Testing SvnRepositoryInfo._get_relative_path""" |
4339 | + info = SvnRepositoryInfo('http://svn.example.com/svn/', '/', '') |
4340 | + self.assertEqual(info._get_relative_path('/foo', '/bar'), None) |
4341 | + self.assertEqual(info._get_relative_path('/', '/trunk/myproject'), |
4342 | + None) |
4343 | + self.assertEqual(info._get_relative_path('/trunk/myproject', '/'), |
4344 | + '/trunk/myproject') |
4345 | + self.assertEqual( |
4346 | + info._get_relative_path('/trunk/myproject', ''), |
4347 | + '/trunk/myproject') |
4348 | + self.assertEqual( |
4349 | + info._get_relative_path('/trunk/myproject', '/trunk'), |
4350 | + '/myproject') |
4351 | + self.assertEqual( |
4352 | + info._get_relative_path('/trunk/myproject', '/trunk/myproject'), |
4353 | + '/') |
4354 | |
4355 | |
4356 | class ApiTests(MockHttpUnitTest): |
4357 | + def setUp(self): |
4358 | + super(ApiTests, self).setUp() |
4359 | + |
4360 | + self.http_response = { |
4361 | + 'api/': json.dumps({ |
4362 | + 'stat': 'ok', |
4363 | + 'links': { |
4364 | + 'info': { |
4365 | + 'href': 'api/info/', |
4366 | + 'method': 'GET', |
4367 | + }, |
4368 | + }, |
4369 | + }), |
4370 | + } |
4371 | + |
4372 | + def test_check_api_version_1_5_2_higher(self): |
4373 | + """Testing checking the API version compatibility (RB >= 1.5.2)""" |
4374 | + self.http_response.update(self._build_info_resource('1.5.2')) |
4375 | + self.server.check_api_version() |
4376 | + self.assertFalse(self.server.deprecated_api) |
4377 | + |
4378 | + self.http_response.update(self._build_info_resource('1.5.3alpha0')) |
4379 | + self.server.check_api_version() |
4380 | + self.assertFalse(self.server.deprecated_api) |
4381 | + |
4382 | + def test_check_api_version_1_5_1_lower(self): |
4383 | + """Testing checking the API version compatibility (RB < 1.5.2)""" |
4384 | + self.http_response.update(self._build_info_resource('1.5.1')) |
4385 | + self.server.check_api_version() |
4386 | + self.assertTrue(self.server.deprecated_api) |
4387 | + |
4388 | + def test_check_api_version_old_api(self): |
4389 | + """Testing checking the API version compatibility (RB < 1.5.0)""" |
4390 | + self.http_response = { |
4391 | + 'api/': APIError(404, 0), |
4392 | + } |
4393 | + |
4394 | + self.server.check_api_version() |
4395 | + self.assertTrue(self.server.deprecated_api) |
4396 | + |
4397 | + def _build_info_resource(self, package_version): |
4398 | + return { |
4399 | + 'api/info/': json.dumps({ |
4400 | + 'stat': 'ok', |
4401 | + 'info': { |
4402 | + 'product': { |
4403 | + 'package_version': package_version, |
4404 | + }, |
4405 | + }, |
4406 | + }), |
4407 | + } |
4408 | + |
4409 | + |
4410 | +class DeprecatedApiTests(MockHttpUnitTest): |
4411 | + deprecated_api = True |
4412 | + |
4413 | SAMPLE_ERROR_STR = json.dumps({ |
4414 | 'stat': 'fail', |
4415 | 'err': { |
4416 | @@ -429,7 +867,7 @@ |
4417 | self.http_response = self.SAMPLE_ERROR_STR |
4418 | |
4419 | try: |
4420 | - data = self.server.api_get('/foo/') |
4421 | + self.server.api_get('/foo/') |
4422 | |
4423 | # Shouldn't be reached |
4424 | self._assert(False) |
4425 | @@ -444,7 +882,7 @@ |
4426 | self.http_response = self.SAMPLE_ERROR_STR |
4427 | |
4428 | try: |
4429 | - data = self.server.api_post('/foo/') |
4430 | + self.server.api_post('/foo/') |
4431 | |
4432 | # Shouldn't be reached |
4433 | self._assert(False) |
4434 | @@ -460,7 +898,7 @@ |
4435 | self.SAMPLE_ERROR_STR) |
4436 | |
4437 | try: |
4438 | - data = self.server.api_get('/foo/') |
4439 | + self.server.api_get('/foo/') |
4440 | |
4441 | # Shouldn't be reached |
4442 | self._assert(False) |
4443 | @@ -476,7 +914,7 @@ |
4444 | self.SAMPLE_ERROR_STR) |
4445 | |
4446 | try: |
4447 | - data = self.server.api_post('/foo/') |
4448 | + self.server.api_post('/foo/') |
4449 | |
4450 | # Shouldn't be reached |
4451 | self._assert(False) |
4452 | @@ -489,3 +927,207 @@ |
4453 | |
4454 | def _make_http_error(self, url, code, body): |
4455 | return urllib2.HTTPError(url, code, body, {}, StringIO(body)) |
4456 | + |
4457 | + |
4458 | +FOO = """\ |
4459 | +ARMA virumque cano, Troiae qui primus ab oris |
4460 | +Italiam, fato profugus, Laviniaque venit |
4461 | +litora, multum ille et terris iactatus et alto |
4462 | +vi superum saevae memorem Iunonis ob iram; |
4463 | +multa quoque et bello passus, dum conderet urbem, |
4464 | +inferretque deos Latio, genus unde Latinum, |
4465 | +Albanique patres, atque altae moenia Romae. |
4466 | +Musa, mihi causas memora, quo numine laeso, |
4467 | +quidve dolens, regina deum tot volvere casus |
4468 | +insignem pietate virum, tot adire labores |
4469 | +impulerit. Tantaene animis caelestibus irae? |
4470 | + |
4471 | +""" |
4472 | + |
4473 | +FOO1 = """\ |
4474 | +ARMA virumque cano, Troiae qui primus ab oris |
4475 | +Italiam, fato profugus, Laviniaque venit |
4476 | +litora, multum ille et terris iactatus et alto |
4477 | +vi superum saevae memorem Iunonis ob iram; |
4478 | +multa quoque et bello passus, dum conderet urbem, |
4479 | +inferretque deos Latio, genus unde Latinum, |
4480 | +Albanique patres, atque altae moenia Romae. |
4481 | +Musa, mihi causas memora, quo numine laeso, |
4482 | + |
4483 | +""" |
4484 | + |
4485 | +FOO2 = """\ |
4486 | +ARMA virumque cano, Troiae qui primus ab oris |
4487 | +ARMA virumque cano, Troiae qui primus ab oris |
4488 | +ARMA virumque cano, Troiae qui primus ab oris |
4489 | +Italiam, fato profugus, Laviniaque venit |
4490 | +litora, multum ille et terris iactatus et alto |
4491 | +vi superum saevae memorem Iunonis ob iram; |
4492 | +multa quoque et bello passus, dum conderet urbem, |
4493 | +inferretque deos Latio, genus unde Latinum, |
4494 | +Albanique patres, atque altae moenia Romae. |
4495 | +Musa, mihi causas memora, quo numine laeso, |
4496 | + |
4497 | +""" |
4498 | + |
4499 | +FOO3 = """\ |
4500 | +ARMA virumque cano, Troiae qui primus ab oris |
4501 | +ARMA virumque cano, Troiae qui primus ab oris |
4502 | +Italiam, fato profugus, Laviniaque venit |
4503 | +litora, multum ille et terris iactatus et alto |
4504 | +vi superum saevae memorem Iunonis ob iram; |
4505 | +dum conderet urbem, |
4506 | +inferretque deos Latio, genus unde Latinum, |
4507 | +Albanique patres, atque altae moenia Romae. |
4508 | +Albanique patres, atque altae moenia Romae. |
4509 | +Musa, mihi causas memora, quo numine laeso, |
4510 | + |
4511 | +""" |
4512 | + |
4513 | +FOO4 = """\ |
4514 | +Italiam, fato profugus, Laviniaque venit |
4515 | +litora, multum ille et terris iactatus et alto |
4516 | +vi superum saevae memorem Iunonis ob iram; |
4517 | +dum conderet urbem, |
4518 | + |
4519 | + |
4520 | + |
4521 | + |
4522 | + |
4523 | +inferretque deos Latio, genus unde Latinum, |
4524 | +Albanique patres, atque altae moenia Romae. |
4525 | +Musa, mihi causas memora, quo numine laeso, |
4526 | + |
4527 | +""" |
4528 | + |
4529 | +FOO5 = """\ |
4530 | +litora, multum ille et terris iactatus et alto |
4531 | +Italiam, fato profugus, Laviniaque venit |
4532 | +vi superum saevae memorem Iunonis ob iram; |
4533 | +dum conderet urbem, |
4534 | +Albanique patres, atque altae moenia Romae. |
4535 | +Albanique patres, atque altae moenia Romae. |
4536 | +Musa, mihi causas memora, quo numine laeso, |
4537 | +inferretque deos Latio, genus unde Latinum, |
4538 | + |
4539 | +ARMA virumque cano, Troiae qui primus ab oris |
4540 | +ARMA virumque cano, Troiae qui primus ab oris |
4541 | +""" |
4542 | + |
4543 | +FOO6 = """\ |
4544 | +ARMA virumque cano, Troiae qui primus ab oris |
4545 | +ARMA virumque cano, Troiae qui primus ab oris |
4546 | +Italiam, fato profugus, Laviniaque venit |
4547 | +litora, multum ille et terris iactatus et alto |
4548 | +vi superum saevae memorem Iunonis ob iram; |
4549 | +dum conderet urbem, inferretque deos Latio, genus |
4550 | +unde Latinum, Albanique patres, atque altae |
4551 | +moenia Romae. Albanique patres, atque altae |
4552 | +moenia Romae. Musa, mihi causas memora, quo numine laeso, |
4553 | + |
4554 | +""" |
4555 | + |
4556 | +EXPECTED_HG_DIFF_0 = """\ |
4557 | +diff --git a/foo.txt b/foo.txt |
4558 | +--- a/foo.txt |
4559 | ++++ b/foo.txt |
4560 | +@@ -6,7 +6,4 @@ |
4561 | + inferretque deos Latio, genus unde Latinum, |
4562 | + Albanique patres, atque altae moenia Romae. |
4563 | + Musa, mihi causas memora, quo numine laeso, |
4564 | +-quidve dolens, regina deum tot volvere casus |
4565 | +-insignem pietate virum, tot adire labores |
4566 | +-impulerit. Tantaene animis caelestibus irae? |
4567 | + |
4568 | +""" |
4569 | + |
4570 | +EXPECTED_HG_DIFF_1 = """\ |
4571 | +diff --git a/foo.txt b/foo.txt |
4572 | +--- a/foo.txt |
4573 | ++++ b/foo.txt |
4574 | +@@ -1,12 +1,11 @@ |
4575 | ++ARMA virumque cano, Troiae qui primus ab oris |
4576 | + ARMA virumque cano, Troiae qui primus ab oris |
4577 | + Italiam, fato profugus, Laviniaque venit |
4578 | + litora, multum ille et terris iactatus et alto |
4579 | + vi superum saevae memorem Iunonis ob iram; |
4580 | +-multa quoque et bello passus, dum conderet urbem, |
4581 | ++dum conderet urbem, |
4582 | + inferretque deos Latio, genus unde Latinum, |
4583 | + Albanique patres, atque altae moenia Romae. |
4584 | ++Albanique patres, atque altae moenia Romae. |
4585 | + Musa, mihi causas memora, quo numine laeso, |
4586 | +-quidve dolens, regina deum tot volvere casus |
4587 | +-insignem pietate virum, tot adire labores |
4588 | +-impulerit. Tantaene animis caelestibus irae? |
4589 | + |
4590 | +""" |
4591 | + |
4592 | +EXPECTED_HG_DIFF_2 = """\ |
4593 | +diff --git a/foo.txt b/foo.txt |
4594 | +--- a/foo.txt |
4595 | ++++ b/foo.txt |
4596 | +@@ -1,3 +1,5 @@ |
4597 | ++ARMA virumque cano, Troiae qui primus ab oris |
4598 | ++ARMA virumque cano, Troiae qui primus ab oris |
4599 | + ARMA virumque cano, Troiae qui primus ab oris |
4600 | + Italiam, fato profugus, Laviniaque venit |
4601 | + litora, multum ille et terris iactatus et alto |
4602 | +""" |
4603 | + |
4604 | +EXPECTED_HG_DIFF_3 = """\ |
4605 | +diff --git a/foo.txt b/foo.txt |
4606 | +--- a/foo.txt |
4607 | ++++ b/foo.txt |
4608 | +@@ -6,7 +6,4 @@ |
4609 | + inferretque deos Latio, genus unde Latinum, |
4610 | + Albanique patres, atque altae moenia Romae. |
4611 | + Musa, mihi causas memora, quo numine laeso, |
4612 | +-quidve dolens, regina deum tot volvere casus |
4613 | +-insignem pietate virum, tot adire labores |
4614 | +-impulerit. Tantaene animis caelestibus irae? |
4615 | + |
4616 | +""" |
4617 | + |
4618 | +EXPECTED_HG_SVN_DIFF_0 = """\ |
4619 | +Index: foo.txt |
4620 | +=================================================================== |
4621 | +--- foo.txt\t(revision 4) |
4622 | ++++ foo.txt\t(working copy) |
4623 | +@@ -1,4 +1,1 @@ |
4624 | +-ARMA virumque cano, Troiae qui primus ab oris |
4625 | +-ARMA virumque cano, Troiae qui primus ab oris |
4626 | +-ARMA virumque cano, Troiae qui primus ab oris |
4627 | + Italiam, fato profugus, Laviniaque venit |
4628 | +@@ -6,3 +3,8 @@ |
4629 | + vi superum saevae memorem Iunonis ob iram; |
4630 | +-multa quoque et bello passus, dum conderet urbem, |
4631 | ++dum conderet urbem, |
4632 | ++ |
4633 | ++ |
4634 | ++ |
4635 | ++ |
4636 | ++ |
4637 | + inferretque deos Latio, genus unde Latinum, |
4638 | +""" |
4639 | + |
4640 | +EXPECTED_HG_SVN_DIFF_1 = """\ |
4641 | +Index: foo.txt |
4642 | +=================================================================== |
4643 | +--- foo.txt\t(revision 4) |
4644 | ++++ foo.txt\t(working copy) |
4645 | +@@ -1,2 +1,1 @@ |
4646 | +-ARMA virumque cano, Troiae qui primus ab oris |
4647 | + ARMA virumque cano, Troiae qui primus ab oris |
4648 | +@@ -6,6 +5,6 @@ |
4649 | + vi superum saevae memorem Iunonis ob iram; |
4650 | +-multa quoque et bello passus, dum conderet urbem, |
4651 | +-inferretque deos Latio, genus unde Latinum, |
4652 | +-Albanique patres, atque altae moenia Romae. |
4653 | +-Musa, mihi causas memora, quo numine laeso, |
4654 | ++dum conderet urbem, inferretque deos Latio, genus |
4655 | ++unde Latinum, Albanique patres, atque altae |
4656 | ++moenia Romae. Albanique patres, atque altae |
4657 | ++moenia Romae. Musa, mihi causas memora, quo numine laeso, |
4658 | + |
4659 | +""" |
4660 | |
4661 | === modified file 'setup.cfg' |
4662 | --- setup.cfg 2010-07-31 18:31:05 +0000 |
4663 | +++ setup.cfg 2011-11-06 07:43:24 +0000 |
4664 | @@ -1,6 +1,7 @@ |
4665 | [egg_info] |
4666 | -tag_build = .dev |
4667 | -tag_svn_revision = 1 |
4668 | +tag_build = |
4669 | +tag_svn_revision = 0 |
4670 | +tag_date = 0 |
4671 | |
4672 | [aliases] |
4673 | snapshot = egg_info -Dr |
4674 | @@ -12,3 +13,4 @@ |
4675 | rc1 = egg_info -DRb rc1 |
4676 | rc2 = egg_info -DRb rc2 |
4677 | release = egg_info -DRb '' |
4678 | + |
One other thing, I put in the changelog that this resolves a Debian bug for the same issue. Is that appropriate, given that's a separate step to get this change into Debian?