Merge lp:~mandel/ubuntuone-windows-installer/include_u1sync_msi into lp:ubuntuone-windows-installer/beta
- include_u1sync_msi
- Merge into beta
Proposed by
Manuel de la Peña
Status: | Merged |
---|---|
Approved by: | John Lenton |
Approved revision: | 57 |
Merged at revision: | 73 |
Proposed branch: | lp:~mandel/ubuntuone-windows-installer/include_u1sync_msi |
Merge into: | lp:ubuntuone-windows-installer/beta |
Diff against target: |
3092 lines (+2969/-4) 16 files modified
.bzrignore (+2/-0) README.txt (+15/-2) install/UbuntuOne.wxs (+562/-0) main.build (+19/-2) src/setup.py (+56/-0) src/u1sync/__init__.py (+14/-0) src/u1sync/client.py (+754/-0) src/u1sync/constants.py (+30/-0) src/u1sync/genericmerge.py (+88/-0) src/u1sync/main.py (+360/-0) src/u1sync/merge.py (+186/-0) src/u1sync/metadata.py (+145/-0) src/u1sync/scan.py (+102/-0) src/u1sync/sync.py (+384/-0) src/u1sync/ubuntuone_optparse.py (+202/-0) src/u1sync/utils.py (+50/-0) |
To merge this branch: | bzr merge lp:~mandel/ubuntuone-windows-installer/include_u1sync_msi |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
John Lenton (community) | Approve | ||
Rick McBride (community) | Approve | ||
Review via email: mp+33957@code.launchpad.net |
Commit message
Description of the change
Added the embedded python runtime to the msi.
To post a comment you must log in.
Revision history for this message
Rick McBride (rmcbride) : | # |
review:
Approve
Revision history for this message
John Lenton (chipaca) : | # |
review:
Approve
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === modified file '.bzrignore' | |||
2 | --- .bzrignore 2010-08-24 18:18:27 +0000 | |||
3 | +++ .bzrignore 2010-08-27 20:21:13 +0000 | |||
4 | @@ -31,3 +31,5 @@ | |||
5 | 31 | install/UbuntuOne.msi | 31 | install/UbuntuOne.msi |
6 | 32 | src/Canonical.UbuntuOne.SyncDaemon/obj | 32 | src/Canonical.UbuntuOne.SyncDaemon/obj |
7 | 33 | src/Canonical.UbuntuOne.SyncDaemon/bin | 33 | src/Canonical.UbuntuOne.SyncDaemon/bin |
8 | 34 | src/build | ||
9 | 35 | src/dist | ||
10 | 34 | 36 | ||
11 | === modified file 'README.txt' | |||
12 | --- README.txt 2010-08-19 11:13:02 +0000 | |||
13 | +++ README.txt 2010-08-27 20:21:13 +0000 | |||
14 | @@ -7,8 +7,21 @@ | |||
15 | 7 | * UbuntuOne Windows Service: Windows service that allows to start, stop, pause and resume the sync daemon. | 7 | * UbuntuOne Windows Service: Windows service that allows to start, stop, pause and resume the sync daemon. |
16 | 8 | * UbuntuOne Windows Service Brodcaster: Provides a WCF service hosted in a windows service that allows to .Net languages | 8 | * UbuntuOne Windows Service Brodcaster: Provides a WCF service hosted in a windows service that allows to .Net languages |
17 | 9 | to communicate with the sync daemon and interact with it. | 9 | to communicate with the sync daemon and interact with it. |
20 | 10 | 10 | ||
21 | 11 | 2. Build | 11 | 2. Enviroment setup |
22 | 12 | |||
23 | 13 | The ubuntuone windows port solution provides a port of the python code that is used on Linux to perform the u1 sync operations. Due to the fact that | ||
24 | 14 | we did not want user to have to be hunting down the python runtime plus the differen modules that have been used. To simplify the live of the | ||
25 | 15 | windows users we have opted to use py2exe to create an executable that will carry all the different python dependencies of the code. As you | ||
26 | 16 | may already know py2exe is not perfect and does not support egg files. In order to make sure that you easy_install does extra the eggs files after | ||
27 | 17 | the packages are installed please use the following command: | ||
28 | 18 | |||
29 | 19 | easy_install -Z %package_name% | ||
30 | 20 | |||
31 | 21 | In order to be able to build the solution you will need to have python win32, the win32 python extensions and the ubuntu one storage protocol in your system.: | ||
32 | 22 | |||
33 | 23 | |||
34 | 24 | 3. Build | ||
35 | 12 | 25 | ||
36 | 13 | In order to simplify the buil as much as possible, all the required tools for the compilation of the project are provided in the soruce tree. The | 26 | In order to simplify the buil as much as possible, all the required tools for the compilation of the project are provided in the soruce tree. The |
37 | 14 | compilation uses a nant project that allows to run the following targets: | 27 | compilation uses a nant project that allows to run the following targets: |
38 | 15 | 28 | ||
39 | === modified file 'install/UbuntuOne.wxs' | |||
40 | --- install/UbuntuOne.wxs 2010-08-24 16:09:12 +0000 | |||
41 | +++ install/UbuntuOne.wxs 2010-08-27 20:21:13 +0000 | |||
42 | @@ -404,6 +404,505 @@ | |||
43 | 404 | KeyPath="yes"/> | 404 | KeyPath="yes"/> |
44 | 405 | </Component> | 405 | </Component> |
45 | 406 | </Directory> | 406 | </Directory> |
46 | 407 | <Directory Id="U1SyncExecutable" | ||
47 | 408 | Name="U1Sync"> | ||
48 | 409 | <Component Id="CTypesComponent" | ||
49 | 410 | Guid="cc878310-b1ee-11df-94e2-0800200c9a66"> | ||
50 | 411 | <File Id="_ctypes.pyd" | ||
51 | 412 | Name="_ctypes.pyd" | ||
52 | 413 | DiskId="1" | ||
53 | 414 | Source="build_results\u1sync\_ctypes.pyd" | ||
54 | 415 | KeyPath="yes"/> | ||
55 | 416 | </Component> | ||
56 | 417 | <Component Id="ElemtTreeComponent" | ||
57 | 418 | Guid="efe6b010-b1ee-11df-94e2-0800200c9a66"> | ||
58 | 419 | <File Id="_elementtree.pyd" | ||
59 | 420 | Name="_elementtree.pyd" | ||
60 | 421 | DiskId="1" | ||
61 | 422 | Source="build_results\u1sync\_elementtree.pyd" | ||
62 | 423 | KeyPath="yes"/> | ||
63 | 424 | </Component> | ||
64 | 425 | <Component Id="HLibComponent" | ||
65 | 426 | Guid="15848270-b1ef-11df-94e2-0800200c9a66"> | ||
66 | 427 | <File Id="_hashlib.pyd" | ||
67 | 428 | Name="_hashlib.pyd" | ||
68 | 429 | DiskId="1" | ||
69 | 430 | Source="build_results\u1sync\_hashlib.pyd" | ||
70 | 431 | KeyPath="yes"/> | ||
71 | 432 | </Component> | ||
72 | 433 | <Component Id="SocketComponent" | ||
73 | 434 | Guid="37e5a060-b1ef-11df-94e2-0800200c9a66"> | ||
74 | 435 | <File Id="_socket.pyd" | ||
75 | 436 | Name="_socket.pyd" | ||
76 | 437 | DiskId="1" | ||
77 | 438 | Source="build_results\u1sync\_socket.pyd" | ||
78 | 439 | KeyPath="yes"/> | ||
79 | 440 | </Component> | ||
80 | 441 | <Component Id="SSLComponent" | ||
81 | 442 | Guid="5df98d20-b1ef-11df-94e2-0800200c9a66"> | ||
82 | 443 | <File Id="_ssl.pyd" | ||
83 | 444 | Name="_ssl.pyd" | ||
84 | 445 | DiskId="1" | ||
85 | 446 | Source="build_results\u1sync\_ssl.pyd" | ||
86 | 447 | KeyPath="yes"/> | ||
87 | 448 | </Component> | ||
88 | 449 | <Component Id="SysLoaderComponent" | ||
89 | 450 | Guid="85d149f0-b1ef-11df-94e2-0800200c9a66"> | ||
90 | 451 | <File Id="_win32sysloader.pyd" | ||
91 | 452 | Name="_win32sysloader.pyd" | ||
92 | 453 | DiskId="1" | ||
93 | 454 | Source="build_results\u1sync\_win32sysloader.pyd" | ||
94 | 455 | KeyPath="yes"/> | ||
95 | 456 | </Component> | ||
96 | 457 | <Component Id="WinCoreDelayComponent" | ||
97 | 458 | Guid="a5ae2e00-b1ef-11df-94e2-0800200c9a66"> | ||
98 | 459 | <File Id="API_MS_Win_Core_DelayLoad_L1_1_0.dll" | ||
99 | 460 | Name="API-MS-Win-Core-DelayLoad-L1-1-0.dll" | ||
100 | 461 | DiskId="1" | ||
101 | 462 | Source="build_results\u1sync\API-MS-Win-Core-DelayLoad-L1-1-0.dll" | ||
102 | 463 | KeyPath="yes"/> | ||
103 | 464 | </Component> | ||
104 | 465 | <Component Id="WinCoreErrorHandlingComponent" | ||
105 | 466 | Guid="f2ae6620-b1ef-11df-94e2-0800200c9a66"> | ||
106 | 467 | <File Id="API_MS_Win_Core_ErrorHandling_L1_1_0.dll" | ||
107 | 468 | Name="API-MS-Win-Core-ErrorHandling-L1-1-0.dll" | ||
108 | 469 | DiskId="1" | ||
109 | 470 | Source="build_results\u1sync\API-MS-Win-Core-ErrorHandling-L1-1-0.dll" | ||
110 | 471 | KeyPath="yes"/> | ||
111 | 472 | </Component> | ||
112 | 473 | <Component Id="WinCoreHandleComponent" | ||
113 | 474 | Guid="252ce1d0-b1f0-11df-94e2-0800200c9a66"> | ||
114 | 475 | <File Id="API_MS_Win_Core_Handle_L1_1_0.dll" | ||
115 | 476 | Name="API-MS-Win-Core-Handle-L1-1-0.dll" | ||
116 | 477 | DiskId="1" | ||
117 | 478 | Source="build_results\u1sync\API-MS-Win-Core-Handle-L1-1-0.dll" | ||
118 | 479 | KeyPath="yes"/> | ||
119 | 480 | </Component> | ||
120 | 481 | <Component Id="WinCoreInterlockedComponent" | ||
121 | 482 | Guid="252ce1d1-b1f0-11df-94e2-0800200c9a66"> | ||
122 | 483 | <File Id="API_MS_Win_Core_Interlocked_L1_1_0.dll" | ||
123 | 484 | Name="API-MS-Win-Core-Interlocked-L1-1-0.dll" | ||
124 | 485 | DiskId="1" | ||
125 | 486 | Source="build_results\u1sync\API-MS-Win-Core-Interlocked-L1-1-0.dll" | ||
126 | 487 | KeyPath="yes"/> | ||
127 | 488 | </Component> | ||
128 | 489 | <Component Id="WinCoreIOComponent" | ||
129 | 490 | Guid="252ce1d2-b1f0-11df-94e2-0800200c9a66"> | ||
130 | 491 | <File Id="API_MS_Win_Core_IO_L1_1_0.dll" | ||
131 | 492 | Name="API-MS-Win-Core-IO-L1-1-0.dll" | ||
132 | 493 | DiskId="1" | ||
133 | 494 | Source="build_results\u1sync\API-MS-Win-Core-IO-L1-1-0.dll" | ||
134 | 495 | KeyPath="yes"/> | ||
135 | 496 | </Component> | ||
136 | 497 | <Component Id="WinCoreLoaderComponent" | ||
137 | 498 | Guid="252ce1d3-b1f0-11df-94e2-0800200c9a66"> | ||
138 | 499 | <File Id="API_MS_Win_Core_LibraryLoader_L1_1_0.dll" | ||
139 | 500 | Name="API-MS-Win-Core-LibraryLoader-L1-1-0.dll" | ||
140 | 501 | DiskId="1" | ||
141 | 502 | Source="build_results\u1sync\API-MS-Win-Core-LibraryLoader-L1-1-0.dll" | ||
142 | 503 | KeyPath="yes"/> | ||
143 | 504 | </Component> | ||
144 | 505 | <Component Id="WinCoreLocalizationComponent" | ||
145 | 506 | Guid="252ce1d4-b1f0-11df-94e2-0800200c9a66"> | ||
146 | 507 | <File Id="API_MS_Win_Core_Localization_L1_1_0.dll" | ||
147 | 508 | Name="API-MS-Win-Core-Localization-L1-1-0.dll" | ||
148 | 509 | DiskId="1" | ||
149 | 510 | Source="build_results\u1sync\API-MS-Win-Core-Localization-L1-1-0.dll" | ||
150 | 511 | KeyPath="yes"/> | ||
151 | 512 | </Component> | ||
152 | 513 | <Component Id="WinCoreLocalRegistryComponent" | ||
153 | 514 | Guid="252ce1d5-b1f0-11df-94e2-0800200c9a66"> | ||
154 | 515 | <File Id="API_MS_Win_Core_LocalRegistry_L1_1_0.dll" | ||
155 | 516 | Name="API-MS-Win-Core-LocalRegistry-L1-1-0.dll" | ||
156 | 517 | DiskId="1" | ||
157 | 518 | Source="build_results\u1sync\API-MS-Win-Core-LocalRegistry-L1-1-0.dll" | ||
158 | 519 | KeyPath="yes"/> | ||
159 | 520 | </Component> | ||
160 | 521 | <Component Id="WinCoreMemoryComponent" | ||
161 | 522 | Guid="252ce1d6-b1f0-11df-94e2-0800200c9a66"> | ||
162 | 523 | <File Id="API_MS_Win_Core_Memory_L1_1_0.dll" | ||
163 | 524 | Name="API-MS-Win-Core-Memory-L1-1-0.dll" | ||
164 | 525 | DiskId="1" | ||
165 | 526 | Source="build_results\u1sync\API-MS-Win-Core-Memory-L1-1-0.dll" | ||
166 | 527 | KeyPath="yes"/> | ||
167 | 528 | </Component> | ||
168 | 529 | <Component Id="WinCoreMiscComponent" | ||
169 | 530 | Guid="252ce1d8-b1f0-11df-94e2-0800200c9a66"> | ||
170 | 531 | <File Id="API_MS_Win_Core_Misc_L1_1_0.dll" | ||
171 | 532 | Name="API-MS-Win-Core-Misc-L1-1-0.dll" | ||
172 | 533 | DiskId="1" | ||
173 | 534 | Source="build_results\u1sync\API-MS-Win-Core-Misc-L1-1-0.dll" | ||
174 | 535 | KeyPath="yes"/> | ||
175 | 536 | </Component> | ||
176 | 537 | <Component Id="WinCoreProcessEnvComponent" | ||
177 | 538 | Guid="252ce1d9-b1f0-11df-94e2-0800200c9a66"> | ||
178 | 539 | <File Id="API_MS_Win_Core_ProcessEnvironment_L1_1_0.dll" | ||
179 | 540 | Name="API-MS-Win-Core-ProcessEnvironment-L1-1-0.dll" | ||
180 | 541 | DiskId="1" | ||
181 | 542 | Source="build_results\u1sync\API-MS-Win-Core-ProcessEnvironment-L1-1-0.dll" | ||
182 | 543 | KeyPath="yes"/> | ||
183 | 544 | </Component> | ||
184 | 545 | <Component Id="WinCoreProcessThreadsComponent" | ||
185 | 546 | Guid="252ce1da-b1f0-11df-94e2-0800200c9a66"> | ||
186 | 547 | <File Id="API_MS_Win_Core_ProcessThreads_L1_1_0.dll" | ||
187 | 548 | Name="API-MS-Win-Core-ProcessThreads-L1-1-0.dll" | ||
188 | 549 | DiskId="1" | ||
189 | 550 | Source="build_results\u1sync\API-MS-Win-Core-ProcessThreads-L1-1-0.dll" | ||
190 | 551 | KeyPath="yes"/> | ||
191 | 552 | </Component> | ||
192 | 553 | <Component Id="WinCoreProfileComponent" | ||
193 | 554 | Guid="252ce1db-b1f0-11df-94e2-0800200c9a66"> | ||
194 | 555 | <File Id="API_MS_Win_Core_Profile_L1_1_0.dll" | ||
195 | 556 | Name="API-MS-Win-Core-Profile-L1-1-0.dll" | ||
196 | 557 | DiskId="1" | ||
197 | 558 | Source="build_results\u1sync\API-MS-Win-Core-Profile-L1-1-0.dll" | ||
198 | 559 | KeyPath="yes"/> | ||
199 | 560 | </Component> | ||
200 | 561 | <Component Id="WinCoreStringComponent" | ||
201 | 562 | Guid="252ce1dc-b1f0-11df-94e2-0800200c9a66"> | ||
202 | 563 | <File Id="API_MS_Win_Core_String_L1_1_0.dll" | ||
203 | 564 | Name="API-MS-Win-Core-String-L1-1-0.dll" | ||
204 | 565 | DiskId="1" | ||
205 | 566 | Source="build_results\u1sync\API-MS-Win-Core-String-L1-1-0.dll" | ||
206 | 567 | KeyPath="yes"/> | ||
207 | 568 | </Component> | ||
208 | 569 | <Component Id="WinCoreSynchComponent" | ||
209 | 570 | Guid="252ce1dd-b1f0-11df-94e2-0800200c9a66"> | ||
210 | 571 | <File Id="API_MS_Win_Core_Synch_L1_1_0.dll" | ||
211 | 572 | Name="API-MS-Win-Core-Synch-L1-1-0.dll" | ||
212 | 573 | DiskId="1" | ||
213 | 574 | Source="build_results\u1sync\API-MS-Win-Core-Synch-L1-1-0.dll" | ||
214 | 575 | KeyPath="yes"/> | ||
215 | 576 | </Component> | ||
216 | 577 | <Component Id="WinCoreSysInfoComponent" | ||
217 | 578 | Guid="252ce1de-b1f0-11df-94e2-0800200c9a66"> | ||
218 | 579 | <File Id="API_MS_Win_Core_SysInfo_L1_1_0.dll" | ||
219 | 580 | Name="API-MS-Win-Core-SysInfo-L1-1-0.dll" | ||
220 | 581 | DiskId="1" | ||
221 | 582 | Source="build_results\u1sync\API-MS-Win-Core-SysInfo-L1-1-0.dll" | ||
222 | 583 | KeyPath="yes"/> | ||
223 | 584 | </Component> | ||
224 | 585 | <Component Id="WinCoreSecurityBaseComponent" | ||
225 | 586 | Guid="252ce1df-b1f0-11df-94e2-0800200c9a66"> | ||
226 | 587 | <File Id="API_MS_Win_Security_Base_L1_1_0.dll" | ||
227 | 588 | Name="API-MS-Win-Security-Base-L1-1-0.dll" | ||
228 | 589 | DiskId="1" | ||
229 | 590 | Source="build_results\u1sync\API-MS-Win-Security-Base-L1-1-0.dll" | ||
230 | 591 | KeyPath="yes"/> | ||
231 | 592 | </Component> | ||
232 | 593 | <Component Id="Bz2Component" | ||
233 | 594 | Guid="252ce1e0-b1f0-11df-94e2-0800200c9a66"> | ||
234 | 595 | <File Id="bz2.pyd" | ||
235 | 596 | Name="bz2.pyd" | ||
236 | 597 | DiskId="1" | ||
237 | 598 | Source="build_results\u1sync\bz2.pyd" | ||
238 | 599 | KeyPath="yes"/> | ||
239 | 600 | </Component> | ||
240 | 601 | <Component Id="BzrAnnotatorComponent" | ||
241 | 602 | Guid="252d08e0-b1f0-11df-94e2-0800200c9a66"> | ||
242 | 603 | <File Id="bzrlib._annotator_pyx.pyd" | ||
243 | 604 | Name="bzrlib._annotator_pyx.pyd" | ||
244 | 605 | DiskId="1" | ||
245 | 606 | Source="build_results\u1sync\bzrlib._annotator_pyx.pyd" | ||
246 | 607 | KeyPath="yes"/> | ||
247 | 608 | </Component> | ||
248 | 609 | <Component Id="BzrBencodeComponent" | ||
249 | 610 | Guid="252d08e1-b1f0-11df-94e2-0800200c9a66"> | ||
250 | 611 | <File Id="bzrlib._bencode_pyx.pyd" | ||
251 | 612 | Name="bzrlib._bencode_pyx.pyd" | ||
252 | 613 | DiskId="1" | ||
253 | 614 | Source="build_results\u1sync\bzrlib._bencode_pyx.pyd" | ||
254 | 615 | KeyPath="yes"/> | ||
255 | 616 | </Component> | ||
256 | 617 | <Component Id="BzrChkComponent" | ||
257 | 618 | Guid="252d08e2-b1f0-11df-94e2-0800200c9a66"> | ||
258 | 619 | <File Id="bzrlib._chk_map_pyx.pyd" | ||
259 | 620 | Name="bzrlib._chk_map_pyx.pyd" | ||
260 | 621 | DiskId="1" | ||
261 | 622 | Source="build_results\u1sync\bzrlib._chk_map_pyx.pyd" | ||
262 | 623 | KeyPath="yes"/> | ||
263 | 624 | </Component> | ||
264 | 625 | <Component Id="BzrChunkComponent" | ||
265 | 626 | Guid="252d08e3-b1f0-11df-94e2-0800200c9a66"> | ||
266 | 627 | <File Id="bzrlib._chunks_to_lines_pyx.pyd" | ||
267 | 628 | Name="bzrlib._chunks_to_lines_pyx.pyd" | ||
268 | 629 | DiskId="1" | ||
269 | 630 | Source="build_results\u1sync\bzrlib._chunks_to_lines_pyx.pyd" | ||
270 | 631 | KeyPath="yes"/> | ||
271 | 632 | </Component> | ||
272 | 633 | <Component Id="BzrPatienceDiffComponent" | ||
273 | 634 | Guid="252d08e5-b1f0-11df-94e2-0800200c9a66"> | ||
274 | 635 | <File Id="bzrlib._patiencediff_c.pyd" | ||
275 | 636 | Name="bzrlib._patiencediff_c.pyd" | ||
276 | 637 | DiskId="1" | ||
277 | 638 | Source="build_results\u1sync\bzrlib._patiencediff_c.pyd" | ||
278 | 639 | KeyPath="yes"/> | ||
279 | 640 | </Component> | ||
280 | 641 | <Component Id="BzrRioComponent" | ||
281 | 642 | Guid="252d08e6-b1f0-11df-94e2-0800200c9a66"> | ||
282 | 643 | <File Id="bzrlib._rio_pyx.pyd" | ||
283 | 644 | Name="bzrlib._rio_pyx.pyd" | ||
284 | 645 | DiskId="1" | ||
285 | 646 | Source="build_results\u1sync\bzrlib._rio_pyx.pyd" | ||
286 | 647 | KeyPath="yes"/> | ||
287 | 648 | </Component> | ||
288 | 649 | <Component Id="BzrStaticTupleComponent" | ||
289 | 650 | Guid="252d08e7-b1f0-11df-94e2-0800200c9a66"> | ||
290 | 651 | <File Id="bzrlib._static_tuple_c.pyd" | ||
291 | 652 | Name="bzrlib._static_tuple_c.pyd" | ||
292 | 653 | DiskId="1" | ||
293 | 654 | Source="build_results\u1sync\bzrlib._static_tuple_c.pyd" | ||
294 | 655 | KeyPath="yes"/> | ||
295 | 656 | </Component> | ||
296 | 657 | <Component Id="KernelBaseComponent" | ||
297 | 658 | Guid="252d08e8-b1f0-11df-94e2-0800200c9a66"> | ||
298 | 659 | <File Id="KERNELBASE.dll" | ||
299 | 660 | Name="KERNELBASE.dll" | ||
300 | 661 | DiskId="1" | ||
301 | 662 | Source="build_results\u1sync\KERNELBASE.dll" | ||
302 | 663 | KeyPath="yes"/> | ||
303 | 664 | </Component> | ||
304 | 665 | <Component Id="Libeay32Component" | ||
305 | 666 | Guid="252d08e9-b1f0-11df-94e2-0800200c9a66"> | ||
306 | 667 | <File Id="LIBEAY32.dll" | ||
307 | 668 | Name="LIBEAY32.dll" | ||
308 | 669 | DiskId="1" | ||
309 | 670 | Source="build_results\u1sync\LIBEAY32.dll" | ||
310 | 671 | KeyPath="yes"/> | ||
311 | 672 | </Component> | ||
312 | 673 | <Component Id="LibraryComponent" | ||
313 | 674 | Guid="252d08ea-b1f0-11df-94e2-0800200c9a66"> | ||
314 | 675 | <File Id="library.zip" | ||
315 | 676 | Name="library.zip" | ||
316 | 677 | DiskId="1" | ||
317 | 678 | Source="build_results\u1sync\library.zip" | ||
318 | 679 | KeyPath="yes"/> | ||
319 | 680 | </Component> | ||
320 | 681 | <Component Id="U1SyncMainComponent" | ||
321 | 682 | Guid="252d08eb-b1f0-11df-94e2-0800200c9a66"> | ||
322 | 683 | <File Id="main.exe" | ||
323 | 684 | Name="main.exe" | ||
324 | 685 | DiskId="1" | ||
325 | 686 | Source="build_results\u1sync\main.exe" | ||
326 | 687 | KeyPath="yes"/> | ||
327 | 688 | </Component> | ||
328 | 689 | <Component Id="MprComponent" | ||
329 | 690 | Guid="252d08ec-b1f0-11df-94e2-0800200c9a66"> | ||
330 | 691 | <File Id="MPR.dll" | ||
331 | 692 | Name="MPR.dll" | ||
332 | 693 | DiskId="1" | ||
333 | 694 | Source="build_results\u1sync\MPR.dll" | ||
334 | 695 | KeyPath="yes"/> | ||
335 | 696 | </Component> | ||
336 | 697 | <Component Id="MswsockComponent" | ||
337 | 698 | Guid="bb15b4b0-b1f5-11df-94e2-0800200c9a66"> | ||
338 | 699 | <File Id="MSWSOCK.dll" | ||
339 | 700 | Name="MSWSOCK.dll" | ||
340 | 701 | DiskId="1" | ||
341 | 702 | Source="build_results\u1sync\MSWSOCK.dll" | ||
342 | 703 | KeyPath="yes"/> | ||
343 | 704 | </Component> | ||
344 | 705 | <Component Id="OpenSSLCryptoComponent" | ||
345 | 706 | Guid="bb15b4b1-b1f5-11df-94e2-0800200c9a66"> | ||
346 | 707 | <File Id="OpenSSL.crypto.pyd" | ||
347 | 708 | Name="OpenSSL.crypto.pyd" | ||
348 | 709 | DiskId="1" | ||
349 | 710 | Source="build_results\u1sync\OpenSSL.crypto.pyd" | ||
350 | 711 | KeyPath="yes"/> | ||
351 | 712 | </Component> | ||
352 | 713 | <Component Id="OpenSSLRandComponent" | ||
353 | 714 | Guid="bb15b4b2-b1f5-11df-94e2-0800200c9a66"> | ||
354 | 715 | <File Id="OpenSSL.rand.pyd" | ||
355 | 716 | Name="OpenSSL.rand.pyd" | ||
356 | 717 | DiskId="1" | ||
357 | 718 | Source="build_results\u1sync\OpenSSL.rand.pyd" | ||
358 | 719 | KeyPath="yes"/> | ||
359 | 720 | </Component> | ||
360 | 721 | <Component Id="OpenSSLSSLComponent" | ||
361 | 722 | Guid="bb15b4b3-b1f5-11df-94e2-0800200c9a66"> | ||
362 | 723 | <File Id="OpenSSL.SSL.pyd" | ||
363 | 724 | Name="OpenSSL.SSL.pyd" | ||
364 | 725 | DiskId="1" | ||
365 | 726 | Source="build_results\u1sync\OpenSSL.SSL.pyd" | ||
366 | 727 | KeyPath="yes"/> | ||
367 | 728 | </Component> | ||
368 | 729 | <Component Id="PowrprofComponent" | ||
369 | 730 | Guid="bb15b4b4-b1f5-11df-94e2-0800200c9a66"> | ||
370 | 731 | <File Id="POWRPROF.dll" | ||
371 | 732 | Name="POWRPROF.dll" | ||
372 | 733 | DiskId="1" | ||
373 | 734 | Source="build_results\u1sync\POWRPROF.dll" | ||
374 | 735 | KeyPath="yes"/> | ||
375 | 736 | </Component> | ||
376 | 737 | <Component Id="PoyexpactComponent" | ||
377 | 738 | Guid="bb15b4b5-b1f5-11df-94e2-0800200c9a66"> | ||
378 | 739 | <File Id="pyexpat.pyd" | ||
379 | 740 | Name="pyexpat.pyd" | ||
380 | 741 | DiskId="1" | ||
381 | 742 | Source="build_results\u1sync\pyexpat.pyd" | ||
382 | 743 | KeyPath="yes"/> | ||
383 | 744 | </Component> | ||
384 | 745 | <Component Id="Python26Component" | ||
385 | 746 | Guid="bb15dbc0-b1f5-11df-94e2-0800200c9a66"> | ||
386 | 747 | <File Id="python26.dll" | ||
387 | 748 | Name="python26.dll" | ||
388 | 749 | DiskId="1" | ||
389 | 750 | Source="build_results\u1sync\python26.dll" | ||
390 | 751 | KeyPath="yes"/> | ||
391 | 752 | </Component> | ||
392 | 753 | <Component Id="PythonCom26Component" | ||
393 | 754 | Guid="bb15dbc1-b1f5-11df-94e2-0800200c9a66"> | ||
394 | 755 | <File Id="pythoncom26.dll" | ||
395 | 756 | Name="pythoncom26.dll" | ||
396 | 757 | DiskId="1" | ||
397 | 758 | Source="build_results\u1sync\pythoncom26.dll" | ||
398 | 759 | KeyPath="yes"/> | ||
399 | 760 | </Component> | ||
400 | 761 | <Component Id="PyWinTypeComponent" | ||
401 | 762 | Guid="bb15dbc2-b1f5-11df-94e2-0800200c9a66"> | ||
402 | 763 | <File Id="pywintypes26.dll" | ||
403 | 764 | Name="pywintypes26.dll" | ||
404 | 765 | DiskId="1" | ||
405 | 766 | Source="build_results\u1sync\pywintypes26.dll" | ||
406 | 767 | KeyPath="yes"/> | ||
407 | 768 | </Component> | ||
408 | 769 | <Component Id="SelectComponent" | ||
409 | 770 | Guid="bb15dbc3-b1f5-11df-94e2-0800200c9a66"> | ||
410 | 771 | <File Id="select.pyd" | ||
411 | 772 | Name="select.pyd" | ||
412 | 773 | DiskId="1" | ||
413 | 774 | Source="build_results\u1sync\select.pyd" | ||
414 | 775 | KeyPath="yes"/> | ||
415 | 776 | </Component> | ||
416 | 777 | <Component Id="Ssleay32Component" | ||
417 | 778 | Guid="bb15dbc4-b1f5-11df-94e2-0800200c9a66"> | ||
418 | 779 | <File Id="SSLEAY32.dll" | ||
419 | 780 | Name="SSLEAY32.dll" | ||
420 | 781 | DiskId="1" | ||
421 | 782 | Source="build_results\u1sync\SSLEAY32.dll" | ||
422 | 783 | KeyPath="yes"/> | ||
423 | 784 | </Component> | ||
424 | 785 | <Component Id="TwistedComponent" | ||
425 | 786 | Guid="bb15dbc5-b1f5-11df-94e2-0800200c9a66"> | ||
426 | 787 | <File Id="twisted.python._initgroups.pyd" | ||
427 | 788 | Name="twisted.python._initgroups.pyd" | ||
428 | 789 | DiskId="1" | ||
429 | 790 | Source="build_results\u1sync\twisted.python._initgroups.pyd" | ||
430 | 791 | KeyPath="yes"/> | ||
431 | 792 | </Component> | ||
432 | 793 | <Component Id="UnicodeDataComponent" | ||
433 | 794 | Guid="bb15dbc6-b1f5-11df-94e2-0800200c9a66"> | ||
434 | 795 | <File Id="unicodedata.pyd" | ||
435 | 796 | Name="unicodedata.pyd" | ||
436 | 797 | DiskId="1" | ||
437 | 798 | Source="build_results\u1sync\unicodedata.pyd" | ||
438 | 799 | KeyPath="yes"/> | ||
439 | 800 | </Component> | ||
440 | 801 | <Component Id="W9xpopenComponent" | ||
441 | 802 | Guid="bb15dbc7-b1f5-11df-94e2-0800200c9a66"> | ||
442 | 803 | <File Id="w9xpopen.exe" | ||
443 | 804 | Name="w9xpopen.exe" | ||
444 | 805 | DiskId="1" | ||
445 | 806 | Source="build_results\u1sync\w9xpopen.exe" | ||
446 | 807 | KeyPath="yes"/> | ||
447 | 808 | </Component> | ||
448 | 809 | <Component Id="Win32ApiComponent" | ||
449 | 810 | Guid="bb15dbc8-b1f5-11df-94e2-0800200c9a66"> | ||
450 | 811 | <File Id="win32api.pyd" | ||
451 | 812 | Name="win32api.pyd" | ||
452 | 813 | DiskId="1" | ||
453 | 814 | Source="build_results\u1sync\win32api.pyd" | ||
454 | 815 | KeyPath="yes"/> | ||
455 | 816 | </Component> | ||
456 | 817 | <Component Id="Win32ComShellComponent" | ||
457 | 818 | Guid="bb15dbc9-b1f5-11df-94e2-0800200c9a66"> | ||
458 | 819 | <File Id="win32com.shell.shell.pyd" | ||
459 | 820 | Name="win32com.shell.shell.pyd" | ||
460 | 821 | DiskId="1" | ||
461 | 822 | Source="build_results\u1sync\win32com.shell.shell.pyd" | ||
462 | 823 | KeyPath="yes"/> | ||
463 | 824 | </Component> | ||
464 | 825 | <Component Id="Win32EventComponent" | ||
465 | 826 | Guid="bb15dbca-b1f5-11df-94e2-0800200c9a66"> | ||
466 | 827 | <File Id="win32event.pyd" | ||
467 | 828 | Name="win32event.pyd" | ||
468 | 829 | DiskId="1" | ||
469 | 830 | Source="build_results\u1sync\win32event.pyd" | ||
470 | 831 | KeyPath="yes"/> | ||
471 | 832 | </Component> | ||
472 | 833 | <Component Id="Win32EventLogComponent" | ||
473 | 834 | Guid="bb15dbcb-b1f5-11df-94e2-0800200c9a66"> | ||
474 | 835 | <File Id="win32evtlog.pyd" | ||
475 | 836 | Name="win32evtlog.pyd" | ||
476 | 837 | DiskId="1" | ||
477 | 838 | Source="build_results\u1sync\win32evtlog.pyd" | ||
478 | 839 | KeyPath="yes"/> | ||
479 | 840 | </Component> | ||
480 | 841 | <Component Id="Win32FileComponent" | ||
481 | 842 | Guid="bb15dbcc-b1f5-11df-94e2-0800200c9a66"> | ||
482 | 843 | <File Id="win32file.pyd" | ||
483 | 844 | Name="win32file.pyd" | ||
484 | 845 | DiskId="1" | ||
485 | 846 | Source="build_results\u1sync\win32file.pyd" | ||
486 | 847 | KeyPath="yes"/> | ||
487 | 848 | </Component> | ||
488 | 849 | <Component Id="Win32PipeComponent" | ||
489 | 850 | Guid="bb15dbcd-b1f5-11df-94e2-0800200c9a66"> | ||
490 | 851 | <File Id="win32pipe.pyd" | ||
491 | 852 | Name="win32pipe.pyd" | ||
492 | 853 | DiskId="1" | ||
493 | 854 | Source="build_results\u1sync\win32pipe.pyd" | ||
494 | 855 | KeyPath="yes"/> | ||
495 | 856 | </Component> | ||
496 | 857 | <Component Id="Win32ProcessComponent" | ||
497 | 858 | Guid="bb15dbce-b1f5-11df-94e2-0800200c9a66"> | ||
498 | 859 | <File Id="win32process.pyd" | ||
499 | 860 | Name="win32process.pyd" | ||
500 | 861 | DiskId="1" | ||
501 | 862 | Source="build_results\u1sync\win32process.pyd" | ||
502 | 863 | KeyPath="yes"/> | ||
503 | 864 | </Component> | ||
504 | 865 | <Component Id="Win32SecurityComponent" | ||
505 | 866 | Guid="bb15dbcf-b1f5-11df-94e2-0800200c9a66"> | ||
506 | 867 | <File Id="win32security.pyd" | ||
507 | 868 | Name="win32security.pyd" | ||
508 | 869 | DiskId="1" | ||
509 | 870 | Source="build_results\u1sync\win32security.pyd" | ||
510 | 871 | KeyPath="yes"/> | ||
511 | 872 | </Component> | ||
512 | 873 | <Component Id="Win32UIComponent" | ||
513 | 874 | Guid="bb15dbd0-b1f5-11df-94e2-0800200c9a66"> | ||
514 | 875 | <File Id="win32ui.pyd" | ||
515 | 876 | Name="win32ui.pyd" | ||
516 | 877 | DiskId="1" | ||
517 | 878 | Source="build_results\u1sync\win32ui.pyd" | ||
518 | 879 | KeyPath="yes"/> | ||
519 | 880 | </Component> | ||
520 | 881 | <Component Id="Win32WnetComponent" | ||
521 | 882 | Guid="bb15dbd1-b1f5-11df-94e2-0800200c9a66"> | ||
522 | 883 | <File Id="win32wnet.pyd" | ||
523 | 884 | Name="win32wnet.pyd" | ||
524 | 885 | DiskId="1" | ||
525 | 886 | Source="build_results\u1sync\win32wnet.pyd" | ||
526 | 887 | KeyPath="yes"/> | ||
527 | 888 | </Component> | ||
528 | 889 | <Component Id="ZLibComponent" | ||
529 | 890 | Guid="bb15dbd2-b1f5-11df-94e2-0800200c9a66"> | ||
530 | 891 | <File Id="zlib1.dll" | ||
531 | 892 | Name="zlib1.dll" | ||
532 | 893 | DiskId="1" | ||
533 | 894 | Source="build_results\u1sync\zlib1.dll" | ||
534 | 895 | KeyPath="yes"/> | ||
535 | 896 | </Component> | ||
536 | 897 | <Component Id="ZopeInterfaceComponent" | ||
537 | 898 | Guid="bb15dbd3-b1f5-11df-94e2-0800200c9a66"> | ||
538 | 899 | <File Id="zope.interface._zope_interface_coptimizations.pyd" | ||
539 | 900 | Name="zope.interface._zope_interface_coptimizations.pyd" | ||
540 | 901 | DiskId="1" | ||
541 | 902 | Source="build_results\u1sync\zope.interface._zope_interface_coptimizations.pyd" | ||
542 | 903 | KeyPath="yes"/> | ||
543 | 904 | </Component> | ||
544 | 905 | </Directory> | ||
545 | 407 | </Directory> | 906 | </Directory> |
546 | 408 | </Directory> | 907 | </Directory> |
547 | 409 | </Directory> | 908 | </Directory> |
548 | @@ -459,6 +958,69 @@ | |||
549 | 459 | <ComponentRef Id="ClientConfigLog4Net" /> | 958 | <ComponentRef Id="ClientConfigLog4Net" /> |
550 | 460 | <!-- Client auto start --> | 959 | <!-- Client auto start --> |
551 | 461 | <ComponentRef Id="UbuntuOneClietnAutostart" /> | 960 | <ComponentRef Id="UbuntuOneClietnAutostart" /> |
552 | 961 | <!-- U1Sync pacakge --> | ||
553 | 962 | <ComponentRef Id="CTypesComponent" /> | ||
554 | 963 | <ComponentRef Id="ElemtTreeComponent" /> | ||
555 | 964 | <ComponentRef Id="HLibComponent" /> | ||
556 | 965 | <ComponentRef Id="SocketComponent" /> | ||
557 | 966 | <ComponentRef Id="SSLComponent" /> | ||
558 | 967 | <ComponentRef Id="SysLoaderComponent" /> | ||
559 | 968 | <ComponentRef Id="WinCoreDelayComponent" /> | ||
560 | 969 | <ComponentRef Id="WinCoreErrorHandlingComponent" /> | ||
561 | 970 | <ComponentRef Id="WinCoreHandleComponent" /> | ||
562 | 971 | <ComponentRef Id="WinCoreInterlockedComponent" /> | ||
563 | 972 | <ComponentRef Id="WinCoreIOComponent" /> | ||
564 | 973 | <ComponentRef Id="WinCoreLoaderComponent" /> | ||
565 | 974 | <ComponentRef Id="WinCoreLocalizationComponent" /> | ||
566 | 975 | <ComponentRef Id="WinCoreLocalRegistryComponent" /> | ||
567 | 976 | <ComponentRef Id="WinCoreMemoryComponent" /> | ||
568 | 977 | <ComponentRef Id="WinCoreMiscComponent" /> | ||
569 | 978 | <ComponentRef Id="WinCoreProcessEnvComponent" /> | ||
570 | 979 | <ComponentRef Id="WinCoreProcessThreadsComponent" /> | ||
571 | 980 | <ComponentRef Id="WinCoreProfileComponent" /> | ||
572 | 981 | <ComponentRef Id="WinCoreStringComponent" /> | ||
573 | 982 | <ComponentRef Id="WinCoreSynchComponent" /> | ||
574 | 983 | <ComponentRef Id="WinCoreSysInfoComponent" /> | ||
575 | 984 | <ComponentRef Id="WinCoreSecurityBaseComponent" /> | ||
576 | 985 | <ComponentRef Id="Bz2Component" /> | ||
577 | 986 | <ComponentRef Id="BzrAnnotatorComponent" /> | ||
578 | 987 | <ComponentRef Id="BzrBencodeComponent" /> | ||
579 | 988 | <ComponentRef Id="BzrChkComponent" /> | ||
580 | 989 | <ComponentRef Id="BzrChunkComponent" /> | ||
581 | 990 | <ComponentRef Id="BzrPatienceDiffComponent" /> | ||
582 | 991 | <ComponentRef Id="BzrRioComponent" /> | ||
583 | 992 | <ComponentRef Id="BzrStaticTupleComponent" /> | ||
584 | 993 | <ComponentRef Id="KernelBaseComponent" /> | ||
585 | 994 | <ComponentRef Id="Libeay32Component" /> | ||
586 | 995 | <ComponentRef Id="LibraryComponent" /> | ||
587 | 996 | <ComponentRef Id="U1SyncMainComponent" /> | ||
588 | 997 | <ComponentRef Id="MprComponent" /> | ||
589 | 998 | <ComponentRef Id="MswsockComponent" /> | ||
590 | 999 | <ComponentRef Id="OpenSSLCryptoComponent" /> | ||
591 | 1000 | <ComponentRef Id="OpenSSLRandComponent" /> | ||
592 | 1001 | <ComponentRef Id="OpenSSLSSLComponent" /> | ||
593 | 1002 | <ComponentRef Id="PowrprofComponent" /> | ||
594 | 1003 | <ComponentRef Id="PoyexpactComponent" /> | ||
595 | 1004 | <ComponentRef Id="Python26Component" /> | ||
596 | 1005 | <ComponentRef Id="PythonCom26Component" /> | ||
597 | 1006 | <ComponentRef Id="PyWinTypeComponent" /> | ||
598 | 1007 | <ComponentRef Id="SelectComponent" /> | ||
599 | 1008 | <ComponentRef Id="Ssleay32Component" /> | ||
600 | 1009 | <ComponentRef Id="TwistedComponent" /> | ||
601 | 1010 | <ComponentRef Id="UnicodeDataComponent" /> | ||
602 | 1011 | <ComponentRef Id="W9xpopenComponent" /> | ||
603 | 1012 | <ComponentRef Id="Win32ApiComponent" /> | ||
604 | 1013 | <ComponentRef Id="Win32ComShellComponent" /> | ||
605 | 1014 | <ComponentRef Id="Win32EventComponent" /> | ||
606 | 1015 | <ComponentRef Id="Win32EventLogComponent" /> | ||
607 | 1016 | <ComponentRef Id="Win32FileComponent" /> | ||
608 | 1017 | <ComponentRef Id="Win32PipeComponent" /> | ||
609 | 1018 | <ComponentRef Id="Win32ProcessComponent" /> | ||
610 | 1019 | <ComponentRef Id="Win32SecurityComponent" /> | ||
611 | 1020 | <ComponentRef Id="Win32UIComponent" /> | ||
612 | 1021 | <ComponentRef Id="Win32WnetComponent" /> | ||
613 | 1022 | <ComponentRef Id="ZLibComponent" /> | ||
614 | 1023 | <ComponentRef Id="ZopeInterfaceComponent" /> | ||
615 | 462 | </Feature> | 1024 | </Feature> |
616 | 463 | 1025 | ||
617 | 464 | <!-- Provide the UI extensions to be used --> | 1026 | <!-- Provide the UI extensions to be used --> |
618 | 465 | 1027 | ||
619 | === modified file 'main.build' | |||
620 | --- main.build 2010-08-24 18:32:48 +0000 | |||
621 | +++ main.build 2010-08-27 20:21:13 +0000 | |||
622 | @@ -176,10 +176,20 @@ | |||
623 | 176 | program="nunit-console.exe" | 176 | program="nunit-console.exe" |
624 | 177 | commandline="UbuntuOneClient.Tests.dll /xml=../../../../test-results/UbuntuOneClient.Tests-Result.xml" /> | 177 | commandline="UbuntuOneClient.Tests.dll /xml=../../../../test-results/UbuntuOneClient.Tests-Result.xml" /> |
625 | 178 | </target> | 178 | </target> |
627 | 179 | 179 | <target name="package_python" | |
628 | 180 | description="Creates the exe binary that embeds the python libs that will be used to perform the sync operation in the windows platform"> | ||
629 | 181 | |||
630 | 182 | <exec basedir="${python_path}" | ||
631 | 183 | managed="true" | ||
632 | 184 | workingdir="src" | ||
633 | 185 | program="python.exe" | ||
634 | 186 | commandline="setup.py py2exe" /> | ||
635 | 187 | |||
636 | 188 | </target> | ||
637 | 189 | |||
638 | 180 | <target name="installer" | 190 | <target name="installer" |
639 | 181 | description="Compiles the solution and create a merge installer that allows to install the solution and other related apps." | 191 | description="Compiles the solution and create a merge installer that allows to install the solution and other related apps." |
641 | 182 | depends="tests"> | 192 | depends="tests, package_python"> |
642 | 183 | 193 | ||
643 | 184 | <mkdir dir="${build_results}" /> | 194 | <mkdir dir="${build_results}" /> |
644 | 185 | 195 | ||
645 | @@ -216,6 +226,13 @@ | |||
646 | 216 | </fileset> | 226 | </fileset> |
647 | 217 | </copy> | 227 | </copy> |
648 | 218 | 228 | ||
649 | 229 | <!-- copy the results of the package_python to the install dir --> | ||
650 | 230 | <copy todir="${build_results}/u1sync" flatten="true"> | ||
651 | 231 | <fileset basedir="src/dist"> | ||
652 | 232 | <include name="*.*" /> | ||
653 | 233 | </fileset> | ||
654 | 234 | </copy> | ||
655 | 235 | |||
656 | 219 | <!-- copy the correct views lib --> | 236 | <!-- copy the correct views lib --> |
657 | 220 | 237 | ||
658 | 221 | <copy todir="${build_results}/Client" flatten="true"> | 238 | <copy todir="${build_results}/Client" flatten="true"> |
659 | 222 | 239 | ||
660 | === added file 'src/setup.py' | |||
661 | --- src/setup.py 1970-01-01 00:00:00 +0000 | |||
662 | +++ src/setup.py 2010-08-27 20:21:13 +0000 | |||
663 | @@ -0,0 +1,56 @@ | |||
664 | 1 | #!/usr/bin/env python | ||
665 | 2 | # Copyright (C) 2010 Canonical - All Rights Reserved | ||
666 | 3 | |||
667 | 4 | """ """ | ||
668 | 5 | import sys | ||
669 | 6 | |||
670 | 7 | # ModuleFinder can't handle runtime changes to __path__, but win32com uses them | ||
671 | 8 | try: | ||
672 | 9 | # py2exe 0.6.4 introduced a replacement modulefinder. | ||
673 | 10 | # This means we have to add package paths there, not to the built-in | ||
674 | 11 | # one. If this new modulefinder gets integrated into Python, then | ||
675 | 12 | # we might be able to revert this some day. | ||
676 | 13 | # if this doesn't work, try import modulefinder | ||
677 | 14 | try: | ||
678 | 15 | import py2exe.mf as modulefinder | ||
679 | 16 | except ImportError: | ||
680 | 17 | import modulefinder | ||
681 | 18 | import win32com | ||
682 | 19 | for p in win32com.__path__[1:]: | ||
683 | 20 | modulefinder.AddPackagePath("win32com", p) | ||
684 | 21 | for extra in ["win32com.shell"]: #,"win32com.mapi" | ||
685 | 22 | __import__(extra) | ||
686 | 23 | m = sys.modules[extra] | ||
687 | 24 | for p in m.__path__[1:]: | ||
688 | 25 | modulefinder.AddPackagePath(extra, p) | ||
689 | 26 | except ImportError: | ||
690 | 27 | # no build path setup, no worries. | ||
691 | 28 | pass | ||
692 | 29 | |||
693 | 30 | from distutils.core import setup | ||
694 | 31 | import py2exe | ||
695 | 32 | |||
696 | 33 | if __name__ == '__main__': | ||
697 | 34 | |||
698 | 35 | setup( | ||
699 | 36 | options = { | ||
700 | 37 | "py2exe": { | ||
701 | 38 | "compressed": 1, | ||
702 | 39 | "optimize": 2}}, | ||
703 | 40 | name='u1sync', | ||
704 | 41 | version='0.0.1', | ||
705 | 42 | author = "Canonical Online Services Hackers", | ||
706 | 43 | description="""u1sync is a utility of Ubuntu One. | ||
707 | 44 | Ubuntu One is a suite of on-line | ||
708 | 45 | services. This package contains the a synchronization client for the | ||
709 | 46 | Ubuntu One file sharing service.""", | ||
710 | 47 | license='GPLv3', | ||
711 | 48 | console=['u1sync\\main.py'], | ||
712 | 49 | requires=[ | ||
713 | 50 | 'python (>= 2.5)', | ||
714 | 51 | 'oauth', | ||
715 | 52 | 'twisted.names', | ||
716 | 53 | 'twisted.web', | ||
717 | 54 | 'ubuntuone.storageprotocol (>= 1.3.0)', | ||
718 | 55 | ], | ||
719 | 56 | ) | ||
720 | 0 | 57 | ||
721 | === added directory 'src/u1sync' | |||
722 | === added file 'src/u1sync/__init__.py' | |||
723 | --- src/u1sync/__init__.py 1970-01-01 00:00:00 +0000 | |||
724 | +++ src/u1sync/__init__.py 2010-08-27 20:21:13 +0000 | |||
725 | @@ -0,0 +1,14 @@ | |||
726 | 1 | # Copyright 2009 Canonical Ltd. | ||
727 | 2 | # | ||
728 | 3 | # This program is free software: you can redistribute it and/or modify it | ||
729 | 4 | # under the terms of the GNU General Public License version 3, as published | ||
730 | 5 | # by the Free Software Foundation. | ||
731 | 6 | # | ||
732 | 7 | # This program is distributed in the hope that it will be useful, but | ||
733 | 8 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
734 | 9 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
735 | 10 | # PURPOSE. See the GNU General Public License for more details. | ||
736 | 11 | # | ||
737 | 12 | # You should have received a copy of the GNU General Public License along | ||
738 | 13 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
739 | 14 | """The guts of the u1sync tool.""" | ||
740 | 0 | 15 | ||
741 | === added file 'src/u1sync/client.py' | |||
742 | --- src/u1sync/client.py 1970-01-01 00:00:00 +0000 | |||
743 | +++ src/u1sync/client.py 2010-08-27 20:21:13 +0000 | |||
744 | @@ -0,0 +1,754 @@ | |||
745 | 1 | # ubuntuone.u1sync.client | ||
746 | 2 | # | ||
747 | 3 | # Client/protocol end of u1sync | ||
748 | 4 | # | ||
749 | 5 | # Author: Lucio Torre <lucio.torre@canonical.com> | ||
750 | 6 | # Author: Tim Cole <tim.cole@canonical.com> | ||
751 | 7 | # | ||
752 | 8 | # Copyright 2009 Canonical Ltd. | ||
753 | 9 | # | ||
754 | 10 | # This program is free software: you can redistribute it and/or modify it | ||
755 | 11 | # under the terms of the GNU General Public License version 3, as published | ||
756 | 12 | # by the Free Software Foundation. | ||
757 | 13 | # | ||
758 | 14 | # This program is distributed in the hope that it will be useful, but | ||
759 | 15 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
760 | 16 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
761 | 17 | # PURPOSE. See the GNU General Public License for more details. | ||
762 | 18 | # | ||
763 | 19 | # You should have received a copy of the GNU General Public License along | ||
764 | 20 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
765 | 21 | """Pretty API for protocol client.""" | ||
766 | 22 | |||
767 | 23 | from __future__ import with_statement | ||
768 | 24 | |||
769 | 25 | import os | ||
770 | 26 | import sys | ||
771 | 27 | import shutil | ||
772 | 28 | from Queue import Queue | ||
773 | 29 | from threading import Lock | ||
774 | 30 | import zlib | ||
775 | 31 | import urlparse | ||
776 | 32 | import ConfigParser | ||
777 | 33 | from cStringIO import StringIO | ||
778 | 34 | from twisted.internet import reactor, defer | ||
779 | 35 | from twisted.internet.defer import inlineCallbacks, returnValue | ||
780 | 36 | from ubuntuone.logger import LOGFOLDER | ||
781 | 37 | from ubuntuone.storageprotocol.content_hash import crc32 | ||
782 | 38 | from ubuntuone.storageprotocol.context import get_ssl_context | ||
783 | 39 | from u1sync.genericmerge import MergeNode | ||
784 | 40 | from u1sync.utils import should_sync | ||
785 | 41 | |||
786 | 42 | CONSUMER_KEY = "ubuntuone" | ||
787 | 43 | |||
788 | 44 | from oauth.oauth import OAuthConsumer | ||
789 | 45 | from ubuntuone.storageprotocol.client import ( | ||
790 | 46 | StorageClientFactory, StorageClient) | ||
791 | 47 | from ubuntuone.storageprotocol import request, volumes | ||
792 | 48 | from ubuntuone.storageprotocol.dircontent_pb2 import \ | ||
793 | 49 | DirectoryContent, DIRECTORY | ||
794 | 50 | import uuid | ||
795 | 51 | import logging | ||
796 | 52 | from logging.handlers import RotatingFileHandler | ||
797 | 53 | import time | ||
798 | 54 | |||
799 | 55 | def share_str(share_uuid): | ||
800 | 56 | """Converts a share UUID to a form the protocol likes.""" | ||
801 | 57 | return str(share_uuid) if share_uuid is not None else request.ROOT | ||
802 | 58 | |||
803 | 59 | LOGFILENAME = os.path.join(LOGFOLDER, 'u1sync.log') | ||
804 | 60 | u1_logger = logging.getLogger("u1sync.timing.log") | ||
805 | 61 | handler = RotatingFileHandler(LOGFILENAME) | ||
806 | 62 | u1_logger.addHandler(handler) | ||
807 | 63 | |||
808 | 64 | def log_timing(func): | ||
809 | 65 | def wrapper(*arg, **kwargs): | ||
810 | 66 | start = time.time() | ||
811 | 67 | ent = func(*arg, **kwargs) | ||
812 | 68 | stop = time.time() | ||
813 | 69 | u1_logger.debug('for %s %0.5f ms elapsed' % (func.func_name, \ | ||
814 | 70 | (stop-start)*1000.0)) | ||
815 | 71 | return ent | ||
816 | 72 | return wrapper | ||
817 | 73 | |||
818 | 74 | |||
819 | 75 | class ForcedShutdown(Exception): | ||
820 | 76 | """Client shutdown forced.""" | ||
821 | 77 | |||
822 | 78 | |||
823 | 79 | class Waiter(object): | ||
824 | 80 | """Wait object for blocking waits.""" | ||
825 | 81 | |||
826 | 82 | def __init__(self): | ||
827 | 83 | """Initializes the wait object.""" | ||
828 | 84 | self.queue = Queue() | ||
829 | 85 | |||
830 | 86 | def wake(self, result): | ||
831 | 87 | """Wakes the waiter with a result.""" | ||
832 | 88 | self.queue.put((result, None)) | ||
833 | 89 | |||
834 | 90 | def wakeAndRaise(self, exc_info): | ||
835 | 91 | """Wakes the waiter, raising the given exception in it.""" | ||
836 | 92 | self.queue.put((None, exc_info)) | ||
837 | 93 | |||
838 | 94 | def wakeWithResult(self, func, *args, **kw): | ||
839 | 95 | """Wakes the waiter with the result of the given function.""" | ||
840 | 96 | try: | ||
841 | 97 | result = func(*args, **kw) | ||
842 | 98 | except Exception: | ||
843 | 99 | self.wakeAndRaise(sys.exc_info()) | ||
844 | 100 | else: | ||
845 | 101 | self.wake(result) | ||
846 | 102 | |||
847 | 103 | def wait(self): | ||
848 | 104 | """Waits for wakeup.""" | ||
849 | 105 | (result, exc_info) = self.queue.get() | ||
850 | 106 | if exc_info: | ||
851 | 107 | try: | ||
852 | 108 | raise exc_info[0], exc_info[1], exc_info[2] | ||
853 | 109 | finally: | ||
854 | 110 | exc_info = None | ||
855 | 111 | else: | ||
856 | 112 | return result | ||
857 | 113 | |||
858 | 114 | |||
859 | 115 | class SyncStorageClient(StorageClient): | ||
860 | 116 | """Simple client that calls a callback on connection.""" | ||
861 | 117 | |||
862 | 118 | @log_timing | ||
863 | 119 | def connectionMade(self): | ||
864 | 120 | """Setup and call callback.""" | ||
865 | 121 | StorageClient.connectionMade(self) | ||
866 | 122 | if self.factory.current_protocol not in (None, self): | ||
867 | 123 | self.factory.current_protocol.transport.loseConnection() | ||
868 | 124 | self.factory.current_protocol = self | ||
869 | 125 | self.factory.observer.connected() | ||
870 | 126 | |||
871 | 127 | @log_timing | ||
872 | 128 | def connectionLost(self, reason=None): | ||
873 | 129 | """Callback for established connection lost""" | ||
874 | 130 | if self.factory.current_protocol is self: | ||
875 | 131 | self.factory.current_protocol = None | ||
876 | 132 | self.factory.observer.disconnected(reason) | ||
877 | 133 | |||
878 | 134 | |||
879 | 135 | class SyncClientFactory(StorageClientFactory): | ||
880 | 136 | """A cmd protocol factory.""" | ||
881 | 137 | # no init: pylint: disable-msg=W0232 | ||
882 | 138 | |||
883 | 139 | protocol = SyncStorageClient | ||
884 | 140 | |||
885 | 141 | @log_timing | ||
886 | 142 | def __init__(self, observer): | ||
887 | 143 | """Create the factory""" | ||
888 | 144 | self.observer = observer | ||
889 | 145 | self.current_protocol = None | ||
890 | 146 | |||
891 | 147 | @log_timing | ||
892 | 148 | def clientConnectionFailed(self, connector, reason): | ||
893 | 149 | """We failed at connecting.""" | ||
894 | 150 | self.current_protocol = None | ||
895 | 151 | self.observer.connection_failed(reason) | ||
896 | 152 | |||
897 | 153 | |||
898 | 154 | class UnsupportedOperationError(Exception): | ||
899 | 155 | """The operation is unsupported by the protocol version.""" | ||
900 | 156 | |||
901 | 157 | |||
902 | 158 | class ConnectionError(Exception): | ||
903 | 159 | """A connection error.""" | ||
904 | 160 | |||
905 | 161 | |||
906 | 162 | class AuthenticationError(Exception): | ||
907 | 163 | """An authentication error.""" | ||
908 | 164 | |||
909 | 165 | |||
910 | 166 | class NoSuchShareError(Exception): | ||
911 | 167 | """Error when there is no such share available.""" | ||
912 | 168 | |||
913 | 169 | |||
914 | 170 | class CapabilitiesError(Exception): | ||
915 | 171 | """A capabilities set/query related error.""" | ||
916 | 172 | |||
917 | 173 | class Client(object): | ||
918 | 174 | """U1 storage client facade.""" | ||
919 | 175 | required_caps = frozenset(["no-content", "fix462230"]) | ||
920 | 176 | |||
921 | 177 | def __init__(self, realm, reactor=reactor): | ||
922 | 178 | """Create the instance.""" | ||
923 | 179 | |||
924 | 180 | self.reactor = reactor | ||
925 | 181 | self.factory = SyncClientFactory(self) | ||
926 | 182 | |||
927 | 183 | self._status_lock = Lock() | ||
928 | 184 | self._status = "disconnected" | ||
929 | 185 | self._status_reason = None | ||
930 | 186 | self._status_waiting = [] | ||
931 | 187 | self._active_waiters = set() | ||
932 | 188 | self.consumer_key = CONSUMER_KEY | ||
933 | 189 | self.consumer_secret = "hammertime" | ||
934 | 190 | |||
935 | 191 | def force_shutdown(self): | ||
936 | 192 | """Forces the client to shut itself down.""" | ||
937 | 193 | with self._status_lock: | ||
938 | 194 | self._status = "forced_shutdown" | ||
939 | 195 | self._reason = None | ||
940 | 196 | for waiter in self._active_waiters: | ||
941 | 197 | waiter.wakeAndRaise((ForcedShutdown("Forced shutdown"), | ||
942 | 198 | None, None)) | ||
943 | 199 | self._active_waiters.clear() | ||
944 | 200 | |||
945 | 201 | def _get_waiter_locked(self): | ||
946 | 202 | """Gets a wait object for blocking waits. Should be called with the | ||
947 | 203 | status lock held. | ||
948 | 204 | """ | ||
949 | 205 | waiter = Waiter() | ||
950 | 206 | if self._status == "forced_shutdown": | ||
951 | 207 | raise ForcedShutdown("Forced shutdown") | ||
952 | 208 | self._active_waiters.add(waiter) | ||
953 | 209 | return waiter | ||
954 | 210 | |||
955 | 211 | def _get_waiter(self): | ||
956 | 212 | """Get a wait object for blocking waits. Acquires the status lock.""" | ||
957 | 213 | with self._status_lock: | ||
958 | 214 | return self._get_waiter_locked() | ||
959 | 215 | |||
960 | 216 | def _wait(self, waiter): | ||
961 | 217 | """Waits for the waiter.""" | ||
962 | 218 | try: | ||
963 | 219 | return waiter.wait() | ||
964 | 220 | finally: | ||
965 | 221 | with self._status_lock: | ||
966 | 222 | if waiter in self._active_waiters: | ||
967 | 223 | self._active_waiters.remove(waiter) | ||
968 | 224 | |||
969 | 225 | @log_timing | ||
970 | 226 | def _change_status(self, status, reason=None): | ||
971 | 227 | """Changes the client status. Usually called from the reactor | ||
972 | 228 | thread. | ||
973 | 229 | |||
974 | 230 | """ | ||
975 | 231 | with self._status_lock: | ||
976 | 232 | if self._status == "forced_shutdown": | ||
977 | 233 | return | ||
978 | 234 | self._status = status | ||
979 | 235 | self._status_reason = reason | ||
980 | 236 | waiting = self._status_waiting | ||
981 | 237 | self._status_waiting = [] | ||
982 | 238 | for waiter in waiting: | ||
983 | 239 | waiter.wake((status, reason)) | ||
984 | 240 | |||
985 | 241 | @log_timing | ||
986 | 242 | def _await_status_not(self, *ignore_statuses): | ||
987 | 243 | """Blocks until the client status changes, returning the new status. | ||
988 | 244 | Should never be called from the reactor thread. | ||
989 | 245 | |||
990 | 246 | """ | ||
991 | 247 | with self._status_lock: | ||
992 | 248 | status = self._status | ||
993 | 249 | reason = self._status_reason | ||
994 | 250 | while status in ignore_statuses: | ||
995 | 251 | waiter = self._get_waiter_locked() | ||
996 | 252 | self._status_waiting.append(waiter) | ||
997 | 253 | self._status_lock.release() | ||
998 | 254 | try: | ||
999 | 255 | status, reason = self._wait(waiter) | ||
1000 | 256 | finally: | ||
1001 | 257 | self._status_lock.acquire() | ||
1002 | 258 | if status == "forced_shutdown": | ||
1003 | 259 | raise ForcedShutdown("Forced shutdown.") | ||
1004 | 260 | return (status, reason) | ||
1005 | 261 | |||
1006 | 262 | def connection_failed(self, reason): | ||
1007 | 263 | """Notification that connection failed.""" | ||
1008 | 264 | self._change_status("disconnected", reason) | ||
1009 | 265 | |||
1010 | 266 | def connected(self): | ||
1011 | 267 | """Notification that connection succeeded.""" | ||
1012 | 268 | self._change_status("connected") | ||
1013 | 269 | |||
1014 | 270 | def disconnected(self, reason): | ||
1015 | 271 | """Notification that we were disconnected.""" | ||
1016 | 272 | self._change_status("disconnected", reason) | ||
1017 | 273 | |||
1018 | 274 | def defer_from_thread(self, function, *args, **kwargs): | ||
1019 | 275 | """Do twisted defer magic to get results and show exceptions.""" | ||
1020 | 276 | waiter = self._get_waiter() | ||
1021 | 277 | @log_timing | ||
1022 | 278 | def runner(): | ||
1023 | 279 | """inner.""" | ||
1024 | 280 | # we do want to catch all | ||
1025 | 281 | # no init: pylint: disable-msg=W0703 | ||
1026 | 282 | try: | ||
1027 | 283 | d = function(*args, **kwargs) | ||
1028 | 284 | if isinstance(d, defer.Deferred): | ||
1029 | 285 | d.addCallbacks(lambda r: waiter.wake((r, None, None)), | ||
1030 | 286 | lambda f: waiter.wake((None, None, f))) | ||
1031 | 287 | else: | ||
1032 | 288 | waiter.wake((d, None, None)) | ||
1033 | 289 | except Exception: | ||
1034 | 290 | waiter.wake((None, sys.exc_info(), None)) | ||
1035 | 291 | |||
1036 | 292 | self.reactor.callFromThread(runner) | ||
1037 | 293 | result, exc_info, failure = self._wait(waiter) | ||
1038 | 294 | if exc_info: | ||
1039 | 295 | try: | ||
1040 | 296 | raise exc_info[0], exc_info[1], exc_info[2] | ||
1041 | 297 | finally: | ||
1042 | 298 | exc_info = None | ||
1043 | 299 | elif failure: | ||
1044 | 300 | failure.raiseException() | ||
1045 | 301 | else: | ||
1046 | 302 | return result | ||
1047 | 303 | |||
1048 | 304 | @log_timing | ||
1049 | 305 | def connect(self, host, port): | ||
1050 | 306 | """Connect to host/port.""" | ||
1051 | 307 | def _connect(): | ||
1052 | 308 | """Deferred part.""" | ||
1053 | 309 | self.reactor.connectTCP(host, port, self.factory) | ||
1054 | 310 | self._connect_inner(_connect) | ||
1055 | 311 | |||
1056 | 312 | @log_timing | ||
1057 | 313 | def connect_ssl(self, host, port, no_verify): | ||
1058 | 314 | """Connect to host/port using ssl.""" | ||
1059 | 315 | def _connect(): | ||
1060 | 316 | """deferred part.""" | ||
1061 | 317 | ctx = get_ssl_context(no_verify) | ||
1062 | 318 | self.reactor.connectSSL(host, port, self.factory, ctx) | ||
1063 | 319 | self._connect_inner(_connect) | ||
1064 | 320 | |||
1065 | 321 | @log_timing | ||
1066 | 322 | def _connect_inner(self, _connect): | ||
1067 | 323 | """Helper function for connecting.""" | ||
1068 | 324 | self._change_status("connecting") | ||
1069 | 325 | self.reactor.callFromThread(_connect) | ||
1070 | 326 | status, reason = self._await_status_not("connecting") | ||
1071 | 327 | if status != "connected": | ||
1072 | 328 | raise ConnectionError(reason.value) | ||
1073 | 329 | |||
1074 | 330 | @log_timing | ||
1075 | 331 | def disconnect(self): | ||
1076 | 332 | """Disconnect.""" | ||
1077 | 333 | if self.factory.current_protocol is not None: | ||
1078 | 334 | self.reactor.callFromThread( | ||
1079 | 335 | self.factory.current_protocol.transport.loseConnection) | ||
1080 | 336 | self._await_status_not("connecting", "connected", "authenticated") | ||
1081 | 337 | |||
1082 | 338 | @log_timing | ||
1083 | 339 | def set_capabilities(self): | ||
1084 | 340 | """Set the capabilities with the server""" | ||
1085 | 341 | |||
1086 | 342 | client = self.factory.current_protocol | ||
1087 | 343 | @log_timing | ||
1088 | 344 | def set_caps_callback(req): | ||
1089 | 345 | "Caps query succeeded" | ||
1090 | 346 | if not req.accepted: | ||
1091 | 347 | de = defer.fail("The server denied setting %s capabilities" % \ | ||
1092 | 348 | req.caps) | ||
1093 | 349 | return de | ||
1094 | 350 | |||
1095 | 351 | @log_timing | ||
1096 | 352 | def query_caps_callback(req): | ||
1097 | 353 | "Caps query succeeded" | ||
1098 | 354 | if req.accepted: | ||
1099 | 355 | set_d = client.set_caps(self.required_caps) | ||
1100 | 356 | set_d.addCallback(set_caps_callback) | ||
1101 | 357 | return set_d | ||
1102 | 358 | else: | ||
1103 | 359 | # the server don't have the requested capabilities. | ||
1104 | 360 | # return a failure for now, in the future we might want | ||
1105 | 361 | # to reconnect to another server | ||
1106 | 362 | de = defer.fail("The server don't have the requested" | ||
1107 | 363 | " capabilities: %s" % str(req.caps)) | ||
1108 | 364 | return de | ||
1109 | 365 | |||
1110 | 366 | @log_timing | ||
1111 | 367 | def _wrapped_set_capabilities(): | ||
1112 | 368 | """Wrapped set_capabilities """ | ||
1113 | 369 | d = client.query_caps(self.required_caps) | ||
1114 | 370 | d.addCallback(query_caps_callback) | ||
1115 | 371 | return d | ||
1116 | 372 | |||
1117 | 373 | try: | ||
1118 | 374 | self.defer_from_thread(_wrapped_set_capabilities) | ||
1119 | 375 | except request.StorageProtocolError, e: | ||
1120 | 376 | raise CapabilitiesError(e) | ||
1121 | 377 | |||
1122 | 378 | @log_timing | ||
1123 | 379 | def get_root_info(self, volume_uuid): | ||
1124 | 380 | """Returns the UUID of the applicable share root.""" | ||
1125 | 381 | if volume_uuid is None: | ||
1126 | 382 | _get_root = self.factory.current_protocol.get_root | ||
1127 | 383 | root = self.defer_from_thread(_get_root) | ||
1128 | 384 | return (uuid.UUID(root), True) | ||
1129 | 385 | else: | ||
1130 | 386 | str_volume_uuid = str(volume_uuid) | ||
1131 | 387 | volume = self._match_volume(lambda v: \ | ||
1132 | 388 | str(v.volume_id) == str_volume_uuid) | ||
1133 | 389 | if isinstance(volume, volumes.ShareVolume): | ||
1134 | 390 | modify = volume.access_level == "Modify" | ||
1135 | 391 | if isinstance(volume, volumes.UDFVolume): | ||
1136 | 392 | modify = True | ||
1137 | 393 | return (uuid.UUID(str(volume.node_id)), modify) | ||
1138 | 394 | |||
1139 | 395 | @log_timing | ||
1140 | 396 | def resolve_path(self, share_uuid, root_uuid, path): | ||
1141 | 397 | """Resolve path relative to the given root node.""" | ||
1142 | 398 | |||
1143 | 399 | @inlineCallbacks | ||
1144 | 400 | def _resolve_worker(): | ||
1145 | 401 | """Path resolution worker.""" | ||
1146 | 402 | node_uuid = root_uuid | ||
1147 | 403 | local_path = path.strip('/') | ||
1148 | 404 | |||
1149 | 405 | while local_path != '': | ||
1150 | 406 | local_path, name = os.path.split(local_path) | ||
1151 | 407 | hashes = yield self._get_node_hashes(share_uuid, [root_uuid]) | ||
1152 | 408 | content_hash = hashes.get(root_uuid, None) | ||
1153 | 409 | if content_hash is None: | ||
1154 | 410 | raise KeyError, "Content hash not available" | ||
1155 | 411 | entries = yield self._get_raw_dir_entries(share_uuid, | ||
1156 | 412 | root_uuid, | ||
1157 | 413 | content_hash) | ||
1158 | 414 | match_name = name.decode('utf-8') | ||
1159 | 415 | match = None | ||
1160 | 416 | for entry in entries: | ||
1161 | 417 | if match_name == entry.name: | ||
1162 | 418 | match = entry | ||
1163 | 419 | break | ||
1164 | 420 | |||
1165 | 421 | if match is None: | ||
1166 | 422 | raise KeyError, "Path not found" | ||
1167 | 423 | |||
1168 | 424 | node_uuid = uuid.UUID(match.node) | ||
1169 | 425 | |||
1170 | 426 | returnValue(node_uuid) | ||
1171 | 427 | |||
1172 | 428 | return self.defer_from_thread(_resolve_worker) | ||
1173 | 429 | |||
1174 | 430 | @log_timing | ||
1175 | 431 | def oauth_from_token(self, token): | ||
1176 | 432 | """Perform OAuth authorisation using an existing token.""" | ||
1177 | 433 | |||
1178 | 434 | consumer = OAuthConsumer(self.consumer_key, self.consumer_secret) | ||
1179 | 435 | |||
1180 | 436 | def _auth_successful(value): | ||
1181 | 437 | """Callback for successful auth. Changes status to | ||
1182 | 438 | authenticated.""" | ||
1183 | 439 | self._change_status("authenticated") | ||
1184 | 440 | return value | ||
1185 | 441 | |||
1186 | 442 | def _auth_failed(value): | ||
1187 | 443 | """Callback for failed auth. Disconnects.""" | ||
1188 | 444 | self.factory.current_protocol.transport.loseConnection() | ||
1189 | 445 | return value | ||
1190 | 446 | |||
1191 | 447 | def _wrapped_authenticate(): | ||
1192 | 448 | """Wrapped authenticate.""" | ||
1193 | 449 | d = self.factory.current_protocol.oauth_authenticate(consumer, | ||
1194 | 450 | token) | ||
1195 | 451 | d.addCallbacks(_auth_successful, _auth_failed) | ||
1196 | 452 | return d | ||
1197 | 453 | |||
1198 | 454 | try: | ||
1199 | 455 | self.defer_from_thread(_wrapped_authenticate) | ||
1200 | 456 | except request.StorageProtocolError, e: | ||
1201 | 457 | raise AuthenticationError(e) | ||
1202 | 458 | status, reason = self._await_status_not("connected") | ||
1203 | 459 | if status != "authenticated": | ||
1204 | 460 | raise AuthenticationError(reason.value) | ||
1205 | 461 | |||
1206 | 462 | @log_timing | ||
1207 | 463 | def find_volume(self, volume_spec): | ||
1208 | 464 | """Finds a share matching the given UUID. Looks at both share UUIDs | ||
1209 | 465 | and root node UUIDs.""" | ||
1210 | 466 | volume = self._match_volume(lambda s: \ | ||
1211 | 467 | str(s.volume_id) == volume_spec or \ | ||
1212 | 468 | str(s.node_id) == volume_spec) | ||
1213 | 469 | return uuid.UUID(str(volume.volume_id)) | ||
1214 | 470 | |||
1215 | 471 | @log_timing | ||
1216 | 472 | def _match_volume(self, predicate): | ||
1217 | 473 | """Finds a volume matching the given predicate.""" | ||
1218 | 474 | _list_shares = self.factory.current_protocol.list_volumes | ||
1219 | 475 | r = self.defer_from_thread(_list_shares) | ||
1220 | 476 | for volume in r.volumes: | ||
1221 | 477 | if predicate(volume): | ||
1222 | 478 | return volume | ||
1223 | 479 | raise NoSuchShareError() | ||
1224 | 480 | |||
1225 | 481 | @log_timing | ||
1226 | 482 | def build_tree(self, share_uuid, root_uuid): | ||
1227 | 483 | """Builds and returns a tree representing the metadata for the given | ||
1228 | 484 | subtree in the given share. | ||
1229 | 485 | |||
1230 | 486 | @param share_uuid: the share UUID or None for the user's volume | ||
1231 | 487 | @param root_uuid: the root UUID of the subtree (must be a directory) | ||
1232 | 488 | @return: a MergeNode tree | ||
1233 | 489 | |||
1234 | 490 | """ | ||
1235 | 491 | root = MergeNode(node_type=DIRECTORY, uuid=root_uuid) | ||
1236 | 492 | |||
1237 | 493 | @log_timing | ||
1238 | 494 | @inlineCallbacks | ||
1239 | 495 | def _get_root_content_hash(): | ||
1240 | 496 | """Obtain the content hash for the root node.""" | ||
1241 | 497 | result = yield self._get_node_hashes(share_uuid, [root_uuid]) | ||
1242 | 498 | returnValue(result.get(root_uuid, None)) | ||
1243 | 499 | |||
1244 | 500 | root.content_hash = self.defer_from_thread(_get_root_content_hash) | ||
1245 | 501 | if root.content_hash is None: | ||
1246 | 502 | raise ValueError("No content available for node %s" % root_uuid) | ||
1247 | 503 | |||
1248 | 504 | @log_timing | ||
1249 | 505 | @inlineCallbacks | ||
1250 | 506 | def _get_children(parent_uuid, parent_content_hash): | ||
1251 | 507 | """Obtain a sequence of MergeNodes corresponding to a node's | ||
1252 | 508 | immediate children. | ||
1253 | 509 | |||
1254 | 510 | """ | ||
1255 | 511 | entries = yield self._get_raw_dir_entries(share_uuid, | ||
1256 | 512 | parent_uuid, | ||
1257 | 513 | parent_content_hash) | ||
1258 | 514 | children = {} | ||
1259 | 515 | for entry in entries: | ||
1260 | 516 | if should_sync(entry.name): | ||
1261 | 517 | child = MergeNode(node_type=entry.node_type, | ||
1262 | 518 | uuid=uuid.UUID(entry.node)) | ||
1263 | 519 | children[entry.name] = child | ||
1264 | 520 | |||
1265 | 521 | child_uuids = [child.uuid for child in children.itervalues()] | ||
1266 | 522 | content_hashes = yield self._get_node_hashes(share_uuid, | ||
1267 | 523 | child_uuids) | ||
1268 | 524 | for child in children.itervalues(): | ||
1269 | 525 | child.content_hash = content_hashes.get(child.uuid, None) | ||
1270 | 526 | |||
1271 | 527 | returnValue(children) | ||
1272 | 528 | |||
1273 | 529 | need_children = [root] | ||
1274 | 530 | while need_children: | ||
1275 | 531 | node = need_children.pop() | ||
1276 | 532 | if node.content_hash is not None: | ||
1277 | 533 | children = self.defer_from_thread(_get_children, node.uuid, | ||
1278 | 534 | node.content_hash) | ||
1279 | 535 | node.children = children | ||
1280 | 536 | for child in children.itervalues(): | ||
1281 | 537 | if child.node_type == DIRECTORY: | ||
1282 | 538 | need_children.append(child) | ||
1283 | 539 | |||
1284 | 540 | return root | ||
1285 | 541 | |||
1286 | 542 | @log_timing | ||
1287 | 543 | def _get_raw_dir_entries(self, share_uuid, node_uuid, content_hash): | ||
1288 | 544 | """Gets raw dir entries for the given directory.""" | ||
1289 | 545 | d = self.factory.current_protocol.get_content(share_str(share_uuid), | ||
1290 | 546 | str(node_uuid), | ||
1291 | 547 | content_hash) | ||
1292 | 548 | d.addCallback(lambda c: zlib.decompress(c.data)) | ||
1293 | 549 | |||
1294 | 550 | def _parse_content(raw_content): | ||
1295 | 551 | """Parses directory content into a list of entry objects.""" | ||
1296 | 552 | unserialized_content = DirectoryContent() | ||
1297 | 553 | unserialized_content.ParseFromString(raw_content) | ||
1298 | 554 | return list(unserialized_content.entries) | ||
1299 | 555 | |||
1300 | 556 | d.addCallback(_parse_content) | ||
1301 | 557 | return d | ||
1302 | 558 | |||
1303 | 559 | @log_timing | ||
1304 | 560 | def download_string(self, share_uuid, node_uuid, content_hash): | ||
1305 | 561 | """Reads a file from the server into a string.""" | ||
1306 | 562 | output = StringIO() | ||
1307 | 563 | self._download_inner(share_uuid=share_uuid, node_uuid=node_uuid, | ||
1308 | 564 | content_hash=content_hash, output=output) | ||
1309 | 565 | return output.getValue() | ||
1310 | 566 | |||
1311 | 567 | @log_timing | ||
1312 | 568 | def download_file(self, share_uuid, node_uuid, content_hash, filename): | ||
1313 | 569 | """Downloads a file from the server.""" | ||
1314 | 570 | # file names have to be quoted to make sure that no issues occur when | ||
1315 | 571 | # spaces exist | ||
1316 | 572 | partial_filename = "%s.u1partial" % filename | ||
1317 | 573 | output = open(partial_filename, "w") | ||
1318 | 574 | |||
1319 | 575 | @log_timing | ||
1320 | 576 | def rename_file(): | ||
1321 | 577 | """Renames the temporary file to the final name.""" | ||
1322 | 578 | output.close() | ||
1323 | 579 | print "Finished downloading %s" % filename | ||
1324 | 580 | os.rename(partial_filename, filename) | ||
1325 | 581 | |||
1326 | 582 | @log_timing | ||
1327 | 583 | def delete_file(): | ||
1328 | 584 | """Deletes the temporary file.""" | ||
1329 | 585 | output.close() | ||
1330 | 586 | os.unlink(partial_filename) | ||
1331 | 587 | |||
1332 | 588 | self._download_inner(share_uuid=share_uuid, node_uuid=node_uuid, | ||
1333 | 589 | content_hash=content_hash, output=output, | ||
1334 | 590 | on_success=rename_file, on_failure=delete_file) | ||
1335 | 591 | |||
1336 | 592 | @log_timing | ||
1337 | 593 | def _download_inner(self, share_uuid, node_uuid, content_hash, output, | ||
1338 | 594 | on_success=lambda: None, on_failure=lambda: None): | ||
1339 | 595 | """Helper function for content downloads.""" | ||
1340 | 596 | dec = zlib.decompressobj() | ||
1341 | 597 | |||
1342 | 598 | @log_timing | ||
1343 | 599 | def write_data(data): | ||
1344 | 600 | """Helper which writes data to the output file.""" | ||
1345 | 601 | uncompressed_data = dec.decompress(data) | ||
1346 | 602 | output.write(uncompressed_data) | ||
1347 | 603 | |||
1348 | 604 | @log_timing | ||
1349 | 605 | def finish_download(value): | ||
1350 | 606 | """Helper which finishes the download.""" | ||
1351 | 607 | uncompressed_data = dec.flush() | ||
1352 | 608 | output.write(uncompressed_data) | ||
1353 | 609 | on_success() | ||
1354 | 610 | return value | ||
1355 | 611 | |||
1356 | 612 | @log_timing | ||
1357 | 613 | def abort_download(value): | ||
1358 | 614 | """Helper which aborts the download.""" | ||
1359 | 615 | on_failure() | ||
1360 | 616 | return value | ||
1361 | 617 | |||
1362 | 618 | @log_timing | ||
1363 | 619 | def _download(): | ||
1364 | 620 | """Async helper.""" | ||
1365 | 621 | _get_content = self.factory.current_protocol.get_content | ||
1366 | 622 | d = _get_content(share_str(share_uuid), str(node_uuid), | ||
1367 | 623 | content_hash, callback=write_data) | ||
1368 | 624 | d.addCallbacks(finish_download, abort_download) | ||
1369 | 625 | return d | ||
1370 | 626 | |||
1371 | 627 | self.defer_from_thread(_download) | ||
1372 | 628 | |||
1373 | 629 | @log_timing | ||
1374 | 630 | def create_directory(self, share_uuid, parent_uuid, name): | ||
1375 | 631 | """Creates a directory on the server.""" | ||
1376 | 632 | r = self.defer_from_thread(self.factory.current_protocol.make_dir, | ||
1377 | 633 | share_str(share_uuid), str(parent_uuid), | ||
1378 | 634 | name) | ||
1379 | 635 | return uuid.UUID(r.new_id) | ||
1380 | 636 | |||
1381 | 637 | @log_timing | ||
1382 | 638 | def create_file(self, share_uuid, parent_uuid, name): | ||
1383 | 639 | """Creates a file on the server.""" | ||
1384 | 640 | r = self.defer_from_thread(self.factory.current_protocol.make_file, | ||
1385 | 641 | share_str(share_uuid), str(parent_uuid), | ||
1386 | 642 | name) | ||
1387 | 643 | return uuid.UUID(r.new_id) | ||
1388 | 644 | |||
1389 | 645 | @log_timing | ||
1390 | 646 | def create_symlink(self, share_uuid, parent_uuid, name, target): | ||
1391 | 647 | """Creates a symlink on the server.""" | ||
1392 | 648 | raise UnsupportedOperationError("Protocol does not support symlinks") | ||
1393 | 649 | |||
1394 | 650 | @log_timing | ||
1395 | 651 | def upload_string(self, share_uuid, node_uuid, old_content_hash, | ||
1396 | 652 | content_hash, content): | ||
1397 | 653 | """Uploads a string to the server as file content.""" | ||
1398 | 654 | crc = crc32(content, 0) | ||
1399 | 655 | compressed_content = zlib.compress(content, 9) | ||
1400 | 656 | compressed = StringIO(compressed_content) | ||
1401 | 657 | self.defer_from_thread(self.factory.current_protocol.put_content, | ||
1402 | 658 | share_str(share_uuid), str(node_uuid), | ||
1403 | 659 | old_content_hash, content_hash, | ||
1404 | 660 | crc, len(content), len(compressed_content), | ||
1405 | 661 | compressed) | ||
1406 | 662 | |||
1407 | 663 | @log_timing | ||
1408 | 664 | def upload_file(self, share_uuid, node_uuid, old_content_hash, | ||
1409 | 665 | content_hash, filename): | ||
1410 | 666 | """Uploads a file to the server.""" | ||
1411 | 667 | parent_dir = os.path.split(filename)[0] | ||
1412 | 668 | unique_filename = os.path.join(parent_dir, "." + str(uuid.uuid4())) | ||
1413 | 669 | |||
1414 | 670 | |||
1415 | 671 | class StagingFile(object): | ||
1416 | 672 | """An object which tracks data being compressed for staging.""" | ||
1417 | 673 | def __init__(self, stream): | ||
1418 | 674 | """Initialize a compression object.""" | ||
1419 | 675 | self.crc32 = 0 | ||
1420 | 676 | self.enc = zlib.compressobj(9) | ||
1421 | 677 | self.size = 0 | ||
1422 | 678 | self.compressed_size = 0 | ||
1423 | 679 | self.stream = stream | ||
1424 | 680 | |||
1425 | 681 | def write(self, bytes): | ||
1426 | 682 | """Compress bytes, keeping track of length and crc32.""" | ||
1427 | 683 | self.size += len(bytes) | ||
1428 | 684 | self.crc32 = crc32(bytes, self.crc32) | ||
1429 | 685 | compressed_bytes = self.enc.compress(bytes) | ||
1430 | 686 | self.compressed_size += len(compressed_bytes) | ||
1431 | 687 | self.stream.write(compressed_bytes) | ||
1432 | 688 | |||
1433 | 689 | def finish(self): | ||
1434 | 690 | """Finish staging compressed data.""" | ||
1435 | 691 | compressed_bytes = self.enc.flush() | ||
1436 | 692 | self.compressed_size += len(compressed_bytes) | ||
1437 | 693 | self.stream.write(compressed_bytes) | ||
1438 | 694 | |||
1439 | 695 | with open(unique_filename, "w+") as compressed: | ||
1440 | 696 | os.unlink(unique_filename) | ||
1441 | 697 | with open(filename, "r") as original: | ||
1442 | 698 | staging = StagingFile(compressed) | ||
1443 | 699 | shutil.copyfileobj(original, staging) | ||
1444 | 700 | staging.finish() | ||
1445 | 701 | compressed.seek(0) | ||
1446 | 702 | self.defer_from_thread(self.factory.current_protocol.put_content, | ||
1447 | 703 | share_str(share_uuid), str(node_uuid), | ||
1448 | 704 | old_content_hash, content_hash, | ||
1449 | 705 | staging.crc32, | ||
1450 | 706 | staging.size, staging.compressed_size, | ||
1451 | 707 | compressed) | ||
1452 | 708 | |||
1453 | 709 | @log_timing | ||
1454 | 710 | def move(self, share_uuid, parent_uuid, name, node_uuid): | ||
1455 | 711 | """Moves a file on the server.""" | ||
1456 | 712 | self.defer_from_thread(self.factory.current_protocol.move, | ||
1457 | 713 | share_str(share_uuid), str(node_uuid), | ||
1458 | 714 | str(parent_uuid), name) | ||
1459 | 715 | |||
1460 | 716 | @log_timing | ||
1461 | 717 | def unlink(self, share_uuid, node_uuid): | ||
1462 | 718 | """Unlinks a file on the server.""" | ||
1463 | 719 | self.defer_from_thread(self.factory.current_protocol.unlink, | ||
1464 | 720 | share_str(share_uuid), str(node_uuid)) | ||
1465 | 721 | |||
1466 | 722 | @log_timing | ||
1467 | 723 | def _get_node_hashes(self, share_uuid, node_uuids): | ||
1468 | 724 | """Fetches hashes for the given nodes.""" | ||
1469 | 725 | share = share_str(share_uuid) | ||
1470 | 726 | queries = [(share, str(node_uuid), request.UNKNOWN_HASH) \ | ||
1471 | 727 | for node_uuid in node_uuids] | ||
1472 | 728 | d = self.factory.current_protocol.query(queries) | ||
1473 | 729 | |||
1474 | 730 | @log_timing | ||
1475 | 731 | def _collect_hashes(multi_result): | ||
1476 | 732 | """Accumulate hashes from query replies.""" | ||
1477 | 733 | hashes = {} | ||
1478 | 734 | for (success, value) in multi_result: | ||
1479 | 735 | if success: | ||
1480 | 736 | for node_state in value.response: | ||
1481 | 737 | node_uuid = uuid.UUID(node_state.node) | ||
1482 | 738 | hashes[node_uuid] = node_state.hash | ||
1483 | 739 | return hashes | ||
1484 | 740 | |||
1485 | 741 | d.addCallback(_collect_hashes) | ||
1486 | 742 | return d | ||
1487 | 743 | |||
1488 | 744 | @log_timing | ||
1489 | 745 | def get_incoming_shares(self): | ||
1490 | 746 | """Returns a list of incoming shares as (name, uuid, accepted) | ||
1491 | 747 | tuples. | ||
1492 | 748 | |||
1493 | 749 | """ | ||
1494 | 750 | _list_shares = self.factory.current_protocol.list_shares | ||
1495 | 751 | r = self.defer_from_thread(_list_shares) | ||
1496 | 752 | return [(s.name, s.id, s.other_visible_name, | ||
1497 | 753 | s.accepted, s.access_level) \ | ||
1498 | 754 | for s in r.shares if s.direction == "to_me"] | ||
1499 | 0 | 755 | ||
1500 | === added file 'src/u1sync/constants.py' | |||
1501 | --- src/u1sync/constants.py 1970-01-01 00:00:00 +0000 | |||
1502 | +++ src/u1sync/constants.py 2010-08-27 20:21:13 +0000 | |||
1503 | @@ -0,0 +1,30 @@ | |||
1504 | 1 | # ubuntuone.u1sync.constants | ||
1505 | 2 | # | ||
1506 | 3 | # u1sync constants | ||
1507 | 4 | # | ||
1508 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
1509 | 6 | # | ||
1510 | 7 | # Copyright 2009 Canonical Ltd. | ||
1511 | 8 | # | ||
1512 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
1513 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
1514 | 11 | # by the Free Software Foundation. | ||
1515 | 12 | # | ||
1516 | 13 | # This program is distributed in the hope that it will be useful, but | ||
1517 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
1518 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
1519 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
1520 | 17 | # | ||
1521 | 18 | # You should have received a copy of the GNU General Public License along | ||
1522 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
1523 | 20 | """Assorted constant definitions which don't fit anywhere else.""" | ||
1524 | 21 | |||
1525 | 22 | import re | ||
1526 | 23 | |||
1527 | 24 | # the name of the directory u1sync uses to keep metadata about a mirror | ||
1528 | 25 | METADATA_DIR_NAME = u".ubuntuone-sync" | ||
1529 | 26 | |||
1530 | 27 | # filenames to ignore | ||
1531 | 28 | SPECIAL_FILE_RE = re.compile(".*\\.(" | ||
1532 | 29 | "(u1)?partial|part|" | ||
1533 | 30 | "(u1)?conflict(\\.[0-9]+)?)$") | ||
1534 | 0 | 31 | ||
1535 | === added file 'src/u1sync/genericmerge.py' | |||
1536 | --- src/u1sync/genericmerge.py 1970-01-01 00:00:00 +0000 | |||
1537 | +++ src/u1sync/genericmerge.py 2010-08-27 20:21:13 +0000 | |||
1538 | @@ -0,0 +1,88 @@ | |||
1539 | 1 | # ubuntuone.u1sync.genericmerge | ||
1540 | 2 | # | ||
1541 | 3 | # Generic merge function | ||
1542 | 4 | # | ||
1543 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
1544 | 6 | # | ||
1545 | 7 | # Copyright 2009 Canonical Ltd. | ||
1546 | 8 | # | ||
1547 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
1548 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
1549 | 11 | # by the Free Software Foundation. | ||
1550 | 12 | # | ||
1551 | 13 | # This program is distributed in the hope that it will be useful, but | ||
1552 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
1553 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
1554 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
1555 | 17 | # | ||
1556 | 18 | # You should have received a copy of the GNU General Public License along | ||
1557 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
1558 | 20 | """A generic abstraction for merge operations on directory trees.""" | ||
1559 | 21 | |||
1560 | 22 | from itertools import chain | ||
1561 | 23 | from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY | ||
1562 | 24 | |||
1563 | 25 | class MergeNode(object): | ||
1564 | 26 | """A filesystem node. Should generally be treated as immutable.""" | ||
1565 | 27 | def __init__(self, node_type, content_hash=None, uuid=None, children=None, | ||
1566 | 28 | conflict_info=None): | ||
1567 | 29 | """Initializes a node instance.""" | ||
1568 | 30 | self.node_type = node_type | ||
1569 | 31 | self.children = children | ||
1570 | 32 | self.uuid = uuid | ||
1571 | 33 | self.content_hash = content_hash | ||
1572 | 34 | self.conflict_info = conflict_info | ||
1573 | 35 | |||
1574 | 36 | def __eq__(self, other): | ||
1575 | 37 | """Equality test.""" | ||
1576 | 38 | if type(other) is not type(self): | ||
1577 | 39 | return False | ||
1578 | 40 | return self.node_type == other.node_type and \ | ||
1579 | 41 | self.children == other.children and \ | ||
1580 | 42 | self.uuid == other.uuid and \ | ||
1581 | 43 | self.content_hash == other.content_hash and \ | ||
1582 | 44 | self.conflict_info == other.conflict_info | ||
1583 | 45 | |||
1584 | 46 | def __ne__(self, other): | ||
1585 | 47 | """Non-equality test.""" | ||
1586 | 48 | return not self.__eq__(other) | ||
1587 | 49 | |||
1588 | 50 | |||
1589 | 51 | def show_tree(tree, indent="", name="/"): | ||
1590 | 52 | """Prints a tree.""" | ||
1591 | 53 | # TODO: return string do not print | ||
1592 | 54 | if tree.node_type == DIRECTORY: | ||
1593 | 55 | type_str = "DIR " | ||
1594 | 56 | else: | ||
1595 | 57 | type_str = "FILE" | ||
1596 | 58 | print "%s%-36s %s %s %s" % (indent, tree.uuid, type_str, name, | ||
1597 | 59 | tree.content_hash) | ||
1598 | 60 | if tree.node_type == DIRECTORY and tree.children is not None: | ||
1599 | 61 | for name in sorted(tree.children.keys()): | ||
1600 | 62 | subtree = tree.children[name] | ||
1601 | 63 | show_tree(subtree, indent=" " + indent, name=name) | ||
1602 | 64 | |||
1603 | 65 | def generic_merge(trees, pre_merge, post_merge, partial_parent, name): | ||
1604 | 66 | """Generic tree merging function.""" | ||
1605 | 67 | |||
1606 | 68 | partial_result = pre_merge(nodes=trees, name=name, | ||
1607 | 69 | partial_parent=partial_parent) | ||
1608 | 70 | |||
1609 | 71 | def tree_children(tree): | ||
1610 | 72 | """Returns children if tree is not None""" | ||
1611 | 73 | return tree.children if tree is not None else None | ||
1612 | 74 | |||
1613 | 75 | child_dicts = [tree_children(t) or {} for t in trees] | ||
1614 | 76 | child_names = set(chain(*[cs.iterkeys() for cs in child_dicts])) | ||
1615 | 77 | child_results = {} | ||
1616 | 78 | for child_name in child_names: | ||
1617 | 79 | subtrees = [cs.get(child_name, None) for cs in child_dicts] | ||
1618 | 80 | child_result = generic_merge(trees=subtrees, | ||
1619 | 81 | pre_merge=pre_merge, | ||
1620 | 82 | post_merge=post_merge, | ||
1621 | 83 | partial_parent=partial_result, | ||
1622 | 84 | name=child_name) | ||
1623 | 85 | child_results[child_name] = child_result | ||
1624 | 86 | |||
1625 | 87 | return post_merge(nodes=trees, partial_result=partial_result, | ||
1626 | 88 | child_results=child_results) | ||
1627 | 0 | 89 | ||
1628 | === added file 'src/u1sync/main.py' | |||
1629 | --- src/u1sync/main.py 1970-01-01 00:00:00 +0000 | |||
1630 | +++ src/u1sync/main.py 2010-08-27 20:21:13 +0000 | |||
1631 | @@ -0,0 +1,360 @@ | |||
1632 | 1 | # ubuntuone.u1sync.main | ||
1633 | 2 | # | ||
1634 | 3 | # Prototype directory sync client | ||
1635 | 4 | # | ||
1636 | 5 | # Author: Lucio Torre <lucio.torre@canonical.com> | ||
1637 | 6 | # Author: Tim Cole <tim.cole@canonical.com> | ||
1638 | 7 | # | ||
1639 | 8 | # Copyright 2009 Canonical Ltd. | ||
1640 | 9 | # | ||
1641 | 10 | # This program is free software: you can redistribute it and/or modify it | ||
1642 | 11 | # under the terms of the GNU General Public License version 3, as published | ||
1643 | 12 | # by the Free Software Foundation. | ||
1644 | 13 | # | ||
1645 | 14 | # This program is distributed in the hope that it will be useful, but | ||
1646 | 15 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
1647 | 16 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
1648 | 17 | # PURPOSE. See the GNU General Public License for more details. | ||
1649 | 18 | # | ||
1650 | 19 | # You should have received a copy of the GNU General Public License along | ||
1651 | 20 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
1652 | 21 | """A prototype directory sync client.""" | ||
1653 | 22 | |||
1654 | 23 | from __future__ import with_statement | ||
1655 | 24 | |||
1656 | 25 | import os | ||
1657 | 26 | import sys | ||
1658 | 27 | import uuid | ||
1659 | 28 | import signal | ||
1660 | 29 | import logging | ||
1661 | 30 | from Queue import Queue | ||
1662 | 31 | from errno import EEXIST | ||
1663 | 32 | from twisted.internet import reactor | ||
1664 | 33 | # import the storeage protocol to allow the communication with the server | ||
1665 | 34 | import ubuntuone.storageprotocol.dircontent_pb2 as dircontent_pb2 | ||
1666 | 35 | from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY, SYMLINK | ||
1667 | 36 | # import helper modules | ||
1668 | 37 | from u1sync.genericmerge import ( | ||
1669 | 38 | show_tree, generic_merge) | ||
1670 | 39 | from u1sync.client import ( | ||
1671 | 40 | ConnectionError, AuthenticationError, NoSuchShareError, | ||
1672 | 41 | ForcedShutdown, Client) | ||
1673 | 42 | from u1sync.scan import scan_directory | ||
1674 | 43 | from u1sync.merge import ( | ||
1675 | 44 | SyncMerge, ClobberServerMerge, ClobberLocalMerge, merge_trees) | ||
1676 | 45 | from u1sync.sync import download_tree, upload_tree | ||
1677 | 46 | from u1sync.utils import safe_mkdir | ||
1678 | 47 | from u1sync import metadata | ||
1679 | 48 | from u1sync.constants import METADATA_DIR_NAME | ||
1680 | 49 | from u1sync.ubuntuone_optparse import UbuntuOneOptionsParser | ||
1681 | 50 | |||
1682 | 51 | # pylint: disable-msg=W0212 | ||
1683 | 52 | NODE_TYPE_ENUM = dircontent_pb2._NODETYPE | ||
1684 | 53 | # pylint: enable-msg=W0212 | ||
1685 | 54 | def node_type_str(node_type): | ||
1686 | 55 | """Converts a numeric node type to a human-readable string.""" | ||
1687 | 56 | return NODE_TYPE_ENUM.values_by_number[node_type].name | ||
1688 | 57 | |||
1689 | 58 | |||
1690 | 59 | class ReadOnlyShareError(Exception): | ||
1691 | 60 | """Share is read-only.""" | ||
1692 | 61 | |||
1693 | 62 | |||
1694 | 63 | class DirectoryAlreadyInitializedError(Exception): | ||
1695 | 64 | """The directory has already been initialized.""" | ||
1696 | 65 | |||
1697 | 66 | |||
1698 | 67 | class DirectoryNotInitializedError(Exception): | ||
1699 | 68 | """The directory has not been initialized.""" | ||
1700 | 69 | |||
1701 | 70 | |||
1702 | 71 | class NoParentError(Exception): | ||
1703 | 72 | """A node has no parent.""" | ||
1704 | 73 | |||
1705 | 74 | |||
1706 | 75 | class TreesDiffer(Exception): | ||
1707 | 76 | """Raised when diff tree differs.""" | ||
1708 | 77 | def __init__(self, quiet): | ||
1709 | 78 | self.quiet = quiet | ||
1710 | 79 | |||
1711 | 80 | |||
1712 | 81 | MERGE_ACTIONS = { | ||
1713 | 82 | # action: (merge_class, should_upload, should_download) | ||
1714 | 83 | 'sync': (SyncMerge, True, True), | ||
1715 | 84 | 'clobber-server': (ClobberServerMerge, True, False), | ||
1716 | 85 | 'clobber-local': (ClobberLocalMerge, False, True), | ||
1717 | 86 | 'upload': (SyncMerge, True, False), | ||
1718 | 87 | 'download': (SyncMerge, False, True), | ||
1719 | 88 | 'auto': None # special case | ||
1720 | 89 | } | ||
1721 | 90 | |||
1722 | 91 | DEFAULT_MERGE_ACTION = 'auto' | ||
1723 | 92 | |||
1724 | 93 | def do_init(client, share_spec, directory, quiet, subtree_path, | ||
1725 | 94 | metadata=metadata): | ||
1726 | 95 | """Initializes a directory for syncing, and syncs it.""" | ||
1727 | 96 | info = metadata.Metadata() | ||
1728 | 97 | |||
1729 | 98 | if share_spec is not None: | ||
1730 | 99 | info.share_uuid = client.find_volume(share_spec) | ||
1731 | 100 | else: | ||
1732 | 101 | info.share_uuid = None | ||
1733 | 102 | |||
1734 | 103 | if subtree_path is not None: | ||
1735 | 104 | info.path = subtree_path | ||
1736 | 105 | else: | ||
1737 | 106 | info.path = "/" | ||
1738 | 107 | |||
1739 | 108 | logging.info("Initializing directory...") | ||
1740 | 109 | safe_mkdir(directory) | ||
1741 | 110 | |||
1742 | 111 | metadata_dir = os.path.join(directory, METADATA_DIR_NAME) | ||
1743 | 112 | try: | ||
1744 | 113 | os.mkdir(metadata_dir) | ||
1745 | 114 | except OSError, e: | ||
1746 | 115 | if e.errno == EEXIST: | ||
1747 | 116 | raise DirectoryAlreadyInitializedError(directory) | ||
1748 | 117 | else: | ||
1749 | 118 | raise | ||
1750 | 119 | |||
1751 | 120 | logging.info("Writing mirror metadata...") | ||
1752 | 121 | metadata.write(metadata_dir, info) | ||
1753 | 122 | |||
1754 | 123 | logging.info("Done.") | ||
1755 | 124 | |||
1756 | 125 | def do_sync(client, directory, action, dry_run, quiet): | ||
1757 | 126 | """Synchronizes a directory with the given share.""" | ||
1758 | 127 | absolute_path = os.path.abspath(directory) | ||
1759 | 128 | while True: | ||
1760 | 129 | metadata_dir = os.path.join(absolute_path, METADATA_DIR_NAME) | ||
1761 | 130 | if os.path.exists(metadata_dir): | ||
1762 | 131 | break | ||
1763 | 132 | if absolute_path == "/": | ||
1764 | 133 | raise DirectoryNotInitializedError(directory) | ||
1765 | 134 | absolute_path = os.path.split(absolute_path)[0] | ||
1766 | 135 | |||
1767 | 136 | logging.info("Reading mirror metadata...") | ||
1768 | 137 | info = metadata.read(metadata_dir) | ||
1769 | 138 | |||
1770 | 139 | top_uuid, writable = client.get_root_info(info.share_uuid) | ||
1771 | 140 | |||
1772 | 141 | if info.root_uuid is None: | ||
1773 | 142 | info.root_uuid = client.resolve_path(info.share_uuid, top_uuid, | ||
1774 | 143 | info.path) | ||
1775 | 144 | |||
1776 | 145 | if action == 'auto': | ||
1777 | 146 | if writable: | ||
1778 | 147 | action = 'sync' | ||
1779 | 148 | else: | ||
1780 | 149 | action = 'download' | ||
1781 | 150 | merge_type, should_upload, should_download = MERGE_ACTIONS[action] | ||
1782 | 151 | if should_upload and not writable: | ||
1783 | 152 | raise ReadOnlyShareError(info.share_uuid) | ||
1784 | 153 | |||
1785 | 154 | logging.info("Scanning directory...") | ||
1786 | 155 | |||
1787 | 156 | local_tree = scan_directory(absolute_path, quiet=quiet) | ||
1788 | 157 | |||
1789 | 158 | logging.info("Fetching metadata...") | ||
1790 | 159 | |||
1791 | 160 | remote_tree = client.build_tree(info.share_uuid, info.root_uuid) | ||
1792 | 161 | if not quiet: | ||
1793 | 162 | show_tree(remote_tree) | ||
1794 | 163 | |||
1795 | 164 | logging.info("Merging trees...") | ||
1796 | 165 | merged_tree = merge_trees(old_local_tree=info.local_tree, | ||
1797 | 166 | local_tree=local_tree, | ||
1798 | 167 | old_remote_tree=info.remote_tree, | ||
1799 | 168 | remote_tree=remote_tree, | ||
1800 | 169 | merge_action=merge_type()) | ||
1801 | 170 | if not quiet: | ||
1802 | 171 | show_tree(merged_tree) | ||
1803 | 172 | |||
1804 | 173 | logging.info("Syncing content...") | ||
1805 | 174 | if should_download: | ||
1806 | 175 | info.local_tree = download_tree(merged_tree=merged_tree, | ||
1807 | 176 | local_tree=local_tree, | ||
1808 | 177 | client=client, | ||
1809 | 178 | share_uuid=info.share_uuid, | ||
1810 | 179 | path=absolute_path, dry_run=dry_run, | ||
1811 | 180 | quiet=quiet) | ||
1812 | 181 | else: | ||
1813 | 182 | info.local_tree = local_tree | ||
1814 | 183 | if should_upload: | ||
1815 | 184 | info.remote_tree = upload_tree(merged_tree=merged_tree, | ||
1816 | 185 | remote_tree=remote_tree, | ||
1817 | 186 | client=client, | ||
1818 | 187 | share_uuid=info.share_uuid, | ||
1819 | 188 | path=absolute_path, dry_run=dry_run, | ||
1820 | 189 | quiet=quiet) | ||
1821 | 190 | else: | ||
1822 | 191 | info.remote_tree = remote_tree | ||
1823 | 192 | |||
1824 | 193 | if not dry_run: | ||
1825 | 194 | logging.info("Updating mirror metadata...") | ||
1826 | 195 | metadata.write(metadata_dir, info) | ||
1827 | 196 | |||
1828 | 197 | logging.info("Done.") | ||
1829 | 198 | |||
1830 | 199 | def do_list_shares(client): | ||
1831 | 200 | """Lists available (incoming) shares.""" | ||
1832 | 201 | shares = client.get_incoming_shares() | ||
1833 | 202 | for (name, id, user, accepted, access) in shares: | ||
1834 | 203 | if not accepted: | ||
1835 | 204 | status = " [not accepted]" | ||
1836 | 205 | else: | ||
1837 | 206 | status = "" | ||
1838 | 207 | name = name.encode("utf-8") | ||
1839 | 208 | user = user.encode("utf-8") | ||
1840 | 209 | print "%s %s (from %s) [%s]%s" % (id, name, user, access, status) | ||
1841 | 210 | |||
1842 | 211 | def do_diff(client, share_spec, directory, quiet, subtree_path, | ||
1843 | 212 | ignore_symlinks=True): | ||
1844 | 213 | """Diffs a local directory with the server.""" | ||
1845 | 214 | if share_spec is not None: | ||
1846 | 215 | share_uuid = client.find_volume(share_spec) | ||
1847 | 216 | else: | ||
1848 | 217 | share_uuid = None | ||
1849 | 218 | if subtree_path is None: | ||
1850 | 219 | subtree_path = '/' | ||
1851 | 220 | # pylint: disable-msg=W0612 | ||
1852 | 221 | root_uuid, writable = client.get_root_info(share_uuid) | ||
1853 | 222 | subtree_uuid = client.resolve_path(share_uuid, root_uuid, subtree_path) | ||
1854 | 223 | local_tree = scan_directory(directory, quiet=True) | ||
1855 | 224 | remote_tree = client.build_tree(share_uuid, subtree_uuid) | ||
1856 | 225 | |||
1857 | 226 | def pre_merge(nodes, name, partial_parent): | ||
1858 | 227 | """Compares nodes and prints differences.""" | ||
1859 | 228 | (local_node, remote_node) = nodes | ||
1860 | 229 | # pylint: disable-msg=W0612 | ||
1861 | 230 | (parent_display_path, parent_differs) = partial_parent | ||
1862 | 231 | display_path = os.path.join(parent_display_path, name.encode("UTF-8")) | ||
1863 | 232 | differs = True | ||
1864 | 233 | if local_node is None: | ||
1865 | 234 | logging.info("%s missing from client",display_path) | ||
1866 | 235 | elif remote_node is None: | ||
1867 | 236 | if ignore_symlinks and local_node.node_type == SYMLINK: | ||
1868 | 237 | differs = False | ||
1869 | 238 | logging.info("%s missing from server",display_path) | ||
1870 | 239 | elif local_node.node_type != remote_node.node_type: | ||
1871 | 240 | local_type = node_type_str(local_node.node_type) | ||
1872 | 241 | remote_type = node_type_str(remote_node.node_type) | ||
1873 | 242 | logging.info("%s node types differ (client: %s, server: %s)", | ||
1874 | 243 | display_path, local_type, remote_type) | ||
1875 | 244 | elif local_node.node_type != DIRECTORY and \ | ||
1876 | 245 | local_node.content_hash != remote_node.content_hash: | ||
1877 | 246 | local_content = local_node.content_hash | ||
1878 | 247 | remote_content = remote_node.content_hash | ||
1879 | 248 | logging.info("%s has different content (client: %s, server: %s)", | ||
1880 | 249 | display_path, local_content, remote_content) | ||
1881 | 250 | else: | ||
1882 | 251 | differs = False | ||
1883 | 252 | return (display_path, differs) | ||
1884 | 253 | |||
1885 | 254 | def post_merge(nodes, partial_result, child_results): | ||
1886 | 255 | """Aggregates 'differs' flags.""" | ||
1887 | 256 | # pylint: disable-msg=W0612 | ||
1888 | 257 | (display_path, differs) = partial_result | ||
1889 | 258 | return differs or any(child_results.itervalues()) | ||
1890 | 259 | |||
1891 | 260 | differs = generic_merge(trees=[local_tree, remote_tree], | ||
1892 | 261 | pre_merge=pre_merge, post_merge=post_merge, | ||
1893 | 262 | partial_parent=("", False), name=u"") | ||
1894 | 263 | if differs: | ||
1895 | 264 | raise TreesDiffer(quiet=quiet) | ||
1896 | 265 | |||
1897 | 266 | def do_main(argv, options_parser): | ||
1898 | 267 | """The main user-facing portion of the script.""" | ||
1899 | 268 | # pass the arguments to ensure that the code works correctly | ||
1900 | 269 | options_parser.get_options(argv) | ||
1901 | 270 | client = Client(realm=options_parser.options.realm, reactor=reactor) | ||
1902 | 271 | # set the logging level to info in the user passed quite to be false | ||
1903 | 272 | if options_parser.options.quiet: | ||
1904 | 273 | logging.basicConfig(level=logging.INFO) | ||
1905 | 274 | |||
1906 | 275 | signal.signal(signal.SIGINT, lambda s, f: client.force_shutdown()) | ||
1907 | 276 | signal.signal(signal.SIGTERM, lambda s, f: client.force_shutdown()) | ||
1908 | 277 | |||
1909 | 278 | def run_client(): | ||
1910 | 279 | """Run the blocking client.""" | ||
1911 | 280 | token = options_parser.options.token | ||
1912 | 281 | |||
1913 | 282 | client.connect_ssl(options_parser.options.host, | ||
1914 | 283 | int(options_parser.options.port), | ||
1915 | 284 | options_parser.options.no_ssl_verify) | ||
1916 | 285 | |||
1917 | 286 | try: | ||
1918 | 287 | client.set_capabilities() | ||
1919 | 288 | client.oauth_from_token(options_parser.options.token) | ||
1920 | 289 | |||
1921 | 290 | if options_parser.options.mode == "sync": | ||
1922 | 291 | do_sync(client=client, directory=options_parser.options.directory, | ||
1923 | 292 | action=options_parser.options.action, | ||
1924 | 293 | dry_run=options_parser.options.dry_run, | ||
1925 | 294 | quiet=options_parser.options.quiet) | ||
1926 | 295 | elif options_parser.options.mode == "init": | ||
1927 | 296 | do_init(client=client, share_spec=options_parser.options.share, | ||
1928 | 297 | directory=options_parser.options.directory, | ||
1929 | 298 | quiet=options_parser.options.quiet, subtree_path=options_parser.options.subtree) | ||
1930 | 299 | elif options.mode == "list-shares": | ||
1931 | 300 | do_list_shares(client=client) | ||
1932 | 301 | elif options.mode == "diff": | ||
1933 | 302 | do_diff(client=client, share_spec=options_parser.options.share, | ||
1934 | 303 | directory=directory, | ||
1935 | 304 | quiet=options_parser.options.quiet, | ||
1936 | 305 | subtree_path=options_parser.options.subtree, | ||
1937 | 306 | ignore_symlinks=False) | ||
1938 | 307 | elif options_parser.options.mode == "authorize": | ||
1939 | 308 | if not options_parser.options.quiet: | ||
1940 | 309 | print "Authorized." | ||
1941 | 310 | finally: | ||
1942 | 311 | client.disconnect() | ||
1943 | 312 | |||
1944 | 313 | def capture_exception(queue, func): | ||
1945 | 314 | """Capture the exception from calling func.""" | ||
1946 | 315 | try: | ||
1947 | 316 | func() | ||
1948 | 317 | except Exception: | ||
1949 | 318 | queue.put(sys.exc_info()) | ||
1950 | 319 | else: | ||
1951 | 320 | queue.put(None) | ||
1952 | 321 | finally: | ||
1953 | 322 | reactor.callWhenRunning(reactor.stop) | ||
1954 | 323 | |||
1955 | 324 | queue = Queue() | ||
1956 | 325 | reactor.callInThread(capture_exception, queue, run_client) | ||
1957 | 326 | reactor.run(installSignalHandlers=False) | ||
1958 | 327 | exc_info = queue.get(True, 0.1) | ||
1959 | 328 | if exc_info: | ||
1960 | 329 | raise exc_info[0], exc_info[1], exc_info[2] | ||
1961 | 330 | |||
1962 | 331 | def main(argv): | ||
1963 | 332 | """Top-level main function.""" | ||
1964 | 333 | try: | ||
1965 | 334 | do_main(argv, UbuntuOneOptionsParser()) | ||
1966 | 335 | except AuthenticationError, e: | ||
1967 | 336 | print "Authentication failed: %s" % e | ||
1968 | 337 | except ConnectionError, e: | ||
1969 | 338 | print "Connection failed: %s" % e | ||
1970 | 339 | except DirectoryNotInitializedError: | ||
1971 | 340 | print "Directory not initialized; " \ | ||
1972 | 341 | "use --init [DIRECTORY] to initialize it." | ||
1973 | 342 | except DirectoryAlreadyInitializedError: | ||
1974 | 343 | print "Directory already initialized." | ||
1975 | 344 | except NoSuchShareError: | ||
1976 | 345 | print "No matching share found." | ||
1977 | 346 | except ReadOnlyShareError: | ||
1978 | 347 | print "The selected action isn't possible on a read-only share." | ||
1979 | 348 | except (ForcedShutdown, KeyboardInterrupt): | ||
1980 | 349 | print "Interrupted!" | ||
1981 | 350 | except TreesDiffer, e: | ||
1982 | 351 | if not e.quiet: | ||
1983 | 352 | print "Trees differ." | ||
1984 | 353 | else: | ||
1985 | 354 | return 0 | ||
1986 | 355 | return 1 | ||
1987 | 356 | |||
1988 | 357 | if __name__ == "__main__": | ||
1989 | 358 | # set the name of the process, this just works on windows | ||
1990 | 359 | sys.argv[0] = "Ubuntuone" | ||
1991 | 360 | main(sys.argv[1:]) | ||
1992 | 0 | \ No newline at end of file | 361 | \ No newline at end of file |
1993 | 1 | 362 | ||
1994 | === added file 'src/u1sync/merge.py' | |||
1995 | --- src/u1sync/merge.py 1970-01-01 00:00:00 +0000 | |||
1996 | +++ src/u1sync/merge.py 2010-08-27 20:21:13 +0000 | |||
1997 | @@ -0,0 +1,186 @@ | |||
1998 | 1 | # ubuntuone.u1sync.merge | ||
1999 | 2 | # | ||
2000 | 3 | # Tree state merging | ||
2001 | 4 | # | ||
2002 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
2003 | 6 | # | ||
2004 | 7 | # Copyright 2009 Canonical Ltd. | ||
2005 | 8 | # | ||
2006 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
2007 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
2008 | 11 | # by the Free Software Foundation. | ||
2009 | 12 | # | ||
2010 | 13 | # This program is distributed in the hope that it will be useful, but | ||
2011 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
2012 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
2013 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
2014 | 17 | # | ||
2015 | 18 | # You should have received a copy of the GNU General Public License along | ||
2016 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
2017 | 20 | """Code for merging changes between modified trees.""" | ||
2018 | 21 | |||
2019 | 22 | from __future__ import with_statement | ||
2020 | 23 | |||
2021 | 24 | import os | ||
2022 | 25 | |||
2023 | 26 | from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY | ||
2024 | 27 | from u1sync.genericmerge import ( | ||
2025 | 28 | MergeNode, generic_merge) | ||
2026 | 29 | import uuid | ||
2027 | 30 | |||
2028 | 31 | |||
2029 | 32 | class NodeTypeMismatchError(Exception): | ||
2030 | 33 | """Node types don't match.""" | ||
2031 | 34 | |||
2032 | 35 | |||
2033 | 36 | def merge_trees(old_local_tree, local_tree, old_remote_tree, remote_tree, | ||
2034 | 37 | merge_action): | ||
2035 | 38 | """Performs a tree merge using the given merge action.""" | ||
2036 | 39 | |||
2037 | 40 | def pre_merge(nodes, name, partial_parent): | ||
2038 | 41 | """Accumulates path and determines merged node type.""" | ||
2039 | 42 | old_local_node, local_node, old_remote_node, remote_node = nodes | ||
2040 | 43 | # pylint: disable-msg=W0612 | ||
2041 | 44 | (parent_path, parent_type) = partial_parent | ||
2042 | 45 | path = os.path.join(parent_path, name.encode("utf-8")) | ||
2043 | 46 | node_type = merge_action.get_node_type(old_local_node=old_local_node, | ||
2044 | 47 | local_node=local_node, | ||
2045 | 48 | old_remote_node=old_remote_node, | ||
2046 | 49 | remote_node=remote_node, | ||
2047 | 50 | path=path) | ||
2048 | 51 | return (path, node_type) | ||
2049 | 52 | |||
2050 | 53 | def post_merge(nodes, partial_result, child_results): | ||
2051 | 54 | """Drops deleted children and merges node.""" | ||
2052 | 55 | old_local_node, local_node, old_remote_node, remote_node = nodes | ||
2053 | 56 | # pylint: disable-msg=W0612 | ||
2054 | 57 | (path, node_type) = partial_result | ||
2055 | 58 | if node_type == DIRECTORY: | ||
2056 | 59 | merged_children = dict([(name, child) for (name, child) | ||
2057 | 60 | in child_results.iteritems() | ||
2058 | 61 | if child is not None]) | ||
2059 | 62 | else: | ||
2060 | 63 | merged_children = None | ||
2061 | 64 | return merge_action.merge_node(old_local_node=old_local_node, | ||
2062 | 65 | local_node=local_node, | ||
2063 | 66 | old_remote_node=old_remote_node, | ||
2064 | 67 | remote_node=remote_node, | ||
2065 | 68 | node_type=node_type, | ||
2066 | 69 | merged_children=merged_children) | ||
2067 | 70 | |||
2068 | 71 | return generic_merge(trees=[old_local_tree, local_tree, | ||
2069 | 72 | old_remote_tree, remote_tree], | ||
2070 | 73 | pre_merge=pre_merge, post_merge=post_merge, | ||
2071 | 74 | name=u"", partial_parent=("", None)) | ||
2072 | 75 | |||
2073 | 76 | |||
2074 | 77 | class SyncMerge(object): | ||
2075 | 78 | """Performs a bidirectional sync merge.""" | ||
2076 | 79 | |||
2077 | 80 | def get_node_type(self, old_local_node, local_node, | ||
2078 | 81 | old_remote_node, remote_node, path): | ||
2079 | 82 | """Requires that all node types match.""" | ||
2080 | 83 | node_type = None | ||
2081 | 84 | for node in (old_local_node, local_node, remote_node): | ||
2082 | 85 | if node is not None: | ||
2083 | 86 | if node_type is not None: | ||
2084 | 87 | if node.node_type != node_type: | ||
2085 | 88 | message = "Node types don't match for %s" % path | ||
2086 | 89 | raise NodeTypeMismatchError(message) | ||
2087 | 90 | else: | ||
2088 | 91 | node_type = node.node_type | ||
2089 | 92 | return node_type | ||
2090 | 93 | |||
2091 | 94 | def merge_node(self, old_local_node, local_node, | ||
2092 | 95 | old_remote_node, remote_node, node_type, merged_children): | ||
2093 | 96 | """Performs bidirectional merge of node state.""" | ||
2094 | 97 | |||
2095 | 98 | def node_content_hash(node): | ||
2096 | 99 | """Returns node content hash if node is not None""" | ||
2097 | 100 | return node.content_hash if node is not None else None | ||
2098 | 101 | |||
2099 | 102 | old_local_content_hash = node_content_hash(old_local_node) | ||
2100 | 103 | local_content_hash = node_content_hash(local_node) | ||
2101 | 104 | old_remote_content_hash = node_content_hash(old_remote_node) | ||
2102 | 105 | remote_content_hash = node_content_hash(remote_node) | ||
2103 | 106 | |||
2104 | 107 | locally_deleted = old_local_node is not None and local_node is None | ||
2105 | 108 | deleted_on_server = old_remote_node is not None and remote_node is None | ||
2106 | 109 | # updated means modified or created | ||
2107 | 110 | locally_updated = not locally_deleted and \ | ||
2108 | 111 | old_local_content_hash != local_content_hash | ||
2109 | 112 | updated_on_server = not deleted_on_server and \ | ||
2110 | 113 | old_remote_content_hash != remote_content_hash | ||
2111 | 114 | |||
2112 | 115 | has_merged_children = merged_children is not None and \ | ||
2113 | 116 | len(merged_children) > 0 | ||
2114 | 117 | |||
2115 | 118 | either_node_exists = local_node is not None or remote_node is not None | ||
2116 | 119 | should_delete = (locally_deleted and not updated_on_server) or \ | ||
2117 | 120 | (deleted_on_server and not locally_updated) | ||
2118 | 121 | |||
2119 | 122 | if (either_node_exists and not should_delete) or has_merged_children: | ||
2120 | 123 | if node_type != DIRECTORY and \ | ||
2121 | 124 | locally_updated and updated_on_server and \ | ||
2122 | 125 | local_content_hash != remote_content_hash: | ||
2123 | 126 | # local_content_hash will become the merged content_hash; | ||
2124 | 127 | # save remote_content_hash in conflict info | ||
2125 | 128 | conflict_info = (str(uuid.uuid4()), remote_content_hash) | ||
2126 | 129 | else: | ||
2127 | 130 | conflict_info = None | ||
2128 | 131 | node_uuid = remote_node.uuid if remote_node is not None else None | ||
2129 | 132 | if locally_updated: | ||
2130 | 133 | content_hash = local_content_hash or remote_content_hash | ||
2131 | 134 | else: | ||
2132 | 135 | content_hash = remote_content_hash or local_content_hash | ||
2133 | 136 | return MergeNode(node_type=node_type, uuid=node_uuid, | ||
2134 | 137 | children=merged_children, content_hash=content_hash, | ||
2135 | 138 | conflict_info=conflict_info) | ||
2136 | 139 | else: | ||
2137 | 140 | return None | ||
2138 | 141 | |||
2139 | 142 | |||
2140 | 143 | class ClobberServerMerge(object): | ||
2141 | 144 | """Clobber server to match local state.""" | ||
2142 | 145 | |||
2143 | 146 | def get_node_type(self, old_local_node, local_node, | ||
2144 | 147 | old_remote_node, remote_node, path): | ||
2145 | 148 | """Return local node type.""" | ||
2146 | 149 | if local_node is not None: | ||
2147 | 150 | return local_node.node_type | ||
2148 | 151 | else: | ||
2149 | 152 | return None | ||
2150 | 153 | |||
2151 | 154 | def merge_node(self, old_local_node, local_node, | ||
2152 | 155 | old_remote_node, remote_node, node_type, merged_children): | ||
2153 | 156 | """Copy local node and associate with remote uuid (if applicable).""" | ||
2154 | 157 | if local_node is None: | ||
2155 | 158 | return None | ||
2156 | 159 | if remote_node is not None: | ||
2157 | 160 | node_uuid = remote_node.uuid | ||
2158 | 161 | else: | ||
2159 | 162 | node_uuid = None | ||
2160 | 163 | return MergeNode(node_type=local_node.node_type, uuid=node_uuid, | ||
2161 | 164 | content_hash=local_node.content_hash, | ||
2162 | 165 | children=merged_children) | ||
2163 | 166 | |||
2164 | 167 | |||
2165 | 168 | class ClobberLocalMerge(object): | ||
2166 | 169 | """Clobber local state to match server.""" | ||
2167 | 170 | |||
2168 | 171 | def get_node_type(self, old_local_node, local_node, | ||
2169 | 172 | old_remote_node, remote_node, path): | ||
2170 | 173 | """Return remote node type.""" | ||
2171 | 174 | if remote_node is not None: | ||
2172 | 175 | return remote_node.node_type | ||
2173 | 176 | else: | ||
2174 | 177 | return None | ||
2175 | 178 | |||
2176 | 179 | def merge_node(self, old_local_node, local_node, | ||
2177 | 180 | old_remote_node, remote_node, node_type, merged_children): | ||
2178 | 181 | """Copy the remote node.""" | ||
2179 | 182 | if remote_node is None: | ||
2180 | 183 | return None | ||
2181 | 184 | return MergeNode(node_type=node_type, uuid=remote_node.uuid, | ||
2182 | 185 | content_hash=remote_node.content_hash, | ||
2183 | 186 | children=merged_children) | ||
2184 | 0 | 187 | ||
2185 | === added file 'src/u1sync/metadata.py' | |||
2186 | --- src/u1sync/metadata.py 1970-01-01 00:00:00 +0000 | |||
2187 | +++ src/u1sync/metadata.py 2010-08-27 20:21:13 +0000 | |||
2188 | @@ -0,0 +1,145 @@ | |||
2189 | 1 | # ubuntuone.u1sync.metadata | ||
2190 | 2 | # | ||
2191 | 3 | # u1sync metadata routines | ||
2192 | 4 | # | ||
2193 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
2194 | 6 | # | ||
2195 | 7 | # Copyright 2009 Canonical Ltd. | ||
2196 | 8 | # | ||
2197 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
2198 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
2199 | 11 | # by the Free Software Foundation. | ||
2200 | 12 | # | ||
2201 | 13 | # This program is distributed in the hope that it will be useful, but | ||
2202 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
2203 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
2204 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
2205 | 17 | # | ||
2206 | 18 | # You should have received a copy of the GNU General Public License along | ||
2207 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
2208 | 20 | """Routines for loading/storing u1sync mirror metadata.""" | ||
2209 | 21 | |||
2210 | 22 | from __future__ import with_statement | ||
2211 | 23 | |||
2212 | 24 | import os | ||
2213 | 25 | import cPickle as pickle | ||
2214 | 26 | from errno import ENOENT | ||
2215 | 27 | from contextlib import contextmanager | ||
2216 | 28 | from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY | ||
2217 | 29 | from u1sync.merge import MergeNode | ||
2218 | 30 | from u1sync.utils import safe_unlink | ||
2219 | 31 | import uuid | ||
2220 | 32 | |||
2221 | 33 | class Metadata(object): | ||
2222 | 34 | """Object representing mirror metadata.""" | ||
2223 | 35 | def __init__(self, local_tree=None, remote_tree=None, share_uuid=None, | ||
2224 | 36 | root_uuid=None, path=None): | ||
2225 | 37 | """Populate fields.""" | ||
2226 | 38 | self.local_tree = local_tree | ||
2227 | 39 | self.remote_tree = remote_tree | ||
2228 | 40 | self.share_uuid = share_uuid | ||
2229 | 41 | self.root_uuid = root_uuid | ||
2230 | 42 | self.path = path | ||
2231 | 43 | |||
2232 | 44 | def read(metadata_dir): | ||
2233 | 45 | """Read metadata for a mirror rooted at directory.""" | ||
2234 | 46 | index_file = os.path.join(metadata_dir, "local-index") | ||
2235 | 47 | share_uuid_file = os.path.join(metadata_dir, "share-uuid") | ||
2236 | 48 | root_uuid_file = os.path.join(metadata_dir, "root-uuid") | ||
2237 | 49 | path_file = os.path.join(metadata_dir, "path") | ||
2238 | 50 | |||
2239 | 51 | index = read_pickle_file(index_file, {}) | ||
2240 | 52 | share_uuid = read_uuid_file(share_uuid_file) | ||
2241 | 53 | root_uuid = read_uuid_file(root_uuid_file) | ||
2242 | 54 | path = read_string_file(path_file, '/') | ||
2243 | 55 | |||
2244 | 56 | local_tree = index.get("tree", None) | ||
2245 | 57 | remote_tree = index.get("remote_tree", None) | ||
2246 | 58 | |||
2247 | 59 | if local_tree is None: | ||
2248 | 60 | local_tree = MergeNode(node_type=DIRECTORY, children={}) | ||
2249 | 61 | if remote_tree is None: | ||
2250 | 62 | remote_tree = MergeNode(node_type=DIRECTORY, children={}) | ||
2251 | 63 | |||
2252 | 64 | return Metadata(local_tree=local_tree, remote_tree=remote_tree, | ||
2253 | 65 | share_uuid=share_uuid, root_uuid=root_uuid, | ||
2254 | 66 | path=path) | ||
2255 | 67 | |||
2256 | 68 | def write(metadata_dir, info): | ||
2257 | 69 | """Writes all metadata for the mirror rooted at directory.""" | ||
2258 | 70 | share_uuid_file = os.path.join(metadata_dir, "share-uuid") | ||
2259 | 71 | root_uuid_file = os.path.join(metadata_dir, "root-uuid") | ||
2260 | 72 | index_file = os.path.join(metadata_dir, "local-index") | ||
2261 | 73 | path_file = os.path.join(metadata_dir, "path") | ||
2262 | 74 | if info.share_uuid is not None: | ||
2263 | 75 | write_uuid_file(share_uuid_file, info.share_uuid) | ||
2264 | 76 | else: | ||
2265 | 77 | safe_unlink(share_uuid_file) | ||
2266 | 78 | if info.root_uuid is not None: | ||
2267 | 79 | write_uuid_file(root_uuid_file, info.root_uuid) | ||
2268 | 80 | else: | ||
2269 | 81 | safe_unlink(root_uuid_file) | ||
2270 | 82 | write_string_file(path_file, info.path) | ||
2271 | 83 | write_pickle_file(index_file, {"tree": info.local_tree, | ||
2272 | 84 | "remote_tree": info.remote_tree}) | ||
2273 | 85 | |||
2274 | 86 | def write_pickle_file(filename, value): | ||
2275 | 87 | """Writes a pickled python object to a file.""" | ||
2276 | 88 | with atomic_update_file(filename) as stream: | ||
2277 | 89 | pickle.dump(value, stream, 2) | ||
2278 | 90 | |||
2279 | 91 | def write_string_file(filename, value): | ||
2280 | 92 | """Writes a string to a file with an added line feed, or | ||
2281 | 93 | deletes the file if value is None. | ||
2282 | 94 | """ | ||
2283 | 95 | if value is not None: | ||
2284 | 96 | with atomic_update_file(filename) as stream: | ||
2285 | 97 | stream.write(value) | ||
2286 | 98 | stream.write('\n') | ||
2287 | 99 | else: | ||
2288 | 100 | safe_unlink(filename) | ||
2289 | 101 | |||
2290 | 102 | def write_uuid_file(filename, value): | ||
2291 | 103 | """Writes a UUID to a file.""" | ||
2292 | 104 | write_string_file(filename, str(value)) | ||
2293 | 105 | |||
2294 | 106 | def read_pickle_file(filename, default_value=None): | ||
2295 | 107 | """Reads a pickled python object from a file.""" | ||
2296 | 108 | try: | ||
2297 | 109 | with open(filename, "rb") as stream: | ||
2298 | 110 | return pickle.load(stream) | ||
2299 | 111 | except IOError, e: | ||
2300 | 112 | if e.errno != ENOENT: | ||
2301 | 113 | raise | ||
2302 | 114 | return default_value | ||
2303 | 115 | |||
2304 | 116 | def read_string_file(filename, default_value=None): | ||
2305 | 117 | """Reads a string from a file, discarding the final character.""" | ||
2306 | 118 | try: | ||
2307 | 119 | with open(filename, "r") as stream: | ||
2308 | 120 | return stream.read()[:-1] | ||
2309 | 121 | except IOError, e: | ||
2310 | 122 | if e.errno != ENOENT: | ||
2311 | 123 | raise | ||
2312 | 124 | return default_value | ||
2313 | 125 | |||
2314 | 126 | def read_uuid_file(filename, default_value=None): | ||
2315 | 127 | """Reads a UUID from a file.""" | ||
2316 | 128 | try: | ||
2317 | 129 | with open(filename, "r") as stream: | ||
2318 | 130 | return uuid.UUID(stream.read()[:-1]) | ||
2319 | 131 | except IOError, e: | ||
2320 | 132 | if e.errno != ENOENT: | ||
2321 | 133 | raise | ||
2322 | 134 | return default_value | ||
2323 | 135 | |||
2324 | 136 | @contextmanager | ||
2325 | 137 | def atomic_update_file(filename): | ||
2326 | 138 | """Returns a context manager for atomically updating a file.""" | ||
2327 | 139 | temp_filename = "%s.%s" % (filename, uuid.uuid4()) | ||
2328 | 140 | try: | ||
2329 | 141 | with open(temp_filename, "w") as stream: | ||
2330 | 142 | yield stream | ||
2331 | 143 | os.rename(temp_filename, filename) | ||
2332 | 144 | finally: | ||
2333 | 145 | safe_unlink(temp_filename) | ||
2334 | 0 | 146 | ||
2335 | === added file 'src/u1sync/scan.py' | |||
2336 | --- src/u1sync/scan.py 1970-01-01 00:00:00 +0000 | |||
2337 | +++ src/u1sync/scan.py 2010-08-27 20:21:13 +0000 | |||
2338 | @@ -0,0 +1,102 @@ | |||
2339 | 1 | # ubuntuone.u1sync.scan | ||
2340 | 2 | # | ||
2341 | 3 | # Directory scanning | ||
2342 | 4 | # | ||
2343 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
2344 | 6 | # | ||
2345 | 7 | # Copyright 2009 Canonical Ltd. | ||
2346 | 8 | # | ||
2347 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
2348 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
2349 | 11 | # by the Free Software Foundation. | ||
2350 | 12 | # | ||
2351 | 13 | # This program is distributed in the hope that it will be useful, but | ||
2352 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
2353 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
2354 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
2355 | 17 | # | ||
2356 | 18 | # You should have received a copy of the GNU General Public License along | ||
2357 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
2358 | 20 | """Code for scanning local directory state.""" | ||
2359 | 21 | |||
2360 | 22 | from __future__ import with_statement | ||
2361 | 23 | |||
2362 | 24 | import os | ||
2363 | 25 | import hashlib | ||
2364 | 26 | import shutil | ||
2365 | 27 | from errno import ENOTDIR, EINVAL | ||
2366 | 28 | import sys | ||
2367 | 29 | |||
2368 | 30 | EMPTY_HASH = "sha1:%s" % hashlib.sha1().hexdigest() | ||
2369 | 31 | |||
2370 | 32 | from ubuntuone.storageprotocol.dircontent_pb2 import \ | ||
2371 | 33 | DIRECTORY, FILE, SYMLINK | ||
2372 | 34 | from u1sync.genericmerge import MergeNode | ||
2373 | 35 | from u1sync.utils import should_sync | ||
2374 | 36 | |||
2375 | 37 | def scan_directory(path, display_path="", quiet=False): | ||
2376 | 38 | """Scans a local directory and builds an in-memory tree from it.""" | ||
2377 | 39 | if display_path != "" and not quiet: | ||
2378 | 40 | print display_path | ||
2379 | 41 | |||
2380 | 42 | link_target = None | ||
2381 | 43 | child_names = None | ||
2382 | 44 | try: | ||
2383 | 45 | print "Path is " + str(path) | ||
2384 | 46 | if sys.platform == "win32": | ||
2385 | 47 | if path.endswith(".lnk") or path.endswith(".url"): | ||
2386 | 48 | import win32com.client | ||
2387 | 49 | import pythoncom | ||
2388 | 50 | pythoncom.CoInitialize() | ||
2389 | 51 | shell = win32com.client.Dispatch("WScript.Shell") | ||
2390 | 52 | shortcut = shell.CreateShortCut(path) | ||
2391 | 53 | print(shortcut.Targetpath) | ||
2392 | 54 | link_target = shortcut.Targetpath | ||
2393 | 55 | else: | ||
2394 | 56 | link_target = None | ||
2395 | 57 | if os.path.isdir(path): | ||
2396 | 58 | child_names = os.listdir(path) | ||
2397 | 59 | else: | ||
2398 | 60 | link_target = os.readlink(path) | ||
2399 | 61 | except OSError, e: | ||
2400 | 62 | if e.errno != EINVAL: | ||
2401 | 63 | raise | ||
2402 | 64 | try: | ||
2403 | 65 | child_names = os.listdir(path) | ||
2404 | 66 | except OSError, e: | ||
2405 | 67 | if e.errno != ENOTDIR: | ||
2406 | 68 | raise | ||
2407 | 69 | |||
2408 | 70 | if link_target is not None: | ||
2409 | 71 | # symlink | ||
2410 | 72 | sum = hashlib.sha1() | ||
2411 | 73 | sum.update(link_target) | ||
2412 | 74 | content_hash = "sha1:%s" % sum.hexdigest() | ||
2413 | 75 | return MergeNode(node_type=SYMLINK, content_hash=content_hash) | ||
2414 | 76 | elif child_names is not None: | ||
2415 | 77 | # directory | ||
2416 | 78 | child_names = [n for n in child_names if should_sync(n.decode("utf-8"))] | ||
2417 | 79 | child_paths = [(os.path.join(path, child_name), | ||
2418 | 80 | os.path.join(display_path, child_name)) \ | ||
2419 | 81 | for child_name in child_names] | ||
2420 | 82 | children = [scan_directory(child_path, child_display_path, quiet) \ | ||
2421 | 83 | for (child_path, child_display_path) in child_paths] | ||
2422 | 84 | unicode_child_names = [n.decode("utf-8") for n in child_names] | ||
2423 | 85 | children = dict(zip(unicode_child_names, children)) | ||
2424 | 86 | return MergeNode(node_type=DIRECTORY, children=children) | ||
2425 | 87 | else: | ||
2426 | 88 | # regular file | ||
2427 | 89 | sum = hashlib.sha1() | ||
2428 | 90 | |||
2429 | 91 | |||
2430 | 92 | class HashStream(object): | ||
2431 | 93 | """Stream that computes hashes.""" | ||
2432 | 94 | def write(self, bytes): | ||
2433 | 95 | """Accumulate bytes.""" | ||
2434 | 96 | sum.update(bytes) | ||
2435 | 97 | |||
2436 | 98 | |||
2437 | 99 | with open(path, "r") as stream: | ||
2438 | 100 | shutil.copyfileobj(stream, HashStream()) | ||
2439 | 101 | content_hash = "sha1:%s" % sum.hexdigest() | ||
2440 | 102 | return MergeNode(node_type=FILE, content_hash=content_hash) | ||
2441 | 0 | 103 | ||
2442 | === added file 'src/u1sync/sync.py' | |||
2443 | --- src/u1sync/sync.py 1970-01-01 00:00:00 +0000 | |||
2444 | +++ src/u1sync/sync.py 2010-08-27 20:21:13 +0000 | |||
2445 | @@ -0,0 +1,384 @@ | |||
2446 | 1 | # ubuntuone.u1sync.sync | ||
2447 | 2 | # | ||
2448 | 3 | # State update | ||
2449 | 4 | # | ||
2450 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
2451 | 6 | # | ||
2452 | 7 | # Copyright 2009 Canonical Ltd. | ||
2453 | 8 | # | ||
2454 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
2455 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
2456 | 11 | # by the Free Software Foundation. | ||
2457 | 12 | # | ||
2458 | 13 | # This program is distributed in the hope that it will be useful, but | ||
2459 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
2460 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
2461 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
2462 | 17 | # | ||
2463 | 18 | # You should have received a copy of the GNU General Public License along | ||
2464 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
2465 | 20 | """After merging, these routines are used to synchronize state locally and on | ||
2466 | 21 | the server to correspond to the merged result.""" | ||
2467 | 22 | |||
2468 | 23 | from __future__ import with_statement | ||
2469 | 24 | |||
2470 | 25 | import os | ||
2471 | 26 | |||
2472 | 27 | EMPTY_HASH = "" | ||
2473 | 28 | UPLOAD_SYMBOL = u"\u25b2".encode("utf-8") | ||
2474 | 29 | DOWNLOAD_SYMBOL = u"\u25bc".encode("utf-8") | ||
2475 | 30 | CONFLICT_SYMBOL = "!" | ||
2476 | 31 | DELETE_SYMBOL = "X" | ||
2477 | 32 | |||
2478 | 33 | from ubuntuone.storageprotocol import request | ||
2479 | 34 | from ubuntuone.storageprotocol.dircontent_pb2 import ( | ||
2480 | 35 | DIRECTORY, SYMLINK) | ||
2481 | 36 | from u1sync.genericmerge import ( | ||
2482 | 37 | MergeNode, generic_merge) | ||
2483 | 38 | from u1sync.utils import safe_mkdir | ||
2484 | 39 | from u1sync.client import UnsupportedOperationError | ||
2485 | 40 | |||
2486 | 41 | def get_conflict_path(path, conflict_info): | ||
2487 | 42 | """Returns path for conflict file corresponding to path.""" | ||
2488 | 43 | dir, name = os.path.split(path) | ||
2489 | 44 | unique_id = conflict_info[0] | ||
2490 | 45 | return os.path.join(dir, "conflict-%s-%s" % (unique_id, name)) | ||
2491 | 46 | |||
2492 | 47 | def name_from_path(path): | ||
2493 | 48 | """Returns unicode name from last path component.""" | ||
2494 | 49 | return os.path.split(path)[1].decode("UTF-8") | ||
2495 | 50 | |||
2496 | 51 | |||
2497 | 52 | class NodeSyncError(Exception): | ||
2498 | 53 | """Error syncing node.""" | ||
2499 | 54 | |||
2500 | 55 | |||
2501 | 56 | class NodeCreateError(NodeSyncError): | ||
2502 | 57 | """Error creating node.""" | ||
2503 | 58 | |||
2504 | 59 | |||
2505 | 60 | class NodeUpdateError(NodeSyncError): | ||
2506 | 61 | """Error updating node.""" | ||
2507 | 62 | |||
2508 | 63 | |||
2509 | 64 | class NodeDeleteError(NodeSyncError): | ||
2510 | 65 | """Error deleting node.""" | ||
2511 | 66 | |||
2512 | 67 | |||
2513 | 68 | def sync_tree(merged_tree, original_tree, sync_mode, path, quiet): | ||
2514 | 69 | """Performs actual synchronization.""" | ||
2515 | 70 | |||
2516 | 71 | def pre_merge(nodes, name, partial_parent): | ||
2517 | 72 | """Create nodes and write content as required.""" | ||
2518 | 73 | (merged_node, original_node) = nodes | ||
2519 | 74 | # pylint: disable-msg=W0612 | ||
2520 | 75 | (parent_path, parent_display_path, parent_uuid, parent_synced) \ | ||
2521 | 76 | = partial_parent | ||
2522 | 77 | |||
2523 | 78 | utf8_name = name.encode("utf-8") | ||
2524 | 79 | path = os.path.join(parent_path, utf8_name) | ||
2525 | 80 | display_path = os.path.join(parent_display_path, utf8_name) | ||
2526 | 81 | node_uuid = None | ||
2527 | 82 | |||
2528 | 83 | synced = False | ||
2529 | 84 | if merged_node is not None: | ||
2530 | 85 | if merged_node.node_type == DIRECTORY: | ||
2531 | 86 | if original_node is not None: | ||
2532 | 87 | synced = True | ||
2533 | 88 | node_uuid = original_node.uuid | ||
2534 | 89 | else: | ||
2535 | 90 | if not quiet: | ||
2536 | 91 | print "%s %s" % (sync_mode.symbol, display_path) | ||
2537 | 92 | try: | ||
2538 | 93 | create_dir = sync_mode.create_directory | ||
2539 | 94 | node_uuid = create_dir(parent_uuid=parent_uuid, | ||
2540 | 95 | path=path) | ||
2541 | 96 | synced = True | ||
2542 | 97 | except NodeCreateError, e: | ||
2543 | 98 | print e | ||
2544 | 99 | elif merged_node.content_hash is None: | ||
2545 | 100 | if not quiet: | ||
2546 | 101 | print "? %s" % display_path | ||
2547 | 102 | elif original_node is None or \ | ||
2548 | 103 | original_node.content_hash != merged_node.content_hash or \ | ||
2549 | 104 | merged_node.conflict_info is not None: | ||
2550 | 105 | conflict_info = merged_node.conflict_info | ||
2551 | 106 | if conflict_info is not None: | ||
2552 | 107 | conflict_symbol = CONFLICT_SYMBOL | ||
2553 | 108 | else: | ||
2554 | 109 | conflict_symbol = " " | ||
2555 | 110 | if not quiet: | ||
2556 | 111 | print "%s %s %s" % (sync_mode.symbol, conflict_symbol, | ||
2557 | 112 | display_path) | ||
2558 | 113 | if original_node is not None: | ||
2559 | 114 | node_uuid = original_node.uuid or merged_node.uuid | ||
2560 | 115 | original_hash = original_node.content_hash or EMPTY_HASH | ||
2561 | 116 | else: | ||
2562 | 117 | node_uuid = merged_node.uuid | ||
2563 | 118 | original_hash = EMPTY_HASH | ||
2564 | 119 | try: | ||
2565 | 120 | sync_mode.write_file(node_uuid=node_uuid, | ||
2566 | 121 | content_hash= | ||
2567 | 122 | merged_node.content_hash, | ||
2568 | 123 | old_content_hash=original_hash, | ||
2569 | 124 | path=path, | ||
2570 | 125 | parent_uuid=parent_uuid, | ||
2571 | 126 | conflict_info=conflict_info, | ||
2572 | 127 | node_type=merged_node.node_type) | ||
2573 | 128 | synced = True | ||
2574 | 129 | except NodeSyncError, e: | ||
2575 | 130 | print e | ||
2576 | 131 | else: | ||
2577 | 132 | synced = True | ||
2578 | 133 | |||
2579 | 134 | return (path, display_path, node_uuid, synced) | ||
2580 | 135 | |||
2581 | 136 | def post_merge(nodes, partial_result, child_results): | ||
2582 | 137 | """Delete nodes.""" | ||
2583 | 138 | (merged_node, original_node) = nodes | ||
2584 | 139 | # pylint: disable-msg=W0612 | ||
2585 | 140 | (path, display_path, node_uuid, synced) = partial_result | ||
2586 | 141 | |||
2587 | 142 | if merged_node is None: | ||
2588 | 143 | assert original_node is not None | ||
2589 | 144 | if not quiet: | ||
2590 | 145 | print "%s %s %s" % (sync_mode.symbol, DELETE_SYMBOL, | ||
2591 | 146 | display_path) | ||
2592 | 147 | try: | ||
2593 | 148 | if original_node.node_type == DIRECTORY: | ||
2594 | 149 | sync_mode.delete_directory(node_uuid=original_node.uuid, | ||
2595 | 150 | path=path) | ||
2596 | 151 | else: | ||
2597 | 152 | # files or symlinks | ||
2598 | 153 | sync_mode.delete_file(node_uuid=original_node.uuid, | ||
2599 | 154 | path=path) | ||
2600 | 155 | synced = True | ||
2601 | 156 | except NodeDeleteError, e: | ||
2602 | 157 | print e | ||
2603 | 158 | |||
2604 | 159 | if synced: | ||
2605 | 160 | model_node = merged_node | ||
2606 | 161 | else: | ||
2607 | 162 | model_node = original_node | ||
2608 | 163 | |||
2609 | 164 | if model_node is not None: | ||
2610 | 165 | if model_node.node_type == DIRECTORY: | ||
2611 | 166 | child_iter = child_results.iteritems() | ||
2612 | 167 | merged_children = dict([(name, child) for (name, child) | ||
2613 | 168 | in child_iter | ||
2614 | 169 | if child is not None]) | ||
2615 | 170 | else: | ||
2616 | 171 | # if there are children here it's because they failed to delete | ||
2617 | 172 | merged_children = None | ||
2618 | 173 | return MergeNode(node_type=model_node.node_type, | ||
2619 | 174 | uuid=model_node.uuid, | ||
2620 | 175 | children=merged_children, | ||
2621 | 176 | content_hash=model_node.content_hash) | ||
2622 | 177 | else: | ||
2623 | 178 | return None | ||
2624 | 179 | |||
2625 | 180 | return generic_merge(trees=[merged_tree, original_tree], | ||
2626 | 181 | pre_merge=pre_merge, post_merge=post_merge, | ||
2627 | 182 | partial_parent=(path, "", None, True), name=u"") | ||
2628 | 183 | |||
2629 | 184 | def download_tree(merged_tree, local_tree, client, share_uuid, path, dry_run, | ||
2630 | 185 | quiet): | ||
2631 | 186 | """Downloads a directory.""" | ||
2632 | 187 | if dry_run: | ||
2633 | 188 | downloader = DryRun(symbol=DOWNLOAD_SYMBOL) | ||
2634 | 189 | else: | ||
2635 | 190 | downloader = Downloader(client=client, share_uuid=share_uuid) | ||
2636 | 191 | return sync_tree(merged_tree=merged_tree, original_tree=local_tree, | ||
2637 | 192 | sync_mode=downloader, path=path, quiet=quiet) | ||
2638 | 193 | |||
2639 | 194 | def upload_tree(merged_tree, remote_tree, client, share_uuid, path, dry_run, | ||
2640 | 195 | quiet): | ||
2641 | 196 | """Uploads a directory.""" | ||
2642 | 197 | if dry_run: | ||
2643 | 198 | uploader = DryRun(symbol=UPLOAD_SYMBOL) | ||
2644 | 199 | else: | ||
2645 | 200 | uploader = Uploader(client=client, share_uuid=share_uuid) | ||
2646 | 201 | return sync_tree(merged_tree=merged_tree, original_tree=remote_tree, | ||
2647 | 202 | sync_mode=uploader, path=path, quiet=quiet) | ||
2648 | 203 | |||
2649 | 204 | |||
2650 | 205 | class DryRun(object): | ||
2651 | 206 | """A class which implements the sync interface but does nothing.""" | ||
2652 | 207 | def __init__(self, symbol): | ||
2653 | 208 | """Initializes a DryRun instance.""" | ||
2654 | 209 | self.symbol = symbol | ||
2655 | 210 | |||
2656 | 211 | def create_directory(self, parent_uuid, path): | ||
2657 | 212 | """Doesn't create a directory.""" | ||
2658 | 213 | return None | ||
2659 | 214 | |||
2660 | 215 | def write_file(self, node_uuid, old_content_hash, content_hash, | ||
2661 | 216 | parent_uuid, path, conflict_info, node_type): | ||
2662 | 217 | """Doesn't write a file.""" | ||
2663 | 218 | return None | ||
2664 | 219 | |||
2665 | 220 | def delete_directory(self, node_uuid, path): | ||
2666 | 221 | """Doesn't delete a directory.""" | ||
2667 | 222 | |||
2668 | 223 | def delete_file(self, node_uuid, path): | ||
2669 | 224 | """Doesn't delete a file.""" | ||
2670 | 225 | |||
2671 | 226 | |||
2672 | 227 | class Downloader(object): | ||
2673 | 228 | """A class which implements the download half of syncing.""" | ||
2674 | 229 | def __init__(self, client, share_uuid): | ||
2675 | 230 | """Initializes a Downloader instance.""" | ||
2676 | 231 | self.client = client | ||
2677 | 232 | self.share_uuid = share_uuid | ||
2678 | 233 | self.symbol = DOWNLOAD_SYMBOL | ||
2679 | 234 | |||
2680 | 235 | def create_directory(self, parent_uuid, path): | ||
2681 | 236 | """Creates a directory.""" | ||
2682 | 237 | try: | ||
2683 | 238 | safe_mkdir(path) | ||
2684 | 239 | except OSError, e: | ||
2685 | 240 | raise NodeCreateError("Error creating local directory %s: %s" % \ | ||
2686 | 241 | (path, e)) | ||
2687 | 242 | return None | ||
2688 | 243 | |||
2689 | 244 | def write_file(self, node_uuid, old_content_hash, content_hash, | ||
2690 | 245 | parent_uuid, path, conflict_info, node_type): | ||
2691 | 246 | """Creates a file and downloads new content for it.""" | ||
2692 | 247 | if conflict_info: | ||
2693 | 248 | # download to conflict file rather than overwriting local changes | ||
2694 | 249 | path = get_conflict_path(path, conflict_info) | ||
2695 | 250 | content_hash = conflict_info[1] | ||
2696 | 251 | try: | ||
2697 | 252 | if node_type == SYMLINK: | ||
2698 | 253 | self.client.download_string(share_uuid= | ||
2699 | 254 | self.share_uuid, | ||
2700 | 255 | node_uuid=node_uuid, | ||
2701 | 256 | content_hash=content_hash) | ||
2702 | 257 | else: | ||
2703 | 258 | self.client.download_file(share_uuid=self.share_uuid, | ||
2704 | 259 | node_uuid=node_uuid, | ||
2705 | 260 | content_hash=content_hash, | ||
2706 | 261 | filename=path) | ||
2707 | 262 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2708 | 263 | if os.path.exists(path): | ||
2709 | 264 | raise NodeUpdateError("Error downloading content for %s: %s" %\ | ||
2710 | 265 | (path, e)) | ||
2711 | 266 | else: | ||
2712 | 267 | raise NodeCreateError("Error locally creating %s: %s" % \ | ||
2713 | 268 | (path, e)) | ||
2714 | 269 | |||
2715 | 270 | def delete_directory(self, node_uuid, path): | ||
2716 | 271 | """Deletes a directory.""" | ||
2717 | 272 | try: | ||
2718 | 273 | os.rmdir(path) | ||
2719 | 274 | except OSError, e: | ||
2720 | 275 | raise NodeDeleteError("Error locally deleting %s: %s" % (path, e)) | ||
2721 | 276 | |||
2722 | 277 | def delete_file(self, node_uuid, path): | ||
2723 | 278 | """Deletes a file.""" | ||
2724 | 279 | try: | ||
2725 | 280 | os.unlink(path) | ||
2726 | 281 | except OSError, e: | ||
2727 | 282 | raise NodeDeleteError("Error locally deleting %s: %s" % (path, e)) | ||
2728 | 283 | |||
2729 | 284 | |||
2730 | 285 | class Uploader(object): | ||
2731 | 286 | """A class which implements the upload half of syncing.""" | ||
2732 | 287 | def __init__(self, client, share_uuid): | ||
2733 | 288 | """Initializes an uploader instance.""" | ||
2734 | 289 | self.client = client | ||
2735 | 290 | self.share_uuid = share_uuid | ||
2736 | 291 | self.symbol = UPLOAD_SYMBOL | ||
2737 | 292 | |||
2738 | 293 | def create_directory(self, parent_uuid, path): | ||
2739 | 294 | """Creates a directory on the server.""" | ||
2740 | 295 | name = name_from_path(path) | ||
2741 | 296 | try: | ||
2742 | 297 | return self.client.create_directory(share_uuid=self.share_uuid, | ||
2743 | 298 | parent_uuid=parent_uuid, | ||
2744 | 299 | name=name) | ||
2745 | 300 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2746 | 301 | raise NodeCreateError("Error remotely creating %s: %s" % \ | ||
2747 | 302 | (path, e)) | ||
2748 | 303 | |||
2749 | 304 | def write_file(self, node_uuid, old_content_hash, content_hash, | ||
2750 | 305 | parent_uuid, path, conflict_info, node_type): | ||
2751 | 306 | """Creates a file on the server and uploads new content for it.""" | ||
2752 | 307 | |||
2753 | 308 | if conflict_info: | ||
2754 | 309 | # move conflicting file out of the way on the server | ||
2755 | 310 | conflict_path = get_conflict_path(path, conflict_info) | ||
2756 | 311 | conflict_name = name_from_path(conflict_path) | ||
2757 | 312 | try: | ||
2758 | 313 | self.client.move(share_uuid=self.share_uuid, | ||
2759 | 314 | parent_uuid=parent_uuid, | ||
2760 | 315 | name=conflict_name, | ||
2761 | 316 | node_uuid=node_uuid) | ||
2762 | 317 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2763 | 318 | raise NodeUpdateError("Error remotely renaming %s to %s: %s" %\ | ||
2764 | 319 | (path, conflict_path, e)) | ||
2765 | 320 | node_uuid = None | ||
2766 | 321 | old_content_hash = EMPTY_HASH | ||
2767 | 322 | |||
2768 | 323 | if node_type == SYMLINK: | ||
2769 | 324 | try: | ||
2770 | 325 | target = os.readlink(path) | ||
2771 | 326 | except OSError, e: | ||
2772 | 327 | raise NodeCreateError("Error retrieving link target " \ | ||
2773 | 328 | "for %s: %s" % (path, e)) | ||
2774 | 329 | else: | ||
2775 | 330 | target = None | ||
2776 | 331 | |||
2777 | 332 | name = name_from_path(path) | ||
2778 | 333 | if node_uuid is None: | ||
2779 | 334 | try: | ||
2780 | 335 | if node_type == SYMLINK: | ||
2781 | 336 | node_uuid = self.client.create_symlink(share_uuid= | ||
2782 | 337 | self.share_uuid, | ||
2783 | 338 | parent_uuid= | ||
2784 | 339 | parent_uuid, | ||
2785 | 340 | name=name, | ||
2786 | 341 | target=target) | ||
2787 | 342 | old_content_hash = content_hash | ||
2788 | 343 | else: | ||
2789 | 344 | node_uuid = self.client.create_file(share_uuid= | ||
2790 | 345 | self.share_uuid, | ||
2791 | 346 | parent_uuid= | ||
2792 | 347 | parent_uuid, | ||
2793 | 348 | name=name) | ||
2794 | 349 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2795 | 350 | raise NodeCreateError("Error remotely creating %s: %s" % \ | ||
2796 | 351 | (path, e)) | ||
2797 | 352 | |||
2798 | 353 | if old_content_hash != content_hash: | ||
2799 | 354 | try: | ||
2800 | 355 | if node_type == SYMLINK: | ||
2801 | 356 | self.client.upload_string(share_uuid=self.share_uuid, | ||
2802 | 357 | node_uuid=node_uuid, | ||
2803 | 358 | content_hash=content_hash, | ||
2804 | 359 | old_content_hash= | ||
2805 | 360 | old_content_hash, | ||
2806 | 361 | content=target) | ||
2807 | 362 | else: | ||
2808 | 363 | self.client.upload_file(share_uuid=self.share_uuid, | ||
2809 | 364 | node_uuid=node_uuid, | ||
2810 | 365 | content_hash=content_hash, | ||
2811 | 366 | old_content_hash=old_content_hash, | ||
2812 | 367 | filename=path) | ||
2813 | 368 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2814 | 369 | raise NodeUpdateError("Error uploading content for %s: %s" % \ | ||
2815 | 370 | (path, e)) | ||
2816 | 371 | |||
2817 | 372 | def delete_directory(self, node_uuid, path): | ||
2818 | 373 | """Deletes a directory.""" | ||
2819 | 374 | try: | ||
2820 | 375 | self.client.unlink(share_uuid=self.share_uuid, node_uuid=node_uuid) | ||
2821 | 376 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2822 | 377 | raise NodeDeleteError("Error remotely deleting %s: %s" % (path, e)) | ||
2823 | 378 | |||
2824 | 379 | def delete_file(self, node_uuid, path): | ||
2825 | 380 | """Deletes a file.""" | ||
2826 | 381 | try: | ||
2827 | 382 | self.client.unlink(share_uuid=self.share_uuid, node_uuid=node_uuid) | ||
2828 | 383 | except (request.StorageRequestError, UnsupportedOperationError), e: | ||
2829 | 384 | raise NodeDeleteError("Error remotely deleting %s: %s" % (path, e)) | ||
2830 | 0 | 385 | ||
2831 | === added file 'src/u1sync/ubuntuone_optparse.py' | |||
2832 | --- src/u1sync/ubuntuone_optparse.py 1970-01-01 00:00:00 +0000 | |||
2833 | +++ src/u1sync/ubuntuone_optparse.py 2010-08-27 20:21:13 +0000 | |||
2834 | @@ -0,0 +1,202 @@ | |||
2835 | 1 | # ubuntuone.u1sync.ubuntuone_optparse | ||
2836 | 2 | # | ||
2837 | 3 | # Prototype directory sync client | ||
2838 | 4 | # | ||
2839 | 5 | # Author: Manuel de la Pena <manuel.delapena@canonical.com> | ||
2840 | 6 | # | ||
2841 | 7 | # Copyright 2010 Canonical Ltd. | ||
2842 | 8 | # | ||
2843 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
2844 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
2845 | 11 | # by the Free Software Foundation. | ||
2846 | 12 | # | ||
2847 | 13 | # This program is distributed in the hope that it will be useful, but | ||
2848 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
2849 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
2850 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
2851 | 17 | # | ||
2852 | 18 | # You should have received a copy of the GNU General Public License along | ||
2853 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
2854 | 20 | import uuid | ||
2855 | 21 | from oauth.oauth import OAuthToken | ||
2856 | 22 | from optparse import OptionParser, SUPPRESS_HELP | ||
2857 | 23 | from u1sync.merge import ( | ||
2858 | 24 | SyncMerge, ClobberServerMerge, ClobberLocalMerge) | ||
2859 | 25 | |||
2860 | 26 | MERGE_ACTIONS = { | ||
2861 | 27 | # action: (merge_class, should_upload, should_download) | ||
2862 | 28 | 'sync': (SyncMerge, True, True), | ||
2863 | 29 | 'clobber-server': (ClobberServerMerge, True, False), | ||
2864 | 30 | 'clobber-local': (ClobberLocalMerge, False, True), | ||
2865 | 31 | 'upload': (SyncMerge, True, False), | ||
2866 | 32 | 'download': (SyncMerge, False, True), | ||
2867 | 33 | 'auto': None # special case | ||
2868 | 34 | } | ||
2869 | 35 | |||
2870 | 36 | DEFAULT_MERGE_ACTION = 'auto' | ||
2871 | 37 | |||
2872 | 38 | class NotParsedOptionsError(Exception): | ||
2873 | 39 | """Exception thrown when there options have not been parsed.""" | ||
2874 | 40 | |||
2875 | 41 | class NotValidatedOptionsError(Exception): | ||
2876 | 42 | """Exception thrown when the options have not been validated.""" | ||
2877 | 43 | |||
2878 | 44 | class UbuntuOneOptionsParser(OptionParser): | ||
2879 | 45 | """Parse for the options passed in the command line""" | ||
2880 | 46 | |||
2881 | 47 | def __init__(self): | ||
2882 | 48 | usage = "Usage: %prog [options] [DIRECTORY]\n" \ | ||
2883 | 49 | " %prog --authorize [options]\n" \ | ||
2884 | 50 | " %prog --list-shares [options]\n" \ | ||
2885 | 51 | " %prog --init [--share=SHARE_UUID] [options] DIRECTORY\n" \ | ||
2886 | 52 | " %prog --diff [--share=SHARE_UUID] [options] DIRECTORY" | ||
2887 | 53 | OptionParser.__init__(self, usage=usage) | ||
2888 | 54 | self._was_validated = False | ||
2889 | 55 | self._args = None | ||
2890 | 56 | # add the different options to be used | ||
2891 | 57 | self.add_option("--port", dest="port", metavar="PORT", | ||
2892 | 58 | default=443, | ||
2893 | 59 | help="The port on which to connect to the server") | ||
2894 | 60 | self.add_option("--host", dest="host", metavar="HOST", | ||
2895 | 61 | default='fs-1.one.ubuntu.com', | ||
2896 | 62 | help="The server address") | ||
2897 | 63 | self.add_option("--realm", dest="realm", metavar="REALM", | ||
2898 | 64 | default='https://ubuntuone.com', | ||
2899 | 65 | help="The oauth realm") | ||
2900 | 66 | self.add_option("--oauth", dest="oauth", metavar="KEY:SECRET", | ||
2901 | 67 | default=None, | ||
2902 | 68 | help="Explicitly provide OAuth credentials " | ||
2903 | 69 | "(default is to query keyring)") | ||
2904 | 70 | |||
2905 | 71 | action_list = ", ".join(sorted(MERGE_ACTIONS.keys())) | ||
2906 | 72 | self.add_option("--action", dest="action", metavar="ACTION", | ||
2907 | 73 | default=None, | ||
2908 | 74 | help="Select a sync action (%s; default is %s)" % \ | ||
2909 | 75 | (action_list, DEFAULT_MERGE_ACTION)) | ||
2910 | 76 | |||
2911 | 77 | self.add_option("--dry-run", action="store_true", dest="dry_run", | ||
2912 | 78 | default=False, help="Do a dry run without actually " | ||
2913 | 79 | "making changes") | ||
2914 | 80 | self.add_option("--quiet", action="store_true", dest="quiet", | ||
2915 | 81 | default=False, help="Produces less output") | ||
2916 | 82 | self.add_option("--authorize", action="store_const", dest="mode", | ||
2917 | 83 | const="authorize", | ||
2918 | 84 | help="Authorize this machine") | ||
2919 | 85 | self.add_option("--list-shares", action="store_const", dest="mode", | ||
2920 | 86 | const="list-shares", default="sync", | ||
2921 | 87 | help="List available shares") | ||
2922 | 88 | self.add_option("--init", action="store_const", dest="mode", | ||
2923 | 89 | const="init", | ||
2924 | 90 | help="Initialize a local directory for syncing") | ||
2925 | 91 | self.add_option("--no-ssl-verify", action="store_true", | ||
2926 | 92 | dest="no_ssl_verify", | ||
2927 | 93 | default=False, help=SUPPRESS_HELP) | ||
2928 | 94 | self.add_option("--diff", action="store_const", dest="mode", | ||
2929 | 95 | const="diff", | ||
2930 | 96 | help="Compare tree on server with local tree " \ | ||
2931 | 97 | "(does not require previous --init)") | ||
2932 | 98 | self.add_option("--share", dest="share", metavar="SHARE_UUID", | ||
2933 | 99 | default=None, | ||
2934 | 100 | help="Sync the directory with a share rather than the " \ | ||
2935 | 101 | "user's own volume") | ||
2936 | 102 | self.add_option("--subtree", dest="subtree", metavar="PATH", | ||
2937 | 103 | default=None, | ||
2938 | 104 | help="Mirror a subset of the share or volume") | ||
2939 | 105 | |||
2940 | 106 | def get_options(self, arguments): | ||
2941 | 107 | """Parses the arguments to from the command line.""" | ||
2942 | 108 | (self.options, self._args) = \ | ||
2943 | 109 | self.parse_args(arguments) | ||
2944 | 110 | self._validate_args() | ||
2945 | 111 | |||
2946 | 112 | def _validate_args(self): | ||
2947 | 113 | """Validates the args that have been parsed.""" | ||
2948 | 114 | self._is_only_share() | ||
2949 | 115 | self._get_directory() | ||
2950 | 116 | self._validate_action_usage() | ||
2951 | 117 | self._validate_authorize_usage() | ||
2952 | 118 | self._validate_subtree_usage() | ||
2953 | 119 | self._validate_action() | ||
2954 | 120 | self._validate_oauth() | ||
2955 | 121 | |||
2956 | 122 | def _is_only_share(self): | ||
2957 | 123 | """Ensures that the share options is not convined with any other.""" | ||
2958 | 124 | if self.options.share is not None and \ | ||
2959 | 125 | self.options.mode != "init" and \ | ||
2960 | 126 | self.options.mode != "diff": | ||
2961 | 127 | self.error("--share is only valid with --init or --diff") | ||
2962 | 128 | |||
2963 | 129 | def _get_directory(self): | ||
2964 | 130 | """Gets the directory to be used according to the paramenters.""" | ||
2965 | 131 | print self._args | ||
2966 | 132 | if self.options.mode == "sync" or self.options.mode == "init" or \ | ||
2967 | 133 | self.options.mode == "diff": | ||
2968 | 134 | if len(self._args) > 2: | ||
2969 | 135 | self.error("Too many arguments") | ||
2970 | 136 | elif len(self._args) < 1: | ||
2971 | 137 | if self.options.mode == "init" or self.options.mode == "diff": | ||
2972 | 138 | self.error("--%s requires a directory to " | ||
2973 | 139 | "be specified" % self.options.mode) | ||
2974 | 140 | else: | ||
2975 | 141 | self.options.directory = "." | ||
2976 | 142 | else: | ||
2977 | 143 | self.options.directory = self._args[0] | ||
2978 | 144 | |||
2979 | 145 | def _validate_action_usage(self): | ||
2980 | 146 | """Ensures that the --action option is correctly used""" | ||
2981 | 147 | if self.options.mode == "init" or \ | ||
2982 | 148 | self.options.mode == "list-shares" or \ | ||
2983 | 149 | self.options.mode == "diff" or \ | ||
2984 | 150 | self.options.mode == "authorize": | ||
2985 | 151 | if self.options.action is not None: | ||
2986 | 152 | self.error("--%s does not take the --action parameter" % \ | ||
2987 | 153 | self.options.mode) | ||
2988 | 154 | if self.options.dry_run: | ||
2989 | 155 | self.error("--%s does not take the --dry-run parameter" % \ | ||
2990 | 156 | self.options.mode) | ||
2991 | 157 | |||
2992 | 158 | def _validate_authorize_usage(self): | ||
2993 | 159 | """Validates the usage of the authorize option.""" | ||
2994 | 160 | if self.options.mode == "authorize": | ||
2995 | 161 | if self.options.oauth is not None: | ||
2996 | 162 | self.error("--authorize does not take the --oauth parameter") | ||
2997 | 163 | if self.options.mode == "list-shares" or \ | ||
2998 | 164 | self.options.mode == "authorize": | ||
2999 | 165 | if len(self._args) != 0: | ||
3000 | 166 | self.error("--list-shares does not take a directory") | ||
3001 | 167 | |||
3002 | 168 | def _validate_subtree_usage(self): | ||
3003 | 169 | """Validates the usage of the subtree option""" | ||
3004 | 170 | if self.options.mode != "init" and self.options.mode != "diff": | ||
3005 | 171 | if self.options.subtree is not None: | ||
3006 | 172 | self.error("--%s does not take the --subtree parameter" % \ | ||
3007 | 173 | self.options.mode) | ||
3008 | 174 | |||
3009 | 175 | def _validate_action(self): | ||
3010 | 176 | """Validates the actions passed to the options.""" | ||
3011 | 177 | if self.options.action is not None and \ | ||
3012 | 178 | self.options.action not in MERGE_ACTIONS: | ||
3013 | 179 | self.error("--action: Unknown action %s" % self.options.action) | ||
3014 | 180 | |||
3015 | 181 | if self.options.action is None: | ||
3016 | 182 | self.options.action = DEFAULT_MERGE_ACTION | ||
3017 | 183 | |||
3018 | 184 | def _validate_oauth(self): | ||
3019 | 185 | """Validates that the oatuh was passed.""" | ||
3020 | 186 | if self.options.oauth is None: | ||
3021 | 187 | self.error("--oauth is currently compulsery.") | ||
3022 | 188 | else: | ||
3023 | 189 | try: | ||
3024 | 190 | (key, secret) = self.options.oauth.split(':', 2) | ||
3025 | 191 | except ValueError: | ||
3026 | 192 | self.error("--oauth requires a key and secret together in the " | ||
3027 | 193 | " form KEY:SECRET") | ||
3028 | 194 | self.options.token = OAuthToken(key, secret) | ||
3029 | 195 | |||
3030 | 196 | def _validate_share(self): | ||
3031 | 197 | """Validates the share option""" | ||
3032 | 198 | if self.options.share is not None: | ||
3033 | 199 | try: | ||
3034 | 200 | uuid.UUID(self.options.share) | ||
3035 | 201 | except ValueError, e: | ||
3036 | 202 | self.error("Invalid --share argument: %s" % e) | ||
3037 | 0 | \ No newline at end of file | 203 | \ No newline at end of file |
3038 | 1 | 204 | ||
3039 | === added file 'src/u1sync/utils.py' | |||
3040 | --- src/u1sync/utils.py 1970-01-01 00:00:00 +0000 | |||
3041 | +++ src/u1sync/utils.py 2010-08-27 20:21:13 +0000 | |||
3042 | @@ -0,0 +1,50 @@ | |||
3043 | 1 | # ubuntuone.u1sync.utils | ||
3044 | 2 | # | ||
3045 | 3 | # Miscellaneous utility functions | ||
3046 | 4 | # | ||
3047 | 5 | # Author: Tim Cole <tim.cole@canonical.com> | ||
3048 | 6 | # | ||
3049 | 7 | # Copyright 2009 Canonical Ltd. | ||
3050 | 8 | # | ||
3051 | 9 | # This program is free software: you can redistribute it and/or modify it | ||
3052 | 10 | # under the terms of the GNU General Public License version 3, as published | ||
3053 | 11 | # by the Free Software Foundation. | ||
3054 | 12 | # | ||
3055 | 13 | # This program is distributed in the hope that it will be useful, but | ||
3056 | 14 | # WITHOUT ANY WARRANTY; without even the implied warranties of | ||
3057 | 15 | # MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR | ||
3058 | 16 | # PURPOSE. See the GNU General Public License for more details. | ||
3059 | 17 | # | ||
3060 | 18 | # You should have received a copy of the GNU General Public License along | ||
3061 | 19 | # with this program. If not, see <http://www.gnu.org/licenses/>. | ||
3062 | 20 | """Miscellaneous utility functions.""" | ||
3063 | 21 | |||
3064 | 22 | import os | ||
3065 | 23 | from errno import EEXIST, ENOENT | ||
3066 | 24 | from u1sync.constants import ( | ||
3067 | 25 | METADATA_DIR_NAME, SPECIAL_FILE_RE) | ||
3068 | 26 | |||
3069 | 27 | def should_sync(filename): | ||
3070 | 28 | """Returns True if the filename should be synced. | ||
3071 | 29 | |||
3072 | 30 | @param filename: a unicode filename | ||
3073 | 31 | |||
3074 | 32 | """ | ||
3075 | 33 | return filename != METADATA_DIR_NAME and \ | ||
3076 | 34 | not SPECIAL_FILE_RE.match(filename) | ||
3077 | 35 | |||
3078 | 36 | def safe_mkdir(path): | ||
3079 | 37 | """Creates a directory iff it does not already exist.""" | ||
3080 | 38 | try: | ||
3081 | 39 | os.mkdir(path) | ||
3082 | 40 | except OSError, e: | ||
3083 | 41 | if e.errno != EEXIST: | ||
3084 | 42 | raise | ||
3085 | 43 | |||
3086 | 44 | def safe_unlink(path): | ||
3087 | 45 | """Unlinks a file iff it exists.""" | ||
3088 | 46 | try: | ||
3089 | 47 | os.unlink(path) | ||
3090 | 48 | except OSError, e: | ||
3091 | 49 | if e.errno != ENOENT: | ||
3092 | 50 | raise |