Merge lp:~mandel/ubuntuone-windows-installer/include_u1sync_msi into lp:ubuntuone-windows-installer/beta
- include_u1sync_msi
- Merge into beta
Proposed by
Manuel de la Peña
Status: | Merged |
---|---|
Approved by: | John Lenton |
Approved revision: | 57 |
Merged at revision: | 73 |
Proposed branch: | lp:~mandel/ubuntuone-windows-installer/include_u1sync_msi |
Merge into: | lp:ubuntuone-windows-installer/beta |
Diff against target: |
3092 lines (+2969/-4) 16 files modified
.bzrignore (+2/-0) README.txt (+15/-2) install/UbuntuOne.wxs (+562/-0) main.build (+19/-2) src/setup.py (+56/-0) src/u1sync/__init__.py (+14/-0) src/u1sync/client.py (+754/-0) src/u1sync/constants.py (+30/-0) src/u1sync/genericmerge.py (+88/-0) src/u1sync/main.py (+360/-0) src/u1sync/merge.py (+186/-0) src/u1sync/metadata.py (+145/-0) src/u1sync/scan.py (+102/-0) src/u1sync/sync.py (+384/-0) src/u1sync/ubuntuone_optparse.py (+202/-0) src/u1sync/utils.py (+50/-0) |
To merge this branch: | bzr merge lp:~mandel/ubuntuone-windows-installer/include_u1sync_msi |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
John Lenton (community) | Approve | ||
Rick McBride (community) | Approve | ||
Review via email:
|
Commit message
Description of the change
Added the embedded python runtime to the msi.
To post a comment you must log in.
Revision history for this message
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Rick McBride (rmcbride) : | # |
review:
Approve
Revision history for this message
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
John Lenton (chipaca) : | # |
review:
Approve
Preview Diff
[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1 | === modified file '.bzrignore' |
2 | --- .bzrignore 2010-08-24 18:18:27 +0000 |
3 | +++ .bzrignore 2010-08-27 20:21:13 +0000 |
4 | @@ -31,3 +31,5 @@ |
5 | install/UbuntuOne.msi |
6 | src/Canonical.UbuntuOne.SyncDaemon/obj |
7 | src/Canonical.UbuntuOne.SyncDaemon/bin |
8 | +src/build |
9 | +src/dist |
10 | |
11 | === modified file 'README.txt' |
12 | --- README.txt 2010-08-19 11:13:02 +0000 |
13 | +++ README.txt 2010-08-27 20:21:13 +0000 |
14 | @@ -7,8 +7,21 @@ |
15 | * UbuntuOne Windows Service: Windows service that allows to start, stop, pause and resume the sync daemon. |
16 | * UbuntuOne Windows Service Brodcaster: Provides a WCF service hosted in a windows service that allows to .Net languages |
17 | to communicate with the sync daemon and interact with it. |
18 | - |
19 | -2. Build |
20 | + |
21 | +2. Enviroment setup |
22 | + |
23 | +The ubuntuone windows port solution provides a port of the python code that is used on Linux to perform the u1 sync operations. Due to the fact that |
24 | +we did not want user to have to be hunting down the python runtime plus the differen modules that have been used. To simplify the live of the |
25 | +windows users we have opted to use py2exe to create an executable that will carry all the different python dependencies of the code. As you |
26 | +may already know py2exe is not perfect and does not support egg files. In order to make sure that you easy_install does extra the eggs files after |
27 | +the packages are installed please use the following command: |
28 | + |
29 | +easy_install -Z %package_name% |
30 | + |
31 | +In order to be able to build the solution you will need to have python win32, the win32 python extensions and the ubuntu one storage protocol in your system.: |
32 | + |
33 | + |
34 | +3. Build |
35 | |
36 | In order to simplify the buil as much as possible, all the required tools for the compilation of the project are provided in the soruce tree. The |
37 | compilation uses a nant project that allows to run the following targets: |
38 | |
39 | === modified file 'install/UbuntuOne.wxs' |
40 | --- install/UbuntuOne.wxs 2010-08-24 16:09:12 +0000 |
41 | +++ install/UbuntuOne.wxs 2010-08-27 20:21:13 +0000 |
42 | @@ -404,6 +404,505 @@ |
43 | KeyPath="yes"/> |
44 | </Component> |
45 | </Directory> |
46 | + <Directory Id="U1SyncExecutable" |
47 | + Name="U1Sync"> |
48 | + <Component Id="CTypesComponent" |
49 | + Guid="cc878310-b1ee-11df-94e2-0800200c9a66"> |
50 | + <File Id="_ctypes.pyd" |
51 | + Name="_ctypes.pyd" |
52 | + DiskId="1" |
53 | + Source="build_results\u1sync\_ctypes.pyd" |
54 | + KeyPath="yes"/> |
55 | + </Component> |
56 | + <Component Id="ElemtTreeComponent" |
57 | + Guid="efe6b010-b1ee-11df-94e2-0800200c9a66"> |
58 | + <File Id="_elementtree.pyd" |
59 | + Name="_elementtree.pyd" |
60 | + DiskId="1" |
61 | + Source="build_results\u1sync\_elementtree.pyd" |
62 | + KeyPath="yes"/> |
63 | + </Component> |
64 | + <Component Id="HLibComponent" |
65 | + Guid="15848270-b1ef-11df-94e2-0800200c9a66"> |
66 | + <File Id="_hashlib.pyd" |
67 | + Name="_hashlib.pyd" |
68 | + DiskId="1" |
69 | + Source="build_results\u1sync\_hashlib.pyd" |
70 | + KeyPath="yes"/> |
71 | + </Component> |
72 | + <Component Id="SocketComponent" |
73 | + Guid="37e5a060-b1ef-11df-94e2-0800200c9a66"> |
74 | + <File Id="_socket.pyd" |
75 | + Name="_socket.pyd" |
76 | + DiskId="1" |
77 | + Source="build_results\u1sync\_socket.pyd" |
78 | + KeyPath="yes"/> |
79 | + </Component> |
80 | + <Component Id="SSLComponent" |
81 | + Guid="5df98d20-b1ef-11df-94e2-0800200c9a66"> |
82 | + <File Id="_ssl.pyd" |
83 | + Name="_ssl.pyd" |
84 | + DiskId="1" |
85 | + Source="build_results\u1sync\_ssl.pyd" |
86 | + KeyPath="yes"/> |
87 | + </Component> |
88 | + <Component Id="SysLoaderComponent" |
89 | + Guid="85d149f0-b1ef-11df-94e2-0800200c9a66"> |
90 | + <File Id="_win32sysloader.pyd" |
91 | + Name="_win32sysloader.pyd" |
92 | + DiskId="1" |
93 | + Source="build_results\u1sync\_win32sysloader.pyd" |
94 | + KeyPath="yes"/> |
95 | + </Component> |
96 | + <Component Id="WinCoreDelayComponent" |
97 | + Guid="a5ae2e00-b1ef-11df-94e2-0800200c9a66"> |
98 | + <File Id="API_MS_Win_Core_DelayLoad_L1_1_0.dll" |
99 | + Name="API-MS-Win-Core-DelayLoad-L1-1-0.dll" |
100 | + DiskId="1" |
101 | + Source="build_results\u1sync\API-MS-Win-Core-DelayLoad-L1-1-0.dll" |
102 | + KeyPath="yes"/> |
103 | + </Component> |
104 | + <Component Id="WinCoreErrorHandlingComponent" |
105 | + Guid="f2ae6620-b1ef-11df-94e2-0800200c9a66"> |
106 | + <File Id="API_MS_Win_Core_ErrorHandling_L1_1_0.dll" |
107 | + Name="API-MS-Win-Core-ErrorHandling-L1-1-0.dll" |
108 | + DiskId="1" |
109 | + Source="build_results\u1sync\API-MS-Win-Core-ErrorHandling-L1-1-0.dll" |
110 | + KeyPath="yes"/> |
111 | + </Component> |
112 | + <Component Id="WinCoreHandleComponent" |
113 | + Guid="252ce1d0-b1f0-11df-94e2-0800200c9a66"> |
114 | + <File Id="API_MS_Win_Core_Handle_L1_1_0.dll" |
115 | + Name="API-MS-Win-Core-Handle-L1-1-0.dll" |
116 | + DiskId="1" |
117 | + Source="build_results\u1sync\API-MS-Win-Core-Handle-L1-1-0.dll" |
118 | + KeyPath="yes"/> |
119 | + </Component> |
120 | + <Component Id="WinCoreInterlockedComponent" |
121 | + Guid="252ce1d1-b1f0-11df-94e2-0800200c9a66"> |
122 | + <File Id="API_MS_Win_Core_Interlocked_L1_1_0.dll" |
123 | + Name="API-MS-Win-Core-Interlocked-L1-1-0.dll" |
124 | + DiskId="1" |
125 | + Source="build_results\u1sync\API-MS-Win-Core-Interlocked-L1-1-0.dll" |
126 | + KeyPath="yes"/> |
127 | + </Component> |
128 | + <Component Id="WinCoreIOComponent" |
129 | + Guid="252ce1d2-b1f0-11df-94e2-0800200c9a66"> |
130 | + <File Id="API_MS_Win_Core_IO_L1_1_0.dll" |
131 | + Name="API-MS-Win-Core-IO-L1-1-0.dll" |
132 | + DiskId="1" |
133 | + Source="build_results\u1sync\API-MS-Win-Core-IO-L1-1-0.dll" |
134 | + KeyPath="yes"/> |
135 | + </Component> |
136 | + <Component Id="WinCoreLoaderComponent" |
137 | + Guid="252ce1d3-b1f0-11df-94e2-0800200c9a66"> |
138 | + <File Id="API_MS_Win_Core_LibraryLoader_L1_1_0.dll" |
139 | + Name="API-MS-Win-Core-LibraryLoader-L1-1-0.dll" |
140 | + DiskId="1" |
141 | + Source="build_results\u1sync\API-MS-Win-Core-LibraryLoader-L1-1-0.dll" |
142 | + KeyPath="yes"/> |
143 | + </Component> |
144 | + <Component Id="WinCoreLocalizationComponent" |
145 | + Guid="252ce1d4-b1f0-11df-94e2-0800200c9a66"> |
146 | + <File Id="API_MS_Win_Core_Localization_L1_1_0.dll" |
147 | + Name="API-MS-Win-Core-Localization-L1-1-0.dll" |
148 | + DiskId="1" |
149 | + Source="build_results\u1sync\API-MS-Win-Core-Localization-L1-1-0.dll" |
150 | + KeyPath="yes"/> |
151 | + </Component> |
152 | + <Component Id="WinCoreLocalRegistryComponent" |
153 | + Guid="252ce1d5-b1f0-11df-94e2-0800200c9a66"> |
154 | + <File Id="API_MS_Win_Core_LocalRegistry_L1_1_0.dll" |
155 | + Name="API-MS-Win-Core-LocalRegistry-L1-1-0.dll" |
156 | + DiskId="1" |
157 | + Source="build_results\u1sync\API-MS-Win-Core-LocalRegistry-L1-1-0.dll" |
158 | + KeyPath="yes"/> |
159 | + </Component> |
160 | + <Component Id="WinCoreMemoryComponent" |
161 | + Guid="252ce1d6-b1f0-11df-94e2-0800200c9a66"> |
162 | + <File Id="API_MS_Win_Core_Memory_L1_1_0.dll" |
163 | + Name="API-MS-Win-Core-Memory-L1-1-0.dll" |
164 | + DiskId="1" |
165 | + Source="build_results\u1sync\API-MS-Win-Core-Memory-L1-1-0.dll" |
166 | + KeyPath="yes"/> |
167 | + </Component> |
168 | + <Component Id="WinCoreMiscComponent" |
169 | + Guid="252ce1d8-b1f0-11df-94e2-0800200c9a66"> |
170 | + <File Id="API_MS_Win_Core_Misc_L1_1_0.dll" |
171 | + Name="API-MS-Win-Core-Misc-L1-1-0.dll" |
172 | + DiskId="1" |
173 | + Source="build_results\u1sync\API-MS-Win-Core-Misc-L1-1-0.dll" |
174 | + KeyPath="yes"/> |
175 | + </Component> |
176 | + <Component Id="WinCoreProcessEnvComponent" |
177 | + Guid="252ce1d9-b1f0-11df-94e2-0800200c9a66"> |
178 | + <File Id="API_MS_Win_Core_ProcessEnvironment_L1_1_0.dll" |
179 | + Name="API-MS-Win-Core-ProcessEnvironment-L1-1-0.dll" |
180 | + DiskId="1" |
181 | + Source="build_results\u1sync\API-MS-Win-Core-ProcessEnvironment-L1-1-0.dll" |
182 | + KeyPath="yes"/> |
183 | + </Component> |
184 | + <Component Id="WinCoreProcessThreadsComponent" |
185 | + Guid="252ce1da-b1f0-11df-94e2-0800200c9a66"> |
186 | + <File Id="API_MS_Win_Core_ProcessThreads_L1_1_0.dll" |
187 | + Name="API-MS-Win-Core-ProcessThreads-L1-1-0.dll" |
188 | + DiskId="1" |
189 | + Source="build_results\u1sync\API-MS-Win-Core-ProcessThreads-L1-1-0.dll" |
190 | + KeyPath="yes"/> |
191 | + </Component> |
192 | + <Component Id="WinCoreProfileComponent" |
193 | + Guid="252ce1db-b1f0-11df-94e2-0800200c9a66"> |
194 | + <File Id="API_MS_Win_Core_Profile_L1_1_0.dll" |
195 | + Name="API-MS-Win-Core-Profile-L1-1-0.dll" |
196 | + DiskId="1" |
197 | + Source="build_results\u1sync\API-MS-Win-Core-Profile-L1-1-0.dll" |
198 | + KeyPath="yes"/> |
199 | + </Component> |
200 | + <Component Id="WinCoreStringComponent" |
201 | + Guid="252ce1dc-b1f0-11df-94e2-0800200c9a66"> |
202 | + <File Id="API_MS_Win_Core_String_L1_1_0.dll" |
203 | + Name="API-MS-Win-Core-String-L1-1-0.dll" |
204 | + DiskId="1" |
205 | + Source="build_results\u1sync\API-MS-Win-Core-String-L1-1-0.dll" |
206 | + KeyPath="yes"/> |
207 | + </Component> |
208 | + <Component Id="WinCoreSynchComponent" |
209 | + Guid="252ce1dd-b1f0-11df-94e2-0800200c9a66"> |
210 | + <File Id="API_MS_Win_Core_Synch_L1_1_0.dll" |
211 | + Name="API-MS-Win-Core-Synch-L1-1-0.dll" |
212 | + DiskId="1" |
213 | + Source="build_results\u1sync\API-MS-Win-Core-Synch-L1-1-0.dll" |
214 | + KeyPath="yes"/> |
215 | + </Component> |
216 | + <Component Id="WinCoreSysInfoComponent" |
217 | + Guid="252ce1de-b1f0-11df-94e2-0800200c9a66"> |
218 | + <File Id="API_MS_Win_Core_SysInfo_L1_1_0.dll" |
219 | + Name="API-MS-Win-Core-SysInfo-L1-1-0.dll" |
220 | + DiskId="1" |
221 | + Source="build_results\u1sync\API-MS-Win-Core-SysInfo-L1-1-0.dll" |
222 | + KeyPath="yes"/> |
223 | + </Component> |
224 | + <Component Id="WinCoreSecurityBaseComponent" |
225 | + Guid="252ce1df-b1f0-11df-94e2-0800200c9a66"> |
226 | + <File Id="API_MS_Win_Security_Base_L1_1_0.dll" |
227 | + Name="API-MS-Win-Security-Base-L1-1-0.dll" |
228 | + DiskId="1" |
229 | + Source="build_results\u1sync\API-MS-Win-Security-Base-L1-1-0.dll" |
230 | + KeyPath="yes"/> |
231 | + </Component> |
232 | + <Component Id="Bz2Component" |
233 | + Guid="252ce1e0-b1f0-11df-94e2-0800200c9a66"> |
234 | + <File Id="bz2.pyd" |
235 | + Name="bz2.pyd" |
236 | + DiskId="1" |
237 | + Source="build_results\u1sync\bz2.pyd" |
238 | + KeyPath="yes"/> |
239 | + </Component> |
240 | + <Component Id="BzrAnnotatorComponent" |
241 | + Guid="252d08e0-b1f0-11df-94e2-0800200c9a66"> |
242 | + <File Id="bzrlib._annotator_pyx.pyd" |
243 | + Name="bzrlib._annotator_pyx.pyd" |
244 | + DiskId="1" |
245 | + Source="build_results\u1sync\bzrlib._annotator_pyx.pyd" |
246 | + KeyPath="yes"/> |
247 | + </Component> |
248 | + <Component Id="BzrBencodeComponent" |
249 | + Guid="252d08e1-b1f0-11df-94e2-0800200c9a66"> |
250 | + <File Id="bzrlib._bencode_pyx.pyd" |
251 | + Name="bzrlib._bencode_pyx.pyd" |
252 | + DiskId="1" |
253 | + Source="build_results\u1sync\bzrlib._bencode_pyx.pyd" |
254 | + KeyPath="yes"/> |
255 | + </Component> |
256 | + <Component Id="BzrChkComponent" |
257 | + Guid="252d08e2-b1f0-11df-94e2-0800200c9a66"> |
258 | + <File Id="bzrlib._chk_map_pyx.pyd" |
259 | + Name="bzrlib._chk_map_pyx.pyd" |
260 | + DiskId="1" |
261 | + Source="build_results\u1sync\bzrlib._chk_map_pyx.pyd" |
262 | + KeyPath="yes"/> |
263 | + </Component> |
264 | + <Component Id="BzrChunkComponent" |
265 | + Guid="252d08e3-b1f0-11df-94e2-0800200c9a66"> |
266 | + <File Id="bzrlib._chunks_to_lines_pyx.pyd" |
267 | + Name="bzrlib._chunks_to_lines_pyx.pyd" |
268 | + DiskId="1" |
269 | + Source="build_results\u1sync\bzrlib._chunks_to_lines_pyx.pyd" |
270 | + KeyPath="yes"/> |
271 | + </Component> |
272 | + <Component Id="BzrPatienceDiffComponent" |
273 | + Guid="252d08e5-b1f0-11df-94e2-0800200c9a66"> |
274 | + <File Id="bzrlib._patiencediff_c.pyd" |
275 | + Name="bzrlib._patiencediff_c.pyd" |
276 | + DiskId="1" |
277 | + Source="build_results\u1sync\bzrlib._patiencediff_c.pyd" |
278 | + KeyPath="yes"/> |
279 | + </Component> |
280 | + <Component Id="BzrRioComponent" |
281 | + Guid="252d08e6-b1f0-11df-94e2-0800200c9a66"> |
282 | + <File Id="bzrlib._rio_pyx.pyd" |
283 | + Name="bzrlib._rio_pyx.pyd" |
284 | + DiskId="1" |
285 | + Source="build_results\u1sync\bzrlib._rio_pyx.pyd" |
286 | + KeyPath="yes"/> |
287 | + </Component> |
288 | + <Component Id="BzrStaticTupleComponent" |
289 | + Guid="252d08e7-b1f0-11df-94e2-0800200c9a66"> |
290 | + <File Id="bzrlib._static_tuple_c.pyd" |
291 | + Name="bzrlib._static_tuple_c.pyd" |
292 | + DiskId="1" |
293 | + Source="build_results\u1sync\bzrlib._static_tuple_c.pyd" |
294 | + KeyPath="yes"/> |
295 | + </Component> |
296 | + <Component Id="KernelBaseComponent" |
297 | + Guid="252d08e8-b1f0-11df-94e2-0800200c9a66"> |
298 | + <File Id="KERNELBASE.dll" |
299 | + Name="KERNELBASE.dll" |
300 | + DiskId="1" |
301 | + Source="build_results\u1sync\KERNELBASE.dll" |
302 | + KeyPath="yes"/> |
303 | + </Component> |
304 | + <Component Id="Libeay32Component" |
305 | + Guid="252d08e9-b1f0-11df-94e2-0800200c9a66"> |
306 | + <File Id="LIBEAY32.dll" |
307 | + Name="LIBEAY32.dll" |
308 | + DiskId="1" |
309 | + Source="build_results\u1sync\LIBEAY32.dll" |
310 | + KeyPath="yes"/> |
311 | + </Component> |
312 | + <Component Id="LibraryComponent" |
313 | + Guid="252d08ea-b1f0-11df-94e2-0800200c9a66"> |
314 | + <File Id="library.zip" |
315 | + Name="library.zip" |
316 | + DiskId="1" |
317 | + Source="build_results\u1sync\library.zip" |
318 | + KeyPath="yes"/> |
319 | + </Component> |
320 | + <Component Id="U1SyncMainComponent" |
321 | + Guid="252d08eb-b1f0-11df-94e2-0800200c9a66"> |
322 | + <File Id="main.exe" |
323 | + Name="main.exe" |
324 | + DiskId="1" |
325 | + Source="build_results\u1sync\main.exe" |
326 | + KeyPath="yes"/> |
327 | + </Component> |
328 | + <Component Id="MprComponent" |
329 | + Guid="252d08ec-b1f0-11df-94e2-0800200c9a66"> |
330 | + <File Id="MPR.dll" |
331 | + Name="MPR.dll" |
332 | + DiskId="1" |
333 | + Source="build_results\u1sync\MPR.dll" |
334 | + KeyPath="yes"/> |
335 | + </Component> |
336 | + <Component Id="MswsockComponent" |
337 | + Guid="bb15b4b0-b1f5-11df-94e2-0800200c9a66"> |
338 | + <File Id="MSWSOCK.dll" |
339 | + Name="MSWSOCK.dll" |
340 | + DiskId="1" |
341 | + Source="build_results\u1sync\MSWSOCK.dll" |
342 | + KeyPath="yes"/> |
343 | + </Component> |
344 | + <Component Id="OpenSSLCryptoComponent" |
345 | + Guid="bb15b4b1-b1f5-11df-94e2-0800200c9a66"> |
346 | + <File Id="OpenSSL.crypto.pyd" |
347 | + Name="OpenSSL.crypto.pyd" |
348 | + DiskId="1" |
349 | + Source="build_results\u1sync\OpenSSL.crypto.pyd" |
350 | + KeyPath="yes"/> |
351 | + </Component> |
352 | + <Component Id="OpenSSLRandComponent" |
353 | + Guid="bb15b4b2-b1f5-11df-94e2-0800200c9a66"> |
354 | + <File Id="OpenSSL.rand.pyd" |
355 | + Name="OpenSSL.rand.pyd" |
356 | + DiskId="1" |
357 | + Source="build_results\u1sync\OpenSSL.rand.pyd" |
358 | + KeyPath="yes"/> |
359 | + </Component> |
360 | + <Component Id="OpenSSLSSLComponent" |
361 | + Guid="bb15b4b3-b1f5-11df-94e2-0800200c9a66"> |
362 | + <File Id="OpenSSL.SSL.pyd" |
363 | + Name="OpenSSL.SSL.pyd" |
364 | + DiskId="1" |
365 | + Source="build_results\u1sync\OpenSSL.SSL.pyd" |
366 | + KeyPath="yes"/> |
367 | + </Component> |
368 | + <Component Id="PowrprofComponent" |
369 | + Guid="bb15b4b4-b1f5-11df-94e2-0800200c9a66"> |
370 | + <File Id="POWRPROF.dll" |
371 | + Name="POWRPROF.dll" |
372 | + DiskId="1" |
373 | + Source="build_results\u1sync\POWRPROF.dll" |
374 | + KeyPath="yes"/> |
375 | + </Component> |
376 | + <Component Id="PoyexpactComponent" |
377 | + Guid="bb15b4b5-b1f5-11df-94e2-0800200c9a66"> |
378 | + <File Id="pyexpat.pyd" |
379 | + Name="pyexpat.pyd" |
380 | + DiskId="1" |
381 | + Source="build_results\u1sync\pyexpat.pyd" |
382 | + KeyPath="yes"/> |
383 | + </Component> |
384 | + <Component Id="Python26Component" |
385 | + Guid="bb15dbc0-b1f5-11df-94e2-0800200c9a66"> |
386 | + <File Id="python26.dll" |
387 | + Name="python26.dll" |
388 | + DiskId="1" |
389 | + Source="build_results\u1sync\python26.dll" |
390 | + KeyPath="yes"/> |
391 | + </Component> |
392 | + <Component Id="PythonCom26Component" |
393 | + Guid="bb15dbc1-b1f5-11df-94e2-0800200c9a66"> |
394 | + <File Id="pythoncom26.dll" |
395 | + Name="pythoncom26.dll" |
396 | + DiskId="1" |
397 | + Source="build_results\u1sync\pythoncom26.dll" |
398 | + KeyPath="yes"/> |
399 | + </Component> |
400 | + <Component Id="PyWinTypeComponent" |
401 | + Guid="bb15dbc2-b1f5-11df-94e2-0800200c9a66"> |
402 | + <File Id="pywintypes26.dll" |
403 | + Name="pywintypes26.dll" |
404 | + DiskId="1" |
405 | + Source="build_results\u1sync\pywintypes26.dll" |
406 | + KeyPath="yes"/> |
407 | + </Component> |
408 | + <Component Id="SelectComponent" |
409 | + Guid="bb15dbc3-b1f5-11df-94e2-0800200c9a66"> |
410 | + <File Id="select.pyd" |
411 | + Name="select.pyd" |
412 | + DiskId="1" |
413 | + Source="build_results\u1sync\select.pyd" |
414 | + KeyPath="yes"/> |
415 | + </Component> |
416 | + <Component Id="Ssleay32Component" |
417 | + Guid="bb15dbc4-b1f5-11df-94e2-0800200c9a66"> |
418 | + <File Id="SSLEAY32.dll" |
419 | + Name="SSLEAY32.dll" |
420 | + DiskId="1" |
421 | + Source="build_results\u1sync\SSLEAY32.dll" |
422 | + KeyPath="yes"/> |
423 | + </Component> |
424 | + <Component Id="TwistedComponent" |
425 | + Guid="bb15dbc5-b1f5-11df-94e2-0800200c9a66"> |
426 | + <File Id="twisted.python._initgroups.pyd" |
427 | + Name="twisted.python._initgroups.pyd" |
428 | + DiskId="1" |
429 | + Source="build_results\u1sync\twisted.python._initgroups.pyd" |
430 | + KeyPath="yes"/> |
431 | + </Component> |
432 | + <Component Id="UnicodeDataComponent" |
433 | + Guid="bb15dbc6-b1f5-11df-94e2-0800200c9a66"> |
434 | + <File Id="unicodedata.pyd" |
435 | + Name="unicodedata.pyd" |
436 | + DiskId="1" |
437 | + Source="build_results\u1sync\unicodedata.pyd" |
438 | + KeyPath="yes"/> |
439 | + </Component> |
440 | + <Component Id="W9xpopenComponent" |
441 | + Guid="bb15dbc7-b1f5-11df-94e2-0800200c9a66"> |
442 | + <File Id="w9xpopen.exe" |
443 | + Name="w9xpopen.exe" |
444 | + DiskId="1" |
445 | + Source="build_results\u1sync\w9xpopen.exe" |
446 | + KeyPath="yes"/> |
447 | + </Component> |
448 | + <Component Id="Win32ApiComponent" |
449 | + Guid="bb15dbc8-b1f5-11df-94e2-0800200c9a66"> |
450 | + <File Id="win32api.pyd" |
451 | + Name="win32api.pyd" |
452 | + DiskId="1" |
453 | + Source="build_results\u1sync\win32api.pyd" |
454 | + KeyPath="yes"/> |
455 | + </Component> |
456 | + <Component Id="Win32ComShellComponent" |
457 | + Guid="bb15dbc9-b1f5-11df-94e2-0800200c9a66"> |
458 | + <File Id="win32com.shell.shell.pyd" |
459 | + Name="win32com.shell.shell.pyd" |
460 | + DiskId="1" |
461 | + Source="build_results\u1sync\win32com.shell.shell.pyd" |
462 | + KeyPath="yes"/> |
463 | + </Component> |
464 | + <Component Id="Win32EventComponent" |
465 | + Guid="bb15dbca-b1f5-11df-94e2-0800200c9a66"> |
466 | + <File Id="win32event.pyd" |
467 | + Name="win32event.pyd" |
468 | + DiskId="1" |
469 | + Source="build_results\u1sync\win32event.pyd" |
470 | + KeyPath="yes"/> |
471 | + </Component> |
472 | + <Component Id="Win32EventLogComponent" |
473 | + Guid="bb15dbcb-b1f5-11df-94e2-0800200c9a66"> |
474 | + <File Id="win32evtlog.pyd" |
475 | + Name="win32evtlog.pyd" |
476 | + DiskId="1" |
477 | + Source="build_results\u1sync\win32evtlog.pyd" |
478 | + KeyPath="yes"/> |
479 | + </Component> |
480 | + <Component Id="Win32FileComponent" |
481 | + Guid="bb15dbcc-b1f5-11df-94e2-0800200c9a66"> |
482 | + <File Id="win32file.pyd" |
483 | + Name="win32file.pyd" |
484 | + DiskId="1" |
485 | + Source="build_results\u1sync\win32file.pyd" |
486 | + KeyPath="yes"/> |
487 | + </Component> |
488 | + <Component Id="Win32PipeComponent" |
489 | + Guid="bb15dbcd-b1f5-11df-94e2-0800200c9a66"> |
490 | + <File Id="win32pipe.pyd" |
491 | + Name="win32pipe.pyd" |
492 | + DiskId="1" |
493 | + Source="build_results\u1sync\win32pipe.pyd" |
494 | + KeyPath="yes"/> |
495 | + </Component> |
496 | + <Component Id="Win32ProcessComponent" |
497 | + Guid="bb15dbce-b1f5-11df-94e2-0800200c9a66"> |
498 | + <File Id="win32process.pyd" |
499 | + Name="win32process.pyd" |
500 | + DiskId="1" |
501 | + Source="build_results\u1sync\win32process.pyd" |
502 | + KeyPath="yes"/> |
503 | + </Component> |
504 | + <Component Id="Win32SecurityComponent" |
505 | + Guid="bb15dbcf-b1f5-11df-94e2-0800200c9a66"> |
506 | + <File Id="win32security.pyd" |
507 | + Name="win32security.pyd" |
508 | + DiskId="1" |
509 | + Source="build_results\u1sync\win32security.pyd" |
510 | + KeyPath="yes"/> |
511 | + </Component> |
512 | + <Component Id="Win32UIComponent" |
513 | + Guid="bb15dbd0-b1f5-11df-94e2-0800200c9a66"> |
514 | + <File Id="win32ui.pyd" |
515 | + Name="win32ui.pyd" |
516 | + DiskId="1" |
517 | + Source="build_results\u1sync\win32ui.pyd" |
518 | + KeyPath="yes"/> |
519 | + </Component> |
520 | + <Component Id="Win32WnetComponent" |
521 | + Guid="bb15dbd1-b1f5-11df-94e2-0800200c9a66"> |
522 | + <File Id="win32wnet.pyd" |
523 | + Name="win32wnet.pyd" |
524 | + DiskId="1" |
525 | + Source="build_results\u1sync\win32wnet.pyd" |
526 | + KeyPath="yes"/> |
527 | + </Component> |
528 | + <Component Id="ZLibComponent" |
529 | + Guid="bb15dbd2-b1f5-11df-94e2-0800200c9a66"> |
530 | + <File Id="zlib1.dll" |
531 | + Name="zlib1.dll" |
532 | + DiskId="1" |
533 | + Source="build_results\u1sync\zlib1.dll" |
534 | + KeyPath="yes"/> |
535 | + </Component> |
536 | + <Component Id="ZopeInterfaceComponent" |
537 | + Guid="bb15dbd3-b1f5-11df-94e2-0800200c9a66"> |
538 | + <File Id="zope.interface._zope_interface_coptimizations.pyd" |
539 | + Name="zope.interface._zope_interface_coptimizations.pyd" |
540 | + DiskId="1" |
541 | + Source="build_results\u1sync\zope.interface._zope_interface_coptimizations.pyd" |
542 | + KeyPath="yes"/> |
543 | + </Component> |
544 | + </Directory> |
545 | </Directory> |
546 | </Directory> |
547 | </Directory> |
548 | @@ -459,6 +958,69 @@ |
549 | <ComponentRef Id="ClientConfigLog4Net" /> |
550 | <!-- Client auto start --> |
551 | <ComponentRef Id="UbuntuOneClietnAutostart" /> |
552 | + <!-- U1Sync pacakge --> |
553 | + <ComponentRef Id="CTypesComponent" /> |
554 | + <ComponentRef Id="ElemtTreeComponent" /> |
555 | + <ComponentRef Id="HLibComponent" /> |
556 | + <ComponentRef Id="SocketComponent" /> |
557 | + <ComponentRef Id="SSLComponent" /> |
558 | + <ComponentRef Id="SysLoaderComponent" /> |
559 | + <ComponentRef Id="WinCoreDelayComponent" /> |
560 | + <ComponentRef Id="WinCoreErrorHandlingComponent" /> |
561 | + <ComponentRef Id="WinCoreHandleComponent" /> |
562 | + <ComponentRef Id="WinCoreInterlockedComponent" /> |
563 | + <ComponentRef Id="WinCoreIOComponent" /> |
564 | + <ComponentRef Id="WinCoreLoaderComponent" /> |
565 | + <ComponentRef Id="WinCoreLocalizationComponent" /> |
566 | + <ComponentRef Id="WinCoreLocalRegistryComponent" /> |
567 | + <ComponentRef Id="WinCoreMemoryComponent" /> |
568 | + <ComponentRef Id="WinCoreMiscComponent" /> |
569 | + <ComponentRef Id="WinCoreProcessEnvComponent" /> |
570 | + <ComponentRef Id="WinCoreProcessThreadsComponent" /> |
571 | + <ComponentRef Id="WinCoreProfileComponent" /> |
572 | + <ComponentRef Id="WinCoreStringComponent" /> |
573 | + <ComponentRef Id="WinCoreSynchComponent" /> |
574 | + <ComponentRef Id="WinCoreSysInfoComponent" /> |
575 | + <ComponentRef Id="WinCoreSecurityBaseComponent" /> |
576 | + <ComponentRef Id="Bz2Component" /> |
577 | + <ComponentRef Id="BzrAnnotatorComponent" /> |
578 | + <ComponentRef Id="BzrBencodeComponent" /> |
579 | + <ComponentRef Id="BzrChkComponent" /> |
580 | + <ComponentRef Id="BzrChunkComponent" /> |
581 | + <ComponentRef Id="BzrPatienceDiffComponent" /> |
582 | + <ComponentRef Id="BzrRioComponent" /> |
583 | + <ComponentRef Id="BzrStaticTupleComponent" /> |
584 | + <ComponentRef Id="KernelBaseComponent" /> |
585 | + <ComponentRef Id="Libeay32Component" /> |
586 | + <ComponentRef Id="LibraryComponent" /> |
587 | + <ComponentRef Id="U1SyncMainComponent" /> |
588 | + <ComponentRef Id="MprComponent" /> |
589 | + <ComponentRef Id="MswsockComponent" /> |
590 | + <ComponentRef Id="OpenSSLCryptoComponent" /> |
591 | + <ComponentRef Id="OpenSSLRandComponent" /> |
592 | + <ComponentRef Id="OpenSSLSSLComponent" /> |
593 | + <ComponentRef Id="PowrprofComponent" /> |
594 | + <ComponentRef Id="PoyexpactComponent" /> |
595 | + <ComponentRef Id="Python26Component" /> |
596 | + <ComponentRef Id="PythonCom26Component" /> |
597 | + <ComponentRef Id="PyWinTypeComponent" /> |
598 | + <ComponentRef Id="SelectComponent" /> |
599 | + <ComponentRef Id="Ssleay32Component" /> |
600 | + <ComponentRef Id="TwistedComponent" /> |
601 | + <ComponentRef Id="UnicodeDataComponent" /> |
602 | + <ComponentRef Id="W9xpopenComponent" /> |
603 | + <ComponentRef Id="Win32ApiComponent" /> |
604 | + <ComponentRef Id="Win32ComShellComponent" /> |
605 | + <ComponentRef Id="Win32EventComponent" /> |
606 | + <ComponentRef Id="Win32EventLogComponent" /> |
607 | + <ComponentRef Id="Win32FileComponent" /> |
608 | + <ComponentRef Id="Win32PipeComponent" /> |
609 | + <ComponentRef Id="Win32ProcessComponent" /> |
610 | + <ComponentRef Id="Win32SecurityComponent" /> |
611 | + <ComponentRef Id="Win32UIComponent" /> |
612 | + <ComponentRef Id="Win32WnetComponent" /> |
613 | + <ComponentRef Id="ZLibComponent" /> |
614 | + <ComponentRef Id="ZopeInterfaceComponent" /> |
615 | </Feature> |
616 | |
617 | <!-- Provide the UI extensions to be used --> |
618 | |
619 | === modified file 'main.build' |
620 | --- main.build 2010-08-24 18:32:48 +0000 |
621 | +++ main.build 2010-08-27 20:21:13 +0000 |
622 | @@ -176,10 +176,20 @@ |
623 | program="nunit-console.exe" |
624 | commandline="UbuntuOneClient.Tests.dll /xml=../../../../test-results/UbuntuOneClient.Tests-Result.xml" /> |
625 | </target> |
626 | - |
627 | + <target name="package_python" |
628 | + description="Creates the exe binary that embeds the python libs that will be used to perform the sync operation in the windows platform"> |
629 | + |
630 | + <exec basedir="${python_path}" |
631 | + managed="true" |
632 | + workingdir="src" |
633 | + program="python.exe" |
634 | + commandline="setup.py py2exe" /> |
635 | + |
636 | + </target> |
637 | + |
638 | <target name="installer" |
639 | description="Compiles the solution and create a merge installer that allows to install the solution and other related apps." |
640 | - depends="tests"> |
641 | + depends="tests, package_python"> |
642 | |
643 | <mkdir dir="${build_results}" /> |
644 | |
645 | @@ -216,6 +226,13 @@ |
646 | </fileset> |
647 | </copy> |
648 | |
649 | + <!-- copy the results of the package_python to the install dir --> |
650 | + <copy todir="${build_results}/u1sync" flatten="true"> |
651 | + <fileset basedir="src/dist"> |
652 | + <include name="*.*" /> |
653 | + </fileset> |
654 | + </copy> |
655 | + |
656 | <!-- copy the correct views lib --> |
657 | |
658 | <copy todir="${build_results}/Client" flatten="true"> |
659 | |
660 | === added file 'src/setup.py' |
661 | --- src/setup.py 1970-01-01 00:00:00 +0000 |
662 | +++ src/setup.py 2010-08-27 20:21:13 +0000 |
663 | @@ -0,0 +1,56 @@ |
664 | +#!/usr/bin/env python |
665 | +# Copyright (C) 2010 Canonical - All Rights Reserved |
666 | + |
667 | +""" """ |
668 | +import sys |
669 | + |
670 | +# ModuleFinder can't handle runtime changes to __path__, but win32com uses them |
671 | +try: |
672 | + # py2exe 0.6.4 introduced a replacement modulefinder. |
673 | + # This means we have to add package paths there, not to the built-in |
674 | + # one. If this new modulefinder gets integrated into Python, then |
675 | + # we might be able to revert this some day. |
676 | + # if this doesn't work, try import modulefinder |
677 | + try: |
678 | + import py2exe.mf as modulefinder |
679 | + except ImportError: |
680 | + import modulefinder |
681 | + import win32com |
682 | + for p in win32com.__path__[1:]: |
683 | + modulefinder.AddPackagePath("win32com", p) |
684 | + for extra in ["win32com.shell"]: #,"win32com.mapi" |
685 | + __import__(extra) |
686 | + m = sys.modules[extra] |
687 | + for p in m.__path__[1:]: |
688 | + modulefinder.AddPackagePath(extra, p) |
689 | +except ImportError: |
690 | + # no build path setup, no worries. |
691 | + pass |
692 | + |
693 | +from distutils.core import setup |
694 | +import py2exe |
695 | + |
696 | +if __name__ == '__main__': |
697 | + |
698 | + setup( |
699 | + options = { |
700 | + "py2exe": { |
701 | + "compressed": 1, |
702 | + "optimize": 2}}, |
703 | + name='u1sync', |
704 | + version='0.0.1', |
705 | + author = "Canonical Online Services Hackers", |
706 | + description="""u1sync is a utility of Ubuntu One. |
707 | + Ubuntu One is a suite of on-line |
708 | + services. This package contains the a synchronization client for the |
709 | + Ubuntu One file sharing service.""", |
710 | + license='GPLv3', |
711 | + console=['u1sync\\main.py'], |
712 | + requires=[ |
713 | + 'python (>= 2.5)', |
714 | + 'oauth', |
715 | + 'twisted.names', |
716 | + 'twisted.web', |
717 | + 'ubuntuone.storageprotocol (>= 1.3.0)', |
718 | + ], |
719 | + ) |
720 | |
721 | === added directory 'src/u1sync' |
722 | === added file 'src/u1sync/__init__.py' |
723 | --- src/u1sync/__init__.py 1970-01-01 00:00:00 +0000 |
724 | +++ src/u1sync/__init__.py 2010-08-27 20:21:13 +0000 |
725 | @@ -0,0 +1,14 @@ |
726 | +# Copyright 2009 Canonical Ltd. |
727 | +# |
728 | +# This program is free software: you can redistribute it and/or modify it |
729 | +# under the terms of the GNU General Public License version 3, as published |
730 | +# by the Free Software Foundation. |
731 | +# |
732 | +# This program is distributed in the hope that it will be useful, but |
733 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
734 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
735 | +# PURPOSE. See the GNU General Public License for more details. |
736 | +# |
737 | +# You should have received a copy of the GNU General Public License along |
738 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
739 | +"""The guts of the u1sync tool.""" |
740 | |
741 | === added file 'src/u1sync/client.py' |
742 | --- src/u1sync/client.py 1970-01-01 00:00:00 +0000 |
743 | +++ src/u1sync/client.py 2010-08-27 20:21:13 +0000 |
744 | @@ -0,0 +1,754 @@ |
745 | +# ubuntuone.u1sync.client |
746 | +# |
747 | +# Client/protocol end of u1sync |
748 | +# |
749 | +# Author: Lucio Torre <lucio.torre@canonical.com> |
750 | +# Author: Tim Cole <tim.cole@canonical.com> |
751 | +# |
752 | +# Copyright 2009 Canonical Ltd. |
753 | +# |
754 | +# This program is free software: you can redistribute it and/or modify it |
755 | +# under the terms of the GNU General Public License version 3, as published |
756 | +# by the Free Software Foundation. |
757 | +# |
758 | +# This program is distributed in the hope that it will be useful, but |
759 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
760 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
761 | +# PURPOSE. See the GNU General Public License for more details. |
762 | +# |
763 | +# You should have received a copy of the GNU General Public License along |
764 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
765 | +"""Pretty API for protocol client.""" |
766 | + |
767 | +from __future__ import with_statement |
768 | + |
769 | +import os |
770 | +import sys |
771 | +import shutil |
772 | +from Queue import Queue |
773 | +from threading import Lock |
774 | +import zlib |
775 | +import urlparse |
776 | +import ConfigParser |
777 | +from cStringIO import StringIO |
778 | +from twisted.internet import reactor, defer |
779 | +from twisted.internet.defer import inlineCallbacks, returnValue |
780 | +from ubuntuone.logger import LOGFOLDER |
781 | +from ubuntuone.storageprotocol.content_hash import crc32 |
782 | +from ubuntuone.storageprotocol.context import get_ssl_context |
783 | +from u1sync.genericmerge import MergeNode |
784 | +from u1sync.utils import should_sync |
785 | + |
786 | +CONSUMER_KEY = "ubuntuone" |
787 | + |
788 | +from oauth.oauth import OAuthConsumer |
789 | +from ubuntuone.storageprotocol.client import ( |
790 | + StorageClientFactory, StorageClient) |
791 | +from ubuntuone.storageprotocol import request, volumes |
792 | +from ubuntuone.storageprotocol.dircontent_pb2 import \ |
793 | + DirectoryContent, DIRECTORY |
794 | +import uuid |
795 | +import logging |
796 | +from logging.handlers import RotatingFileHandler |
797 | +import time |
798 | + |
799 | +def share_str(share_uuid): |
800 | + """Converts a share UUID to a form the protocol likes.""" |
801 | + return str(share_uuid) if share_uuid is not None else request.ROOT |
802 | + |
803 | +LOGFILENAME = os.path.join(LOGFOLDER, 'u1sync.log') |
804 | +u1_logger = logging.getLogger("u1sync.timing.log") |
805 | +handler = RotatingFileHandler(LOGFILENAME) |
806 | +u1_logger.addHandler(handler) |
807 | + |
808 | +def log_timing(func): |
809 | + def wrapper(*arg, **kwargs): |
810 | + start = time.time() |
811 | + ent = func(*arg, **kwargs) |
812 | + stop = time.time() |
813 | + u1_logger.debug('for %s %0.5f ms elapsed' % (func.func_name, \ |
814 | + (stop-start)*1000.0)) |
815 | + return ent |
816 | + return wrapper |
817 | + |
818 | + |
819 | +class ForcedShutdown(Exception): |
820 | + """Client shutdown forced.""" |
821 | + |
822 | + |
823 | +class Waiter(object): |
824 | + """Wait object for blocking waits.""" |
825 | + |
826 | + def __init__(self): |
827 | + """Initializes the wait object.""" |
828 | + self.queue = Queue() |
829 | + |
830 | + def wake(self, result): |
831 | + """Wakes the waiter with a result.""" |
832 | + self.queue.put((result, None)) |
833 | + |
834 | + def wakeAndRaise(self, exc_info): |
835 | + """Wakes the waiter, raising the given exception in it.""" |
836 | + self.queue.put((None, exc_info)) |
837 | + |
838 | + def wakeWithResult(self, func, *args, **kw): |
839 | + """Wakes the waiter with the result of the given function.""" |
840 | + try: |
841 | + result = func(*args, **kw) |
842 | + except Exception: |
843 | + self.wakeAndRaise(sys.exc_info()) |
844 | + else: |
845 | + self.wake(result) |
846 | + |
847 | + def wait(self): |
848 | + """Waits for wakeup.""" |
849 | + (result, exc_info) = self.queue.get() |
850 | + if exc_info: |
851 | + try: |
852 | + raise exc_info[0], exc_info[1], exc_info[2] |
853 | + finally: |
854 | + exc_info = None |
855 | + else: |
856 | + return result |
857 | + |
858 | + |
859 | +class SyncStorageClient(StorageClient): |
860 | + """Simple client that calls a callback on connection.""" |
861 | + |
862 | + @log_timing |
863 | + def connectionMade(self): |
864 | + """Setup and call callback.""" |
865 | + StorageClient.connectionMade(self) |
866 | + if self.factory.current_protocol not in (None, self): |
867 | + self.factory.current_protocol.transport.loseConnection() |
868 | + self.factory.current_protocol = self |
869 | + self.factory.observer.connected() |
870 | + |
871 | + @log_timing |
872 | + def connectionLost(self, reason=None): |
873 | + """Callback for established connection lost""" |
874 | + if self.factory.current_protocol is self: |
875 | + self.factory.current_protocol = None |
876 | + self.factory.observer.disconnected(reason) |
877 | + |
878 | + |
879 | +class SyncClientFactory(StorageClientFactory): |
880 | + """A cmd protocol factory.""" |
881 | + # no init: pylint: disable-msg=W0232 |
882 | + |
883 | + protocol = SyncStorageClient |
884 | + |
885 | + @log_timing |
886 | + def __init__(self, observer): |
887 | + """Create the factory""" |
888 | + self.observer = observer |
889 | + self.current_protocol = None |
890 | + |
891 | + @log_timing |
892 | + def clientConnectionFailed(self, connector, reason): |
893 | + """We failed at connecting.""" |
894 | + self.current_protocol = None |
895 | + self.observer.connection_failed(reason) |
896 | + |
897 | + |
898 | +class UnsupportedOperationError(Exception): |
899 | + """The operation is unsupported by the protocol version.""" |
900 | + |
901 | + |
902 | +class ConnectionError(Exception): |
903 | + """A connection error.""" |
904 | + |
905 | + |
906 | +class AuthenticationError(Exception): |
907 | + """An authentication error.""" |
908 | + |
909 | + |
910 | +class NoSuchShareError(Exception): |
911 | + """Error when there is no such share available.""" |
912 | + |
913 | + |
914 | +class CapabilitiesError(Exception): |
915 | + """A capabilities set/query related error.""" |
916 | + |
917 | +class Client(object): |
918 | + """U1 storage client facade.""" |
919 | + required_caps = frozenset(["no-content", "fix462230"]) |
920 | + |
921 | + def __init__(self, realm, reactor=reactor): |
922 | + """Create the instance.""" |
923 | + |
924 | + self.reactor = reactor |
925 | + self.factory = SyncClientFactory(self) |
926 | + |
927 | + self._status_lock = Lock() |
928 | + self._status = "disconnected" |
929 | + self._status_reason = None |
930 | + self._status_waiting = [] |
931 | + self._active_waiters = set() |
932 | + self.consumer_key = CONSUMER_KEY |
933 | + self.consumer_secret = "hammertime" |
934 | + |
935 | + def force_shutdown(self): |
936 | + """Forces the client to shut itself down.""" |
937 | + with self._status_lock: |
938 | + self._status = "forced_shutdown" |
939 | + self._reason = None |
940 | + for waiter in self._active_waiters: |
941 | + waiter.wakeAndRaise((ForcedShutdown("Forced shutdown"), |
942 | + None, None)) |
943 | + self._active_waiters.clear() |
944 | + |
945 | + def _get_waiter_locked(self): |
946 | + """Gets a wait object for blocking waits. Should be called with the |
947 | + status lock held. |
948 | + """ |
949 | + waiter = Waiter() |
950 | + if self._status == "forced_shutdown": |
951 | + raise ForcedShutdown("Forced shutdown") |
952 | + self._active_waiters.add(waiter) |
953 | + return waiter |
954 | + |
955 | + def _get_waiter(self): |
956 | + """Get a wait object for blocking waits. Acquires the status lock.""" |
957 | + with self._status_lock: |
958 | + return self._get_waiter_locked() |
959 | + |
960 | + def _wait(self, waiter): |
961 | + """Waits for the waiter.""" |
962 | + try: |
963 | + return waiter.wait() |
964 | + finally: |
965 | + with self._status_lock: |
966 | + if waiter in self._active_waiters: |
967 | + self._active_waiters.remove(waiter) |
968 | + |
969 | + @log_timing |
970 | + def _change_status(self, status, reason=None): |
971 | + """Changes the client status. Usually called from the reactor |
972 | + thread. |
973 | + |
974 | + """ |
975 | + with self._status_lock: |
976 | + if self._status == "forced_shutdown": |
977 | + return |
978 | + self._status = status |
979 | + self._status_reason = reason |
980 | + waiting = self._status_waiting |
981 | + self._status_waiting = [] |
982 | + for waiter in waiting: |
983 | + waiter.wake((status, reason)) |
984 | + |
985 | + @log_timing |
986 | + def _await_status_not(self, *ignore_statuses): |
987 | + """Blocks until the client status changes, returning the new status. |
988 | + Should never be called from the reactor thread. |
989 | + |
990 | + """ |
991 | + with self._status_lock: |
992 | + status = self._status |
993 | + reason = self._status_reason |
994 | + while status in ignore_statuses: |
995 | + waiter = self._get_waiter_locked() |
996 | + self._status_waiting.append(waiter) |
997 | + self._status_lock.release() |
998 | + try: |
999 | + status, reason = self._wait(waiter) |
1000 | + finally: |
1001 | + self._status_lock.acquire() |
1002 | + if status == "forced_shutdown": |
1003 | + raise ForcedShutdown("Forced shutdown.") |
1004 | + return (status, reason) |
1005 | + |
1006 | + def connection_failed(self, reason): |
1007 | + """Notification that connection failed.""" |
1008 | + self._change_status("disconnected", reason) |
1009 | + |
1010 | + def connected(self): |
1011 | + """Notification that connection succeeded.""" |
1012 | + self._change_status("connected") |
1013 | + |
1014 | + def disconnected(self, reason): |
1015 | + """Notification that we were disconnected.""" |
1016 | + self._change_status("disconnected", reason) |
1017 | + |
1018 | + def defer_from_thread(self, function, *args, **kwargs): |
1019 | + """Do twisted defer magic to get results and show exceptions.""" |
1020 | + waiter = self._get_waiter() |
1021 | + @log_timing |
1022 | + def runner(): |
1023 | + """inner.""" |
1024 | + # we do want to catch all |
1025 | + # no init: pylint: disable-msg=W0703 |
1026 | + try: |
1027 | + d = function(*args, **kwargs) |
1028 | + if isinstance(d, defer.Deferred): |
1029 | + d.addCallbacks(lambda r: waiter.wake((r, None, None)), |
1030 | + lambda f: waiter.wake((None, None, f))) |
1031 | + else: |
1032 | + waiter.wake((d, None, None)) |
1033 | + except Exception: |
1034 | + waiter.wake((None, sys.exc_info(), None)) |
1035 | + |
1036 | + self.reactor.callFromThread(runner) |
1037 | + result, exc_info, failure = self._wait(waiter) |
1038 | + if exc_info: |
1039 | + try: |
1040 | + raise exc_info[0], exc_info[1], exc_info[2] |
1041 | + finally: |
1042 | + exc_info = None |
1043 | + elif failure: |
1044 | + failure.raiseException() |
1045 | + else: |
1046 | + return result |
1047 | + |
1048 | + @log_timing |
1049 | + def connect(self, host, port): |
1050 | + """Connect to host/port.""" |
1051 | + def _connect(): |
1052 | + """Deferred part.""" |
1053 | + self.reactor.connectTCP(host, port, self.factory) |
1054 | + self._connect_inner(_connect) |
1055 | + |
1056 | + @log_timing |
1057 | + def connect_ssl(self, host, port, no_verify): |
1058 | + """Connect to host/port using ssl.""" |
1059 | + def _connect(): |
1060 | + """deferred part.""" |
1061 | + ctx = get_ssl_context(no_verify) |
1062 | + self.reactor.connectSSL(host, port, self.factory, ctx) |
1063 | + self._connect_inner(_connect) |
1064 | + |
1065 | + @log_timing |
1066 | + def _connect_inner(self, _connect): |
1067 | + """Helper function for connecting.""" |
1068 | + self._change_status("connecting") |
1069 | + self.reactor.callFromThread(_connect) |
1070 | + status, reason = self._await_status_not("connecting") |
1071 | + if status != "connected": |
1072 | + raise ConnectionError(reason.value) |
1073 | + |
1074 | + @log_timing |
1075 | + def disconnect(self): |
1076 | + """Disconnect.""" |
1077 | + if self.factory.current_protocol is not None: |
1078 | + self.reactor.callFromThread( |
1079 | + self.factory.current_protocol.transport.loseConnection) |
1080 | + self._await_status_not("connecting", "connected", "authenticated") |
1081 | + |
1082 | + @log_timing |
1083 | + def set_capabilities(self): |
1084 | + """Set the capabilities with the server""" |
1085 | + |
1086 | + client = self.factory.current_protocol |
1087 | + @log_timing |
1088 | + def set_caps_callback(req): |
1089 | + "Caps query succeeded" |
1090 | + if not req.accepted: |
1091 | + de = defer.fail("The server denied setting %s capabilities" % \ |
1092 | + req.caps) |
1093 | + return de |
1094 | + |
1095 | + @log_timing |
1096 | + def query_caps_callback(req): |
1097 | + "Caps query succeeded" |
1098 | + if req.accepted: |
1099 | + set_d = client.set_caps(self.required_caps) |
1100 | + set_d.addCallback(set_caps_callback) |
1101 | + return set_d |
1102 | + else: |
1103 | + # the server don't have the requested capabilities. |
1104 | + # return a failure for now, in the future we might want |
1105 | + # to reconnect to another server |
1106 | + de = defer.fail("The server don't have the requested" |
1107 | + " capabilities: %s" % str(req.caps)) |
1108 | + return de |
1109 | + |
1110 | + @log_timing |
1111 | + def _wrapped_set_capabilities(): |
1112 | + """Wrapped set_capabilities """ |
1113 | + d = client.query_caps(self.required_caps) |
1114 | + d.addCallback(query_caps_callback) |
1115 | + return d |
1116 | + |
1117 | + try: |
1118 | + self.defer_from_thread(_wrapped_set_capabilities) |
1119 | + except request.StorageProtocolError, e: |
1120 | + raise CapabilitiesError(e) |
1121 | + |
1122 | + @log_timing |
1123 | + def get_root_info(self, volume_uuid): |
1124 | + """Returns the UUID of the applicable share root.""" |
1125 | + if volume_uuid is None: |
1126 | + _get_root = self.factory.current_protocol.get_root |
1127 | + root = self.defer_from_thread(_get_root) |
1128 | + return (uuid.UUID(root), True) |
1129 | + else: |
1130 | + str_volume_uuid = str(volume_uuid) |
1131 | + volume = self._match_volume(lambda v: \ |
1132 | + str(v.volume_id) == str_volume_uuid) |
1133 | + if isinstance(volume, volumes.ShareVolume): |
1134 | + modify = volume.access_level == "Modify" |
1135 | + if isinstance(volume, volumes.UDFVolume): |
1136 | + modify = True |
1137 | + return (uuid.UUID(str(volume.node_id)), modify) |
1138 | + |
1139 | + @log_timing |
1140 | + def resolve_path(self, share_uuid, root_uuid, path): |
1141 | + """Resolve path relative to the given root node.""" |
1142 | + |
1143 | + @inlineCallbacks |
1144 | + def _resolve_worker(): |
1145 | + """Path resolution worker.""" |
1146 | + node_uuid = root_uuid |
1147 | + local_path = path.strip('/') |
1148 | + |
1149 | + while local_path != '': |
1150 | + local_path, name = os.path.split(local_path) |
1151 | + hashes = yield self._get_node_hashes(share_uuid, [root_uuid]) |
1152 | + content_hash = hashes.get(root_uuid, None) |
1153 | + if content_hash is None: |
1154 | + raise KeyError, "Content hash not available" |
1155 | + entries = yield self._get_raw_dir_entries(share_uuid, |
1156 | + root_uuid, |
1157 | + content_hash) |
1158 | + match_name = name.decode('utf-8') |
1159 | + match = None |
1160 | + for entry in entries: |
1161 | + if match_name == entry.name: |
1162 | + match = entry |
1163 | + break |
1164 | + |
1165 | + if match is None: |
1166 | + raise KeyError, "Path not found" |
1167 | + |
1168 | + node_uuid = uuid.UUID(match.node) |
1169 | + |
1170 | + returnValue(node_uuid) |
1171 | + |
1172 | + return self.defer_from_thread(_resolve_worker) |
1173 | + |
1174 | + @log_timing |
1175 | + def oauth_from_token(self, token): |
1176 | + """Perform OAuth authorisation using an existing token.""" |
1177 | + |
1178 | + consumer = OAuthConsumer(self.consumer_key, self.consumer_secret) |
1179 | + |
1180 | + def _auth_successful(value): |
1181 | + """Callback for successful auth. Changes status to |
1182 | + authenticated.""" |
1183 | + self._change_status("authenticated") |
1184 | + return value |
1185 | + |
1186 | + def _auth_failed(value): |
1187 | + """Callback for failed auth. Disconnects.""" |
1188 | + self.factory.current_protocol.transport.loseConnection() |
1189 | + return value |
1190 | + |
1191 | + def _wrapped_authenticate(): |
1192 | + """Wrapped authenticate.""" |
1193 | + d = self.factory.current_protocol.oauth_authenticate(consumer, |
1194 | + token) |
1195 | + d.addCallbacks(_auth_successful, _auth_failed) |
1196 | + return d |
1197 | + |
1198 | + try: |
1199 | + self.defer_from_thread(_wrapped_authenticate) |
1200 | + except request.StorageProtocolError, e: |
1201 | + raise AuthenticationError(e) |
1202 | + status, reason = self._await_status_not("connected") |
1203 | + if status != "authenticated": |
1204 | + raise AuthenticationError(reason.value) |
1205 | + |
1206 | + @log_timing |
1207 | + def find_volume(self, volume_spec): |
1208 | + """Finds a share matching the given UUID. Looks at both share UUIDs |
1209 | + and root node UUIDs.""" |
1210 | + volume = self._match_volume(lambda s: \ |
1211 | + str(s.volume_id) == volume_spec or \ |
1212 | + str(s.node_id) == volume_spec) |
1213 | + return uuid.UUID(str(volume.volume_id)) |
1214 | + |
1215 | + @log_timing |
1216 | + def _match_volume(self, predicate): |
1217 | + """Finds a volume matching the given predicate.""" |
1218 | + _list_shares = self.factory.current_protocol.list_volumes |
1219 | + r = self.defer_from_thread(_list_shares) |
1220 | + for volume in r.volumes: |
1221 | + if predicate(volume): |
1222 | + return volume |
1223 | + raise NoSuchShareError() |
1224 | + |
1225 | + @log_timing |
1226 | + def build_tree(self, share_uuid, root_uuid): |
1227 | + """Builds and returns a tree representing the metadata for the given |
1228 | + subtree in the given share. |
1229 | + |
1230 | + @param share_uuid: the share UUID or None for the user's volume |
1231 | + @param root_uuid: the root UUID of the subtree (must be a directory) |
1232 | + @return: a MergeNode tree |
1233 | + |
1234 | + """ |
1235 | + root = MergeNode(node_type=DIRECTORY, uuid=root_uuid) |
1236 | + |
1237 | + @log_timing |
1238 | + @inlineCallbacks |
1239 | + def _get_root_content_hash(): |
1240 | + """Obtain the content hash for the root node.""" |
1241 | + result = yield self._get_node_hashes(share_uuid, [root_uuid]) |
1242 | + returnValue(result.get(root_uuid, None)) |
1243 | + |
1244 | + root.content_hash = self.defer_from_thread(_get_root_content_hash) |
1245 | + if root.content_hash is None: |
1246 | + raise ValueError("No content available for node %s" % root_uuid) |
1247 | + |
1248 | + @log_timing |
1249 | + @inlineCallbacks |
1250 | + def _get_children(parent_uuid, parent_content_hash): |
1251 | + """Obtain a sequence of MergeNodes corresponding to a node's |
1252 | + immediate children. |
1253 | + |
1254 | + """ |
1255 | + entries = yield self._get_raw_dir_entries(share_uuid, |
1256 | + parent_uuid, |
1257 | + parent_content_hash) |
1258 | + children = {} |
1259 | + for entry in entries: |
1260 | + if should_sync(entry.name): |
1261 | + child = MergeNode(node_type=entry.node_type, |
1262 | + uuid=uuid.UUID(entry.node)) |
1263 | + children[entry.name] = child |
1264 | + |
1265 | + child_uuids = [child.uuid for child in children.itervalues()] |
1266 | + content_hashes = yield self._get_node_hashes(share_uuid, |
1267 | + child_uuids) |
1268 | + for child in children.itervalues(): |
1269 | + child.content_hash = content_hashes.get(child.uuid, None) |
1270 | + |
1271 | + returnValue(children) |
1272 | + |
1273 | + need_children = [root] |
1274 | + while need_children: |
1275 | + node = need_children.pop() |
1276 | + if node.content_hash is not None: |
1277 | + children = self.defer_from_thread(_get_children, node.uuid, |
1278 | + node.content_hash) |
1279 | + node.children = children |
1280 | + for child in children.itervalues(): |
1281 | + if child.node_type == DIRECTORY: |
1282 | + need_children.append(child) |
1283 | + |
1284 | + return root |
1285 | + |
1286 | + @log_timing |
1287 | + def _get_raw_dir_entries(self, share_uuid, node_uuid, content_hash): |
1288 | + """Gets raw dir entries for the given directory.""" |
1289 | + d = self.factory.current_protocol.get_content(share_str(share_uuid), |
1290 | + str(node_uuid), |
1291 | + content_hash) |
1292 | + d.addCallback(lambda c: zlib.decompress(c.data)) |
1293 | + |
1294 | + def _parse_content(raw_content): |
1295 | + """Parses directory content into a list of entry objects.""" |
1296 | + unserialized_content = DirectoryContent() |
1297 | + unserialized_content.ParseFromString(raw_content) |
1298 | + return list(unserialized_content.entries) |
1299 | + |
1300 | + d.addCallback(_parse_content) |
1301 | + return d |
1302 | + |
1303 | + @log_timing |
1304 | + def download_string(self, share_uuid, node_uuid, content_hash): |
1305 | + """Reads a file from the server into a string.""" |
1306 | + output = StringIO() |
1307 | + self._download_inner(share_uuid=share_uuid, node_uuid=node_uuid, |
1308 | + content_hash=content_hash, output=output) |
1309 | + return output.getValue() |
1310 | + |
1311 | + @log_timing |
1312 | + def download_file(self, share_uuid, node_uuid, content_hash, filename): |
1313 | + """Downloads a file from the server.""" |
1314 | + # file names have to be quoted to make sure that no issues occur when |
1315 | + # spaces exist |
1316 | + partial_filename = "%s.u1partial" % filename |
1317 | + output = open(partial_filename, "w") |
1318 | + |
1319 | + @log_timing |
1320 | + def rename_file(): |
1321 | + """Renames the temporary file to the final name.""" |
1322 | + output.close() |
1323 | + print "Finished downloading %s" % filename |
1324 | + os.rename(partial_filename, filename) |
1325 | + |
1326 | + @log_timing |
1327 | + def delete_file(): |
1328 | + """Deletes the temporary file.""" |
1329 | + output.close() |
1330 | + os.unlink(partial_filename) |
1331 | + |
1332 | + self._download_inner(share_uuid=share_uuid, node_uuid=node_uuid, |
1333 | + content_hash=content_hash, output=output, |
1334 | + on_success=rename_file, on_failure=delete_file) |
1335 | + |
1336 | + @log_timing |
1337 | + def _download_inner(self, share_uuid, node_uuid, content_hash, output, |
1338 | + on_success=lambda: None, on_failure=lambda: None): |
1339 | + """Helper function for content downloads.""" |
1340 | + dec = zlib.decompressobj() |
1341 | + |
1342 | + @log_timing |
1343 | + def write_data(data): |
1344 | + """Helper which writes data to the output file.""" |
1345 | + uncompressed_data = dec.decompress(data) |
1346 | + output.write(uncompressed_data) |
1347 | + |
1348 | + @log_timing |
1349 | + def finish_download(value): |
1350 | + """Helper which finishes the download.""" |
1351 | + uncompressed_data = dec.flush() |
1352 | + output.write(uncompressed_data) |
1353 | + on_success() |
1354 | + return value |
1355 | + |
1356 | + @log_timing |
1357 | + def abort_download(value): |
1358 | + """Helper which aborts the download.""" |
1359 | + on_failure() |
1360 | + return value |
1361 | + |
1362 | + @log_timing |
1363 | + def _download(): |
1364 | + """Async helper.""" |
1365 | + _get_content = self.factory.current_protocol.get_content |
1366 | + d = _get_content(share_str(share_uuid), str(node_uuid), |
1367 | + content_hash, callback=write_data) |
1368 | + d.addCallbacks(finish_download, abort_download) |
1369 | + return d |
1370 | + |
1371 | + self.defer_from_thread(_download) |
1372 | + |
1373 | + @log_timing |
1374 | + def create_directory(self, share_uuid, parent_uuid, name): |
1375 | + """Creates a directory on the server.""" |
1376 | + r = self.defer_from_thread(self.factory.current_protocol.make_dir, |
1377 | + share_str(share_uuid), str(parent_uuid), |
1378 | + name) |
1379 | + return uuid.UUID(r.new_id) |
1380 | + |
1381 | + @log_timing |
1382 | + def create_file(self, share_uuid, parent_uuid, name): |
1383 | + """Creates a file on the server.""" |
1384 | + r = self.defer_from_thread(self.factory.current_protocol.make_file, |
1385 | + share_str(share_uuid), str(parent_uuid), |
1386 | + name) |
1387 | + return uuid.UUID(r.new_id) |
1388 | + |
1389 | + @log_timing |
1390 | + def create_symlink(self, share_uuid, parent_uuid, name, target): |
1391 | + """Creates a symlink on the server.""" |
1392 | + raise UnsupportedOperationError("Protocol does not support symlinks") |
1393 | + |
1394 | + @log_timing |
1395 | + def upload_string(self, share_uuid, node_uuid, old_content_hash, |
1396 | + content_hash, content): |
1397 | + """Uploads a string to the server as file content.""" |
1398 | + crc = crc32(content, 0) |
1399 | + compressed_content = zlib.compress(content, 9) |
1400 | + compressed = StringIO(compressed_content) |
1401 | + self.defer_from_thread(self.factory.current_protocol.put_content, |
1402 | + share_str(share_uuid), str(node_uuid), |
1403 | + old_content_hash, content_hash, |
1404 | + crc, len(content), len(compressed_content), |
1405 | + compressed) |
1406 | + |
1407 | + @log_timing |
1408 | + def upload_file(self, share_uuid, node_uuid, old_content_hash, |
1409 | + content_hash, filename): |
1410 | + """Uploads a file to the server.""" |
1411 | + parent_dir = os.path.split(filename)[0] |
1412 | + unique_filename = os.path.join(parent_dir, "." + str(uuid.uuid4())) |
1413 | + |
1414 | + |
1415 | + class StagingFile(object): |
1416 | + """An object which tracks data being compressed for staging.""" |
1417 | + def __init__(self, stream): |
1418 | + """Initialize a compression object.""" |
1419 | + self.crc32 = 0 |
1420 | + self.enc = zlib.compressobj(9) |
1421 | + self.size = 0 |
1422 | + self.compressed_size = 0 |
1423 | + self.stream = stream |
1424 | + |
1425 | + def write(self, bytes): |
1426 | + """Compress bytes, keeping track of length and crc32.""" |
1427 | + self.size += len(bytes) |
1428 | + self.crc32 = crc32(bytes, self.crc32) |
1429 | + compressed_bytes = self.enc.compress(bytes) |
1430 | + self.compressed_size += len(compressed_bytes) |
1431 | + self.stream.write(compressed_bytes) |
1432 | + |
1433 | + def finish(self): |
1434 | + """Finish staging compressed data.""" |
1435 | + compressed_bytes = self.enc.flush() |
1436 | + self.compressed_size += len(compressed_bytes) |
1437 | + self.stream.write(compressed_bytes) |
1438 | + |
1439 | + with open(unique_filename, "w+") as compressed: |
1440 | + os.unlink(unique_filename) |
1441 | + with open(filename, "r") as original: |
1442 | + staging = StagingFile(compressed) |
1443 | + shutil.copyfileobj(original, staging) |
1444 | + staging.finish() |
1445 | + compressed.seek(0) |
1446 | + self.defer_from_thread(self.factory.current_protocol.put_content, |
1447 | + share_str(share_uuid), str(node_uuid), |
1448 | + old_content_hash, content_hash, |
1449 | + staging.crc32, |
1450 | + staging.size, staging.compressed_size, |
1451 | + compressed) |
1452 | + |
1453 | + @log_timing |
1454 | + def move(self, share_uuid, parent_uuid, name, node_uuid): |
1455 | + """Moves a file on the server.""" |
1456 | + self.defer_from_thread(self.factory.current_protocol.move, |
1457 | + share_str(share_uuid), str(node_uuid), |
1458 | + str(parent_uuid), name) |
1459 | + |
1460 | + @log_timing |
1461 | + def unlink(self, share_uuid, node_uuid): |
1462 | + """Unlinks a file on the server.""" |
1463 | + self.defer_from_thread(self.factory.current_protocol.unlink, |
1464 | + share_str(share_uuid), str(node_uuid)) |
1465 | + |
1466 | + @log_timing |
1467 | + def _get_node_hashes(self, share_uuid, node_uuids): |
1468 | + """Fetches hashes for the given nodes.""" |
1469 | + share = share_str(share_uuid) |
1470 | + queries = [(share, str(node_uuid), request.UNKNOWN_HASH) \ |
1471 | + for node_uuid in node_uuids] |
1472 | + d = self.factory.current_protocol.query(queries) |
1473 | + |
1474 | + @log_timing |
1475 | + def _collect_hashes(multi_result): |
1476 | + """Accumulate hashes from query replies.""" |
1477 | + hashes = {} |
1478 | + for (success, value) in multi_result: |
1479 | + if success: |
1480 | + for node_state in value.response: |
1481 | + node_uuid = uuid.UUID(node_state.node) |
1482 | + hashes[node_uuid] = node_state.hash |
1483 | + return hashes |
1484 | + |
1485 | + d.addCallback(_collect_hashes) |
1486 | + return d |
1487 | + |
1488 | + @log_timing |
1489 | + def get_incoming_shares(self): |
1490 | + """Returns a list of incoming shares as (name, uuid, accepted) |
1491 | + tuples. |
1492 | + |
1493 | + """ |
1494 | + _list_shares = self.factory.current_protocol.list_shares |
1495 | + r = self.defer_from_thread(_list_shares) |
1496 | + return [(s.name, s.id, s.other_visible_name, |
1497 | + s.accepted, s.access_level) \ |
1498 | + for s in r.shares if s.direction == "to_me"] |
1499 | |
1500 | === added file 'src/u1sync/constants.py' |
1501 | --- src/u1sync/constants.py 1970-01-01 00:00:00 +0000 |
1502 | +++ src/u1sync/constants.py 2010-08-27 20:21:13 +0000 |
1503 | @@ -0,0 +1,30 @@ |
1504 | +# ubuntuone.u1sync.constants |
1505 | +# |
1506 | +# u1sync constants |
1507 | +# |
1508 | +# Author: Tim Cole <tim.cole@canonical.com> |
1509 | +# |
1510 | +# Copyright 2009 Canonical Ltd. |
1511 | +# |
1512 | +# This program is free software: you can redistribute it and/or modify it |
1513 | +# under the terms of the GNU General Public License version 3, as published |
1514 | +# by the Free Software Foundation. |
1515 | +# |
1516 | +# This program is distributed in the hope that it will be useful, but |
1517 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
1518 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
1519 | +# PURPOSE. See the GNU General Public License for more details. |
1520 | +# |
1521 | +# You should have received a copy of the GNU General Public License along |
1522 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
1523 | +"""Assorted constant definitions which don't fit anywhere else.""" |
1524 | + |
1525 | +import re |
1526 | + |
1527 | +# the name of the directory u1sync uses to keep metadata about a mirror |
1528 | +METADATA_DIR_NAME = u".ubuntuone-sync" |
1529 | + |
1530 | +# filenames to ignore |
1531 | +SPECIAL_FILE_RE = re.compile(".*\\.(" |
1532 | + "(u1)?partial|part|" |
1533 | + "(u1)?conflict(\\.[0-9]+)?)$") |
1534 | |
1535 | === added file 'src/u1sync/genericmerge.py' |
1536 | --- src/u1sync/genericmerge.py 1970-01-01 00:00:00 +0000 |
1537 | +++ src/u1sync/genericmerge.py 2010-08-27 20:21:13 +0000 |
1538 | @@ -0,0 +1,88 @@ |
1539 | +# ubuntuone.u1sync.genericmerge |
1540 | +# |
1541 | +# Generic merge function |
1542 | +# |
1543 | +# Author: Tim Cole <tim.cole@canonical.com> |
1544 | +# |
1545 | +# Copyright 2009 Canonical Ltd. |
1546 | +# |
1547 | +# This program is free software: you can redistribute it and/or modify it |
1548 | +# under the terms of the GNU General Public License version 3, as published |
1549 | +# by the Free Software Foundation. |
1550 | +# |
1551 | +# This program is distributed in the hope that it will be useful, but |
1552 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
1553 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
1554 | +# PURPOSE. See the GNU General Public License for more details. |
1555 | +# |
1556 | +# You should have received a copy of the GNU General Public License along |
1557 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
1558 | +"""A generic abstraction for merge operations on directory trees.""" |
1559 | + |
1560 | +from itertools import chain |
1561 | +from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY |
1562 | + |
1563 | +class MergeNode(object): |
1564 | + """A filesystem node. Should generally be treated as immutable.""" |
1565 | + def __init__(self, node_type, content_hash=None, uuid=None, children=None, |
1566 | + conflict_info=None): |
1567 | + """Initializes a node instance.""" |
1568 | + self.node_type = node_type |
1569 | + self.children = children |
1570 | + self.uuid = uuid |
1571 | + self.content_hash = content_hash |
1572 | + self.conflict_info = conflict_info |
1573 | + |
1574 | + def __eq__(self, other): |
1575 | + """Equality test.""" |
1576 | + if type(other) is not type(self): |
1577 | + return False |
1578 | + return self.node_type == other.node_type and \ |
1579 | + self.children == other.children and \ |
1580 | + self.uuid == other.uuid and \ |
1581 | + self.content_hash == other.content_hash and \ |
1582 | + self.conflict_info == other.conflict_info |
1583 | + |
1584 | + def __ne__(self, other): |
1585 | + """Non-equality test.""" |
1586 | + return not self.__eq__(other) |
1587 | + |
1588 | + |
1589 | +def show_tree(tree, indent="", name="/"): |
1590 | + """Prints a tree.""" |
1591 | + # TODO: return string do not print |
1592 | + if tree.node_type == DIRECTORY: |
1593 | + type_str = "DIR " |
1594 | + else: |
1595 | + type_str = "FILE" |
1596 | + print "%s%-36s %s %s %s" % (indent, tree.uuid, type_str, name, |
1597 | + tree.content_hash) |
1598 | + if tree.node_type == DIRECTORY and tree.children is not None: |
1599 | + for name in sorted(tree.children.keys()): |
1600 | + subtree = tree.children[name] |
1601 | + show_tree(subtree, indent=" " + indent, name=name) |
1602 | + |
1603 | +def generic_merge(trees, pre_merge, post_merge, partial_parent, name): |
1604 | + """Generic tree merging function.""" |
1605 | + |
1606 | + partial_result = pre_merge(nodes=trees, name=name, |
1607 | + partial_parent=partial_parent) |
1608 | + |
1609 | + def tree_children(tree): |
1610 | + """Returns children if tree is not None""" |
1611 | + return tree.children if tree is not None else None |
1612 | + |
1613 | + child_dicts = [tree_children(t) or {} for t in trees] |
1614 | + child_names = set(chain(*[cs.iterkeys() for cs in child_dicts])) |
1615 | + child_results = {} |
1616 | + for child_name in child_names: |
1617 | + subtrees = [cs.get(child_name, None) for cs in child_dicts] |
1618 | + child_result = generic_merge(trees=subtrees, |
1619 | + pre_merge=pre_merge, |
1620 | + post_merge=post_merge, |
1621 | + partial_parent=partial_result, |
1622 | + name=child_name) |
1623 | + child_results[child_name] = child_result |
1624 | + |
1625 | + return post_merge(nodes=trees, partial_result=partial_result, |
1626 | + child_results=child_results) |
1627 | |
1628 | === added file 'src/u1sync/main.py' |
1629 | --- src/u1sync/main.py 1970-01-01 00:00:00 +0000 |
1630 | +++ src/u1sync/main.py 2010-08-27 20:21:13 +0000 |
1631 | @@ -0,0 +1,360 @@ |
1632 | +# ubuntuone.u1sync.main |
1633 | +# |
1634 | +# Prototype directory sync client |
1635 | +# |
1636 | +# Author: Lucio Torre <lucio.torre@canonical.com> |
1637 | +# Author: Tim Cole <tim.cole@canonical.com> |
1638 | +# |
1639 | +# Copyright 2009 Canonical Ltd. |
1640 | +# |
1641 | +# This program is free software: you can redistribute it and/or modify it |
1642 | +# under the terms of the GNU General Public License version 3, as published |
1643 | +# by the Free Software Foundation. |
1644 | +# |
1645 | +# This program is distributed in the hope that it will be useful, but |
1646 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
1647 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
1648 | +# PURPOSE. See the GNU General Public License for more details. |
1649 | +# |
1650 | +# You should have received a copy of the GNU General Public License along |
1651 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
1652 | +"""A prototype directory sync client.""" |
1653 | + |
1654 | +from __future__ import with_statement |
1655 | + |
1656 | +import os |
1657 | +import sys |
1658 | +import uuid |
1659 | +import signal |
1660 | +import logging |
1661 | +from Queue import Queue |
1662 | +from errno import EEXIST |
1663 | +from twisted.internet import reactor |
1664 | +# import the storeage protocol to allow the communication with the server |
1665 | +import ubuntuone.storageprotocol.dircontent_pb2 as dircontent_pb2 |
1666 | +from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY, SYMLINK |
1667 | +# import helper modules |
1668 | +from u1sync.genericmerge import ( |
1669 | + show_tree, generic_merge) |
1670 | +from u1sync.client import ( |
1671 | + ConnectionError, AuthenticationError, NoSuchShareError, |
1672 | + ForcedShutdown, Client) |
1673 | +from u1sync.scan import scan_directory |
1674 | +from u1sync.merge import ( |
1675 | + SyncMerge, ClobberServerMerge, ClobberLocalMerge, merge_trees) |
1676 | +from u1sync.sync import download_tree, upload_tree |
1677 | +from u1sync.utils import safe_mkdir |
1678 | +from u1sync import metadata |
1679 | +from u1sync.constants import METADATA_DIR_NAME |
1680 | +from u1sync.ubuntuone_optparse import UbuntuOneOptionsParser |
1681 | + |
1682 | +# pylint: disable-msg=W0212 |
1683 | +NODE_TYPE_ENUM = dircontent_pb2._NODETYPE |
1684 | +# pylint: enable-msg=W0212 |
1685 | +def node_type_str(node_type): |
1686 | + """Converts a numeric node type to a human-readable string.""" |
1687 | + return NODE_TYPE_ENUM.values_by_number[node_type].name |
1688 | + |
1689 | + |
1690 | +class ReadOnlyShareError(Exception): |
1691 | + """Share is read-only.""" |
1692 | + |
1693 | + |
1694 | +class DirectoryAlreadyInitializedError(Exception): |
1695 | + """The directory has already been initialized.""" |
1696 | + |
1697 | + |
1698 | +class DirectoryNotInitializedError(Exception): |
1699 | + """The directory has not been initialized.""" |
1700 | + |
1701 | + |
1702 | +class NoParentError(Exception): |
1703 | + """A node has no parent.""" |
1704 | + |
1705 | + |
1706 | +class TreesDiffer(Exception): |
1707 | + """Raised when diff tree differs.""" |
1708 | + def __init__(self, quiet): |
1709 | + self.quiet = quiet |
1710 | + |
1711 | + |
1712 | +MERGE_ACTIONS = { |
1713 | + # action: (merge_class, should_upload, should_download) |
1714 | + 'sync': (SyncMerge, True, True), |
1715 | + 'clobber-server': (ClobberServerMerge, True, False), |
1716 | + 'clobber-local': (ClobberLocalMerge, False, True), |
1717 | + 'upload': (SyncMerge, True, False), |
1718 | + 'download': (SyncMerge, False, True), |
1719 | + 'auto': None # special case |
1720 | +} |
1721 | + |
1722 | +DEFAULT_MERGE_ACTION = 'auto' |
1723 | + |
1724 | +def do_init(client, share_spec, directory, quiet, subtree_path, |
1725 | + metadata=metadata): |
1726 | + """Initializes a directory for syncing, and syncs it.""" |
1727 | + info = metadata.Metadata() |
1728 | + |
1729 | + if share_spec is not None: |
1730 | + info.share_uuid = client.find_volume(share_spec) |
1731 | + else: |
1732 | + info.share_uuid = None |
1733 | + |
1734 | + if subtree_path is not None: |
1735 | + info.path = subtree_path |
1736 | + else: |
1737 | + info.path = "/" |
1738 | + |
1739 | + logging.info("Initializing directory...") |
1740 | + safe_mkdir(directory) |
1741 | + |
1742 | + metadata_dir = os.path.join(directory, METADATA_DIR_NAME) |
1743 | + try: |
1744 | + os.mkdir(metadata_dir) |
1745 | + except OSError, e: |
1746 | + if e.errno == EEXIST: |
1747 | + raise DirectoryAlreadyInitializedError(directory) |
1748 | + else: |
1749 | + raise |
1750 | + |
1751 | + logging.info("Writing mirror metadata...") |
1752 | + metadata.write(metadata_dir, info) |
1753 | + |
1754 | + logging.info("Done.") |
1755 | + |
1756 | +def do_sync(client, directory, action, dry_run, quiet): |
1757 | + """Synchronizes a directory with the given share.""" |
1758 | + absolute_path = os.path.abspath(directory) |
1759 | + while True: |
1760 | + metadata_dir = os.path.join(absolute_path, METADATA_DIR_NAME) |
1761 | + if os.path.exists(metadata_dir): |
1762 | + break |
1763 | + if absolute_path == "/": |
1764 | + raise DirectoryNotInitializedError(directory) |
1765 | + absolute_path = os.path.split(absolute_path)[0] |
1766 | + |
1767 | + logging.info("Reading mirror metadata...") |
1768 | + info = metadata.read(metadata_dir) |
1769 | + |
1770 | + top_uuid, writable = client.get_root_info(info.share_uuid) |
1771 | + |
1772 | + if info.root_uuid is None: |
1773 | + info.root_uuid = client.resolve_path(info.share_uuid, top_uuid, |
1774 | + info.path) |
1775 | + |
1776 | + if action == 'auto': |
1777 | + if writable: |
1778 | + action = 'sync' |
1779 | + else: |
1780 | + action = 'download' |
1781 | + merge_type, should_upload, should_download = MERGE_ACTIONS[action] |
1782 | + if should_upload and not writable: |
1783 | + raise ReadOnlyShareError(info.share_uuid) |
1784 | + |
1785 | + logging.info("Scanning directory...") |
1786 | + |
1787 | + local_tree = scan_directory(absolute_path, quiet=quiet) |
1788 | + |
1789 | + logging.info("Fetching metadata...") |
1790 | + |
1791 | + remote_tree = client.build_tree(info.share_uuid, info.root_uuid) |
1792 | + if not quiet: |
1793 | + show_tree(remote_tree) |
1794 | + |
1795 | + logging.info("Merging trees...") |
1796 | + merged_tree = merge_trees(old_local_tree=info.local_tree, |
1797 | + local_tree=local_tree, |
1798 | + old_remote_tree=info.remote_tree, |
1799 | + remote_tree=remote_tree, |
1800 | + merge_action=merge_type()) |
1801 | + if not quiet: |
1802 | + show_tree(merged_tree) |
1803 | + |
1804 | + logging.info("Syncing content...") |
1805 | + if should_download: |
1806 | + info.local_tree = download_tree(merged_tree=merged_tree, |
1807 | + local_tree=local_tree, |
1808 | + client=client, |
1809 | + share_uuid=info.share_uuid, |
1810 | + path=absolute_path, dry_run=dry_run, |
1811 | + quiet=quiet) |
1812 | + else: |
1813 | + info.local_tree = local_tree |
1814 | + if should_upload: |
1815 | + info.remote_tree = upload_tree(merged_tree=merged_tree, |
1816 | + remote_tree=remote_tree, |
1817 | + client=client, |
1818 | + share_uuid=info.share_uuid, |
1819 | + path=absolute_path, dry_run=dry_run, |
1820 | + quiet=quiet) |
1821 | + else: |
1822 | + info.remote_tree = remote_tree |
1823 | + |
1824 | + if not dry_run: |
1825 | + logging.info("Updating mirror metadata...") |
1826 | + metadata.write(metadata_dir, info) |
1827 | + |
1828 | + logging.info("Done.") |
1829 | + |
1830 | +def do_list_shares(client): |
1831 | + """Lists available (incoming) shares.""" |
1832 | + shares = client.get_incoming_shares() |
1833 | + for (name, id, user, accepted, access) in shares: |
1834 | + if not accepted: |
1835 | + status = " [not accepted]" |
1836 | + else: |
1837 | + status = "" |
1838 | + name = name.encode("utf-8") |
1839 | + user = user.encode("utf-8") |
1840 | + print "%s %s (from %s) [%s]%s" % (id, name, user, access, status) |
1841 | + |
1842 | +def do_diff(client, share_spec, directory, quiet, subtree_path, |
1843 | + ignore_symlinks=True): |
1844 | + """Diffs a local directory with the server.""" |
1845 | + if share_spec is not None: |
1846 | + share_uuid = client.find_volume(share_spec) |
1847 | + else: |
1848 | + share_uuid = None |
1849 | + if subtree_path is None: |
1850 | + subtree_path = '/' |
1851 | + # pylint: disable-msg=W0612 |
1852 | + root_uuid, writable = client.get_root_info(share_uuid) |
1853 | + subtree_uuid = client.resolve_path(share_uuid, root_uuid, subtree_path) |
1854 | + local_tree = scan_directory(directory, quiet=True) |
1855 | + remote_tree = client.build_tree(share_uuid, subtree_uuid) |
1856 | + |
1857 | + def pre_merge(nodes, name, partial_parent): |
1858 | + """Compares nodes and prints differences.""" |
1859 | + (local_node, remote_node) = nodes |
1860 | + # pylint: disable-msg=W0612 |
1861 | + (parent_display_path, parent_differs) = partial_parent |
1862 | + display_path = os.path.join(parent_display_path, name.encode("UTF-8")) |
1863 | + differs = True |
1864 | + if local_node is None: |
1865 | + logging.info("%s missing from client",display_path) |
1866 | + elif remote_node is None: |
1867 | + if ignore_symlinks and local_node.node_type == SYMLINK: |
1868 | + differs = False |
1869 | + logging.info("%s missing from server",display_path) |
1870 | + elif local_node.node_type != remote_node.node_type: |
1871 | + local_type = node_type_str(local_node.node_type) |
1872 | + remote_type = node_type_str(remote_node.node_type) |
1873 | + logging.info("%s node types differ (client: %s, server: %s)", |
1874 | + display_path, local_type, remote_type) |
1875 | + elif local_node.node_type != DIRECTORY and \ |
1876 | + local_node.content_hash != remote_node.content_hash: |
1877 | + local_content = local_node.content_hash |
1878 | + remote_content = remote_node.content_hash |
1879 | + logging.info("%s has different content (client: %s, server: %s)", |
1880 | + display_path, local_content, remote_content) |
1881 | + else: |
1882 | + differs = False |
1883 | + return (display_path, differs) |
1884 | + |
1885 | + def post_merge(nodes, partial_result, child_results): |
1886 | + """Aggregates 'differs' flags.""" |
1887 | + # pylint: disable-msg=W0612 |
1888 | + (display_path, differs) = partial_result |
1889 | + return differs or any(child_results.itervalues()) |
1890 | + |
1891 | + differs = generic_merge(trees=[local_tree, remote_tree], |
1892 | + pre_merge=pre_merge, post_merge=post_merge, |
1893 | + partial_parent=("", False), name=u"") |
1894 | + if differs: |
1895 | + raise TreesDiffer(quiet=quiet) |
1896 | + |
1897 | +def do_main(argv, options_parser): |
1898 | + """The main user-facing portion of the script.""" |
1899 | + # pass the arguments to ensure that the code works correctly |
1900 | + options_parser.get_options(argv) |
1901 | + client = Client(realm=options_parser.options.realm, reactor=reactor) |
1902 | + # set the logging level to info in the user passed quite to be false |
1903 | + if options_parser.options.quiet: |
1904 | + logging.basicConfig(level=logging.INFO) |
1905 | + |
1906 | + signal.signal(signal.SIGINT, lambda s, f: client.force_shutdown()) |
1907 | + signal.signal(signal.SIGTERM, lambda s, f: client.force_shutdown()) |
1908 | + |
1909 | + def run_client(): |
1910 | + """Run the blocking client.""" |
1911 | + token = options_parser.options.token |
1912 | + |
1913 | + client.connect_ssl(options_parser.options.host, |
1914 | + int(options_parser.options.port), |
1915 | + options_parser.options.no_ssl_verify) |
1916 | + |
1917 | + try: |
1918 | + client.set_capabilities() |
1919 | + client.oauth_from_token(options_parser.options.token) |
1920 | + |
1921 | + if options_parser.options.mode == "sync": |
1922 | + do_sync(client=client, directory=options_parser.options.directory, |
1923 | + action=options_parser.options.action, |
1924 | + dry_run=options_parser.options.dry_run, |
1925 | + quiet=options_parser.options.quiet) |
1926 | + elif options_parser.options.mode == "init": |
1927 | + do_init(client=client, share_spec=options_parser.options.share, |
1928 | + directory=options_parser.options.directory, |
1929 | + quiet=options_parser.options.quiet, subtree_path=options_parser.options.subtree) |
1930 | + elif options.mode == "list-shares": |
1931 | + do_list_shares(client=client) |
1932 | + elif options.mode == "diff": |
1933 | + do_diff(client=client, share_spec=options_parser.options.share, |
1934 | + directory=directory, |
1935 | + quiet=options_parser.options.quiet, |
1936 | + subtree_path=options_parser.options.subtree, |
1937 | + ignore_symlinks=False) |
1938 | + elif options_parser.options.mode == "authorize": |
1939 | + if not options_parser.options.quiet: |
1940 | + print "Authorized." |
1941 | + finally: |
1942 | + client.disconnect() |
1943 | + |
1944 | + def capture_exception(queue, func): |
1945 | + """Capture the exception from calling func.""" |
1946 | + try: |
1947 | + func() |
1948 | + except Exception: |
1949 | + queue.put(sys.exc_info()) |
1950 | + else: |
1951 | + queue.put(None) |
1952 | + finally: |
1953 | + reactor.callWhenRunning(reactor.stop) |
1954 | + |
1955 | + queue = Queue() |
1956 | + reactor.callInThread(capture_exception, queue, run_client) |
1957 | + reactor.run(installSignalHandlers=False) |
1958 | + exc_info = queue.get(True, 0.1) |
1959 | + if exc_info: |
1960 | + raise exc_info[0], exc_info[1], exc_info[2] |
1961 | + |
1962 | +def main(argv): |
1963 | + """Top-level main function.""" |
1964 | + try: |
1965 | + do_main(argv, UbuntuOneOptionsParser()) |
1966 | + except AuthenticationError, e: |
1967 | + print "Authentication failed: %s" % e |
1968 | + except ConnectionError, e: |
1969 | + print "Connection failed: %s" % e |
1970 | + except DirectoryNotInitializedError: |
1971 | + print "Directory not initialized; " \ |
1972 | + "use --init [DIRECTORY] to initialize it." |
1973 | + except DirectoryAlreadyInitializedError: |
1974 | + print "Directory already initialized." |
1975 | + except NoSuchShareError: |
1976 | + print "No matching share found." |
1977 | + except ReadOnlyShareError: |
1978 | + print "The selected action isn't possible on a read-only share." |
1979 | + except (ForcedShutdown, KeyboardInterrupt): |
1980 | + print "Interrupted!" |
1981 | + except TreesDiffer, e: |
1982 | + if not e.quiet: |
1983 | + print "Trees differ." |
1984 | + else: |
1985 | + return 0 |
1986 | + return 1 |
1987 | + |
1988 | +if __name__ == "__main__": |
1989 | + # set the name of the process, this just works on windows |
1990 | + sys.argv[0] = "Ubuntuone" |
1991 | + main(sys.argv[1:]) |
1992 | \ No newline at end of file |
1993 | |
1994 | === added file 'src/u1sync/merge.py' |
1995 | --- src/u1sync/merge.py 1970-01-01 00:00:00 +0000 |
1996 | +++ src/u1sync/merge.py 2010-08-27 20:21:13 +0000 |
1997 | @@ -0,0 +1,186 @@ |
1998 | +# ubuntuone.u1sync.merge |
1999 | +# |
2000 | +# Tree state merging |
2001 | +# |
2002 | +# Author: Tim Cole <tim.cole@canonical.com> |
2003 | +# |
2004 | +# Copyright 2009 Canonical Ltd. |
2005 | +# |
2006 | +# This program is free software: you can redistribute it and/or modify it |
2007 | +# under the terms of the GNU General Public License version 3, as published |
2008 | +# by the Free Software Foundation. |
2009 | +# |
2010 | +# This program is distributed in the hope that it will be useful, but |
2011 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
2012 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
2013 | +# PURPOSE. See the GNU General Public License for more details. |
2014 | +# |
2015 | +# You should have received a copy of the GNU General Public License along |
2016 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
2017 | +"""Code for merging changes between modified trees.""" |
2018 | + |
2019 | +from __future__ import with_statement |
2020 | + |
2021 | +import os |
2022 | + |
2023 | +from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY |
2024 | +from u1sync.genericmerge import ( |
2025 | + MergeNode, generic_merge) |
2026 | +import uuid |
2027 | + |
2028 | + |
2029 | +class NodeTypeMismatchError(Exception): |
2030 | + """Node types don't match.""" |
2031 | + |
2032 | + |
2033 | +def merge_trees(old_local_tree, local_tree, old_remote_tree, remote_tree, |
2034 | + merge_action): |
2035 | + """Performs a tree merge using the given merge action.""" |
2036 | + |
2037 | + def pre_merge(nodes, name, partial_parent): |
2038 | + """Accumulates path and determines merged node type.""" |
2039 | + old_local_node, local_node, old_remote_node, remote_node = nodes |
2040 | + # pylint: disable-msg=W0612 |
2041 | + (parent_path, parent_type) = partial_parent |
2042 | + path = os.path.join(parent_path, name.encode("utf-8")) |
2043 | + node_type = merge_action.get_node_type(old_local_node=old_local_node, |
2044 | + local_node=local_node, |
2045 | + old_remote_node=old_remote_node, |
2046 | + remote_node=remote_node, |
2047 | + path=path) |
2048 | + return (path, node_type) |
2049 | + |
2050 | + def post_merge(nodes, partial_result, child_results): |
2051 | + """Drops deleted children and merges node.""" |
2052 | + old_local_node, local_node, old_remote_node, remote_node = nodes |
2053 | + # pylint: disable-msg=W0612 |
2054 | + (path, node_type) = partial_result |
2055 | + if node_type == DIRECTORY: |
2056 | + merged_children = dict([(name, child) for (name, child) |
2057 | + in child_results.iteritems() |
2058 | + if child is not None]) |
2059 | + else: |
2060 | + merged_children = None |
2061 | + return merge_action.merge_node(old_local_node=old_local_node, |
2062 | + local_node=local_node, |
2063 | + old_remote_node=old_remote_node, |
2064 | + remote_node=remote_node, |
2065 | + node_type=node_type, |
2066 | + merged_children=merged_children) |
2067 | + |
2068 | + return generic_merge(trees=[old_local_tree, local_tree, |
2069 | + old_remote_tree, remote_tree], |
2070 | + pre_merge=pre_merge, post_merge=post_merge, |
2071 | + name=u"", partial_parent=("", None)) |
2072 | + |
2073 | + |
2074 | +class SyncMerge(object): |
2075 | + """Performs a bidirectional sync merge.""" |
2076 | + |
2077 | + def get_node_type(self, old_local_node, local_node, |
2078 | + old_remote_node, remote_node, path): |
2079 | + """Requires that all node types match.""" |
2080 | + node_type = None |
2081 | + for node in (old_local_node, local_node, remote_node): |
2082 | + if node is not None: |
2083 | + if node_type is not None: |
2084 | + if node.node_type != node_type: |
2085 | + message = "Node types don't match for %s" % path |
2086 | + raise NodeTypeMismatchError(message) |
2087 | + else: |
2088 | + node_type = node.node_type |
2089 | + return node_type |
2090 | + |
2091 | + def merge_node(self, old_local_node, local_node, |
2092 | + old_remote_node, remote_node, node_type, merged_children): |
2093 | + """Performs bidirectional merge of node state.""" |
2094 | + |
2095 | + def node_content_hash(node): |
2096 | + """Returns node content hash if node is not None""" |
2097 | + return node.content_hash if node is not None else None |
2098 | + |
2099 | + old_local_content_hash = node_content_hash(old_local_node) |
2100 | + local_content_hash = node_content_hash(local_node) |
2101 | + old_remote_content_hash = node_content_hash(old_remote_node) |
2102 | + remote_content_hash = node_content_hash(remote_node) |
2103 | + |
2104 | + locally_deleted = old_local_node is not None and local_node is None |
2105 | + deleted_on_server = old_remote_node is not None and remote_node is None |
2106 | + # updated means modified or created |
2107 | + locally_updated = not locally_deleted and \ |
2108 | + old_local_content_hash != local_content_hash |
2109 | + updated_on_server = not deleted_on_server and \ |
2110 | + old_remote_content_hash != remote_content_hash |
2111 | + |
2112 | + has_merged_children = merged_children is not None and \ |
2113 | + len(merged_children) > 0 |
2114 | + |
2115 | + either_node_exists = local_node is not None or remote_node is not None |
2116 | + should_delete = (locally_deleted and not updated_on_server) or \ |
2117 | + (deleted_on_server and not locally_updated) |
2118 | + |
2119 | + if (either_node_exists and not should_delete) or has_merged_children: |
2120 | + if node_type != DIRECTORY and \ |
2121 | + locally_updated and updated_on_server and \ |
2122 | + local_content_hash != remote_content_hash: |
2123 | + # local_content_hash will become the merged content_hash; |
2124 | + # save remote_content_hash in conflict info |
2125 | + conflict_info = (str(uuid.uuid4()), remote_content_hash) |
2126 | + else: |
2127 | + conflict_info = None |
2128 | + node_uuid = remote_node.uuid if remote_node is not None else None |
2129 | + if locally_updated: |
2130 | + content_hash = local_content_hash or remote_content_hash |
2131 | + else: |
2132 | + content_hash = remote_content_hash or local_content_hash |
2133 | + return MergeNode(node_type=node_type, uuid=node_uuid, |
2134 | + children=merged_children, content_hash=content_hash, |
2135 | + conflict_info=conflict_info) |
2136 | + else: |
2137 | + return None |
2138 | + |
2139 | + |
2140 | +class ClobberServerMerge(object): |
2141 | + """Clobber server to match local state.""" |
2142 | + |
2143 | + def get_node_type(self, old_local_node, local_node, |
2144 | + old_remote_node, remote_node, path): |
2145 | + """Return local node type.""" |
2146 | + if local_node is not None: |
2147 | + return local_node.node_type |
2148 | + else: |
2149 | + return None |
2150 | + |
2151 | + def merge_node(self, old_local_node, local_node, |
2152 | + old_remote_node, remote_node, node_type, merged_children): |
2153 | + """Copy local node and associate with remote uuid (if applicable).""" |
2154 | + if local_node is None: |
2155 | + return None |
2156 | + if remote_node is not None: |
2157 | + node_uuid = remote_node.uuid |
2158 | + else: |
2159 | + node_uuid = None |
2160 | + return MergeNode(node_type=local_node.node_type, uuid=node_uuid, |
2161 | + content_hash=local_node.content_hash, |
2162 | + children=merged_children) |
2163 | + |
2164 | + |
2165 | +class ClobberLocalMerge(object): |
2166 | + """Clobber local state to match server.""" |
2167 | + |
2168 | + def get_node_type(self, old_local_node, local_node, |
2169 | + old_remote_node, remote_node, path): |
2170 | + """Return remote node type.""" |
2171 | + if remote_node is not None: |
2172 | + return remote_node.node_type |
2173 | + else: |
2174 | + return None |
2175 | + |
2176 | + def merge_node(self, old_local_node, local_node, |
2177 | + old_remote_node, remote_node, node_type, merged_children): |
2178 | + """Copy the remote node.""" |
2179 | + if remote_node is None: |
2180 | + return None |
2181 | + return MergeNode(node_type=node_type, uuid=remote_node.uuid, |
2182 | + content_hash=remote_node.content_hash, |
2183 | + children=merged_children) |
2184 | |
2185 | === added file 'src/u1sync/metadata.py' |
2186 | --- src/u1sync/metadata.py 1970-01-01 00:00:00 +0000 |
2187 | +++ src/u1sync/metadata.py 2010-08-27 20:21:13 +0000 |
2188 | @@ -0,0 +1,145 @@ |
2189 | +# ubuntuone.u1sync.metadata |
2190 | +# |
2191 | +# u1sync metadata routines |
2192 | +# |
2193 | +# Author: Tim Cole <tim.cole@canonical.com> |
2194 | +# |
2195 | +# Copyright 2009 Canonical Ltd. |
2196 | +# |
2197 | +# This program is free software: you can redistribute it and/or modify it |
2198 | +# under the terms of the GNU General Public License version 3, as published |
2199 | +# by the Free Software Foundation. |
2200 | +# |
2201 | +# This program is distributed in the hope that it will be useful, but |
2202 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
2203 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
2204 | +# PURPOSE. See the GNU General Public License for more details. |
2205 | +# |
2206 | +# You should have received a copy of the GNU General Public License along |
2207 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
2208 | +"""Routines for loading/storing u1sync mirror metadata.""" |
2209 | + |
2210 | +from __future__ import with_statement |
2211 | + |
2212 | +import os |
2213 | +import cPickle as pickle |
2214 | +from errno import ENOENT |
2215 | +from contextlib import contextmanager |
2216 | +from ubuntuone.storageprotocol.dircontent_pb2 import DIRECTORY |
2217 | +from u1sync.merge import MergeNode |
2218 | +from u1sync.utils import safe_unlink |
2219 | +import uuid |
2220 | + |
2221 | +class Metadata(object): |
2222 | + """Object representing mirror metadata.""" |
2223 | + def __init__(self, local_tree=None, remote_tree=None, share_uuid=None, |
2224 | + root_uuid=None, path=None): |
2225 | + """Populate fields.""" |
2226 | + self.local_tree = local_tree |
2227 | + self.remote_tree = remote_tree |
2228 | + self.share_uuid = share_uuid |
2229 | + self.root_uuid = root_uuid |
2230 | + self.path = path |
2231 | + |
2232 | +def read(metadata_dir): |
2233 | + """Read metadata for a mirror rooted at directory.""" |
2234 | + index_file = os.path.join(metadata_dir, "local-index") |
2235 | + share_uuid_file = os.path.join(metadata_dir, "share-uuid") |
2236 | + root_uuid_file = os.path.join(metadata_dir, "root-uuid") |
2237 | + path_file = os.path.join(metadata_dir, "path") |
2238 | + |
2239 | + index = read_pickle_file(index_file, {}) |
2240 | + share_uuid = read_uuid_file(share_uuid_file) |
2241 | + root_uuid = read_uuid_file(root_uuid_file) |
2242 | + path = read_string_file(path_file, '/') |
2243 | + |
2244 | + local_tree = index.get("tree", None) |
2245 | + remote_tree = index.get("remote_tree", None) |
2246 | + |
2247 | + if local_tree is None: |
2248 | + local_tree = MergeNode(node_type=DIRECTORY, children={}) |
2249 | + if remote_tree is None: |
2250 | + remote_tree = MergeNode(node_type=DIRECTORY, children={}) |
2251 | + |
2252 | + return Metadata(local_tree=local_tree, remote_tree=remote_tree, |
2253 | + share_uuid=share_uuid, root_uuid=root_uuid, |
2254 | + path=path) |
2255 | + |
2256 | +def write(metadata_dir, info): |
2257 | + """Writes all metadata for the mirror rooted at directory.""" |
2258 | + share_uuid_file = os.path.join(metadata_dir, "share-uuid") |
2259 | + root_uuid_file = os.path.join(metadata_dir, "root-uuid") |
2260 | + index_file = os.path.join(metadata_dir, "local-index") |
2261 | + path_file = os.path.join(metadata_dir, "path") |
2262 | + if info.share_uuid is not None: |
2263 | + write_uuid_file(share_uuid_file, info.share_uuid) |
2264 | + else: |
2265 | + safe_unlink(share_uuid_file) |
2266 | + if info.root_uuid is not None: |
2267 | + write_uuid_file(root_uuid_file, info.root_uuid) |
2268 | + else: |
2269 | + safe_unlink(root_uuid_file) |
2270 | + write_string_file(path_file, info.path) |
2271 | + write_pickle_file(index_file, {"tree": info.local_tree, |
2272 | + "remote_tree": info.remote_tree}) |
2273 | + |
2274 | +def write_pickle_file(filename, value): |
2275 | + """Writes a pickled python object to a file.""" |
2276 | + with atomic_update_file(filename) as stream: |
2277 | + pickle.dump(value, stream, 2) |
2278 | + |
2279 | +def write_string_file(filename, value): |
2280 | + """Writes a string to a file with an added line feed, or |
2281 | + deletes the file if value is None. |
2282 | + """ |
2283 | + if value is not None: |
2284 | + with atomic_update_file(filename) as stream: |
2285 | + stream.write(value) |
2286 | + stream.write('\n') |
2287 | + else: |
2288 | + safe_unlink(filename) |
2289 | + |
2290 | +def write_uuid_file(filename, value): |
2291 | + """Writes a UUID to a file.""" |
2292 | + write_string_file(filename, str(value)) |
2293 | + |
2294 | +def read_pickle_file(filename, default_value=None): |
2295 | + """Reads a pickled python object from a file.""" |
2296 | + try: |
2297 | + with open(filename, "rb") as stream: |
2298 | + return pickle.load(stream) |
2299 | + except IOError, e: |
2300 | + if e.errno != ENOENT: |
2301 | + raise |
2302 | + return default_value |
2303 | + |
2304 | +def read_string_file(filename, default_value=None): |
2305 | + """Reads a string from a file, discarding the final character.""" |
2306 | + try: |
2307 | + with open(filename, "r") as stream: |
2308 | + return stream.read()[:-1] |
2309 | + except IOError, e: |
2310 | + if e.errno != ENOENT: |
2311 | + raise |
2312 | + return default_value |
2313 | + |
2314 | +def read_uuid_file(filename, default_value=None): |
2315 | + """Reads a UUID from a file.""" |
2316 | + try: |
2317 | + with open(filename, "r") as stream: |
2318 | + return uuid.UUID(stream.read()[:-1]) |
2319 | + except IOError, e: |
2320 | + if e.errno != ENOENT: |
2321 | + raise |
2322 | + return default_value |
2323 | + |
2324 | +@contextmanager |
2325 | +def atomic_update_file(filename): |
2326 | + """Returns a context manager for atomically updating a file.""" |
2327 | + temp_filename = "%s.%s" % (filename, uuid.uuid4()) |
2328 | + try: |
2329 | + with open(temp_filename, "w") as stream: |
2330 | + yield stream |
2331 | + os.rename(temp_filename, filename) |
2332 | + finally: |
2333 | + safe_unlink(temp_filename) |
2334 | |
2335 | === added file 'src/u1sync/scan.py' |
2336 | --- src/u1sync/scan.py 1970-01-01 00:00:00 +0000 |
2337 | +++ src/u1sync/scan.py 2010-08-27 20:21:13 +0000 |
2338 | @@ -0,0 +1,102 @@ |
2339 | +# ubuntuone.u1sync.scan |
2340 | +# |
2341 | +# Directory scanning |
2342 | +# |
2343 | +# Author: Tim Cole <tim.cole@canonical.com> |
2344 | +# |
2345 | +# Copyright 2009 Canonical Ltd. |
2346 | +# |
2347 | +# This program is free software: you can redistribute it and/or modify it |
2348 | +# under the terms of the GNU General Public License version 3, as published |
2349 | +# by the Free Software Foundation. |
2350 | +# |
2351 | +# This program is distributed in the hope that it will be useful, but |
2352 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
2353 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
2354 | +# PURPOSE. See the GNU General Public License for more details. |
2355 | +# |
2356 | +# You should have received a copy of the GNU General Public License along |
2357 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
2358 | +"""Code for scanning local directory state.""" |
2359 | + |
2360 | +from __future__ import with_statement |
2361 | + |
2362 | +import os |
2363 | +import hashlib |
2364 | +import shutil |
2365 | +from errno import ENOTDIR, EINVAL |
2366 | +import sys |
2367 | + |
2368 | +EMPTY_HASH = "sha1:%s" % hashlib.sha1().hexdigest() |
2369 | + |
2370 | +from ubuntuone.storageprotocol.dircontent_pb2 import \ |
2371 | + DIRECTORY, FILE, SYMLINK |
2372 | +from u1sync.genericmerge import MergeNode |
2373 | +from u1sync.utils import should_sync |
2374 | + |
2375 | +def scan_directory(path, display_path="", quiet=False): |
2376 | + """Scans a local directory and builds an in-memory tree from it.""" |
2377 | + if display_path != "" and not quiet: |
2378 | + print display_path |
2379 | + |
2380 | + link_target = None |
2381 | + child_names = None |
2382 | + try: |
2383 | + print "Path is " + str(path) |
2384 | + if sys.platform == "win32": |
2385 | + if path.endswith(".lnk") or path.endswith(".url"): |
2386 | + import win32com.client |
2387 | + import pythoncom |
2388 | + pythoncom.CoInitialize() |
2389 | + shell = win32com.client.Dispatch("WScript.Shell") |
2390 | + shortcut = shell.CreateShortCut(path) |
2391 | + print(shortcut.Targetpath) |
2392 | + link_target = shortcut.Targetpath |
2393 | + else: |
2394 | + link_target = None |
2395 | + if os.path.isdir(path): |
2396 | + child_names = os.listdir(path) |
2397 | + else: |
2398 | + link_target = os.readlink(path) |
2399 | + except OSError, e: |
2400 | + if e.errno != EINVAL: |
2401 | + raise |
2402 | + try: |
2403 | + child_names = os.listdir(path) |
2404 | + except OSError, e: |
2405 | + if e.errno != ENOTDIR: |
2406 | + raise |
2407 | + |
2408 | + if link_target is not None: |
2409 | + # symlink |
2410 | + sum = hashlib.sha1() |
2411 | + sum.update(link_target) |
2412 | + content_hash = "sha1:%s" % sum.hexdigest() |
2413 | + return MergeNode(node_type=SYMLINK, content_hash=content_hash) |
2414 | + elif child_names is not None: |
2415 | + # directory |
2416 | + child_names = [n for n in child_names if should_sync(n.decode("utf-8"))] |
2417 | + child_paths = [(os.path.join(path, child_name), |
2418 | + os.path.join(display_path, child_name)) \ |
2419 | + for child_name in child_names] |
2420 | + children = [scan_directory(child_path, child_display_path, quiet) \ |
2421 | + for (child_path, child_display_path) in child_paths] |
2422 | + unicode_child_names = [n.decode("utf-8") for n in child_names] |
2423 | + children = dict(zip(unicode_child_names, children)) |
2424 | + return MergeNode(node_type=DIRECTORY, children=children) |
2425 | + else: |
2426 | + # regular file |
2427 | + sum = hashlib.sha1() |
2428 | + |
2429 | + |
2430 | + class HashStream(object): |
2431 | + """Stream that computes hashes.""" |
2432 | + def write(self, bytes): |
2433 | + """Accumulate bytes.""" |
2434 | + sum.update(bytes) |
2435 | + |
2436 | + |
2437 | + with open(path, "r") as stream: |
2438 | + shutil.copyfileobj(stream, HashStream()) |
2439 | + content_hash = "sha1:%s" % sum.hexdigest() |
2440 | + return MergeNode(node_type=FILE, content_hash=content_hash) |
2441 | |
2442 | === added file 'src/u1sync/sync.py' |
2443 | --- src/u1sync/sync.py 1970-01-01 00:00:00 +0000 |
2444 | +++ src/u1sync/sync.py 2010-08-27 20:21:13 +0000 |
2445 | @@ -0,0 +1,384 @@ |
2446 | +# ubuntuone.u1sync.sync |
2447 | +# |
2448 | +# State update |
2449 | +# |
2450 | +# Author: Tim Cole <tim.cole@canonical.com> |
2451 | +# |
2452 | +# Copyright 2009 Canonical Ltd. |
2453 | +# |
2454 | +# This program is free software: you can redistribute it and/or modify it |
2455 | +# under the terms of the GNU General Public License version 3, as published |
2456 | +# by the Free Software Foundation. |
2457 | +# |
2458 | +# This program is distributed in the hope that it will be useful, but |
2459 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
2460 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
2461 | +# PURPOSE. See the GNU General Public License for more details. |
2462 | +# |
2463 | +# You should have received a copy of the GNU General Public License along |
2464 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
2465 | +"""After merging, these routines are used to synchronize state locally and on |
2466 | +the server to correspond to the merged result.""" |
2467 | + |
2468 | +from __future__ import with_statement |
2469 | + |
2470 | +import os |
2471 | + |
2472 | +EMPTY_HASH = "" |
2473 | +UPLOAD_SYMBOL = u"\u25b2".encode("utf-8") |
2474 | +DOWNLOAD_SYMBOL = u"\u25bc".encode("utf-8") |
2475 | +CONFLICT_SYMBOL = "!" |
2476 | +DELETE_SYMBOL = "X" |
2477 | + |
2478 | +from ubuntuone.storageprotocol import request |
2479 | +from ubuntuone.storageprotocol.dircontent_pb2 import ( |
2480 | + DIRECTORY, SYMLINK) |
2481 | +from u1sync.genericmerge import ( |
2482 | + MergeNode, generic_merge) |
2483 | +from u1sync.utils import safe_mkdir |
2484 | +from u1sync.client import UnsupportedOperationError |
2485 | + |
2486 | +def get_conflict_path(path, conflict_info): |
2487 | + """Returns path for conflict file corresponding to path.""" |
2488 | + dir, name = os.path.split(path) |
2489 | + unique_id = conflict_info[0] |
2490 | + return os.path.join(dir, "conflict-%s-%s" % (unique_id, name)) |
2491 | + |
2492 | +def name_from_path(path): |
2493 | + """Returns unicode name from last path component.""" |
2494 | + return os.path.split(path)[1].decode("UTF-8") |
2495 | + |
2496 | + |
2497 | +class NodeSyncError(Exception): |
2498 | + """Error syncing node.""" |
2499 | + |
2500 | + |
2501 | +class NodeCreateError(NodeSyncError): |
2502 | + """Error creating node.""" |
2503 | + |
2504 | + |
2505 | +class NodeUpdateError(NodeSyncError): |
2506 | + """Error updating node.""" |
2507 | + |
2508 | + |
2509 | +class NodeDeleteError(NodeSyncError): |
2510 | + """Error deleting node.""" |
2511 | + |
2512 | + |
2513 | +def sync_tree(merged_tree, original_tree, sync_mode, path, quiet): |
2514 | + """Performs actual synchronization.""" |
2515 | + |
2516 | + def pre_merge(nodes, name, partial_parent): |
2517 | + """Create nodes and write content as required.""" |
2518 | + (merged_node, original_node) = nodes |
2519 | + # pylint: disable-msg=W0612 |
2520 | + (parent_path, parent_display_path, parent_uuid, parent_synced) \ |
2521 | + = partial_parent |
2522 | + |
2523 | + utf8_name = name.encode("utf-8") |
2524 | + path = os.path.join(parent_path, utf8_name) |
2525 | + display_path = os.path.join(parent_display_path, utf8_name) |
2526 | + node_uuid = None |
2527 | + |
2528 | + synced = False |
2529 | + if merged_node is not None: |
2530 | + if merged_node.node_type == DIRECTORY: |
2531 | + if original_node is not None: |
2532 | + synced = True |
2533 | + node_uuid = original_node.uuid |
2534 | + else: |
2535 | + if not quiet: |
2536 | + print "%s %s" % (sync_mode.symbol, display_path) |
2537 | + try: |
2538 | + create_dir = sync_mode.create_directory |
2539 | + node_uuid = create_dir(parent_uuid=parent_uuid, |
2540 | + path=path) |
2541 | + synced = True |
2542 | + except NodeCreateError, e: |
2543 | + print e |
2544 | + elif merged_node.content_hash is None: |
2545 | + if not quiet: |
2546 | + print "? %s" % display_path |
2547 | + elif original_node is None or \ |
2548 | + original_node.content_hash != merged_node.content_hash or \ |
2549 | + merged_node.conflict_info is not None: |
2550 | + conflict_info = merged_node.conflict_info |
2551 | + if conflict_info is not None: |
2552 | + conflict_symbol = CONFLICT_SYMBOL |
2553 | + else: |
2554 | + conflict_symbol = " " |
2555 | + if not quiet: |
2556 | + print "%s %s %s" % (sync_mode.symbol, conflict_symbol, |
2557 | + display_path) |
2558 | + if original_node is not None: |
2559 | + node_uuid = original_node.uuid or merged_node.uuid |
2560 | + original_hash = original_node.content_hash or EMPTY_HASH |
2561 | + else: |
2562 | + node_uuid = merged_node.uuid |
2563 | + original_hash = EMPTY_HASH |
2564 | + try: |
2565 | + sync_mode.write_file(node_uuid=node_uuid, |
2566 | + content_hash= |
2567 | + merged_node.content_hash, |
2568 | + old_content_hash=original_hash, |
2569 | + path=path, |
2570 | + parent_uuid=parent_uuid, |
2571 | + conflict_info=conflict_info, |
2572 | + node_type=merged_node.node_type) |
2573 | + synced = True |
2574 | + except NodeSyncError, e: |
2575 | + print e |
2576 | + else: |
2577 | + synced = True |
2578 | + |
2579 | + return (path, display_path, node_uuid, synced) |
2580 | + |
2581 | + def post_merge(nodes, partial_result, child_results): |
2582 | + """Delete nodes.""" |
2583 | + (merged_node, original_node) = nodes |
2584 | + # pylint: disable-msg=W0612 |
2585 | + (path, display_path, node_uuid, synced) = partial_result |
2586 | + |
2587 | + if merged_node is None: |
2588 | + assert original_node is not None |
2589 | + if not quiet: |
2590 | + print "%s %s %s" % (sync_mode.symbol, DELETE_SYMBOL, |
2591 | + display_path) |
2592 | + try: |
2593 | + if original_node.node_type == DIRECTORY: |
2594 | + sync_mode.delete_directory(node_uuid=original_node.uuid, |
2595 | + path=path) |
2596 | + else: |
2597 | + # files or symlinks |
2598 | + sync_mode.delete_file(node_uuid=original_node.uuid, |
2599 | + path=path) |
2600 | + synced = True |
2601 | + except NodeDeleteError, e: |
2602 | + print e |
2603 | + |
2604 | + if synced: |
2605 | + model_node = merged_node |
2606 | + else: |
2607 | + model_node = original_node |
2608 | + |
2609 | + if model_node is not None: |
2610 | + if model_node.node_type == DIRECTORY: |
2611 | + child_iter = child_results.iteritems() |
2612 | + merged_children = dict([(name, child) for (name, child) |
2613 | + in child_iter |
2614 | + if child is not None]) |
2615 | + else: |
2616 | + # if there are children here it's because they failed to delete |
2617 | + merged_children = None |
2618 | + return MergeNode(node_type=model_node.node_type, |
2619 | + uuid=model_node.uuid, |
2620 | + children=merged_children, |
2621 | + content_hash=model_node.content_hash) |
2622 | + else: |
2623 | + return None |
2624 | + |
2625 | + return generic_merge(trees=[merged_tree, original_tree], |
2626 | + pre_merge=pre_merge, post_merge=post_merge, |
2627 | + partial_parent=(path, "", None, True), name=u"") |
2628 | + |
2629 | +def download_tree(merged_tree, local_tree, client, share_uuid, path, dry_run, |
2630 | + quiet): |
2631 | + """Downloads a directory.""" |
2632 | + if dry_run: |
2633 | + downloader = DryRun(symbol=DOWNLOAD_SYMBOL) |
2634 | + else: |
2635 | + downloader = Downloader(client=client, share_uuid=share_uuid) |
2636 | + return sync_tree(merged_tree=merged_tree, original_tree=local_tree, |
2637 | + sync_mode=downloader, path=path, quiet=quiet) |
2638 | + |
2639 | +def upload_tree(merged_tree, remote_tree, client, share_uuid, path, dry_run, |
2640 | + quiet): |
2641 | + """Uploads a directory.""" |
2642 | + if dry_run: |
2643 | + uploader = DryRun(symbol=UPLOAD_SYMBOL) |
2644 | + else: |
2645 | + uploader = Uploader(client=client, share_uuid=share_uuid) |
2646 | + return sync_tree(merged_tree=merged_tree, original_tree=remote_tree, |
2647 | + sync_mode=uploader, path=path, quiet=quiet) |
2648 | + |
2649 | + |
2650 | +class DryRun(object): |
2651 | + """A class which implements the sync interface but does nothing.""" |
2652 | + def __init__(self, symbol): |
2653 | + """Initializes a DryRun instance.""" |
2654 | + self.symbol = symbol |
2655 | + |
2656 | + def create_directory(self, parent_uuid, path): |
2657 | + """Doesn't create a directory.""" |
2658 | + return None |
2659 | + |
2660 | + def write_file(self, node_uuid, old_content_hash, content_hash, |
2661 | + parent_uuid, path, conflict_info, node_type): |
2662 | + """Doesn't write a file.""" |
2663 | + return None |
2664 | + |
2665 | + def delete_directory(self, node_uuid, path): |
2666 | + """Doesn't delete a directory.""" |
2667 | + |
2668 | + def delete_file(self, node_uuid, path): |
2669 | + """Doesn't delete a file.""" |
2670 | + |
2671 | + |
2672 | +class Downloader(object): |
2673 | + """A class which implements the download half of syncing.""" |
2674 | + def __init__(self, client, share_uuid): |
2675 | + """Initializes a Downloader instance.""" |
2676 | + self.client = client |
2677 | + self.share_uuid = share_uuid |
2678 | + self.symbol = DOWNLOAD_SYMBOL |
2679 | + |
2680 | + def create_directory(self, parent_uuid, path): |
2681 | + """Creates a directory.""" |
2682 | + try: |
2683 | + safe_mkdir(path) |
2684 | + except OSError, e: |
2685 | + raise NodeCreateError("Error creating local directory %s: %s" % \ |
2686 | + (path, e)) |
2687 | + return None |
2688 | + |
2689 | + def write_file(self, node_uuid, old_content_hash, content_hash, |
2690 | + parent_uuid, path, conflict_info, node_type): |
2691 | + """Creates a file and downloads new content for it.""" |
2692 | + if conflict_info: |
2693 | + # download to conflict file rather than overwriting local changes |
2694 | + path = get_conflict_path(path, conflict_info) |
2695 | + content_hash = conflict_info[1] |
2696 | + try: |
2697 | + if node_type == SYMLINK: |
2698 | + self.client.download_string(share_uuid= |
2699 | + self.share_uuid, |
2700 | + node_uuid=node_uuid, |
2701 | + content_hash=content_hash) |
2702 | + else: |
2703 | + self.client.download_file(share_uuid=self.share_uuid, |
2704 | + node_uuid=node_uuid, |
2705 | + content_hash=content_hash, |
2706 | + filename=path) |
2707 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2708 | + if os.path.exists(path): |
2709 | + raise NodeUpdateError("Error downloading content for %s: %s" %\ |
2710 | + (path, e)) |
2711 | + else: |
2712 | + raise NodeCreateError("Error locally creating %s: %s" % \ |
2713 | + (path, e)) |
2714 | + |
2715 | + def delete_directory(self, node_uuid, path): |
2716 | + """Deletes a directory.""" |
2717 | + try: |
2718 | + os.rmdir(path) |
2719 | + except OSError, e: |
2720 | + raise NodeDeleteError("Error locally deleting %s: %s" % (path, e)) |
2721 | + |
2722 | + def delete_file(self, node_uuid, path): |
2723 | + """Deletes a file.""" |
2724 | + try: |
2725 | + os.unlink(path) |
2726 | + except OSError, e: |
2727 | + raise NodeDeleteError("Error locally deleting %s: %s" % (path, e)) |
2728 | + |
2729 | + |
2730 | +class Uploader(object): |
2731 | + """A class which implements the upload half of syncing.""" |
2732 | + def __init__(self, client, share_uuid): |
2733 | + """Initializes an uploader instance.""" |
2734 | + self.client = client |
2735 | + self.share_uuid = share_uuid |
2736 | + self.symbol = UPLOAD_SYMBOL |
2737 | + |
2738 | + def create_directory(self, parent_uuid, path): |
2739 | + """Creates a directory on the server.""" |
2740 | + name = name_from_path(path) |
2741 | + try: |
2742 | + return self.client.create_directory(share_uuid=self.share_uuid, |
2743 | + parent_uuid=parent_uuid, |
2744 | + name=name) |
2745 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2746 | + raise NodeCreateError("Error remotely creating %s: %s" % \ |
2747 | + (path, e)) |
2748 | + |
2749 | + def write_file(self, node_uuid, old_content_hash, content_hash, |
2750 | + parent_uuid, path, conflict_info, node_type): |
2751 | + """Creates a file on the server and uploads new content for it.""" |
2752 | + |
2753 | + if conflict_info: |
2754 | + # move conflicting file out of the way on the server |
2755 | + conflict_path = get_conflict_path(path, conflict_info) |
2756 | + conflict_name = name_from_path(conflict_path) |
2757 | + try: |
2758 | + self.client.move(share_uuid=self.share_uuid, |
2759 | + parent_uuid=parent_uuid, |
2760 | + name=conflict_name, |
2761 | + node_uuid=node_uuid) |
2762 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2763 | + raise NodeUpdateError("Error remotely renaming %s to %s: %s" %\ |
2764 | + (path, conflict_path, e)) |
2765 | + node_uuid = None |
2766 | + old_content_hash = EMPTY_HASH |
2767 | + |
2768 | + if node_type == SYMLINK: |
2769 | + try: |
2770 | + target = os.readlink(path) |
2771 | + except OSError, e: |
2772 | + raise NodeCreateError("Error retrieving link target " \ |
2773 | + "for %s: %s" % (path, e)) |
2774 | + else: |
2775 | + target = None |
2776 | + |
2777 | + name = name_from_path(path) |
2778 | + if node_uuid is None: |
2779 | + try: |
2780 | + if node_type == SYMLINK: |
2781 | + node_uuid = self.client.create_symlink(share_uuid= |
2782 | + self.share_uuid, |
2783 | + parent_uuid= |
2784 | + parent_uuid, |
2785 | + name=name, |
2786 | + target=target) |
2787 | + old_content_hash = content_hash |
2788 | + else: |
2789 | + node_uuid = self.client.create_file(share_uuid= |
2790 | + self.share_uuid, |
2791 | + parent_uuid= |
2792 | + parent_uuid, |
2793 | + name=name) |
2794 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2795 | + raise NodeCreateError("Error remotely creating %s: %s" % \ |
2796 | + (path, e)) |
2797 | + |
2798 | + if old_content_hash != content_hash: |
2799 | + try: |
2800 | + if node_type == SYMLINK: |
2801 | + self.client.upload_string(share_uuid=self.share_uuid, |
2802 | + node_uuid=node_uuid, |
2803 | + content_hash=content_hash, |
2804 | + old_content_hash= |
2805 | + old_content_hash, |
2806 | + content=target) |
2807 | + else: |
2808 | + self.client.upload_file(share_uuid=self.share_uuid, |
2809 | + node_uuid=node_uuid, |
2810 | + content_hash=content_hash, |
2811 | + old_content_hash=old_content_hash, |
2812 | + filename=path) |
2813 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2814 | + raise NodeUpdateError("Error uploading content for %s: %s" % \ |
2815 | + (path, e)) |
2816 | + |
2817 | + def delete_directory(self, node_uuid, path): |
2818 | + """Deletes a directory.""" |
2819 | + try: |
2820 | + self.client.unlink(share_uuid=self.share_uuid, node_uuid=node_uuid) |
2821 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2822 | + raise NodeDeleteError("Error remotely deleting %s: %s" % (path, e)) |
2823 | + |
2824 | + def delete_file(self, node_uuid, path): |
2825 | + """Deletes a file.""" |
2826 | + try: |
2827 | + self.client.unlink(share_uuid=self.share_uuid, node_uuid=node_uuid) |
2828 | + except (request.StorageRequestError, UnsupportedOperationError), e: |
2829 | + raise NodeDeleteError("Error remotely deleting %s: %s" % (path, e)) |
2830 | |
2831 | === added file 'src/u1sync/ubuntuone_optparse.py' |
2832 | --- src/u1sync/ubuntuone_optparse.py 1970-01-01 00:00:00 +0000 |
2833 | +++ src/u1sync/ubuntuone_optparse.py 2010-08-27 20:21:13 +0000 |
2834 | @@ -0,0 +1,202 @@ |
2835 | +# ubuntuone.u1sync.ubuntuone_optparse |
2836 | +# |
2837 | +# Prototype directory sync client |
2838 | +# |
2839 | +# Author: Manuel de la Pena <manuel.delapena@canonical.com> |
2840 | +# |
2841 | +# Copyright 2010 Canonical Ltd. |
2842 | +# |
2843 | +# This program is free software: you can redistribute it and/or modify it |
2844 | +# under the terms of the GNU General Public License version 3, as published |
2845 | +# by the Free Software Foundation. |
2846 | +# |
2847 | +# This program is distributed in the hope that it will be useful, but |
2848 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
2849 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
2850 | +# PURPOSE. See the GNU General Public License for more details. |
2851 | +# |
2852 | +# You should have received a copy of the GNU General Public License along |
2853 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
2854 | +import uuid |
2855 | +from oauth.oauth import OAuthToken |
2856 | +from optparse import OptionParser, SUPPRESS_HELP |
2857 | +from u1sync.merge import ( |
2858 | + SyncMerge, ClobberServerMerge, ClobberLocalMerge) |
2859 | + |
2860 | +MERGE_ACTIONS = { |
2861 | + # action: (merge_class, should_upload, should_download) |
2862 | + 'sync': (SyncMerge, True, True), |
2863 | + 'clobber-server': (ClobberServerMerge, True, False), |
2864 | + 'clobber-local': (ClobberLocalMerge, False, True), |
2865 | + 'upload': (SyncMerge, True, False), |
2866 | + 'download': (SyncMerge, False, True), |
2867 | + 'auto': None # special case |
2868 | +} |
2869 | + |
2870 | +DEFAULT_MERGE_ACTION = 'auto' |
2871 | + |
2872 | +class NotParsedOptionsError(Exception): |
2873 | + """Exception thrown when there options have not been parsed.""" |
2874 | + |
2875 | +class NotValidatedOptionsError(Exception): |
2876 | + """Exception thrown when the options have not been validated.""" |
2877 | + |
2878 | +class UbuntuOneOptionsParser(OptionParser): |
2879 | + """Parse for the options passed in the command line""" |
2880 | + |
2881 | + def __init__(self): |
2882 | + usage = "Usage: %prog [options] [DIRECTORY]\n" \ |
2883 | + " %prog --authorize [options]\n" \ |
2884 | + " %prog --list-shares [options]\n" \ |
2885 | + " %prog --init [--share=SHARE_UUID] [options] DIRECTORY\n" \ |
2886 | + " %prog --diff [--share=SHARE_UUID] [options] DIRECTORY" |
2887 | + OptionParser.__init__(self, usage=usage) |
2888 | + self._was_validated = False |
2889 | + self._args = None |
2890 | + # add the different options to be used |
2891 | + self.add_option("--port", dest="port", metavar="PORT", |
2892 | + default=443, |
2893 | + help="The port on which to connect to the server") |
2894 | + self.add_option("--host", dest="host", metavar="HOST", |
2895 | + default='fs-1.one.ubuntu.com', |
2896 | + help="The server address") |
2897 | + self.add_option("--realm", dest="realm", metavar="REALM", |
2898 | + default='https://ubuntuone.com', |
2899 | + help="The oauth realm") |
2900 | + self.add_option("--oauth", dest="oauth", metavar="KEY:SECRET", |
2901 | + default=None, |
2902 | + help="Explicitly provide OAuth credentials " |
2903 | + "(default is to query keyring)") |
2904 | + |
2905 | + action_list = ", ".join(sorted(MERGE_ACTIONS.keys())) |
2906 | + self.add_option("--action", dest="action", metavar="ACTION", |
2907 | + default=None, |
2908 | + help="Select a sync action (%s; default is %s)" % \ |
2909 | + (action_list, DEFAULT_MERGE_ACTION)) |
2910 | + |
2911 | + self.add_option("--dry-run", action="store_true", dest="dry_run", |
2912 | + default=False, help="Do a dry run without actually " |
2913 | + "making changes") |
2914 | + self.add_option("--quiet", action="store_true", dest="quiet", |
2915 | + default=False, help="Produces less output") |
2916 | + self.add_option("--authorize", action="store_const", dest="mode", |
2917 | + const="authorize", |
2918 | + help="Authorize this machine") |
2919 | + self.add_option("--list-shares", action="store_const", dest="mode", |
2920 | + const="list-shares", default="sync", |
2921 | + help="List available shares") |
2922 | + self.add_option("--init", action="store_const", dest="mode", |
2923 | + const="init", |
2924 | + help="Initialize a local directory for syncing") |
2925 | + self.add_option("--no-ssl-verify", action="store_true", |
2926 | + dest="no_ssl_verify", |
2927 | + default=False, help=SUPPRESS_HELP) |
2928 | + self.add_option("--diff", action="store_const", dest="mode", |
2929 | + const="diff", |
2930 | + help="Compare tree on server with local tree " \ |
2931 | + "(does not require previous --init)") |
2932 | + self.add_option("--share", dest="share", metavar="SHARE_UUID", |
2933 | + default=None, |
2934 | + help="Sync the directory with a share rather than the " \ |
2935 | + "user's own volume") |
2936 | + self.add_option("--subtree", dest="subtree", metavar="PATH", |
2937 | + default=None, |
2938 | + help="Mirror a subset of the share or volume") |
2939 | + |
2940 | + def get_options(self, arguments): |
2941 | + """Parses the arguments to from the command line.""" |
2942 | + (self.options, self._args) = \ |
2943 | + self.parse_args(arguments) |
2944 | + self._validate_args() |
2945 | + |
2946 | + def _validate_args(self): |
2947 | + """Validates the args that have been parsed.""" |
2948 | + self._is_only_share() |
2949 | + self._get_directory() |
2950 | + self._validate_action_usage() |
2951 | + self._validate_authorize_usage() |
2952 | + self._validate_subtree_usage() |
2953 | + self._validate_action() |
2954 | + self._validate_oauth() |
2955 | + |
2956 | + def _is_only_share(self): |
2957 | + """Ensures that the share options is not convined with any other.""" |
2958 | + if self.options.share is not None and \ |
2959 | + self.options.mode != "init" and \ |
2960 | + self.options.mode != "diff": |
2961 | + self.error("--share is only valid with --init or --diff") |
2962 | + |
2963 | + def _get_directory(self): |
2964 | + """Gets the directory to be used according to the paramenters.""" |
2965 | + print self._args |
2966 | + if self.options.mode == "sync" or self.options.mode == "init" or \ |
2967 | + self.options.mode == "diff": |
2968 | + if len(self._args) > 2: |
2969 | + self.error("Too many arguments") |
2970 | + elif len(self._args) < 1: |
2971 | + if self.options.mode == "init" or self.options.mode == "diff": |
2972 | + self.error("--%s requires a directory to " |
2973 | + "be specified" % self.options.mode) |
2974 | + else: |
2975 | + self.options.directory = "." |
2976 | + else: |
2977 | + self.options.directory = self._args[0] |
2978 | + |
2979 | + def _validate_action_usage(self): |
2980 | + """Ensures that the --action option is correctly used""" |
2981 | + if self.options.mode == "init" or \ |
2982 | + self.options.mode == "list-shares" or \ |
2983 | + self.options.mode == "diff" or \ |
2984 | + self.options.mode == "authorize": |
2985 | + if self.options.action is not None: |
2986 | + self.error("--%s does not take the --action parameter" % \ |
2987 | + self.options.mode) |
2988 | + if self.options.dry_run: |
2989 | + self.error("--%s does not take the --dry-run parameter" % \ |
2990 | + self.options.mode) |
2991 | + |
2992 | + def _validate_authorize_usage(self): |
2993 | + """Validates the usage of the authorize option.""" |
2994 | + if self.options.mode == "authorize": |
2995 | + if self.options.oauth is not None: |
2996 | + self.error("--authorize does not take the --oauth parameter") |
2997 | + if self.options.mode == "list-shares" or \ |
2998 | + self.options.mode == "authorize": |
2999 | + if len(self._args) != 0: |
3000 | + self.error("--list-shares does not take a directory") |
3001 | + |
3002 | + def _validate_subtree_usage(self): |
3003 | + """Validates the usage of the subtree option""" |
3004 | + if self.options.mode != "init" and self.options.mode != "diff": |
3005 | + if self.options.subtree is not None: |
3006 | + self.error("--%s does not take the --subtree parameter" % \ |
3007 | + self.options.mode) |
3008 | + |
3009 | + def _validate_action(self): |
3010 | + """Validates the actions passed to the options.""" |
3011 | + if self.options.action is not None and \ |
3012 | + self.options.action not in MERGE_ACTIONS: |
3013 | + self.error("--action: Unknown action %s" % self.options.action) |
3014 | + |
3015 | + if self.options.action is None: |
3016 | + self.options.action = DEFAULT_MERGE_ACTION |
3017 | + |
3018 | + def _validate_oauth(self): |
3019 | + """Validates that the oatuh was passed.""" |
3020 | + if self.options.oauth is None: |
3021 | + self.error("--oauth is currently compulsery.") |
3022 | + else: |
3023 | + try: |
3024 | + (key, secret) = self.options.oauth.split(':', 2) |
3025 | + except ValueError: |
3026 | + self.error("--oauth requires a key and secret together in the " |
3027 | + " form KEY:SECRET") |
3028 | + self.options.token = OAuthToken(key, secret) |
3029 | + |
3030 | + def _validate_share(self): |
3031 | + """Validates the share option""" |
3032 | + if self.options.share is not None: |
3033 | + try: |
3034 | + uuid.UUID(self.options.share) |
3035 | + except ValueError, e: |
3036 | + self.error("Invalid --share argument: %s" % e) |
3037 | \ No newline at end of file |
3038 | |
3039 | === added file 'src/u1sync/utils.py' |
3040 | --- src/u1sync/utils.py 1970-01-01 00:00:00 +0000 |
3041 | +++ src/u1sync/utils.py 2010-08-27 20:21:13 +0000 |
3042 | @@ -0,0 +1,50 @@ |
3043 | +# ubuntuone.u1sync.utils |
3044 | +# |
3045 | +# Miscellaneous utility functions |
3046 | +# |
3047 | +# Author: Tim Cole <tim.cole@canonical.com> |
3048 | +# |
3049 | +# Copyright 2009 Canonical Ltd. |
3050 | +# |
3051 | +# This program is free software: you can redistribute it and/or modify it |
3052 | +# under the terms of the GNU General Public License version 3, as published |
3053 | +# by the Free Software Foundation. |
3054 | +# |
3055 | +# This program is distributed in the hope that it will be useful, but |
3056 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
3057 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
3058 | +# PURPOSE. See the GNU General Public License for more details. |
3059 | +# |
3060 | +# You should have received a copy of the GNU General Public License along |
3061 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
3062 | +"""Miscellaneous utility functions.""" |
3063 | + |
3064 | +import os |
3065 | +from errno import EEXIST, ENOENT |
3066 | +from u1sync.constants import ( |
3067 | + METADATA_DIR_NAME, SPECIAL_FILE_RE) |
3068 | + |
3069 | +def should_sync(filename): |
3070 | + """Returns True if the filename should be synced. |
3071 | + |
3072 | + @param filename: a unicode filename |
3073 | + |
3074 | + """ |
3075 | + return filename != METADATA_DIR_NAME and \ |
3076 | + not SPECIAL_FILE_RE.match(filename) |
3077 | + |
3078 | +def safe_mkdir(path): |
3079 | + """Creates a directory iff it does not already exist.""" |
3080 | + try: |
3081 | + os.mkdir(path) |
3082 | + except OSError, e: |
3083 | + if e.errno != EEXIST: |
3084 | + raise |
3085 | + |
3086 | +def safe_unlink(path): |
3087 | + """Unlinks a file iff it exists.""" |
3088 | + try: |
3089 | + os.unlink(path) |
3090 | + except OSError, e: |
3091 | + if e.errno != ENOENT: |
3092 | + raise |