Merge lp:~nataliabidart/ubuntuone-client/stable-3-0-update into lp:ubuntuone-client/stable-3-0

Proposed by Natalia Bidart
Status: Merged
Approved by: Natalia Bidart
Approved revision: 1163
Merged at revision: 1163
Proposed branch: lp:~nataliabidart/ubuntuone-client/stable-3-0-update
Merge into: lp:ubuntuone-client/stable-3-0
Diff against target: 2828 lines (+913/-218)
43 files modified
configure.ac (+1/-1)
contrib/testing/testcase.py (+1/-1)
run-tests.bat (+1/-1)
tests/platform/linux/eventlog/test_zg_listener.py (+14/-14)
tests/platform/linux/test_credentials.py (+1/-1)
tests/platform/linux/test_dbus.py (+1/-1)
tests/platform/linux/test_session.py (+1/-1)
tests/platform/linux/test_vm.py (+17/-13)
tests/platform/test_os_helper.py (+54/-3)
tests/platform/test_tools.py (+3/-0)
tests/platform/windows/test_os_helper.py (+122/-2)
tests/platform/windows/test_tools.py (+21/-6)
tests/status/test_aggregator.py (+20/-0)
tests/syncdaemon/test_config.py (+14/-20)
tests/syncdaemon/test_eq_inotify.py (+3/-5)
tests/syncdaemon/test_fileshelf.py (+2/-2)
tests/syncdaemon/test_interaction_interfaces.py (+4/-0)
tests/syncdaemon/test_localrescan.py (+6/-13)
tests/syncdaemon/test_logger.py (+1/-1)
tests/syncdaemon/test_main.py (+3/-2)
tests/syncdaemon/test_sync.py (+1/-1)
tests/syncdaemon/test_vm.py (+152/-77)
tests/syncdaemon/test_vm_helper.py (+2/-2)
ubuntuone/platform/__init__.py (+86/-3)
ubuntuone/platform/constants.py (+27/-0)
ubuntuone/platform/event_logging.py (+30/-0)
ubuntuone/platform/launcher.py (+31/-0)
ubuntuone/platform/linux/__init__.py (+1/-0)
ubuntuone/platform/linux/messaging.py (+0/-2)
ubuntuone/platform/linux/session.py (+4/-4)
ubuntuone/platform/messaging.py (+42/-0)
ubuntuone/platform/notification.py (+34/-0)
ubuntuone/platform/os_helper.py (+57/-0)
ubuntuone/platform/session.py (+47/-0)
ubuntuone/platform/tools/windows.py (+12/-0)
ubuntuone/platform/windows/__init__.py (+10/-0)
ubuntuone/platform/windows/messaging.py (+0/-2)
ubuntuone/platform/windows/os_helper.py (+49/-8)
ubuntuone/status/aggregator.py (+8/-6)
ubuntuone/syncdaemon/config.py (+4/-2)
ubuntuone/syncdaemon/interaction_interfaces.py (+3/-8)
ubuntuone/syncdaemon/vm_helper.py (+4/-5)
ubuntuone/syncdaemon/volume_manager.py (+19/-11)
To merge this branch: bzr merge lp:~nataliabidart/ubuntuone-client/stable-3-0-update
Reviewer Review Type Date Requested Status
Roberto Alsina (community) Approve
Review via email: mp+86294@code.launchpad.net

Commit message

[ Manuel de la Pena <email address hidden> ]
  - Ensure that the folders returned by listdir and walk are not system folders
    (LP: #845659).
  - Added more details in the raised exception to fix bug (LP: #817582).
  - Ensure that if the path is missing we let the client now that we did not
    manage to start the sd (LP: #881186).

[ Guillermo Gonzalez <email address hidden> ]
  - Fix creation of volumes (with content) by adding a new "volatile" variable
    to signal when the volume is being processes by local rescan
    (LP: #869920).

[ Eric Casteleijn <email address hidden> ]
  - Uploading filename is now reset on every message display (LP: #807005).

[ Diego Sarmentero <email address hidden> ]
  - Fixed imports problems in ubuntuone/platform (LP: #898292)
  - Handle encoding of the options returned by ConfigGlue (LP: #818197).
  - Implement a multiplatform expand_user (LP: #888228).
  - Failing when trying to create ubuntuone_log_dir (LP: #881940).
  - Unicode issues resolved in ubuntu one client.

[ Natalia B. Bidart <email address hidden> ]
  - Less not-useful debug messages in the log.
  - Import from non-deprecated module ubuntuone.devtools.testcases
    (LP: #898324).

Description of the change

All green in ubuntu, and windows (tested windows 7).

IRL tested on Ubuntu.

To post a comment you must log in.
Revision history for this message
Roberto Alsina (ralsina) wrote :

Seems to work on windows, code was reviewed on trunk...

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'configure.ac'
2--- configure.ac 2011-12-01 19:23:32 +0000
3+++ configure.ac 2011-12-19 20:54:24 +0000
4@@ -1,7 +1,7 @@
5 dnl Process this file with autoconf to produce a configure script.
6 AC_PREREQ(2.53)
7
8-AC_INIT([ubuntuone-client], [2.99])
9+AC_INIT([ubuntuone-client], [2.99.0])
10 AC_CONFIG_SRCDIR([config.h.in])
11
12 AM_INIT_AUTOMAKE([1.10 foreign])
13
14=== modified file 'contrib/testing/testcase.py'
15--- contrib/testing/testcase.py 2011-11-08 15:29:55 +0000
16+++ contrib/testing/testcase.py 2011-12-19 20:54:24 +0000
17@@ -31,7 +31,7 @@
18
19 from twisted.internet import defer
20 from twisted.trial.unittest import TestCase as TwistedTestCase
21-from ubuntuone.devtools.testcase import skipIfOS
22+from ubuntuone.devtools.testcases import skipIfOS
23 from zope.interface import implements
24 from zope.interface.verify import verifyObject
25
26
27=== modified file 'run-tests.bat'
28--- run-tests.bat 2011-10-06 10:16:08 +0000
29+++ run-tests.bat 2011-12-19 20:54:24 +0000
30@@ -53,7 +53,7 @@
31 COPY windows\clientdefs.py ubuntuone\clientdefs.py
32 COPY windows\logging.conf data\logging.conf
33 :: execute the tests with a number of ignored linux only modules
34-"%PYTHONEXEPATH%\python.exe" "%PYTHONEXEPATH%\Scripts\u1trial" --reactor=twisted -c tests -p tests\platform\linux %*
35+"%PYTHONEXEPATH%\python.exe" "%PYTHONEXEPATH%\Scripts\u1trial" --reactor=twisted -c -p tests\platform\linux %* tests
36 "%PYTHONEXEPATH%\python.exe" "%PYTHONEXEPATH%\Scripts\u1lint"
37 :: test for style if we can, if pep8 is not present, move to the end
38 IF EXIST "%PYTHONEXEPATH%\Scripts\pep8.exe"
39
40=== modified file 'tests/platform/linux/eventlog/test_zg_listener.py'
41--- tests/platform/linux/eventlog/test_zg_listener.py 2011-10-25 02:24:29 +0000
42+++ tests/platform/linux/eventlog/test_zg_listener.py 2011-12-19 20:54:24 +0000
43@@ -32,7 +32,7 @@
44 # disabled, but fake it for running tests on lucid
45 try:
46 import zeitgeist.mimetypes
47- assert(zeitgeist.mimetypes is not None) # make pyflakes happy
48+ assert(zeitgeist.mimetypes is not None) # make pyflakes happy
49 except ImportError:
50
51 class FakeMimetypes(object):
52@@ -47,6 +47,7 @@
53 from contrib.testing.testcase import (
54 FakeMain, FakeMainTestCase, BaseTwistedTestCase)
55 from ubuntuone.devtools.handlers import MementoHandler
56+from ubuntuone.platform import expand_user
57 from ubuntuone.storageprotocol import client, delta
58 from ubuntuone.storageprotocol.request import ROOT
59 from ubuntuone.storageprotocol.sharersp import NotifyShareHolder
60@@ -101,6 +102,7 @@
61 if d:
62 d.callback(event)
63
64+
65 def listen_for(event_q, event, callback, count=1, collect=False):
66 """Setup a EQ listener for the specified event."""
67 class Listener(object):
68@@ -124,7 +126,7 @@
69 callback(args)
70
71 listener = Listener()
72- setattr(listener, 'handle_'+event, listener._handle_event)
73+ setattr(listener, 'handle_' + event, listener._handle_event)
74 event_q.subscribe(listener)
75 return listener
76
77@@ -225,7 +227,6 @@
78 MANIFESTATION_U1_CONTACT_DATA_OBJECT)
79 self.assertEqual(other_user.text, fake_username)
80
81-
82 @defer.inlineCallbacks
83 def test_share_deleted_is_logged(self):
84 """Test VolumeManager.delete_share."""
85@@ -317,7 +318,6 @@
86 MANIFESTATION_U1_CONTACT_DATA_OBJECT)
87 self.assertEqual(other_user.text, fake_username)
88
89-
90 @defer.inlineCallbacks
91 def test_share_unaccepted_is_logged(self):
92 """Test that an unaccepted share event is logged."""
93@@ -402,7 +402,7 @@
94 def create_udf(path, name, marker):
95 """Fake create_udf."""
96 # check that the marker is the full path to the udf
97- expanded_path = os.path.expanduser(path.encode('utf-8'))
98+ expanded_path = expand_user(path.encode('utf-8'))
99 udf_path = os.path.join(expanded_path, name.encode('utf-8'))
100 if str(marker) != udf_path:
101 d.errback(ValueError("marker != path - "
102@@ -451,7 +451,7 @@
103 def create_udf(path, name, marker):
104 """Fake create_udf."""
105 # check that the marker is the full path to the udf
106- expanded_path = os.path.expanduser(path.encode('utf-8'))
107+ expanded_path = expand_user(path.encode('utf-8'))
108 udf_path = os.path.join(expanded_path, name.encode('utf-8'))
109 if str(marker) != udf_path:
110 d.errback(ValueError("marker != path - "
111@@ -861,7 +861,7 @@
112 listen_for(self.main.event_q, 'SV_FILE_NEW', d.callback)
113 listen_for(self.main.event_q, 'AQ_DOWNLOAD_FINISHED', d2.callback)
114
115- deltas = [ self.filemp3delta ]
116+ deltas = [self.filemp3delta]
117 kwargs = dict(volume_id=ROOT, delta_content=deltas, end_generation=11,
118 full=True, free_bytes=10)
119 self.main.sync.handle_AQ_DELTA_OK(**kwargs)
120@@ -872,7 +872,7 @@
121 self.assertEqual(node.is_dir, False)
122 self.assertEqual(node.generation, self.filemp3delta.generation)
123
124- yield d # wait for SV_FILE_NEW
125+ yield d # wait for SV_FILE_NEW
126
127 dlargs = dict(
128 share_id=self.filemp3delta.share_id,
129@@ -880,7 +880,7 @@
130 server_hash="server hash")
131 self.main.event_q.push("AQ_DOWNLOAD_FINISHED", **dlargs)
132
133- yield d2 # wait for AQ_DOWNLOAD_FINISHED
134+ yield d2 # wait for AQ_DOWNLOAD_FINISHED
135
136 self.assertEqual(len(self.listener.zg.events), 1)
137 event = self.listener.zg.events[0]
138@@ -908,7 +908,7 @@
139 d = defer.Deferred()
140 listen_for(self.main.event_q, 'SV_DIR_NEW', d.callback)
141
142- deltas = [ self.dirdelta ]
143+ deltas = [self.dirdelta]
144 kwargs = dict(volume_id=ROOT, delta_content=deltas, end_generation=11,
145 full=True, free_bytes=10)
146 self.main.sync.handle_AQ_DELTA_OK(**kwargs)
147@@ -919,7 +919,7 @@
148 self.assertEqual(node.is_dir, True)
149 self.assertEqual(node.generation, self.dirdelta.generation)
150
151- yield d # wait for SV_DIR_NEW
152+ yield d # wait for SV_DIR_NEW
153
154 self.assertEqual(len(self.listener.zg.events), 1)
155 event = self.listener.zg.events[0]
156@@ -949,7 +949,7 @@
157 listen_for(self.main.event_q, 'SV_FILE_NEW', d.callback)
158 listen_for(self.main.event_q, 'AQ_DOWNLOAD_FINISHED', d2.callback)
159
160- deltas = [ self.filemp3delta ]
161+ deltas = [self.filemp3delta]
162 kwargs = dict(volume_id=ROOT, delta_content=deltas, end_generation=11,
163 full=True, free_bytes=10)
164 self.main.sync.handle_AQ_DELTA_OK(**kwargs)
165@@ -960,7 +960,7 @@
166 self.assertEqual(node.is_dir, False)
167 self.assertEqual(node.generation, self.filemp3delta.generation)
168
169- yield d # wait for SV_FILE_NEW
170+ yield d # wait for SV_FILE_NEW
171
172 # remove from the recent list
173 local_file_id = (self.filemp3delta.share_id, self.filemp3delta.node_id)
174@@ -972,7 +972,7 @@
175 server_hash="server hash")
176 self.main.event_q.push("AQ_DOWNLOAD_FINISHED", **dlargs)
177
178- yield d2 # wait for AQ_DOWNLOAD_FINISHED
179+ yield d2 # wait for AQ_DOWNLOAD_FINISHED
180
181 self.assertEqual(len(self.listener.zg.events), 1)
182 event = self.listener.zg.events[0]
183
184=== modified file 'tests/platform/linux/test_credentials.py'
185--- tests/platform/linux/test_credentials.py 2011-09-30 18:07:35 +0000
186+++ tests/platform/linux/test_credentials.py 2011-12-19 20:54:24 +0000
187@@ -25,8 +25,8 @@
188 import dbus.service
189
190 from twisted.internet.defer import Deferred, inlineCallbacks
191-from ubuntuone.devtools.testcase import DBusTestCase
192 from ubuntuone.devtools.handlers import MementoHandler
193+from ubuntuone.devtools.testcases.dbus import DBusTestCase
194
195 from ubuntuone.platform.credentials import (
196 CredentialsError,
197
198=== modified file 'tests/platform/linux/test_dbus.py'
199--- tests/platform/linux/test_dbus.py 2011-10-28 15:30:54 +0000
200+++ tests/platform/linux/test_dbus.py 2011-12-19 20:54:24 +0000
201@@ -20,7 +20,7 @@
202 import dbus
203
204 from twisted.internet import defer
205-from ubuntuone.devtools.testcase import DBusTestCase
206+from ubuntuone.devtools.testcases.dbus import DBusTestCase
207
208 from contrib.testing.testcase import (
209 FakeMainTestCase,
210
211=== modified file 'tests/platform/linux/test_session.py'
212--- tests/platform/linux/test_session.py 2011-07-27 17:31:25 +0000
213+++ tests/platform/linux/test_session.py 2011-12-19 20:54:24 +0000
214@@ -22,8 +22,8 @@
215
216 from dbus.mainloop.glib import DBusGMainLoop
217 from twisted.internet.defer import inlineCallbacks
218+from ubuntuone.devtools.testcases.dbus import DBusTestCase
219
220-from ubuntuone.devtools.testcase import DBusTestCase
221 from ubuntuone.platform.linux import session
222
223 INHIBIT_ALL = (session.INHIBIT_LOGGING_OUT |
224
225=== modified file 'tests/platform/linux/test_vm.py'
226--- tests/platform/linux/test_vm.py 2011-10-27 13:47:09 +0000
227+++ tests/platform/linux/test_vm.py 2011-12-19 20:54:24 +0000
228@@ -21,7 +21,7 @@
229
230 from twisted.internet import defer
231
232-from contrib.testing.testcase import FakeMain, environ
233+from contrib.testing.testcase import FakeMain
234 from tests.syncdaemon.test_vm import MetadataTestCase, BaseVolumeManagerTests
235 from ubuntuone.storageprotocol import request
236 from ubuntuone.syncdaemon.volume_manager import (
237@@ -39,12 +39,12 @@
238 def test_get_udf_path(self):
239 """Test for get_udf_path."""
240 suggested_path = u"suggested_path"
241- with environ('HOME', self.home_dir):
242- udf_path = get_udf_path(u"~/" + suggested_path)
243+ udf_path = get_udf_path(u"~/" + suggested_path)
244 self.assertEquals(os.path.join(self.home_dir,
245 suggested_path.encode('utf-8')),
246 udf_path)
247
248+
249 class MetadataOldLayoutTests(MetadataTestCase):
250 """Tests for 'old' layouts and metadata upgrade"""
251
252@@ -87,7 +87,7 @@
253 old_shelf[request.ROOT] = root_share
254 for idx in range(1, 10):
255 sid = str(uuid.uuid4())
256- old_shelf[sid] = _Share(
257+ old_shelf[sid] = _Share(
258 path=os.path.join(self.shares_dir, str(idx)), share_id=sid)
259 # ShareFileShelf.keys returns a generator
260 old_keys = [key for key in old_shelf.keys()]
261@@ -151,7 +151,7 @@
262 self.main = FakeMain(self.new_root_dir, self.new_shares_dir,
263 self.data_dir, self.partials_dir)
264 new_keys = [new_key for new_key in self.main.vm.shares.keys()]
265- self.assertEquals(2, len(new_keys)) # the fake share plus root
266+ self.assertEquals(2, len(new_keys)) # the fake share plus root
267 for key in [request.ROOT, share.id]:
268 self.assertIn(key, new_keys)
269 self.check_version()
270@@ -188,7 +188,7 @@
271 expected = []
272
273 for dirname, new_dirname in [(self.root_dir, self.new_root_dir),
274- (self.shares_dir, self.new_shares_dir)]:
275+ (self.shares_dir, self.new_shares_dir)]:
276 # a plain .conflict...
277 # ...on a file
278 open(dirname + '/1a.conflict', 'w').close()
279@@ -558,7 +558,7 @@
280 share = _Share(path=os.path.join(self.shares_dir, share_name),
281 share_id=sid, name=share_name,
282 node_id=str(uuid.uuid4()),
283- other_username='username'+str(idx),
284+ other_username='username' + str(idx),
285 other_visible_name='visible name ' + str(idx))
286 if idx % 2:
287 share.access_level = ACCESS_LEVEL_RW
288@@ -592,6 +592,7 @@
289 self.main = FakeMain(self.root_dir, self.shares_dir,
290 self.data_dir, self.partials_dir)
291 vm = self.main.vm
292+
293 def compare_share(share, old_share):
294 """Compare two shares, new and old"""
295 self.assertEquals(share.volume_id, old_share.id)
296@@ -635,7 +636,7 @@
297 share = _Share(path=os.path.join(self.shares_dir, share_name),
298 share_id=sid, name=share_name,
299 node_id=str(uuid.uuid4()),
300- other_username='username'+str(idx),
301+ other_username='username' + str(idx),
302 other_visible_name='visible name ' + str(idx))
303 if idx % 2:
304 share.access_level = ACCESS_LEVEL_RW
305@@ -684,6 +685,7 @@
306 self.main = FakeMain(self.root_dir, self.shares_dir,
307 self.data_dir, self.partials_dir)
308 vm = self.main.vm
309+
310 def compare_share(share, old_share):
311 """Compare two shares, new and old"""
312 self.assertEquals(share.volume_id, old_share.id)
313@@ -730,7 +732,8 @@
314 # create some old shares and shared metadata
315 legacy_shares = LegacyShareFileShelf(self.share_md_dir)
316 root_share = _Share(path=self.root_dir, share_id=request.ROOT,
317- access_level=ACCESS_LEVEL_RW, node_id=str(uuid.uuid4()))
318+ access_level=ACCESS_LEVEL_RW, node_id=str(
319+ uuid.uuid4()))
320 legacy_shares[request.ROOT] = root_share
321 for idx, name in enumerate(['share'] * 3):
322 sid = str(uuid.uuid4())
323@@ -738,7 +741,7 @@
324 share = _Share(path=os.path.join(self.shares_dir, share_name),
325 share_id=sid, name=share_name,
326 node_id=str(uuid.uuid4()),
327- other_username='username'+str(idx),
328+ other_username='username' + str(idx),
329 other_visible_name='visible name ' + str(idx))
330 if idx == 0:
331 share.access_level = ACCESS_LEVEL_RW
332@@ -794,6 +797,7 @@
333 self.main = FakeMain(self.root_dir, self.shares_dir,
334 self.data_dir, self.partials_dir)
335 vm = self.main.vm
336+
337 def compare_share(share, old_share):
338 """Compare two shares, new and old"""
339 old_id = getattr(old_share, 'id', None)
340@@ -845,7 +849,7 @@
341 share = _Share(path=os.path.join(self.shares_dir, share_name),
342 share_id=sid, name=share_name,
343 node_id=str(uuid.uuid4()),
344- other_username='username'+str(idx),
345+ other_username='username' + str(idx),
346 other_visible_name='visible name ' + str(idx))
347 if idx % 2:
348 share.access_level = ACCESS_LEVEL_RW
349@@ -876,6 +880,7 @@
350 self.set_md_version('')
351 # upgrade it!
352 old_upgrade_share_to_volume = MetadataUpgrader._upgrade_share_to_volume
353+
354 def upgrade_share_to_volume(share, shared=False):
355 raise ValueError('FAIL!')
356 MetadataUpgrader._upgrade_share_to_volume = upgrade_share_to_volume
357@@ -928,7 +933,7 @@
358 share = Share(path=os.path.join(self.shares_dir, share_name),
359 volume_id=sid, name=share_name,
360 node_id=str(uuid.uuid4()),
361- other_username='username'+str(idx),
362+ other_username='username' + str(idx),
363 other_visible_name='visible name ' + str(idx))
364 if idx % 2:
365 share.access_level = ACCESS_LEVEL_RW
366@@ -1005,4 +1010,3 @@
367 class BrokenNewMDVersionUpgradeTests(MetadataNewLayoutTests):
368 """MetadataNewLayoutTests with broken .version file."""
369 md_version_None = True
370-
371
372=== modified file 'tests/platform/test_os_helper.py'
373--- tests/platform/test_os_helper.py 2011-08-25 18:49:35 +0000
374+++ tests/platform/test_os_helper.py 2011-12-19 20:54:24 +0000
375@@ -1,6 +1,4 @@
376-#
377-# Author: Guillermo Gonzalez <guillermo.gonzalez@canonical.com>
378-# Manuel de la Pena <manuel@canonical.com>
379+# -*- encoding: utf-8 -*-
380 #
381 # Copyright 2010-2011 Canonical Ltd.
382 #
383@@ -30,10 +28,12 @@
384 BaseTwistedTestCase,
385 skip_if_win32_and_uses_readonly,
386 )
387+from ubuntuone import platform
388 from ubuntuone.platform import (
389 access,
390 allow_writes,
391 can_write,
392+ expand_user,
393 is_link,
394 listdir,
395 make_dir,
396@@ -91,6 +91,8 @@
397 yield super(OSWrapperTests, self).setUp(
398 test_dir_name=test_dir_name, test_file_name=test_file_name,
399 valid_file_path_builder=valid_file_path_builder)
400+ self.my_home = 'myhome'
401+ self.patch(platform, "xdg_home", self.my_home)
402 # make sure the file exists
403 open_file(self.testfile, 'w').close()
404
405@@ -344,6 +346,55 @@
406 e = self.assertRaises(OSError, move_to_trash, path)
407 self.assertEqual(e.errno, errno.ENOENT)
408
409+ def test_expand_user_not_start_with_tilde(self):
410+ """Test the expand_user function with an ordinary path."""
411+ path = 'userpath'
412+ result = expand_user(path)
413+ self.assertEqual(path, result)
414+
415+ def test_expand_user_start_with_tilde_no_backslash(self):
416+ """Test the expand_user function with tilde an ordinary path."""
417+ path = '~userpath'
418+ result = expand_user(path)
419+ self.assertEqual(path, result)
420+
421+ def test_expand_user_double_backslash(self):
422+ """Test the expand_user function with double backslash."""
423+ path = '~~userpath'
424+ result = expand_user(path)
425+ self.assertEqual(path, result)
426+
427+ def test_expand_user_start_with_tilde(self):
428+ """Test the expand_user function with a path like: ~/userpath."""
429+ path = os.path.join('~', 'userpath')
430+ result = expand_user(path)
431+ expected = os.path.join(self.my_home, 'userpath')
432+ self.assertEqual(expected, result)
433+
434+ def test_expand_user_tilde_and_backslash(self):
435+ """Test the expand_user function with tilde and backslash."""
436+ tilde = '~' + os.path.sep
437+ result = expand_user(tilde)
438+ expected = self.my_home + os.path.sep
439+ self.assertEqual(expected, result)
440+
441+ def test_expand_user_only_tilde(self):
442+ """Test the expand_user function returns with only tilde input."""
443+ tilde = '~'
444+ result = expand_user(tilde)
445+ self.assertEqual(self.my_home, result)
446+ self.assertFalse(result.endswith(os.path.sep))
447+
448+ def test_expand_user_fails_if_not_bytes(self):
449+ """Test the expand_user function input assertions."""
450+ path = u'userpath'
451+ self.assertRaises(AssertionError, expand_user, path)
452+
453+ def test_expand_user_fails_if_not_utf8_encoded(self):
454+ """Test the expand_user function input encoding."""
455+ path = u'usérpath'.encode('latin-1')
456+ self.assertRaises(AssertionError, expand_user, path)
457+
458
459 class RecursiveMoveTests(BaseTestCase):
460 """Tests for os wrapper functions."""
461
462=== modified file 'tests/platform/test_tools.py'
463--- tests/platform/test_tools.py 2011-11-12 17:09:09 +0000
464+++ tests/platform/test_tools.py 2011-12-19 20:54:24 +0000
465@@ -31,6 +31,7 @@
466 states,
467 volume_manager,
468 )
469+from ubuntuone import platform
470 from ubuntuone.platform import tools
471 from tests.platform import IPCTestCase
472
473@@ -532,6 +533,7 @@
474 @defer.inlineCallbacks
475 def test_create_folder(self):
476 """Test for Folders.create."""
477+ self.patch(platform, 'xdg_home', self.home_dir)
478 path = os.path.join(self.home_dir, u'ñoño')
479 volume_id = 'volume_id'
480 node_id = 'node_id'
481@@ -550,6 +552,7 @@
482 @defer.inlineCallbacks
483 def test_create_folder_error(self):
484 """Test for Folders.create with error."""
485+ self.patch(platform, 'xdg_home', self.home_dir)
486 path = os.path.join(self.home_dir, u'ñoño')
487
488 def create_udf(path, name, marker):
489
490=== modified file 'tests/platform/windows/test_os_helper.py'
491--- tests/platform/windows/test_os_helper.py 2011-09-02 11:34:17 +0000
492+++ tests/platform/windows/test_os_helper.py 2011-12-19 20:54:24 +0000
493@@ -1,5 +1,4 @@
494-#
495-# Author: Manuel de la Pena <manuel@canonical.com>
496+# -*- encoding: utf-8 -*-
497 #
498 # Copyright 2011 Canonical Ltd.
499 #
500@@ -17,7 +16,10 @@
501
502 """Specific tests for the os_helper on Windows."""
503
504+import errno
505 import os
506+import shutil
507+import sys
508
509 from twisted.internet import defer
510 from twisted.trial.unittest import TestCase
511@@ -29,7 +31,14 @@
512 FILE_GENERIC_READ,
513 FILE_GENERIC_WRITE,
514 )
515+from win32file import (
516+ FILE_ATTRIBUTE_NORMAL,
517+ FILE_ATTRIBUTE_SYSTEM,
518+ GetFileAttributesW,
519+ SetFileAttributesW
520+)
521
522+from ubuntuone.platform.windows import os_helper
523 from ubuntuone.platform.windows.os_helper import (
524 _set_file_attributes,
525 _unicode_to_bytes,
526@@ -40,6 +49,7 @@
527 assert_windows_path,
528 can_write,
529 get_syncdaemon_valid_path,
530+ get_user_sid,
531 get_windows_valid_path,
532 normpath,
533 set_dir_readwrite,
534@@ -50,6 +60,12 @@
535 from tests.platform.test_os_helper import OSWrapperTests, WalkTests
536
537
538+# ugly trick to stop pylint for complaining about
539+# WindowsError on Linux
540+if sys.platform != 'win32':
541+ WindowsError = Exception
542+
543+
544 def _build_invalid_windows_bytes_name():
545 invalid_unicodes = u''.join(WINDOWS_ILLEGAL_CHARS_MAP)
546 invalid_filename = 'test_file' + invalid_unicodes.encode('utf8')
547@@ -111,6 +127,16 @@
548 self.assertFalse(LONG_PATH_PREFIX in current_path)
549
550
551+class FakeSecurityInfo(object):
552+
553+ user_sid = 'user_sid'
554+
555+ # pylint: disable=C0103
556+ def GetSecurityDescriptorOwner(self):
557+ return self.user_sid
558+ # pylint: enable=C0103
559+
560+
561 class TestAccess(BaseTwistedTestCase):
562 """Test specific windows implementation access details."""
563
564@@ -257,6 +283,23 @@
565 _set_file_attributes(self.valid_path, groups)
566 self.assertTrue(can_write(self.testfile))
567
568+ def fake_security_info(self, *args):
569+ return FakeSecurityInfo()
570+
571+ def test_get_user_sid(self):
572+ self.patch(os_helper, "GetSecurityInfo", self.fake_security_info)
573+ user_sid = get_user_sid()
574+ self.assertEqual(user_sid, FakeSecurityInfo.user_sid)
575+
576+ def test_set_file_attributes_missing_path(self):
577+ """Set file attr for a missing file."""
578+ groups = [(EVERYONE_SID, FILE_ALL_ACCESS)]
579+ # file does not exist.
580+ self.patch(os_helper.os.path, 'exists', lambda f: False)
581+ exc = self.assertRaises(WindowsError, _set_file_attributes,
582+ self.valid_path, groups)
583+ self.assertEqual(errno.ENOENT, exc.errno,
584+ 'Errno should be file not found.')
585
586 class DecoratorsTestCase(TestCase):
587 """Test case for all the validators and transformers."""
588@@ -346,3 +389,80 @@
589 name_transformer=_unicode_to_bytes)
590 super(TestIllegalPathsWalk, self).test_top_down(topdown=topdown,
591 expected=expected)
592+
593+
594+class TestSystemPaths(TestCase):
595+ """Tests related with the system paths."""
596+
597+ @defer.inlineCallbacks
598+ def setUp(self):
599+ """Set tests."""
600+ yield super(TestSystemPaths, self).setUp()
601+ self.system_paths = ['My Music', 'My Pictures']
602+ self.dirs = ['One', 'Two', 'Tree']
603+ self.files = ['File', 'Second File', 'Last file']
604+ self.temp = self.mktemp()
605+ self._make_test_files()
606+ self.addCleanup(shutil.rmtree, self.temp)
607+
608+ def _make_test_files(self):
609+ """Create the temp test files."""
610+
611+ # lets make the files for the tests
612+ for d in self.dirs:
613+ os.makedirs(os.path.join(self.temp, d))
614+
615+ for s in self.system_paths:
616+ path = os.path.join(self.temp, s)
617+ os.makedirs(path)
618+ self._set_as_system_path(path)
619+
620+ for f in self.files:
621+ path = os.path.join(self.temp, f)
622+ with open(path, 'w') as fd:
623+ fd.write('Im a test, blame TestSystemPaths!')
624+
625+ def _set_as_system_path(self, path):
626+ """Set a path to have the system attr."""
627+ attrs = GetFileAttributesW(path)
628+ attrs = attrs | FILE_ATTRIBUTE_SYSTEM
629+ SetFileAttributesW(path, attrs)
630+
631+
632+ def test_os_listdir(self):
633+ """Test the list dir."""
634+ expected_result = self.dirs + self.files
635+ self.assertEqual(sorted(expected_result),
636+ sorted(os_helper.listdir(self.temp)))
637+
638+ def test_os_walk(self):
639+ """Test the walk."""
640+ expected_dirs = ['One', 'Two', 'Tree']
641+ expected_files = ['File', 'Second File', 'Last file']
642+ result_dirs = []
643+ result_files = []
644+ for dirpath, dirs, files in os_helper.walk(self.temp):
645+ result_dirs.extend(dirs)
646+ result_files.extend(files)
647+ self.assertEqual(sorted(expected_dirs), sorted(result_dirs))
648+ self.assertEqual(sorted(expected_files), sorted(result_files))
649+
650+ def test_native_is_system_path_true(self):
651+ """Test the function that returns if is a sytem folder."""
652+
653+ def fake_get_attrs(path):
654+ """Fake the GetFileAttributes method."""
655+ return FILE_ATTRIBUTE_NORMAL | FILE_ATTRIBUTE_SYSTEM
656+
657+ self.patch(os_helper, 'GetFileAttributesW', fake_get_attrs)
658+ self.assertTrue(os_helper.native_is_system_path(self.temp))
659+
660+ def test_native_is_system_path_false(self):
661+ """Test the function that returns if is a sytem folder."""
662+
663+ def fake_get_attrs(path):
664+ """Fake the GetFileAttributes method."""
665+ return FILE_ATTRIBUTE_NORMAL
666+
667+ self.patch(os_helper, 'GetFileAttributesW', fake_get_attrs)
668+ self.assertFalse(os_helper.native_is_system_path(self.temp))
669
670=== modified file 'tests/platform/windows/test_tools.py'
671--- tests/platform/windows/test_tools.py 2011-10-28 19:17:47 +0000
672+++ tests/platform/windows/test_tools.py 2011-12-19 20:54:24 +0000
673@@ -14,14 +14,18 @@
674 # You should have received a copy of the GNU General Public License along
675 # with this program. If not, see <http://www.gnu.org/licenses/>.
676
677+import sys
678
679 from twisted.internet import defer
680 from twisted.trial.unittest import TestCase
681
682-from ubuntuone.platform.tools.windows import (
683- SyncDaemonToolProxy,
684- UbuntuOneClient,
685-)
686+from ubuntuone.platform.tools import windows
687+
688+
689+# ugly trick to stop pylint for complaining about
690+# WindowsError on Linux
691+if sys.platform != 'win32':
692+ WindowsError = None
693
694
695 class TestSyncDaemonTool(TestCase):
696@@ -30,8 +34,8 @@
697 @defer.inlineCallbacks
698 def setUp(self):
699 yield super(TestSyncDaemonTool, self).setUp()
700- self.patch(UbuntuOneClient, "connect", lambda _: defer.Deferred())
701- self.sdtool = SyncDaemonToolProxy()
702+ self.patch(windows.UbuntuOneClient, "connect", lambda _: defer.Deferred())
703+ self.sdtool = windows.SyncDaemonToolProxy()
704
705 def test_call_after_connection(self):
706 """Test the _call_after_connection method."""
707@@ -79,3 +83,14 @@
708 attr = getattr(self.sdtool, attr_name)
709 func_name = getattr(attr, "__name__", None)
710 self.assertNotEqual(func_name, "call_after_connection_inner")
711+
712+ def test_start_missing_exe(self):
713+ """Test starting the service when the exe is missing."""
714+ # file is missing
715+ self.patch(windows.os.path, 'exists', lambda f: False)
716+ key = 'key'
717+ path = 'path/to/exe'
718+ self.patch(windows, 'OpenKey', lambda k,p: key)
719+ self.patch(windows, 'QueryValueEx', lambda k,p: path)
720+
721+ self.assertFailure(self.sdtool.start(), WindowsError)
722
723=== modified file 'tests/status/test_aggregator.py'
724--- tests/status/test_aggregator.py 2011-10-27 13:47:09 +0000
725+++ tests/status/test_aggregator.py 2011-12-19 20:54:24 +0000
726@@ -1327,6 +1327,26 @@
727 result = self.aggregator.get_discovery_message()
728 self.assertEqual(expected, result)
729
730+ def test_get_discovery_message_clears_filenames(self):
731+ """Test the message that's shown on the discovery bubble."""
732+ uploading = 10
733+ downloading = 8
734+ filename = 'upfile0.ext'
735+ filename2 = 'downfile0.ext'
736+ self.aggregator.files_uploading.extend([
737+ FakeCommand(path='upfile%d.ext' % n) for n in range(uploading)])
738+ self.aggregator.uploading_filename = filename
739+ self.aggregator.files_downloading.extend([
740+ FakeCommand(path='downfile%d.ext' % n) for n in
741+ range(downloading)])
742+ self.aggregator.downloading_filename = 'STALE FILENAME'
743+ self.aggregator.uploading_filename = 'STALE FILENAME'
744+ expected = (
745+ aggregator.files_being_uploaded(filename, uploading) + "\n" +
746+ aggregator.files_being_downloaded(filename2, downloading))
747+ result = self.aggregator.get_discovery_message()
748+ self.assertEqual(expected, result)
749+
750 def test_get_final_status_message(self):
751 """The final status message."""
752 done = (5, 10)
753
754=== modified file 'tests/syncdaemon/test_config.py'
755--- tests/syncdaemon/test_config.py 2011-10-03 13:36:31 +0000
756+++ tests/syncdaemon/test_config.py 2011-12-19 20:54:24 +0000
757@@ -20,24 +20,20 @@
758
759 import logging
760 import os
761-import sys
762
763 from ConfigParser import ConfigParser
764 from twisted.internet import defer
765 from twisted.trial.unittest import TestCase
766-
767-from ubuntuone.devtools.testcase import skipIfOS
768-from ubuntuone.platform import open_file, path_exists
769 from ubuntu_sso.xdg_base_directory import (
770 xdg_data_home,
771 xdg_cache_home,
772 )
773+from ubuntuone.devtools.testcases import skipIfOS
774+
775+from contrib.testing.testcase import BaseTwistedTestCase
776+from ubuntuone import platform
777+from ubuntuone.platform import open_file, path_exists
778 from ubuntuone.syncdaemon import config
779-from contrib.testing.testcase import (
780- BaseTwistedTestCase,
781- environ,
782-)
783-
784
785
786 class TestConfigBasic(BaseTwistedTestCase):
787@@ -205,7 +201,6 @@
788 self.assertEquals(None, conf.get_throttling_read_limit())
789 self.assertEquals(None, conf.get_throttling_write_limit())
790
791-
792 def test_load_partial_config(self):
793 """test loading a partial config file and fallback to defaults"""
794 conf_file = os.path.join(self.test_root, 'test_load_config.conf')
795@@ -483,14 +478,13 @@
796 """Check that get_config_files uses paths in the right encoding."""
797 temp = self.mktemp()
798 fake_path = os.path.join(temp, u"Ñandú")
799- native_dir = fake_path.encode(sys.getfilesystemencoding())
800- os.makedirs(native_dir)
801- with open(os.path.join(native_dir, config.CONFIG_FILE), "w") as f:
802+ os.makedirs(fake_path)
803+ with open(os.path.join(fake_path, config.CONFIG_FILE), "w") as f:
804 f.write("this is a fake config file")
805 fake_load_config_paths = lambda _: [fake_path.encode("utf8")]
806 self.patch(config, "load_config_paths", fake_load_config_paths)
807 config_files = config.get_config_files()
808- branch_config = os.path.join(native_dir, config.CONFIG_FILE)
809+ branch_config = os.path.join(fake_path, config.CONFIG_FILE)
810 self.assertIn(branch_config, config_files)
811
812
813@@ -537,12 +531,12 @@
814 def test_good_value(self):
815 """Test the parser using a good value."""
816 homedir = os.path.join('', 'home', 'fake')
817- with environ('HOME', homedir):
818- expected = os.path.join(self.xdg_dir, 'hola', 'mundo')
819- actual = self.parser(self.good_value)
820- self.assertEqual(expected, actual)
821- self.assertIsInstance(actual, str)
822- self.assertNotIsInstance(actual, unicode)
823+ self.patch(platform, 'xdg_home', homedir)
824+ expected = os.path.join(self.xdg_dir, 'hola', 'mundo')
825+ actual = self.parser(self.good_value)
826+ self.assertEqual(expected, actual)
827+ self.assertIsInstance(actual, str)
828+ self.assertNotIsInstance(actual, unicode)
829
830 def test_bad_value(self):
831 """Test the parser using a bad value."""
832
833=== modified file 'tests/syncdaemon/test_eq_inotify.py'
834--- tests/syncdaemon/test_eq_inotify.py 2011-09-02 17:29:30 +0000
835+++ tests/syncdaemon/test_eq_inotify.py 2011-12-19 20:54:24 +0000
836@@ -24,8 +24,8 @@
837 import os
838
839 from twisted.internet import defer, reactor
840-from ubuntuone.devtools.testcase import skipIfOS
841 from ubuntuone.devtools.handlers import MementoHandler
842+from ubuntuone.devtools.testcases import skipIfOS
843
844 from contrib.testing.testcase import (
845 BaseTwistedTestCase,
846@@ -34,6 +34,7 @@
847 skip_if_win32_missing_fs_event,
848 )
849 from tests.syncdaemon.test_eventqueue import BaseEQTestCase
850+from ubuntuone import platform
851 from ubuntuone.platform import (
852 make_link,
853 make_dir,
854@@ -660,10 +661,7 @@
855 self.eq.subscribe(self.listener)
856 self.addCleanup(self.eq.unsubscribe, self.listener)
857
858- env_var = 'HOME'
859- old_value = os.environ.get(env_var, None)
860- os.environ[env_var] = self.home_dir
861- self.addCleanup(os.environ.__setitem__, env_var, old_value)
862+ self.patch(platform, 'xdg_home', self.home_dir)
863
864 # create UDF
865 suggested_path = u'~/Documents/Reading/Books/PDFs'
866
867=== modified file 'tests/syncdaemon/test_fileshelf.py'
868--- tests/syncdaemon/test_fileshelf.py 2011-08-10 18:27:24 +0000
869+++ tests/syncdaemon/test_fileshelf.py 2011-12-19 20:54:24 +0000
870@@ -24,8 +24,9 @@
871 import unittest
872
873 from twisted.internet import defer
874+from ubuntuone.devtools.testcases import skipIfOS
875
876-from ubuntuone.devtools.testcase import skipIfOS
877+from contrib.testing.testcase import BaseTwistedTestCase
878 from ubuntuone.platform import (
879 open_file,
880 path_exists,
881@@ -36,7 +37,6 @@
882 LRUCache,
883 CacheInconsistencyError,
884 )
885-from contrib.testing.testcase import BaseTwistedTestCase
886
887
888 BROKEN_PICKLE = '\axb80\x02}q\x01(U\x01aU\x04testq\x02U\x01bU\x06brokenq\x03u.'
889
890=== modified file 'tests/syncdaemon/test_interaction_interfaces.py'
891--- tests/syncdaemon/test_interaction_interfaces.py 2011-11-02 16:51:32 +0000
892+++ tests/syncdaemon/test_interaction_interfaces.py 2011-12-19 20:54:24 +0000
893@@ -1559,6 +1559,7 @@
894 info = yield d
895
896 udf.subscribed = True
897+ udf.local_rescanning = False
898 self.assertEqual(get_udf_dict(udf), info)
899
900 @defer.inlineCallbacks
901@@ -1589,6 +1590,7 @@
902 info = yield d
903
904 udf.subscribed = False
905+ udf.local_rescanning = False
906 self.assertEqual(get_udf_dict(udf), info)
907
908 @defer.inlineCallbacks
909@@ -1648,6 +1650,7 @@
910 info = yield d
911
912 share.subscribed = True
913+ share.local_rescanning = False
914 self.assertEqual(get_share_dict(share), info)
915
916 @defer.inlineCallbacks
917@@ -1678,6 +1681,7 @@
918 info = yield d
919
920 share.subscribed = False
921+ share.local_rescanning = False
922 self.assertEqual(get_share_dict(share), info)
923
924 @defer.inlineCallbacks
925
926=== modified file 'tests/syncdaemon/test_localrescan.py'
927--- tests/syncdaemon/test_localrescan.py 2011-11-02 16:51:32 +0000
928+++ tests/syncdaemon/test_localrescan.py 2011-12-19 20:54:24 +0000
929@@ -26,13 +26,14 @@
930
931 from twisted.internet import defer, reactor
932 from ubuntuone.devtools.handlers import MementoHandler
933-from ubuntuone.devtools.testcase import skipIfOS
934+from ubuntuone.devtools.testcases import skipIfOS
935
936 from contrib.testing.testcase import (
937 BaseTwistedTestCase,
938 FakeVolumeManager,
939 skip_if_win32_and_uses_readonly,
940 )
941+from ubuntuone import platform
942 from ubuntuone.platform import (
943 make_dir,
944 make_link,
945@@ -130,9 +131,7 @@
946 self.tritcask_dir = self.mktemp("tritcask")
947
948 # set the home for the tests
949- old_value = os.environ.get('HOME', None)
950- os.environ['HOME'] = self.home_dir
951- self.addCleanup(os.environ.__setitem__, 'HOME', old_value)
952+ self.patch(platform, 'xdg_home', self.home_dir)
953
954 self.vm = FakeVolumeManager(usrdir)
955 self.db = Tritcask(self.tritcask_dir)
956@@ -372,9 +371,7 @@
957 """Init."""
958 yield super(VolumeTestCase, self).setUp()
959
960- old_value = os.environ.get('HOME', None)
961- os.environ['HOME'] = self.home_dir
962- self.addCleanup(os.environ.__setitem__, 'HOME', old_value)
963+ self.patch(platform, 'xdg_home', self.home_dir)
964
965 self.lr = local_rescan.LocalRescan(self.vm, self.fsm, self.eq, self.aq)
966
967@@ -2021,9 +2018,7 @@
968 @defer.inlineCallbacks
969 def setUp(self):
970 yield super(RootBadStateTests, self).setUp()
971- old_value = os.environ.get('HOME', None)
972- os.environ['HOME'] = self.home_dir
973- self.addCleanup(os.environ.__setitem__, 'HOME', old_value)
974+ self.patch(platform, 'xdg_home', self.home_dir)
975
976 @defer.inlineCallbacks
977 def _test_it(self, volume):
978@@ -2262,9 +2257,7 @@
979
980 self.lr = local_rescan.LocalRescan(self.vm, self.fsm, self.eq, self.aq)
981
982- old_value = os.environ.get('HOME', None)
983- os.environ['HOME'] = self.home_dir
984- self.addCleanup(os.environ.__setitem__, 'HOME', old_value)
985+ self.patch(platform, 'xdg_home', self.home_dir)
986
987 # create UDF
988 suggested_path = u'~/Documents/Reading/Books/PDFs'
989
990=== modified file 'tests/syncdaemon/test_logger.py'
991--- tests/syncdaemon/test_logger.py 2011-10-27 11:39:43 +0000
992+++ tests/syncdaemon/test_logger.py 2011-12-19 20:54:24 +0000
993@@ -24,7 +24,7 @@
994 from twisted.trial import unittest
995
996 from ubuntuone.devtools.handlers import MementoHandler
997-from ubuntuone.devtools.testcase import skipIfOS
998+from ubuntuone.devtools.testcases import skipIfOS
999
1000 from ubuntuone.syncdaemon.logger import (
1001 DebugCapture,
1002
1003=== modified file 'tests/syncdaemon/test_main.py'
1004--- tests/syncdaemon/test_main.py 2011-10-21 13:40:09 +0000
1005+++ tests/syncdaemon/test_main.py 2011-12-19 20:54:24 +0000
1006@@ -20,6 +20,7 @@
1007
1008 from twisted.internet import defer, reactor
1009 from ubuntuone.devtools.handlers import MementoHandler
1010+from ubuntuone.platform import expand_user
1011
1012 from contrib.testing.testcase import (
1013 BaseTwistedTestCase, FAKED_CREDENTIALS,
1014@@ -263,13 +264,13 @@
1015
1016 def test_get_rootdir(self):
1017 """The get_rootdir returns the root dir."""
1018- expected = os.path.expanduser(os.path.join('~', 'Ubuntu Test One'))
1019+ expected = expand_user(os.path.join('~', 'Ubuntu Test One'))
1020 main = self.build_main(root_dir=expected)
1021 self.assertEqual(main.get_rootdir(), expected)
1022
1023 def test_get_sharesdir(self):
1024 """The get_sharesdir returns the shares dir."""
1025- expected = os.path.expanduser(os.path.join('~', 'Share it to Me'))
1026+ expected = expand_user(os.path.join('~', 'Share it to Me'))
1027 main = self.build_main(shares_dir=expected)
1028 self.assertEqual(main.get_sharesdir(), expected)
1029
1030
1031=== modified file 'tests/syncdaemon/test_sync.py'
1032--- tests/syncdaemon/test_sync.py 2011-09-02 12:22:35 +0000
1033+++ tests/syncdaemon/test_sync.py 2011-12-19 20:54:24 +0000
1034@@ -30,7 +30,7 @@
1035
1036 from twisted.internet import defer
1037 from twisted.python.failure import Failure
1038-from ubuntuone.devtools.testcase import skipIfOS
1039+from ubuntuone.devtools.testcases import skipIfOS
1040
1041 from contrib.testing.testcase import (
1042 FakeMain,
1043
1044=== modified file 'tests/syncdaemon/test_vm.py'
1045--- tests/syncdaemon/test_vm.py 2011-10-25 02:24:29 +0000
1046+++ tests/syncdaemon/test_vm.py 2011-12-19 20:54:24 +0000
1047@@ -30,19 +30,20 @@
1048
1049 from mocker import Mocker, MATCH
1050 from twisted.internet import defer, reactor
1051-
1052-from contrib.testing.testcase import (
1053- BaseTwistedTestCase,
1054- FakeMain,
1055-)
1056 from ubuntuone.devtools.handlers import MementoHandler
1057-from ubuntuone.devtools.testcase import skipIfOS
1058+from ubuntuone.devtools.testcases import skipIfOS
1059 from ubuntuone.storageprotocol import volumes, request
1060 from ubuntuone.storageprotocol.client import ListShares
1061 from ubuntuone.storageprotocol.sharersp import (
1062 NotifyShareHolder,
1063 ShareResponse,
1064 )
1065+
1066+from contrib.testing.testcase import (
1067+ BaseTwistedTestCase,
1068+ FakeMain,
1069+)
1070+from ubuntuone import platform
1071 from ubuntuone.syncdaemon import config, event_queue, tritcask
1072 from ubuntuone.syncdaemon.volume_manager import (
1073 ACCESS_LEVEL_RO,
1074@@ -97,15 +98,18 @@
1075 self.partials_dir = self.mktemp('partials_dir')
1076 self.main = FakeMain(self.root_dir, self.shares_dir,
1077 self.data_dir, self.partials_dir)
1078+ self.patch(platform, 'xdg_home', self.home_dir)
1079
1080 self.watches = set() # keep track of added watches
1081
1082 orig_add_watch = self.main.event_q.add_watch
1083+
1084 def fake_add_watch(path):
1085 self.watches.add(path)
1086 return orig_add_watch(path)
1087
1088 orig_rm_watch = self.main.event_q.rm_watch
1089+
1090 def fake_rm_watch(path):
1091 self.watches.remove(path)
1092 return orig_rm_watch(path)
1093@@ -135,6 +139,7 @@
1094 def _listen_for(self, event, callback, count=1, collect=False):
1095 """Setup a EQ listener for the especified event."""
1096 event_q = self.main.event_q
1097+
1098 class Listener(object):
1099 """A basic listener to handle the pushed event."""
1100
1101@@ -407,7 +412,8 @@
1102 @defer.inlineCallbacks
1103 def test_add_share_modify_scans_share(self):
1104 """Test that add_share scans the share."""
1105- share = self._create_share(access_level=ACCESS_LEVEL_RW, subscribed=True)
1106+ share = self._create_share(access_level=ACCESS_LEVEL_RW,
1107+ subscribed=True)
1108
1109 scan_d = defer.Deferred()
1110
1111@@ -426,6 +432,7 @@
1112 self.patch(self.main.lr, 'scan_dir', fake_scan_dir)
1113
1114 scratch_d = defer.Deferred()
1115+
1116 def fake_rescan_from_scratch(volume_id):
1117 """A fake scan share that check the arguments."""
1118 self.assertEqual(share.volume_id, volume_id)
1119@@ -483,7 +490,7 @@
1120 # add a inotify watch to the dir
1121 yield self.vm._add_watch(path)
1122 files = ['a_file', os.path.join('dir', 'file'),
1123- os.path.join('dir','subdir','file')]
1124+ os.path.join('dir', 'subdir', 'file')]
1125 for i, file in enumerate(files):
1126 path = os.path.join(share.path, file)
1127 self.main.fs.create(path, share.volume_id)
1128@@ -558,7 +565,7 @@
1129 response = ListShares(None)
1130 response.shares = [share_response]
1131 self.vm.handle_AQ_SHARES_LIST(response)
1132- self.assertEqual(2, len(self.vm.shares)) # the new shares and root
1133+ self.assertEqual(2, len(self.vm.shares)) # the new shares and root
1134 # check that the share is in the shares dict
1135 self.assertIn(str(share_id), self.vm.shares)
1136 share = self.vm.shares[str(share_id)]
1137@@ -582,7 +589,8 @@
1138 access_level=ACCESS_LEVEL_RO)
1139 yield self.vm.add_share(share)
1140 self.vm.handle_SV_SHARE_CHANGED(info=share_holder)
1141- self.assertEquals(ACCESS_LEVEL_RW, self.vm.shares[str(share_id)].access_level)
1142+ self.assertEquals(ACCESS_LEVEL_RW,
1143+ self.vm.shares[str(share_id)].access_level)
1144 self.vm.handle_SV_SHARE_DELETED(share_holder.share_id)
1145 self.assertNotIn('share_id', self.vm.shares)
1146
1147@@ -619,8 +627,8 @@
1148 response = ListShares(None)
1149 response.shares = [share_response, shared_response]
1150 self.vm.handle_AQ_SHARES_LIST(response)
1151- self.assertEqual(2, len(self.vm.shares)) # the new share and root
1152- self.assertEqual(1, len(self.vm.shared)) # the new shared
1153+ self.assertEqual(2, len(self.vm.shares)) # the new share and root
1154+ self.assertEqual(1, len(self.vm.shared)) # the new shared
1155 shared = self.vm.shared[str(shared_id)]
1156 self.assertEquals('fake_shared', shared.name)
1157 # check that the uuid is stored in fs
1158@@ -786,10 +794,12 @@
1159 self.assertIn(share.volume_id, self.vm.shares)
1160 self.assertEquals(False, share.accepted)
1161 # helper method, pylint: disable-msg=C0111
1162+
1163 def answer_share(share_id, answer):
1164 reactor.callLater(0.2, d.callback, (share_id, answer))
1165 return d
1166 self.main.action_q.answer_share = answer_share
1167+
1168 def callback(result):
1169 share_id, answer = result
1170 self.assertEquals(share.volume_id, share_id)
1171@@ -812,7 +822,7 @@
1172 response = ListShares(None)
1173 response.shares = [shared_response]
1174 self.vm.handle_AQ_SHARES_LIST(response)
1175- self.assertEquals(1, len(self.vm.shared)) # the new shares and root
1176+ self.assertEquals(1, len(self.vm.shared)) # the new shares and root
1177 shared = self.vm.shared['shared_id']
1178 self.assertEquals('fake_shared', shared.name)
1179 # check that the uuid is stored in fs
1180@@ -829,7 +839,8 @@
1181 # initialize the the root
1182 self.vm._got_root('root_uuid')
1183 # add the shared folder
1184- share = Share(path=path, volume_id='share_id', access_level=ACCESS_LEVEL_RO)
1185+ share = Share(path=path, volume_id='share_id',
1186+ access_level=ACCESS_LEVEL_RO)
1187 yield self.vm.add_shared(share)
1188 self.assertEquals(False, self.vm.shared['share_id'].accepted)
1189 # check that a answer notify of a missing share don't blowup
1190@@ -866,7 +877,7 @@
1191 response = ListShares(None)
1192 response.shares = [share_response, share_response_1]
1193 self.vm.handle_AQ_SHARES_LIST(response)
1194- self.assertEquals(3, len(self.vm.shares)) # the new shares and root
1195+ self.assertEquals(3, len(self.vm.shares)) # the new shares and root
1196 # check that the share is in the shares dict
1197 self.assertIn('share_id', self.vm.shares)
1198 self.assertIn('share_id_1', self.vm.shares)
1199@@ -1025,6 +1036,7 @@
1200 share = self._create_share()
1201
1202 scratch_d = defer.Deferred()
1203+
1204 def fake_rescan_from_scratch(volume_id):
1205 """A fake get_delta that check the arguments."""
1206 self.assertEquals(share.volume_id, volume_id)
1207@@ -1141,8 +1153,8 @@
1208
1209 self.assertTrue(self.vm.shares[share.volume_id].subscribed)
1210 # create a few files and directories
1211- dirs = ['dir', os.path.join('dir','subdir'),
1212- os.path.join('dir','empty_dir')]
1213+ dirs = ['dir', os.path.join('dir', 'subdir'),
1214+ os.path.join('dir', 'empty_dir')]
1215 for i, dir in enumerate(dirs):
1216 path = os.path.join(share.path, dir)
1217 with allow_writes(os.path.split(share.path)[0]):
1218@@ -1154,7 +1166,7 @@
1219 # add a inotify watch to the dir
1220 yield self.vm._add_watch(path)
1221 files = ['a_file', os.path.join('dir', 'file'),
1222- os.path.join('dir','subdir','file')]
1223+ os.path.join('dir', 'subdir', 'file')]
1224 for i, file in enumerate(files):
1225 path = os.path.join(share.path, file)
1226 with allow_writes(os.path.split(share.path)[0]):
1227@@ -1188,6 +1200,7 @@
1228 def _test_subscribe_share_generations(self, share):
1229 """Test subscribe_share with a generation."""
1230 scratch_d = defer.Deferred()
1231+
1232 def fake_rescan_from_scratch(volume_id):
1233 """A fake rescan_from_scratch that check the arguments."""
1234 self.assertEquals(share.volume_id, volume_id)
1235@@ -1255,7 +1268,7 @@
1236 self.main.event_q.rm_watch(share.path)
1237 self.assertTrue(self.vm.shares[share.volume_id].subscribed)
1238 # create a few files and directories
1239- dirs = ['dir', os.path.join('dir','subdir'),
1240+ dirs = ['dir', os.path.join('dir', 'subdir'),
1241 os.path.join('dir', 'empty_dir')]
1242 for i, dir in enumerate(dirs):
1243 path = os.path.join(share.path, dir)
1244@@ -1409,7 +1422,7 @@
1245 os.path.join(u'~', u'Documents'),
1246 os.path.join(u'~', u'Documents', u'Reading Años'),
1247 os.path.join(u'~', u'Documents', u'Reading Años', u'Books')]
1248- expected = [os.path.expanduser(p).encode('utf8') for p in expected]
1249+ expected = [platform.expand_user(p.encode('utf-8')) for p in expected]
1250
1251 udf = self._create_udf(suggested_path=suggested_path)
1252 self.assertEquals(expected, udf.ancestors)
1253@@ -1449,10 +1462,39 @@
1254 'watch for %r should be present.' % udf.path)
1255
1256 @defer.inlineCallbacks
1257+ def test_add_udf_with_content(self):
1258+ """Test for VolumeManager.add_udf with content on disk."""
1259+ # create a sync instance
1260+ from ubuntuone.syncdaemon import sync
1261+ sync = sync.Sync(self.main)
1262+ suggested_path = u"~/suggested_path"
1263+ udf = self._create_udf(suggested_path=suggested_path,
1264+ subscribed=True)
1265+ # create some files inside it
1266+ make_dir(udf.path)
1267+ for i in range(10):
1268+ with open_file(os.path.join(udf.path, 'file_%d' % (i,)), 'wb') as f:
1269+ f.write(os.urandom(10))
1270+ self.assertEqual(len(os.listdir(udf.path)), 10)
1271+ # patch the fake action queue to intercept make_file calls
1272+ called = []
1273+ self.main.action_q.make_file = lambda *a: called.append(a)
1274+ yield self.vm.add_udf(udf)
1275+ self.assertEqual(len(called), 10)
1276+ # check that the UDF is in the fsm metadata
1277+ mdobj = self.main.fs.get_by_path(udf.path)
1278+ self.assertEqual(mdobj.node_id, udf.node_id)
1279+ self.assertEqual(mdobj.share_id, udf.volume_id)
1280+ # check that there is a watch in the UDF
1281+ self.assertIn(udf.path, self.watches,
1282+ 'watch for %r should be present.' % udf.path)
1283+
1284+ @defer.inlineCallbacks
1285 def test_add_udf_calls_AQ(self):
1286 """Test that VolumeManager.add_udf calls AQ.rescan_from_scratch."""
1287 udf = self._create_udf(subscribed=True)
1288 scratch_d = defer.Deferred()
1289+
1290 def fake_rescan_from_scratch(volume_id):
1291 """A fake rescan_from_scratch that check the arguments."""
1292 self.assertEquals(udf.volume_id, volume_id)
1293@@ -1496,7 +1538,7 @@
1294 self.assertEquals(1, len(self.vm.udfs))
1295 # create a few files and directories
1296 dirs = ['dir', os.path.join('dir', 'subdir'),
1297- os.path.join('dir','empty_dir')]
1298+ os.path.join('dir', 'empty_dir')]
1299 for i, dir in enumerate(dirs):
1300 path = os.path.join(udf.path, dir)
1301 if not path_exists(path):
1302@@ -1505,8 +1547,8 @@
1303 self.main.fs.set_node_id(path, 'dir_node_id' + str(i))
1304 # add a inotify watch to the dir
1305 yield self.vm._add_watch(path)
1306- files = ['a_file', os.path.join('dir','file'),
1307- os.path.join('dir','subdir','file')]
1308+ files = ['a_file', os.path.join('dir', 'file'),
1309+ os.path.join('dir', 'subdir', 'file')]
1310 for i, file in enumerate(files):
1311 path = os.path.join(udf.path, file)
1312 self.main.fs.create(path, udf.volume_id)
1313@@ -1583,8 +1625,8 @@
1314 self.vm.handle_AQ_LIST_VOLUMES(response)
1315 yield d1
1316 yield d2
1317- self.assertEqual(2, len(self.vm.shares)) # the new share and root
1318- self.assertEqual(1, len(self.vm.udfs)) # the new udf
1319+ self.assertEqual(2, len(self.vm.shares)) # the new share and root
1320+ self.assertEqual(1, len(self.vm.udfs)) # the new udf
1321 # check that the share is in the shares dict
1322 self.assertIn(str(share_id), self.vm.shares)
1323 self.assertIn(str(udf_id), self.vm.udfs)
1324@@ -1600,8 +1642,8 @@
1325 self.assertEqual(root.node_id, str(root_volume.node_id))
1326 # now send the same list again and check
1327 self.vm.handle_AQ_LIST_VOLUMES(response)
1328- self.assertEqual(2, len(self.vm.shares)) # the share and root
1329- self.assertEqual(1, len(self.vm.udfs)) # one udf
1330+ self.assertEqual(2, len(self.vm.shares)) # the share and root
1331+ self.assertEqual(1, len(self.vm.udfs)) # one udf
1332 # check that the udf is the same.
1333 new_udf = self.vm.udfs[str(udf_id)]
1334 self.assertEqual(udf.__dict__, new_udf.__dict__)
1335@@ -1649,8 +1691,8 @@
1336 self._listen_for('VM_UDF_CREATED', d.callback)
1337 self.vm.handle_AQ_LIST_VOLUMES(response)
1338 yield d
1339- self.assertEqual(2, len(self.vm.shares)) # the new shares and root
1340- self.assertEqual(1, len(self.vm.udfs)) # the new shares and root
1341+ self.assertEqual(2, len(self.vm.shares)) # the new shares and root
1342+ self.assertEqual(1, len(self.vm.udfs)) # the new shares and root
1343 # check that the share is in the shares dict
1344 self.assertIn(str(share_id), self.vm.shares)
1345 self.assertIn(str(udf_id), self.vm.udfs)
1346@@ -1668,7 +1710,7 @@
1347 response = [root_volume]
1348 self.vm.refresh_volumes = lambda: self.fail('refresh_volumes called!')
1349 self.vm.handle_AQ_LIST_VOLUMES(response)
1350- self.assertEqual(1, len(self.vm.shares)) # the new share and root
1351+ self.assertEqual(1, len(self.vm.shares)) # the new share and root
1352 # check that the root is in the shares dict
1353 self.assertIn(request.ROOT, self.vm.shares)
1354 self.assertEqual(self.vm.shares[request.ROOT].node_id,
1355@@ -1695,8 +1737,8 @@
1356 # patch aq.rescan_from_scratch in order to intercept the calls
1357 root_from_scratch_d = defer.Deferred()
1358 share_from_scratch_d = defer.Deferred()
1359- from_scratch_deferreds = {'':root_from_scratch_d,
1360- str(share_id):share_from_scratch_d}
1361+ from_scratch_deferreds = {'': root_from_scratch_d,
1362+ str(share_id): share_from_scratch_d}
1363 self.patch(self.main.action_q, 'rescan_from_scratch',
1364 lambda vol_id: from_scratch_deferreds.pop(vol_id).callback(vol_id))
1365
1366@@ -1727,7 +1769,7 @@
1367
1368 def check():
1369 """The test itself."""
1370- self.assertEqual(2, len(self.vm.shares)) # the share and the root
1371+ self.assertEqual(2, len(self.vm.shares)) # the share and the root
1372 # check that the share is in the shares dict
1373 self.assertIn(str(share_id), self.vm.shares)
1374 share = self.vm.shares[str(share_id)]
1375@@ -1748,8 +1790,8 @@
1376 # patch aq.rescan_from_scratch in order to intercept the calls
1377 root_from_scratch_d = defer.Deferred()
1378 share_from_scratch_d = defer.Deferred()
1379- from_scratch_deferreds = {'':root_from_scratch_d,
1380- str(share_id):share_from_scratch_d}
1381+ from_scratch_deferreds = {'': root_from_scratch_d,
1382+ str(share_id): share_from_scratch_d}
1383 self.patch(self.main.action_q, 'rescan_from_scratch',
1384 lambda vol_id: from_scratch_deferreds.pop(vol_id).callback(vol_id))
1385
1386@@ -1793,8 +1835,8 @@
1387 # patch aq.rescan_from_scratch in order to intercept the calls
1388 root_from_scratch_d = defer.Deferred()
1389 share_from_scratch_d = defer.Deferred()
1390- from_scratch_deferreds = {'':root_from_scratch_d,
1391- str(udf_id):share_from_scratch_d}
1392+ from_scratch_deferreds = {'': root_from_scratch_d,
1393+ str(udf_id): share_from_scratch_d}
1394 self.patch(self.main.action_q, 'rescan_from_scratch',
1395 lambda vol_id: from_scratch_deferreds.pop(vol_id).callback(vol_id))
1396
1397@@ -1824,7 +1866,7 @@
1398
1399 def check():
1400 """The test itself."""
1401- self.assertEqual(1, len(self.vm.udfs)) # the new udf
1402+ self.assertEqual(1, len(self.vm.udfs)) # the new udf
1403 # check that the UDF is in the udfs dict
1404 self.assertIn(str(udf_id), self.vm.udfs)
1405 udf = self.vm.udfs[str(udf_id)]
1406@@ -1879,11 +1921,13 @@
1407 udf_id = uuid.uuid4()
1408 node_id = uuid.uuid4()
1409 # patch AQ.create_udf
1410+
1411 def create_udf(path, name, marker):
1412 """Fake create_udf"""
1413 self.main.event_q.push("AQ_CREATE_UDF_OK", volume_id=udf_id,
1414 node_id=node_id, marker=marker)
1415 self.main.action_q.create_udf = create_udf
1416+
1417 def check(info):
1418 """Check the udf attributes."""
1419 udf = info['udf']
1420@@ -1908,6 +1952,7 @@
1421 d = defer.Deferred()
1422 path = get_udf_path(u"~/ñoño/mirá que lindo mi udf")
1423 # patch AQ.create_udf
1424+
1425 def create_udf(path, name, marker):
1426 """Fake create_udf"""
1427 d.callback((path, name))
1428@@ -1934,10 +1979,12 @@
1429 yield self.vm.add_share(share)
1430 d = defer.Deferred()
1431 # patch AQ.delete_volume
1432+
1433 def delete_volume(volume_id, path):
1434 """Fake delete_volume"""
1435 self.main.event_q.push("AQ_DELETE_VOLUME_OK", volume_id=volume_id)
1436 self.main.action_q.delete_volume = delete_volume
1437+
1438 def check_udf(info):
1439 """Check the udf attributes."""
1440 deleted_udf = info['volume']
1441@@ -1972,6 +2019,7 @@
1442 yield self.vm.add_udf(udf)
1443 d = defer.Deferred()
1444 # patch AQ.delete_volume
1445+
1446 def delete_volume(volume_id, path):
1447 """Fake AQ.delete_volume."""
1448 if volume_id == udf.volume_id and path == udf.path:
1449@@ -2042,6 +2090,7 @@
1450 def _test_subscribe_udf_generations(self, udf):
1451 """Test subscribe_udf with a generation."""
1452 scratch_d = defer.Deferred()
1453+
1454 def fake_rescan_from_scratch(volume_id):
1455 """A fake rescan_from_scratch that check the arguments."""
1456 self.assertEquals(udf.volume_id, volume_id)
1457@@ -2092,8 +2141,8 @@
1458 yield self.vm.add_udf(udf)
1459 self.assertTrue(self.vm.udfs[udf.volume_id].subscribed)
1460 # create a few files and directories
1461- dirs = ['dir', os.path.join('dir','subdir'),
1462- os.path.join('dir','empty_dir')]
1463+ dirs = ['dir', os.path.join('dir', 'subdir'),
1464+ os.path.join('dir', 'empty_dir')]
1465 for i, dir in enumerate(dirs):
1466 path = os.path.join(udf.path, dir)
1467 if not path_exists(path):
1468@@ -2102,8 +2151,8 @@
1469 self.main.fs.set_node_id(path, 'dir_node_id' + str(i))
1470 # add a inotify watch to the dir
1471 yield self.vm._add_watch(path)
1472- files = ['a_file', os.path.join('dir','file'),
1473- os.path.join('dir','subdir','file')]
1474+ files = ['a_file', os.path.join('dir', 'file'),
1475+ os.path.join('dir', 'subdir', 'file')]
1476 for i, file in enumerate(files):
1477 path = os.path.join(udf.path, file)
1478 open_file(path, 'w').close()
1479@@ -2141,7 +2190,7 @@
1480 self.assertTrue(self.vm.udfs[udf.volume_id].subscribed)
1481 # create a few files and directories
1482 dirs = ['dir', os.path.join('dir', 'subdir'),
1483- os.path.join('dir','empty_dir')]
1484+ os.path.join('dir', 'empty_dir')]
1485 for i, path in enumerate(dirs):
1486 path = os.path.join(udf.path, path)
1487 if not path_exists(path):
1488@@ -2264,6 +2313,7 @@
1489 udf_id = uuid.uuid4()
1490 node_id = uuid.uuid4()
1491 # patch AQ.create_udf
1492+
1493 def create_udf(path, name, marker):
1494 """Fake create_udf"""
1495 self.main.event_q.push("AQ_CREATE_UDF_OK", volume_id=udf_id,
1496@@ -2289,6 +2339,7 @@
1497 d = defer.Deferred()
1498 path = get_udf_path(u'~/ñoño')
1499 # patch AQ.create_udf
1500+
1501 def create_udf(path, name, marker):
1502 """Fake create_udf"""
1503 self.main.event_q.push("AQ_CREATE_UDF_ERROR",
1504@@ -2297,6 +2348,7 @@
1505 self._listen_for('VM_UDF_CREATE_ERROR', d.callback)
1506 # fake VM state, call create_udf
1507 self.vm.create_udf(path)
1508+
1509 def check(info):
1510 """The callback"""
1511 self.assertEquals(info['path'], path)
1512@@ -2312,10 +2364,12 @@
1513 yield self.vm.add_udf(udf)
1514 d = defer.Deferred()
1515 # patch AQ.delete_volume
1516+
1517 def delete_volume(vol_id, path):
1518 """Fake delete_volume"""
1519 self.main.event_q.push("AQ_DELETE_VOLUME_OK", volume_id=vol_id)
1520 self.main.action_q.delete_volume = delete_volume
1521+
1522 def check(info):
1523 """Check the udf attributes."""
1524 deleted_udf = info['volume']
1525@@ -2336,11 +2390,13 @@
1526 yield self.vm.add_udf(udf)
1527 d = defer.Deferred()
1528 # patch AQ.delete_volume
1529+
1530 def delete_volume(vol_id, path):
1531 """Fake delete_volume"""
1532 self.main.event_q.push("AQ_DELETE_VOLUME_ERROR",
1533 volume_id=vol_id, error="ERROR!")
1534 self.main.action_q.delete_volume = delete_volume
1535+
1536 def check(info):
1537 """Check the udf attributes."""
1538 deleted_udf, error = info['volume_id'], info['error']
1539@@ -2429,7 +2485,8 @@
1540 d = defer.Deferred()
1541 self._listen_for('VM_UDF_CREATED', d.callback)
1542 rescan_cb = defer.Deferred()
1543- self.patch(self.main.action_q, 'rescan_from_scratch', rescan_cb.callback)
1544+ self.patch(self.main.action_q, 'rescan_from_scratch',
1545+ rescan_cb.callback)
1546
1547 self.vm.handle_SV_VOLUME_CREATED(udf_volume)
1548 info = yield d
1549@@ -2669,6 +2726,7 @@
1550 self.vm._got_root('root_uuid')
1551 udf_path = os.path.join(self.root_dir, 'udf_inside_root')
1552 # patch FakeAQ
1553+
1554 def create_udf(path, name, marker):
1555 """Fake create_udf"""
1556 d = dict(volume_id=uuid.uuid4(), node_id=uuid.uuid4(),
1557@@ -2692,6 +2750,7 @@
1558 udf_child = os.path.join(udf.path, 'd')
1559 yield self.vm.add_udf(udf)
1560 # patch FakeAQ
1561+
1562 def create_udf(path, name, marker):
1563 """Fake create_udf"""
1564 self.main.event_q.push("AQ_CREATE_UDF_OK", volume_id=uuid.uuid4(),
1565@@ -2713,6 +2772,7 @@
1566 yield self.vm.add_udf(udf)
1567 udf_parent_path = os.path.dirname(udf.path)
1568 # patch FakeAQ
1569+
1570 def create_udf(path, name, marker):
1571 """Fake create_udf."""
1572 self.main.event_q.push("AQ_CREATE_UDF_OK", volume_id=uuid.uuid4(),
1573@@ -2784,8 +2844,8 @@
1574 self._listen_for('VM_UDF_CREATED', d.callback)
1575 self.vm.handle_AQ_LIST_VOLUMES(response)
1576 yield d
1577- self.assertEquals(2, len(self.vm.shares)) # the new share and root
1578- self.assertEquals(1, len(self.vm.udfs)) # the new udf
1579+ self.assertEquals(2, len(self.vm.shares)) # the new share and root
1580+ self.assertEquals(1, len(self.vm.udfs)) # the new udf
1581 shared_id = uuid.uuid4()
1582 shared_response = ShareResponse.from_params(shared_id, 'from_me',
1583 'fake_share_uuid',
1584@@ -2796,7 +2856,7 @@
1585 shares_response.shares = [shared_response]
1586 self.vm.handle_AQ_SHARES_LIST(shares_response)
1587 # check that all the shares are still there
1588- self.assertEquals(2, len(self.vm.shares)) # the new share and root
1589+ self.assertEquals(2, len(self.vm.shares)) # the new share and root
1590
1591 @defer.inlineCallbacks
1592 def test_handle_SV_FREE_SPACE(self):
1593@@ -2811,6 +2871,7 @@
1594 yield self.vm.add_udf(udf)
1595 # override AQ.check_conditions
1596 d = defer.Deferred()
1597+
1598 def check_conditions():
1599 """Fake check_conditions that just keep count of calls."""
1600 if d.called:
1601@@ -2882,6 +2943,7 @@
1602 real_udf_path = os.path.join(self.home_dir, "my_udf")
1603 udf_path = os.path.join(self.home_dir, "MyUDF")
1604 # patch FakeAQ
1605+
1606 def create_udf(path, name, marker):
1607 """Fake create_udf"""
1608 self.main.event_q.push("AQ_CREATE_UDF_OK", volume_id=uuid.uuid4(),
1609@@ -2907,6 +2969,7 @@
1610 real_udf_path = os.path.join(self.home_dir, "udf_parent", "my_udf")
1611 udf_path = os.path.join(self.home_dir, "MyUDF")
1612 # patch FakeAQ
1613+
1614 def create_udf(path, name, marker):
1615 """Fake create_udf"""
1616 self.main.event_q.push("AQ_CREATE_UDF_OK", volume_id=uuid.uuid4(),
1617@@ -2968,8 +3031,8 @@
1618 # patch aq.rescan_from_scratch in order to intercept the calls
1619 root_from_scratch_d = defer.Deferred()
1620 share_from_scratch_d = defer.Deferred()
1621- from_scratch_deferreds = {'':root_from_scratch_d,
1622- str(share_id):share_from_scratch_d}
1623+ from_scratch_deferreds = {'': root_from_scratch_d,
1624+ str(share_id): share_from_scratch_d}
1625 self.patch(self.main.action_q, 'rescan_from_scratch',
1626 lambda vol_id: from_scratch_deferreds.pop(vol_id).callback(vol_id))
1627
1628@@ -3010,8 +3073,8 @@
1629 # patch aq.rescan_from_scratch in order to intercept the calls
1630 root_from_scratch_d = defer.Deferred()
1631 udf_from_scratch_d = defer.Deferred()
1632- from_scratch_deferreds = {'':root_from_scratch_d,
1633- str(udf_id):udf_from_scratch_d}
1634+ from_scratch_deferreds = {'': root_from_scratch_d,
1635+ str(udf_id): udf_from_scratch_d}
1636 self.patch(self.main.action_q, 'rescan_from_scratch',
1637 lambda vol_id: from_scratch_deferreds.pop(vol_id).callback(vol_id))
1638
1639@@ -3059,9 +3122,9 @@
1640 root_from_scratch_d = defer.Deferred()
1641 share_from_scratch_d = defer.Deferred()
1642 udf_from_scratch_d = defer.Deferred()
1643- from_scratch_deferreds = {'':root_from_scratch_d,
1644- str(share_id):share_from_scratch_d,
1645- str(udf_id):udf_from_scratch_d}
1646+ from_scratch_deferreds = {'': root_from_scratch_d,
1647+ str(share_id): share_from_scratch_d,
1648+ str(udf_id): udf_from_scratch_d}
1649 self.patch(self.main.action_q, 'rescan_from_scratch',
1650 lambda vol_id: from_scratch_deferreds.pop(vol_id).callback(vol_id))
1651
1652@@ -3092,7 +3155,8 @@
1653 def test_server_rescan_error(self):
1654 """Test the server_rescan method."""
1655 # patch fake action queue
1656- self.main.action_q.query_volumes = lambda: defer.fail(Exception('foo bar'))
1657+ self.main.action_q.query_volumes = lambda: defer.fail(
1658+ Exception('foo bar'))
1659 d = defer.Deferred()
1660 self._listen_for('SYS_SERVER_RESCAN_ERROR', d.callback)
1661 yield self.vm.server_rescan()
1662@@ -3101,6 +3165,7 @@
1663 # patch fake action queue
1664 self.main.action_q.query_volumes = lambda: defer.succeed([])
1665 # patch volume manager
1666+
1667 def broken_volumes_rescan_cb(_):
1668 raise ValueError('die!')
1669 self.vm._volumes_rescan_cb = broken_volumes_rescan_cb
1670@@ -3123,7 +3188,7 @@
1671 yield self.vm.server_rescan()
1672 called = yield d
1673 self.assertTrue(called)
1674- test_refresh_shares_called_after_server_rescan.timeout = 1
1675+ test_refresh_shares_called_after_server_rescan.timeout = 1
1676
1677 @defer.inlineCallbacks
1678 def test_server_rescan_clean_dead_udf(self):
1679@@ -3262,7 +3327,7 @@
1680 for event in events)
1681 self.assertIn(str(share_id), events_dict)
1682 self.assertIn(str(udf_id), events_dict)
1683- self.assertNotIn(request.ROOT, events_dict) # same gen as metadata
1684+ self.assertNotIn(request.ROOT, events_dict) # same gen as metadata
1685 self.assertEqual(10, events_dict[str(share_id)])
1686 self.assertEqual(5, events_dict[str(udf_id)])
1687 # now only change the root volume generation
1688@@ -3278,7 +3343,7 @@
1689 for event in events)
1690 self.assertNotIn(str(share_id), events_dict)
1691 self.assertNotIn(str(udf_id), events_dict)
1692- self.assertIn(request.ROOT, events_dict) # same generation as metadata
1693+ self.assertIn(request.ROOT, events_dict) # same generation as metadata
1694 self.assertEqual(100, events_dict[request.ROOT])
1695
1696 @defer.inlineCallbacks
1697@@ -3368,7 +3433,8 @@
1698 # delete the fsm metadata
1699 self.main.fs.delete_metadata(udf.path)
1700 d = defer.Deferred()
1701- self._listen_for('SV_VOLUME_NEW_GENERATION', d.callback, 2, collect=True)
1702+ self._listen_for('SV_VOLUME_NEW_GENERATION', d.callback, 2,
1703+ collect=True)
1704 self.patch(self.vm, '_scan_volume', defer.succeed)
1705 self.vm._volumes_rescan_cb(response)
1706 events = yield d
1707@@ -3441,7 +3507,8 @@
1708 yield self.vm.add_udf(udf)
1709 self.vm.update_generation(udf.volume_id, 10)
1710 d = defer.Deferred()
1711- self.patch(self.main.action_q, 'get_delta', lambda v, g: d.callback((v, g)))
1712+ self.patch(self.main.action_q, 'get_delta',
1713+ lambda v, g: d.callback((v, g)))
1714 self.main.event_q.push('SV_VOLUME_NEW_GENERATION',
1715 volume_id=udf.volume_id, generation=100)
1716 vol_id, gen = yield d
1717@@ -3501,7 +3568,8 @@
1718 self.vm._got_root('root_node_id')
1719 self.vm.update_generation(root.volume_id, 10)
1720 d = defer.Deferred()
1721- self.patch(self.main.action_q, 'get_delta', lambda v, g: d.callback((v, g)))
1722+ self.patch(self.main.action_q, 'get_delta',
1723+ lambda v, g: d.callback((v, g)))
1724 self.main.event_q.push('SV_VOLUME_NEW_GENERATION',
1725 volume_id=root.volume_id, generation=100)
1726 vol_id, gen = yield d
1727@@ -3544,7 +3612,8 @@
1728 yield self.vm.add_share(share)
1729 self.vm.update_generation(share.volume_id, 10)
1730 d = defer.Deferred()
1731- self.patch(self.main.action_q, 'get_delta', lambda v, g: d.callback((v, g)))
1732+ self.patch(self.main.action_q, 'get_delta',
1733+ lambda v, g: d.callback((v, g)))
1734 self.main.event_q.push('SV_VOLUME_NEW_GENERATION',
1735 volume_id=share.volume_id, generation=100)
1736 vol_id, gen = yield d
1737@@ -3650,7 +3719,7 @@
1738 response = ListShares(None)
1739 response.shares = [share_response]
1740 self.vm.handle_AQ_SHARES_LIST(response)
1741- self.assertEquals(1, len(self.vm.shared)) # the new shares and root
1742+ self.assertEquals(1, len(self.vm.shared)) # the new shares and root
1743 # check that the share is in the shares dict
1744 self.assertIn(str(share_id), self.vm.shared)
1745 shared = self.vm.shared[str(share_id)]
1746@@ -3674,7 +3743,8 @@
1747 """Create some directories."""
1748 yield super(MetadataTestCase, self).setUp()
1749 self.home_dir = self.mktemp('ubuntuonehacker')
1750- self.root_dir = self.mktemp(os.path.join('ubuntuonehacker', 'Ubuntu One'))
1751+ self.root_dir = self.mktemp(
1752+ os.path.join('ubuntuonehacker', 'Ubuntu One'))
1753 self.data_dir = os.path.join(self.tmpdir, 'data_dir')
1754 self.vm_data_dir = os.path.join(self.tmpdir, 'data_dir', 'vm')
1755 self.partials_dir = self.mktemp('partials')
1756@@ -3708,8 +3778,10 @@
1757 @defer.inlineCallbacks
1758 def setUp(self):
1759 yield super(GenerationsMetadataTestCase, self).setUp()
1760- self.share_md_dir = self.mktemp(os.path.join(self.vm_data_dir, 'shares'))
1761- self.shared_md_dir = self.mktemp(os.path.join(self.vm_data_dir, 'shared'))
1762+ self.share_md_dir = self.mktemp(
1763+ os.path.join(self.vm_data_dir, 'shares'))
1764+ self.shared_md_dir = self.mktemp(
1765+ os.path.join(self.vm_data_dir, 'shared'))
1766 self.shares_dir = self.mktemp('shares')
1767 self.shares_dir_link = os.path.join(self.u1_dir, 'Shared With Me')
1768 make_link(self.shares_dir, self.shares_dir_link)
1769@@ -3735,7 +3807,7 @@
1770 legacy_shares[root.volume_id] = root.__dict__
1771 udf = UDF(volume_id=str(uuid.uuid4()), node_id='udf_node_id',
1772 suggested_path='~/UDF',
1773- path=os.path.join('a','fake','UDF'))
1774+ path=os.path.join('a', 'fake', 'UDF'))
1775 udf.__dict__.pop('generation')
1776 legacy_udfs[udf.volume_id] = udf.__dict__
1777 shares = VMFileShelf(self.share_md_dir)
1778@@ -3770,8 +3842,10 @@
1779 def setUp(self):
1780 """Create the MetadataUpgrader instance."""
1781 yield super(MetadataUpgraderTests, self).setUp()
1782- self.share_md_dir = self.mktemp(os.path.join(self.vm_data_dir, 'shares'))
1783- self.shared_md_dir = self.mktemp(os.path.join(self.vm_data_dir, 'shared'))
1784+ self.share_md_dir = self.mktemp(
1785+ os.path.join(self.vm_data_dir, 'shares'))
1786+ self.shared_md_dir = self.mktemp(
1787+ os.path.join(self.vm_data_dir, 'shared'))
1788 self.udfs_md_dir = os.path.join(self.vm_data_dir, 'udfs')
1789 self._tritcask_dir = self.mktemp('tritcask')
1790 self.shares_dir = self.mktemp('shares')
1791@@ -3786,6 +3860,7 @@
1792 self.udfs_md_dir, self.root_dir,
1793 self.shares_dir,
1794 self.shares_dir_link, self.db)
1795+
1796 @defer.inlineCallbacks
1797 def tearDown(self):
1798 """Restorre _get_md_version"""
1799@@ -3801,7 +3876,7 @@
1800 self.root_dir, self.shares_dir]:
1801 if path_exists(path):
1802 self.rmtree(path)
1803- make_dir(os.path.join(self.root_dir, 'My Files'),recursive=True)
1804+ make_dir(os.path.join(self.root_dir, 'My Files'), recursive=True)
1805 shares_dir = os.path.join(self.root_dir, 'Shared With Me')
1806 make_dir(shares_dir, recursive=True)
1807 set_dir_readonly(self.root_dir)
1808@@ -3843,7 +3918,7 @@
1809 """Test _guess_metadata_version method for version 5."""
1810 # fake a version 5 layout and metadata
1811 shelf = LegacyShareFileShelf(self.share_md_dir)
1812- shelf['foobar'] = _Share(path=os.path.join('foo','bar'),
1813+ shelf['foobar'] = _Share(path=os.path.join('foo', 'bar'),
1814 share_id='foobar')
1815 version = self.md_upgrader._guess_metadata_version()
1816 self.assertEquals(version, '5')
1817@@ -3852,7 +3927,7 @@
1818 """Test _guess_metadata_version method for version 6."""
1819 # fake a version 6 layout and metadata
1820 shelf = VMFileShelf(self.share_md_dir)
1821- shelf['foobar'] = Share(path=os.path.join('foo','bar'),
1822+ shelf['foobar'] = Share(path=os.path.join('foo', 'bar'),
1823 volume_id='foobar')
1824 version = self.md_upgrader._guess_metadata_version()
1825 self.assertEquals(version, '6')
1826@@ -3861,9 +3936,9 @@
1827 """Test _guess_metadata_version method for mixed version 5 and 6."""
1828 # fake a version 6 layout and metadata
1829 shelf = LegacyShareFileShelf(self.share_md_dir)
1830- shelf['old_share'] = _Share(path=os.path.join('foo','bar'),
1831+ shelf['old_share'] = _Share(path=os.path.join('foo', 'bar'),
1832 share_id='old_share')
1833- shelf['new_share'] = Share(path=os.path.join('bar','foo'),
1834+ shelf['new_share'] = Share(path=os.path.join('bar', 'foo'),
1835 volume_id='new_share').__dict__
1836 version = self.md_upgrader._guess_metadata_version()
1837 self.assertEquals(version, '5')
1838@@ -3883,7 +3958,7 @@
1839 def test_upgrade_metadata_5_no_os_rename(self):
1840 """Test that when we upgrade we use the os_helper.rename."""
1841 shelf = LegacyShareFileShelf(self.share_md_dir)
1842- shelf['foobar'] = _Share(path=os.path.join('foo','bar'),
1843+ shelf['foobar'] = _Share(path=os.path.join('foo', 'bar'),
1844 share_id='foobar')
1845 mocker = Mocker()
1846 # ensure that we do use the platform method and not the renamed one
1847@@ -3945,8 +4020,8 @@
1848 def test_store_dicts(self):
1849 """Test that the info stored for a volume is actually a dict."""
1850 udf = UDF(volume_id=str(uuid.uuid4()), node_id='udf_node_id',
1851- suggested_path= '~/UDF',
1852- path=os.path.join('a', 'fake','UDF'))
1853+ suggested_path='~/UDF',
1854+ path=os.path.join('a', 'fake', 'UDF'))
1855 self.shelf[udf.volume_id] = udf
1856 self.assertEqual(udf, self.shelf[udf.volume_id])
1857 pickled_dict = self.shelf._db.get(1000, udf.volume_id)
1858
1859=== modified file 'tests/syncdaemon/test_vm_helper.py'
1860--- tests/syncdaemon/test_vm_helper.py 2011-08-17 20:20:21 +0000
1861+++ tests/syncdaemon/test_vm_helper.py 2011-12-19 20:54:24 +0000
1862@@ -26,7 +26,7 @@
1863 from tests.syncdaemon.test_vm import BaseVolumeManagerTests
1864
1865 from contrib.testing.testcase import BaseTwistedTestCase
1866-from ubuntuone.platform import os_helper
1867+from ubuntuone.platform import expand_user, os_helper
1868 from ubuntuone.syncdaemon import vm_helper
1869 from ubuntuone.syncdaemon.vm_helper import (
1870 create_shares_link,
1871@@ -46,7 +46,7 @@
1872
1873 path = get_udf_path(suggested_path)
1874 expected = suggested_path.replace(u'/', os.path.sep)
1875- expected = os.path.expanduser(expected.encode('utf8'))
1876+ expected = expand_user(expected.encode('utf8'))
1877 self.assertEqual(path, expected)
1878
1879 def test_get_udf_path(self):
1880
1881=== modified file 'ubuntuone/platform/__init__.py'
1882--- ubuntuone/platform/__init__.py 2011-10-05 21:08:10 +0000
1883+++ ubuntuone/platform/__init__.py 2011-12-19 20:54:24 +0000
1884@@ -1,5 +1,26 @@
1885+# -*- encoding: utf-8 -*-
1886+#
1887+# Copyright 2009-2011 Canonical Ltd.
1888+#
1889+# This program is free software: you can redistribute it and/or modify it
1890+# under the terms of the GNU General Public License version 3, as published
1891+# by the Free Software Foundation.
1892+#
1893+# This program is distributed in the hope that it will be useful, but
1894+# WITHOUT ANY WARRANTY; without even the implied warranties of
1895+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
1896+# PURPOSE. See the GNU General Public License for more details.
1897+#
1898+# You should have received a copy of the GNU General Public License along
1899+# with this program. If not, see <http://www.gnu.org/licenses/>.
1900+
1901+"""Platform specific bindings."""
1902+
1903+import os
1904 import sys
1905
1906+from ubuntu_sso.xdg_base_directory import xdg_home
1907+
1908 # very hackish way to avoid "import *" to satisfy pyflakes
1909 # and to avoid import ubuntuone.platform.X as source (it wont work)
1910
1911@@ -13,6 +34,68 @@
1912 from ubuntuone.platform import credentials
1913 from ubuntuone.platform import tools
1914
1915-target = sys.modules[__name__]
1916-for k in dir(source):
1917- setattr(target, k, getattr(source, k))
1918+
1919+def expand_user(path):
1920+ """Fix Python expanduser for weird chars in windows."""
1921+ # Receives bytes, returns unicode
1922+ assert isinstance(path, str)
1923+ try:
1924+ path.decode('utf-8')
1925+ except UnicodeDecodeError:
1926+ raise AssertionError('The path %r must be encoded in utf-8' % path)
1927+ tilde = '~'
1928+ if not path.startswith(tilde) or \
1929+ (len(path) > 1 and path[1:2] != os.path.sep):
1930+ return path
1931+ result = path.replace('~', xdg_home, 1)
1932+
1933+ assert isinstance(result, str)
1934+ try:
1935+ result.decode('utf-8')
1936+ except UnicodeDecodeError:
1937+ raise AssertionError('The path %r must be encoded in utf-8' % result)
1938+ return result
1939+
1940+
1941+platform = source.platform
1942+access = source.access
1943+allow_writes = source.allow_writes
1944+can_write = source.can_write
1945+get_path_list = source.get_path_list
1946+is_link = source.is_link
1947+is_root = source.is_root
1948+listdir = source.listdir
1949+make_dir = source.make_dir
1950+make_link = source.make_link
1951+move_to_trash = source.move_to_trash
1952+native_rename = source.native_rename
1953+normpath = source.normpath
1954+open_file = source.open_file
1955+path_exists = source.path_exists
1956+read_link = source.read_link
1957+recursive_move = source.recursive_move
1958+remove_dir = source.remove_dir
1959+remove_file = source.remove_file
1960+remove_link = source.remove_link
1961+remove_tree = source.remove_tree
1962+rename = source.rename
1963+set_application_name = source.set_application_name
1964+set_dir_readonly = source.set_dir_readonly
1965+set_dir_readwrite = source.set_dir_readwrite
1966+set_file_readonly = source.set_file_readonly
1967+set_file_readwrite = source.set_file_readwrite
1968+set_no_rights = source.set_no_rights
1969+stat_path = source.stat_path
1970+walk = source.walk
1971+
1972+# From Logger
1973+setup_filesystem_logging = source.setup_filesystem_logging
1974+get_filesystem_logger = source.get_filesystem_logger
1975+
1976+# From File System Notifications
1977+FilesystemMonitor = source.FilesystemMonitor
1978+_GeneralINotifyProcessor = source._GeneralINotifyProcessor
1979+
1980+# IPC
1981+ExternalInterface = source.ExternalInterface
1982+is_already_running = source.is_already_running
1983
1984=== added file 'ubuntuone/platform/constants.py'
1985--- ubuntuone/platform/constants.py 1970-01-01 00:00:00 +0000
1986+++ ubuntuone/platform/constants.py 2011-12-19 20:54:24 +0000
1987@@ -0,0 +1,27 @@
1988+# -*- coding: utf-8 *-*
1989+#
1990+# Copyright 2011 Canonical Ltd.
1991+#
1992+# This program is free software: you can redistribute it and/or modify it
1993+# under the terms of the GNU General Public License version 3, as published
1994+# by the Free Software Foundation.
1995+#
1996+# This program is distributed in the hope that it will be useful, but
1997+# WITHOUT ANY WARRANTY; without even the implied warranties of
1998+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
1999+# PURPOSE. See the GNU General Public License for more details.
2000+#
2001+# You should have received a copy of the GNU General Public License along
2002+# with this program. If not, see <http://www.gnu.org/licenses/>.
2003+
2004+"""Constants for the proper platform."""
2005+
2006+# To start hashing a file, no new hash request should arrive
2007+# for it within this amount of seconds:
2008+
2009+import sys
2010+
2011+if sys.platform == "win32":
2012+ HASHQUEUE_DELAY = 3.0
2013+else:
2014+ HASHQUEUE_DELAY = 0.5
2015
2016=== added file 'ubuntuone/platform/event_logging.py'
2017--- ubuntuone/platform/event_logging.py 1970-01-01 00:00:00 +0000
2018+++ ubuntuone/platform/event_logging.py 2011-12-19 20:54:24 +0000
2019@@ -0,0 +1,30 @@
2020+# -*- coding: utf-8 *-*
2021+#
2022+# Copyright 2011 Canonical Ltd.
2023+#
2024+# This program is free software: you can redistribute it and/or modify it
2025+# under the terms of the GNU General Public License version 3, as published
2026+# by the Free Software Foundation.
2027+#
2028+# This program is distributed in the hope that it will be useful, but
2029+# WITHOUT ANY WARRANTY; without even the implied warranties of
2030+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
2031+# PURPOSE. See the GNU General Public License for more details.
2032+#
2033+# You should have received a copy of the GNU General Public License along
2034+# with this program. If not, see <http://www.gnu.org/licenses/>.
2035+
2036+"""Builds a syncdaemon listener that logs events if ZG is installed."""
2037+
2038+import sys
2039+
2040+
2041+if sys.platform == "win32":
2042+ from ubuntuone.platform.windows import event_logging
2043+ source = event_logging
2044+else:
2045+ from ubuntuone.platform.linux import event_logging
2046+ source = event_logging
2047+ is_zeitgeist_installed = source.is_zeitgeist_installed
2048+
2049+get_listener = source.get_listener
2050
2051=== added file 'ubuntuone/platform/launcher.py'
2052--- ubuntuone/platform/launcher.py 1970-01-01 00:00:00 +0000
2053+++ ubuntuone/platform/launcher.py 2011-12-19 20:54:24 +0000
2054@@ -0,0 +1,31 @@
2055+# -*- coding: utf-8 *-*
2056+#
2057+# Copyright 2011 Canonical Ltd.
2058+#
2059+# This program is free software: you can redistribute it and/or modify it
2060+# under the terms of the GNU General Public License version 3, as published
2061+# by the Free Software Foundation.
2062+#
2063+# This program is distributed in the hope that it will be useful, but
2064+# WITHOUT ANY WARRANTY; without even the implied warranties of
2065+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
2066+# PURPOSE. See the GNU General Public License for more details.
2067+#
2068+# You should have received a copy of the GNU General Public License along
2069+# with this program. If not, see <http://www.gnu.org/licenses/>.
2070+"""Use libunity to show a progressbar and emblems on the launcher icon."""
2071+
2072+import sys
2073+
2074+
2075+if sys.platform == "win32":
2076+ from ubuntuone.platform.windows import launcher
2077+ source = launcher
2078+else:
2079+ from ubuntuone.platform.linux import launcher
2080+ source = launcher
2081+
2082+U1_DOTDESKTOP = source.U1_DOTDESKTOP
2083+
2084+UbuntuOneLauncher = source.UbuntuOneLauncher
2085+DummyLauncher = source.DummyLauncher
2086
2087=== modified file 'ubuntuone/platform/linux/__init__.py'
2088--- ubuntuone/platform/linux/__init__.py 2011-10-21 15:49:18 +0000
2089+++ ubuntuone/platform/linux/__init__.py 2011-12-19 20:54:24 +0000
2090@@ -20,6 +20,7 @@
2091
2092 platform = "linux"
2093
2094+
2095 from ubuntuone.platform.linux.os_helper import (
2096 access,
2097 allow_writes,
2098
2099=== modified file 'ubuntuone/platform/linux/messaging.py'
2100--- ubuntuone/platform/linux/messaging.py 2011-07-06 16:59:58 +0000
2101+++ ubuntuone/platform/linux/messaging.py 2011-12-19 20:54:24 +0000
2102@@ -40,8 +40,6 @@
2103 from ubuntuone.status.messaging import AbstractMessaging
2104 from ubuntuone.status.logger import logger
2105
2106-APPLICATION_NAME = 'Ubuntu One Client'
2107-
2108
2109 # pylint: disable=W0613
2110 def open_volumes():
2111
2112=== modified file 'ubuntuone/platform/linux/session.py'
2113--- ubuntuone/platform/linux/session.py 2011-10-14 20:02:23 +0000
2114+++ ubuntuone/platform/linux/session.py 2011-12-19 20:54:24 +0000
2115@@ -21,16 +21,16 @@
2116
2117 from twisted.internet import defer
2118
2119-SESSION_MANAGER_BUSNAME = "org.gnome.SessionManager"
2120-SESSION_MANAGER_IFACE = "org.gnome.SessionManager"
2121-SESSION_MANAGER_PATH = "/org/gnome/SessionManager"
2122-
2123 INHIBIT_LOGGING_OUT = 1
2124 INHIBIT_USER_SWITCHING = 2
2125 INHIBIT_SUSPENDING_COMPUTER = 4
2126 INHIBIT_SESSION_IDLE = 8
2127 INHIBIT_LOGOUT_SUSPEND = INHIBIT_LOGGING_OUT | INHIBIT_SUSPENDING_COMPUTER
2128
2129+SESSION_MANAGER_BUSNAME = "org.gnome.SessionManager"
2130+SESSION_MANAGER_IFACE = "org.gnome.SessionManager"
2131+SESSION_MANAGER_PATH = "/org/gnome/SessionManager"
2132+
2133 APP_ID = "Ubuntu One"
2134 TOPLEVEL_XID = 0
2135
2136
2137=== added file 'ubuntuone/platform/messaging.py'
2138--- ubuntuone/platform/messaging.py 1970-01-01 00:00:00 +0000
2139+++ ubuntuone/platform/messaging.py 2011-12-19 20:54:24 +0000
2140@@ -0,0 +1,42 @@
2141+# -*- coding: utf-8 *-*
2142+# ubuntuone.syncdaemon.platform.messaging - Messages to the user
2143+#
2144+# Copyright 2011 Canonical Ltd.
2145+#
2146+# This program is free software: you can redistribute it and/or modify it
2147+# under the terms of the GNU General Public License version 3, as published
2148+# by the Free Software Foundation.
2149+#
2150+# This program is distributed in the hope that it will be useful, but
2151+# WITHOUT ANY WARRANTY; without even the implied warranties of
2152+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
2153+# PURPOSE. See the GNU General Public License for more details.
2154+#
2155+# You should have received a copy of the GNU General Public License along
2156+# with this program. If not, see <http://www.gnu.org/licenses/>.
2157+"""Module that implements sending messages to the end user."""
2158+
2159+# TODO: We may want to enable different messaging systems. When none
2160+# of them are available, we should fall back to silently discarding
2161+# messages.
2162+
2163+import sys
2164+
2165+
2166+APPLICATION_NAME = 'Ubuntu One Client'
2167+
2168+
2169+if sys.platform == "win32":
2170+ from ubuntuone.platform.windows import messaging
2171+ source = messaging
2172+ hide_message = source.hide_message
2173+else:
2174+ from ubuntuone.platform.linux import messaging
2175+ source = messaging
2176+ DBUS_BUS_NAME = source.DBUS_BUS_NAME
2177+ DBUS_IFACE_GUI = source.DBUS_IFACE_GUI
2178+ TRANSLATION_DOMAIN = source.TRANSLATION_DOMAIN
2179+ _server_callback = source._server_callback
2180+
2181+Messaging = source.Messaging
2182+open_volumes = source.open_volumes
2183
2184=== added file 'ubuntuone/platform/notification.py'
2185--- ubuntuone/platform/notification.py 1970-01-01 00:00:00 +0000
2186+++ ubuntuone/platform/notification.py 2011-12-19 20:54:24 +0000
2187@@ -0,0 +1,34 @@
2188+# -*- coding: utf-8 *-*
2189+# ubuntuone.syncdaemon.platform.notification - User Notification
2190+#
2191+# Copyright 2011 Canonical Ltd.
2192+#
2193+# This program is free software: you can redistribute it and/or modify it
2194+# under the terms of the GNU General Public License version 3, as published
2195+# by the Free Software Foundation.
2196+#
2197+# This program is distributed in the hope that it will be useful, but
2198+# WITHOUT ANY WARRANTY; without even the implied warranties of
2199+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
2200+# PURPOSE. See the GNU General Public License for more details.
2201+#
2202+# You should have received a copy of the GNU General Public License along
2203+# with this program. If not, see <http://www.gnu.org/licenses/>.
2204+"""Module that implements notification of the end user."""
2205+
2206+import sys
2207+
2208+
2209+if sys.platform == "win32":
2210+ from ubuntuone.platform.windows import notification
2211+ source = notification
2212+else:
2213+ from ubuntuone.platform.linux import notification
2214+ source = notification
2215+ NOTIFY_MODULE = source.NOTIFY_MODULE
2216+ ICON_NAME = source.ICON_NAME
2217+
2218+
2219+APPLICATION_NAME = source.APPLICATION_NAME
2220+
2221+Notification = source.Notification
2222
2223=== added file 'ubuntuone/platform/os_helper.py'
2224--- ubuntuone/platform/os_helper.py 1970-01-01 00:00:00 +0000
2225+++ ubuntuone/platform/os_helper.py 2011-12-19 20:54:24 +0000
2226@@ -0,0 +1,57 @@
2227+# -*- coding: utf-8 *-*
2228+#
2229+# Copyright 2011 Canonical Ltd.
2230+#
2231+# This program is free software: you can redistribute it and/or modify it
2232+# under the terms of the GNU General Public License version 3, as published
2233+# by the Free Software Foundation.
2234+#
2235+# This program is distributed in the hope that it will be useful, but
2236+# WITHOUT ANY WARRANTY; without even the implied warranties of
2237+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
2238+# PURPOSE. See the GNU General Public License for more details.
2239+#
2240+# You should have received a copy of the GNU General Public License along
2241+# with this program. If not, see <http://www.gnu.org/licenses/>.
2242+
2243+"""Multiplatform tools to interact with the os."""
2244+
2245+import sys
2246+
2247+
2248+if sys.platform == "win32":
2249+ from ubuntuone.platform.windows import os_helper
2250+ source = os_helper
2251+else:
2252+ from ubuntuone.platform.linux import os_helper
2253+ source = os_helper
2254+
2255+set_no_rights = source.set_no_rights
2256+set_file_readonly = source.set_file_readonly
2257+set_file_readwrite = source.set_file_readwrite
2258+set_dir_readonly = source.set_dir_readonly
2259+set_dir_readwrite = source.set_dir_readwrite
2260+allow_writes = source.allow_writes
2261+remove_file = source.remove_file
2262+remove_tree = source.remove_tree
2263+remove_dir = source.remove_dir
2264+path_exists = source.path_exists
2265+make_dir = source.make_dir
2266+open_file = source.open_file
2267+rename = source.rename
2268+native_rename = source.native_rename
2269+recursive_move = source.recursive_move
2270+make_link = source.make_link
2271+read_link = source.read_link
2272+is_link = source.is_link
2273+remove_link = source.remove_link
2274+listdir = source.listdir
2275+walk = source.walk
2276+access = source.access
2277+can_write = source.can_write
2278+stat_path = source.stat_path
2279+move_to_trash = source.move_to_trash
2280+set_application_name = source.set_application_name
2281+is_root = source.is_root
2282+get_path_list = source.get_path_list
2283+normpath = source.normpath
2284
2285=== added file 'ubuntuone/platform/session.py'
2286--- ubuntuone/platform/session.py 1970-01-01 00:00:00 +0000
2287+++ ubuntuone/platform/session.py 2011-12-19 20:54:24 +0000
2288@@ -0,0 +1,47 @@
2289+# -*- coding: utf-8 *-*
2290+#
2291+# Copyright 2011 Canonical Ltd.
2292+#
2293+# This program is free software: you can redistribute it and/or modify it
2294+# under the terms of the GNU General Public License version 3, as published
2295+# by the Free Software Foundation.
2296+#
2297+# This program is distributed in the hope that it will be useful, but
2298+# WITHOUT ANY WARRANTY; without even the implied warranties of
2299+# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
2300+# PURPOSE. See the GNU General Public License for more details.
2301+#
2302+# You should have received a copy of the GNU General Public License along
2303+# with this program. If not, see <http://www.gnu.org/licenses/>.
2304+
2305+"""Inhibit session logout when busy."""
2306+
2307+import sys
2308+
2309+
2310+INHIBIT_LOGGING_OUT = 1
2311+INHIBIT_USER_SWITCHING = 2
2312+INHIBIT_SUSPENDING_COMPUTER = 4
2313+INHIBIT_SESSION_IDLE = 8
2314+INHIBIT_LOGOUT_SUSPEND = INHIBIT_LOGGING_OUT | INHIBIT_SUSPENDING_COMPUTER
2315+
2316+
2317+if sys.platform == "win32":
2318+ from ubuntuone.platform.windows import session
2319+ source = session
2320+else:
2321+ from ubuntuone.platform.linux import session
2322+ source = session
2323+ SESSION_MANAGER_BUSNAME = source.SESSION_MANAGER_BUSNAME
2324+ SESSION_MANAGER_IFACE = source.SESSION_MANAGER_IFACE
2325+ SESSION_MANAGER_PATH = source.SESSION_MANAGER_PATH
2326+ APP_ID = source.APP_ID
2327+ TOPLEVEL_XID = source.TOPLEVEL_XID
2328+
2329+
2330+Inhibitor = source.Inhibitor
2331+
2332+
2333+def inhibit_logout_suspend(reason):
2334+ """Inhibit the suspend and logout. The result can be cancelled."""
2335+ return Inhibitor().inhibit(INHIBIT_LOGOUT_SUSPEND, reason)
2336
2337=== modified file 'ubuntuone/platform/tools/windows.py'
2338--- ubuntuone/platform/tools/windows.py 2011-11-07 15:39:32 +0000
2339+++ ubuntuone/platform/tools/windows.py 2011-12-19 20:54:24 +0000
2340@@ -16,7 +16,10 @@
2341
2342 """SyncDaemon Tools."""
2343
2344+import errno
2345 import subprocess
2346+import sys
2347+import os
2348
2349 from twisted.internet import defer
2350 from _winreg import OpenKey, HKEY_LOCAL_MACHINE, QueryValueEx
2351@@ -30,6 +33,11 @@
2352 U1_REG_PATH = r'Software\\Ubuntu One'
2353 SD_INSTALL_PATH = 'SyncDaemonInstallPath'
2354
2355+# ugly trick to stop pylint for complaining about
2356+# WindowsError on Linux
2357+if sys.platform != 'win32':
2358+ WindowsError = None
2359+
2360
2361 class IPCError(Exception):
2362 """An IPC specific error signal."""
2363@@ -154,5 +162,9 @@
2364 # to launch the sd on windows
2365 key = OpenKey(HKEY_LOCAL_MACHINE, U1_REG_PATH)
2366 path = QueryValueEx(key, SD_INSTALL_PATH)[0]
2367+ if not os.path.exists(path):
2368+ # either the .exe was moved of the value is wrong
2369+ return defer.fail(WindowsError(errno.ENOENT,
2370+ 'Could not start syncdaemon: File not found %s' % path))
2371 p = subprocess.Popen([path])
2372 return defer.succeed(p)
2373
2374=== modified file 'ubuntuone/platform/windows/__init__.py'
2375--- ubuntuone/platform/windows/__init__.py 2011-10-21 15:49:18 +0000
2376+++ ubuntuone/platform/windows/__init__.py 2011-12-19 20:54:24 +0000
2377@@ -22,6 +22,16 @@
2378 platform = "win32"
2379
2380
2381+import constants
2382+import event_logging
2383+import filesystem_notifications
2384+import launcher
2385+import logger
2386+import messaging
2387+import notification
2388+import os_helper
2389+import session
2390+
2391 from ubuntuone.platform.windows.os_helper import (
2392 access,
2393 allow_writes,
2394
2395=== modified file 'ubuntuone/platform/windows/messaging.py'
2396--- ubuntuone/platform/windows/messaging.py 2011-02-08 22:12:36 +0000
2397+++ ubuntuone/platform/windows/messaging.py 2011-12-19 20:54:24 +0000
2398@@ -23,8 +23,6 @@
2399
2400 from ubuntuone.status.messaging import AbstractMessaging
2401
2402-APPLICATION_NAME = 'Ubuntu One Client'
2403-
2404
2405 class Messaging(AbstractMessaging):
2406 """Notification of the end user."""
2407
2408=== modified file 'ubuntuone/platform/windows/os_helper.py'
2409--- ubuntuone/platform/windows/os_helper.py 2011-10-14 20:02:23 +0000
2410+++ ubuntuone/platform/windows/os_helper.py 2011-12-19 20:54:24 +0000
2411@@ -1,5 +1,4 @@
2412-# Authors: Manuel de la Pena <manuel@canonical.com>
2413-# Natalia B. Bidart <natalia.bidart@canonical.com>
2414+# -*- encoding: utf-8 -*-
2415 #
2416 # Copyright 2011 Canonical Ltd.
2417 #
2418@@ -32,12 +31,14 @@
2419 FILE_GENERIC_WRITE,
2420 FILE_ALL_ACCESS,
2421 )
2422+import win32api
2423 from pywintypes import error as PyWinError
2424-from win32api import GetUserName
2425 from win32com.client import Dispatch
2426 from win32com.shell import shell, shellcon
2427 from win32file import (
2428+ GetFileAttributesW,
2429 MoveFileExW,
2430+ FILE_ATTRIBUTE_SYSTEM,
2431 MOVEFILE_COPY_ALLOWED,
2432 MOVEFILE_REPLACE_EXISTING,
2433 MOVEFILE_WRITE_THROUGH,
2434@@ -49,9 +50,12 @@
2435 CreateWellKnownSid,
2436 DACL_SECURITY_INFORMATION,
2437 GetFileSecurity,
2438+ GetSecurityInfo,
2439 LookupAccountName,
2440 OBJECT_INHERIT_ACE,
2441+ OWNER_SECURITY_INFORMATION,
2442 SetFileSecurity,
2443+ SE_KERNEL_OBJECT,
2444 WinBuiltinAdministratorsSid,
2445 WinWorldSid,
2446 )
2447@@ -65,11 +69,13 @@
2448 logger = logging.getLogger('ubuntuone.SyncDaemon.VM')
2449 platform = 'win32'
2450
2451+# missing win32file constant
2452+INVALID_FILE_ATTRIBUTES = -1
2453+
2454 # LONG_PATH_PREFIX will always be appended only to windows paths,
2455 # which should always be unicode.
2456 LONG_PATH_PREFIX = u'\\\\?\\'
2457
2458-USER_SID = LookupAccountName("", GetUserName())[0]
2459 EVERYONE_SID = CreateWellKnownSid(WinWorldSid)
2460 ADMINISTRATORS_SID = CreateWellKnownSid(WinBuiltinAdministratorsSid)
2461
2462@@ -93,6 +99,16 @@
2463 LINUX_ILLEGAL_CHARS_MAP[value] = key
2464
2465
2466+def get_user_sid():
2467+ process_handle = win32api.GetCurrentProcess()
2468+ security_info = GetSecurityInfo(process_handle, SE_KERNEL_OBJECT,
2469+ OWNER_SECURITY_INFORMATION)
2470+ return security_info.GetSecurityDescriptorOwner()
2471+
2472+
2473+USER_SID = get_user_sid()
2474+
2475+
2476 def _int_to_bin(n):
2477 """Convert an int to a bin string of 32 bits."""
2478 return "".join([str((n >> y) & 1) for y in range(32 - 1, -1, -1)])
2479@@ -417,7 +433,7 @@
2480 def _set_file_attributes(path, groups):
2481 """Set file attributes using the wind32api."""
2482 if not os.path.exists(path):
2483- raise WindowsError('Path %s could not be found.' % path)
2484+ raise WindowsError(errno.ENOENT, 'Path %s could not be found.' % path)
2485
2486 # No need to do any specific handling for invalid characters since
2487 # 'path' is a valid windows path.
2488@@ -627,6 +643,11 @@
2489 shortcut = shell_script.CreateShortCut(destination)
2490 shortcut.Targetpath = target
2491 shortcut.save()
2492+ except AttributeError:
2493+ # This try-except is required at least for the current implementation
2494+ # to allow the application to run even when the paths has unicode chars
2495+ logger.exception('Link couldn\'t be created for '
2496+ 'destination %r:', destination)
2497 except:
2498 logger.exception('make_link could not be completed for target %r, '
2499 'destination %r:', target, destination)
2500@@ -653,6 +674,14 @@
2501 return os.path.exists(path)
2502
2503
2504+def native_is_system_path(path):
2505+ """Return if the path has the sys attr."""
2506+ attrs = GetFileAttributesW(path)
2507+ if attrs == INVALID_FILE_ATTRIBUTES:
2508+ return False
2509+ return FILE_ATTRIBUTE_SYSTEM & attrs == FILE_ATTRIBUTE_SYSTEM
2510+
2511+
2512 @windowspath()
2513 def remove_link(path):
2514 """Removes a link."""
2515@@ -686,7 +715,12 @@
2516 if not directory.endswith(os.path.sep):
2517 directory += os.path.sep
2518
2519- return map(_unicode_to_bytes, os.listdir(directory))
2520+ # On top of the above we have the issue in which python os.listdir does
2521+ # return those paths that are system paths. Those paths are the ones that
2522+ # we do not want to work with.
2523+
2524+ return map(_unicode_to_bytes, [p for p in os.listdir(directory) if not
2525+ native_is_system_path(os.path.join(directory, p))])
2526
2527
2528 @windowspath()
2529@@ -701,10 +735,17 @@
2530 handled.
2531
2532 """
2533+ # Interestingly, while os.listdir DOES return the system folders, os.walk
2534+ # does not. Nevertheless lets filter the same way in here so that if python
2535+ # os.walk changes at some point, we do the same in BOTH methods.
2536 for dirpath, dirnames, filenames in os.walk(path, topdown):
2537 dirpath = _unicode_to_bytes(dirpath.replace(LONG_PATH_PREFIX, u''))
2538- dirnames = map(_unicode_to_bytes, dirnames)
2539- filenames = map(_unicode_to_bytes, filenames)
2540+ if native_is_system_path(dirpath):
2541+ continue
2542+ dirnames = map(_unicode_to_bytes, [p for p in dirnames if
2543+ not native_is_system_path(os.path.join(dirpath, p))])
2544+ filenames = map(_unicode_to_bytes, [p for p in filenames if not
2545+ native_is_system_path(os.path.join(dirpath, p))])
2546 yield dirpath, dirnames, filenames
2547
2548
2549
2550=== modified file 'ubuntuone/status/aggregator.py'
2551--- ubuntuone/status/aggregator.py 2011-10-21 15:49:18 +0000
2552+++ ubuntuone/status/aggregator.py 2011-12-19 20:54:24 +0000
2553@@ -646,12 +646,14 @@
2554 lines = []
2555 files_uploading = len(self.files_uploading)
2556 if files_uploading > 0:
2557+ self.uploading_filename = os.path.basename(
2558+ self.files_uploading[0].path)
2559 lines.append(files_being_uploaded(
2560 self.uploading_filename, files_uploading))
2561 files_downloading = len(self.files_downloading)
2562 if files_downloading > 0:
2563- self.downloading_filename = self.files_downloading[0].path.split(
2564- os.path.sep)[-1]
2565+ self.downloading_filename = os.path.basename(
2566+ self.files_downloading[0].path)
2567 lines.append(files_being_downloaded(
2568 self.downloading_filename, files_downloading))
2569 return "\n".join(lines)
2570@@ -712,8 +714,8 @@
2571 (command.share_id, command.node_id)] = command.deflated_size
2572 # pylint: disable=W0201
2573 if not self.downloading_filename:
2574- self.downloading_filename = self.files_downloading[0].path.split(
2575- os.path.sep)[-1]
2576+ self.downloading_filename = os.path.basename(
2577+ self.files_downloading[0].path)
2578 # pylint: enable=W0201
2579 self.update_progressbar()
2580 logger.debug(
2581@@ -743,8 +745,8 @@
2582 (command.share_id, command.node_id)] = command.deflated_size
2583 # pylint: disable=W0201
2584 if not self.uploading_filename:
2585- self.uploading_filename = self.files_uploading[0].path.split(
2586- os.path.sep)[-1]
2587+ self.uploading_filename = os.path.basename(
2588+ self.files_uploading[0].path)
2589 # pylint: enable=W0201
2590 self.update_progressbar()
2591 logger.debug(
2592
2593=== modified file 'ubuntuone/syncdaemon/config.py'
2594--- ubuntuone/syncdaemon/config.py 2011-11-02 16:51:32 +0000
2595+++ ubuntuone/syncdaemon/config.py 2011-12-19 20:54:24 +0000
2596@@ -26,12 +26,13 @@
2597 from ubuntu_sso.xdg_base_directory import (
2598 load_config_paths,
2599 native_path,
2600- syncdaemon_path,
2601 save_config_path,
2602 xdg_data_home,
2603 xdg_cache_home,
2604 )
2605
2606+from ubuntuone.platform import expand_user
2607+
2608 # the try/except is to work with older versions of configglue (that
2609 # had everything that is now configglue.inischema.* as configglue.*).
2610 # The naming shenanigans are to work around pyflakes being completely
2611@@ -85,7 +86,8 @@
2612 Return the path using user home + value.
2613
2614 """
2615- result = syncdaemon_path(os.path.expanduser(path_from_unix(value)))
2616+ path = path_from_unix(value)
2617+ result = expand_user(path)
2618 assert isinstance(result, str)
2619 return result
2620
2621
2622=== modified file 'ubuntuone/syncdaemon/interaction_interfaces.py'
2623--- ubuntuone/syncdaemon/interaction_interfaces.py 2011-11-02 18:14:42 +0000
2624+++ ubuntuone/syncdaemon/interaction_interfaces.py 2011-12-19 20:54:24 +0000
2625@@ -251,7 +251,6 @@
2626 """Return the free space for the given volume."""
2627 return self.main.vm.get_free_space(volume_id)
2628
2629- @log_call(logger.debug, with_result=False)
2630 def waiting(self):
2631 """Return a list of the operations in action queue."""
2632 waiting = []
2633@@ -262,7 +261,6 @@
2634 waiting.append((operation, str(id(cmd)), data))
2635 return waiting
2636
2637- @log_call(logger.debug, with_result=False)
2638 def waiting_metadata(self):
2639 """Return a list of the operations in the meta-queue.
2640
2641@@ -279,7 +277,6 @@
2642 waiting_metadata.append((operation, data))
2643 return waiting_metadata
2644
2645- @log_call(logger.debug, with_result=False)
2646 def waiting_content(self):
2647 """Return a list of files that are waiting to be up- or downloaded.
2648
2649@@ -1099,7 +1096,6 @@
2650
2651 self.interface.sync_daemon.QuotaExceeded(volume_dict)
2652
2653- @log_call(logger.debug)
2654 def handle_SYS_QUEUE_ADDED(self, command):
2655 """Handle SYS_QUEUE_ADDED.
2656
2657@@ -1117,7 +1113,6 @@
2658 sanitize_dict(data)
2659 self.interface.status.RequestQueueAdded(op_name, str(op_id), data)
2660
2661- @log_call(logger.debug)
2662 def handle_SYS_QUEUE_REMOVED(self, command):
2663 """Handle SYS_QUEUE_REMOVED.
2664
2665@@ -1250,17 +1245,17 @@
2666 @log_call(logger.debug)
2667 def get_rootdir(self):
2668 """Return the root dir/mount point."""
2669- return self.main.get_rootdir()
2670+ return self.main.get_rootdir().decode('utf-8')
2671
2672 @log_call(logger.debug)
2673 def get_sharesdir(self):
2674 """Return the shares dir/mount point."""
2675- return self.main.get_sharesdir()
2676+ return self.main.get_sharesdir().decode('utf-8')
2677
2678 @log_call(logger.debug)
2679 def get_sharesdir_link(self):
2680 """Return the shares dir/mount point."""
2681- return self.main.get_sharesdir_link()
2682+ return self.main.get_sharesdir_link().decode('utf-8')
2683
2684 @log_call(logger.debug)
2685 def wait_for_nirvana(self, last_event_interval):
2686
2687=== modified file 'ubuntuone/syncdaemon/vm_helper.py'
2688--- ubuntuone/syncdaemon/vm_helper.py 2011-08-25 14:38:35 +0000
2689+++ ubuntuone/syncdaemon/vm_helper.py 2011-12-19 20:54:24 +0000
2690@@ -21,7 +21,7 @@
2691
2692 import os
2693
2694-
2695+from ubuntuone.platform import expand_user
2696 from ubuntuone.platform import (
2697 is_link,
2698 make_link,
2699@@ -78,7 +78,6 @@
2700 # only create the link if it does not exist
2701 make_link(source, dest)
2702 result = True
2703-
2704 return result
2705
2706
2707@@ -95,7 +94,7 @@
2708
2709 path = path.decode('utf8')
2710
2711- user_home = os.path.expanduser(u'~')
2712+ user_home = expand_user('~')
2713 start_list = os.path.abspath(user_home).split(os.path.sep)
2714 path_list = os.path.abspath(path).split(os.path.sep)
2715
2716@@ -123,5 +122,5 @@
2717 assert isinstance(suggested_path, unicode)
2718 # Unicode boundary! the suggested_path is Unicode in protocol and server,
2719 # but here we use bytes for paths
2720- path = suggested_path.replace(u'/', os.path.sep)
2721- return os.path.expanduser(path).encode("utf8")
2722+ path = suggested_path.replace(u'/', os.path.sep).encode('utf-8')
2723+ return expand_user(path)
2724
2725=== modified file 'ubuntuone/syncdaemon/volume_manager.py'
2726--- ubuntuone/syncdaemon/volume_manager.py 2011-10-25 02:24:29 +0000
2727+++ ubuntuone/syncdaemon/volume_manager.py 2011-12-19 20:54:24 +0000
2728@@ -28,6 +28,7 @@
2729 from itertools import ifilter
2730
2731 from twisted.internet import defer
2732+from ubuntuone.platform import expand_user
2733 from ubuntuone.storageprotocol import request
2734 from ubuntuone.storageprotocol.volumes import (
2735 ShareVolume,
2736@@ -119,8 +120,12 @@
2737
2738 # default generation for all volumes without a value in the metadata
2739 generation = None
2740+ # transient value used in local rescan during resubscribe. Set it always to
2741+ # False no matter what is passed to the constructor
2742+ local_rescanning = False
2743
2744- def __init__(self, volume_id, node_id, generation=None, subscribed=False):
2745+ def __init__(self, volume_id, node_id, generation=None, subscribed=False,
2746+ local_rescanning=None):
2747 """Create the volume."""
2748 # id and node_id should be str or None
2749 assert isinstance(volume_id, basestring) or volume_id is None
2750@@ -230,7 +235,7 @@
2751 @property
2752 def active(self):
2753 """Return True if this Share is accepted."""
2754- return self.accepted and self.subscribed
2755+ return self.accepted and self.subscribed and not self.local_rescanning
2756
2757 def __eq__(self, other):
2758 result = (super(Share, self).__eq__(other) and
2759@@ -328,7 +333,7 @@
2760 Return a list of paths that are 'syncdaemon' valid.
2761
2762 """
2763- user_home = os.path.expanduser('~')
2764+ user_home = expand_user('~')
2765 path_list = get_path_list(self.path)
2766 common_prefix = os.path.commonprefix([self.path, user_home])
2767 common_list = get_path_list(common_prefix)
2768@@ -346,7 +351,7 @@
2769 @property
2770 def active(self):
2771 """Returns True if the UDF is subscribed."""
2772- return self.subscribed
2773+ return self.subscribed and not self.local_rescanning
2774
2775 @classmethod
2776 def from_udf_volume(cls, udf_volume, path):
2777@@ -861,13 +866,13 @@
2778 self._create_share_dir(share)
2779 self.log.debug('add_share: volume active, temporarly unsubscribe '
2780 'to rescan (scan_local? %r).', share.can_write())
2781- share.subscribed = False
2782+ share.local_rescanning = True
2783 self.shares[share.volume_id] = share
2784
2785 def subscribe(result):
2786 """Subscribe the share after the local rescan."""
2787 volume = self.get_volume(share.volume_id)
2788- volume.subscribed = True
2789+ volume.local_rescanning = False
2790 self.store_volume(volume)
2791 return result
2792
2793@@ -1089,7 +1094,7 @@
2794 if udf.subscribed:
2795 self.log.debug('add_udf: volume subscribed, '
2796 'temporarly unsubscribe to do local rescan.')
2797- udf.subscribed = False
2798+ udf.local_rescanning = True
2799 self.udfs[udf.volume_id] = udf
2800 if not path_exists(udf.path):
2801 make_dir(udf.path, recursive=True)
2802@@ -1102,7 +1107,7 @@
2803
2804 """
2805 volume = self.get_volume(udf.volume_id)
2806- volume.subscribed = True
2807+ volume.local_rescanning = False
2808 self.store_volume(volume)
2809 return result
2810
2811@@ -1277,11 +1282,14 @@
2812
2813 """
2814 volume = self.get_volume(volume_id)
2815+ volume.local_rescanning = False
2816+ self.store_volume(volume)
2817+ return result
2818+
2819+ try:
2820 volume.subscribed = True
2821+ volume.local_rescanning = True
2822 self.store_volume(volume)
2823- return result
2824-
2825- try:
2826 d = self._scan_volume(volume, scan_local=volume.can_write())
2827 except KeyError, e:
2828 push_error(error="METADATA_DOES_NOT_EXIST")

Subscribers

People subscribed via source and target branches