Merge lp:~dobey/ubuntuone-client/update-4-0 into lp:ubuntuone-client/stable-4-0

Proposed by dobey
Status: Merged
Merged at revision: 1256
Proposed branch: lp:~dobey/ubuntuone-client/update-4-0
Merge into: lp:ubuntuone-client/stable-4-0
Diff against target: 2941 lines (+894/-756)
31 files modified
contrib/testing/testcase.py (+11/-3)
run-mac-tests (+1/-1)
tests/platform/filesystem_notifications/common.py (+122/-303)
tests/platform/filesystem_notifications/test_darwin.py (+84/-388)
tests/platform/filesystem_notifications/test_fsevents_daemon.py (+8/-8)
tests/platform/filesystem_notifications/test_windows.py (+344/-0)
tests/platform/ipc/test_external_interface.py (+14/-0)
tests/platform/test_tools.py (+10/-0)
tests/status/test_aggregator.py (+82/-1)
tests/syncdaemon/test_fsm.py (+27/-0)
tests/syncdaemon/test_interaction_interfaces.py (+28/-20)
tests/syncdaemon/test_vm.py (+13/-2)
ubuntuone/platform/filesystem_notifications/monitor/__init__.py (+3/-0)
ubuntuone/platform/filesystem_notifications/monitor/common.py (+1/-0)
ubuntuone/platform/filesystem_notifications/monitor/darwin/fsevents_daemon.py (+20/-20)
ubuntuone/platform/ipc/ipc_client.py (+17/-1)
ubuntuone/platform/ipc/linux.py (+33/-0)
ubuntuone/platform/ipc/perspective_broker.py (+7/-2)
ubuntuone/platform/os_helper/__init__.py (+1/-0)
ubuntuone/platform/os_helper/darwin.py (+5/-2)
ubuntuone/platform/os_helper/linux.py (+1/-0)
ubuntuone/platform/os_helper/unix.py (+5/-0)
ubuntuone/platform/os_helper/windows.py (+2/-0)
ubuntuone/platform/tools/__init__.py (+7/-0)
ubuntuone/status/aggregator.py (+16/-1)
ubuntuone/syncdaemon/__init__.py (+5/-0)
ubuntuone/syncdaemon/event_queue.py (+1/-0)
ubuntuone/syncdaemon/filesystem_manager.py (+3/-1)
ubuntuone/syncdaemon/interaction_interfaces.py (+6/-2)
ubuntuone/syncdaemon/status_listener.py (+13/-1)
ubuntuone/syncdaemon/volume_manager.py (+4/-0)
To merge this branch: bzr merge lp:~dobey/ubuntuone-client/update-4-0
Reviewer Review Type Date Requested Status
Roberto Alsina (community) Approve
Review via email: mp+120847@code.launchpad.net

Commit message

[Diego Sarmentero]

    - Fixing ipc signals in windows and changing deque size to 5.
    - Adding ipc support to share the menu data (LP: #1032659).
    - Collect and return the data for the menu from aggregator (LP: #1032659).
    - Refactoring test for filesystem notifications
    - Use the correct name space (LP: 1026209).

[Rodney Dawes]

    - Move the patching of user_home to before where it is used elsewhere in setUp.

[Alejandro J. Cura]

    - DownloadFinished ipc signal is now thrown after the partial is commited. (LP: #1031197)

[Roberto Alsina]

    - Added check for UDF path not being a file (LP:1033582).

To post a comment you must log in.
Revision history for this message
Roberto Alsina (ralsina) :
review: Approve
Revision history for this message
Ubuntu One Auto Pilot (otto-pilot) wrote :
Download full text (254.9 KiB)

The attempt to merge lp:~dobey/ubuntuone-client/update-4-0 into lp:ubuntuone-client/stable-4-0 failed. Below is the output from the failed tests.

/usr/bin/gnome-autogen.sh
checking for autoconf >= 2.53...
  testing autoconf2.50... not found.
  testing autoconf... found 2.69
checking for automake >= 1.10...
  testing automake-1.11... found 1.11.5
checking for libtool >= 1.5...
  testing libtoolize... found 2.4.2
checking for intltool >= 0.30...
  testing intltoolize... found 0.50.2
checking for pkg-config >= 0.14.0...
  testing pkg-config... found 0.26
checking for gtk-doc >= 1.0...
  testing gtkdocize... found 1.18
Checking for required M4 macros...
Checking for forbidden M4 macros...
Processing ./configure.ac
Running libtoolize...
libtoolize: putting auxiliary files in `.'.
libtoolize: copying file `./ltmain.sh'
libtoolize: putting macros in AC_CONFIG_MACRO_DIR, `m4'.
libtoolize: copying file `m4/libtool.m4'
libtoolize: copying file `m4/ltoptions.m4'
libtoolize: copying file `m4/ltsugar.m4'
libtoolize: copying file `m4/ltversion.m4'
libtoolize: copying file `m4/lt~obsolete.m4'
Running intltoolize...
Running gtkdocize...
Running aclocal-1.11...
Running autoconf...
Running autoheader...
Running automake-1.11...
Running ./configure --enable-gtk-doc --enable-debug ...
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking how to create a ustar tar archive... gnutar
checking whether make supports nested variables... yes
checking for style of include used by make... GNU
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking dependency style of gcc... gcc3
checking for library containing strerror... none required
checking for gcc... (cached) gcc
checking whether we are using the GNU C compiler... (cached) yes
checking whether gcc accepts -g... (cached) yes
checking for gcc option to accept ISO C89... (cached) none needed
checking dependency style of gcc... (cached) gcc3
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking how to print strings... printf
checking for a sed that does not truncate output... /bin/sed
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for fgrep... /bin/grep -F
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking the maximum length of command line arguments... 1572864
checking whether the shell understands some XSI constructs......

1256. By dobey

[Diego Sarmentero]

    - Fixing ipc signals in windows and changing deque size to 5.
    - Adding ipc support to share the menu data (LP: #1032659).
    - Collect and return the data for the menu from aggregator (LP: #1032659).
    - Refactoring test for filesystem notifications
    - Use the correct name space (LP: 1026209).

[Rodney Dawes]

    - Move the patching of user_home to before where it is used elsewhere in setUp.

[Alejandro J. Cura]

    - DownloadFinished ipc signal is now thrown after the partial is commited. (LP: #1031197)

[Roberto Alsina]

    - Added check for UDF path not being a file (LP:1033582).

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'contrib/testing/testcase.py'
--- contrib/testing/testcase.py 2012-06-22 09:59:14 +0000
+++ contrib/testing/testcase.py 2012-08-22 18:22:29 +0000
@@ -57,6 +57,8 @@
57 main,57 main,
58 local_rescan,58 local_rescan,
59 tritcask,59 tritcask,
60 RECENT_TRANSFERS,
61 UPLOADING,
60)62)
61from ubuntuone.syncdaemon import logger63from ubuntuone.syncdaemon import logger
62from ubuntuone import platform64from ubuntuone import platform
@@ -208,6 +210,10 @@
208210
209 show_all_notifications = True211 show_all_notifications = True
210212
213 def menu_data(self):
214 """Fake menu_data."""
215 return {RECENT_TRANSFERS: [], UPLOADING: []}
216
211217
212class FakeMain(main.Main):218class FakeMain(main.Main):
213 """ A fake Main class to setup the tests """219 """ A fake Main class to setup the tests """
@@ -385,6 +391,11 @@
385 def setUp(self):391 def setUp(self):
386 yield super(BaseTwistedTestCase, self).setUp()392 yield super(BaseTwistedTestCase, self).setUp()
387 self.__root = None393 self.__root = None
394
395 # Patch the user home
396 self.home_dir = self.mktemp('ubuntuonehacker')
397 self.patch(platform, "user_home", self.home_dir)
398
388 # use the config from the branch399 # use the config from the branch
389 new_get_config_files = lambda: [os.path.join(os.environ['ROOTDIR'],400 new_get_config_files = lambda: [os.path.join(os.environ['ROOTDIR'],
390 'data', 'syncdaemon.conf')]401 'data', 'syncdaemon.conf')]
@@ -405,9 +416,6 @@
405 self.log = logging.getLogger("ubuntuone.SyncDaemon.TEST")416 self.log = logging.getLogger("ubuntuone.SyncDaemon.TEST")
406 self.log.info("starting test %s.%s", self.__class__.__name__,417 self.log.info("starting test %s.%s", self.__class__.__name__,
407 self._testMethodName)418 self._testMethodName)
408 # Patch the user home
409 self.home_dir = self.mktemp('ubuntuonehacker')
410 self.patch(platform, "user_home", self.home_dir)
411 self.patch(action_queue.tunnel_runner, "TunnelRunner",419 self.patch(action_queue.tunnel_runner, "TunnelRunner",
412 self.tunnel_runner_class)420 self.tunnel_runner_class)
413421
414422
=== modified file 'run-mac-tests'
--- run-mac-tests 2012-07-30 19:31:32 +0000
+++ run-mac-tests 2012-08-22 18:22:29 +0000
@@ -27,7 +27,7 @@
27# version. If you delete this exception statement from all source27# version. If you delete this exception statement from all source
28# files in the program, then also delete it here.28# files in the program, then also delete it here.
2929
30PYTHONPATH=../ubuntu-sso-client/:../ubuntuone-storage-protocol:../ubuntuone-dev-tools:$PYTHONPATH30PYTHONPATH=../ubuntu-sso-client/:../ubuntuone-storage-protocol:../ubuntuone-dev-tools:../ubuntuone-fsevents-daemon/python:$PYTHONPATH
3131
32set -e32set -e
33if [ $# -ne 0 ]; then33if [ $# -ne 0 ]; then
3434
=== renamed file 'tests/platform/filesystem_notifications/test_windows.py' => 'tests/platform/filesystem_notifications/common.py'
--- tests/platform/filesystem_notifications/test_windows.py 2012-07-18 15:18:04 +0000
+++ tests/platform/filesystem_notifications/common.py 2012-08-22 18:22:29 +0000
@@ -38,42 +38,45 @@
38import itertools38import itertools
3939
40from twisted.internet import defer40from twisted.internet import defer
41from win32file import FILE_NOTIFY_INFORMATION
42
43from contrib.testing.testcase import BaseTwistedTestCase41from contrib.testing.testcase import BaseTwistedTestCase
44from ubuntuone.devtools.handlers import MementoHandler42from ubuntuone.devtools.handlers import MementoHandler
45from ubuntuone.platform.os_helper import windows as os_helper
46from ubuntuone.platform.filesystem_notifications.pyinotify_agnostic import (43from ubuntuone.platform.filesystem_notifications.pyinotify_agnostic import (
44 EventsCodes,
47 ProcessEvent,45 ProcessEvent,
48 IN_CLOSE_WRITE,46 IN_CLOSE_WRITE,
49 IN_CREATE,47 IN_CREATE,
50 IN_DELETE,48 IN_DELETE,
51 IN_OPEN,49 IN_OPEN,
52)50)
53from ubuntuone.platform.filesystem_notifications.monitor import (
54 windows as filesystem_notifications,
55)
56from ubuntuone.platform.filesystem_notifications import notify_processor51from ubuntuone.platform.filesystem_notifications import notify_processor
57from ubuntuone.platform.filesystem_notifications.monitor.common import (52from ubuntuone.platform.filesystem_notifications.monitor.common import (
58 FilesystemMonitor,53 FilesystemMonitor,
59 Watch,54 Watch,
60 WatchManager,55 WatchManager,
61)56)
62from ubuntuone.platform.filesystem_notifications.monitor.windows import (57from ubuntuone.platform.filesystem_notifications.monitor import ACTIONS
63 ACTIONS,58from ubuntuone.platform.os_helper import get_os_valid_path
64 FILE_NOTIFY_CHANGE_FILE_NAME,59
65 FILE_NOTIFY_CHANGE_DIR_NAME,60OP_FLAGS = EventsCodes.FLAG_COLLECTIONS['OP_FLAGS']
66 FILE_NOTIFY_CHANGE_ATTRIBUTES,61IS_DIR = EventsCodes.FLAG_COLLECTIONS['SPECIAL_FLAGS']['IN_ISDIR']
67 FILE_NOTIFY_CHANGE_SIZE,
68 FILE_NOTIFY_CHANGE_LAST_WRITE,
69 FILE_NOTIFY_CHANGE_SECURITY,
70 FILE_NOTIFY_CHANGE_LAST_ACCESS,
71)
7262
73#create a rever mapping to use it in the tests.63#create a rever mapping to use it in the tests.
74REVERSE_WINDOWS_ACTIONS = {}64REVERSE_OS_ACTIONS = {}
75for key, value in ACTIONS.iteritems():65for key, value in ACTIONS.items():
76 REVERSE_WINDOWS_ACTIONS[value] = key66 REVERSE_OS_ACTIONS[value] = key
67
68
69class FakeEventsProcessor(object):
70
71 """Handle fake events creation and processing."""
72
73 def create_fake_event(self):
74 """Create a fake filesystem event."""
75 raise NotImplementedError
76
77 def custom_process_events(self):
78 """Process a fake event."""
79 raise NotImplementedError
7780
7881
79class FakeException(Exception):82class FakeException(Exception):
@@ -127,32 +130,17 @@
127 @defer.inlineCallbacks130 @defer.inlineCallbacks
128 def setUp(self):131 def setUp(self):
129 yield super(TestWatch, self).setUp()132 yield super(TestWatch, self).setUp()
133 self.path = ''
134 self.invalid_path = ''
135 self.common_path = ''
130 self.basedir = self.mktemp('test_root')136 self.basedir = self.mktemp('test_root')
131 self.mask = FILE_NOTIFY_CHANGE_FILE_NAME | \137 self.mask = None
132 FILE_NOTIFY_CHANGE_DIR_NAME | \
133 FILE_NOTIFY_CHANGE_ATTRIBUTES | \
134 FILE_NOTIFY_CHANGE_SIZE | \
135 FILE_NOTIFY_CHANGE_LAST_WRITE | \
136 FILE_NOTIFY_CHANGE_SECURITY | \
137 FILE_NOTIFY_CHANGE_LAST_ACCESS
138 self.memento = MementoHandler()138 self.memento = MementoHandler()
139 self.memento.setLevel(logging.DEBUG)139 self.memento.setLevel(logging.DEBUG)
140 self.raw_events = []140 self.raw_events = []
141 self.paths_checked = []141 self.paths_checked = []
142 old_is_dir = Watch._path_is_dir142 old_is_dir = Watch._path_is_dir
143143 self.fake_events_processor = FakeEventsProcessor()
144 def file_notify_information_wrapper(buf, data):
145 """Wrapper that gets the events and adds them to the list."""
146 events = FILE_NOTIFY_INFORMATION(buf, data)
147 # we want to append the list because that is what will be logged.
148 # If we use extend we wont have the same logging because it will
149 # group all events in a single lists which is not what the COM API
150 # does.
151 str_events = [
152 (filesystem_notifications.ACTIONS_NAMES[action], path) for action, path in
153 events]
154 self.raw_events.append(str_events)
155 return events
156144
157 def path_is_dir_wrapper(watch, path):145 def path_is_dir_wrapper(watch, path):
158 """Wrapper that gets the checked paths."""146 """Wrapper that gets the checked paths."""
@@ -160,8 +148,6 @@
160 self.paths_checked.append((path, result))148 self.paths_checked.append((path, result))
161 return result149 return result
162150
163 self.patch(filesystem_notifications, 'FILE_NOTIFY_INFORMATION',
164 file_notify_information_wrapper)
165 self.patch(Watch, '_path_is_dir', path_is_dir_wrapper)151 self.patch(Watch, '_path_is_dir', path_is_dir_wrapper)
166152
167 @defer.inlineCallbacks153 @defer.inlineCallbacks
@@ -169,7 +155,7 @@
169 """Perform the file operations and returns the recorded events."""155 """Perform the file operations and returns the recorded events."""
170 handler = TestCaseHandler(number_events=number_events)156 handler = TestCaseHandler(number_events=number_events)
171 manager = WatchManager(handler)157 manager = WatchManager(handler)
172 yield manager.add_watch(os_helper.get_windows_valid_path(path), mask)158 yield manager.add_watch(get_os_valid_path(path), mask)
173 # change the logger so that we can check the logs if we wanted159 # change the logger so that we can check the logs if we wanted
174 manager._wdm[0].log.addHandler(self.memento)160 manager._wdm[0].log.addHandler(self.memento)
175 # clean logger later161 # clean logger later
@@ -198,7 +184,7 @@
198 create_file, 1)184 create_file, 1)
199 event = events[0]185 event = events[0]
200 self.assertFalse(event.dir)186 self.assertFalse(event.dir)
201 self.assertEqual(0x100, event.mask)187 self.assertEqual(OP_FLAGS['IN_CREATE'], event.mask)
202 self.assertEqual('IN_CREATE', event.maskname)188 self.assertEqual('IN_CREATE', event.maskname)
203 self.assertEqual(os.path.split(file_name)[1], event.name)189 self.assertEqual(os.path.split(file_name)[1], event.name)
204 self.assertEqual('.', event.path)190 self.assertEqual('.', event.path)
@@ -218,7 +204,7 @@
218 create_dir, 1)204 create_dir, 1)
219 event = events[0]205 event = events[0]
220 self.assertTrue(event.dir)206 self.assertTrue(event.dir)
221 self.assertEqual(0x40000100, event.mask)207 self.assertEqual(OP_FLAGS['IN_CREATE'] | IS_DIR, event.mask)
222 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)208 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)
223 self.assertEqual(os.path.split(dir_name)[1], event.name)209 self.assertEqual(os.path.split(dir_name)[1], event.name)
224 self.assertEqual('.', event.path)210 self.assertEqual('.', event.path)
@@ -240,7 +226,7 @@
240 remove_file, 1)226 remove_file, 1)
241 event = events[0]227 event = events[0]
242 self.assertFalse(event.dir)228 self.assertFalse(event.dir)
243 self.assertEqual(0x200, event.mask)229 self.assertEqual(OP_FLAGS['IN_DELETE'], event.mask)
244 self.assertEqual('IN_DELETE', event.maskname)230 self.assertEqual('IN_DELETE', event.maskname)
245 self.assertEqual(os.path.split(file_name)[1], event.name)231 self.assertEqual(os.path.split(file_name)[1], event.name)
246 self.assertEqual('.', event.path)232 self.assertEqual('.', event.path)
@@ -262,36 +248,15 @@
262 remove_dir, 1)248 remove_dir, 1)
263 event = events[0]249 event = events[0]
264 self.assertTrue(event.dir)250 self.assertTrue(event.dir)
265 self.assertEqual(0x40000200, event.mask)251 self.assertEqual(OP_FLAGS['IN_DELETE'] | IS_DIR, event.mask)
266 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)252 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)
267 self.assertEqual('.', event.path)253 self.assertEqual('.', event.path)
268 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)254 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)
269 self.assertEqual(0, event.wd)255 self.assertEqual(0, event.wd)
270256
271 @defer.inlineCallbacks
272 def test_file_write(self):257 def test_file_write(self):
273 """Test that the correct event is raised when a file is written."""258 """Test that the correct event is raised when a file is written."""
274 file_name = os.path.join(self.basedir, 'test_file_write')259 raise NotImplementedError
275 # create the file before recording
276 fd = open(file_name, 'w')
277 # clean behind us by removing the file
278 self.addCleanup(os.remove, file_name)
279
280 def write_file():
281 """Action for the test."""
282 fd.write('test')
283 fd.close()
284
285 events = yield self._perform_operations(self.basedir, self.mask,
286 write_file, 1)
287 event = events[0]
288 self.assertFalse(event.dir)
289 self.assertEqual(0x2, event.mask)
290 self.assertEqual('IN_MODIFY', event.maskname)
291 self.assertEqual(os.path.split(file_name)[1], event.name)
292 self.assertEqual('.', event.path)
293 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)
294 self.assertEqual(0, event.wd)
295260
296 @defer.inlineCallbacks261 @defer.inlineCallbacks
297 def test_file_moved_to_watched_dir_same_watcher(self):262 def test_file_moved_to_watched_dir_same_watcher(self):
@@ -313,7 +278,7 @@
313 move_to_event = events[1]278 move_to_event = events[1]
314 # first test the move from279 # first test the move from
315 self.assertFalse(move_from_event.dir)280 self.assertFalse(move_from_event.dir)
316 self.assertEqual(0x40, move_from_event.mask)281 self.assertEqual(OP_FLAGS['IN_MOVED_FROM'], move_from_event.mask)
317 self.assertEqual('IN_MOVED_FROM', move_from_event.maskname)282 self.assertEqual('IN_MOVED_FROM', move_from_event.maskname)
318 self.assertEqual(os.path.split(from_file_name)[1],283 self.assertEqual(os.path.split(from_file_name)[1],
319 move_from_event.name)284 move_from_event.name)
@@ -323,7 +288,7 @@
323 self.assertEqual(0, move_from_event.wd)288 self.assertEqual(0, move_from_event.wd)
324 # test the move to289 # test the move to
325 self.assertFalse(move_to_event.dir)290 self.assertFalse(move_to_event.dir)
326 self.assertEqual(0x80, move_to_event.mask)291 self.assertEqual(OP_FLAGS['IN_MOVED_TO'], move_to_event.mask)
327 self.assertEqual('IN_MOVED_TO', move_to_event.maskname)292 self.assertEqual('IN_MOVED_TO', move_to_event.maskname)
328 self.assertEqual(os.path.split(to_file_name)[1], move_to_event.name)293 self.assertEqual(os.path.split(to_file_name)[1], move_to_event.name)
329 self.assertEqual('.', move_to_event.path)294 self.assertEqual('.', move_to_event.path)
@@ -354,7 +319,7 @@
354 move_file, 1)319 move_file, 1)
355 event = events[0]320 event = events[0]
356 self.assertFalse(event.dir)321 self.assertFalse(event.dir)
357 self.assertEqual(0x200, event.mask)322 self.assertEqual(OP_FLAGS['IN_DELETE'], event.mask)
358 self.assertEqual('IN_DELETE', event.maskname)323 self.assertEqual('IN_DELETE', event.maskname)
359 self.assertEqual(os.path.split(from_file_name)[1], event.name)324 self.assertEqual(os.path.split(from_file_name)[1], event.name)
360 self.assertEqual('.', event.path)325 self.assertEqual('.', event.path)
@@ -382,7 +347,7 @@
382 move_files, 1)347 move_files, 1)
383 event = events[0]348 event = events[0]
384 self.assertFalse(event.dir)349 self.assertFalse(event.dir)
385 self.assertEqual(0x100, event.mask)350 self.assertEqual(OP_FLAGS['IN_CREATE'], event.mask)
386 self.assertEqual('IN_CREATE', event.maskname)351 self.assertEqual('IN_CREATE', event.maskname)
387 self.assertEqual(os.path.split(to_file_name)[1], event.name)352 self.assertEqual(os.path.split(to_file_name)[1], event.name)
388 self.assertEqual('.', event.path)353 self.assertEqual('.', event.path)
@@ -409,7 +374,8 @@
409 move_to_event = events[1]374 move_to_event = events[1]
410 # first test the move from375 # first test the move from
411 self.assertTrue(move_from_event.dir)376 self.assertTrue(move_from_event.dir)
412 self.assertEqual(0x40000040, move_from_event.mask)377 self.assertEqual(OP_FLAGS['IN_MOVED_FROM'] | IS_DIR,
378 move_from_event.mask)
413 self.assertEqual('IN_MOVED_FROM|IN_ISDIR', move_from_event.maskname)379 self.assertEqual('IN_MOVED_FROM|IN_ISDIR', move_from_event.maskname)
414 self.assertEqual(os.path.split(from_dir_name)[1], move_from_event.name)380 self.assertEqual(os.path.split(from_dir_name)[1], move_from_event.name)
415 self.assertEqual('.', move_from_event.path)381 self.assertEqual('.', move_from_event.path)
@@ -418,7 +384,7 @@
418 self.assertEqual(0, move_from_event.wd)384 self.assertEqual(0, move_from_event.wd)
419 # test the move to385 # test the move to
420 self.assertTrue(move_to_event.dir)386 self.assertTrue(move_to_event.dir)
421 self.assertEqual(0x40000080, move_to_event.mask)387 self.assertEqual(OP_FLAGS['IN_MOVED_TO'] | IS_DIR, move_to_event.mask)
422 self.assertEqual('IN_MOVED_TO|IN_ISDIR', move_to_event.maskname)388 self.assertEqual('IN_MOVED_TO|IN_ISDIR', move_to_event.maskname)
423 self.assertEqual(os.path.split(to_dir_name)[1], move_to_event.name)389 self.assertEqual(os.path.split(to_dir_name)[1], move_to_event.name)
424 self.assertEqual('.', move_to_event.path)390 self.assertEqual('.', move_to_event.path)
@@ -447,7 +413,7 @@
447 move_dir, 1)413 move_dir, 1)
448 event = events[0]414 event = events[0]
449 self.assertTrue(event.dir)415 self.assertTrue(event.dir)
450 self.assertEqual(0x40000200, event.mask)416 self.assertEqual(OP_FLAGS['IN_DELETE'] | IS_DIR, event.mask)
451 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)417 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)
452 self.assertEqual('.', event.path)418 self.assertEqual('.', event.path)
453 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)419 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)
@@ -471,7 +437,7 @@
471 move_dir, 1)437 move_dir, 1)
472 event = events[0]438 event = events[0]
473 self.assertTrue(event.dir)439 self.assertTrue(event.dir)
474 self.assertEqual(0x40000100, event.mask)440 self.assertEqual(OP_FLAGS['IN_CREATE'] | IS_DIR, event.mask)
475 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)441 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)
476 self.assertEqual(os.path.split(from_dir_name)[1], event.name)442 self.assertEqual(os.path.split(from_dir_name)[1], event.name)
477 self.assertEqual('.', event.path)443 self.assertEqual('.', event.path)
@@ -484,8 +450,9 @@
484 handler = TestCaseHandler(number_events=0)450 handler = TestCaseHandler(number_events=0)
485 manager = WatchManager(handler)451 manager = WatchManager(handler)
486 # add a watch that will always exclude all actions452 # add a watch that will always exclude all actions
487 manager.add_watch(os_helper.get_windows_valid_path(self.basedir),453 manager.add_watch(get_os_valid_path(self.basedir),
488 self.mask, exclude_filter=lambda x: True)454 self.mask, auto_add=True,
455 exclude_filter=lambda x: True)
489 # execution the actions456 # execution the actions
490 file_name = os.path.join(self.basedir, 'test_file_create')457 file_name = os.path.join(self.basedir, 'test_file_create')
491 open(file_name, 'w').close()458 open(file_name, 'w').close()
@@ -502,16 +469,18 @@
502 """Memorize the processed events."""469 """Memorize the processed events."""
503 events.append(event)470 events.append(event)
504471
505 path = u'\\\\?\\C:\\path' # a valid windows path472 child = 'child'
506 child = u'child'473 watch = Watch(1, self.path, None)
507 watch = Watch(1, path, fake_processor)474 watch.ignore_path(os.path.join(self.path, child))
508 watch.ignore_path(os.path.join(path, child))
509 paths_to_ignore = []475 paths_to_ignore = []
510 for file_name in 'abcdef':476 for file_name in 'abcdef':
511 paths_to_ignore.append((1, os.path.join(child, file_name)))477 paths_to_ignore.append(
478 self.fake_events_processor.create_fake_event(
479 os.path.join(child, file_name)))
512 # ensure that the watch is watching480 # ensure that the watch is watching
513 watch._watching = True481 watch.platform_watch.watching = True
514 watch.platform_watch._process_events(paths_to_ignore)482 self.fake_events_processor.custom_process_events(
483 watch, paths_to_ignore)
515 self.assertEqual(0, len(events),484 self.assertEqual(0, len(events),
516 'All events should have been ignored.')485 'All events should have been ignored.')
517486
@@ -523,17 +492,18 @@
523 """Memorize the processed events."""492 """Memorize the processed events."""
524 events.append(event)493 events.append(event)
525494
526 path = u'\\\\?\\C:\\path' # a valid windows path495 child = 'child'
527 child = u'child'496 watch = Watch(1, self.path, fake_processor)
528 watch = Watch(1, path, fake_processor)497 watch.ignore_path(os.path.join(self.path, child))
529 watch.ignore_path(os.path.join(path, child))
530 paths_not_to_ignore = []498 paths_not_to_ignore = []
531 for file_name in 'abcdef':499 for file_name in 'abcdef':
532 paths_not_to_ignore.append((1, os.path.join(500 event = self.fake_events_processor.create_fake_event(
533 child + file_name, file_name)))501 os.path.join(child + file_name, file_name))
502 paths_not_to_ignore.append(event)
534 # ensure that the watch is watching503 # ensure that the watch is watching
535 watch.platform_watch.watching = True504 watch.platform_watch.watching = True
536 watch.platform_watch._process_events(paths_not_to_ignore)505 self.fake_events_processor.custom_process_events(
506 watch, paths_not_to_ignore)
537 self.assertEqual(len(paths_not_to_ignore), len(events),507 self.assertEqual(len(paths_not_to_ignore), len(events),
538 'No events should have been ignored.')508 'No events should have been ignored.')
539509
@@ -545,21 +515,22 @@
545 """Memorize the processed events."""515 """Memorize the processed events."""
546 events.append(event.pathname)516 events.append(event.pathname)
547517
548 child = u'child'518 child = 'child'
549 path = u'\\\\?\\C:\\path\\' # a valid windows path519 watch = Watch(1, self.path, fake_processor)
550 watch = Watch(1, path, fake_processor)520 watch.ignore_path(os.path.join(self.path, child))
551 watch.ignore_path(os.path.join(path, child))
552 paths_not_to_ignore = []521 paths_not_to_ignore = []
553 paths_to_ignore = []522 paths_to_ignore = []
554 expected_events = []523 expected_events = []
555 for file_name in 'abcdef':524 for file_name in 'abcdef':
556 valid = os.path.join(child + file_name, file_name)525 valid = os.path.join(child + file_name, file_name)
557 paths_to_ignore.append((1, os.path.join(child, file_name)))526 paths_to_ignore.append((1, os.path.join(child, file_name)))
558 paths_not_to_ignore.append((1, valid))527 paths_not_to_ignore.append(
559 expected_events.append(os.path.join('C:\\path', valid))528 self.fake_events_processor.create_fake_event(valid))
529 expected_events.append(os.path.join(self.common_path, valid))
560 # ensure that the watch is watching530 # ensure that the watch is watching
561 watch.platform_watch.watching = True531 watch.platform_watch.watching = True
562 watch.platform_watch._process_events(paths_not_to_ignore)532 self.fake_events_processor.custom_process_events(
533 watch, paths_not_to_ignore)
563 self.assertEqual(len(paths_not_to_ignore), len(events),534 self.assertEqual(len(paths_not_to_ignore), len(events),
564 'Wrong number of events ignored.')535 'Wrong number of events ignored.')
565 self.assertTrue(all([event in expected_events for event in events]),536 self.assertTrue(all([event in expected_events for event in events]),
@@ -573,17 +544,19 @@
573 """Memorize the processed events."""544 """Memorize the processed events."""
574 events.append(event)545 events.append(event)
575546
576 path = u'\\\\?\\C:\\path' # a valid windows path547 child = 'child'
577 child = u'child'548 watch = Watch(1, self.path, fake_processor)
578 watch = Watch(1, path, fake_processor)549 watch.ignore_path(os.path.join(self.path, child))
579 watch.ignore_path(os.path.join(path, child))550 watch.remove_ignored_path(os.path.join(self.path, child))
580 watch.remove_ignored_path(os.path.join(path, child))
581 paths_not_to_ignore = []551 paths_not_to_ignore = []
582 for file_name in 'abcdef':552 for file_name in 'abcdef':
583 paths_not_to_ignore.append((1, os.path.join(child, file_name)))553 event = self.fake_events_processor.create_fake_event(
554 os.path.join(child, file_name))
555 paths_not_to_ignore.append(event)
584 # ensure that the watch is watching556 # ensure that the watch is watching
585 watch.platform_watch.watching = True557 watch.platform_watch.watching = True
586 watch.platform_watch._process_events(paths_not_to_ignore)558 self.fake_events_processor.custom_process_events(
559 watch, paths_not_to_ignore)
587 self.assertEqual(len(paths_not_to_ignore), len(events),560 self.assertEqual(len(paths_not_to_ignore), len(events),
588 'All events should have been accepted.')561 'All events should have been accepted.')
589562
@@ -595,190 +568,98 @@
595 """Memorize the processed events."""568 """Memorize the processed events."""
596 events.append(event.pathname)569 events.append(event.pathname)
597570
598 path = u'\\\\?\\C:\\path' # a valid windows path571 child_a = 'childa'
599 child_a = u'childa'572 child_b = 'childb'
600 child_b = u'childb'573 watch = Watch(1, self.path, fake_processor)
601 watch = Watch(1, path, fake_processor)574 watch.ignore_path(os.path.join(self.path, child_a))
602 watch.ignore_path(os.path.join(path, child_a))575 watch.ignore_path(os.path.join(self.path, child_b))
603 watch.ignore_path(os.path.join(path, child_b))576 watch.remove_ignored_path(os.path.join(self.path, child_a))
604 watch.remove_ignored_path(os.path.join(path, child_a))
605 paths_to_ignore = []577 paths_to_ignore = []
606 paths_not_to_ignore = []578 paths_not_to_ignore = []
607 expected_events = []579 expected_events = []
608 for file_name in 'abcdef':580 for file_name in 'abcdef':
609 paths_to_ignore.append((1, os.path.join(child_b, file_name)))581 paths_to_ignore.append((1, os.path.join(child_b, file_name)))
610 valid = os.path.join(child_a, file_name)582 valid = os.path.join(child_a, file_name)
611 paths_not_to_ignore.append((1, valid))583 event = self.fake_events_processor.create_fake_event(valid)
612 expected_events.append(os.path.join('C:\\path', valid))584 paths_not_to_ignore.append(event)
585 expected_events.append(os.path.join(self.common_path, valid))
613 # ensure that the watch is watching586 # ensure that the watch is watching
614 watch.platform_watch.watching = True587 watch.platform_watch.watching = True
615 watch.platform_watch._process_events(paths_not_to_ignore)588 self.fake_events_processor.custom_process_events(
589 watch, paths_not_to_ignore)
616 self.assertEqual(len(paths_not_to_ignore), len(events),590 self.assertEqual(len(paths_not_to_ignore), len(events),
617 'All events should have been accepted.')591 'All events should have been accepted.')
618 self.assertTrue(all([event in expected_events for event in events]),592 self.assertTrue(all([event in expected_events for event in events]),
619 'Paths ignored that should have not been ignored.')593 'Paths ignored that should have not been ignored.')
620594
621 @defer.inlineCallbacks
622 def test_call_deferred_already_called(self):
623 """Test that the function is not called."""
624 method_args = []
625
626 def fake_call(*args, **kwargs):
627 """Execute the call."""
628 method_args.append((args, kwargs),)
629
630 path = u'\\\\?\\C:\\path' # a valid windows path
631 watch = Watch(1, path, None)
632 yield watch.platform_watch._watch_started_deferred.callback(True)
633 watch.platform_watch._call_deferred(fake_call, None)
634 self.assertEqual(0, len(method_args))
635
636 def test_call_deferred_not_called(self):
637 """Test that is indeed called."""
638 method_args = []
639
640 def fake_call(*args, **kwargs):
641 """Execute the call."""
642 method_args.append((args, kwargs),)
643
644 path = u'\\\\?\\C:\\path' # a valid windows path
645 watch = Watch(1, path, None)
646 watch.platform_watch._call_deferred(fake_call, None)
647 self.assertEqual(1, len(method_args))
648
649 def test_started_property(self):
650 """Test that the started property returns the started deferred."""
651 path = u'\\\\?\\C:\\path' # a valid windows path
652 watch = Watch(1, path, None)
653 self.assertEqual(watch.started, watch.platform_watch._watch_started_deferred)
654
655 def test_stopped_property(self):
656 """Test that the stopped property returns the stopped deferred."""
657 path = u'\\\\?\\C:\\path' # a valid windows path
658 watch = Watch(1, path, None)
659 self.assertEqual(watch.stopped, watch.platform_watch._watch_stopped_deferred)
660
661 def random_error(self, *args):595 def random_error(self, *args):
662 """Throw a fake exception."""596 """Throw a fake exception."""
663 raise FakeException()597 raise FakeException()
664598
665 @defer.inlineCallbacks
666 def test_start_watching_fails_early_in_thread(self):
667 """An early failure inside the thread should errback the deferred."""
668 test_path = self.mktemp("test_directory")
669 self.patch(filesystem_notifications, "CreateFileW", self.random_error)
670 watch = Watch(1, test_path, None)
671 d = watch.start_watching()
672 yield self.assertFailure(d, FakeException)
673
674 @defer.inlineCallbacks
675 def test_start_watching_fails_late_in_thread(self):
676 """A late failure inside the thread should errback the deferred."""
677 test_path = self.mktemp("test_directory")
678 self.patch(filesystem_notifications, "ReadDirectoryChangesW",
679 self.random_error)
680 watch = Watch(1, test_path, None)
681 d = watch.start_watching()
682 yield self.assertFailure(d, FakeException)
683
684 @defer.inlineCallbacks
685 def test_close_handle_is_called_on_error(self):
686 """CloseHandle is called when there's an error in the watch thread."""
687 test_path = self.mktemp("test_directory")
688 close_called = []
689 self.patch(filesystem_notifications, "CreateFileW", lambda *_: None)
690 self.patch(filesystem_notifications, "CloseHandle",
691 close_called.append)
692 self.patch(filesystem_notifications, "ReadDirectoryChangesW",
693 self.random_error)
694 watch = Watch(1, test_path, None)
695 d = watch.start_watching()
696 yield self.assertFailure(d, FakeException)
697 self.assertEqual(len(close_called), 1)
698 yield watch.stop_watching()
699
700 @defer.inlineCallbacks
701 def test_stop_watching_fired_when_watch_thread_finishes(self):
702 """The deferred returned is fired when the watch thread finishes."""
703 test_path = self.mktemp("another_test_directory")
704 watch = Watch(1, test_path, None)
705 yield watch.start_watching()
706 self.assertNotEqual(watch.platform_watch._watch_handle, None)
707 yield watch.stop_watching()
708 self.assertEqual(watch.platform_watch._watch_handle, None)
709
710 def test_is_path_dir_missing_no_subdir(self):599 def test_is_path_dir_missing_no_subdir(self):
711 """Test when the path does not exist and is no a subdir."""600 """Test when the path does not exist and is no a subdir."""
712 path = u'\\\\?\\C:\\path\\to\\no\\dir'
713 test_path = self.mktemp("test_directory")601 test_path = self.mktemp("test_directory")
714 self.patch(os.path, 'exists', lambda path: False)602 self.patch(os.path, 'exists', lambda path: False)
715 watch = Watch(1, test_path, None)603 watch = Watch(1, test_path, None)
716 self.assertFalse(watch._path_is_dir(path))604 self.assertFalse(watch._path_is_dir(self.invalid_path))
717605
718 def test_is_path_dir_missing_in_subdir(self):606 def test_is_path_dir_missing_in_subdir(self):
719 """Test when the path does not exist and is a subdir."""607 """Test when the path does not exist and is a subdir."""
720 path = u'\\\\?\\C:\\path\\to\\no\\dir'
721 test_path = self.mktemp("test_directory")608 test_path = self.mktemp("test_directory")
722 self.patch(os.path, 'exists', lambda path: False)609 self.patch(os.path, 'exists', lambda path: False)
723 watch = Watch(1, test_path, None)610 watch = Watch(1, test_path, None)
724 watch._subdirs.add(path)611 watch._subdirs.add(self.invalid_path)
725 self.assertTrue(watch._path_is_dir(path))612 self.assertTrue(watch._path_is_dir(self.invalid_path))
726613
727 def test_is_path_dir_present_is_dir(self):614 def test_is_path_dir_present_is_dir(self):
728 """Test when the path is present and is dir."""615 """Test when the path is present and is dir."""
729 path = u'\\\\?\\C:\\path\\to\\no\\dir'
730 test_path = self.mktemp("test_directory")616 test_path = self.mktemp("test_directory")
731 self.patch(os.path, 'exists', lambda path: True)617 self.patch(os.path, 'exists', lambda path: True)
732 self.patch(os.path, 'isdir', lambda path: True)618 self.patch(os.path, 'isdir', lambda path: True)
733 watch = Watch(1, test_path, None)619 watch = Watch(1, test_path, None)
734 watch._subdirs.add(path)620 watch._subdirs.add(self.invalid_path)
735 self.assertTrue(watch._path_is_dir(path))621 self.assertTrue(watch._path_is_dir(self.invalid_path))
736622
737 def test_is_path_dir_present_no_dir(self):623 def test_is_path_dir_present_no_dir(self):
738 """Test when the path is present but not a dir."""624 """Test when the path is present but not a dir."""
739 path = u'\\\\?\\C:\\path\\to\\no\\dir'
740 test_path = self.mktemp("test_directory")625 test_path = self.mktemp("test_directory")
741 self.patch(os.path, 'exists', lambda path: True)626 self.patch(os.path, 'exists', lambda path: True)
742 self.patch(os.path, 'isdir', lambda path: False)627 self.patch(os.path, 'isdir', lambda path: False)
743 watch = Watch(1, test_path, None)628 watch = Watch(1, test_path, None)
744 watch._subdirs.add(path)629 watch._subdirs.add(self.invalid_path)
745 self.assertFalse(watch._path_is_dir(path))630 self.assertFalse(watch._path_is_dir(self.invalid_path))
746631
747 def test_update_subdirs_create_not_present(self):632 def test_update_subdirs_create_not_present(self):
748 """Test when we update on a create event and not present."""633 """Test when we update on a create event and not present."""
749 path = u'\\\\?\\C:\\path\\to\\no\\dir'
750 test_path = self.mktemp("test_directory")634 test_path = self.mktemp("test_directory")
751 watch = Watch(1, test_path, None)635 watch = Watch(1, test_path, None)
752 watch._update_subdirs(path, REVERSE_WINDOWS_ACTIONS[IN_CREATE])636 watch._update_subdirs(self.invalid_path, REVERSE_OS_ACTIONS[IN_CREATE])
753 self.assertTrue(path in watch._subdirs)637 self.assertTrue(self.invalid_path in watch._subdirs)
754638
755 def test_update_subdirs_create_present(self):639 def test_update_subdirs_create_present(self):
756 """Test when we update on a create event and is present."""640 """Test when we update on a create event and is present."""
757 path = u'\\\\?\\C:\\path\\to\\no\\dir'
758 test_path = self.mktemp("test_directory")641 test_path = self.mktemp("test_directory")
759 watch = Watch(1, test_path, None)642 watch = Watch(1, test_path, None)
760 watch._subdirs.add(path)643 watch._subdirs.add(self.invalid_path)
761 old_length = len(watch._subdirs)644 old_length = len(watch._subdirs)
762 watch._update_subdirs(path, REVERSE_WINDOWS_ACTIONS[IN_CREATE])645 watch._update_subdirs(self.invalid_path, REVERSE_OS_ACTIONS[IN_CREATE])
763 self.assertTrue(path in watch._subdirs)646 self.assertTrue(self.invalid_path in watch._subdirs)
764 self.assertEqual(old_length, len(watch._subdirs))647 self.assertEqual(old_length, len(watch._subdirs))
765648
766 def test_update_subdirs_delete_not_present(self):649 def test_update_subdirs_delete_not_present(self):
767 """Test when we delete and is not present."""650 """Test when we delete and is not present."""
768 path = u'\\\\?\\C:\\path\\to\\no\\dir'
769 test_path = self.mktemp("test_directory")651 test_path = self.mktemp("test_directory")
770 watch = Watch(1, test_path, None)652 watch = Watch(1, test_path, None)
771 watch._update_subdirs(path, REVERSE_WINDOWS_ACTIONS[IN_DELETE])653 watch._update_subdirs(self.invalid_path, REVERSE_OS_ACTIONS[IN_DELETE])
772 self.assertTrue(path not in watch._subdirs)654 self.assertTrue(self.invalid_path not in watch._subdirs)
773655
774 def test_update_subdirs_delete_present(self):656 def test_update_subdirs_delete_present(self):
775 """Test when we delete and is present."""657 """Test when we delete and is present."""
776 path = u'\\\\?\\C:\\path\\to\\no\\dir'
777 test_path = self.mktemp("test_directory")658 test_path = self.mktemp("test_directory")
778 watch = Watch(1, test_path, None)659 watch = Watch(1, test_path, None)
779 watch._subdirs.add(path)660 watch._subdirs.add(self.invalid_path)
780 watch._update_subdirs(path, REVERSE_WINDOWS_ACTIONS[IN_DELETE])661 watch._update_subdirs(self.invalid_path, REVERSE_OS_ACTIONS[IN_DELETE])
781 self.assertTrue(path not in watch._subdirs)662 self.assertTrue(self.invalid_path not in watch._subdirs)
782663
783664
784class TestWatchManager(BaseTwistedTestCase):665class TestWatchManager(BaseTwistedTestCase):
@@ -788,11 +669,8 @@
788 def setUp(self):669 def setUp(self):
789 """Set each of the tests."""670 """Set each of the tests."""
790 yield super(TestWatchManager, self).setUp()671 yield super(TestWatchManager, self).setUp()
791 self.parent_path = u'\\\\?\\C:\\' # a valid windows path
792 self.path = self.parent_path + u'path'
793 self.watch = Watch(1, self.path, None)
794 self.manager = WatchManager(None)672 self.manager = WatchManager(None)
795 self.manager._wdm = {1: self.watch}673 self.fake_events_processor = FakeEventsProcessor()
796674
797 @defer.inlineCallbacks675 @defer.inlineCallbacks
798 def test_stop(self):676 def test_stop(self):
@@ -808,25 +686,9 @@
808 yield self.manager.stop()686 yield self.manager.stop()
809 self.assertTrue(self.was_called, 'The watch stop should be called.')687 self.assertTrue(self.was_called, 'The watch stop should be called.')
810688
811 @defer.inlineCallbacks
812 def test_stop_multiple(self):689 def test_stop_multiple(self):
813 """Test that stop is fired when *all* watches have stopped."""690 """Test that stop is fired when *all* watches have stopped."""
814691 raise NotImplementedError
815 def fake_stop_watching(watch):
816 """Another fake stop watch."""
817 return watch.stopped
818
819 self.patch(Watch, "stop_watching", fake_stop_watching)
820 second_path = self.parent_path + u"second_path"
821 second_watch = Watch(2, second_path, None)
822 self.manager._wdm[2] = second_watch
823 d = self.manager.stop()
824 self.assertFalse(d.called, "Not fired before all watches end")
825 self.watch.stopped.callback(None)
826 self.assertFalse(d.called, "Not fired before all watches end")
827 second_watch.stopped.callback(None)
828 yield d
829 self.assertTrue(d.called, "Fired after the watches ended")
830692
831 def test_get_present_watch(self):693 def test_get_present_watch(self):
832 """Test that we can get a Watch using is wd."""694 """Test that we can get a Watch using is wd."""
@@ -836,7 +698,6 @@
836 """Test that we get an error when trying to get a missing wd."""698 """Test that we get an error when trying to get a missing wd."""
837 self.assertRaises(KeyError, self.manager.get_watch, (1,))699 self.assertRaises(KeyError, self.manager.get_watch, (1,))
838700
839 @defer.inlineCallbacks
840 def test_delete_present_watch(self):701 def test_delete_present_watch(self):
841 """Test that we can remove a present watch."""702 """Test that we can remove a present watch."""
842 self.was_called = False703 self.was_called = False
@@ -865,7 +726,6 @@
865 self.manager.add_watch(self.path, mask)726 self.manager.add_watch(self.path, mask)
866 self.assertEqual(1, len(self.manager._wdm))727 self.assertEqual(1, len(self.manager._wdm))
867 self.assertTrue(self.was_called, 'The watch start was not called.')728 self.assertTrue(self.was_called, 'The watch start was not called.')
868 self.assertEqual(self.path + os.path.sep, self.manager._wdm[0].path)
869729
870 def test_get_watch_present_wd(self):730 def test_get_watch_present_wd(self):
871 """Test that the correct path is returned."""731 """Test that the correct path is returned."""
@@ -889,10 +749,8 @@
889 """A watch on an unwatched path returns None."""749 """A watch on an unwatched path returns None."""
890 self.assertEqual(None, self.manager.get_wd(self.parent_path))750 self.assertEqual(None, self.manager.get_wd(self.parent_path))
891751
892 @defer.inlineCallbacks
893 def test_rm_present_wd(self):752 def test_rm_present_wd(self):
894 """Test the removal of a present watch."""753 """Test the removal of a present watch."""
895
896 def fake_stop_watching():754 def fake_stop_watching():
897 """Fake stop watch."""755 """Fake stop watch."""
898 return defer.succeed(True)756 return defer.succeed(True)
@@ -913,8 +771,9 @@
913 self.manager.rm_path(self.path)771 self.manager.rm_path(self.path)
914 self.assertEqual(self.watch, self.manager._wdm.get(1))772 self.assertEqual(self.watch, self.manager._wdm.get(1))
915 self.watch._watching = True773 self.watch._watching = True
916 self.watch.platform_watch._process_events(774 event = self.fake_events_processor.create_fake_event(
917 [(1, os.path.join(self.path, 'test'))])775 os.path.join(self.path, 'test'))
776 self.fake_events_processor.custom_process_events(self.watch, [event])
918 self.assertEqual(0, len(events))777 self.assertEqual(0, len(events))
919778
920 def test_rm_child_path(self):779 def test_rm_child_path(self):
@@ -926,41 +785,27 @@
926 events.append(event.pathname)785 events.append(event.pathname)
927786
928 self.watch._processor = fake_processor787 self.watch._processor = fake_processor
929 child = os.path.join(self.path, u'child')788 child = os.path.join(self.path, 'child')
930 self.manager.rm_path(child)789 self.manager.rm_path(child)
931 self.assertEqual(self.watch, self.manager._wdm[1])790 self.assertEqual(self.watch, self.manager._wdm[1])
932 # assert that the correct event is ignored791 # assert that the correct event is ignored
933 self.watch.platform_watch.watching = True792 self.watch.platform_watch.watching = True
934 self.watch.platform_watch._process_events(793 event = self.fake_events_processor.create_fake_event(
935 [(1, os.path.join('child', 'test'))])794 os.path.join('child', 'test'))
795 self.fake_events_processor.custom_process_events(self.watch, [event])
936 self.assertEqual(0, len(events))796 self.assertEqual(0, len(events))
937 # assert that other events are not ignored797 # assert that other events are not ignored
938 self.watch.platform_watch._process_events([(1, 'test')])798 event2 = self.fake_events_processor.create_fake_event('test')
799 self.fake_events_processor.custom_process_events(self.watch, [event2])
939 self.assertEqual(1, len(events))800 self.assertEqual(1, len(events))
940801
941802
942class TestWatchManagerAddWatches(BaseTwistedTestCase):803class TestWatchManagerAddWatches(BaseTwistedTestCase):
943 """Test the watch manager."""804 """Test the watch manager."""
944 timeout = 5
945805
946 def test_add_watch_twice(self):806 def test_add_watch_twice(self):
947 """Adding a watch twice succeeds when the watch is running."""807 """Adding a watch twice succeeds when the watch is running."""
948 self.patch(Watch, "start_watching", lambda self: self.started)808 raise NotImplementedError
949 manager = WatchManager(None)
950 # no need to stop watching because start_watching is fake
951
952 path = u'\\\\?\\C:\\test' # a valid windows path
953 mask = 'fake bit mask'
954 d1 = manager.add_watch(path, mask)
955 d2 = manager.add_watch(path, mask)
956
957 self.assertFalse(d1.called, "Should not be called yet.")
958 self.assertFalse(d2.called, "Should not be called yet.")
959
960 manager._wdm.values()[0].started.callback(True)
961
962 self.assertTrue(d1.called, "Should already be called.")
963 self.assertTrue(d2.called, "Should already be called.")
964809
965810
966class FakeEvent(object):811class FakeEvent(object):
@@ -1083,16 +928,6 @@
1083 self.assertEqual(event, self.general.called_methods[0][1])928 self.assertEqual(event, self.general.called_methods[0][1])
1084 self.assertEqual(paths, self.general.called_methods[0][2])929 self.assertEqual(paths, self.general.called_methods[0][2])
1085930
1086 def test_platform_is_ignored(self):
1087 """Test that we do indeed ignore the correct paths."""
1088 not_ignored = 'test'
1089 ignored = not_ignored + '.lnk'
1090 path_is_ignored = notify_processor.common.path_is_ignored
1091 self.assertFalse(path_is_ignored(not_ignored),
1092 'Only links should be ignored.')
1093 self.assertTrue(path_is_ignored(ignored),
1094 'Links should be ignored.')
1095
1096 def test_is_ignored(self):931 def test_is_ignored(self):
1097 """Test that we do ensure that the path is ignored."""932 """Test that we do ensure that the path is ignored."""
1098 path = 'path'933 path = 'path'
@@ -1406,31 +1241,15 @@
14061241
1407class FilesystemMonitorTestCase(BaseTwistedTestCase):1242class FilesystemMonitorTestCase(BaseTwistedTestCase):
1408 """Tests for the FilesystemMonitor."""1243 """Tests for the FilesystemMonitor."""
1244
1409 timeout = 51245 timeout = 5
14101246
1411 def test_add_watch_twice(self):1247 def test_add_watch_twice(self):
1412 """Check the deferred returned by a second add_watch."""1248 """Check the deferred returned by a second add_watch."""
1413 self.patch(Watch, "start_watching", lambda self: self.started)1249 raise NotImplementedError
1414 monitor = FilesystemMonitor(None, None)1250
1415 # no need to stop watching because start_watching is fake
1416
1417 parent_path = 'C:\\test' # a valid windows path in utf-8 bytes
1418 child_path = parent_path + "\\child"
1419 d1 = monitor.add_watch(parent_path)
1420 d2 = monitor.add_watch(child_path)
1421
1422 self.assertFalse(d1.called, "Should not be called yet.")
1423 self.assertFalse(d2.called, "Should not be called yet.")
1424
1425 monitor._watch_manager._wdm.values()[0].started.callback(True)
1426
1427 self.assertTrue(d1.called, "Should already be called.")
1428 self.assertTrue(d2.called, "Should already be called.")
1429
1430 @defer.inlineCallbacks
1431 def test_add_watches_to_udf_ancestors(self):1251 def test_add_watches_to_udf_ancestors(self):
1432 """Test that the ancestor watches are not added."""1252 """Test that the ancestor watches are not added."""
1433
1434 class FakeVolume(object):1253 class FakeVolume(object):
1435 """A fake UDF."""1254 """A fake UDF."""
14361255
14371256
=== modified file 'tests/platform/filesystem_notifications/test_darwin.py'
--- tests/platform/filesystem_notifications/test_darwin.py 2012-07-18 15:18:04 +0000
+++ tests/platform/filesystem_notifications/test_darwin.py 2012-08-22 18:22:29 +0000
@@ -28,16 +28,13 @@
28# files in the program, then also delete it here.28# files in the program, then also delete it here.
29"""Test the filesystem notifications on MAC OS."""29"""Test the filesystem notifications on MAC OS."""
3030
31import itertools
31import logging32import logging
32import os33import os
33import tempfile34import tempfile
34import thread35import thread
35import itertools
3636
37import fsevents
38from twisted.internet import defer37from twisted.internet import defer
39
40from contrib.testing.testcase import BaseTwistedTestCase
41from ubuntuone.devtools.handlers import MementoHandler38from ubuntuone.devtools.handlers import MementoHandler
42from ubuntuone.platform.filesystem_notifications.monitor import (39from ubuntuone.platform.filesystem_notifications.monitor import (
43 common,40 common,
@@ -51,7 +48,6 @@
51 WatchManager,48 WatchManager,
52)49)
53from ubuntuone.platform.filesystem_notifications.pyinotify_agnostic import (50from ubuntuone.platform.filesystem_notifications.pyinotify_agnostic import (
54 EventsCodes,
55 ProcessEvent,51 ProcessEvent,
56 IN_CLOSE_WRITE,52 IN_CLOSE_WRITE,
57 IN_CREATE,53 IN_CREATE,
@@ -59,6 +55,7 @@
59 IN_OPEN,55 IN_OPEN,
60)56)
61from tests.platform.filesystem_notifications import BaseFSMonitorTestCase57from tests.platform.filesystem_notifications import BaseFSMonitorTestCase
58from tests.platform.filesystem_notifications import common as common_tests
6259
6360
64# A reverse mapping for the tests61# A reverse mapping for the tests
@@ -67,23 +64,18 @@
67 REVERSE_MACOS_ACTIONS[value] = key64 REVERSE_MACOS_ACTIONS[value] = key
6865
6966
70OP_FLAGS = EventsCodes.FLAG_COLLECTIONS['OP_FLAGS']67class FakeEventsProcessor(object):
71IS_DIR = EventsCodes.FLAG_COLLECTIONS['SPECIAL_FLAGS']['IN_ISDIR']68
7269 """Handle fake events creation and processing."""
7370
74class FakeException(Exception):71 def create_fake_event(self, filename):
75 """A fake Exception used in tests."""72 """Create a fake file event."""
7673 return FakeFileEvent(256, None, filename)
7774
78class FakeVolume(object):75 def custom_process_events(self, watch, events):
79 """A fake volume."""76 """Adapt to each platform way to process events."""
8077 for event in events:
81 def __init__(self, path, ancestors):78 watch.platform_watch._process_events(event)
82 """Create a new instance."""
83 super(FakeVolume, self).__init__()
84 self.volume_id = path
85 self.path = path
86 self.ancestors = ancestors
8779
8880
89class FakeFileEvent(object):81class FakeFileEvent(object):
@@ -135,7 +127,7 @@
135 assert self.main_thread_id == thread.get_ident()127 assert self.main_thread_id == thread.get_ident()
136128
137129
138class TestWatch(BaseTwistedTestCase):130class TestWatch(common_tests.TestWatch):
139 """Test the watch so that it returns the same events as pyinotify."""131 """Test the watch so that it returns the same events as pyinotify."""
140132
141 timeout = 5133 timeout = 5
@@ -143,6 +135,9 @@
143 @defer.inlineCallbacks135 @defer.inlineCallbacks
144 def setUp(self):136 def setUp(self):
145 yield super(TestWatch, self).setUp()137 yield super(TestWatch, self).setUp()
138 self.path = '/Users/username/folder'
139 self.common_path = '/Users/username/folder'
140 self.invalid_path = '/Users/username/path/to/not/dir'
146 self.basedir = self.mktemp('test_root')141 self.basedir = self.mktemp('test_root')
147 self.mask = None142 self.mask = None
148 self.stream = None143 self.stream = None
@@ -151,6 +146,7 @@
151 self.raw_events = []146 self.raw_events = []
152 self.paths_checked = []147 self.paths_checked = []
153 old_is_dir = Watch._path_is_dir148 old_is_dir = Watch._path_is_dir
149 self.fake_events_processor = FakeEventsProcessor()
154150
155 def path_is_dir_wrapper(watch, path):151 def path_is_dir_wrapper(watch, path):
156 """Wrapper that gets the checked paths."""152 """Wrapper that gets the checked paths."""
@@ -158,37 +154,32 @@
158 self.paths_checked.append((path, result))154 self.paths_checked.append((path, result))
159 return result155 return result
160156
161 self.patch(Watch, '_path_is_dir',157 self.patch(Watch, '_path_is_dir', path_is_dir_wrapper)
162 path_is_dir_wrapper)158
163159 def test_not_ignore_path(self):
164 @defer.inlineCallbacks160 """Test that we do get the events when they do not match."""
165 def _perform_operations(self, path, mask, actions, number_events):161 self.patch(filesystem_notifications.reactor, 'callFromThread',
166 """Perform the file operations and returns the recorded events."""162 lambda x, e: x(e))
167 handler = TestCaseHandler(number_events=number_events)163 super(TestWatch, self).test_not_ignore_path()
168 manager = WatchManager(handler)164
169 yield manager.add_watch(path, mask)165 def test_undo_ignore_path_ignored(self):
170 # change the logger so that we can check the logs if we wanted166 """Test that we do deal with events from and old ignored path."""
171 manager._wdm[0].log.addHandler(self.memento)167 self.patch(filesystem_notifications.reactor, 'callFromThread',
172 # clean logger later168 lambda x, e: x(e))
173 self.addCleanup(manager._wdm[0].log.removeHandler, self.memento)169 super(TestWatch, self).test_not_ignore_path()
174 # execution the actions170
175 actions()171 def test_undo_ignore_path_other_ignored(self):
176 # process the recorded events172 """Test that we can undo and the other path is ignored."""
177 ret = yield handler.deferred173 self.patch(filesystem_notifications.reactor, 'callFromThread',
178 self.addCleanup(manager.stop)174 lambda x, e: x(e))
179 defer.returnValue(ret)175 super(TestWatch, self).test_not_ignore_path()
180176
181 def _assert_logs(self, events):177 def test_mixed_ignore_path(self):
182 """Assert the debug logs."""178 """Test that we do get the correct events."""
183 logs = []179 self.patch(filesystem_notifications.reactor, 'callFromThread',
184 msg = 'Is path %r a dir? %s'180 lambda x, e: x(e))
185 logs.extend([msg % data for data in self.paths_checked])181 super(TestWatch, self).test_mixed_ignore_path()
186 msg = 'Pushing event %r to processor.'182
187 logs.extend([msg % e for e in events])
188 for msg in logs:
189 self.assertTrue(self.memento.check_debug(msg))
190
191 @defer.inlineCallbacks
192 def test_file_create(self):183 def test_file_create(self):
193 """Test that the correct event is returned on a file create."""184 """Test that the correct event is returned on a file create."""
194 file_name = os.path.join(self.basedir, 'test_file_create')185 file_name = os.path.join(self.basedir, 'test_file_create')
@@ -205,14 +196,12 @@
205 create_file, 1)196 create_file, 1)
206 event = events[0]197 event = events[0]
207 self.assertFalse(event.dir)198 self.assertFalse(event.dir)
208 self.assertEqual(OP_FLAGS['IN_CREATE'], event.mask)199 self.assertEqual(common_tests.OP_FLAGS['IN_CREATE'], event.mask)
209 self.assertEqual('IN_CREATE', event.maskname)200 self.assertEqual('IN_CREATE', event.maskname)
210 self.assertEqual(os.path.split(file_name)[1], event.name)201 self.assertEqual(os.path.split(file_name)[1], event.name)
211 self.assertEqual('.', event.path)202 self.assertEqual('.', event.path)
212 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)203 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)
213 self.assertEqual(0, event.wd)204 self.assertEqual(0, event.wd)
214 # assert the logging
215 self._assert_logs(events)
216205
217 @defer.inlineCallbacks206 @defer.inlineCallbacks
218 def test_dir_create(self):207 def test_dir_create(self):
@@ -227,14 +216,13 @@
227 create_dir, 1)216 create_dir, 1)
228 event = events[0]217 event = events[0]
229 self.assertTrue(event.dir)218 self.assertTrue(event.dir)
230 self.assertEqual(OP_FLAGS['IN_CREATE'] | IS_DIR, event.mask)219 self.assertEqual(common_tests.OP_FLAGS['IN_CREATE'] |
220 common_tests.IS_DIR, event.mask)
231 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)221 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)
232 self.assertEqual(os.path.split(dir_name)[1], event.name)222 self.assertEqual(os.path.split(dir_name)[1], event.name)
233 self.assertEqual('.', event.path)223 self.assertEqual('.', event.path)
234 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)224 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)
235 self.assertEqual(0, event.wd)225 self.assertEqual(0, event.wd)
236 # assert the logging
237 self._assert_logs(events)
238226
239 @defer.inlineCallbacks227 @defer.inlineCallbacks
240 def test_file_remove(self):228 def test_file_remove(self):
@@ -251,14 +239,12 @@
251 remove_file, 1)239 remove_file, 1)
252 event = events[0]240 event = events[0]
253 self.assertFalse(event.dir)241 self.assertFalse(event.dir)
254 self.assertEqual(OP_FLAGS['IN_DELETE'], event.mask)242 self.assertEqual(common_tests.OP_FLAGS['IN_DELETE'], event.mask)
255 self.assertEqual('IN_DELETE', event.maskname)243 self.assertEqual('IN_DELETE', event.maskname)
256 self.assertEqual(os.path.split(file_name)[1], event.name)244 self.assertEqual(os.path.split(file_name)[1], event.name)
257 self.assertEqual('.', event.path)245 self.assertEqual('.', event.path)
258 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)246 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)
259 self.assertEqual(0, event.wd)247 self.assertEqual(0, event.wd)
260 # assert the logging
261 self._assert_logs(events)
262248
263 @defer.inlineCallbacks249 @defer.inlineCallbacks
264 def test_dir_remove(self):250 def test_dir_remove(self):
@@ -275,13 +261,12 @@
275 remove_dir, 1)261 remove_dir, 1)
276 event = events[0]262 event = events[0]
277 self.assertTrue(event.dir)263 self.assertTrue(event.dir)
278 self.assertEqual(OP_FLAGS['IN_DELETE'] | IS_DIR, event.mask)264 self.assertEqual(common_tests.OP_FLAGS['IN_DELETE'] |
265 common_tests.IS_DIR, event.mask)
279 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)266 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)
280 self.assertEqual('.', event.path)267 self.assertEqual('.', event.path)
281 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)268 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)
282 self.assertEqual(0, event.wd)269 self.assertEqual(0, event.wd)
283 # assert the logging
284 self._assert_logs(events)
285270
286 @defer.inlineCallbacks271 @defer.inlineCallbacks
287 def test_file_write(self):272 def test_file_write(self):
@@ -301,14 +286,12 @@
301 write_file, 1)286 write_file, 1)
302 event = events[0]287 event = events[0]
303 self.assertFalse(event.dir)288 self.assertFalse(event.dir)
304 self.assertEqual(OP_FLAGS['IN_CREATE'], event.mask)289 self.assertEqual(common_tests.OP_FLAGS['IN_CREATE'], event.mask)
305 self.assertEqual('IN_CREATE', event.maskname)290 self.assertEqual('IN_CREATE', event.maskname)
306 self.assertEqual(os.path.split(file_name)[1], event.name)291 self.assertEqual(os.path.split(file_name)[1], event.name)
307 self.assertEqual('.', event.path)292 self.assertEqual('.', event.path)
308 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)293 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)
309 self.assertEqual(0, event.wd)294 self.assertEqual(0, event.wd)
310 # assert the logging
311 self._assert_logs(events)
312295
313 @defer.inlineCallbacks296 @defer.inlineCallbacks
314 def test_file_moved_to_watched_dir_same_watcher(self):297 def test_file_moved_to_watched_dir_same_watcher(self):
@@ -330,7 +313,8 @@
330 move_to_event = events[1]313 move_to_event = events[1]
331 # first test the move from314 # first test the move from
332 self.assertFalse(move_from_event.dir)315 self.assertFalse(move_from_event.dir)
333 self.assertEqual(OP_FLAGS['IN_MOVED_FROM'], move_from_event.mask)316 self.assertEqual(common_tests.OP_FLAGS['IN_MOVED_FROM'],
317 move_from_event.mask)
334 self.assertEqual('IN_MOVED_FROM', move_from_event.maskname)318 self.assertEqual('IN_MOVED_FROM', move_from_event.maskname)
335 self.assertEqual(os.path.split(from_file_name)[1],319 self.assertEqual(os.path.split(from_file_name)[1],
336 move_from_event.name)320 move_from_event.name)
@@ -340,7 +324,8 @@
340 self.assertEqual(0, move_from_event.wd)324 self.assertEqual(0, move_from_event.wd)
341 # test the move to325 # test the move to
342 self.assertFalse(move_to_event.dir)326 self.assertFalse(move_to_event.dir)
343 self.assertEqual(OP_FLAGS['IN_MOVED_TO'], move_to_event.mask)327 self.assertEqual(common_tests.OP_FLAGS['IN_MOVED_TO'],
328 move_to_event.mask)
344 self.assertEqual('IN_MOVED_TO', move_to_event.maskname)329 self.assertEqual('IN_MOVED_TO', move_to_event.maskname)
345 self.assertEqual(os.path.split(to_file_name)[1], move_to_event.name)330 self.assertEqual(os.path.split(to_file_name)[1], move_to_event.name)
346 self.assertEqual('.', move_to_event.path)331 self.assertEqual('.', move_to_event.path)
@@ -351,8 +336,6 @@
351 self.assertEqual(0, move_to_event.wd)336 self.assertEqual(0, move_to_event.wd)
352 # assert that both cookies are the same337 # assert that both cookies are the same
353 self.assertEqual(move_from_event.cookie, move_to_event.cookie)338 self.assertEqual(move_from_event.cookie, move_to_event.cookie)
354 # assert the logging
355 self._assert_logs(events)
356339
357 @defer.inlineCallbacks340 @defer.inlineCallbacks
358 def test_file_moved_to_not_watched_dir(self):341 def test_file_moved_to_not_watched_dir(self):
@@ -372,15 +355,13 @@
372 move_file, 1)355 move_file, 1)
373 event = events[0]356 event = events[0]
374 self.assertFalse(event.dir)357 self.assertFalse(event.dir)
375 self.assertEqual(OP_FLAGS['IN_DELETE'], event.mask)358 self.assertEqual(common_tests.OP_FLAGS['IN_DELETE'], event.mask)
376 self.assertEqual('IN_DELETE', event.maskname)359 self.assertEqual('IN_DELETE', event.maskname)
377 self.assertEqual(os.path.split(from_file_name)[1], event.name)360 self.assertEqual(os.path.split(from_file_name)[1], event.name)
378 self.assertEqual('.', event.path)361 self.assertEqual('.', event.path)
379 self.assertEqual(os.path.join(self.basedir, from_file_name),362 self.assertEqual(os.path.join(self.basedir, from_file_name),
380 event.pathname)363 event.pathname)
381 self.assertEqual(0, event.wd)364 self.assertEqual(0, event.wd)
382 # assert the logging
383 self._assert_logs(events)
384365
385 @defer.inlineCallbacks366 @defer.inlineCallbacks
386 def test_file_move_from_not_watched_dir(self):367 def test_file_move_from_not_watched_dir(self):
@@ -402,15 +383,13 @@
402 move_files, 1)383 move_files, 1)
403 event = events[0]384 event = events[0]
404 self.assertFalse(event.dir)385 self.assertFalse(event.dir)
405 self.assertEqual(OP_FLAGS['IN_CREATE'], event.mask)386 self.assertEqual(common_tests.OP_FLAGS['IN_CREATE'], event.mask)
406 self.assertEqual('IN_CREATE', event.maskname)387 self.assertEqual('IN_CREATE', event.maskname)
407 self.assertEqual(os.path.split(to_file_name)[1], event.name)388 self.assertEqual(os.path.split(to_file_name)[1], event.name)
408 self.assertEqual('.', event.path)389 self.assertEqual('.', event.path)
409 self.assertEqual(os.path.join(self.basedir, to_file_name),390 self.assertEqual(os.path.join(self.basedir, to_file_name),
410 event.pathname)391 event.pathname)
411 self.assertEqual(0, event.wd)392 self.assertEqual(0, event.wd)
412 # assert the logging
413 self._assert_logs(events)
414393
415 @defer.inlineCallbacks394 @defer.inlineCallbacks
416 def test_dir_moved_to_watched_dir_same_watcher(self):395 def test_dir_moved_to_watched_dir_same_watcher(self):
@@ -431,7 +410,8 @@
431 move_to_event = events[1]410 move_to_event = events[1]
432 # first test the move from411 # first test the move from
433 self.assertTrue(move_from_event.dir)412 self.assertTrue(move_from_event.dir)
434 self.assertEqual(OP_FLAGS['IN_MOVED_FROM'] | IS_DIR,413 self.assertEqual(common_tests.OP_FLAGS['IN_MOVED_FROM'] |
414 common_tests.IS_DIR,
435 move_from_event.mask)415 move_from_event.mask)
436 self.assertEqual('IN_MOVED_FROM|IN_ISDIR', move_from_event.maskname)416 self.assertEqual('IN_MOVED_FROM|IN_ISDIR', move_from_event.maskname)
437 self.assertEqual(os.path.split(from_dir_name)[1], move_from_event.name)417 self.assertEqual(os.path.split(from_dir_name)[1], move_from_event.name)
@@ -441,7 +421,8 @@
441 self.assertEqual(0, move_from_event.wd)421 self.assertEqual(0, move_from_event.wd)
442 # test the move to422 # test the move to
443 self.assertTrue(move_to_event.dir)423 self.assertTrue(move_to_event.dir)
444 self.assertEqual(OP_FLAGS['IN_MOVED_TO'] | IS_DIR, move_to_event.mask)424 self.assertEqual(common_tests.OP_FLAGS['IN_MOVED_TO'] |
425 common_tests.IS_DIR, move_to_event.mask)
445 self.assertEqual('IN_MOVED_TO|IN_ISDIR', move_to_event.maskname)426 self.assertEqual('IN_MOVED_TO|IN_ISDIR', move_to_event.maskname)
446 self.assertEqual(os.path.split(to_dir_name)[1], move_to_event.name)427 self.assertEqual(os.path.split(to_dir_name)[1], move_to_event.name)
447 self.assertEqual('.', move_to_event.path)428 self.assertEqual('.', move_to_event.path)
@@ -452,8 +433,6 @@
452 self.assertEqual(0, move_to_event.wd)433 self.assertEqual(0, move_to_event.wd)
453 # assert that both cookies are the same434 # assert that both cookies are the same
454 self.assertEqual(move_from_event.cookie, move_to_event.cookie)435 self.assertEqual(move_from_event.cookie, move_to_event.cookie)
455 # assert the logging
456 self._assert_logs(events)
457436
458 @defer.inlineCallbacks437 @defer.inlineCallbacks
459 def test_dir_moved_to_not_watched_dir(self):438 def test_dir_moved_to_not_watched_dir(self):
@@ -473,13 +452,12 @@
473 move_dir, 1)452 move_dir, 1)
474 event = events[0]453 event = events[0]
475 self.assertTrue(event.dir)454 self.assertTrue(event.dir)
476 self.assertEqual(OP_FLAGS['IN_DELETE'] | IS_DIR, event.mask)455 self.assertEqual(common_tests.OP_FLAGS['IN_DELETE'] |
456 common_tests.IS_DIR, event.mask)
477 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)457 self.assertEqual('IN_DELETE|IN_ISDIR', event.maskname)
478 self.assertEqual('.', event.path)458 self.assertEqual('.', event.path)
479 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)459 self.assertEqual(os.path.join(self.basedir, dir_name), event.pathname)
480 self.assertEqual(0, event.wd)460 self.assertEqual(0, event.wd)
481 # assert the logging
482 self._assert_logs(events)
483461
484 @defer.inlineCallbacks462 @defer.inlineCallbacks
485 def test_dir_move_from_not_watched_dir(self):463 def test_dir_move_from_not_watched_dir(self):
@@ -499,7 +477,8 @@
499 move_dir, 1)477 move_dir, 1)
500 event = events[0]478 event = events[0]
501 self.assertTrue(event.dir)479 self.assertTrue(event.dir)
502 self.assertEqual(OP_FLAGS['IN_CREATE'] | IS_DIR, event.mask)480 self.assertEqual(common_tests.OP_FLAGS['IN_CREATE'] |
481 common_tests.IS_DIR, event.mask)
503 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)482 self.assertEqual('IN_CREATE|IN_ISDIR', event.maskname)
504 self.assertEqual(os.path.split(from_dir_name)[1], event.name)483 self.assertEqual(os.path.split(from_dir_name)[1], event.name)
505 self.assertEqual('.', event.path)484 self.assertEqual('.', event.path)
@@ -521,149 +500,6 @@
521 self.assertEqual(0, len(handler.processed_events))500 self.assertEqual(0, len(handler.processed_events))
522 test_exclude_filter.skip = "we must rethink this test."501 test_exclude_filter.skip = "we must rethink this test."
523502
524 def test_ignore_path(self):
525 """Test that events from a path are ignored."""
526 events = []
527
528 def fake_processor(event):
529 """Memorize the processed events."""
530 events.append(event)
531
532 path = '/Users/username/folder'
533 child = 'child'
534 watch = Watch(1, path, fake_processor)
535 watch.ignore_path(os.path.join(path, child))
536 # ensure that the watch is watching
537 watch.platform_watch.watching = True
538 for file_name in 'abcdef':
539 event = FakeFileEvent(256, None, os.path.join(child, file_name))
540 watch.platform_watch._process_events(event)
541 self.assertEqual(0, len(events),
542 'All events should have been ignored.')
543
544 def test_not_ignore_path(self):
545 """Test that we do get the events when they do not match."""
546 events = []
547
548 def fake_processor(event):
549 """Memorize the processed events."""
550 events.append(event)
551
552 self.patch(filesystem_notifications.reactor, 'callFromThread',
553 lambda x, e: x(e))
554
555 path = '/Users/username/folder'
556 child = 'child'
557 watch = Watch(1, path, fake_processor)
558 watch.ignore_path(os.path.join(path, child))
559 paths_not_to_ignore = []
560 for file_name in 'abcdef':
561 event = FakeFileEvent(256, None, os.path.join(child + file_name,
562 file_name))
563 paths_not_to_ignore.append(event)
564 # ensure that the watch is watching
565 watch.platform_watch.watching = True
566 for event in paths_not_to_ignore:
567 watch.platform_watch._process_events(event)
568 self.assertEqual(len(paths_not_to_ignore), len(events),
569 'No events should have been ignored.')
570
571 def test_mixed_ignore_path(self):
572 """Test that we do get the correct events."""
573 events = []
574
575 def fake_processor(event):
576 """Memorize the processed events."""
577 events.append(event.pathname)
578
579 self.patch(filesystem_notifications.reactor, 'callFromThread',
580 lambda x, e: x(e))
581
582 child = 'child'
583 path = '/Users/username/folder'
584 watch = Watch(1, path, fake_processor)
585 watch.ignore_path(os.path.join(path, child))
586 paths_not_to_ignore = []
587 paths_to_ignore = []
588 expected_events = []
589 for file_name in 'abcdef':
590 valid = os.path.join(child + file_name, file_name)
591 paths_to_ignore.append((1, os.path.join(child, file_name)))
592 event = FakeFileEvent(256, None, valid)
593 paths_not_to_ignore.append(event)
594 expected_events.append(os.path.join(path, valid))
595 # ensure that the watch is watching
596 watch.platform_watch.watching = True
597 for event in paths_not_to_ignore:
598 watch.platform_watch._process_events(event)
599 self.assertEqual(len(paths_not_to_ignore), len(events),
600 'Wrong number of events ignored.')
601 self.assertTrue(all([event in expected_events for event in events]),
602 'Paths ignored that should have not been ignored.')
603
604 def test_undo_ignore_path_ignored(self):
605 """Test that we do deal with events from and old ignored path."""
606 events = []
607
608 def fake_processor(event):
609 """Memorize the processed events."""
610 events.append(event)
611
612 self.patch(filesystem_notifications.reactor, 'callFromThread',
613 lambda x, e: x(e))
614
615 path = '/Users/username/folder'
616 child = 'child'
617 watch = Watch(1, path, fake_processor)
618 watch.ignore_path(os.path.join(path, child))
619 watch.remove_ignored_path(os.path.join(path, child))
620 paths_not_to_ignore = []
621 for file_name in 'abcdef':
622 event = FakeFileEvent(256, None, os.path.join(child, file_name))
623 paths_not_to_ignore.append(event)
624 # ensure that the watch is watching
625 watch.platform_watch.watching = True
626 for event in paths_not_to_ignore:
627 watch.platform_watch._process_events(event)
628 self.assertEqual(len(paths_not_to_ignore), len(events),
629 'All events should have been accepted.')
630
631 def test_undo_ignore_path_other_ignored(self):
632 """Test that we can undo and the other path is ignored."""
633 events = []
634
635 def fake_processor(event):
636 """Memorize the processed events."""
637 events.append(event.pathname)
638
639 self.patch(filesystem_notifications.reactor, 'callFromThread',
640 lambda x, e: x(e))
641
642 path = '/Users/username/folder'
643 child_a = 'childa'
644 child_b = 'childb'
645 watch = Watch(1, path, fake_processor)
646 watch.ignore_path(os.path.join(path, child_a))
647 watch.ignore_path(os.path.join(path, child_b))
648 watch.remove_ignored_path(os.path.join(path, child_a))
649 paths_to_ignore = []
650 paths_not_to_ignore = []
651 expected_events = []
652 for file_name in 'abcdef':
653 paths_to_ignore.append((1, os.path.join(child_b, file_name)))
654 valid = os.path.join(child_a, file_name)
655 event = FakeFileEvent(256, None, valid)
656 paths_not_to_ignore.append(event)
657 expected_events.append(os.path.join(path, valid))
658 # ensure that the watch is watching
659 watch.platform_watch.watching = True
660 for event in paths_not_to_ignore:
661 watch.platform_watch._process_events(event)
662 self.assertEqual(len(paths_not_to_ignore), len(events),
663 'All events should have been accepted.')
664 self.assertTrue(all([event in expected_events for event in events]),
665 'Paths ignored that should have not been ignored.')
666
667 def test_stream_created(self):503 def test_stream_created(self):
668 """Test that the stream is created."""504 """Test that the stream is created."""
669 def fake_call(*args, **kwargs):505 def fake_call(*args, **kwargs):
@@ -684,7 +520,7 @@
684520
685 def random_error(self, *args):521 def random_error(self, *args):
686 """Throw a fake exception."""522 """Throw a fake exception."""
687 raise FakeException()523 raise common_tests.FakeException()
688524
689 def test_is_path_dir_missing_no_subdir(self):525 def test_is_path_dir_missing_no_subdir(self):
690 """Test when the path does not exist and is no a subdir."""526 """Test when the path does not exist and is no a subdir."""
@@ -760,7 +596,7 @@
760 self.assertTrue(path not in watch._subdirs)596 self.assertTrue(path not in watch._subdirs)
761597
762598
763class TestWatchManager(BaseTwistedTestCase):599class TestWatchManager(common_tests.TestWatchManager):
764 """Test the watch manager."""600 """Test the watch manager."""
765601
766 @defer.inlineCallbacks602 @defer.inlineCallbacks
@@ -772,24 +608,19 @@
772 self.watch = Watch(1, self.path, None)608 self.watch = Watch(1, self.path, None)
773 self.manager = WatchManager(None)609 self.manager = WatchManager(None)
774 self.manager._wdm = {1: self.watch}610 self.manager._wdm = {1: self.watch}
611 self.stream = None
612 self.fake_events_processor = FakeEventsProcessor()
775613
776 @defer.inlineCallbacks614 @defer.inlineCallbacks
777 def test_stop(self):615 def test_stop(self):
778 """Test that the different watches are stopped."""616 """Test that the different watches are stopped."""
779 self.was_called = False617 self.patch(self.manager.manager.observer, "unschedule",
780618 lambda x: None)
781 def fake_stop_watching(watch):
782 """Fake stop watch."""
783 self.was_called = True
784 return defer.succeed(True)
785
786 self.patch(Watch, "stop_watching", fake_stop_watching)
787 self.patch(self.manager.manager.observer, "unschedule", lambda x: None)619 self.patch(self.manager.manager.observer, "unschedule", lambda x: None)
788 yield self.manager.stop()620 yield super(TestWatchManager, self).test_stop()
789 self.assertTrue(self.was_called, 'The watch stop should be called.')
790621
791 def test_stop_multiple(self):622 def test_stop_multiple(self):
792 """The watches should became watching=False and the observer stopped."""623 """Watches should became watching=False and the observer stopped."""
793 self.patch(self.manager.manager.observer, "unschedule", lambda x: None)624 self.patch(self.manager.manager.observer, "unschedule", lambda x: None)
794 second_path = self.parent_path + "second_path"625 second_path = self.parent_path + "second_path"
795 second_watch = Watch(2, second_path, None)626 second_watch = Watch(2, second_path, None)
@@ -810,28 +641,6 @@
810 """Test that we get an error when trying to get a missing wd."""641 """Test that we get an error when trying to get a missing wd."""
811 self.assertRaises(KeyError, self.manager.get_watch, (1,))642 self.assertRaises(KeyError, self.manager.get_watch, (1,))
812643
813 @defer.inlineCallbacks
814 def test_delete_present_watch(self):
815 """Test that we can remove a present watch."""
816 self.was_called = False
817
818 def stop_watching():
819 """Fake stop watch."""
820 self.was_called = True
821 return defer.succeed(True)
822
823 def fake_unschedule(s):
824 """Fake function that should receive a Stream object."""
825 self.stream = s
826
827 self.patch(self.manager.manager.observer, "unschedule",
828 fake_unschedule)
829
830 self.watch.stop_watching = stop_watching
831 yield self.manager.del_watch(1)
832 self.assertIsInstance(self.stream, fsevents.Stream)
833 self.assertRaises(KeyError, self.manager.get_watch, (1,))
834
835 def test_add_single_watch(self):644 def test_add_single_watch(self):
836 """Test the addition of a new single watch."""645 """Test the addition of a new single watch."""
837 self.was_called = False646 self.was_called = False
@@ -849,10 +658,6 @@
849 self.assertTrue(self.was_called, 'The watch start was not called.')658 self.assertTrue(self.was_called, 'The watch start was not called.')
850 self.assertEqual(self.path + os.path.sep, self.manager._wdm[0].path)659 self.assertEqual(self.path + os.path.sep, self.manager._wdm[0].path)
851660
852 def test_get_watch_present_wd(self):
853 """Test that the correct path is returned."""
854 self.assertEqual(self.path + os.path.sep, self.manager.get_path(1))
855
856 def test_get_watch_missing_wd(self):661 def test_get_watch_missing_wd(self):
857 """Test that the correct path is returned."""662 """Test that the correct path is returned."""
858 self.manager._wdm = {}663 self.manager._wdm = {}
@@ -873,60 +678,24 @@
873678
874 def test_rm_present_wd(self):679 def test_rm_present_wd(self):
875 """Test the removal of a present watch."""680 """Test the removal of a present watch."""
876 self.patch(self.watch, "stop_watching", lambda: None)
877 self.patch(self.manager.manager.observer, "unschedule", lambda x: None)681 self.patch(self.manager.manager.observer, "unschedule", lambda x: None)
878 self.manager.rm_watch(1)682 super(TestWatchManager, self).test_rm_present_wd()
879 self.assertEqual(None, self.manager._wdm.get(1))
880
881 def test_rm_root_path(self):
882 """Test the removal of a root path."""
883 events = []
884
885 def fake_processor(event):
886 """Memorize the processed events."""
887 events.append(event.pathname)
888
889 self.watch._processor = fake_processor
890 self.manager.rm_path(self.path)
891 self.assertEqual(self.watch, self.manager._wdm.get(1))
892 self.watch._watching = True
893 event = FakeFileEvent(256, None, os.path.join(self.path, 'test'))
894 self.watch.platform_watch._process_events(event)
895 self.assertEqual(0, len(events))
896683
897 def test_rm_child_path(self):684 def test_rm_child_path(self):
898 """Test the removal of a child path."""685 """Test the removal of a child path."""
899 events = []
900
901 def fake_processor(event):
902 """Memorize the processed events."""
903 events.append(event.pathname)
904
905 self.patch(filesystem_notifications.reactor, 'callFromThread',686 self.patch(filesystem_notifications.reactor, 'callFromThread',
906 lambda x, e: x(e))687 lambda x, e: x(e))
907688 super(TestWatchManager, self).test_rm_child_path()
908 self.watch._processor = fake_processor689
909 child = os.path.join(self.path, 'child')690
910 self.manager.rm_path(child)691class TestWatchManagerAddWatches(common_tests.TestWatchManagerAddWatches):
911 self.assertEqual(self.watch, self.manager._wdm[1])
912 # assert that the correct event is ignored
913 self.watch.platform_watch.watching = True
914 event = FakeFileEvent(256, None, os.path.join('child', 'test'))
915 self.watch.platform_watch._process_events(event)
916 self.assertEqual(0, len(events))
917 # assert that other events are not ignored
918 event2 = FakeFileEvent(256, None, 'test')
919 self.watch.platform_watch._process_events(event2)
920 self.assertEqual(1, len(events))
921
922
923class TestWatchManagerAddWatches(BaseTwistedTestCase):
924 """Test the watch manager."""692 """Test the watch manager."""
925 timeout = 5693 timeout = 5
926694
927 def test_add_watch_twice(self):695 def test_add_watch_twice(self):
928 """Adding a watch twice succeeds when the watch is running."""696 """Adding a watch twice succeeds when the watch is running."""
929 self.patch(Watch, "start_watching", lambda self: None)697 self.patch(Watch, "start_watching", lambda self: None)
698 self.patch(Watch, "started", lambda self: True)
930 manager = WatchManager(None)699 manager = WatchManager(None)
931 # no need to stop watching because start_watching is fake700 # no need to stop watching because start_watching is fake
932701
@@ -936,7 +705,7 @@
936 d2 = manager.add_watch(path, mask)705 d2 = manager.add_watch(path, mask)
937706
938 self.assertTrue(d1.result, "Should not be called yet.")707 self.assertTrue(d1.result, "Should not be called yet.")
939 self.assertFalse(d2.result, "Should not be called yet.")708 self.assertTrue(d2, "Should not be called yet.")
940709
941710
942class FakeEvent(object):711class FakeEvent(object):
@@ -953,88 +722,15 @@
953 self.cookie = cookie722 self.cookie = cookie
954723
955724
956class FakeLog(object):725class TestNotifyProcessor(common_tests.TestNotifyProcessor):
957 """A fake log that is used by the general processor."""
958
959 def __init__(self):
960 """Create the fake."""
961 self.called_methods = []
962
963 def info(self, *args):
964 """Fake the info call."""
965 self.called_methods.append(('info', args))
966
967
968class FakeGeneralProcessor(object):
969 """Fake implementation of the general processor."""
970
971 def __init__(self):
972 """Create the fake."""
973 self.called_methods = []
974 self.paths_to_return = []
975 self.log = FakeLog()
976 self.share_id = None
977 self.ignore = False
978
979 def rm_from_mute_filter(self, event, paths):
980 """Fake rm_from_mute_filter."""
981 self.called_methods.append(('rm_from_mute_filter', event, paths))
982
983 def add_to_mute_filter(self, event, paths):
984 """Fake add_to_move_filter."""
985 self.called_methods.append(('add_to_mute_filter', event, paths))
986
987 def is_ignored(self, path):
988 """Fake is_ignored."""
989 self.called_methods.append(('is_ignored', path))
990 return self.ignore
991
992 def push_event(self, event):
993 """Fake push event."""
994 self.called_methods.append(('push_event', event))
995
996 def eq_push(self, event, path=None, path_to=None, path_from=None):
997 """Fake event to push event."""
998 self.called_methods.append(('eq_push', event, path, path_to,
999 path_from))
1000
1001 def get_paths_starting_with(self, fullpath, include_base=False):
1002 """Fake get_paths_starting_with."""
1003 self.called_methods.append(('get_paths_starting_with', fullpath,
1004 include_base))
1005 return self.paths_to_return
1006
1007 def get_path_share_id(self, path):
1008 """Fake get_path_share_id."""
1009 self.called_methods.append(('get_path_share_id', path))
1010 return self.share_id
1011
1012 def rm_watch(self, path):
1013 """Fake the remove watch."""
1014 self.called_methods.append(('rm_watch', path))
1015
1016 def freeze_begin(self, path):
1017 """Fake freeze_begin"""
1018 self.called_methods.append(('freeze_begin', path))
1019
1020 def freeze_rollback(self):
1021 """Fake rollback."""
1022 self.called_methods.append(('freeze_rollback',))
1023
1024 def freeze_commit(self, path):
1025 """Fake freeze commit."""
1026 self.called_methods.append(('freeze_commit', path))
1027
1028
1029class TestNotifyProcessor(BaseTwistedTestCase):
1030 """Test the notify processor."""726 """Test the notify processor."""
1031727
1032 @defer.inlineCallbacks728 @defer.inlineCallbacks
1033 def setUp(self):729 def setUp(self):
1034 """set up the diffeent tests."""730 """set up the different tests."""
1035 yield super(TestNotifyProcessor, self).setUp()731 yield super(TestNotifyProcessor, self).setUp()
1036 self.processor = notify_processor.NotifyProcessor(None)732 self.processor = notify_processor.NotifyProcessor(None)
1037 self.general = FakeGeneralProcessor()733 self.general = common_tests.FakeGeneralProcessor()
1038 self.processor.general_processor = self.general734 self.processor.general_processor = self.general
1039735
1040 def test_rm_from_mute_filter(self):736 def test_rm_from_mute_filter(self):
1041737
=== modified file 'tests/platform/filesystem_notifications/test_fsevents_daemon.py'
--- tests/platform/filesystem_notifications/test_fsevents_daemon.py 2012-07-19 14:13:06 +0000
+++ tests/platform/filesystem_notifications/test_fsevents_daemon.py 2012-08-22 18:22:29 +0000
@@ -26,14 +26,14 @@
26# do not wish to do so, delete this exception statement from your26# do not wish to do so, delete this exception statement from your
27# version. If you delete this exception statement from all source27# version. If you delete this exception statement from all source
28# files in the program, then also delete it here.28# files in the program, then also delete it here.
29"""Tests for the fsevents daemon integration."""29"""Tests for the fseventsd daemon integration."""
3030
31import os31import os
3232
33from twisted.internet import defer, protocol33from twisted.internet import defer, protocol
3434
35from contrib.testing.testcase import BaseTwistedTestCase35from contrib.testing.testcase import BaseTwistedTestCase
36from ubuntuone.darwin import fsevents36from ubuntuone import fseventsd
37from ubuntuone.devtools.testcases.txsocketserver import TidyUnixServer37from ubuntuone.devtools.testcases.txsocketserver import TidyUnixServer
38from ubuntuone.platform.filesystem_notifications.monitor.darwin import (38from ubuntuone.platform.filesystem_notifications.monitor.darwin import (
39 fsevents_daemon,39 fsevents_daemon,
@@ -272,7 +272,7 @@
272 head, _ = os.path.split(destination_path)272 head, _ = os.path.split(destination_path)
273 self.factory.watched_paths.append(head)273 self.factory.watched_paths.append(head)
274 event = FakeDaemonEvent()274 event = FakeDaemonEvent()
275 event.event_type = fsevents.FSE_RENAME275 event.event_type = fseventsd.FSE_RENAME
276 event.event_paths.extend([source_path, destination_path])276 event.event_paths.extend([source_path, destination_path])
277 converted_events = self.factory.convert_in_pyinotify_event(event)277 converted_events = self.factory.convert_in_pyinotify_event(event)
278 self.assertEqual(1, len(converted_events))278 self.assertEqual(1, len(converted_events))
@@ -289,7 +289,7 @@
289 head, _ = os.path.split(source_path)289 head, _ = os.path.split(source_path)
290 self.factory.watched_paths.append(head)290 self.factory.watched_paths.append(head)
291 event = FakeDaemonEvent()291 event = FakeDaemonEvent()
292 event.event_type = fsevents.FSE_RENAME292 event.event_type = fseventsd.FSE_RENAME
293 event.event_paths.extend([source_path, destination_path])293 event.event_paths.extend([source_path, destination_path])
294 converted_events = self.factory.convert_in_pyinotify_event(event)294 converted_events = self.factory.convert_in_pyinotify_event(event)
295 self.assertEqual(1, len(converted_events))295 self.assertEqual(1, len(converted_events))
@@ -306,7 +306,7 @@
306 head, _ = os.path.split(source_path)306 head, _ = os.path.split(source_path)
307 self.factory.watched_paths.append(head)307 self.factory.watched_paths.append(head)
308 event = FakeDaemonEvent()308 event = FakeDaemonEvent()
309 event.event_type = fsevents.FSE_RENAME309 event.event_type = fseventsd.FSE_RENAME
310 event.event_paths.extend([source_path, destination_path])310 event.event_paths.extend([source_path, destination_path])
311 converted_events = self.factory.convert_in_pyinotify_event(event)311 converted_events = self.factory.convert_in_pyinotify_event(event)
312 self.assertEqual(2, len(converted_events))312 self.assertEqual(2, len(converted_events))
@@ -337,7 +337,7 @@
337 """Test processing the drop of the events."""337 """Test processing the drop of the events."""
338 func_called = []338 func_called = []
339 event = FakeDaemonEvent()339 event = FakeDaemonEvent()
340 event.event_type = fsevents.FSE_EVENTS_DROPPED340 event.event_type = fseventsd.FSE_EVENTS_DROPPED
341341
342 def fake_events_dropped():342 def fake_events_dropped():
343 """A fake events dropped implementation."""343 """A fake events dropped implementation."""
@@ -354,7 +354,7 @@
354 self.factory.ignored_paths.append(head)354 self.factory.ignored_paths.append(head)
355 event = FakeDaemonEvent()355 event = FakeDaemonEvent()
356 event.event_paths.append(event_path)356 event.event_paths.append(event_path)
357 event.event_type = fsevents.FSE_CREATE_FILE357 event.event_type = fseventsd.FSE_CREATE_FILE
358 self.factory.process_event(event)358 self.factory.process_event(event)
359 self.assertEqual(0, len(self.processor.processed_events))359 self.assertEqual(0, len(self.processor.processed_events))
360360
@@ -365,7 +365,7 @@
365 self.factory.watched_paths.append(head)365 self.factory.watched_paths.append(head)
366 event = FakeDaemonEvent()366 event = FakeDaemonEvent()
367 event.event_paths.append(event_path)367 event.event_paths.append(event_path)
368 event.event_type = fsevents.FSE_CREATE_FILE368 event.event_type = fseventsd.FSE_CREATE_FILE
369 self.factory.process_event(event)369 self.factory.process_event(event)
370 self.assertEqual(1, len(self.processor.processed_events))370 self.assertEqual(1, len(self.processor.processed_events))
371 self.assertEqual(event_path,371 self.assertEqual(event_path,
372372
=== added file 'tests/platform/filesystem_notifications/test_windows.py'
--- tests/platform/filesystem_notifications/test_windows.py 1970-01-01 00:00:00 +0000
+++ tests/platform/filesystem_notifications/test_windows.py 2012-08-22 18:22:29 +0000
@@ -0,0 +1,344 @@
1#
2# Authors: Manuel de la Pena <manuel@canonical.com>
3# Alejandro J. Cura <alecu@canonical.com>
4#
5# Copyright 2011-2012 Canonical Ltd.
6#
7# This program is free software: you can redistribute it and/or modify it
8# under the terms of the GNU General Public License version 3, as published
9# by the Free Software Foundation.
10#
11# This program is distributed in the hope that it will be useful, but
12# WITHOUT ANY WARRANTY; without even the implied warranties of
13# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR
14# PURPOSE. See the GNU General Public License for more details.
15#
16# You should have received a copy of the GNU General Public License along
17# with this program. If not, see <http://www.gnu.org/licenses/>.
18#
19# In addition, as a special exception, the copyright holders give
20# permission to link the code of portions of this program with the
21# OpenSSL library under certain conditions as described in each
22# individual source file, and distribute linked combinations
23# including the two.
24# You must obey the GNU General Public License in all respects
25# for all of the code used other than OpenSSL. If you modify
26# file(s) with this exception, you may extend this exception to your
27# version of the file(s), but you are not obligated to do so. If you
28# do not wish to do so, delete this exception statement from your
29# version. If you delete this exception statement from all source
30# files in the program, then also delete it here.
31"""Test the filesystem notifications on windows."""
32
33import os
34
35from twisted.internet import defer
36from win32file import FILE_NOTIFY_INFORMATION
37
38from ubuntuone.platform.filesystem_notifications.monitor import (
39 common,
40 windows as filesystem_notifications,
41)
42from ubuntuone.platform.filesystem_notifications.monitor.common import (
43 FilesystemMonitor,
44 Watch,
45 WatchManager,
46)
47from ubuntuone.platform.filesystem_notifications.monitor.windows import (
48 FILE_NOTIFY_CHANGE_FILE_NAME,
49 FILE_NOTIFY_CHANGE_DIR_NAME,
50 FILE_NOTIFY_CHANGE_ATTRIBUTES,
51 FILE_NOTIFY_CHANGE_SIZE,
52 FILE_NOTIFY_CHANGE_LAST_WRITE,
53 FILE_NOTIFY_CHANGE_SECURITY,
54 FILE_NOTIFY_CHANGE_LAST_ACCESS,
55)
56from tests.platform.filesystem_notifications import common as common_tests
57
58
59class FakeEventsProcessor(object):
60
61 """Handle fake events creation and processing."""
62
63 def create_fake_event(self, filename):
64 """Create a fake file event."""
65 return (1, filename)
66
67 def custom_process_events(self, watch, events):
68 """Adapt to each platform way to process events."""
69 watch.platform_watch._process_events(events)
70
71
72class TestWatch(common_tests.TestWatch):
73 """Test the watch so that it returns the same events as pyinotify."""
74
75 timeout = 5
76
77 @defer.inlineCallbacks
78 def setUp(self):
79 yield super(TestWatch, self).setUp()
80 self.path = u'\\\\?\\C:\\path' # a valid windows path
81 self.common_path = u'C:\\path'
82 self.invalid_path = u'\\\\?\\C:\\path\\to\\no\\dir'
83 self.mask = FILE_NOTIFY_CHANGE_FILE_NAME | \
84 FILE_NOTIFY_CHANGE_DIR_NAME | \
85 FILE_NOTIFY_CHANGE_ATTRIBUTES | \
86 FILE_NOTIFY_CHANGE_SIZE | \
87 FILE_NOTIFY_CHANGE_LAST_WRITE | \
88 FILE_NOTIFY_CHANGE_SECURITY | \
89 FILE_NOTIFY_CHANGE_LAST_ACCESS
90 self.fake_events_processor = FakeEventsProcessor()
91
92 def file_notify_information_wrapper(buf, data):
93 """Wrapper that gets the events and adds them to the list."""
94 events = FILE_NOTIFY_INFORMATION(buf, data)
95 # we want to append the list because that is what will be logged.
96 # If we use extend we wont have the same logging because it will
97 # group all events in a single lists which is not what the COM API
98 # does.
99 str_events = [
100 (common.ACTIONS_NAMES[action], path) for action, path in
101 events]
102 self.raw_events.append(str_events)
103 return events
104
105 self.patch(filesystem_notifications, 'FILE_NOTIFY_INFORMATION',
106 file_notify_information_wrapper)
107
108 @defer.inlineCallbacks
109 def test_file_write(self):
110 """Test that the correct event is raised when a file is written."""
111 file_name = os.path.join(self.basedir, 'test_file_write')
112 # create the file before recording
113 fd = open(file_name, 'w')
114 # clean behind us by removing the file
115 self.addCleanup(os.remove, file_name)
116
117 def write_file():
118 """Action for the test."""
119 fd.write('test')
120 fd.close()
121
122 events = yield self._perform_operations(self.basedir, self.mask,
123 write_file, 1)
124 event = events[0]
125 self.assertFalse(event.dir)
126 self.assertEqual(0x2, event.mask)
127 self.assertEqual('IN_MODIFY', event.maskname)
128 self.assertEqual(os.path.split(file_name)[1], event.name)
129 self.assertEqual('.', event.path)
130 self.assertEqual(os.path.join(self.basedir, file_name), event.pathname)
131 self.assertEqual(0, event.wd)
132
133 @defer.inlineCallbacks
134 def test_call_deferred_already_called(self):
135 """Test that the function is not called."""
136 method_args = []
137
138 def fake_call(*args, **kwargs):
139 """Execute the call."""
140 method_args.append((args, kwargs),)
141
142 watch = Watch(1, self.path, None)
143 yield watch.platform_watch._watch_started_deferred.callback(True)
144 watch.platform_watch._call_deferred(fake_call, None)
145 self.assertEqual(0, len(method_args))
146
147 def test_call_deferred_not_called(self):
148 """Test that is indeed called."""
149 method_args = []
150
151 def fake_call(*args, **kwargs):
152 """Execute the call."""
153 method_args.append((args, kwargs),)
154
155 watch = Watch(1, self.path, None)
156 watch.platform_watch._call_deferred(fake_call, None)
157 self.assertEqual(1, len(method_args))
158
159 def test_started_property(self):
160 """Test that the started property returns the started deferred."""
161 watch = Watch(1, self.path, None)
162 self.assertEqual(watch.started,
163 watch.platform_watch._watch_started_deferred)
164
165 def test_stopped_property(self):
166 """Test that the stopped property returns the stopped deferred."""
167 watch = Watch(1, self.path, None)
168 self.assertEqual(watch.stopped,
169 watch.platform_watch._watch_stopped_deferred)
170
171 @defer.inlineCallbacks
172 def test_start_watching_fails_early_in_thread(self):
173 """An early failure inside the thread should errback the deferred."""
174 test_path = self.mktemp("test_directory")
175 self.patch(filesystem_notifications, "CreateFileW", self.random_error)
176 watch = Watch(1, test_path, None)
177 d = watch.start_watching()
178 yield self.assertFailure(d, common_tests.FakeException)
179
180 @defer.inlineCallbacks
181 def test_start_watching_fails_late_in_thread(self):
182 """A late failure inside the thread should errback the deferred."""
183 test_path = self.mktemp("test_directory")
184 self.patch(filesystem_notifications, "ReadDirectoryChangesW",
185 self.random_error)
186 watch = Watch(1, test_path, None)
187 d = watch.start_watching()
188 yield self.assertFailure(d, common_tests.FakeException)
189
190 @defer.inlineCallbacks
191 def test_close_handle_is_called_on_error(self):
192 """CloseHandle is called when there's an error in the watch thread."""
193 test_path = self.mktemp("test_directory")
194 close_called = []
195 self.patch(filesystem_notifications, "CreateFileW", lambda *_: None)
196 self.patch(filesystem_notifications, "CloseHandle",
197 close_called.append)
198 self.patch(filesystem_notifications, "ReadDirectoryChangesW",
199 self.random_error)
200 watch = Watch(1, test_path, self.mask)
201 d = watch.start_watching()
202 yield self.assertFailure(d, common_tests.FakeException)
203 self.assertEqual(len(close_called), 1)
204 yield watch.stop_watching()
205
206 @defer.inlineCallbacks
207 def test_stop_watching_fired_when_watch_thread_finishes(self):
208 """The deferred returned is fired when the watch thread finishes."""
209 test_path = self.mktemp("another_test_directory")
210 watch = Watch(1, test_path, self.mask)
211 yield watch.start_watching()
212 self.assertNotEqual(watch.platform_watch._watch_handle, None)
213 yield watch.stop_watching()
214 self.assertEqual(watch.platform_watch._watch_handle, None)
215
216
217class TestWatchManager(common_tests.TestWatchManager):
218 """Test the watch manager."""
219
220 @defer.inlineCallbacks
221 def setUp(self):
222 """Set each of the tests."""
223 yield super(TestWatchManager, self).setUp()
224 self.parent_path = u'\\\\?\\C:\\' # a valid windows path
225 self.path = self.parent_path + u'path'
226 self.watch = Watch(1, self.path, None)
227 self.manager._wdm = {1: self.watch}
228 self.fake_events_processor = FakeEventsProcessor()
229
230 def test_add_single_watch(self):
231 """Test the addition of a new single watch."""
232 self.was_called = False
233
234 def fake_start_watching(*args):
235 """Fake start watch."""
236 self.was_called = True
237
238 self.patch(Watch, "start_watching", fake_start_watching)
239 self.manager._wdm = {}
240
241 mask = 'bit_mask'
242 self.manager.add_watch(self.path, mask)
243 self.assertEqual(1, len(self.manager._wdm))
244 self.assertTrue(self.was_called, 'The watch start was not called.')
245 self.assertEqual(self.path + os.path.sep, self.manager._wdm[0].path)
246 self.assertEqual(filesystem_notifications.FILESYSTEM_MONITOR_MASK,
247 self.manager._wdm[0].platform_watch._mask)
248
249 @defer.inlineCallbacks
250 def test_stop_multiple(self):
251 """Test that stop is fired when *all* watches have stopped."""
252
253 def fake_stop_watching(watch):
254 """Another fake stop watch."""
255 return watch.stopped
256
257 self.patch(Watch, "stop_watching", fake_stop_watching)
258 second_path = self.parent_path + u"second_path"
259 second_watch = Watch(2, second_path, None)
260 self.manager._wdm[2] = second_watch
261 d = self.manager.stop()
262 self.assertFalse(d.called, "Not fired before all watches end")
263 self.watch.stopped.callback(None)
264 self.assertFalse(d.called, "Not fired before all watches end")
265 second_watch.stopped.callback(None)
266 yield d
267 self.assertTrue(d.called, "Fired after the watches ended")
268
269
270class TestWatchManagerAddWatches(common_tests.TestWatchManagerAddWatches):
271 """Test the watch manager."""
272 timeout = 5
273
274 def test_add_watch_twice(self):
275 """Adding a watch twice succeeds when the watch is running."""
276 self.patch(Watch, "start_watching", lambda self: self.started)
277 manager = WatchManager(None)
278 # no need to stop watching because start_watching is fake
279
280 path = u'\\\\?\\C:\\test' # a valid windows path
281 mask = 'fake bit mask'
282 d1 = manager.add_watch(path, mask)
283 d2 = manager.add_watch(path, mask)
284
285 self.assertFalse(d1.called, "Should not be called yet.")
286 self.assertFalse(d2.called, "Should not be called yet.")
287
288 manager._wdm.values()[0].started.callback(True)
289
290 self.assertTrue(d1.called, "Should already be called.")
291 self.assertTrue(d2.called, "Should already be called.")
292
293
294class TestNotifyProcessor(common_tests.TestNotifyProcessor):
295 """Test the notify processor."""
296
297 @defer.inlineCallbacks
298 def setUp(self):
299 """set up the diffeent tests."""
300 yield super(TestNotifyProcessor, self).setUp()
301
302
303class FilesystemMonitorTestCase(common_tests.FilesystemMonitorTestCase):
304 """Tests for the FilesystemMonitor."""
305 timeout = 5
306
307 def test_add_watch_twice(self):
308 """Check the deferred returned by a second add_watch."""
309 self.patch(Watch, "start_watching", lambda self: self.started)
310 monitor = FilesystemMonitor(None, None)
311 # no need to stop watching because start_watching is fake
312
313 parent_path = 'C:\\test' # a valid windows path in utf-8 bytes
314 child_path = parent_path + "\\child"
315 d1 = monitor.add_watch(parent_path)
316 d2 = monitor.add_watch(child_path)
317
318 self.assertFalse(d1.called, "Should not be called yet.")
319 self.assertFalse(d2.called, "Should not be called yet.")
320
321 monitor._watch_manager._wdm.values()[0].started.callback(True)
322
323 self.assertTrue(d1.called, "Should already be called.")
324 self.assertTrue(d2.called, "Should already be called.")
325
326 @defer.inlineCallbacks
327 def test_add_watches_to_udf_ancestors(self):
328 """Test that the ancestor watches are not added."""
329
330 class FakeVolume(object):
331 """A fake UDF."""
332
333 def __init__(self, ancestors):
334 """Create a new instance."""
335 self.ancestors = ancestors
336
337 ancestors = ['~', '~\\Pictures', '~\\Pictures\\Home', ]
338 volume = FakeVolume(ancestors)
339 monitor = FilesystemMonitor(None, None)
340 added = yield monitor.add_watches_to_udf_ancestors(volume)
341 self.assertTrue(added, 'We should always return true.')
342 # lets ensure that we never added the watches
343 self.assertEqual(0, len(monitor._watch_manager._wdm.values()),
344 'No watches should have been added.')
0345
=== modified file 'tests/platform/ipc/test_external_interface.py'
--- tests/platform/ipc/test_external_interface.py 2012-04-30 14:24:55 +0000
+++ tests/platform/ipc/test_external_interface.py 2012-08-22 18:22:29 +0000
@@ -40,6 +40,10 @@
40 StatusTestCase,40 StatusTestCase,
41 SyncDaemonTestCase,41 SyncDaemonTestCase,
42)42)
43from ubuntuone.syncdaemon import (
44 RECENT_TRANSFERS,
45 UPLOADING,
46)
4347
44STR = 'something'48STR = 'something'
45STR_STR_DICT = {'foo': 'bar'}49STR_STR_DICT = {'foo': 'bar'}
@@ -132,6 +136,16 @@
132 self.assert_remote_method('waiting_metadata',136 self.assert_remote_method('waiting_metadata',
133 in_signature=None, out_signature='a(sa{ss})')137 in_signature=None, out_signature='a(sa{ss})')
134138
139 @defer.inlineCallbacks
140 def test_sync_menu(self):
141 """Test sync_menu."""
142 result = {RECENT_TRANSFERS: [], UPLOADING: []}
143 method = 'sync_menu'
144 yield self.assert_method_called(self.service.status,
145 method, result)
146 self.assert_remote_method(method,
147 in_signature=None, out_signature='a{sv}')
148
135149
136class EventsTests(EventsTestCase):150class EventsTests(EventsTestCase):
137 """Basic tests for the Events exposed object."""151 """Basic tests for the Events exposed object."""
138152
=== modified file 'tests/platform/test_tools.py'
--- tests/platform/test_tools.py 2012-05-30 15:35:49 +0000
+++ tests/platform/test_tools.py 2012-08-22 18:22:29 +0000
@@ -36,12 +36,15 @@
36from ubuntuone.devtools.handlers import MementoHandler36from ubuntuone.devtools.handlers import MementoHandler
3737
38from contrib.testing.testcase import FakeCommand, skipIfOS38from contrib.testing.testcase import FakeCommand, skipIfOS
39
39from ubuntuone.syncdaemon import (40from ubuntuone.syncdaemon import (
40 action_queue,41 action_queue,
41 event_queue,42 event_queue,
42 interaction_interfaces,43 interaction_interfaces,
43 states,44 states,
44 volume_manager,45 volume_manager,
46 RECENT_TRANSFERS,
47 UPLOADING,
45)48)
46from ubuntuone.platform import tools49from ubuntuone.platform import tools
47from tests.platform import IPCTestCase50from tests.platform import IPCTestCase
@@ -243,6 +246,13 @@
243 self.assertEqual('share_id', result['volume_id'])246 self.assertEqual('share_id', result['volume_id'])
244 self.assertEqual(False, self.main.vm.shares['share_id'].accepted)247 self.assertEqual(False, self.main.vm.shares['share_id'].accepted)
245248
249 @defer.inlineCallbacks
250 def test_sync_menu(self):
251 """Test accept_share method."""
252 result = yield self.tool.sync_menu()
253 self.assertIn(RECENT_TRANSFERS, result)
254 self.assertIn(UPLOADING, result)
255
246256
247class TestWaitForSignals(TestToolsBase):257class TestWaitForSignals(TestToolsBase):
248 """Test case for the wait_for_signals method from SyncDaemonTool."""258 """Test case for the wait_for_signals method from SyncDaemonTool."""
249259
=== modified file 'tests/status/test_aggregator.py'
--- tests/status/test_aggregator.py 2012-04-09 20:07:05 +0000
+++ tests/status/test_aggregator.py 2012-08-22 18:22:29 +0000
@@ -42,7 +42,11 @@
42from ubuntuone.status import aggregator42from ubuntuone.status import aggregator
43from ubuntuone.status.notification import AbstractNotification43from ubuntuone.status.notification import AbstractNotification
44from ubuntuone.status.messaging import AbstractMessaging44from ubuntuone.status.messaging import AbstractMessaging
45from ubuntuone.syncdaemon import status_listener45from ubuntuone.syncdaemon import (
46 status_listener,
47 RECENT_TRANSFERS,
48 UPLOADING,
49)
46from ubuntuone.syncdaemon.volume_manager import Share, UDF, Root50from ubuntuone.syncdaemon.volume_manager import Share, UDF, Root
4751
48FILENAME = 'example.txt'52FILENAME = 'example.txt'
@@ -706,6 +710,8 @@
706 self.share_id = path710 self.share_id = path
707 self.node_id = path711 self.node_id = path
708 self.deflated_size = 10000712 self.deflated_size = 10000
713 self.size = 0
714 self.n_bytes_written = 0
709715
710716
711class FakeVolumeManager(object):717class FakeVolumeManager(object):
@@ -733,6 +739,7 @@
733 self.files_uploading = []739 self.files_uploading = []
734 self.files_downloading = []740 self.files_downloading = []
735 self.progress_events = []741 self.progress_events = []
742 self.recent_transfers = aggregator.deque(maxlen=10)
736743
737 def queue_done(self):744 def queue_done(self):
738 """The queue completed all operations."""745 """The queue completed all operations."""
@@ -762,6 +769,7 @@
762 """An upload just finished."""769 """An upload just finished."""
763 if command in self.files_uploading:770 if command in self.files_uploading:
764 self.files_uploading.remove(command)771 self.files_uploading.remove(command)
772 self.recent_transfers.append(command.path)
765 self.queued_commands.discard(command)773 self.queued_commands.discard(command)
766774
767 def progress_made(self, share_id, node_id, n_bytes, deflated_size):775 def progress_made(self, share_id, node_id, n_bytes, deflated_size):
@@ -796,6 +804,70 @@
796 self.fakevm,804 self.fakevm,
797 self.status_frontend)805 self.status_frontend)
798806
807 def test_recent_transfers(self):
808 """Check that it generates a tuple with the recent transfers."""
809 self.patch(status_listener.action_queue, "Upload", FakeCommand)
810 fake_command = FakeCommand('path1')
811 self.listener.handle_SYS_QUEUE_ADDED(fake_command)
812 self.listener.handle_SYS_QUEUE_REMOVED(fake_command)
813 fake_command = FakeCommand('path2')
814 self.listener.handle_SYS_QUEUE_ADDED(fake_command)
815 self.listener.handle_SYS_QUEUE_REMOVED(fake_command)
816 fake_command = FakeCommand('path3')
817 self.listener.handle_SYS_QUEUE_ADDED(fake_command)
818 self.listener.handle_SYS_QUEUE_REMOVED(fake_command)
819 transfers = self.status_frontend.recent_transfers()
820 expected = ['path1', 'path2', 'path3']
821 self.assertEqual(transfers, expected)
822
823 menu_data = self.listener.menu_data()
824 self.assertEqual(menu_data,
825 {UPLOADING: [], RECENT_TRANSFERS: expected})
826
827 def test_file_uploading(self):
828 """Check that it returns a list with the path, size, and progress."""
829 fc = FakeCommand(path='testfile.txt')
830 fc.size = 200
831 self.status_frontend.upload_started(fc)
832 uploading = self.status_frontend.files_uploading()
833 expected = [('testfile.txt', 200, 0)]
834 self.assertEqual(uploading, expected)
835 menu_data = self.listener.menu_data()
836 self.assertEqual(menu_data,
837 {UPLOADING: expected, RECENT_TRANSFERS: []})
838
839 fc.size = 1000
840 fc.n_bytes_written = 200
841 fc2 = FakeCommand(path='testfile2.txt')
842 fc2.size = 2000
843 fc2.n_bytes_written = 450
844 self.status_frontend.upload_started(fc2)
845 uploading = self.status_frontend.files_uploading()
846 expected = [('testfile.txt', 1000, 200), ('testfile2.txt', 2000, 450)]
847 self.assertEqual(uploading, expected)
848
849 menu_data = self.listener.menu_data()
850 self.assertEqual(menu_data,
851 {UPLOADING: expected, RECENT_TRANSFERS: []})
852
853 def test_menu_data_full_response(self):
854 """Check that listener.menu_data returns both uploading and recent."""
855 self.patch(status_listener.action_queue, "Upload", FakeCommand)
856 fake_command = FakeCommand('path1')
857 self.listener.handle_SYS_QUEUE_ADDED(fake_command)
858 self.listener.handle_SYS_QUEUE_REMOVED(fake_command)
859 fc = FakeCommand(path='testfile.txt')
860 fc.size = 1000
861 fc.n_bytes_written = 200
862 self.status_frontend.upload_started(fc)
863 uploading = self.status_frontend.files_uploading()
864 transfers = self.status_frontend.recent_transfers()
865 expected = {UPLOADING: [('testfile.txt', 1000, 200)],
866 RECENT_TRANSFERS: ['path1']}
867
868 self.assertEqual(
869 {UPLOADING: uploading, RECENT_TRANSFERS: transfers}, expected)
870
799 def test_file_published(self):871 def test_file_published(self):
800 """A file published event is processed."""872 """A file published event is processed."""
801 share_id = "fake share id"873 share_id = "fake share id"
@@ -1308,6 +1380,15 @@
1308 self.assertEqual(1380 self.assertEqual(
1309 {(fc.share_id, fc.node_id): (fc.deflated_size)},1381 {(fc.share_id, fc.node_id): (fc.deflated_size)},
1310 self.aggregator.progress)1382 self.aggregator.progress)
1383 self.assertEqual(len(self.aggregator.recent_transfers), 1)
1384
1385 def test_max_recent_files(self):
1386 """Check that the queue doesn't exceed the 5 items."""
1387 for i in range(15):
1388 fc = FakeCommand()
1389 self.status_frontend.upload_started(fc)
1390 self.status_frontend.upload_finished(fc)
1391 self.assertEqual(len(self.aggregator.recent_transfers), 5)
13111392
1312 def test_progress_made(self):1393 def test_progress_made(self):
1313 """Progress on up and downloads is tracked."""1394 """Progress on up and downloads is tracked."""
13141395
=== modified file 'tests/syncdaemon/test_fsm.py'
--- tests/syncdaemon/test_fsm.py 2012-04-09 20:07:05 +0000
+++ tests/syncdaemon/test_fsm.py 2012-08-22 18:22:29 +0000
@@ -1626,6 +1626,33 @@
1626 mdobj = self.fsm.get_by_mdid(mdid)1626 mdobj = self.fsm.get_by_mdid(mdid)
1627 self.assertEqual(mdobj.stat, stat_path(path))1627 self.assertEqual(mdobj.stat, stat_path(path))
16281628
1629 def test_commit_partial_pushes_event(self):
1630 """Test that the right event is pushed after the commit."""
1631 listener = Listener()
1632 self.eq.subscribe(listener)
1633
1634 path = os.path.join(self.share.path, "thisfile")
1635 open_file(path, "w").close()
1636 mdobj = self.create_node("thisfile")
1637 mdid = mdobj.mdid
1638 oldstat = stat_path(path)
1639 self.assertEqual(mdobj.stat, oldstat)
1640
1641 # create a partial
1642 self.fsm.create_partial(mdobj.node_id, mdobj.share_id)
1643 fh = self.fsm.get_partial_for_writing(mdobj.node_id, mdobj.share_id)
1644 fh.write("foobar")
1645 fh.close()
1646 mdobj = self.fsm.get_by_mdid(mdid)
1647 self.assertEqual(mdobj.stat, oldstat)
1648
1649 # commit the partial
1650 self.fsm.commit_partial(mdobj.node_id, mdobj.share_id, "localhash")
1651 mdobj = self.fsm.get_by_mdid(mdid)
1652
1653 kwargs = dict(share_id=mdobj.share_id, node_id=mdobj.node_id)
1654 self.assertTrue(("FSM_PARTIAL_COMMITED", kwargs) in listener.events)
1655
1629 def test_move(self):1656 def test_move(self):
1630 """Test that move refreshes stat."""1657 """Test that move refreshes stat."""
1631 path1 = os.path.join(self.share.path, "thisfile1")1658 path1 = os.path.join(self.share.path, "thisfile1")
16321659
=== modified file 'tests/syncdaemon/test_interaction_interfaces.py'
--- tests/syncdaemon/test_interaction_interfaces.py 2012-04-09 20:07:05 +0000
+++ tests/syncdaemon/test_interaction_interfaces.py 2012-08-22 18:22:29 +0000
@@ -1226,18 +1226,19 @@
1226 self.addCleanup(self.main.event_q.unsubscribe, self.sd_obj)1226 self.addCleanup(self.main.event_q.unsubscribe, self.sd_obj)
12271227
12281228
1229class DownloadTestCase(SyncdaemonEventListenerTestCase):1229class UploadTestCase(SyncdaemonEventListenerTestCase):
1230 """Test the Download events in SyncdaemonEventListener."""1230 """Test the Upload events in SyncdaemonEventListener."""
12311231
1232 add_fsm_key = True1232 add_fsm_key = True
1233 direction = 'Download'1233 direction = 'Upload'
1234 bytes_key = 'n_bytes_read'1234 bytes_key = 'n_bytes_written'
1235 hash_kwarg = 'server_hash'1235 hash_kwarg = 'hash'
1236 extra_finished_args = {}1236 extra_finished_args = dict(new_generation='new_generation', hash='')
1237 finished_event = 'AQ_UPLOAD_FINISHED'
12371238
1238 @defer.inlineCallbacks1239 @defer.inlineCallbacks
1239 def setUp(self):1240 def setUp(self):
1240 yield super(DownloadTestCase, self).setUp()1241 yield super(UploadTestCase, self).setUp()
1241 self.deferred = None1242 self.deferred = None
1242 self.signal_name = None1243 self.signal_name = None
1243 if self.add_fsm_key:1244 if self.add_fsm_key:
@@ -1304,7 +1305,7 @@
1304 return self.deferred1305 return self.deferred
13051306
1306 def test_handle_finished(self):1307 def test_handle_finished(self):
1307 """Test the handle_AQ_<direction>_FINISHED method."""1308 """Test the handle_<finished_event> method."""
1308 self.signal_name = self.direction + 'Finished'1309 self.signal_name = self.direction + 'Finished'
1309 self.deferred = defer.Deferred()1310 self.deferred = defer.Deferred()
13101311
@@ -1320,10 +1321,9 @@
1320 self.patch(self.sd_obj.interface.status, 'SignalError',1321 self.patch(self.sd_obj.interface.status, 'SignalError',
1321 self.error_handler)1322 self.error_handler)
13221323
1323 kwargs = {'share_id': '', 'node_id': 'node_id', self.hash_kwarg: ''}1324 kwargs = {'share_id': '', 'node_id': 'node_id'}
1324 kwargs.update(self.extra_finished_args)1325 kwargs.update(self.extra_finished_args)
1325 self.main.event_q.push('AQ_%s_FINISHED' % self.direction.upper(),1326 self.main.event_q.push(self.finished_event, **kwargs)
1326 **kwargs)
1327 return self.deferred1327 return self.deferred
13281328
1329 def test_handle_event_error(self):1329 def test_handle_event_error(self):
@@ -1350,21 +1350,29 @@
1350 return self.deferred1350 return self.deferred
13511351
13521352
1353class DownloadTestCase(UploadTestCase):
1354 """Test the Download events in SyncdaemonEventListener."""
1355
1356 direction = 'Download'
1357 bytes_key = 'n_bytes_read'
1358 hash_kwarg = 'server_hash'
1359 extra_finished_args = {}
1360 finished_event = 'FSM_PARTIAL_COMMITED'
1361
1362 # The download is special, because we don't want to throw the ipc signal on
1363 # AQ_DOWNLOAD_FINISHED but instead we should wait for FSM_PARTIAL_COMMITED
1364
1365 def test_ignore_pre_partial_commit_event(self):
1366 """The AQ_DOWNLOAD_FINISHED signal is ignored."""
1367 self.assertNotIn("handle_AQ_DOWNLOAD_FINISHED", vars(self.sd_class))
1368
1369
1353class DownloadNoKeyTestCase(DownloadTestCase):1370class DownloadNoKeyTestCase(DownloadTestCase):
1354 """Test the Download events when there is a fsm KeyError."""1371 """Test the Download events when there is a fsm KeyError."""
13551372
1356 add_fsm_key = False1373 add_fsm_key = False
13571374
13581375
1359class UploadTestCase(DownloadTestCase):
1360 """Test the Upload events in SyncdaemonEventListener."""
1361
1362 direction = 'Upload'
1363 bytes_key = 'n_bytes_written'
1364 hash_kwarg = 'hash'
1365 extra_finished_args = dict(new_generation='new_generation')
1366
1367
1368class UploadNoKeyTestCase(UploadTestCase):1376class UploadNoKeyTestCase(UploadTestCase):
1369 """Test the Upload events when there is a fsm KeyError."""1377 """Test the Upload events when there is a fsm KeyError."""
13701378
13711379
=== modified file 'tests/syncdaemon/test_vm.py'
--- tests/syncdaemon/test_vm.py 2012-04-09 20:07:05 +0000
+++ tests/syncdaemon/test_vm.py 2012-08-22 18:22:29 +0000
@@ -2752,7 +2752,7 @@
27522752
2753 result, msg = self.vm.validate_path_for_folder(folder_path)2753 result, msg = self.vm.validate_path_for_folder(folder_path)
2754 self.assertTrue(result)2754 self.assertTrue(result)
2755 self.assertIs(msg, "", 2755 self.assertIs(msg, "",
2756 '%r must be a valid path for creating a folder.' % folder_path)2756 '%r must be a valid path for creating a folder.' % folder_path)
27572757
2758 def test_validate_UDF_path_if_folder_shares_a_prefix_with_an_udf(self):2758 def test_validate_UDF_path_if_folder_shares_a_prefix_with_an_udf(self):
@@ -2769,7 +2769,7 @@
27692769
2770 result, msg = self.vm.validate_path_for_folder(tricky_path)2770 result, msg = self.vm.validate_path_for_folder(tricky_path)
2771 self.assertTrue(result)2771 self.assertTrue(result)
2772 self.assertIs(msg, "", 2772 self.assertIs(msg, "",
2773 '%r must be a valid path for creating a folder.' % tricky_path)2773 '%r must be a valid path for creating a folder.' % tricky_path)
27742774
2775 def test_validate_UDF_path_not_valid_if_outside_home(self):2775 def test_validate_UDF_path_not_valid_if_outside_home(self):
@@ -2818,6 +2818,17 @@
2818 self.assertIsNot(msg, "",2818 self.assertIsNot(msg, "",
2819 '%r must be an invalid path for creating a folder.' % udf_parent)2819 '%r must be an invalid path for creating a folder.' % udf_parent)
28202820
2821 def test_not_valid_if_folder_is_file(self):
2822 """A link path is not valid."""
2823 self.patch(volume_manager.os.path, 'isdir', lambda p: False)
2824 self.patch(volume_manager, 'path_exists', lambda p: True)
2825 path_link = os.path.join(self.home_dir, 'Test Me')
2826
2827 result, msg = self.vm.validate_path_for_folder(path_link)
2828 self.assertFalse(result)
2829 self.assertIsNot(msg, "",
2830 '%r must be an invalid path for creating a folder.' % path_link)
2831
2821 def test_not_valid_if_folder_is_link(self):2832 def test_not_valid_if_folder_is_link(self):
2822 """A link path is not valid."""2833 """A link path is not valid."""
2823 self.patch(volume_manager, 'is_link', lambda p: True)2834 self.patch(volume_manager, 'is_link', lambda p: True)
28242835
=== modified file 'ubuntuone/platform/filesystem_notifications/monitor/__init__.py'
--- ubuntuone/platform/filesystem_notifications/monitor/__init__.py 2012-07-18 15:18:04 +0000
+++ ubuntuone/platform/filesystem_notifications/monitor/__init__.py 2012-08-22 18:22:29 +0000
@@ -42,11 +42,13 @@
42if sys.platform == 'win32':42if sys.platform == 'win32':
43 from ubuntuone.platform.filesystem_notifications.monitor import (43 from ubuntuone.platform.filesystem_notifications.monitor import (
44 common,44 common,
45 windows,
45 )46 )
4647
47 FILEMONITOR_IDS = {48 FILEMONITOR_IDS = {
48 DEFAULT_MONITOR: common.FilesystemMonitor,49 DEFAULT_MONITOR: common.FilesystemMonitor,
49 }50 }
51 ACTIONS = windows.ACTIONS
5052
51elif sys.platform == 'darwin':53elif sys.platform == 'darwin':
52 from ubuntuone.platform.filesystem_notifications.monitor import darwin54 from ubuntuone.platform.filesystem_notifications.monitor import darwin
@@ -58,6 +60,7 @@
58 DEFAULT_MONITOR: common.FilesystemMonitor,60 DEFAULT_MONITOR: common.FilesystemMonitor,
59 'daemon': darwin.fsevents_daemon.FilesystemMonitor,61 'daemon': darwin.fsevents_daemon.FilesystemMonitor,
60 }62 }
63 ACTIONS = darwin.fsevents_client.ACTIONS
61else:64else:
62 from ubuntuone.platform.filesystem_notifications.monitor import (65 from ubuntuone.platform.filesystem_notifications.monitor import (
63 linux,66 linux,
6467
=== modified file 'ubuntuone/platform/filesystem_notifications/monitor/common.py'
--- ubuntuone/platform/filesystem_notifications/monitor/common.py 2012-07-18 09:05:26 +0000
+++ ubuntuone/platform/filesystem_notifications/monitor/common.py 2012-08-22 18:22:29 +0000
@@ -65,6 +65,7 @@
65else:65else:
66 raise ImportError('Not supported platform')66 raise ImportError('Not supported platform')
6767
68
68# a map between the few events that we have on common platforms and those69# a map between the few events that we have on common platforms and those
69# found in pyinotify70# found in pyinotify
70ACTIONS = source.ACTIONS71ACTIONS = source.ACTIONS
7172
=== modified file 'ubuntuone/platform/filesystem_notifications/monitor/darwin/fsevents_daemon.py'
--- ubuntuone/platform/filesystem_notifications/monitor/darwin/fsevents_daemon.py 2012-07-19 14:13:06 +0000
+++ ubuntuone/platform/filesystem_notifications/monitor/darwin/fsevents_daemon.py 2012-08-22 18:22:29 +0000
@@ -25,7 +25,7 @@
25# do not wish to do so, delete this exception statement from your25# do not wish to do so, delete this exception statement from your
26# version. If you delete this exception statement from all source26# version. If you delete this exception statement from all source
27# files in the program, then also delete it here.27# files in the program, then also delete it here.
28"""Filesystem notifications based on the fsevents daemon.."""28"""Filesystem notifications based on the fseventsd daemon.."""
2929
30import logging30import logging
31import os31import os
@@ -42,7 +42,7 @@
42)42)
4343
44from ubuntuone import logger44from ubuntuone import logger
45from ubuntuone.darwin import fsevents45from ubuntuone import fseventsd
46from ubuntuone.platform.filesystem_notifications.notify_processor import (46from ubuntuone.platform.filesystem_notifications.notify_processor import (
47 NotifyProcessor,47 NotifyProcessor,
48)48)
@@ -61,24 +61,24 @@
6161
62TRACE = logger.TRACE62TRACE = logger.TRACE
6363
64# map the fsevents actions to those from pyinotify64# map the fseventsd actions to those from pyinotify
65DARWIN_ACTIONS = {65DARWIN_ACTIONS = {
66 fsevents.FSE_CREATE_FILE: IN_CREATE,66 fseventsd.FSE_CREATE_FILE: IN_CREATE,
67 fsevents.FSE_DELETE: IN_DELETE,67 fseventsd.FSE_DELETE: IN_DELETE,
68 fsevents.FSE_STAT_CHANGED: IN_MODIFY,68 fseventsd.FSE_STAT_CHANGED: IN_MODIFY,
69 fsevents.FSE_CONTENT_MODIFIED: IN_MODIFY,69 fseventsd.FSE_CONTENT_MODIFIED: IN_MODIFY,
70 fsevents.FSE_CREATE_DIR: IN_CREATE,70 fseventsd.FSE_CREATE_DIR: IN_CREATE,
71}71}
7272
73# list of those events from which we do not care73# list of those events from which we do not care
74DARWIN_IGNORED_ACTIONS = (74DARWIN_IGNORED_ACTIONS = (
75 fsevents.FSE_UNKNOWN,75 fseventsd.FSE_UNKNOWN,
76 fsevents.FSE_INVALID,76 fseventsd.FSE_INVALID,
77 fsevents.FSE_EXCHANGE,77 fseventsd.FSE_EXCHANGE,
78 fsevents.FSE_FINDER_INFO_CHANGED,78 fseventsd.FSE_FINDER_INFO_CHANGED,
79 fsevents.FSE_CHOWN,79 fseventsd.FSE_CHOWN,
80 fsevents.FSE_XATTR_MODIFIED,80 fseventsd.FSE_XATTR_MODIFIED,
81 fsevents.FSE_XATTR_REMOVED,81 fseventsd.FSE_XATTR_REMOVED,
82)82)
8383
84# translates quickly the event and it's is_dir state to our standard events84# translates quickly the event and it's is_dir state to our standard events
@@ -95,7 +95,7 @@
95 IN_MOVED_TO: 'FS_FILE_CREATE',95 IN_MOVED_TO: 'FS_FILE_CREATE',
96 IN_MOVED_TO | IN_ISDIR: 'FS_DIR_CREATE'}96 IN_MOVED_TO | IN_ISDIR: 'FS_DIR_CREATE'}
9797
98# TODO: This should be in fsevents to be imported!98# TODO: This should be in fseventsd to be imported!
99# Path to the socket used by the daemon99# Path to the socket used by the daemon
100DAEMON_SOCKET = '/var/run/ubuntuone_fsevents_daemon'100DAEMON_SOCKET = '/var/run/ubuntuone_fsevents_daemon'
101101
@@ -134,14 +134,14 @@
134 return unicodedata.normalize('NFC', path).encode('utf-8')134 return unicodedata.normalize('NFC', path).encode('utf-8')
135135
136136
137class PyInotifyEventsFactory(fsevents.FsEventsFactory):137class PyInotifyEventsFactory(fseventsd.FsEventsFactory):
138 """Factory that process events and converts them in pyinotify ones."""138 """Factory that process events and converts them in pyinotify ones."""
139139
140 def __init__(self, processor,140 def __init__(self, processor,
141 ignored_events=DARWIN_IGNORED_ACTIONS):141 ignored_events=DARWIN_IGNORED_ACTIONS):
142 """Create a new instance."""142 """Create a new instance."""
143 # old style class143 # old style class
144 fsevents.FsEventsFactory.__init__(self)144 fseventsd.FsEventsFactory.__init__(self)
145 self._processor = processor145 self._processor = processor
146 self._ignored_events = ignored_events146 self._ignored_events = ignored_events
147 self.watched_paths = []147 self.watched_paths = []
@@ -215,7 +215,7 @@
215 """Get an event from the daemon and convert it in a pyinotify one."""215 """Get an event from the daemon and convert it in a pyinotify one."""
216 # the rename is a special type of event because it has to be either216 # the rename is a special type of event because it has to be either
217 # converted is a pair of events or in a single one (CREATE or DELETE)217 # converted is a pair of events or in a single one (CREATE or DELETE)
218 if event.event_type == fsevents.FSE_RENAME:218 if event.event_type == fseventsd.FSE_RENAME:
219 is_create = self.is_create(event)219 is_create = self.is_create(event)
220 if is_create or self.is_delete(event):220 if is_create or self.is_delete(event):
221 mask = IN_CREATE if is_create else IN_DELETE221 mask = IN_CREATE if is_create else IN_DELETE
@@ -278,7 +278,7 @@
278 if event.event_type in self._ignored_events:278 if event.event_type in self._ignored_events:
279 # Do nothing because sd does not care about such info279 # Do nothing because sd does not care about such info
280 return280 return
281 if event.event_type == fsevents.FSE_EVENTS_DROPPED:281 if event.event_type == fseventsd.FSE_EVENTS_DROPPED:
282 # this should not be very common but we have to deal with it282 # this should not be very common but we have to deal with it
283 return self.events_dropper()283 return self.events_dropper()
284 events = self.convert_in_pyinotify_event(event)284 events = self.convert_in_pyinotify_event(event)
285285
=== modified file 'ubuntuone/platform/ipc/ipc_client.py'
--- ubuntuone/platform/ipc/ipc_client.py 2012-05-22 14:28:56 +0000
+++ ubuntuone/platform/ipc/ipc_client.py 2012-08-22 18:22:29 +0000
@@ -197,6 +197,22 @@
197 def current_uploads(self):197 def current_uploads(self):
198 """Return a list of files with a upload in progress."""198 """Return a list of files with a upload in progress."""
199199
200 @remote
201 def sync_menu(self):
202 """
203 This method returns a dictionary, with the following keys and values:
204
205 Key: 'recent-transfers'
206 Value: a list of strings (paths), each being the name of a file that
207 was recently transferred.
208
209 Key: 'uploading'
210 Value: a list of tuples, with each tuple having the following items:
211 * str: the path of a file that's currently being uploaded
212 * int: size of the file
213 * int: bytes written
214 """
215
200 @signal216 @signal
201 def on_content_queue_changed(self):217 def on_content_queue_changed(self):
202 """Emit ContentQueueChanged."""218 """Emit ContentQueueChanged."""
@@ -234,7 +250,7 @@
234 """Emit UploadFileProgress."""250 """Emit UploadFileProgress."""
235251
236 @signal252 @signal
237 def on_upload_finished(self, upload, **info):253 def on_upload_finished(self, upload, *info):
238 """Emit UploadFinished."""254 """Emit UploadFinished."""
239255
240 @signal256 @signal
241257
=== modified file 'ubuntuone/platform/ipc/linux.py'
--- ubuntuone/platform/ipc/linux.py 2012-05-22 14:07:55 +0000
+++ ubuntuone/platform/ipc/linux.py 2012-08-22 18:22:29 +0000
@@ -38,6 +38,10 @@
38from xml.etree import ElementTree38from xml.etree import ElementTree
3939
40from ubuntuone.platform.launcher import UbuntuOneLauncher40from ubuntuone.platform.launcher import UbuntuOneLauncher
41from ubuntuone.syncdaemon import (
42 RECENT_TRANSFERS,
43 UPLOADING,
44)
4145
42# Disable the "Invalid Name" check here, as we have lots of DBus style names46# Disable the "Invalid Name" check here, as we have lots of DBus style names
43# pylint: disable-msg=C010347# pylint: disable-msg=C0103
@@ -173,6 +177,35 @@
173 warnings.warn('Use "waiting" method instead.', DeprecationWarning)177 warnings.warn('Use "waiting" method instead.', DeprecationWarning)
174 return self.service.status.waiting_content()178 return self.service.status.waiting_content()
175179
180 @dbus.service.method(DBUS_IFACE_STATUS_NAME, out_signature='a{sv}')
181 def sync_menu(self):
182 """
183 This method returns a dictionary, with the following keys and values:
184
185 Key: 'recent-transfers'
186 Value: a list of strings (paths), each being the name of a file that
187 was recently transferred.
188
189 Key: 'uploading'
190 Value: a list of tuples, with each tuple having the following items:
191 * str: the path of a file that's currently being uploaded
192 * int: size of the file
193 * int: bytes written
194 """
195 data = self.service.status.sync_menu()
196 uploading = data[UPLOADING]
197 transfers = data[RECENT_TRANSFERS]
198 upload_data = dbus.Array(signature="(sii)")
199 transfer_data = dbus.Array(signature="s")
200 for up in uploading:
201 upload_data.append(dbus.Struct(up, signature="sii"))
202 for transfer in transfers:
203 transfer_data.append(transfer)
204 result = dbus.Dictionary(signature="sv")
205 result[UPLOADING] = upload_data
206 result[RECENT_TRANSFERS] = transfer_data
207 return result
208
176 @dbus.service.signal(DBUS_IFACE_STATUS_NAME)209 @dbus.service.signal(DBUS_IFACE_STATUS_NAME)
177 def DownloadStarted(self, path):210 def DownloadStarted(self, path):
178 """Fire a signal to notify that a download has started."""211 """Fire a signal to notify that a download has started."""
179212
=== modified file 'ubuntuone/platform/ipc/perspective_broker.py'
--- ubuntuone/platform/ipc/perspective_broker.py 2012-07-13 16:06:27 +0000
+++ ubuntuone/platform/ipc/perspective_broker.py 2012-08-22 18:22:29 +0000
@@ -231,6 +231,7 @@
231 'waiting',231 'waiting',
232 'waiting_metadata',232 'waiting_metadata',
233 'waiting_content',233 'waiting_content',
234 'sync_menu',
234 ]235 ]
235236
236 signal_mapping = {237 signal_mapping = {
@@ -291,6 +292,10 @@
291 warnings.warn('Use "waiting" method instead.', DeprecationWarning)292 warnings.warn('Use "waiting" method instead.', DeprecationWarning)
292 return self.service.status.waiting_content()293 return self.service.status.waiting_content()
293294
295 def sync_menu(self):
296 """Return the info necessary to construct the menu."""
297 return self.service.status.sync_menu()
298
294 @signal299 @signal
295 def DownloadStarted(self, path):300 def DownloadStarted(self, path):
296 """Fire a signal to notify that a download has started."""301 """Fire a signal to notify that a download has started."""
@@ -300,7 +305,7 @@
300 """Fire a signal to notify about a download progress."""305 """Fire a signal to notify about a download progress."""
301306
302 @signal307 @signal
303 def DownloadFinished(self, path, info):308 def DownloadFinished(self, path, *info):
304 """Fire a signal to notify that a download has finished."""309 """Fire a signal to notify that a download has finished."""
305310
306 @signal311 @signal
@@ -312,7 +317,7 @@
312 """Fire a signal to notify about an upload progress."""317 """Fire a signal to notify about an upload progress."""
313318
314 @signal319 @signal
315 def UploadFinished(self, path, info):320 def UploadFinished(self, path, *info):
316 """Fire a signal to notify that an upload has finished."""321 """Fire a signal to notify that an upload has finished."""
317322
318 @signal323 @signal
319324
=== modified file 'ubuntuone/platform/os_helper/__init__.py'
--- ubuntuone/platform/os_helper/__init__.py 2012-06-22 19:37:36 +0000
+++ ubuntuone/platform/os_helper/__init__.py 2012-08-22 18:22:29 +0000
@@ -73,6 +73,7 @@
7373
74# Decorators74# Decorators
7575
76get_os_valid_path = source.get_os_valid_path
76is_valid_syncdaemon_path = source.is_valid_syncdaemon_path77is_valid_syncdaemon_path = source.is_valid_syncdaemon_path
77is_valid_os_path = source.is_valid_os_path78is_valid_os_path = source.is_valid_os_path
78os_path = source.os_path79os_path = source.os_path
7980
=== modified file 'ubuntuone/platform/os_helper/darwin.py'
--- ubuntuone/platform/os_helper/darwin.py 2012-07-10 18:41:15 +0000
+++ ubuntuone/platform/os_helper/darwin.py 2012-08-22 18:22:29 +0000
@@ -29,8 +29,8 @@
29"""29"""
30Darwin import for ubuntuone-client30Darwin import for ubuntuone-client
3131
32This module has to have all darwin specific modules and provide the api required32This module has to have all darwin specific modules and provide the
33to support the darwin platform.33api required to support the darwin platform.
34"""34"""
3535
36import errno36import errno
@@ -70,6 +70,7 @@
70is_root = unix.is_root70is_root = unix.is_root
71get_path_list = unix.get_path_list71get_path_list = unix.get_path_list
72normpath = unix.normpath72normpath = unix.normpath
73get_os_valid_path = unix.get_os_valid_path
7374
7475
75def move_to_trash(path):76def move_to_trash(path):
@@ -121,6 +122,8 @@
121def is_valid_os_path(path_indexes=None):122def is_valid_os_path(path_indexes=None):
122 def decorator(func):123 def decorator(func):
123 def wrapped(*args, **kwargs):124 def wrapped(*args, **kwargs):
125 for i in path_indexes:
126 assert isinstance(args[i], str), 'Path %r should be str.'
124 return func(*args, **kwargs)127 return func(*args, **kwargs)
125 return wrapped128 return wrapped
126 return decorator129 return decorator
127130
=== modified file 'ubuntuone/platform/os_helper/linux.py'
--- ubuntuone/platform/os_helper/linux.py 2012-06-27 12:51:20 +0000
+++ ubuntuone/platform/os_helper/linux.py 2012-08-22 18:22:29 +0000
@@ -123,6 +123,7 @@
123is_root = unix.is_root123is_root = unix.is_root
124get_path_list = unix.get_path_list124get_path_list = unix.get_path_list
125normpath = unix.normpath125normpath = unix.normpath
126get_os_valid_path = unix.get_os_valid_path
126is_valid_syncdaemon_path = None127is_valid_syncdaemon_path = None
127is_valid_os_path = None128is_valid_os_path = None
128os_path = None129os_path = None
129130
=== modified file 'ubuntuone/platform/os_helper/unix.py'
--- ubuntuone/platform/os_helper/unix.py 2012-06-27 14:02:14 +0000
+++ ubuntuone/platform/os_helper/unix.py 2012-08-22 18:22:29 +0000
@@ -185,3 +185,8 @@
185def normpath(path):185def normpath(path):
186 """Normalize path, eliminating double slashes, etc."""186 """Normalize path, eliminating double slashes, etc."""
187 return os.path.normpath(path)187 return os.path.normpath(path)
188
189
190def get_os_valid_path(path):
191 """Return a valid os path."""
192 return os.path.abspath(path)
188193
=== modified file 'ubuntuone/platform/os_helper/windows.py'
--- ubuntuone/platform/os_helper/windows.py 2012-07-03 17:16:57 +0000
+++ ubuntuone/platform/os_helper/windows.py 2012-08-22 18:22:29 +0000
@@ -249,6 +249,8 @@
249 assert_windows_path(result)249 assert_windows_path(result)
250 return result250 return result
251251
252get_os_valid_path = get_windows_valid_path
253
252254
253def _unicode_to_bytes(path):255def _unicode_to_bytes(path):
254 """Convert a unicode path to a bytes path."""256 """Convert a unicode path to a bytes path."""
255257
=== modified file 'ubuntuone/platform/tools/__init__.py'
--- ubuntuone/platform/tools/__init__.py 2012-05-30 15:35:49 +0000
+++ ubuntuone/platform/tools/__init__.py 2012-08-22 18:22:29 +0000
@@ -326,6 +326,13 @@
326326
327 @defer.inlineCallbacks327 @defer.inlineCallbacks
328 @log_call(logger.debug)328 @log_call(logger.debug)
329 def sync_menu(self):
330 """Return a deferred that will be fired with the sync menu data."""
331 results = yield self.proxy.call_method('status', 'sync_menu')
332 defer.returnValue(results)
333
334 @defer.inlineCallbacks
335 @log_call(logger.debug)
329 def accept_share(self, share_id):336 def accept_share(self, share_id):
330 """Accept the share with id: share_id."""337 """Accept the share with id: share_id."""
331 d = self.wait_for_signals(signal_ok='ShareAnswerResponse',338 d = self.wait_for_signals(signal_ok='ShareAnswerResponse',
332339
=== modified file 'ubuntuone/status/aggregator.py'
--- ubuntuone/status/aggregator.py 2012-04-09 20:07:05 +0000
+++ ubuntuone/status/aggregator.py 2012-08-22 18:22:29 +0000
@@ -33,7 +33,7 @@
33import itertools33import itertools
34import operator34import operator
35import os35import os
3636from collections import deque
3737
38import gettext38import gettext
3939
@@ -624,6 +624,7 @@
624 self.finished_delay = 10624 self.finished_delay = 10
625 self.progress = {}625 self.progress = {}
626 self.to_do = {}626 self.to_do = {}
627 self.recent_transfers = deque(maxlen=5)
627628
628 def get_notification(self):629 def get_notification(self):
629 """Create a new toggleable notification object."""630 """Create a new toggleable notification object."""
@@ -775,6 +776,7 @@
775 if command.deflated_size is not None:776 if command.deflated_size is not None:
776 self.progress[777 self.progress[
777 (command.share_id, command.node_id)] = command.deflated_size778 (command.share_id, command.node_id)] = command.deflated_size
779 self.recent_transfers.append(command.path)
778 logger.debug("unqueueing command: %s", command.__class__.__name__)780 logger.debug("unqueueing command: %s", command.__class__.__name__)
779 self.update_progressbar()781 self.update_progressbar()
780782
@@ -806,6 +808,19 @@
806 self.messaging = Messaging()808 self.messaging = Messaging()
807 self.quota_timer = None809 self.quota_timer = None
808810
811 def recent_transfers(self):
812 """Return a tuple with the recent transfers paths."""
813 return list(self.aggregator.recent_transfers)
814
815 def files_uploading(self):
816 """Return a list with the files being uploading."""
817 uploading = []
818 for upload in self.aggregator.files_uploading:
819 if upload.size != 0:
820 uploading.append((upload.path, upload.size,
821 upload.n_bytes_written))
822 return uploading
823
809 def file_published(self, public_url):824 def file_published(self, public_url):
810 """A file was published."""825 """A file was published."""
811 status_event = FilePublishingStatus(new_public_url=public_url)826 status_event = FilePublishingStatus(new_public_url=public_url)
812827
=== modified file 'ubuntuone/syncdaemon/__init__.py'
--- ubuntuone/syncdaemon/__init__.py 2012-04-09 20:07:05 +0000
+++ ubuntuone/syncdaemon/__init__.py 2012-08-22 18:22:29 +0000
@@ -36,3 +36,8 @@
36 "volumes",36 "volumes",
37 "generations",37 "generations",
38 ])38 ])
39
40
41#Sync Menu data constants
42RECENT_TRANSFERS = 'recent-transfers'
43UPLOADING = 'uploading'
3944
=== modified file 'ubuntuone/syncdaemon/event_queue.py'
--- ubuntuone/syncdaemon/event_queue.py 2012-07-31 08:26:30 +0000
+++ ubuntuone/syncdaemon/event_queue.py 2012-08-22 18:22:29 +0000
@@ -159,6 +159,7 @@
159159
160 'FSM_FILE_CONFLICT': ('old_name', 'new_name'),160 'FSM_FILE_CONFLICT': ('old_name', 'new_name'),
161 'FSM_DIR_CONFLICT': ('old_name', 'new_name'),161 'FSM_DIR_CONFLICT': ('old_name', 'new_name'),
162 'FSM_PARTIAL_COMMITED': ('share_id', 'node_id'),
162163
163 'VM_UDF_SUBSCRIBED': ('udf',),164 'VM_UDF_SUBSCRIBED': ('udf',),
164 'VM_UDF_SUBSCRIBE_ERROR': ('udf_id', 'error'),165 'VM_UDF_SUBSCRIBE_ERROR': ('udf_id', 'error'),
165166
=== modified file 'ubuntuone/syncdaemon/filesystem_manager.py'
--- ubuntuone/syncdaemon/filesystem_manager.py 2012-06-22 11:29:10 +0000
+++ ubuntuone/syncdaemon/filesystem_manager.py 2012-08-22 18:22:29 +0000
@@ -1142,7 +1142,7 @@
1142 return fd1142 return fd
11431143
1144 def commit_partial(self, node_id, share_id, local_hash):1144 def commit_partial(self, node_id, share_id, local_hash):
1145 """Create a .partial in disk and set the flag in metadata."""1145 """Commit a file from a .partial to disk."""
1146 mdid = self._idx_node_id[(share_id, node_id)]1146 mdid = self._idx_node_id[(share_id, node_id)]
1147 mdobj = self.fs[mdid]1147 mdobj = self.fs[mdid]
1148 if mdobj["is_dir"]:1148 if mdobj["is_dir"]:
@@ -1166,6 +1166,8 @@
1166 mdobj["info"]["is_partial"] = False1166 mdobj["info"]["is_partial"] = False
1167 mdobj["stat"] = get_stat(path)1167 mdobj["stat"] = get_stat(path)
1168 self.fs[mdid] = mdobj1168 self.fs[mdid] = mdobj
1169 self.eq.push("FSM_PARTIAL_COMMITED", share_id=share_id,
1170 node_id=node_id)
11691171
1170 def remove_partial(self, node_id, share_id):1172 def remove_partial(self, node_id, share_id):
1171 """Remove a .partial in disk and set the flag in metadata."""1173 """Remove a .partial in disk and set the flag in metadata."""
11721174
=== modified file 'ubuntuone/syncdaemon/interaction_interfaces.py'
--- ubuntuone/syncdaemon/interaction_interfaces.py 2012-05-22 14:07:55 +0000
+++ ubuntuone/syncdaemon/interaction_interfaces.py 2012-08-22 18:22:29 +0000
@@ -302,6 +302,10 @@
302 waiting_content.append(data)302 waiting_content.append(data)
303 return waiting_content303 return waiting_content
304304
305 def sync_menu(self):
306 """Return the info necessary to construct the menu."""
307 return self.main.status_listener.menu_data()
308
305309
306class SyncdaemonFileSystem(SyncdaemonObject):310class SyncdaemonFileSystem(SyncdaemonObject):
307 """An interface to the FileSystem Manager."""311 """An interface to the FileSystem Manager."""
@@ -816,8 +820,8 @@
816 self._path_from_ids(share_id, node_id, 'DownloadFileProgress', info)820 self._path_from_ids(share_id, node_id, 'DownloadFileProgress', info)
817821
818 @log_call(logger.debug)822 @log_call(logger.debug)
819 def handle_AQ_DOWNLOAD_FINISHED(self, share_id, node_id, server_hash):823 def handle_FSM_PARTIAL_COMMITED(self, share_id, node_id):
820 """Handle AQ_DOWNLOAD_FINISHED."""824 """Handle FSM_PARTIAL_COMMITED."""
821 self._path_from_ids(share_id, node_id, 'DownloadFinished', info={})825 self._path_from_ids(share_id, node_id, 'DownloadFinished', info={})
822826
823 @log_call(logger.debug)827 @log_call(logger.debug)
824828
=== modified file 'ubuntuone/syncdaemon/status_listener.py'
--- ubuntuone/syncdaemon/status_listener.py 2012-04-09 20:07:05 +0000
+++ ubuntuone/syncdaemon/status_listener.py 2012-08-22 18:22:29 +0000
@@ -31,7 +31,12 @@
31"""Listener for event queue that updates the UI to show syncdaemon status."""31"""Listener for event queue that updates the UI to show syncdaemon status."""
3232
33from ubuntuone.status.aggregator import StatusFrontend33from ubuntuone.status.aggregator import StatusFrontend
34from ubuntuone.syncdaemon import action_queue, config34from ubuntuone.syncdaemon import (
35 action_queue,
36 config,
37 RECENT_TRANSFERS,
38 UPLOADING,
39)
35from ubuntuone.syncdaemon.interaction_interfaces import (40from ubuntuone.syncdaemon.interaction_interfaces import (
36 get_share_dict, get_udf_dict)41 get_share_dict, get_udf_dict)
37from ubuntuone.syncdaemon.volume_manager import UDF, Root42from ubuntuone.syncdaemon.volume_manager import UDF, Root
@@ -66,6 +71,13 @@
66 user_conf = config.get_user_config()71 user_conf = config.get_user_config()
67 self.show_all_notifications = user_conf.get_show_all_notifications()72 self.show_all_notifications = user_conf.get_show_all_notifications()
6873
74 def menu_data(self):
75 """Return the info necessary to construct the sync menu."""
76 uploading = self.status_frontend.files_uploading()
77 transfers = self.status_frontend.recent_transfers()
78 data = {RECENT_TRANSFERS: transfers, UPLOADING: uploading}
79 return data
80
69 def get_show_all_notifications(self):81 def get_show_all_notifications(self):
70 """Get the value of show_all_notifications."""82 """Get the value of show_all_notifications."""
71 return self._show_all_notifications83 return self._show_all_notifications
7284
=== modified file 'ubuntuone/syncdaemon/volume_manager.py'
--- ubuntuone/syncdaemon/volume_manager.py 2012-05-14 21:24:24 +0000
+++ ubuntuone/syncdaemon/volume_manager.py 2012-08-22 18:22:29 +0000
@@ -1207,6 +1207,10 @@
1207 if is_link(path):1207 if is_link(path):
1208 return (False, "UDFs can not be a symlink")1208 return (False, "UDFs can not be a symlink")
12091209
1210 # check if path exists but is not a directory
1211 if path_exists(path) and not os.path.isdir(path):
1212 return (False, "The path exists but is not a folder")
1213
1210 # check if the path it's ok (outside root and1214 # check if the path it's ok (outside root and
1211 # isn't a ancestor or child of another UDF)1215 # isn't a ancestor or child of another UDF)
1212 if self._is_nested_udf(path):1216 if self._is_nested_udf(path):

Subscribers

People subscribed via source and target branches

to all changes: