Merge lp:~super-friends/friends/raring into lp:friends

Proposed by Ken VanDine on 2013-03-20
Status: Merged
Approved by: Ken VanDine on 2013-03-20
Approved revision: 182
Merged at revision: 164
Proposed branch: lp:~super-friends/friends/raring
Merge into: lp:friends
Diff against target: 3182 lines (+1267/-632)
34 files modified
debian/changelog (+18/-0)
debian/friends-dispatcher.install (+3/-0)
debian/rules (+4/-1)
friends/main.py (+9/-8)
friends/protocols/facebook.py (+47/-20)
friends/protocols/flickr.py (+5/-2)
friends/protocols/foursquare.py (+6/-1)
friends/protocols/twitter.py (+7/-3)
friends/service/dispatcher.py (+58/-8)
friends/tests/data/facebook-full.dat (+366/-1)
friends/tests/data/flickr-full.dat (+13/-1)
friends/tests/mocks.py (+50/-2)
friends/tests/test_account.py (+3/-62)
friends/tests/test_cache.py (+0/-4)
friends/tests/test_dispatcher.py (+75/-10)
friends/tests/test_facebook.py (+153/-72)
friends/tests/test_flickr.py (+59/-76)
friends/tests/test_foursquare.py (+7/-15)
friends/tests/test_identica.py (+2/-10)
friends/tests/test_mock_dispatcher.py (+0/-2)
friends/tests/test_model.py (+11/-2)
friends/tests/test_notify.py (+1/-14)
friends/tests/test_protocols.py (+154/-79)
friends/tests/test_shortener.py (+0/-2)
friends/tests/test_twitter.py (+52/-32)
friends/utils/account.py (+3/-23)
friends/utils/authentication.py (+6/-1)
friends/utils/base.py (+97/-143)
friends/utils/model.py (+9/-20)
service/configure.ac (+1/-0)
service/src/Makefile.am (+3/-2)
service/src/service.vala (+43/-15)
setup.py (+1/-0)
tools/debug_live.py (+1/-1)
To merge this branch: bzr merge lp:~super-friends/friends/raring
Reviewer Review Type Date Requested Status
PS Jenkins bot (community) continuous-integration Approve on 2013-03-20
Ken VanDine Approve on 2013-03-20
Review via email: mp+154365@code.launchpad.net

Commit message

* Stop deduplicating messages across protocols, simplifying model
    schema (LP: #1156941)
  * Add schema columns for latitude, longitude, and location name.
  * Fix 'likes' column from gdouble to guint64.
  * Add geotagging support from foursquare, facebook, flickr.
  * Implement since= for Facebook, reducing bandwidth usage.
  * Automatically prepend the required @mention to Twitter
    replies (LP: #1156829)
  * Automatically linkify URLs that get published to the model.
  * Fix the publishing of Facebook Stories (LP: #1155785)

Description of the change

Merge queued up changes for raring to trunk now that the FFe bug 1156979 was approved

To post a comment you must log in.
Ken VanDine (ken-vandine) wrote :

All the changes have been reviewed before merging into the raring branch.

review: Approve
review: Approve (continuous-integration)

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'debian/changelog'
2--- debian/changelog 2013-03-19 05:03:07 +0000
3+++ debian/changelog 2013-03-20 13:27:53 +0000
4@@ -1,3 +1,21 @@
5+friends (0.1.3-0ubuntu1) UNRELEASED; urgency=low
6+
7+ [ Robert Bruce Park ]
8+ * Keep the Dispatcher alive for 30s beyond the return of the final
9+ method invocation.
10+ * Stop deduplicating messages across protocols, simplifying model
11+ schema (LP: #1156941)
12+ * Add schema columns for latitude, longitude, and location name.
13+ * Fix 'likes' column from gdouble to guint64.
14+ * Add geotagging support from foursquare, facebook, flickr.
15+ * Implement since= for Facebook, reducing bandwidth usage.
16+ * Automatically prepend the required @mention to Twitter
17+ replies (LP: #1156829)
18+ * Automatically linkify URLs that get published to the model.
19+ * Fix the publishing of Facebook Stories (LP: #1155785)
20+
21+ -- Ken VanDine <ken.vandine@canonical.com> Wed, 20 Mar 2013 09:14:15 -0400
22+
23 friends (0.1.2daily13.03.19-0ubuntu1) raring; urgency=low
24
25 [ Martin Pitt ]
26
27=== modified file 'debian/friends-dispatcher.install'
28--- debian/friends-dispatcher.install 2013-02-05 01:25:43 +0000
29+++ debian/friends-dispatcher.install 2013-03-20 13:27:53 +0000
30@@ -5,4 +5,7 @@
31 usr/lib/python3/dist-packages/friends/service/*
32 usr/lib/python3/dist-packages/friends/shorteners/*
33 usr/lib/python3/dist-packages/friends/utils/*
34+usr/lib/python3/dist-packages/friends/tests/mocks.py
35+usr/lib/python3/dist-packages/friends/tests/__init__.py
36+usr/lib/python3/dist-packages/friends/tests/data/*
37 usr/share/dbus-1/services/com.canonical.Friends.Dispatcher.service
38
39=== modified file 'debian/rules'
40--- debian/rules 2013-02-13 05:43:57 +0000
41+++ debian/rules 2013-03-20 13:27:53 +0000
42@@ -18,8 +18,11 @@
43 override_dh_auto_install:
44 python3 setup.py install --root=$(CURDIR)/debian/tmp --install-layout=deb
45 python3 setup.py install_service_files -d $(CURDIR)/debian/tmp/usr
46+ rm -rf $(CURDIR)/debian/tmp/usr/lib/python3/dist-packages/friends/tests/test*
47+ rm -rf $(CURDIR)/debian/tmp/usr/lib/python3/dist-packages/friends/*/__pycache__
48+ rm -rf $(CURDIR)/debian/tmp/usr/lib/python3/dist-packages/friends/__pycache__
49 dh_auto_install -D service
50- dh_install --list-missing
51+ dh_install --fail-missing
52
53 override_dh_auto_test:
54 dbus-test-runner -t make -p check -m 300
55
56=== modified file 'friends/main.py'
57--- friends/main.py 2013-03-01 20:28:27 +0000
58+++ friends/main.py 2013-03-20 13:27:53 +0000
59@@ -43,11 +43,16 @@
60
61 if args.test:
62 from friends.service.mock_service import Dispatcher
63+ from friends.tests.mocks import populate_fake_data
64+
65+ populate_fake_data()
66 Dispatcher()
67+
68 try:
69 loop.run()
70 except KeyboardInterrupt:
71 pass
72+
73 sys.exit(0)
74
75
76@@ -57,11 +62,9 @@
77 GObject.threads_init(None)
78
79 from friends.service.dispatcher import Dispatcher, DBUS_INTERFACE
80-from friends.utils.base import _OperationThread, _publish_lock
81-from friends.utils.base import Base, initialize_caches
82+from friends.utils.base import Base, initialize_caches, _publish_lock
83 from friends.utils.model import Model, prune_model
84 from friends.utils.logging import initialize
85-from friends.utils.avatar import Avatar
86
87
88 # Optional performance profiling module.
89@@ -84,10 +87,6 @@
90 print(class_name)
91 return
92
93- # Our threading implementation needs to know how to quit the
94- # application once all threads have completed.
95- _OperationThread.shutdown = loop.quit
96-
97 # Disallow multiple instances of friends-dispatcher
98 bus = dbus.SessionBus()
99 obj = bus.get_object('org.freedesktop.DBus', '/org/freedesktop/DBus')
100@@ -140,7 +139,9 @@
101 log.info('Starting friends-dispatcher main loop')
102 loop.run()
103 except KeyboardInterrupt:
104- log.info('Stopped friends-dispatcher main loop')
105+ pass
106+
107+ log.info('Stopped friends-dispatcher main loop')
108
109 # This bit doesn't run until after the mainloop exits.
110 if args.performance and yappi is not None:
111
112=== modified file 'friends/protocols/facebook.py'
113--- friends/protocols/facebook.py 2013-02-27 22:22:38 +0000
114+++ friends/protocols/facebook.py 2013-03-20 13:27:53 +0000
115@@ -24,10 +24,9 @@
116 import time
117 import logging
118
119-from datetime import datetime, timedelta
120-
121 from friends.utils.avatar import Avatar
122 from friends.utils.base import Base, feature
123+from friends.utils.cache import JsonCache
124 from friends.utils.http import Downloader, Uploader
125 from friends.utils.time import parsetime, iso8601utc
126 from friends.errors import FriendsError
127@@ -42,10 +41,17 @@
128 FACEBOOK_ADDRESS_BOOK = 'friends-facebook-contacts'
129
130
131+TEN_DAYS = 864000 # seconds
132+
133+
134 log = logging.getLogger(__name__)
135
136
137 class Facebook(Base):
138+ def __init__(self, account):
139+ super().__init__(account)
140+ self._timestamps = PostIdCache(self._name + '_ids')
141+
142 def _whoami(self, authdata):
143 """Identify the authenticating user."""
144 me_data = Downloader(
145@@ -59,15 +65,22 @@
146 # We can't do much with this entry.
147 return
148
149+ place = entry.get('place', {})
150+ location = place.get('location', {})
151+
152 args = dict(
153+ message_id=message_id,
154 stream=stream,
155- message=entry.get('message', ''),
156+ message=entry.get('message', '') or entry.get('story', ''),
157 icon_uri=entry.get('icon', ''),
158 link_picture=entry.get('picture', ''),
159 link_name=entry.get('name', ''),
160 link_url=entry.get('link', ''),
161 link_desc=entry.get('description', ''),
162 link_caption=entry.get('caption', ''),
163+ location=place.get('name', ''),
164+ latitude=location.get('latitude', 0.0),
165+ longitude=location.get('longitude', 0.0),
166 )
167
168 # Posts gives us a likes dict, while replies give us an int.
169@@ -89,10 +102,16 @@
170 # Normalize the timestamp.
171 timestamp = entry.get('updated_time', entry.get('created_time'))
172 if timestamp is not None:
173- args['timestamp'] = iso8601utc(parsetime(timestamp))
174+ timestamp = args['timestamp'] = iso8601utc(parsetime(timestamp))
175+ # We need to record timestamps for use with since=. Note that
176+ # _timestamps is a special dict subclass that only accepts
177+ # timestamps that are larger than the existing value, so at any
178+ # given time it will map the stream to the most
179+ # recent timestamp we've seen for that stream.
180+ self._timestamps[stream] = timestamp
181
182 # Publish this message into the SharedModel.
183- self._publish(message_id, **args)
184+ self._publish(**args)
185
186 # If there are any replies, publish them as well.
187 for comment in entry.get('comments', {}).get('data', []):
188@@ -109,6 +128,7 @@
189
190 while True:
191 response = Downloader(url, params).get_json()
192+
193 if self._is_error(response):
194 break
195
196@@ -137,24 +157,18 @@
197 # We've gotten everything Facebook is going to give us.
198 return entries
199
200- def _get(self, url, stream, since=None):
201+ def _get(self, url, stream):
202 """Retrieve a list of Facebook objects.
203
204 A maximum of 50 objects are requested.
205-
206- :param since: Only get objects posted since this date. If not given,
207- then only objects younger than 10 days are retrieved. The value
208- is a number seconds since the epoch.
209- :type since: float
210 """
211 access_token = self._get_access_token()
212- if since is None:
213- when = datetime.now() - timedelta(days=10)
214- else:
215- when = datetime.fromtimestamp(since)
216+ since = self._timestamps.get(
217+ stream, iso8601utc(int(time.time()) - TEN_DAYS))
218+
219 entries = []
220 params = dict(access_token=access_token,
221- since=when.isoformat(),
222+ since=since,
223 limit=self._DOWNLOAD_LIMIT)
224
225 entries = self._follow_pagination(url, params)
226@@ -163,15 +177,15 @@
227 self._publish_entry(entry, stream=stream)
228
229 @feature
230- def home(self, since=None):
231+ def home(self):
232 """Gather and publish public timeline messages."""
233- self._get(ME_URL + '/home', 'messages', since)
234+ self._get(ME_URL + '/home', 'messages')
235 return self._get_n_rows()
236
237 @feature
238- def wall(self, since=None):
239+ def wall(self):
240 """Gather and publish messages written on user's wall."""
241- self._get(ME_URL + '/feed', 'mentions', since)
242+ self._get(ME_URL + '/feed', 'mentions')
243 return self._get_n_rows()
244
245 @feature
246@@ -369,3 +383,16 @@
247 def delete_contacts(self):
248 source = self._get_eds_source(FACEBOOK_ADDRESS_BOOK)
249 return self._delete_service_contacts(source)
250+
251+
252+class PostIdCache(JsonCache):
253+ """Persist most-recent timestamps as JSON."""
254+
255+ def __setitem__(self, key, value):
256+ if key.find('/') >= 0:
257+ # Don't flood the cache with irrelevant "reply_to/..." and
258+ # "search/..." streams, we only need the main streams.
259+ return
260+ # Thank SCIENCE for lexically-sortable timestamp strings!
261+ if value > self.get(key, ''):
262+ JsonCache.__setitem__(self, key, value)
263
264=== modified file 'friends/protocols/flickr.py'
265--- friends/protocols/flickr.py 2013-02-27 23:04:36 +0000
266+++ friends/protocols/flickr.py 2013-03-20 13:27:53 +0000
267@@ -113,7 +113,7 @@
268 method='flickr.photos.getContactsPhotos',
269 format='json',
270 nojsoncallback='1',
271- extras='date_upload,owner_name,icon_server',
272+ extras='date_upload,owner_name,icon_server,geo',
273 )
274
275 response = self._get_url(args)
276@@ -166,7 +166,10 @@
277 link_caption=data.get('title', ''),
278 link_url=img_url,
279 link_picture=img_src,
280- link_icon=img_thumb)
281+ link_icon=img_thumb,
282+ latitude=data.get('latitude', 0.0),
283+ longitude=data.get('longitude', 0.0),
284+ )
285 return self._get_n_rows()
286
287 # http://www.flickr.com/services/api/upload.api.html
288
289=== modified file 'friends/protocols/foursquare.py'
290--- friends/protocols/foursquare.py 2013-02-26 19:05:10 +0000
291+++ friends/protocols/foursquare.py 2013-03-20 13:27:53 +0000
292@@ -85,6 +85,8 @@
293 checkin_id = checkin.get('id', '')
294 tz_offset = checkin.get('timeZoneOffset', 0)
295 epoch = checkin.get('createdAt', 0)
296+ venue = checkin.get('venue', {})
297+ location = venue.get('location', {})
298 self._publish(
299 message_id=checkin_id,
300 stream='messages',
301@@ -94,6 +96,9 @@
302 message=checkin.get('shout', ''),
303 likes=checkin.get('likes', {}).get('count', 0),
304 icon_uri=Avatar.get_image(avatar_url),
305- url=checkin.get('venue', {}).get('canonicalUrl', ''),
306+ url=venue.get('canonicalUrl', ''),
307+ location=venue.get('name', ''),
308+ latitude=location.get('lat', 0.0),
309+ longitude=location.get('lng', 0.0),
310 )
311 return self._get_n_rows()
312
313=== modified file 'friends/protocols/twitter.py'
314--- friends/protocols/twitter.py 2013-03-08 02:31:15 +0000
315+++ friends/protocols/twitter.py 2013-03-20 13:27:53 +0000
316@@ -26,12 +26,10 @@
317 import logging
318
319 from urllib.parse import quote
320-from gi.repository import GLib
321
322 from friends.utils.avatar import Avatar
323 from friends.utils.base import Base, feature
324 from friends.utils.cache import JsonCache
325-from friends.utils.model import Model
326 from friends.utils.http import BaseRateLimiter, Downloader
327 from friends.utils.time import parsetime, iso8601utc
328 from friends.errors import FriendsError
329@@ -70,7 +68,7 @@
330 super().__init__(account)
331 self._rate_limiter = RateLimiter()
332 # Can be 'twitter_ids' or 'identica_ids'
333- self._tweet_ids = TweetIdCache(self.__class__.__name__.lower() + '_ids')
334+ self._tweet_ids = TweetIdCache(self._name + '_ids')
335
336 def _whoami(self, authdata):
337 """Identify the authenticating user."""
338@@ -254,6 +252,12 @@
339 order for Twitter to actually accept this as a reply. Otherwise it
340 will just be an ordinary tweet.
341 """
342+ try:
343+ sender = '@{}'.format(self._fetch_cell(message_id, 'sender_nick'))
344+ if message.find(sender) < 0:
345+ message = sender + ' ' + message
346+ except FriendsError:
347+ pass
348 url = self._api_base.format(endpoint='statuses/update')
349 tweet = self._get_url(url, dict(in_reply_to_status_id=message_id,
350 status=message))
351
352=== modified file 'friends/service/dispatcher.py'
353--- friends/service/dispatcher.py 2013-02-20 13:08:27 +0000
354+++ friends/service/dispatcher.py 2013-03-20 13:27:53 +0000
355@@ -28,12 +28,13 @@
356 import dbus.service
357
358 from gi.repository import GLib
359+from contextlib import ContextDecorator
360
361 from friends.utils.avatar import Avatar
362 from friends.utils.account import AccountManager
363 from friends.utils.manager import protocol_manager
364 from friends.utils.menus import MenuManager
365-from friends.utils.model import Model
366+from friends.utils.model import Model, persist_model
367 from friends.shorteners import lookup
368
369
370@@ -43,6 +44,48 @@
371 STUB = lambda *ignore, **kwignore: None
372
373
374+# Avoid race condition during shut-down
375+_exit_lock = threading.Lock()
376+
377+
378+class ManageTimers(ContextDecorator):
379+ """Exit the dispatcher 30s after the most recent method call returns."""
380+ timers = set()
381+ callback = STUB
382+
383+ def __enter__(self):
384+ self.clear_all_timers()
385+
386+ def __exit__(self, *ignore):
387+ self.set_new_timer()
388+
389+ def clear_all_timers(self):
390+ log.debug('Clearing {} shutdown timer(s)...'.format(len(self.timers)))
391+ while self.timers:
392+ GLib.source_remove(self.timers.pop())
393+
394+ def set_new_timer(self):
395+ # Concurrency will cause two methods to exit near each other,
396+ # causing two timers to be set, so we have to clear them again.
397+ self.clear_all_timers()
398+ log.debug('Starting new shutdown timer...')
399+ self.timers.add(GLib.timeout_add_seconds(30, self.terminate))
400+
401+ def terminate(self, *ignore):
402+ """Exit the dispatcher, but only if there are no active subthreads."""
403+ with _exit_lock:
404+ if threading.activeCount() < 2:
405+ log.debug('No threads found, shutting down.')
406+ persist_model()
407+ self.timers.add(GLib.idle_add(self.callback))
408+ else:
409+ log.debug('Delaying shutdown because active threads found.')
410+ self.set_new_timer()
411+
412+
413+exit_after_idle = ManageTimers()
414+
415+
416 class Dispatcher(dbus.service.Object):
417 """This is the primary handler of dbus method calls."""
418 __dbus_object_path__ = '/com/canonical/friends/Dispatcher'
419@@ -59,10 +102,13 @@
420 self.menu_manager = MenuManager(self.Refresh, self.mainloop.quit)
421 Model.connect('row-added', self._increment_unread_count)
422
423+ ManageTimers.callback = mainloop.quit
424+
425 def _increment_unread_count(self, model, itr):
426 self._unread_count += 1
427 self.menu_manager.update_unread_count(self._unread_count)
428
429+ @exit_after_idle
430 @dbus.service.method(DBUS_INTERFACE)
431 def Refresh(self):
432 """Download new messages from each connected protocol."""
433@@ -80,6 +126,7 @@
434 # If a protocol doesn't support receive then ignore it.
435 pass
436
437+ @exit_after_idle
438 @dbus.service.method(DBUS_INTERFACE)
439 def ClearIndicators(self):
440 """Indicate that messages have been read.
441@@ -92,8 +139,8 @@
442 service.ClearIndicators()
443 """
444 self.menu_manager.update_unread_count(0)
445- GLib.idle_add(self.mainloop.quit)
446
447+ @exit_after_idle
448 @dbus.service.method(DBUS_INTERFACE,
449 in_signature='sss',
450 out_signature='s',
451@@ -117,7 +164,7 @@
452 """
453 if account_id:
454 accounts = [self.account_manager.get(account_id)]
455- if accounts == [None]:
456+ if None in accounts:
457 message = 'Could not find account: {}'.format(account_id)
458 failure(message)
459 log.error(message)
460@@ -138,6 +185,7 @@
461 if not called:
462 failure('No accounts supporting {} found.'.format(action))
463
464+ @exit_after_idle
465 @dbus.service.method(DBUS_INTERFACE,
466 in_signature='s',
467 out_signature='s',
468@@ -162,7 +210,7 @@
469 sent = True
470 log.debug(
471 'Sending message to {}'.format(
472- account.protocol.__class__.__name__))
473+ account.protocol._Name))
474 account.protocol(
475 'send',
476 message,
477@@ -172,6 +220,7 @@
478 if not sent:
479 failure('No send_enabled accounts found.')
480
481+ @exit_after_idle
482 @dbus.service.method(DBUS_INTERFACE,
483 in_signature='sss',
484 out_signature='s',
485@@ -205,6 +254,7 @@
486 failure(message)
487 log.error(message)
488
489+ @exit_after_idle
490 @dbus.service.method(DBUS_INTERFACE,
491 in_signature='sss',
492 out_signature='s',
493@@ -262,6 +312,7 @@
494 failure(message)
495 log.error(message)
496
497+ @exit_after_idle
498 @dbus.service.method(DBUS_INTERFACE, in_signature='s', out_signature='s')
499 def GetFeatures(self, protocol_name):
500 """Returns a list of features supported by service as json string.
501@@ -274,9 +325,9 @@
502 features = json.loads(service.GetFeatures('facebook'))
503 """
504 protocol = protocol_manager.protocols.get(protocol_name)
505- GLib.idle_add(self.mainloop.quit)
506- return json.dumps(protocol.get_features())
507+ return json.dumps(protocol.get_features() if protocol else [])
508
509+ @exit_after_idle
510 @dbus.service.method(DBUS_INTERFACE, in_signature='s', out_signature='s')
511 def URLShorten(self, url):
512 """Shorten a URL.
513@@ -291,7 +342,6 @@
514 service = dbus.Interface(obj, DBUS_INTERFACE)
515 short_url = service.URLShorten(url)
516 """
517- GLib.idle_add(self.mainloop.quit)
518 service_name = self.settings.get_string('urlshorter')
519 log.info('Shortening URL {} with {}'.format(url, service_name))
520 if (lookup.is_shortened(url) or
521@@ -305,7 +355,7 @@
522 log.exception('URL shortening class: {}'.format(service))
523 return url
524
525+ @exit_after_idle
526 @dbus.service.method(DBUS_INTERFACE)
527 def ExpireAvatars(self):
528 Avatar.expire_old_avatars()
529- GLib.idle_add(self.mainloop.quit)
530
531=== modified file 'friends/tests/data/facebook-full.dat'
532--- friends/tests/data/facebook-full.dat 2012-10-13 01:27:15 +0000
533+++ friends/tests/data/facebook-full.dat 2013-03-20 13:27:53 +0000
534@@ -1,1 +1,366 @@
535-{"paging": {"previous": "https://graph.facebook.com/101/home?access_token=ABC&limit=25&since=1348682101&__previous=1"}, "data": [{"picture": "https://fbexternal-a.akamaihd.net/rush.jpg", "from": {"category": "Entertainment", "name": "Rush is a Band", "id": "117402931676347"}, "name": "Rush is a Band Blog", "comments": {"count": 0}, "actions": [{"link": "https://www.facebook.com/117402931676347/posts/287578798009078", "name": "Comment"}, {"link": "https://www.facebook.com/117402931676347/posts/287578798009078", "name": "Like"}], "updated_time": "2012-09-26T17:34:00+0000", "caption": "www.rushisaband.com", "link": "http://www.rushisaband.com/blog/Rush-Clockwork-Angels-tour", "likes": {"count": 16, "data": [{"name": "Alex Lifeson", "id": "801"}, {"name": "Vlada Lee", "id": "801"}, {"name": "Richard Peart", "id": "803"}, {"name": "Eric Lifeson", "id": "804"}]}, "created_time": "2012-09-26T17:34:00+0000", "message": "Rush takes off to the Great White North", "icon": "https://s-static.ak.facebook.com/rsrc.php/v2/yD/r/a.gif", "type": "link", "id": "108", "status_type": "shared_story", "description": "Rush is a Band: Neil Peart, Geddy Lee, Alex Lifeson"}, {"picture": "https://images.gibson.com/Rush_Clockwork-Angels_t.jpg", "likes": {"count": 27, "data": [{"name": "Tracy Lee", "id": "805"}, {"name": "Wendy Peart", "id": "806"}, {"name": "Vlada Lifeson", "id": "807"}, {"name": "Chevy Lee", "id": "808"}]}, "from": {"category": "Entertainment", "name": "Rush is a Band", "id": "117402931676347"}, "name": "Top 10 Alex Lifeson Guitar Moments", "comments": {"count": 5, "data": [{"created_time": "2012-09-26T17:16:00+0000", "message": "OK Don...10) Headlong Flight", "from": {"name": "Bruce Peart", "id": "809"}, "id": "117402931676347_386054134801436_3235476"}, {"created_time": "2012-09-26T17:49:06+0000", "message": "No Cygnus X-1 Bruce? I call shenanigans!", "from": {"name": "Don Lee", "id": "810"}, "id": "117402931676347_386054134801436_3235539"}]}, "actions": [{"link": "https://www.facebook.com/117402931676347/posts/386054134801436", "name": "Comment"}, {"link": "https://www.facebook.com/117402931676347/posts/386054134801436", "name": "Like"}], "updated_time": "2012-09-26T17:49:06+0000", "caption": "www2.gibson.com", "link": "http://www2.gibson.com/Alex-Lifeson.aspx", "shares": {"count": 11}, "created_time": "2012-09-26T16:42:15+0000", "message": "http://www2.gibson.com/Alex-Lifeson-0225-2011.aspx", "icon": "https://s-static.ak.facebook.com/rsrc.php/v2/yD/r/a.gif", "type": "link", "id": "109", "status_type": "shared_story", "description": "For millions of Rush fans old and new, it\u2019s a pleasure"}]}
536\ No newline at end of file
537+{
538+ "data": [
539+ {
540+ "id": "fake_id",
541+ "from": {
542+ "name": "Yours Truly",
543+ "id": "56789"
544+ },
545+ "message": "Writing code that supports geotagging data from facebook. If y'all could make some geotagged facebook posts for me to test with, that'd be super.",
546+ "actions": [
547+ {
548+ "name": "Comment",
549+ "link": "https://www.facebook.com/fake/posts/id"
550+ },
551+ {
552+ "name": "Like",
553+ "link": "https://www.facebook.com/fake/posts/id"
554+ }
555+ ],
556+ "privacy": {
557+ "value": ""
558+ },
559+ "place": {
560+ "id": "103135879727382",
561+ "name": "Victoria, British Columbia",
562+ "location": {
563+ "street": "",
564+ "zip": "",
565+ "latitude": 48.4333,
566+ "longitude": -123.35
567+ }
568+ },
569+ "type": "status",
570+ "status_type": "mobile_status_update",
571+ "created_time": "2013-03-12T21:27:56+0000",
572+ "updated_time": "2013-03-13T23:29:07+0000",
573+ "likes": {
574+ "data": [
575+ {
576+ "name": "Anna",
577+ "id": "12345"
578+ }
579+ ],
580+ "count": 1
581+ },
582+ "comments": {
583+ "data": [
584+ {
585+ "id": "fake as a snake",
586+ "from": {
587+ "name": "Grandma",
588+ "id": "9876"
589+ },
590+ "message": "If I knew what a geotagged facebook post was I might be able to comply!",
591+ "created_time": "2013-03-12T22:56:17+0000"
592+ },
593+ {
594+ "id": "faker than cake!",
595+ "from": {
596+ "name": "Father",
597+ "id": "234"
598+ },
599+ "message": "don't know how",
600+ "created_time": "2013-03-12T23:29:45+0000"
601+ },
602+ {
603+ "id": "still fake",
604+ "from": {
605+ "name": "Mother",
606+ "id": "456"
607+ },
608+ "message": "HUH!!!!",
609+ "created_time": "2013-03-13T02:20:27+0000"
610+ },
611+ {
612+ "id": "this one is real",
613+ "from": {
614+ "name": "Yours Truly",
615+ "id": "56789"
616+ },
617+ "message": "Coming up with tons of fake data is hard!",
618+ "created_time": "2013-03-13T23:29:07+0000"
619+ }
620+ ],
621+ "count": 4
622+ }
623+ },
624+ {
625+ "id": "270843027745_10151370303782746",
626+ "from": {
627+ "category": "Shopping/retail",
628+ "category_list": [
629+ {
630+ "id": "128003127270269",
631+ "name": "Bike Shop"
632+ }
633+ ],
634+ "name": "Western Cycle Source for Sports",
635+ "id": "270843027745"
636+ },
637+ "story": "Western Cycle Source for Sports updated their cover photo.",
638+ "story_tags": {
639+ "0": [
640+ {
641+ "id": "270843027745",
642+ "name": "Western Cycle Source for Sports",
643+ "offset": 0,
644+ "length": 31,
645+ "type": "page"
646+ }
647+ ]
648+ },
649+ "picture": "https://fbcdn-photos-a.akamaihd.net/hphotos-ak-snc7/482418_10151370303672746_1924798223_s.jpg",
650+ "link": "https://www.facebook.com/photo.php?fbid=10151370303672746&set=a.10150598301902746.381693.270843027745&type=1&relevant_count=1",
651+ "icon": "https://fbstatic-a.akamaihd.net/rsrc.php/v2/yz/r/StEh3RhPvjk.gif",
652+ "actions": [
653+ {
654+ "name": "Comment",
655+ "link": "https://www.facebook.com/270843027745/posts/10151370303782746"
656+ },
657+ {
658+ "name": "Like",
659+ "link": "https://www.facebook.com/270843027745/posts/10151370303782746"
660+ }
661+ ],
662+ "privacy": {
663+ "value": ""
664+ },
665+ "place": {
666+ "id": "270843027745",
667+ "name": "Western Cycle Source for Sports",
668+ "location": {
669+ "street": "1550 8th Ave",
670+ "city": "Regina",
671+ "state": "SK",
672+ "country": "Canada",
673+ "zip": "S4R 1E4",
674+ "latitude": 50.45679,
675+ "longitude": -104.60276
676+ }
677+ },
678+ "type": "photo",
679+ "object_id": "10151370303672746",
680+ "created_time": "2013-03-11T23:46:06+0000",
681+ "updated_time": "2013-03-11T23:46:06+0000",
682+ "likes": {
683+ "data": [
684+ {
685+ "name": "Lou Schwindt",
686+ "id": "57"
687+ },
688+ {
689+ "name": "Maureen Daniel",
690+ "id": "72"
691+ },
692+ {
693+ "name": "Lee Watson",
694+ "id": "696"
695+ },
696+ {
697+ "name": "Rob Nelson",
698+ "id": "40"
699+ }
700+ ],
701+ "count": 10
702+ },
703+ "comments": {
704+ "count": 0
705+ }
706+ },
707+ {
708+ "id": "161247843901324_629147610444676",
709+ "from": {
710+ "category": "Hotel",
711+ "category_list": [
712+ {
713+ "id": "164243073639257",
714+ "name": "Hotel"
715+ }
716+ ],
717+ "name": "Best Western Denver Southwest",
718+ "id": "161247843901324"
719+ },
720+ "message": "Today only -- Come meet Caroline and Meredith and Stanley the Stegosaurus (& Greg & Joe, too!) at the TechZulu Trend Lounge, Hilton Garden Inn 18th floor, 500 N Interstate 35, Austin, Texas. Monday, March 11th, 4:00pm to 7:00 pm. Also here Hannah Hart (My Drunk Kitchen) and Angry Video Game Nerd producer, Sean Keegan. Stanley is in the lobby.",
721+ "picture": "https://fbcdn-photos-a.akamaihd.net/hphotos-ak-snc7/601266_629147587111345_968504279_s.jpg",
722+ "link": "https://www.facebook.com/photo.php?fbid=629147587111345&set=a.173256162700492.47377.161247843901324&type=1&relevant_count=1",
723+ "icon": "https://fbstatic-a.akamaihd.net/rsrc.php/v2/yz/r/StEh3RhPvjk.gif",
724+ "actions": [
725+ {
726+ "name": "Comment",
727+ "link": "https://www.facebook.com/161247843901324/posts/629147610444676"
728+ },
729+ {
730+ "name": "Like",
731+ "link": "https://www.facebook.com/161247843901324/posts/629147610444676"
732+ }
733+ ],
734+ "privacy": {
735+ "value": ""
736+ },
737+ "place": {
738+ "id": "132709090079327",
739+ "name": "Hilton Garden Inn Austin Downtown/Convention Center",
740+ "location": {
741+ "street": "500 North Interstate 35",
742+ "city": "Austin",
743+ "state": "TX",
744+ "country": "United States",
745+ "zip": "78701",
746+ "latitude": 30.265384957204,
747+ "longitude": -97.735604602521
748+ }
749+ },
750+ "type": "photo",
751+ "status_type": "added_photos",
752+ "object_id": "629147587111345",
753+ "created_time": "2013-03-11T20:49:10+0000",
754+ "updated_time": "2013-03-11T23:51:25+0000",
755+ "likes": {
756+ "data": [
757+ {
758+ "name": "Andrew Henninger",
759+ "id": "11"
760+ },
761+ {
762+ "name": "Sarah Brents",
763+ "id": "22"
764+ },
765+ {
766+ "name": "Thomas Bush",
767+ "id": "33"
768+ },
769+ {
770+ "name": "Jennifer Tornetta",
771+ "id": "44"
772+ }
773+ ],
774+ "count": 84
775+ },
776+ "comments": {
777+ "data": [
778+ {
779+ "id": "1612484301324_6294760444676_1126376",
780+ "from": {
781+ "name": "Amy Gibbs",
782+ "id": "55"
783+ },
784+ "message": "You have to love a family that travels with their stegasaurus.",
785+ "created_time": "2013-03-11T23:51:05+0000",
786+ "likes": 2
787+ },
788+ {
789+ "id": "1612843901324_6294761044676_1124378",
790+ "from": {
791+ "name": "Amy Gibbs",
792+ "id": "55"
793+ },
794+ "message": "*stegosaurus...sorry!",
795+ "created_time": "2013-03-11T23:51:25+0000",
796+ "likes": 1
797+ }
798+ ],
799+ "count": 11
800+ }
801+ },
802+ {
803+ "id": "104443_100085049977",
804+ "from": {
805+ "name": "Guy Frenchie",
806+ "id": "1244414"
807+ },
808+ "story": "Guy Frenchie did some things with some stuff.",
809+ "story_tags": {
810+ "0": [
811+ {
812+ "id": "1244414",
813+ "name": "Guy Frenchie",
814+ "offset": 0,
815+ "length": 16,
816+ "type": "user"
817+ }
818+ ],
819+ "26": [
820+ {
821+ "id": "37067557",
822+ "name": "somebody",
823+ "offset": 26,
824+ "length": 10,
825+ "type": "page"
826+ }
827+ ],
828+ "48": [
829+ {
830+ "id": "50681138",
831+ "name": "What do you think about things and stuff?",
832+ "offset": 48,
833+ "length": 52
834+ }
835+ ]
836+ },
837+ "icon": "https://fbstatic-a.akamaihd.net/rsrc.php/v2/yg/r/5PpICR5KcPe.png",
838+ "actions": [
839+ {
840+ "name": "Comment",
841+ "link": "https://www.facebook.com/1244414/posts/100085049977"
842+ },
843+ {
844+ "name": "Like",
845+ "link": "https://www.facebook.com/1244414/posts/100085049977"
846+ }
847+ ],
848+ "privacy": {
849+ "value": ""
850+ },
851+ "type": "question",
852+ "object_id": "584616119",
853+ "application": {
854+ "name": "Questions",
855+ "id": "101502535258"
856+ },
857+ "created_time": "2013-03-15T19:57:14+0000",
858+ "updated_time": "2013-03-15T19:57:14+0000",
859+ "likes": {
860+ "data": [
861+ {
862+ "name": "Kevin Diner",
863+ "id": "55520"
864+ },
865+ {
866+ "name": "Bozo the Clown",
867+ "id": "13960"
868+ }
869+ ],
870+ "count": 3
871+ },
872+ "comments": {
873+ "data": [
874+ {
875+ "id": "14446143_102008355988977_100927",
876+ "from": {
877+ "name": "Seymour Butts",
878+ "id": "505677"
879+ },
880+ "message": "seems legit",
881+ "created_time": "2013-03-13T12:20:19+0000",
882+ "likes": 2
883+ },
884+ {
885+ "id": "120143_1020035588977_1019440",
886+ "from": {
887+ "name": "Andre the Giant",
888+ "id": "100390199"
889+ },
890+ "message": "Anybody want a peanut?",
891+ "created_time": "2013-03-13T12:23:25+0000"
892+ }
893+ ],
894+ "count": 22
895+ }
896+ }
897+ ],
898+ "paging": {
899+ "previous": "https://graph.facebook.com/me/home&limit=25&since=1234",
900+ "next": "https://graph.facebook.com/me/home&limit=25&until=4321"
901+ }
902+}
903
904=== modified file 'friends/tests/data/flickr-full.dat'
905--- friends/tests/data/flickr-full.dat 2012-10-20 15:35:30 +0000
906+++ friends/tests/data/flickr-full.dat 2013-03-20 13:27:53 +0000
907@@ -1,1 +1,13 @@
908-{"photos": {"photo": [{"username": "Geddy Lee", "secret": "abc", "title": "ant", "owner": "123", "id": "801", "dateupload": "2012-05-10T13:36:45", "server": "1"}, {"username": "Alex Lifeson", "secret": "def", "ownername": "Alex Lifeson", "title": "bee", "owner": "456", "id": "802", "server": "1"}, {"username": "Neil Peart", "title": "cat", "farm": "animalz", "server": "1", "iconserver": "9", "secret": "ghi", "ownername": "Bob Dobbs", "owner": "789", "id": "803", "iconfarm": "iconz"}]}}
909+{ "photos": {
910+ "photo": [
911+ { "id": "8552892154", "secret": "a", "server": "8378", "farm": 9, "owner": "47303164@N00", "username": "raise my voice", "title": "Chocolate chai #yegcoffee", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363117902", "ownername": "raise my voice", "iconserver": 93, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
912+ { "id": "8552845358", "secret": "b", "server": "8085", "farm": 9, "owner": "47303164@N00", "username": "raise my voice", "title": "Torah ark #yegjew", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363116818", "ownername": "raise my voice", "iconserver": 93, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
913+ { "id": "8552661200", "secret": "c", "server": "8522", "farm": 9, "owner": "60551783@N00", "username": "Reinhard.Pantke", "title": "Henningsvaer", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363112533", "ownername": "Reinhard.Pantke", "iconserver": 8, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
914+ { "id": "8550946245", "secret": "d", "server": "8107", "farm": 9, "owner": "60551783@N00", "username": "Reinhard.Pantke", "title": "Summerfeeling on Lofoten", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363098878", "ownername": "Reinhard.Pantke", "iconserver": 8, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
915+ { "id": "8550829193", "secret": "e", "server": "8246", "farm": 9, "owner": "27204141@N05", "username": "Nelson Webb", "title": "St. Michael - The Archangel", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363096450", "ownername": "Nelson Webb", "iconserver": "2047", "iconfarm": 3, "latitude": 53.833156, "longitude": -112.330784, "accuracy": 15, "context": 0, "place_id": "4Y55lnhZVrNO", "woeid": "8496", "geo_is_family": 0, "geo_is_friend": 0, "geo_is_contact": 0, "geo_is_public": 1 },
916+ { "id": "8551930826", "secret": "f", "server": "8247", "farm": 9, "owner": "27204141@N05", "username": "Nelson Webb", "title": "Pine scented air freshener", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363096449", "ownername": "Nelson Webb", "iconserver": "2047", "iconfarm": 3, "latitude": 53.878136, "longitude": -112.335162, "accuracy": 15, "context": 0, "place_id": "4Y55lnhZVrNO", "woeid": "8496", "geo_is_family": 0, "geo_is_friend": 0, "geo_is_contact": 0, "geo_is_public": 1 },
917+ { "id": "8549658873", "secret": "g", "server": "8239", "farm": 9, "owner": "30584843@N00", "username": "Mark Iocchelli", "title": "Sleepy Hollow", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363055714", "ownername": "Mark Iocchelli", "iconserver": 22, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
918+ { "id": "8548811967", "secret": "h", "server": "8229", "farm": 9, "owner": "47303164@N00", "username": "raise my voice", "title": "Trying out The Wokkery #yegfood", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363028798", "ownername": "raise my voice", "iconserver": 93, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
919+ { "id": "8548753789", "secret": "i", "server": "8512", "farm": 9, "owner": "30584843@N00", "username": "Mark Iocchelli", "title": "Alberta Rail Pipeline", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363027071", "ownername": "Mark Iocchelli", "iconserver": 22, "iconfarm": 1, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 },
920+ { "id": "8549607582", "secret": "j", "server": "8087", "farm": 9, "owner": "60404254@N00", "username": "tavis_mcnally", "title": "26 weeks", "ispublic": 1, "isfriend": 0, "isfamily": 0, "dateupload": "1363022277", "ownername": "tavis_mcnally", "iconserver": "2182", "iconfarm": 3, "latitude": 0, "longitude": 0, "accuracy": 0, "context": 0 }
921+ ], "total": "290", "page": 1, "per_page": 10, "pages": 29 }, "stat": "ok" }
922
923=== modified file 'friends/tests/mocks.py'
924--- friends/tests/mocks.py 2013-02-19 17:00:41 +0000
925+++ friends/tests/mocks.py 2013-03-20 13:27:53 +0000
926@@ -30,15 +30,19 @@
927 import hashlib
928 import logging
929 import threading
930+import tempfile
931+import shutil
932
933 from io import StringIO
934 from logging.handlers import QueueHandler
935 from pkg_resources import resource_listdir, resource_string
936 from queue import Empty, Queue
937 from urllib.parse import urlsplit
938+from gi.repository import Dee
939
940 from friends.utils.base import Base
941 from friends.utils.logging import LOG_FORMAT
942+from friends.utils.model import COLUMN_TYPES
943
944
945 try:
946@@ -51,6 +55,50 @@
947 NEWLINE = '\n'
948
949
950+# Create a test model that will not interfere with the user's environment.
951+# We'll use this object as a mock of the real model.
952+TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
953+TestModel.set_schema_full(COLUMN_TYPES)
954+
955+
956+@mock.patch('friends.utils.http._soup', mock.Mock())
957+@mock.patch('friends.utils.base.Model', TestModel)
958+@mock.patch('friends.utils.base.Base._get_access_token',
959+ mock.Mock(return_value='Access Tolkien'))
960+def populate_fake_data():
961+ """Dump a mixture of random data from our testsuite into TestModel.
962+
963+ This is invoked by running 'friends-dispatcher --test' so that you
964+ can have some phony data in the model to test against.
965+
966+ Just remember that the data appears in a separate model so as not
967+ to interfere with the user's official DeeModel stream.
968+ """
969+ from friends.utils.cache import JsonCache
970+ from friends.protocols.facebook import Facebook
971+ from friends.protocols.flickr import Flickr
972+ from friends.protocols.twitter import Twitter
973+ from gi.repository import Dee
974+
975+ temp_cache = tempfile.mkdtemp()
976+ root = JsonCache._root = os.path.join(temp_cache, '{}.json')
977+
978+ protocols = {
979+ 'facebook-full.dat': Facebook(FakeAccount(account_id=1)),
980+ 'flickr-full.dat': Flickr(FakeAccount(account_id=2)),
981+ 'twitter-home.dat': Twitter(FakeAccount(account_id=3)),
982+ }
983+
984+ for fake_name, protocol in protocols.items():
985+ protocol.source_registry = EDSRegistry()
986+ with mock.patch('friends.utils.http.Soup.Message',
987+ FakeSoupMessage('friends.tests.data',
988+ fake_name)) as fake:
989+ protocol.receive()
990+
991+ shutil.rmtree(temp_cache)
992+
993+
994 class FakeAuth:
995 id = 'fakeauth id'
996 method = 'fakeauth method'
997@@ -61,7 +109,7 @@
998 class FakeAccount:
999 """A fake account object for testing purposes."""
1000
1001- def __init__(self, service=None):
1002+ def __init__(self, service=None, account_id=88):
1003 self.access_token = None
1004 self.secret_token = None
1005 self.user_full_name = None
1006@@ -69,7 +117,7 @@
1007 self.user_id = None
1008 self.auth = FakeAuth()
1009 self.login_lock = threading.Lock()
1010- self.id = '1234'
1011+ self.id = account_id
1012 self.protocol = Base(self)
1013
1014
1015
1016=== modified file 'friends/tests/test_account.py'
1017--- friends/tests/test_account.py 2013-02-05 01:11:35 +0000
1018+++ friends/tests/test_account.py 2013-03-20 13:27:53 +0000
1019@@ -23,19 +23,11 @@
1020
1021 import unittest
1022
1023-from gi.repository import Dee
1024-
1025 from friends.errors import UnsupportedProtocolError
1026 from friends.protocols.flickr import Flickr
1027-from friends.tests.mocks import FakeAccount, LogMock, SettingsIterMock, mock
1028+from friends.tests.mocks import FakeAccount, LogMock, SettingsIterMock
1029+from friends.tests.mocks import TestModel, mock
1030 from friends.utils.account import Account, AccountManager
1031-from friends.utils.model import COLUMN_TYPES
1032-
1033-
1034-# Create a test model that will not interfere with the user's environment.
1035-# We'll use this object as a mock of the real model.
1036-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1037-TestModel.set_schema_full(COLUMN_TYPES)
1038
1039
1040 class TestAccount(unittest.TestCase):
1041@@ -159,7 +151,6 @@
1042 assert self.account != None
1043
1044
1045-
1046 accounts_manager = mock.Mock()
1047 accounts_manager.new_for_service_type(
1048 'microblogging').get_enabled_account_services.return_value = []
1049@@ -199,7 +190,7 @@
1050 # the account manager's mapping.
1051 manager = AccountManager()
1052 manager._add_new_account(self.account_service)
1053- self.assertIn('1234', manager._accounts)
1054+ self.assertIn(88, manager._accounts)
1055
1056 def test_account_manager_enabled_event(self):
1057 manager = AccountManager()
1058@@ -210,56 +201,6 @@
1059 manager._on_enabled_event(accounts_manager, 2)
1060 account.protocol.assert_called_once_with('receive')
1061
1062- def test_account_manager_delete_account_no_account(self):
1063- # Deleting an account removes the global_id from the mapping. But if
1064- # that global id is missing, then it does not cause an exception.
1065- manager = AccountManager()
1066- manager._get_service = mock.Mock()
1067- manager._get_service.return_value = self.account_service
1068- self.assertNotIn('1234', manager._accounts)
1069- manager._on_account_deleted(accounts_manager, '1234')
1070- self.assertNotIn('1234', manager._accounts)
1071-
1072- @mock.patch('friends.utils.base.Model', TestModel)
1073- @mock.patch('friends.utils.base._seen_ids', {})
1074- def test_account_manager_delete_account(self):
1075- # Deleting an account removes the id from the mapping. But if
1076- # that id is missing, then it does not cause an exception.
1077- manager = AccountManager()
1078- manager._get_service = mock.Mock()
1079- manager._get_service.return_value = self.account_service
1080- manager._add_new_account(self.account_service)
1081- self.assertIn('1234', manager._accounts)
1082- manager._on_account_deleted(accounts_manager, '1234')
1083- self.assertNotIn('1234', manager._accounts)
1084-
1085- @mock.patch('friends.utils.base.Model', TestModel)
1086- @mock.patch('friends.utils.base._seen_ids', {})
1087- def test_account_manager_delete_account_preserve_messages(self):
1088- # Deleting an Account should not delete messages from the row
1089- # that exist on other protocols too.
1090- manager = AccountManager()
1091- manager._get_service = mock.Mock()
1092- manager._get_service.return_value = self.account_service
1093- manager._add_new_account(self.account_service)
1094- example_row = [[['twitter', '6', '1234'],
1095- ['base', '1234', '5678']],
1096- 'messages', 'Fred Flintstone', '', 'fred', True,
1097- '2012-08-28T19:59:34', 'Yabba dabba dooooo!', '', '',
1098- 0.0, False, '', '', '', '', '', '']
1099- result_row = [[['twitter', '6', '1234']],
1100- 'messages', 'Fred Flintstone', '', 'fred', True,
1101- '2012-08-28T19:59:34', 'Yabba dabba dooooo!', '', '',
1102- 0.0, False, '', '', '', '', '', '']
1103- row_iter = TestModel.append(*example_row)
1104- from friends.utils.base import _seen_ids
1105- _seen_ids[
1106- ('base', '1234', '5678')
1107- ] = TestModel.get_position(row_iter)
1108- self.assertEqual(list(TestModel.get_row(0)), example_row)
1109- manager._on_account_deleted(accounts_manager, '1234')
1110- self.assertEqual(list(TestModel.get_row(0)), result_row)
1111-
1112
1113 @mock.patch('gi.repository.Accounts.Manager', accounts_manager)
1114 class TestAccountManagerRealAccount(unittest.TestCase):
1115
1116=== modified file 'friends/tests/test_cache.py'
1117--- friends/tests/test_cache.py 2013-03-08 02:31:15 +0000
1118+++ friends/tests/test_cache.py 2013-03-20 13:27:53 +0000
1119@@ -21,14 +21,10 @@
1120
1121
1122 import os
1123-import time
1124 import shutil
1125 import tempfile
1126 import unittest
1127
1128-from datetime import date, timedelta
1129-from pkg_resources import resource_filename
1130-
1131 from friends.utils.cache import JsonCache
1132
1133
1134
1135=== modified file 'friends/tests/test_dispatcher.py'
1136--- friends/tests/test_dispatcher.py 2013-02-19 17:00:41 +0000
1137+++ friends/tests/test_dispatcher.py 2013-03-20 13:27:53 +0000
1138@@ -26,7 +26,7 @@
1139
1140 from dbus.mainloop.glib import DBusGMainLoop
1141
1142-from friends.service.dispatcher import Dispatcher, STUB
1143+from friends.service.dispatcher import Dispatcher, ManageTimers, STUB
1144 from friends.tests.mocks import LogMock, mock
1145
1146
1147@@ -61,7 +61,11 @@
1148 self.dispatcher.account_manager.get_all.assert_called_once_with()
1149 account.protocol.assert_called_once_with('receive')
1150
1151- self.assertEqual(self.log_mock.empty(), 'Refresh requested\n')
1152+ self.assertEqual(self.log_mock.empty(),
1153+ 'Clearing 1 shutdown timer(s)...\n'
1154+ 'Refresh requested\n'
1155+ 'Clearing 0 shutdown timer(s)...\n'
1156+ 'Starting new shutdown timer...\n')
1157
1158 def test_clear_indicators(self):
1159 self.dispatcher.menu_manager = mock.Mock()
1160@@ -81,7 +85,10 @@
1161 'like', '23346356767354626', success=STUB, failure=STUB)
1162
1163 self.assertEqual(self.log_mock.empty(),
1164- '345: like 23346356767354626\n')
1165+ 'Clearing 1 shutdown timer(s)...\n'
1166+ '345: like 23346356767354626\n'
1167+ 'Clearing 0 shutdown timer(s)...\n'
1168+ 'Starting new shutdown timer...\n')
1169
1170 def test_failing_do(self):
1171 account = mock.Mock()
1172@@ -93,7 +100,10 @@
1173 self.assertEqual(account.protocol.call_count, 0)
1174
1175 self.assertEqual(self.log_mock.empty(),
1176- 'Could not find account: 6\n')
1177+ 'Clearing 1 shutdown timer(s)...\n'
1178+ 'Could not find account: 6\n'
1179+ 'Clearing 0 shutdown timer(s)...\n'
1180+ 'Starting new shutdown timer...\n')
1181
1182 def test_send_message(self):
1183 account1 = mock.Mock()
1184@@ -128,7 +138,10 @@
1185 success=STUB, failure=STUB)
1186
1187 self.assertEqual(self.log_mock.empty(),
1188- 'Replying to 2, objid\n')
1189+ 'Clearing 1 shutdown timer(s)...\n'
1190+ 'Replying to 2, objid\n'
1191+ 'Clearing 0 shutdown timer(s)...\n'
1192+ 'Starting new shutdown timer...\n')
1193
1194 def test_send_reply_failed(self):
1195 account = mock.Mock()
1196@@ -140,8 +153,11 @@
1197 self.assertEqual(account.protocol.call_count, 0)
1198
1199 self.assertEqual(self.log_mock.empty(),
1200- 'Replying to 2, objid\n' +
1201- 'Could not find account: 2\n')
1202+ 'Clearing 1 shutdown timer(s)...\n'
1203+ 'Replying to 2, objid\n'
1204+ 'Could not find account: 2\n'
1205+ 'Clearing 0 shutdown timer(s)...\n'
1206+ 'Starting new shutdown timer...\n')
1207
1208 def test_upload_async(self):
1209 account = mock.Mock()
1210@@ -166,7 +182,10 @@
1211 )
1212
1213 self.assertEqual(self.log_mock.empty(),
1214- 'Uploading file://path/to/image.png to 2\n')
1215+ 'Clearing 1 shutdown timer(s)...\n'
1216+ 'Uploading file://path/to/image.png to 2\n'
1217+ 'Clearing 0 shutdown timer(s)...\n'
1218+ 'Starting new shutdown timer...\n')
1219
1220 def test_get_features(self):
1221 self.assertEqual(json.loads(self.dispatcher.GetFeatures('facebook')),
1222@@ -205,6 +224,52 @@
1223 self.dispatcher.URLShorten(long_url),
1224 'short url')
1225 lookup_mock.is_shortened.assert_called_once_with(long_url)
1226- self.dispatcher.settings.get_boolean.assert_called_once_with('shorten-urls')
1227+ self.dispatcher.settings.get_boolean.assert_called_once_with(
1228+ 'shorten-urls')
1229 lookup_mock.lookup.assert_called_once_with('is.gd')
1230- lookup_mock.lookup.return_value.shorten.assert_called_once_with(long_url)
1231+ lookup_mock.lookup.return_value.shorten.assert_called_once_with(
1232+ long_url)
1233+
1234+ @mock.patch('friends.service.dispatcher.GLib')
1235+ def test_manage_timers_clear(self, glib):
1236+ manager = ManageTimers()
1237+ manager.timers = {1}
1238+ manager.__enter__()
1239+ glib.source_remove.assert_called_once_with(1)
1240+ manager.timers = {1, 2, 3}
1241+ manager.clear_all_timers()
1242+ self.assertEqual(glib.source_remove.call_count, 4)
1243+
1244+ @mock.patch('friends.service.dispatcher.GLib')
1245+ def test_manage_timers_set(self, glib):
1246+ manager = ManageTimers()
1247+ manager.timers = set()
1248+ manager.clear_all_timers = mock.Mock()
1249+ manager.__exit__()
1250+ glib.timeout_add_seconds.assert_called_once_with(30, manager.terminate)
1251+ manager.clear_all_timers.assert_called_once_with()
1252+ self.assertEqual(len(manager.timers), 1)
1253+
1254+ @mock.patch('friends.service.dispatcher.persist_model')
1255+ @mock.patch('friends.service.dispatcher.threading')
1256+ @mock.patch('friends.service.dispatcher.GLib')
1257+ def test_manage_timers_terminate(self, glib, thread, persist):
1258+ manager = ManageTimers()
1259+ manager.timers = set()
1260+ thread.activeCount.return_value = 1
1261+ manager.terminate()
1262+ thread.activeCount.assert_called_once_with()
1263+ persist.assert_called_once_with()
1264+ glib.idle_add.assert_called_once_with(manager.callback)
1265+
1266+ @mock.patch('friends.service.dispatcher.persist_model')
1267+ @mock.patch('friends.service.dispatcher.threading')
1268+ @mock.patch('friends.service.dispatcher.GLib')
1269+ def test_manage_timers_dont_kill_threads(self, glib, thread, persist):
1270+ manager = ManageTimers()
1271+ manager.timers = set()
1272+ manager.set_new_timer = mock.Mock()
1273+ thread.activeCount.return_value = 10
1274+ manager.terminate()
1275+ thread.activeCount.assert_called_once_with()
1276+ manager.set_new_timer.assert_called_once_with()
1277
1278=== modified file 'friends/tests/test_facebook.py'
1279--- friends/tests/test_facebook.py 2013-02-27 22:22:38 +0000
1280+++ friends/tests/test_facebook.py 2013-03-20 13:27:53 +0000
1281@@ -15,27 +15,26 @@
1282
1283 """Test the Facebook plugin."""
1284
1285+
1286 __all__ = [
1287 'TestFacebook',
1288 ]
1289
1290
1291+import os
1292+import tempfile
1293 import unittest
1294+import shutil
1295
1296-from gi.repository import Dee, GLib
1297+from gi.repository import GLib
1298 from pkg_resources import resource_filename
1299
1300 from friends.protocols.facebook import Facebook
1301-from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock, mock
1302+from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock
1303+from friends.tests.mocks import TestModel, mock
1304 from friends.tests.mocks import EDSBookClientMock, EDSSource, EDSRegistry
1305 from friends.errors import ContactsError, FriendsError, AuthorizationError
1306-from friends.utils.model import COLUMN_TYPES
1307-
1308-
1309-# Create a test model that will not interfere with the user's environment.
1310-# We'll use this object as a mock of the real model.
1311-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1312-TestModel.set_schema_full(COLUMN_TYPES)
1313+from friends.utils.cache import JsonCache
1314
1315
1316 @mock.patch('friends.utils.http._soup', mock.Mock())
1317@@ -44,12 +43,16 @@
1318 """Test the Facebook API."""
1319
1320 def setUp(self):
1321+ self._temp_cache = tempfile.mkdtemp()
1322+ self._root = JsonCache._root = os.path.join(
1323+ self._temp_cache, '{}.json')
1324 self.account = FakeAccount()
1325 self.protocol = Facebook(self.account)
1326 self.protocol.source_registry = EDSRegistry()
1327
1328 def tearDown(self):
1329 TestModel.clear()
1330+ shutil.rmtree(self._temp_cache)
1331
1332 def test_features(self):
1333 # The set of public features.
1334@@ -106,75 +109,153 @@
1335 # Receive the wall feed for a user.
1336 self.maxDiff = None
1337 self.account.access_token = 'abc'
1338- self.assertEqual(self.protocol.receive(), 4)
1339- self.assertEqual(TestModel.get_n_rows(), 4)
1340+ self.assertEqual(self.protocol.receive(), 12)
1341+ self.assertEqual(TestModel.get_n_rows(), 12)
1342+ self.assertEqual(list(TestModel.get_row(0)), [
1343+ 'facebook',
1344+ 88,
1345+ 'fake_id',
1346+ 'mentions',
1347+ 'Yours Truly',
1348+ '56789',
1349+ 'Yours Truly',
1350+ False,
1351+ '2013-03-13T23:29:07Z',
1352+ 'Writing code that supports geotagging data from facebook. ' +
1353+ 'If y\'all could make some geotagged facebook posts for me ' +
1354+ 'to test with, that\'d be super.',
1355+ GLib.get_user_cache_dir() +
1356+ '/friends/avatars/5c4e74c64b1a09343558afc1046c2b1d176a2ba2',
1357+ 'https://www.facebook.com/56789',
1358+ 1,
1359+ False,
1360+ '',
1361+ '',
1362+ '',
1363+ '',
1364+ '',
1365+ '',
1366+ 'Victoria, British Columbia',
1367+ 48.4333,
1368+ -123.35,
1369+ ])
1370 self.assertEqual(list(TestModel.get_row(2)), [
1371- [['facebook',
1372- '1234',
1373- '117402931676347_386054134801436_3235476']],
1374- 'reply_to/109',
1375- 'Bruce Peart',
1376- '809',
1377- 'Bruce Peart',
1378- False,
1379- '2012-09-26T17:16:00Z',
1380- 'OK Don...10) Headlong Flight',
1381- GLib.get_user_cache_dir() +
1382- '/friends/avatars/b688c8def0455d4a3853d5fcdfaf0708645cfd3e',
1383- 'https://www.facebook.com/809',
1384- 0.0,
1385- False,
1386- '',
1387- '',
1388- '',
1389- '',
1390- '',
1391- ''])
1392- self.assertEqual(list(TestModel.get_row(0)), [
1393- [['facebook', '1234', '108']],
1394- 'mentions',
1395- 'Rush is a Band',
1396- '117402931676347',
1397- 'Rush is a Band',
1398- False,
1399- '2012-09-26T17:34:00Z',
1400- 'Rush takes off to the Great White North',
1401- GLib.get_user_cache_dir() +
1402- '/friends/avatars/7d1a70e6998f4a38954e93ca03d689463f71d63b',
1403- 'https://www.facebook.com/117402931676347',
1404- 16.0,
1405- False,
1406- 'https://fbexternal-a.akamaihd.net/rush.jpg',
1407- 'Rush is a Band Blog',
1408- 'http://www.rushisaband.com/blog/Rush-Clockwork-Angels-tour',
1409- 'Rush is a Band: Neil Peart, Geddy Lee, Alex Lifeson',
1410- 'www.rushisaband.com',
1411- ''])
1412- self.assertEqual(list(TestModel.get_row(1)), [
1413- [['facebook', '1234', '109']],
1414- 'mentions',
1415- 'Rush is a Band',
1416- '117402931676347',
1417- 'Rush is a Band',
1418- False,
1419- '2012-09-26T17:49:06Z',
1420- 'http://www2.gibson.com/Alex-Lifeson-0225-2011.aspx',
1421- GLib.get_user_cache_dir() +
1422- '/friends/avatars/7d1a70e6998f4a38954e93ca03d689463f71d63b',
1423- 'https://www.facebook.com/117402931676347',
1424- 27.0,
1425- False,
1426- 'https://images.gibson.com/Rush_Clockwork-Angels_t.jpg',
1427- 'Top 10 Alex Lifeson Guitar Moments',
1428- 'http://www2.gibson.com/Alex-Lifeson.aspx',
1429- 'For millions of Rush fans old and new, it’s a pleasure',
1430- 'www2.gibson.com',
1431- ''])
1432+ 'facebook',
1433+ 88,
1434+ 'faker than cake!',
1435+ 'reply_to/fake_id',
1436+ 'Father',
1437+ '234',
1438+ 'Father',
1439+ False,
1440+ '2013-03-12T23:29:45Z',
1441+ 'don\'t know how',
1442+ GLib.get_user_cache_dir() +
1443+ '/friends/avatars/9b9379ccc7948e4804dff7914bfa4c6de3974df5',
1444+ 'https://www.facebook.com/234',
1445+ 0,
1446+ False,
1447+ '',
1448+ '',
1449+ '',
1450+ '',
1451+ '',
1452+ '',
1453+ '',
1454+ 0.0,
1455+ 0.0,
1456+ ])
1457+ self.assertEqual(list(TestModel.get_row(6)), [
1458+ 'facebook',
1459+ 88,
1460+ '161247843901324_629147610444676',
1461+ 'mentions',
1462+ 'Best Western Denver Southwest',
1463+ '161247843901324',
1464+ 'Best Western Denver Southwest',
1465+ False,
1466+ '2013-03-11T23:51:25Z',
1467+ 'Today only -- Come meet Caroline and Meredith and Stanley the ' +
1468+ 'Stegosaurus (& Greg & Joe, too!) at the TechZulu Trend Lounge, ' +
1469+ 'Hilton Garden Inn 18th floor, 500 N Interstate 35, Austin, ' +
1470+ 'Texas. Monday, March 11th, 4:00pm to 7:00 pm. Also here ' +
1471+ 'Hannah Hart (My Drunk Kitchen) and Angry Video Game Nerd ' +
1472+ 'producer, Sean Keegan. Stanley is in the lobby.',
1473+ GLib.get_user_cache_dir() +
1474+ '/friends/avatars/5b2d70e788df790b9c8db4c6a138fc4a1f433ec9',
1475+ 'https://www.facebook.com/161247843901324',
1476+ 84,
1477+ False,
1478+ 'https://fbcdn-photos-a.akamaihd.net/hphotos-ak-snc7/' +
1479+ '601266_629147587111345_968504279_s.jpg',
1480+ '',
1481+ 'https://www.facebook.com/photo.php?fbid=629147587111345&set=a.173256162700492.47377.161247843901324&type=1&relevant_count=1',
1482+ '',
1483+ '',
1484+ '',
1485+ 'Hilton Garden Inn Austin Downtown/Convention Center',
1486+ 30.265384957204,
1487+ -97.735604602521,
1488+ ])
1489+ self.assertEqual(list(TestModel.get_row(9)), [
1490+ 'facebook',
1491+ 88,
1492+ '104443_100085049977',
1493+ 'mentions',
1494+ 'Guy Frenchie',
1495+ '1244414',
1496+ 'Guy Frenchie',
1497+ False,
1498+ '2013-03-15T19:57:14Z',
1499+ 'Guy Frenchie did some things with some stuff.',
1500+ GLib.get_user_cache_dir() +
1501+ '/friends/avatars/3f5e276af0c43f6411d931b829123825ede1968e',
1502+ 'https://www.facebook.com/1244414',
1503+ 3,
1504+ False,
1505+ '',
1506+ '',
1507+ '',
1508+ '',
1509+ '',
1510+ '',
1511+ '',
1512+ 0.0,
1513+ 0.0,
1514+ ])
1515
1516 # XXX We really need full coverage of the receive() method, including
1517 # cases where some data is missing, or can't be converted
1518 # (e.g. timestamps), and paginations.
1519
1520+ @mock.patch('friends.utils.base.Model', TestModel)
1521+ @mock.patch('friends.utils.http.Soup.Message',
1522+ FakeSoupMessage('friends.tests.data', 'facebook-full.dat'))
1523+ @mock.patch('friends.protocols.facebook.Facebook._login',
1524+ return_value=True)
1525+ @mock.patch('friends.utils.base._seen_ids', {})
1526+ def test_home_since_id(self, *mocks):
1527+ self.account.access_token = 'access'
1528+ self.account.secret_token = 'secret'
1529+ self.account.auth.parameters = dict(
1530+ ConsumerKey='key',
1531+ ConsumerSecret='secret')
1532+ self.assertEqual(self.protocol.home(), 12)
1533+
1534+ with open(self._root.format('facebook_ids'), 'r') as fd:
1535+ self.assertEqual(fd.read(), '{"messages": "2013-03-15T19:57:14Z"}')
1536+
1537+ follow = self.protocol._follow_pagination = mock.Mock()
1538+ follow.return_value = []
1539+ self.assertEqual(self.protocol.home(), 12)
1540+ follow.assert_called_once_with(
1541+ 'https://graph.facebook.com/me/home',
1542+ dict(limit=50,
1543+ since='2013-03-15T19:57:14Z',
1544+ access_token='access',
1545+ )
1546+ )
1547+
1548 @mock.patch('friends.protocols.facebook.Downloader')
1549 def test_send_to_my_wall(self, dload):
1550 dload().get_json.return_value = dict(id='post_id')
1551
1552=== modified file 'friends/tests/test_flickr.py'
1553--- friends/tests/test_flickr.py 2013-02-27 23:04:36 +0000
1554+++ friends/tests/test_flickr.py 2013-03-20 13:27:53 +0000
1555@@ -22,18 +22,12 @@
1556
1557 import unittest
1558
1559-from gi.repository import GLib, Dee
1560+from gi.repository import GLib
1561
1562 from friends.errors import AuthorizationError, FriendsError
1563 from friends.protocols.flickr import Flickr
1564-from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock, mock
1565-from friends.utils.model import COLUMN_INDICES, COLUMN_TYPES
1566-
1567-
1568-# Create a test model that will not interfere with the user's environment.
1569-# We'll use this object as a mock of the real model.
1570-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1571-TestModel.set_schema_full(COLUMN_TYPES)
1572+from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock
1573+from friends.tests.mocks import TestModel, mock
1574
1575
1576 @mock.patch('friends.utils.http._soup', mock.Mock())
1577@@ -133,7 +127,7 @@
1578 'http://api.flickr.com/services/rest',
1579 method='GET',
1580 params=dict(
1581- extras='date_upload,owner_name,icon_server',
1582+ extras='date_upload,owner_name,icon_server,geo',
1583 format='json',
1584 nojsoncallback='1',
1585 api_key='fake',
1586@@ -157,77 +151,66 @@
1587 @mock.patch('friends.utils.base.Model', TestModel)
1588 def test_flickr_data(self):
1589 # Start by setting up a fake account id.
1590- self.account.id = 'lerxst'
1591+ self.account.id = 69
1592 with mock.patch.object(self.protocol, '_get_access_token',
1593 return_value='token'):
1594- self.assertEqual(self.protocol.receive(), 3)
1595- self.assertEqual(TestModel.get_n_rows(), 3)
1596+ self.assertEqual(self.protocol.receive(), 10)
1597+ self.assertEqual(TestModel.get_n_rows(), 10)
1598
1599 self.assertEqual(
1600 list(TestModel.get_row(0)),
1601- [[['flickr', 'lerxst', '801']],
1602- 'images',
1603- '',
1604- '123',
1605- '',
1606- False,
1607- '2012-05-10T13:36:45Z',
1608- '',
1609- '',
1610- '',
1611- 0.0,
1612- False,
1613- '',
1614- '',
1615- '',
1616- '',
1617- 'ant',
1618- '',
1619- ])
1620-
1621- self.assertEqual(
1622- list(TestModel.get_row(1)),
1623- [[['flickr', 'lerxst', '802']],
1624- 'images',
1625- 'Alex Lifeson',
1626- '456',
1627- 'Alex Lifeson',
1628- True,
1629- '',
1630- '',
1631- '',
1632- '',
1633- 0.0,
1634- False,
1635- '',
1636- '',
1637- '',
1638- '',
1639- 'bee',
1640- '',
1641- ])
1642-
1643- self.assertEqual(
1644- list(TestModel.get_row(2)),
1645- [[['flickr', 'lerxst', '803']],
1646- 'images',
1647- 'Bob Dobbs',
1648- '789',
1649- 'Bob Dobbs',
1650- False,
1651- '',
1652- '',
1653- GLib.get_user_cache_dir() +
1654- '/friends/avatars/b913501d6face9d13f3006b731a711b596d23099',
1655- 'http://www.flickr.com/people/789',
1656- 0.0,
1657- False,
1658- 'http://farmanimalz.static.flickr.com/1/789_ghi_m.jpg',
1659- '',
1660- 'http://farmanimalz.static.flickr.com/1/789_ghi_b.jpg',
1661- '',
1662- 'cat',
1663- 'http://farmanimalz.static.flickr.com/1/789_ghi_t.jpg',
1664+ ['flickr',
1665+ 69,
1666+ '8552892154',
1667+ 'images',
1668+ 'raise my voice',
1669+ '47303164@N00',
1670+ 'raise my voice',
1671+ True,
1672+ '2013-03-12T19:51:42Z',
1673+ '',
1674+ GLib.get_user_cache_dir() +
1675+ '/friends/avatars/7b30ff0140dd9b80f2b1782a2802c3ce785fa0ce',
1676+ 'http://www.flickr.com/people/47303164@N00',
1677+ 0,
1678+ False,
1679+ 'http://farm9.static.flickr.com/8378/47303164@N00_a_m.jpg',
1680+ '',
1681+ 'http://farm9.static.flickr.com/8378/47303164@N00_a_b.jpg',
1682+ '',
1683+ 'Chocolate chai #yegcoffee',
1684+ 'http://farm9.static.flickr.com/8378/47303164@N00_a_t.jpg',
1685+ '',
1686+ 0.0,
1687+ 0.0,
1688+ ])
1689+
1690+ self.assertEqual(
1691+ list(TestModel.get_row(4)),
1692+ ['flickr',
1693+ 69,
1694+ '8550829193',
1695+ 'images',
1696+ 'Nelson Webb',
1697+ '27204141@N05',
1698+ 'Nelson Webb',
1699+ True,
1700+ '2013-03-12T13:54:10Z',
1701+ '',
1702+ GLib.get_user_cache_dir() +
1703+ '/friends/avatars/cae2939354a33fea5f008df91bb8e25920be5dc3',
1704+ 'http://www.flickr.com/people/27204141@N05',
1705+ 0,
1706+ False,
1707+ 'http://farm9.static.flickr.com/8246/27204141@N05_e_m.jpg',
1708+ '',
1709+ 'http://farm9.static.flickr.com/8246/27204141@N05_e_b.jpg',
1710+ '',
1711+ 'St. Michael - The Archangel',
1712+ 'http://farm9.static.flickr.com/8246/27204141@N05_e_t.jpg',
1713+ '',
1714+ 53.833156,
1715+ -112.330784,
1716 ])
1717
1718 @mock.patch('friends.utils.http.Soup.form_request_new_from_multipart',
1719
1720=== modified file 'friends/tests/test_foursquare.py'
1721--- friends/tests/test_foursquare.py 2013-02-26 19:13:31 +0000
1722+++ friends/tests/test_foursquare.py 2013-03-20 13:27:53 +0000
1723@@ -22,20 +22,12 @@
1724
1725 import unittest
1726
1727-from gi.repository import Dee
1728-
1729 from friends.protocols.foursquare import FourSquare
1730-from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock, mock
1731-from friends.utils.model import COLUMN_TYPES
1732+from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock
1733+from friends.tests.mocks import TestModel, mock
1734 from friends.errors import AuthorizationError
1735
1736
1737-# Create a test model that will not interfere with the user's environment.
1738-# We'll use this object as a mock of the real model.
1739-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1740-TestModel.set_schema_full(COLUMN_TYPES)
1741-
1742-
1743 @mock.patch('friends.utils.http._soup', mock.Mock())
1744 @mock.patch('friends.utils.base.notify', mock.Mock())
1745 class TestFourSquare(unittest.TestCase):
1746@@ -93,11 +85,11 @@
1747 self.assertEqual(self.protocol.receive(), 1)
1748 self.assertEqual(1, TestModel.get_n_rows())
1749 expected = [
1750- [['foursquare', '1234', '50574c9ce4b0a9a6e84433a0']],
1751+ 'foursquare', 88, '50574c9ce4b0a9a6e84433a0',
1752 'messages', 'Jimbob Smith', '', '', True, '2012-09-17T19:15:24Z',
1753 "Working on friends's foursquare plugin.",
1754- '~/.cache/friends/avatar/hash', '', 0.0, False, '', '', '',
1755- '', '', '',
1756+ '~/.cache/friends/avatar/hash', '', 0, False, '', '', '',
1757+ '', '', '', 'Pop Soda\'s Coffee House & Gallery',
1758+ 49.88873164336725, -97.158043384552,
1759 ]
1760- for got, want in zip(TestModel.get_row(0), expected):
1761- self.assertEqual(got, want)
1762+ self.assertEqual(list(TestModel.get_row(0)), expected)
1763
1764=== modified file 'friends/tests/test_identica.py'
1765--- friends/tests/test_identica.py 2013-03-08 02:31:15 +0000
1766+++ friends/tests/test_identica.py 2013-03-20 13:27:53 +0000
1767@@ -26,23 +26,15 @@
1768 import unittest
1769 import shutil
1770
1771-from gi.repository import Dee
1772-
1773 from friends.protocols.identica import Identica
1774-from friends.tests.mocks import FakeAccount, LogMock, mock
1775+from friends.tests.mocks import FakeAccount, LogMock, TestModel, mock
1776 from friends.utils.cache import JsonCache
1777-from friends.utils.model import COLUMN_TYPES
1778 from friends.errors import AuthorizationError
1779
1780
1781-# Create a test model that will not interfere with the user's environment.
1782-# We'll use this object as a mock of the real model.
1783-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1784-TestModel.set_schema_full(COLUMN_TYPES)
1785-
1786-
1787 @mock.patch('friends.utils.http._soup', mock.Mock())
1788 @mock.patch('friends.utils.base.notify', mock.Mock())
1789+@mock.patch('friends.utils.base.Model', TestModel)
1790 class TestIdentica(unittest.TestCase):
1791 """Test the Identica API."""
1792
1793
1794=== modified file 'friends/tests/test_mock_dispatcher.py'
1795--- friends/tests/test_mock_dispatcher.py 2013-02-05 01:11:35 +0000
1796+++ friends/tests/test_mock_dispatcher.py 2013-03-20 13:27:53 +0000
1797@@ -22,13 +22,11 @@
1798
1799 import dbus.service
1800 import unittest
1801-import json
1802
1803 from dbus.mainloop.glib import DBusGMainLoop
1804
1805 from friends.service.mock_service import Dispatcher as MockDispatcher
1806 from friends.service.dispatcher import Dispatcher
1807-from friends.tests.mocks import LogMock, mock
1808
1809
1810 # Set up the DBus main loop.
1811
1812=== modified file 'friends/tests/test_model.py'
1813--- friends/tests/test_model.py 2013-02-05 01:11:35 +0000
1814+++ friends/tests/test_model.py 2013-03-20 13:27:53 +0000
1815@@ -26,10 +26,8 @@
1816
1817 import unittest
1818
1819-from friends.utils.model import Model
1820 from friends.utils.model import prune_model, persist_model
1821 from friends.tests.mocks import LogMock, mock
1822-from gi.repository import Dee
1823
1824
1825 class TestModel(unittest.TestCase):
1826@@ -42,6 +40,17 @@
1827 self.log_mock.stop()
1828
1829 @mock.patch('friends.utils.model.Model')
1830+ def test_persist_model(self, model):
1831+ model.__len__.return_value = 500
1832+ model.is_synchronized.return_value = True
1833+ persist_model()
1834+ model.is_synchronized.assert_called_once_with()
1835+ model.flush_revision_queue.assert_called_once_with()
1836+ self.assertEqual(self.log_mock.empty(),
1837+ 'Trying to save Dee.SharedModel with 500 rows.\n' +
1838+ 'Saving Dee.SharedModel with 500 rows.\n')
1839+
1840+ @mock.patch('friends.utils.model.Model')
1841 @mock.patch('friends.utils.model.persist_model')
1842 def test_prune_one(self, persist, model):
1843 model.get_n_rows.return_value = 8001
1844
1845=== modified file 'friends/tests/test_notify.py'
1846--- friends/tests/test_notify.py 2013-02-05 01:11:35 +0000
1847+++ friends/tests/test_notify.py 2013-03-20 13:27:53 +0000
1848@@ -22,20 +22,11 @@
1849
1850 import unittest
1851
1852-from gi.repository import Dee
1853-
1854-from friends.tests.mocks import FakeAccount, mock
1855+from friends.tests.mocks import FakeAccount, TestModel, mock
1856 from friends.utils.base import Base
1857-from friends.utils.model import COLUMN_TYPES
1858 from friends.utils.notify import notify
1859
1860
1861-# Create a test model that will not interfere with the user's environment.
1862-# We'll use this object as a mock of the real model.
1863-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1864-TestModel.set_schema_full(COLUMN_TYPES)
1865-
1866-
1867 class TestNotifications(unittest.TestCase):
1868 """Test notification details."""
1869
1870@@ -43,7 +34,6 @@
1871 TestModel.clear()
1872
1873 @mock.patch('friends.utils.base.Model', TestModel)
1874- @mock.patch('friends.utils.base._seen_messages', {})
1875 @mock.patch('friends.utils.base._seen_ids', {})
1876 @mock.patch('friends.utils.base.notify')
1877 def test_publish_all(self, notify):
1878@@ -57,7 +47,6 @@
1879 notify.assert_called_once_with('Benjamin', 'notify!', '')
1880
1881 @mock.patch('friends.utils.base.Model', TestModel)
1882- @mock.patch('friends.utils.base._seen_messages', {})
1883 @mock.patch('friends.utils.base._seen_ids', {})
1884 @mock.patch('friends.utils.base.notify')
1885 def test_publish_mentions_private(self, notify):
1886@@ -73,7 +62,6 @@
1887 notify.assert_called_once_with('Benjamin', 'This message is private!', '')
1888
1889 @mock.patch('friends.utils.base.Model', TestModel)
1890- @mock.patch('friends.utils.base._seen_messages', {})
1891 @mock.patch('friends.utils.base._seen_ids', {})
1892 @mock.patch('friends.utils.base.notify')
1893 def test_publish_mention_fail(self, notify):
1894@@ -89,7 +77,6 @@
1895 self.assertEqual(notify.call_count, 0)
1896
1897 @mock.patch('friends.utils.base.Model', TestModel)
1898- @mock.patch('friends.utils.base._seen_messages', {})
1899 @mock.patch('friends.utils.base._seen_ids', {})
1900 @mock.patch('friends.utils.base.notify')
1901 def test_publish_mention_none(self, notify):
1902
1903=== modified file 'friends/tests/test_protocols.py'
1904--- friends/tests/test_protocols.py 2013-02-05 01:11:35 +0000
1905+++ friends/tests/test_protocols.py 2013-03-20 13:27:53 +0000
1906@@ -24,21 +24,12 @@
1907 import unittest
1908 import threading
1909
1910-from gi.repository import Dee
1911-
1912 from friends.protocols.flickr import Flickr
1913 from friends.protocols.twitter import Twitter
1914-from friends.tests.mocks import FakeAccount, LogMock, mock
1915-from friends.utils.base import Base, feature
1916+from friends.tests.mocks import FakeAccount, LogMock, TestModel, mock
1917+from friends.utils.base import Base, feature, linkify_string
1918 from friends.utils.manager import ProtocolManager
1919-from friends.utils.model import (
1920- COLUMN_INDICES, COLUMN_NAMES, COLUMN_TYPES, Model)
1921-
1922-
1923-# Create a test model that will not interfere with the user's environment.
1924-# We'll use this object as a mock of the real model.
1925-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
1926-TestModel.set_schema_full(COLUMN_TYPES)
1927+from friends.utils.model import COLUMN_INDICES, Model
1928
1929
1930 class TestProtocolManager(unittest.TestCase):
1931@@ -147,42 +138,39 @@
1932 count = Model.get_n_rows()
1933 self.assertEqual(TestModel.get_n_rows(), 0)
1934 base = Base(FakeAccount())
1935- base._publish('alpha', message='a')
1936- base._publish('beta', message='b')
1937- base._publish('omega', message='c')
1938+ base._publish(message_id='alpha', message='a')
1939+ base._publish(message_id='beta', message='b')
1940+ base._publish(message_id='omega', message='c')
1941 self.assertEqual(Model.get_n_rows(), count)
1942 self.assertEqual(TestModel.get_n_rows(), 3)
1943
1944 @mock.patch('friends.utils.base.Model', TestModel)
1945 @mock.patch('friends.utils.base._seen_ids', {})
1946- @mock.patch('friends.utils.base._seen_messages', {})
1947 def test_seen_dicts_successfully_instantiated(self):
1948- from friends.utils.base import _seen_ids, _seen_messages
1949+ from friends.utils.base import _seen_ids
1950 from friends.utils.base import initialize_caches
1951 self.assertEqual(TestModel.get_n_rows(), 0)
1952 base = Base(FakeAccount())
1953- base._publish('alpha', sender='a', message='a')
1954- base._publish('beta', sender='a', message='a')
1955- base._publish('omega', sender='a', message='b')
1956- self.assertEqual(TestModel.get_n_rows(), 2)
1957+ base._publish(message_id='alpha', sender='a', message='a')
1958+ base._publish(message_id='beta', sender='a', message='a')
1959+ base._publish(message_id='omega', sender='a', message='b')
1960+ self.assertEqual(TestModel.get_n_rows(), 3)
1961 _seen_ids.clear()
1962- _seen_messages.clear()
1963 initialize_caches()
1964- self.assertEqual(sorted(list(_seen_messages.keys())), ['aa', 'ab'])
1965- self.assertEqual(sorted(list(_seen_ids.keys())),
1966- [('base', '1234', 'alpha'),
1967- ('base', '1234', 'beta'),
1968- ('base', '1234', 'omega')])
1969- # These two point at the same row because sender+message are identical
1970- self.assertEqual(_seen_ids[('base', '1234', 'alpha')],
1971- _seen_ids[('base', '1234', 'beta')])
1972+ self.assertEqual(
1973+ _seen_ids,
1974+ dict(alpha=0,
1975+ beta=1,
1976+ omega=2,
1977+ )
1978+ )
1979
1980 @mock.patch('friends.utils.base.Model', TestModel)
1981 def test_invalid_argument(self):
1982 base = Base(FakeAccount())
1983 self.assertEqual(0, TestModel.get_n_rows())
1984 with self.assertRaises(TypeError) as cm:
1985- base._publish('message_id', invalid_argument='not good')
1986+ base._publish(message_id='message_id', invalid_argument='not good')
1987 self.assertEqual(str(cm.exception),
1988 'Unexpected keyword arguments: invalid_argument')
1989
1990@@ -192,12 +180,11 @@
1991 base = Base(FakeAccount())
1992 self.assertEqual(0, TestModel.get_n_rows())
1993 with self.assertRaises(TypeError) as cm:
1994- base._publish('p.middy', bad='no', wrong='yes')
1995+ base._publish(message_id='p.middy', bad='no', wrong='yes')
1996 self.assertEqual(str(cm.exception),
1997 'Unexpected keyword arguments: bad, wrong')
1998
1999 @mock.patch('friends.utils.base.Model', TestModel)
2000- @mock.patch('friends.utils.base._seen_messages', {})
2001 @mock.patch('friends.utils.base._seen_ids', {})
2002 def test_one_message(self):
2003 # Test that publishing a message inserts a row into the model.
2004@@ -215,28 +202,34 @@
2005 liked=True))
2006 self.assertEqual(1, TestModel.get_n_rows())
2007 row = TestModel.get_row(0)
2008- # For convenience.
2009- def V(column_name):
2010- return row[COLUMN_INDICES[column_name]]
2011- self.assertEqual(V('message_ids'),
2012- [['base', '1234', '1234']])
2013- self.assertEqual(V('stream'), 'messages')
2014- self.assertEqual(V('sender'), 'fred')
2015- self.assertEqual(V('sender_nick'), 'freddy')
2016- self.assertTrue(V('from_me'))
2017- self.assertEqual(V('timestamp'), 'today')
2018- self.assertEqual(V('message'), 'hello, @jimmy')
2019- self.assertEqual(V('likes'), 10)
2020- self.assertTrue(V('liked'))
2021- # All the other columns have empty string values.
2022- empty_columns = set(COLUMN_NAMES) - set(
2023- ['message_ids', 'stream', 'sender', 'sender_nick', 'from_me',
2024- 'timestamp', 'comments', 'message', 'likes', 'liked'])
2025- for column_name in empty_columns:
2026- self.assertEqual(row[COLUMN_INDICES[column_name]], '')
2027+ self.assertEqual(
2028+ list(row),
2029+ ['base',
2030+ 88,
2031+ '1234',
2032+ 'messages',
2033+ 'fred',
2034+ '',
2035+ 'freddy',
2036+ True,
2037+ 'today',
2038+ 'hello, @jimmy',
2039+ '',
2040+ '',
2041+ 10,
2042+ True,
2043+ '',
2044+ '',
2045+ '',
2046+ '',
2047+ '',
2048+ '',
2049+ '',
2050+ 0.0,
2051+ 0.0,
2052+ ])
2053
2054 @mock.patch('friends.utils.base.Model', TestModel)
2055- @mock.patch('friends.utils.base._seen_messages', {})
2056 @mock.patch('friends.utils.base._seen_ids', {})
2057 def test_unpublish(self):
2058 base = Base(FakeAccount())
2059@@ -246,32 +239,24 @@
2060 sender='fred',
2061 message='hello, @jimmy'))
2062 self.assertTrue(base._publish(
2063+ message_id='1234',
2064+ sender='fred',
2065+ message='hello, @jimmy'))
2066+ self.assertTrue(base._publish(
2067 message_id='5678',
2068 sender='fred',
2069 message='hello, +jimmy'))
2070- self.assertEqual(1, TestModel.get_n_rows())
2071- self.assertEqual(TestModel[0][0],
2072- [['base', '1234', '1234'],
2073- ['base', '1234', '5678']])
2074+ self.assertEqual(2, TestModel.get_n_rows())
2075 base._unpublish('1234')
2076 self.assertEqual(1, TestModel.get_n_rows())
2077- self.assertEqual(TestModel[0][0],
2078- [['base', '1234', '5678']])
2079 base._unpublish('5678')
2080 self.assertEqual(0, TestModel.get_n_rows())
2081
2082 @mock.patch('friends.utils.base.Model', TestModel)
2083- @mock.patch('friends.utils.base._seen_messages', {})
2084 @mock.patch('friends.utils.base._seen_ids', {})
2085 def test_duplicate_messages_identified(self):
2086- # When two messages which are deemed identical, by way of the
2087- # _make_key() test in base.py, are published, only one ends up in the
2088- # model. However, the message_ids list-of-lists gets both sets of
2089- # identifiers.
2090 base = Base(FakeAccount())
2091 self.assertEqual(0, TestModel.get_n_rows())
2092- # Insert the first message into the table. The key will be the string
2093- # 'fredhellojimmy'
2094 self.assertTrue(base._publish(
2095 message_id='1234',
2096 stream='messages',
2097@@ -282,11 +267,9 @@
2098 message='hello, @jimmy',
2099 likes=10,
2100 liked=True))
2101- # Insert the second message into the table. Note that because
2102- # punctuation was stripped from the above message, this one will also
2103- # have the key 'fredhellojimmy', thus it will be deemed a duplicate.
2104+ # Duplicate
2105 self.assertTrue(base._publish(
2106- message_id='5678',
2107+ message_id='1234',
2108 stream='messages',
2109 sender='fred',
2110 sender_nick='freddy',
2111@@ -300,13 +283,8 @@
2112 # The first published message wins.
2113 row = TestModel.get_row(0)
2114 self.assertEqual(row[COLUMN_INDICES['message']], 'hello, @jimmy')
2115- # Both message ids will be present, in the order they were published.
2116- self.assertEqual(row[COLUMN_INDICES['message_ids']],
2117- [['base', '1234', '1234'],
2118- ['base', '1234', '5678']])
2119
2120 @mock.patch('friends.utils.base.Model', TestModel)
2121- @mock.patch('friends.utils.base._seen_messages', {})
2122 @mock.patch('friends.utils.base._seen_ids', {})
2123 def test_duplicate_ids_not_duplicated(self):
2124 # When two messages are actually identical (same ids and all),
2125@@ -325,12 +303,34 @@
2126 message='hello, @jimmy'))
2127 self.assertEqual(1, TestModel.get_n_rows())
2128 row = TestModel.get_row(0)
2129- # The same message_id should not appear twice.
2130- self.assertEqual(row[COLUMN_INDICES['message_ids']],
2131- [['base', '1234', '1234']])
2132+ self.assertEqual(
2133+ list(row),
2134+ ['base',
2135+ 88,
2136+ '1234',
2137+ 'messages',
2138+ 'fred',
2139+ '',
2140+ '',
2141+ False,
2142+ '',
2143+ 'hello, @jimmy',
2144+ '',
2145+ '',
2146+ 0,
2147+ False,
2148+ '',
2149+ '',
2150+ '',
2151+ '',
2152+ '',
2153+ '',
2154+ '',
2155+ 0.0,
2156+ 0.0,
2157+ ])
2158
2159 @mock.patch('friends.utils.base.Model', TestModel)
2160- @mock.patch('friends.utils.base._seen_messages', {})
2161 @mock.patch('friends.utils.base._seen_ids', {})
2162 def test_similar_messages_allowed(self):
2163 # Because both the sender and message contribute to the unique key we
2164@@ -392,3 +392,78 @@
2165
2166 def test_features(self):
2167 self.assertEqual(MyProtocol.get_features(), ['feature_1', 'feature_2'])
2168+
2169+ def test_linkify_string(self):
2170+ # String with no URL is unchanged.
2171+ self.assertEqual('Hello!', linkify_string('Hello!'))
2172+ # http:// works.
2173+ self.assertEqual(
2174+ '<a href="http://www.example.com">http://www.example.com</a>',
2175+ linkify_string('http://www.example.com'))
2176+ # https:// works, too.
2177+ self.assertEqual(
2178+ '<a href="https://www.example.com">https://www.example.com</a>',
2179+ linkify_string('https://www.example.com'))
2180+ # http:// is optional if you include www.
2181+ self.assertEqual(
2182+ '<a href="www.example.com">www.example.com</a>',
2183+ linkify_string('www.example.com'))
2184+ # Haha, nobody uses ftp anymore!
2185+ self.assertEqual(
2186+ '<a href="ftp://example.com/">ftp://example.com/</a>',
2187+ linkify_string('ftp://example.com/'))
2188+ # Trailing periods are not linkified.
2189+ self.assertEqual(
2190+ '<a href="http://example.com">http://example.com</a>.',
2191+ linkify_string('http://example.com.'))
2192+ # URL can contain periods without getting cut off.
2193+ self.assertEqual(
2194+ '<a href="http://example.com/products/buy.html">'
2195+ 'http://example.com/products/buy.html</a>.',
2196+ linkify_string('http://example.com/products/buy.html.'))
2197+ # Don't linkify trailing brackets.
2198+ self.assertEqual(
2199+ 'Example Co (<a href="http://example.com">http://example.com</a>).',
2200+ linkify_string('Example Co (http://example.com).'))
2201+ # Don't linkify trailing exclamation marks.
2202+ self.assertEqual(
2203+ 'Go to <a href="https://example.com">https://example.com</a>!',
2204+ linkify_string('Go to https://example.com!'))
2205+ # Don't linkify trailing commas, also ensure all links are found.
2206+ self.assertEqual(
2207+ '<a href="www.example.com">www.example.com</a>, <a '
2208+ 'href="http://example.com/stuff">http://example.com/stuff</a>, and '
2209+ '<a href="http://example.com/things">http://example.com/things</a> '
2210+ 'are my favorite sites.',
2211+ linkify_string('www.example.com, http://example.com/stuff, and '
2212+ 'http://example.com/things are my favorite sites.'))
2213+ # Don't linkify trailing question marks.
2214+ self.assertEqual(
2215+ 'Ever been to <a href="www.example.com">www.example.com</a>?',
2216+ linkify_string('Ever been to www.example.com?'))
2217+ # URLs can contain question marks ok.
2218+ self.assertEqual(
2219+ 'Like <a href="http://example.com?foo=bar&grill=true">'
2220+ 'http://example.com?foo=bar&grill=true</a>?',
2221+ linkify_string('Like http://example.com?foo=bar&grill=true?'))
2222+ # Multi-line strings are also supported.
2223+ self.assertEqual(
2224+ 'Hey, visit us online!\n\n'
2225+ '<a href="http://example.com">http://example.com</a>',
2226+ linkify_string('Hey, visit us online!\n\nhttp://example.com'))
2227+ # Don't accidentally duplicate linkification.
2228+ self.assertEqual(
2229+ '<a href="www.example.com">click here!</a>',
2230+ linkify_string('<a href="www.example.com">click here!</a>'))
2231+ self.assertEqual(
2232+ '<a href="www.example.com">www.example.com</a>',
2233+ linkify_string('<a href="www.example.com">www.example.com</a>'))
2234+ self.assertEqual(
2235+ '<a href="www.example.com">www.example.com</a> is our website',
2236+ linkify_string(
2237+ '<a href="www.example.com">www.example.com</a> is our website'))
2238+ # This, apparently, is valid HTML.
2239+ self.assertEqual(
2240+ '<a href = "www.example.com">www.example.com</a>',
2241+ linkify_string(
2242+ '<a href = "www.example.com">www.example.com</a>'))
2243
2244=== modified file 'friends/tests/test_shortener.py'
2245--- friends/tests/test_shortener.py 2013-02-13 02:05:37 +0000
2246+++ friends/tests/test_shortener.py 2013-03-20 13:27:53 +0000
2247@@ -22,8 +22,6 @@
2248
2249 import unittest
2250
2251-from operator import getitem
2252-
2253 from friends.shorteners import isgd, ougd, linkeecom, lookup, tinyurlcom
2254 from friends.tests.mocks import FakeSoupMessage, mock
2255
2256
2257=== modified file 'friends/tests/test_twitter.py'
2258--- friends/tests/test_twitter.py 2013-03-08 02:31:15 +0000
2259+++ friends/tests/test_twitter.py 2013-03-20 13:27:53 +0000
2260@@ -26,22 +26,16 @@
2261 import unittest
2262 import shutil
2263
2264-from gi.repository import GLib, Dee
2265+from gi.repository import GLib
2266 from urllib.error import HTTPError
2267
2268 from friends.protocols.twitter import RateLimiter, Twitter
2269-from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock, mock
2270+from friends.tests.mocks import FakeAccount, FakeSoupMessage, LogMock
2271+from friends.tests.mocks import TestModel, mock
2272 from friends.utils.cache import JsonCache
2273-from friends.utils.model import COLUMN_TYPES
2274 from friends.errors import AuthorizationError
2275
2276
2277-# Create a test model that will not interfere with the user's environment.
2278-# We'll use this object as a mock of the real model.
2279-TestModel = Dee.SharedModel.new('com.canonical.Friends.TestSharedModel')
2280-TestModel.set_schema_full(COLUMN_TYPES)
2281-
2282-
2283 @mock.patch('friends.utils.http._soup', mock.Mock())
2284 @mock.patch('friends.utils.base.notify', mock.Mock())
2285 class TestTwitter(unittest.TestCase):
2286@@ -124,7 +118,6 @@
2287 FakeSoupMessage('friends.tests.data', 'twitter-home.dat'))
2288 @mock.patch('friends.protocols.twitter.Twitter._login',
2289 return_value=True)
2290- @mock.patch('friends.utils.base._seen_messages', {})
2291 @mock.patch('friends.utils.base._seen_ids', {})
2292 def test_home(self, *mocks):
2293 self.account.access_token = 'access'
2294@@ -138,31 +131,32 @@
2295
2296 # This test data was ripped directly from Twitter's API docs.
2297 expected = [
2298- [[['twitter', '1234', '240558470661799936']],
2299+ ['twitter', 88, '240558470661799936',
2300 'messages', 'OAuth Dancer', '119476949', 'oauth_dancer', False,
2301 '2012-08-28T21:16:23Z', 'just another test',
2302 GLib.get_user_cache_dir() +
2303 '/friends/avatars/ded4ba3c00583ee511f399d0b2537731ca14c39d',
2304 'https://twitter.com/oauth_dancer/status/240558470661799936',
2305- 0.0, False, '', '', '', '', '', '',
2306+ 0, False, '', '', '', '', '', '', '', 0.0, 0.0,
2307 ],
2308- [[['twitter', '1234', '240556426106372096']],
2309+ ['twitter', 88, '240556426106372096',
2310 'messages', 'Raffi Krikorian', '8285392', 'raffi', False,
2311- '2012-08-28T21:08:15Z', 'lecturing at the "analyzing big data ' +
2312- 'with twitter" class at @cal with @othman http://t.co/bfj7zkDJ',
2313+ '2012-08-28T21:08:15Z', 'lecturing at the "analyzing big data '
2314+ 'with twitter" class at @cal with @othman '
2315+ '<a href="http://t.co/bfj7zkDJ">http://t.co/bfj7zkDJ</a>',
2316 GLib.get_user_cache_dir() +
2317 '/friends/avatars/0219effc03a3049a622476e6e001a4014f33dc31',
2318 'https://twitter.com/raffi/status/240556426106372096',
2319- 0.0, False, '', '', '', '', '', '',
2320+ 0, False, '', '', '', '', '', '', '', 0.0, 0.0,
2321 ],
2322- [[['twitter', '1234', '240539141056638977']],
2323+ ['twitter', 88, '240539141056638977',
2324 'messages', 'Taylor Singletary', '819797', 'episod', False,
2325 '2012-08-28T19:59:34Z',
2326 'You\'d be right more often if you thought you were wrong.',
2327 GLib.get_user_cache_dir() +
2328 '/friends/avatars/0c829cb2934ad76489be21ee5e103735d9b7b034',
2329 'https://twitter.com/episod/status/240539141056638977',
2330- 0.0, False, '', '', '', '', '', '',
2331+ 0, False, '', '', '', '', '', '', '', 0.0, 0.0,
2332 ],
2333 ]
2334 for i, expected_row in enumerate(expected):
2335@@ -173,7 +167,6 @@
2336 FakeSoupMessage('friends.tests.data', 'twitter-home.dat'))
2337 @mock.patch('friends.protocols.twitter.Twitter._login',
2338 return_value=True)
2339- @mock.patch('friends.utils.base._seen_messages', {})
2340 @mock.patch('friends.utils.base._seen_ids', {})
2341 def test_home_since_id(self, *mocks):
2342 self.account.access_token = 'access'
2343@@ -193,13 +186,11 @@
2344 'https://api.twitter.com/1.1/statuses/' +
2345 'home_timeline.json?count=50&since_id=240558470661799936')
2346
2347-
2348 @mock.patch('friends.utils.base.Model', TestModel)
2349 @mock.patch('friends.utils.http.Soup.Message',
2350 FakeSoupMessage('friends.tests.data', 'twitter-send.dat'))
2351 @mock.patch('friends.protocols.twitter.Twitter._login',
2352 return_value=True)
2353- @mock.patch('friends.utils.base._seen_messages', {})
2354 @mock.patch('friends.utils.base._seen_ids', {})
2355 def test_from_me(self, *mocks):
2356 self.account.access_token = 'access'
2357@@ -216,18 +207,17 @@
2358
2359 # This test data was ripped directly from Twitter's API docs.
2360 expected_row = [
2361- [['twitter', '1234', '240558470661799936']],
2362+ 'twitter', 88, '240558470661799936',
2363 'messages', 'OAuth Dancer', '119476949', 'oauth_dancer', True,
2364 '2012-08-28T21:16:23Z', 'just another test',
2365 GLib.get_user_cache_dir() +
2366 '/friends/avatars/ded4ba3c00583ee511f399d0b2537731ca14c39d',
2367 'https://twitter.com/oauth_dancer/status/240558470661799936',
2368- 0.0, False, '', '', '', '', '', '',
2369+ 0, False, '', '', '', '', '', '', '', 0.0, 0.0,
2370 ]
2371 self.assertEqual(list(TestModel.get_row(0)), expected_row)
2372
2373 @mock.patch('friends.utils.base.Model', TestModel)
2374- @mock.patch('friends.utils.base._seen_messages', {})
2375 @mock.patch('friends.utils.base._seen_ids', {})
2376 def test_home_url(self):
2377 get_url = self.protocol._get_url = mock.Mock(return_value=['tweet'])
2378@@ -240,7 +230,6 @@
2379 'https://api.twitter.com/1.1/statuses/home_timeline.json?count=50')
2380
2381 @mock.patch('friends.utils.base.Model', TestModel)
2382- @mock.patch('friends.utils.base._seen_messages', {})
2383 @mock.patch('friends.utils.base._seen_ids', {})
2384 def test_mentions(self):
2385 get_url = self.protocol._get_url = mock.Mock(return_value=['tweet'])
2386@@ -254,7 +243,6 @@
2387 'mentions_timeline.json?count=50')
2388
2389 @mock.patch('friends.utils.base.Model', TestModel)
2390- @mock.patch('friends.utils.base._seen_messages', {})
2391 @mock.patch('friends.utils.base._seen_ids', {})
2392 def test_user(self):
2393 get_url = self.protocol._get_url = mock.Mock(return_value=['tweet'])
2394@@ -267,7 +255,6 @@
2395 'https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=')
2396
2397 @mock.patch('friends.utils.base.Model', TestModel)
2398- @mock.patch('friends.utils.base._seen_messages', {})
2399 @mock.patch('friends.utils.base._seen_ids', {})
2400 def test_list(self):
2401 get_url = self.protocol._get_url = mock.Mock(return_value=['tweet'])
2402@@ -280,7 +267,6 @@
2403 'https://api.twitter.com/1.1/lists/statuses.json?list_id=some_list_id')
2404
2405 @mock.patch('friends.utils.base.Model', TestModel)
2406- @mock.patch('friends.utils.base._seen_messages', {})
2407 @mock.patch('friends.utils.base._seen_ids', {})
2408 def test_lists(self):
2409 get_url = self.protocol._get_url = mock.Mock(
2410@@ -294,7 +280,6 @@
2411 'https://api.twitter.com/1.1/lists/list.json')
2412
2413 @mock.patch('friends.utils.base.Model', TestModel)
2414- @mock.patch('friends.utils.base._seen_messages', {})
2415 @mock.patch('friends.utils.base._seen_ids', {})
2416 def test_private(self):
2417 get_url = self.protocol._get_url = mock.Mock(return_value=['tweet'])
2418@@ -346,7 +331,6 @@
2419 ])
2420
2421 @mock.patch('friends.utils.base.Model', TestModel)
2422- @mock.patch('friends.utils.base._seen_messages', {})
2423 @mock.patch('friends.utils.base._seen_ids', {})
2424 def test_send_private(self):
2425 get_url = self.protocol._get_url = mock.Mock(return_value='tweet')
2426@@ -406,6 +390,44 @@
2427 'tweet @pumpichank!',
2428 in_reply_to_status_id='1234'))
2429
2430+ @mock.patch('friends.utils.base.Model', TestModel)
2431+ @mock.patch('friends.utils.http.Soup.Message',
2432+ FakeSoupMessage('friends.tests.data', 'twitter-home.dat'))
2433+ @mock.patch('friends.protocols.twitter.Twitter._login',
2434+ return_value=True)
2435+ @mock.patch('friends.utils.base._seen_ids', {})
2436+ def test_send_thread_prepend_nick(self, *mocks):
2437+ self.account.access_token = 'access'
2438+ self.account.secret_token = 'secret'
2439+ self.account.auth.parameters = dict(
2440+ ConsumerKey='key',
2441+ ConsumerSecret='secret')
2442+ self.assertEqual(0, TestModel.get_n_rows())
2443+ self.assertEqual(self.protocol.home(), 3)
2444+ self.assertEqual(3, TestModel.get_n_rows())
2445+
2446+ # If you forgot to @mention in your reply, we add it for you.
2447+ get = self.protocol._get_url = mock.Mock()
2448+ self.protocol._publish_tweet = mock.Mock()
2449+ self.protocol.send_thread(
2450+ '240556426106372096',
2451+ 'Exciting and original response!')
2452+ get.assert_called_once_with(
2453+ 'https://api.twitter.com/1.1/statuses/update.json',
2454+ dict(status='@raffi Exciting and original response!',
2455+ in_reply_to_status_id='240556426106372096'))
2456+
2457+ # If you remembered the @mention, we won't duplicate it.
2458+ get.reset_mock()
2459+ self.protocol.send_thread(
2460+ '240556426106372096',
2461+ 'You are the greatest, @raffi!')
2462+ get.assert_called_once_with(
2463+ 'https://api.twitter.com/1.1/statuses/update.json',
2464+ dict(status='You are the greatest, @raffi!',
2465+ in_reply_to_status_id='240556426106372096'))
2466+
2467+
2468 def test_delete(self):
2469 get_url = self.protocol._get_url = mock.Mock(return_value='tweet')
2470 publish = self.protocol._unpublish = mock.Mock()
2471@@ -466,7 +488,6 @@
2472 dict(id='1234'))
2473
2474 @mock.patch('friends.utils.base.Model', TestModel)
2475- @mock.patch('friends.utils.base._seen_messages', {})
2476 @mock.patch('friends.utils.base._seen_ids', {})
2477 def test_tag(self):
2478 get_url = self.protocol._get_url = mock.Mock(
2479@@ -486,7 +507,6 @@
2480 'https://api.twitter.com/1.1/search/tweets.json?q=%23yegbike')
2481
2482 @mock.patch('friends.utils.base.Model', TestModel)
2483- @mock.patch('friends.utils.base._seen_messages', {})
2484 @mock.patch('friends.utils.base._seen_ids', {})
2485 def test_search(self):
2486 get_url = self.protocol._get_url = mock.Mock(
2487
2488=== modified file 'friends/utils/account.py'
2489--- friends/utils/account.py 2013-02-05 01:11:35 +0000
2490+++ friends/utils/account.py 2013-03-20 13:27:53 +0000
2491@@ -43,7 +43,6 @@
2492 # are added or deleted.
2493 manager = Accounts.Manager.new_for_service_type('microblogging')
2494 manager.connect('enabled-event', self._on_enabled_event)
2495- manager.connect('account-deleted', self._on_account_deleted)
2496 # Add all the currently known accounts.
2497 for account_service in manager.get_enabled_account_services():
2498 self._add_new_account(account_service)
2499@@ -63,25 +62,6 @@
2500 account = self._add_new_account(account_service)
2501 if account is not None:
2502 account.protocol('receive')
2503- else:
2504- # If an account has been disabled in UOA, we should remove
2505- # it's messages from the SharedModel.
2506- self._unpublish_entire_account(account_id)
2507-
2508- def _on_account_deleted(self, manager, account_id):
2509- account_service = self._get_service(manager, account_id)
2510- if account_service is not None:
2511- log.debug('Deleting account {}'.format(account_id))
2512- self._unpublish_entire_account(account_id)
2513- else:
2514- log.error('Tried to delete invalid account: {}'.format(account_id))
2515-
2516- def _unpublish_entire_account(self, account_id):
2517- """Delete all the account's messages from the SharedModel."""
2518- log.debug('Deleting all messages from {}.'.format(account_id))
2519- account = self._accounts.pop(str(account_id), None)
2520- if account is not None:
2521- account.protocol._unpublish_all()
2522
2523 def _add_new_account(self, account_service):
2524 try:
2525@@ -89,14 +69,14 @@
2526 except UnsupportedProtocolError as error:
2527 log.info(error)
2528 else:
2529- self._accounts[str(new_account.id)] = new_account
2530+ self._accounts[new_account.id] = new_account
2531 return new_account
2532
2533 def get_all(self):
2534 return self._accounts.values()
2535
2536 def get(self, account_id, default=None):
2537- return self._accounts.get(str(account_id), default)
2538+ return self._accounts.get(int(account_id), default)
2539
2540
2541 class AuthData:
2542@@ -132,12 +112,12 @@
2543 self.auth = AuthData(account_service.get_auth_data())
2544 # The provider in libaccounts should match the name of our protocol.
2545 account = account_service.get_account()
2546+ self.id = account.id
2547 self.protocol_name = account.get_provider_name()
2548 protocol_class = protocol_manager.protocols.get(self.protocol_name)
2549 if protocol_class is None:
2550 raise UnsupportedProtocolError(self.protocol_name)
2551 self.protocol = protocol_class(self)
2552- self.id = str(account.id)
2553 # Connect responders to changes in the account information.
2554 account_service.connect('changed', self._on_account_changed, account)
2555 self._on_account_changed(account_service, account)
2556
2557=== modified file 'friends/utils/authentication.py'
2558--- friends/utils/authentication.py 2013-02-05 01:11:35 +0000
2559+++ friends/utils/authentication.py 2013-03-20 13:27:53 +0000
2560@@ -66,5 +66,10 @@
2561 def _login_cb(self, session, reply, error, user_data):
2562 self._reply = reply
2563 if error:
2564- raise AuthorizationError(self.account.id, error.message)
2565+ exception = AuthorizationError(self.account.id, error.message)
2566+ # Mardy says this error can happen during normal operation.
2567+ if error.message.endswith('userActionFinished error: 10'):
2568+ log.error(str(exception))
2569+ else:
2570+ raise exception
2571 log.debug('Login completed')
2572
2573=== modified file 'friends/utils/base.py'
2574--- friends/utils/base.py 2013-03-08 02:31:15 +0000
2575+++ friends/utils/base.py 2013-03-20 13:27:53 +0000
2576@@ -25,7 +25,6 @@
2577
2578 import re
2579 import time
2580-import string
2581 import logging
2582 import threading
2583
2584@@ -43,23 +42,49 @@
2585
2586
2587 STUB = lambda *ignore, **kwignore: None
2588-IGNORED = string.punctuation + string.whitespace
2589-SCHEME_RE = re.compile('http[s]?://|friends:/', re.IGNORECASE)
2590-EMPTY_STRING = ''
2591 COMMA_SPACE = ', '
2592 AVATAR_IDX = COLUMN_INDICES['icon_uri']
2593 FROM_ME_IDX = COLUMN_INDICES['from_me']
2594 STREAM_IDX = COLUMN_INDICES['stream']
2595 SENDER_IDX = COLUMN_INDICES['sender']
2596 MESSAGE_IDX = COLUMN_INDICES['message']
2597-IDS_IDX = COLUMN_INDICES['message_ids']
2598+ID_IDX = COLUMN_INDICES['message_id']
2599+ACCT_IDX = COLUMN_INDICES['account_id']
2600 TIME_IDX = COLUMN_INDICES['timestamp']
2601
2602-
2603-# This is a mapping from Dee.SharedModel row keys to the DeeModelIters
2604-# representing the rows matching those keys. It is used for quickly finding
2605-# duplicates when we want to insert new rows into the model.
2606-_seen_messages = {}
2607+# See friends/tests/test_protocols.py for further documentation
2608+LINKIFY_REGEX = re.compile(
2609+ r"""
2610+ # Do not match if URL is preceded by '"' or '>'
2611+ # This is used to prevent duplication of linkification.
2612+ (?<![\"\>])
2613+ # Record everything that we're about to match.
2614+ (
2615+ # URLs can start with 'http://', 'https://', 'ftp://', or 'www.'
2616+ (?:(?:https?|ftp)://|www\.)
2617+ # Match many non-whitespace characters, but not greedily.
2618+ (?:\S+?)
2619+ # Stop recording the match.
2620+ )
2621+ # This section will peek ahead (without matching) in order to
2622+ # determine precisely where the URL actually *ends*.
2623+ (?=
2624+ # Do not include any trailing period, comma, exclamation mark,
2625+ # question mark, or closing parentheses, if any are present.
2626+ [.,!?\)]*
2627+ # With "trailing" defined as immediately preceding the first
2628+ # space, or end-of-string.
2629+ (?:\s|$)
2630+ # But abort the whole thing if the URL ends with '</a>',
2631+ # again to prevent duplication of linkification.
2632+ (?!</a>)
2633+ )""",
2634+ flags=re.VERBOSE).sub
2635+
2636+
2637+# This is a mapping from message_ids to DeeModel row index ints. It is
2638+# used for quickly and easily preventing the same message from being
2639+# published multiple times by mistake.
2640 _seen_ids = {}
2641
2642
2643@@ -67,9 +92,6 @@
2644 # publishing new data into the SharedModel.
2645 _publish_lock = threading.Lock()
2646
2647-# Avoid race condition during shut-down
2648-_exit_lock = threading.Lock()
2649-
2650
2651 log = logging.getLogger(__name__)
2652
2653@@ -92,56 +114,27 @@
2654 return method
2655
2656
2657-def _make_key(row):
2658- """Return a unique key for a row in the model.
2659-
2660- This is used for fuzzy comparisons with messages that are already in the
2661- model. We don't want duplicate messages to show up in the stream of
2662- messages that are visible to the user. But different social media sites
2663- attach different semantic meanings to different punctuation marks, so we
2664- want to ignore those for the sake of determining whether one message is
2665- actually identical to another or not. Thus, we need to strip out this
2666- punctuation for the sake of comparing the strings. For example:
2667-
2668- Fred uses Friends to post identical messages on Twitter and Google+
2669- (pretend that we support G+ for a moment). Fred writes 'Hey jimbob, been
2670- to http://example.com lately?', and this message might show up on Twitter
2671- like 'Hey @jimbob, been to example.com lately?', but it might show up on
2672- G+ like 'Hey +jimbob, been to http://example.com lately?'. So we need to
2673- strip out all the possibly different bits in order to identify that these
2674- messages are the same for our purposes. In both of these cases, the
2675- string is converted into 'Heyjimbobbeentoexamplecomlately' and then they
2676- compare equally, so we've identified a duplicate message.
2677- """
2678- # Given a 'row' of data, the sender and message fields are concatenated
2679- # together to form the raw key. Then we strip out details such as url
2680- # schemes, punctuation, and whitespace, that allow for the fuzzy matching.
2681- key = SCHEME_RE.sub('', row[SENDER_IDX] + row[MESSAGE_IDX])
2682- # Now remove all punctuation and whitespace.
2683- return EMPTY_STRING.join([char for char in key if char not in IGNORED])
2684-
2685-
2686 def initialize_caches():
2687- """Populate _seen_ids and _seen_messages with Model data.
2688+ """Populate _seen_ids with Model data.
2689
2690 Our Dee.SharedModel persists across instances, so we need to
2691- populate these caches at launch.
2692+ populate this cache at launch.
2693 """
2694- for i in range(Model.get_n_rows()):
2695- row_iter = Model.get_iter_at_row(i)
2696- row = Model.get_row(row_iter)
2697- _seen_messages[_make_key(row)] = i
2698- for triple in row[IDS_IDX]:
2699- _seen_ids[tuple(triple)] = i
2700- log.debug(
2701- '_seen_ids: {}, _seen_messages: {}'.format(
2702- len(_seen_ids), len(_seen_messages)))
2703+ # Don't create a new dict; we need to keep the same dict object in
2704+ # memory since it gets imported into a few different places that
2705+ # would not get the updated reference to the new dict.
2706+ _seen_ids.clear()
2707+ _seen_ids.update({row[ID_IDX]: i for i, row in enumerate(Model)})
2708+ log.debug('_seen_ids: {}'.format(len(_seen_ids)))
2709+
2710+
2711+def linkify_string(string):
2712+ """Finds all URLs in a string and turns them into HTML links."""
2713+ return LINKIFY_REGEX(r'<a href="\1">\1</a>', string)
2714
2715
2716 class _OperationThread(threading.Thread):
2717 """Manage async callbacks, and log subthread exceptions."""
2718- # main.py will replace this with a reference to the mainloop.quit method
2719- shutdown = lambda: log.error('Failed to exit friends-dispatcher main loop')
2720
2721 def __init__(self, *args, id=None, success=STUB, failure=STUB, **kws):
2722 self._id = id
2723@@ -173,13 +166,6 @@
2724 log.debug('{} has completed in {:.2f}s, thread exiting.'.format(
2725 self._id, elapsed))
2726
2727- # If this is the last thread to exit, then the refresh is
2728- # completed and we should save the model, and then exit.
2729- with _exit_lock:
2730- if threading.activeCount() < 3:
2731- persist_model()
2732- GLib.idle_add(self.shutdown)
2733-
2734
2735 class Base:
2736 """Parent class for any protocol plugin such as Facebook or Twitter.
2737@@ -213,6 +199,8 @@
2738
2739 def __init__(self, account):
2740 self._account = account
2741+ self._Name = self.__class__.__name__
2742+ self._name = self._Name.lower()
2743
2744 def _whoami(self, result):
2745 """Use OAuth login results to identify the authenticating user.
2746@@ -240,7 +228,7 @@
2747 """
2748 raise NotImplementedError(
2749 '{} protocol has no _whoami() method.'.format(
2750- self.__class__.__name__))
2751+ self._Name))
2752
2753 def receive(self):
2754 """Poll the social network for new messages.
2755@@ -279,7 +267,7 @@
2756 """
2757 raise NotImplementedError(
2758 '{} protocol has no receive() method.'.format(
2759- self.__class__.__name__))
2760+ self._Name))
2761
2762 def __call__(self, operation, *args, success=STUB, failure=STUB, **kwargs):
2763 """Call an operation, i.e. a method, with arguments in a sub-thread.
2764@@ -311,7 +299,7 @@
2765 raise NotImplementedError(operation)
2766 method = getattr(self, operation)
2767 _OperationThread(
2768- id='{}.{}'.format(self.__class__.__name__, operation),
2769+ id='{}.{}'.format(self._Name, operation),
2770 target=method,
2771 success=success,
2772 failure=failure,
2773@@ -323,7 +311,7 @@
2774 """Return the number of rows in the Dee.SharedModel."""
2775 return len(Model)
2776
2777- def _publish(self, message_id, **kwargs):
2778+ def _publish(self, **kwargs):
2779 """Publish fresh data into the model, ignoring duplicates.
2780
2781 This method inserts a new full row into the Dee.SharedModel
2782@@ -352,35 +340,31 @@
2783 present. Otherwise, False is returned if the message could not be
2784 appended.
2785 """
2786- # Initialize the row of arguments to contain the message_ids value.
2787- # The column value is a list of lists (see friends/utils/model.py for
2788- # details), and because the arguments are themselves a list, this gets
2789- # initialized as a triply-nested list.
2790- triple = [self.__class__.__name__.lower(),
2791- self._account.id,
2792- message_id]
2793- args = [[triple]]
2794- # Now iterate through all the column names listed in the SCHEMA,
2795- # except for the first, since we just composed its value in the
2796- # preceding line. Pop matching column values from the kwargs, in the
2797- # order which they appear in the SCHEMA. If any are left over at the
2798- # end of this, raise a TypeError indicating the unexpected column
2799- # names.
2800- #
2801- # Missing column values default to the empty string.
2802- for column_name, column_type in SCHEMA[1:]:
2803+ # These bits don't need to be set by the caller; we can infer them.
2804+ kwargs.update(
2805+ dict(
2806+ protocol=self._name,
2807+ account_id=self._account.id
2808+ )
2809+ )
2810+ # linkify the message
2811+ kwargs['message'] = linkify_string(kwargs.get('message', ''))
2812+ args = []
2813+ # Now iterate through all the column names listed in the
2814+ # SCHEMA, and pop matching column values from the kwargs, in
2815+ # the order which they appear in the SCHEMA. If any are left
2816+ # over at the end of this, raise a TypeError indicating the
2817+ # unexpected column names.
2818+ for column_name, column_type in SCHEMA:
2819 args.append(kwargs.pop(column_name, DEFAULTS[column_type]))
2820 if len(kwargs) > 0:
2821 raise TypeError('Unexpected keyword arguments: {}'.format(
2822 COMMA_SPACE.join(sorted(kwargs))))
2823 with _publish_lock:
2824- # Don't let duplicate messages into the model, but do record the
2825- # unique message ids of each duplicate message.
2826- key = _make_key(args)
2827- row_idx = _seen_messages.get(key)
2828- if row_idx is None:
2829- # We haven't seen this message before.
2830- _seen_messages[key] = Model.get_position(Model.append(*args))
2831+ message_id = args[ID_IDX]
2832+ # Don't let duplicate messages into the model
2833+ if message_id not in _seen_ids:
2834+ _seen_ids[message_id] = Model.get_position(Model.append(*args))
2835 # I think it's safe not to notify the user about
2836 # messages that they sent themselves...
2837 if not args[FROM_ME_IDX] and self._do_notify(args[STREAM_IDX]):
2838@@ -389,25 +373,7 @@
2839 args[MESSAGE_IDX],
2840 args[AVATAR_IDX],
2841 )
2842- else:
2843- # We have seen this before, so append to the matching column's
2844- # message_ids list, this message's id.
2845- row = Model.get_row(Model.get_iter_at_row(row_idx))
2846- # Remember that row[IDS] is the nested list-of-lists of
2847- # message_ids. args[IDS] is the nested list-of-lists for the
2848- # message that we're publishing. The outer list of the latter
2849- # will always be of size 1. We want to take the inner list
2850- # from args and append it to the list-of-lists (i.e.
2851- # message_ids) of the row already in the model. To make sure
2852- # the model gets updated, we need to insert into the row, thus
2853- # it's best to concatenate the two lists together and store it
2854- # back into the column.
2855- if triple not in row[IDS_IDX]:
2856- row[IDS_IDX] = row[IDS_IDX] + args[IDS_IDX]
2857- # Tuple-ize triple because lists, being mutable, cannot be used as
2858- # dictionary keys.
2859- _seen_ids[tuple(triple)] = _seen_messages.get(key)
2860- return key in _seen_messages
2861+ return message_id in _seen_ids
2862
2863 def _unpublish(self, message_id):
2864 """Remove message_id from the Dee.SharedModel.
2865@@ -416,45 +382,24 @@
2866 published.
2867 :type message_id: string
2868 """
2869- triple = (self.__class__.__name__.lower(),
2870- self._account.id,
2871- message_id)
2872- log.debug('Unpublishing {}!'.format(triple))
2873+ log.debug('Unpublishing {}!'.format(message_id))
2874
2875- row_idx = _seen_ids.pop(triple, None)
2876+ row_idx = _seen_ids.pop(message_id, None)
2877 if row_idx is None:
2878 raise FriendsError('Tried to delete an invalid message id.')
2879
2880- row_iter = Model.get_iter_at_row(row_idx)
2881- row = Model.get_row(row_iter)
2882-
2883- if len(row[IDS_IDX]) == 1:
2884- # Message only exists on one protocol, delete it
2885- del _seen_messages[_make_key(row)]
2886- Model.remove(row_iter)
2887- # Shift our cached indexes up one, when one gets deleted.
2888- for key, value in _seen_ids.items():
2889- if value > row_idx:
2890- _seen_ids[key] = value - 1
2891- else:
2892- # Message exists on other protocols too, only drop id
2893- row[IDS_IDX] = [ids for ids
2894- in row[IDS_IDX]
2895- if ids[-1] != message_id]
2896-
2897- def _unpublish_all(self):
2898- """Remove all of this account's messages from the Model.
2899-
2900- Saves the Model to disk after it is done purging rows."""
2901- for triple in _seen_ids.copy():
2902- if self._account.id in triple:
2903- self._unpublish(triple[-1])
2904- persist_model()
2905+ Model.remove(Model.get_iter_at_row(row_idx))
2906+
2907+ # Shift our cached indexes up one, when one gets deleted.
2908+ for key, value in _seen_ids.items():
2909+ if value > row_idx:
2910+ _seen_ids[key] = value - 1
2911
2912 def _get_access_token(self):
2913 """Return an access token, logging in if necessary.
2914
2915- :return: The access_token, if we are successfully logged in."""
2916+ :return: The access_token, if we are successfully logged in.
2917+ """
2918 if self._account.access_token is None:
2919 self._login()
2920
2921@@ -512,15 +457,14 @@
2922 subthread needs to log in. You do not have to worry about
2923 subthread race conditions inside this method.
2924 """
2925- protocol = self.__class__.__name__
2926 log.debug('{} to {}'.format(
2927- 'Re-authenticating' if old_token else 'Logging in', protocol))
2928+ 'Re-authenticating' if old_token else 'Logging in', self._Name))
2929
2930 result = Authentication(self._account).login()
2931
2932 self._account.access_token = result.get('AccessToken')
2933 self._whoami(result)
2934- log.debug('{} UID: {}'.format(protocol, self._account.user_id))
2935+ log.debug('{} UID: {}'.format(self._Name, self._account.user_id))
2936
2937 def _get_oauth_headers(self, method, url, data=None, headers=None):
2938 """Basic wrapper around oauthlib that we use for Twitter and Flickr."""
2939@@ -559,6 +503,16 @@
2940 message = None
2941 raise FriendsError(message or str(error))
2942
2943+ def _fetch_cell(self, message_id, column_name):
2944+ """Find a column value associated with a specific message_id."""
2945+ row_id = _seen_ids.get(message_id)
2946+ col_idx = COLUMN_INDICES.get(column_name)
2947+ if None not in (row_id, col_idx):
2948+ row = Model.get_row(row_id)
2949+ return row[col_idx]
2950+ else:
2951+ raise FriendsError('Value could not be found.')
2952+
2953 def _new_book_client(self, source):
2954 client = EBook.BookClient.new(source)
2955 client.open_sync(False, None)
2956
2957=== modified file 'friends/utils/model.py'
2958--- friends/utils/model.py 2013-02-05 01:11:35 +0000
2959+++ friends/utils/model.py 2013-03-20 13:27:53 +0000
2960@@ -41,26 +41,11 @@
2961 log = logging.getLogger(__name__)
2962
2963
2964-# Most of this schema is very straightforward, but the 'message_ids' column
2965-# needs a bit of explanation:
2966-#
2967-# It is a two-dimensional array (ie, an array of arrays). Each inner
2968-# array contains three elements: the name of the protocol
2969-# (introspected from the name of the class that implements the
2970-# protocol), the account_id as a string (like '6' or '3'), followed by
2971-# the message_id for that particular service.
2972-#
2973-# Then, there will be one of these triples present for every service on which
2974-# the message exists. So for example, if the user posts the same message to
2975-# both facebook and twitter, that message will appear as a single row in this
2976-# schema, and the 'message_ids' column will look something like this:
2977-#
2978-# [
2979-# ['facebook', '2', '12345'],
2980-# ['twitter', '3', '987654'],
2981-# ]
2982+# DO NOT EDIT THIS WITHOUT ADJUSTING service.vala IN LOCKSTEP
2983 SCHEMA = (
2984- ('message_ids', 'aas'),
2985+ ('protocol', 's'), # Same as UOA 'provider_name'
2986+ ('account_id', 't'), # Same as UOA account id
2987+ ('message_id', 's'),
2988 ('stream', 's'),
2989 ('sender', 's'),
2990 ('sender_id', 's'),
2991@@ -70,7 +55,7 @@
2992 ('message', 's'),
2993 ('icon_uri', 's'),
2994 ('url', 's'),
2995- ('likes', 'd'),
2996+ ('likes', 't'),
2997 ('liked', 'b'),
2998 ('link_picture', 's'),
2999 ('link_name', 's'),
3000@@ -78,6 +63,9 @@
3001 ('link_desc', 's'),
3002 ('link_caption', 's'),
3003 ('link_icon', 's'),
3004+ ('location', 's'),
3005+ ('latitude', 'd'),
3006+ ('longitude', 'd'),
3007 )
3008
3009
3010@@ -91,6 +79,7 @@
3011 'b': False,
3012 's': '',
3013 'd': 0,
3014+ 't': 0,
3015 }
3016
3017
3018
3019=== modified file 'service/configure.ac'
3020--- service/configure.ac 2013-02-07 23:36:14 +0000
3021+++ service/configure.ac 2013-03-20 13:27:53 +0000
3022@@ -18,6 +18,7 @@
3023
3024 DEE_REQUIRED=1.0.0
3025 PKG_CHECK_MODULES(BASE,
3026+ libaccounts-glib
3027 gio-2.0
3028 dee-1.0 >= $DEE_REQUIRED)
3029
3030
3031=== modified file 'service/src/Makefile.am'
3032--- service/src/Makefile.am 2013-02-04 19:42:37 +0000
3033+++ service/src/Makefile.am 2013-03-20 13:27:53 +0000
3034@@ -2,14 +2,15 @@
3035 friends-service
3036
3037 INCLUDES = \
3038- $(BASE_CFLAGS)
3039+ $(BASE_CFLAGS)
3040
3041 VALAFLAGS = \
3042+ --pkg accounts \
3043 --pkg dee-1.0 \
3044 --pkg gio-2.0
3045
3046 friends_service_LDADD = \
3047- $(BASE_LIBS)
3048+ $(BASE_LIBS)
3049
3050 friends_service_SOURCES = \
3051 service.vala
3052
3053=== modified file 'service/src/service.vala'
3054--- service/src/service.vala 2013-02-20 13:24:44 +0000
3055+++ service/src/service.vala 2013-03-20 13:27:53 +0000
3056@@ -16,6 +16,7 @@
3057 * Authored by Ken VanDine <ken.vandine@canonical.com>
3058 */
3059
3060+using Ag;
3061
3062 [DBus (name = "com.canonical.Friends.Dispatcher")]
3063 private interface Dispatcher : GLib.Object {
3064@@ -35,11 +36,34 @@
3065 private Dee.Model model;
3066 private Dee.SharedModel shared_model;
3067 private unowned Dee.ResourceManager resources;
3068+ private Ag.Manager acct_manager;
3069 private Dispatcher dispatcher;
3070 public int interval { get; set; }
3071
3072 public Master ()
3073 {
3074+ acct_manager = new Ag.Manager.for_service_type ("microblogging");
3075+ acct_manager.account_deleted.connect ((manager, account_id) => {
3076+ debug ("Account %u deleted from UOA, purging...", account_id);
3077+ uint purged = 0;
3078+ uint rows = model.get_n_rows ();
3079+ // Destructively iterate over the Model from back to
3080+ // front; I know "i < rows" looks kinda goofy here,
3081+ // but what's happening is that i is unsigned, so once
3082+ // it hits 0, i-- will overflow to a very large
3083+ // number, and then "i < rows" will fail, stopping the
3084+ // iteration at index 0.
3085+ for (uint i = rows - 1; i < rows; i--) {
3086+ var itr = model.get_iter_at_row (i);
3087+ if (model.get_uint64 (itr, 1) == account_id) {
3088+ model.remove (itr);
3089+ purged++;
3090+ }
3091+ }
3092+ debug ("Purged %u rows.", purged);
3093+ }
3094+ );
3095+
3096 resources = Dee.ResourceManager.get_default ();
3097 model = new Dee.SequenceModel ();
3098 Dee.SequenceModel? _m = null;
3099@@ -50,24 +74,29 @@
3100 debug ("Failed to load model from resource manager: %s", e.message);
3101 }
3102
3103- string[] SCHEMA = {"aas",
3104- "s",
3105- "s",
3106- "s",
3107- "s",
3108- "b",
3109+ string[] SCHEMA = {"s",
3110+ "t",
3111+ "s",
3112+ "s",
3113+ "s",
3114+ "s",
3115+ "s",
3116+ "b",
3117+ "s",
3118+ "s",
3119+ "s",
3120+ "s",
3121+ "t",
3122+ "b",
3123+ "s",
3124+ "s",
3125+ "s",
3126 "s",
3127 "s",
3128 "s",
3129 "s",
3130 "d",
3131- "b",
3132- "s",
3133- "s",
3134- "s",
3135- "s",
3136- "s",
3137- "s"};
3138+ "d"};
3139
3140 bool schemaReset = false;
3141
3142@@ -77,7 +106,7 @@
3143 // Compare columns from cached model's schema
3144 string[] _SCHEMA = _m.get_schema ();
3145 if (_SCHEMA.length != SCHEMA.length)
3146- schemaReset = true;
3147+ schemaReset = true;
3148 else
3149 {
3150 for (int i=0; i < _SCHEMA.length;i++ )
3151@@ -87,7 +116,6 @@
3152 debug ("SCHEMA MISMATCH");
3153 schemaReset = true;
3154 }
3155-
3156 }
3157 }
3158 if (!schemaReset)
3159
3160=== modified file 'setup.py'
3161--- setup.py 2013-02-05 01:11:35 +0000
3162+++ setup.py 2013-03-20 13:27:53 +0000
3163@@ -29,6 +29,7 @@
3164 include_package_data=True,
3165 package_data = {
3166 'friends.service.templates': ['*.service.in'],
3167+ 'friends.tests.data': ['*.dat'],
3168 },
3169 data_files = [
3170 ('/usr/share/glib-2.0/schemas',
3171
3172=== modified file 'tools/debug_live.py'
3173--- tools/debug_live.py 2013-02-19 09:27:53 +0000
3174+++ tools/debug_live.py 2013-03-20 13:27:53 +0000
3175@@ -59,7 +59,7 @@
3176 Model.connect('row-added', row_added)
3177
3178 for account in a._accounts.values():
3179- if account.protocol.__class__.__name__.lower() == protocol.lower():
3180+ if account.protocol._name == protocol.lower():
3181 found = True
3182 account.protocol(*args)
3183

Subscribers

People subscribed via source and target branches

to all changes: