[urlgrab] Not handling posting exceptions well
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Ibid |
Fix Released
|
Low
|
Stefano Rivera |
Bug Description
Query: aoeu oeu osth.com eoeiiuiu
ERROR:scripts.
Traceback (most recent call last):
File "scripts/
processor.
File "./ibid/
method(event, *match.groups())
File "./ibid/
self.
File "./ibid/
resp = opener.
File "/usr/lib/
response = self._open(req, data)
File "/usr/lib/
'_open', req)
File "/usr/lib/
result = func(*args)
File "/usr/lib/
return self.do_
File "/usr/lib/
raise URLError(err)
URLError: <urlopen error [Errno 110] Connection timed out>
sqlite> select * from urls;
sqlite>
Didn't get stored due to the exception
Related branches
- marcog (community): Approve
- Jonathan Hitchcock: Approve
- Max Rabkin: Approve
-
Diff: 40 lines (+3/-9)1 file modifiedibid/plugins/urlgrab.py (+3/-9)
- marcog (community): Approve
- Jonathan Hitchcock: Approve
- Max Rabkin: Approve
-
Diff: 40 lines (+3/-9)1 file modifiedibid/plugins/urlgrab.py (+3/-9)
Changed in ibid: | |
status: | In Progress → Fix Committed |
Changed in ibid: | |
status: | Fix Committed → Fix Released |
I'm sure we used to handle this correctly.
Also, considering how commonly this slows the bot's responses down, I think it would be a good idea to move the title determination and delicious posting into an asynchronous action.