Merge lp:~josejuan05/parcel-tracker/trunk into lp:parcel-tracker

Proposed by josejuan05
Status: Merged
Merged at revision: 507
Proposed branch: lp:~josejuan05/parcel-tracker/trunk
Merge into: lp:parcel-tracker
Diff against target: 26 lines (+9/-1)
1 file modified
parcel_tracker_lib/postservices/uspscom.py (+9/-1)
To merge this branch: bzr merge lp:~josejuan05/parcel-tracker/trunk
Reviewer Review Type Date Requested Status
Vsevolod Velichko Approve
Review via email: mp+340570@code.launchpad.net

Description of the change

USPS postservice has been failing for me for the past several months. It looks like a problem where if the user hits an Akamai endpoint without having cookies enabled the Akamai server returns a 403(?) error.

I enabled cookies and it solved the problem except for one tracking number (where a location wasn't returned for one of the tracking numbers (9400111699000432487028).

For that number USPS returned a status but not a location, so the parsing code (which splits the result on a newline and uses the first line as the status and the second line as the location) failed because there was no newline, hence there was no second array entry. The proposed code change isn't beautiful, and I'd gladly accept a better solution.

To post a comment you must log in.
Revision history for this message
Vsevolod Velichko (torkvemada) :
review: Approve
Revision history for this message
Vsevolod Velichko (torkvemada) wrote :

Great, thanks for your patch. Merged into 18.03.1

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'parcel_tracker_lib/postservices/uspscom.py'
2--- parcel_tracker_lib/postservices/uspscom.py 2017-11-17 21:46:49 +0000
3+++ parcel_tracker_lib/postservices/uspscom.py 2018-03-04 08:05:01 +0000
4@@ -32,6 +32,11 @@
5 name = "USPS.com"
6 url = 'https://tools.usps.com/go/TrackConfirmAction?tLabels=%(number)s'
7
8+ def _get_page(self):
9+ fields = {'number': self.number}
10+ url = self.url % fields
11+ return self._fetch_url(url, data=None, headers={}, use_cookies=True)
12+
13 def _parse_page(self, html):
14 html = html.decode('utf-8', 'ignore')
15 res = re.search(r'<div id="trackingHistory_1"[^>]*>(.*?)END Body content for \'Tracking History\'', html, re.DOTALL | re.UNICODE)
16@@ -49,6 +54,9 @@
17 remainder = re.sub(r'\n\s*\n', '\n', unescape(untagify(remainder)).strip(), re.UNICODE)
18 remainder = remainder.split('\n')
19 operation = remainder[0].strip()
20- location = remainder[1].strip()
21+ if len(remainder)>1:
22+ location = remainder[1].strip()
23+ else:
24+ location=''
25 result.append((operation, opdate, location))
26 return result

Subscribers

People subscribed via source and target branches