Merge lp:~alecu/ubuntu-sso-client/proxy-integration-tests into lp:ubuntu-sso-client
- proxy-integration-tests
- Merge into trunk
Status: | Merged | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Approved by: | Alejandro J. Cura | ||||||||||||||||||||
Approved revision: | 838 | ||||||||||||||||||||
Merged at revision: | 831 | ||||||||||||||||||||
Proposed branch: | lp:~alecu/ubuntu-sso-client/proxy-integration-tests | ||||||||||||||||||||
Merge into: | lp:ubuntu-sso-client | ||||||||||||||||||||
Diff against target: |
726 lines (+419/-30) 8 files modified
ubuntu_sso/utils/webclient/common.py (+14/-3) ubuntu_sso/utils/webclient/gsettings.py (+63/-0) ubuntu_sso/utils/webclient/libsoup.py (+32/-4) ubuntu_sso/utils/webclient/qtnetwork.py (+32/-4) ubuntu_sso/utils/webclient/tests/test_gsettings.py (+131/-0) ubuntu_sso/utils/webclient/tests/test_webclient.py (+88/-16) ubuntu_sso/utils/webclient/tests/webclient_demo.py (+51/-0) ubuntu_sso/utils/webclient/txweb.py (+8/-3) |
||||||||||||||||||||
To merge this branch: | bzr merge lp:~alecu/ubuntu-sso-client/proxy-integration-tests | ||||||||||||||||||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Manuel de la Peña (community) | Approve | ||
Natalia Bidart (community) | Approve | ||
Review via email: mp+87538@code.launchpad.net |
Description of the change
Proxy aware webclient with QtNetwork and libsoup backends, and integration tests. (LP: #884963 LP: #884968 LP: #884970 LP: #884971)
This branch also forces the webclient to take unicode IRIs (instead of ascii URIs) (LP: #911844)
NOTE: this branch needs the squid support in this branch:
* lp:~mandel/ubuntuone-dev-tools/proxy-testcase
Alejandro J. Cura (alecu) wrote : | # |
Natalia Bidart (nataliabidart) wrote : | # |
I have this PEP8 issue:
./ubuntu_
Manuel de la Peña (mandel) wrote : | # |
With squid3 the tests won't pass. The solution to this is to make the MockServer timeout the connection that the proxy has. Please take a look at ubuntuone-dev-tools for an example.
Alejandro J. Cura (alecu) wrote : | # |
> I have this PEP8 issue:
>
> ./ubuntu_
> before ')'
I can't reproduce that in my Precise installation.
Anyway, I'm rewritting that bit of code so it looks better.
- 836. By Alejandro J. Cura
-
Forcefully disconnect squid3 that uses HTTP/1.1 and lint fixes
Manuel de la Peña (mandel) wrote : | # |
I dont understand why GSETTINGS_CMDLINE is stored as a single string if you are going to do .split() with it. Isn't it more clear to store it as it will be used?
Alejandro J. Cura (alecu) wrote : | # |
> I dont understand why GSETTINGS_CMDLINE is stored as a single string if you
> are going to do .split() with it. Isn't it more clear to store it as it will
> be used?
I think it's more clear to store it as just a string; to me using a list of strings for this adds noise to the code.
Also, if we ever need to change that cmdline it's easier to change it this way.
Natalia Bidart (nataliabidart) wrote : | # |
Module docstring for ubuntu_
"""A webclient backend that uses QtNetwork."""
I would guess this is a copy&paste typo?
- 837. By Alejandro J. Cura
-
fix for a docstring
Natalia Bidart (nataliabidart) wrote : | # |
So, I ran the suite in windows and I'm getting:
ubuntu_
WebClientTestCase
test_get_iri ... Traceback (most recent call last):
Failure: twisted.
test_webclient.
unning at 8.0 secs
[ERROR]
test_
Failure: twisted.
test_webclient.
ror) still running at 8.0 secs
[ERROR]
test_
Failure: twisted.
DelayedCalls: (set twisted.
<DelayedCall 0x425ea08 [0.22000002861s] called=0 cancelled=0 SaveSite._updateLog
DateTime()>
<DelayedCall 0x425eaf8 [5.21399998665s] called=0 cancelled=0 onTimeout(<Deferred
at 0x425ea30>)>
[ERROR]
Traceback (most recent call last):
Failure: twisted.
Selectables:
<<class 'twisted.
bclient.SaveSite on 50543>
[ERROR]
Alejandro J. Cura (alecu) wrote : | # |
I tried increasing the test timeout on that TestCase, from 8 to 28, and now all tests pass.
I don't think it's a clean solution, so tomorrow I'll be looking for a better solution.
- 838. By Alejandro J. Cura
-
127.0.0.1 instead of localhost, because tests on windows timeout before resolving
Natalia Bidart (nataliabidart) wrote : | # |
Looks great now!
Manuel de la Peña (mandel) wrote : | # |
Ran the tests on Ubuntu P and Windows 7, all pass as expected. Code looks great.
Preview Diff
1 | === modified file 'ubuntu_sso/utils/webclient/common.py' |
2 | --- ubuntu_sso/utils/webclient/common.py 2011-12-13 19:26:16 +0000 |
3 | +++ ubuntu_sso/utils/webclient/common.py 2012-01-11 04:40:28 +0000 |
4 | @@ -17,6 +17,7 @@ |
5 | |
6 | import time |
7 | |
8 | +from httplib2 import iri2uri |
9 | from oauth import oauth |
10 | from twisted.internet import defer |
11 | |
12 | @@ -46,7 +47,7 @@ |
13 | self.username = username |
14 | self.password = password |
15 | |
16 | - def request(self, url, method="GET", extra_headers=None, |
17 | + def request(self, iri, method="GET", extra_headers=None, |
18 | oauth_credentials=None): |
19 | """Return a deferred that will be fired with a Response object.""" |
20 | raise NotImplementedError |
21 | @@ -57,8 +58,18 @@ |
22 | # TODO: get the synchronized timestamp |
23 | return defer.succeed(time.time()) |
24 | |
25 | + def force_use_proxy(self, settings): |
26 | + """Setup this webclient to use the given proxy settings.""" |
27 | + raise NotImplementedError |
28 | + |
29 | + def iri_to_uri(self, iri): |
30 | + """Transform a unicode iri into a ascii uri.""" |
31 | + if not isinstance(iri, unicode): |
32 | + raise TypeError |
33 | + return bytes(iri2uri(iri)) |
34 | + |
35 | @staticmethod |
36 | - def build_oauth_headers(method, url, credentials, timestamp): |
37 | + def build_oauth_headers(method, uri, credentials, timestamp): |
38 | """Build an oauth request given some credentials.""" |
39 | consumer = oauth.OAuthConsumer(credentials["consumer_key"], |
40 | credentials["consumer_secret"]) |
41 | @@ -68,7 +79,7 @@ |
42 | if timestamp: |
43 | parameters["oauth_timestamp"] = timestamp |
44 | request = oauth.OAuthRequest.from_consumer_and_token( |
45 | - http_url=url, |
46 | + http_url=uri, |
47 | http_method=method, |
48 | parameters=parameters, |
49 | oauth_consumer=consumer, |
50 | |
51 | === added file 'ubuntu_sso/utils/webclient/gsettings.py' |
52 | --- ubuntu_sso/utils/webclient/gsettings.py 1970-01-01 00:00:00 +0000 |
53 | +++ ubuntu_sso/utils/webclient/gsettings.py 2012-01-11 04:40:28 +0000 |
54 | @@ -0,0 +1,63 @@ |
55 | +# -*- coding: utf-8 -*- |
56 | +# |
57 | +# Copyright 2011 Canonical Ltd. |
58 | +# |
59 | +# This program is free software: you can redistribute it and/or modify it |
60 | +# under the terms of the GNU General Public License version 3, as published |
61 | +# by the Free Software Foundation. |
62 | +# |
63 | +# This program is distributed in the hope that it will be useful, but |
64 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
65 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
66 | +# PURPOSE. See the GNU General Public License for more details. |
67 | +# |
68 | +# You should have received a copy of the GNU General Public License along |
69 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
70 | +"""Retrieve the proxy configuration from Gnome.""" |
71 | + |
72 | +import subprocess |
73 | + |
74 | +GSETTINGS_CMDLINE = "gsettings list-recursively org.gnome.system.proxy" |
75 | + |
76 | + |
77 | +def get_proxy_settings(): |
78 | + """Parse the proxy settings as returned by the gsettings executable.""" |
79 | + output = subprocess.check_output(GSETTINGS_CMDLINE.split()) |
80 | + gsettings = {} |
81 | + base_len = len("org.gnome.system.proxy.") |
82 | + # pylint: disable=E1103 |
83 | + for line in output.split("\n"): |
84 | + try: |
85 | + path, key, value = line.split(" ", 2) |
86 | + except ValueError: |
87 | + continue |
88 | + if value.startswith("'"): |
89 | + parsed_value = value[1:-1] |
90 | + elif value.startswith('['): |
91 | + parsed_value = value |
92 | + elif value in ('true', 'false'): |
93 | + parsed_value = (value == 'true') |
94 | + else: |
95 | + parsed_value = int(value) |
96 | + relative_key = (path + "." + key)[base_len:] |
97 | + gsettings[relative_key] = parsed_value |
98 | + |
99 | + mode = gsettings["mode"] |
100 | + if mode == "none": |
101 | + settings = {} |
102 | + elif mode == "manual": |
103 | + settings = { |
104 | + "host": gsettings["http.host"], |
105 | + "port": gsettings["http.port"], |
106 | + } |
107 | + if gsettings["http.use-authentication"]: |
108 | + settings.update({ |
109 | + "username": gsettings["http.authentication-user"], |
110 | + "password": gsettings["http.authentication-password"], |
111 | + }) |
112 | + else: |
113 | + # If mode is automatic the PAC javascript should be interpreted |
114 | + # on each request. That is out of scope so it's ignored for now |
115 | + settings = {} |
116 | + |
117 | + return settings |
118 | |
119 | === modified file 'ubuntu_sso/utils/webclient/libsoup.py' |
120 | --- ubuntu_sso/utils/webclient/libsoup.py 2011-12-20 01:46:51 +0000 |
121 | +++ ubuntu_sso/utils/webclient/libsoup.py 2012-01-11 04:40:28 +0000 |
122 | @@ -17,6 +17,8 @@ |
123 | |
124 | import httplib |
125 | |
126 | +from collections import defaultdict |
127 | + |
128 | from twisted.internet import defer |
129 | |
130 | from ubuntu_sso.utils.webclient.common import ( |
131 | @@ -26,6 +28,9 @@ |
132 | WebClientError, |
133 | ) |
134 | |
135 | +URI_ANONYMOUS_TEMPLATE = "http://{host}:{port}/" |
136 | +URI_USERNAME_TEMPLATE = "http://{username}:{password}@{host}:{port}/" |
137 | + |
138 | |
139 | class WebClient(BaseWebClient): |
140 | """A webclient with a libsoup backend.""" |
141 | @@ -43,7 +48,11 @@ |
142 | def _on_message(self, session, message, d): |
143 | """Handle the result of an http message.""" |
144 | if message.status_code == httplib.OK: |
145 | - response = Response(message.response_body.data) |
146 | + headers = defaultdict(list) |
147 | + response_headers = message.get_property("response-headers") |
148 | + add_header = lambda key, value, _: headers[key].append(value) |
149 | + response_headers.foreach(add_header, None) |
150 | + response = Response(message.response_body.data, headers) |
151 | d.callback(response) |
152 | elif message.status_code == httplib.UNAUTHORIZED: |
153 | e = UnauthorizedError(message.reason_phrase) |
154 | @@ -58,9 +67,10 @@ |
155 | auth.authenticate(self.username, self.password) |
156 | |
157 | @defer.inlineCallbacks |
158 | - def request(self, url, method="GET", extra_headers=None, |
159 | + def request(self, iri, method="GET", extra_headers=None, |
160 | oauth_credentials=None): |
161 | """Return a deferred that will be fired with a Response object.""" |
162 | + uri = self.iri_to_uri(iri) |
163 | if extra_headers: |
164 | headers = dict(extra_headers) |
165 | else: |
166 | @@ -68,12 +78,12 @@ |
167 | |
168 | if oauth_credentials: |
169 | timestamp = yield self.get_timestamp() |
170 | - oauth_headers = self.build_oauth_headers(method, url, |
171 | + oauth_headers = self.build_oauth_headers(method, uri, |
172 | oauth_credentials, timestamp) |
173 | headers.update(oauth_headers) |
174 | |
175 | d = defer.Deferred() |
176 | - message = self.soup.Message.new(method, url) |
177 | + message = self.soup.Message.new(method, uri) |
178 | |
179 | for key, value in headers.iteritems(): |
180 | message.request_headers.append(key, value) |
181 | @@ -82,6 +92,24 @@ |
182 | response = yield d |
183 | defer.returnValue(response) |
184 | |
185 | + def force_use_proxy(self, settings): |
186 | + """Setup this webclient to use the given proxy settings.""" |
187 | + # pylint: disable=W0511 |
188 | + proxy_uri = self.get_proxy_uri(settings) |
189 | + self.session.set_property("proxy-uri", proxy_uri) |
190 | + |
191 | + def get_proxy_uri(self, settings): |
192 | + """Get a Soup.URI for the proxy, or None if disabled.""" |
193 | + if "host" in settings and "port" in settings: |
194 | + template = URI_ANONYMOUS_TEMPLATE |
195 | + if "username" in settings and "password" in settings: |
196 | + template = URI_USERNAME_TEMPLATE |
197 | + uri = template.format(**settings) |
198 | + return self.soup.URI.new(uri) |
199 | + else: |
200 | + # If the proxy host is not set, use no proxy |
201 | + return None |
202 | + |
203 | def shutdown(self): |
204 | """End the soup session for this webclient.""" |
205 | self.session.abort() |
206 | |
207 | === modified file 'ubuntu_sso/utils/webclient/qtnetwork.py' |
208 | --- ubuntu_sso/utils/webclient/qtnetwork.py 2011-12-13 19:26:16 +0000 |
209 | +++ ubuntu_sso/utils/webclient/qtnetwork.py 2012-01-11 04:40:28 +0000 |
210 | @@ -15,12 +15,17 @@ |
211 | # with this program. If not, see <http://www.gnu.org/licenses/>. |
212 | """A webclient backend that uses QtNetwork.""" |
213 | |
214 | +import sys |
215 | + |
216 | +from collections import defaultdict |
217 | + |
218 | from PyQt4.QtCore import ( |
219 | QCoreApplication, |
220 | QUrl, |
221 | ) |
222 | from PyQt4.QtNetwork import ( |
223 | QNetworkAccessManager, |
224 | + QNetworkProxy, |
225 | QNetworkReply, |
226 | QNetworkRequest, |
227 | ) |
228 | @@ -32,6 +37,7 @@ |
229 | UnauthorizedError, |
230 | WebClientError, |
231 | ) |
232 | +from ubuntu_sso.utils.webclient import gsettings |
233 | |
234 | |
235 | class WebClient(BaseWebClient): |
236 | @@ -44,12 +50,22 @@ |
237 | self.nam.finished.connect(self._handle_finished) |
238 | self.nam.authenticationRequired.connect(self._handle_authentication) |
239 | self.replies = {} |
240 | + self.setup_proxy() |
241 | + |
242 | + def setup_proxy(self): |
243 | + """Setup the proxy settings if needed.""" |
244 | + # QtNetwork knows how to use the system settings on both Win and Mac |
245 | + if sys.platform.startswith("linux"): |
246 | + settings = gsettings.get_proxy_settings() |
247 | + if settings: |
248 | + self.force_use_proxy(settings) |
249 | |
250 | @defer.inlineCallbacks |
251 | - def request(self, url, method="GET", extra_headers=None, |
252 | + def request(self, iri, method="GET", extra_headers=None, |
253 | oauth_credentials=None): |
254 | """Return a deferred that will be fired with a Response object.""" |
255 | - request = QNetworkRequest(QUrl(url)) |
256 | + uri = self.iri_to_uri(iri) |
257 | + request = QNetworkRequest(QUrl(uri)) |
258 | |
259 | if extra_headers: |
260 | headers = dict(extra_headers) |
261 | @@ -58,7 +74,7 @@ |
262 | |
263 | if oauth_credentials: |
264 | timestamp = yield self.get_timestamp() |
265 | - oauth_headers = self.build_oauth_headers(method, url, |
266 | + oauth_headers = self.build_oauth_headers(method, uri, |
267 | oauth_credentials, timestamp) |
268 | headers.update(oauth_headers) |
269 | |
270 | @@ -87,7 +103,10 @@ |
271 | d = self.replies.pop(reply) |
272 | error = reply.error() |
273 | if not error: |
274 | - response = Response(reply.readAll()) |
275 | + headers = defaultdict(list) |
276 | + for key, value in reply.rawHeaderPairs(): |
277 | + headers[str(key)].append(str(value)) |
278 | + response = Response(reply.readAll(), headers) |
279 | d.callback(response) |
280 | else: |
281 | if error == QNetworkReply.AuthenticationRequiredError: |
282 | @@ -96,6 +115,15 @@ |
283 | exception = WebClientError(reply.errorString()) |
284 | d.errback(exception) |
285 | |
286 | + def force_use_proxy(self, settings): |
287 | + """Setup this webclient to use the given proxy settings.""" |
288 | + proxy = QNetworkProxy(QNetworkProxy.HttpProxy, |
289 | + hostName=settings.get("host", ""), |
290 | + port=settings.get("port", 0), |
291 | + user=settings.get("username", ""), |
292 | + password=settings.get("password", "")) |
293 | + self.nam.setProxy(proxy) |
294 | + |
295 | def shutdown(self): |
296 | """Shut down all pending requests (if possible).""" |
297 | self.nam.deleteLater() |
298 | |
299 | === added file 'ubuntu_sso/utils/webclient/tests/test_gsettings.py' |
300 | --- ubuntu_sso/utils/webclient/tests/test_gsettings.py 1970-01-01 00:00:00 +0000 |
301 | +++ ubuntu_sso/utils/webclient/tests/test_gsettings.py 2012-01-11 04:40:28 +0000 |
302 | @@ -0,0 +1,131 @@ |
303 | +# -*- coding: utf-8 -*- |
304 | +# |
305 | +# Copyright 2011 Canonical Ltd. |
306 | +# |
307 | +# This program is free software: you can redistribute it and/or modify it |
308 | +# under the terms of the GNU General Public License version 3, as published |
309 | +# by the Free Software Foundation. |
310 | +# |
311 | +# This program is distributed in the hope that it will be useful, but |
312 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
313 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
314 | +# PURPOSE. See the GNU General Public License for more details. |
315 | +# |
316 | +# You should have received a copy of the GNU General Public License along |
317 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
318 | +"""Test the gsettings parser.""" |
319 | + |
320 | +from twisted.trial.unittest import TestCase |
321 | + |
322 | +from ubuntu_sso.utils.webclient import gsettings |
323 | + |
324 | +# Some settings are not used as described in: |
325 | +# https://bugzilla.gnome.org/show_bug.cgi?id=648237 |
326 | + |
327 | +TEMPLATE_GSETTINGS_OUTPUT = """\ |
328 | +org.gnome.system.proxy autoconfig-url '{autoconfig_url}' |
329 | +org.gnome.system.proxy ignore-hosts {ignore_hosts:s} |
330 | +org.gnome.system.proxy mode '{mode}' |
331 | +org.gnome.system.proxy.ftp host '{ftp_host}' |
332 | +org.gnome.system.proxy.ftp port {ftp_port} |
333 | +org.gnome.system.proxy.http authentication-password '{auth_password}' |
334 | +org.gnome.system.proxy.http authentication-user '{auth_user}' |
335 | +org.gnome.system.proxy.http host '{http_host}' |
336 | +org.gnome.system.proxy.http port {http_port} |
337 | +org.gnome.system.proxy.http use-authentication {http_use_auth} |
338 | +org.gnome.system.proxy.https host '{https_host}' |
339 | +org.gnome.system.proxy.https port {https_port} |
340 | +org.gnome.system.proxy.socks host '{socks_host}' |
341 | +org.gnome.system.proxy.socks port {socks_port} |
342 | +""" |
343 | + |
344 | +BASE_GSETTINGS_VALUES = { |
345 | + "autoconfig_url": "", |
346 | + "ignore_hosts": ["localhost", "127.0.0.0/8"], |
347 | + "mode": "none", |
348 | + "ftp_host": "", |
349 | + "ftp_port": 0, |
350 | + "auth_password": "", |
351 | + "auth_user": "", |
352 | + "http_host": "", |
353 | + "http_port": 0, |
354 | + "http_use_auth": "false", |
355 | + "https_host": "", |
356 | + "https_port": 0, |
357 | + "socks_host": "", |
358 | + "socks_port": 0, |
359 | +} |
360 | + |
361 | + |
362 | +class ProxySettingsTestCase(TestCase): |
363 | + """Test the getting of the proxy settings.""" |
364 | + |
365 | + def test_gsettings_cmdline_correct(self): |
366 | + """The command line used to get the proxy settings is the right one.""" |
367 | + expected = "gsettings list-recursively org.gnome.system.proxy".split() |
368 | + called = [] |
369 | + |
370 | + def append_output(args): |
371 | + """Append the output and return some settings.""" |
372 | + called.append(args) |
373 | + return TEMPLATE_GSETTINGS_OUTPUT.format(**BASE_GSETTINGS_VALUES) |
374 | + |
375 | + self.patch(gsettings.subprocess, "check_output", append_output) |
376 | + gsettings.get_proxy_settings() |
377 | + self.assertEqual(called[0], expected) |
378 | + |
379 | + def test_gsettings_parser_none(self): |
380 | + """Test a parser of gsettings.""" |
381 | + expected = {} |
382 | + fake_output = TEMPLATE_GSETTINGS_OUTPUT.format(**BASE_GSETTINGS_VALUES) |
383 | + self.patch(gsettings.subprocess, "check_output", |
384 | + lambda _: fake_output) |
385 | + ps = gsettings.get_proxy_settings() |
386 | + self.assertEqual(ps, expected) |
387 | + |
388 | + def test_gsettings_parser_http_anonymous(self): |
389 | + """Test a parser of gsettings.""" |
390 | + template_values = dict(BASE_GSETTINGS_VALUES) |
391 | + expected_host = "expected_host" |
392 | + expected_port = 54321 |
393 | + expected = { |
394 | + "host": expected_host, |
395 | + "port": expected_port, |
396 | + } |
397 | + template_values.update({ |
398 | + "mode": "manual", |
399 | + "http_host": expected_host, |
400 | + "http_port": expected_port, |
401 | + }) |
402 | + fake_output = TEMPLATE_GSETTINGS_OUTPUT.format(**template_values) |
403 | + self.patch(gsettings.subprocess, "check_output", |
404 | + lambda _: fake_output) |
405 | + ps = gsettings.get_proxy_settings() |
406 | + self.assertEqual(ps, expected) |
407 | + |
408 | + def test_gsettings_parser_http_authenticated(self): |
409 | + """Test a parser of gsettings.""" |
410 | + template_values = dict(BASE_GSETTINGS_VALUES) |
411 | + expected_host = "expected_host" |
412 | + expected_port = 54321 |
413 | + expected_user = "carlitos" |
414 | + expected_password = "very secret password" |
415 | + expected = { |
416 | + "host": expected_host, |
417 | + "port": expected_port, |
418 | + "username": expected_user, |
419 | + "password": expected_password, |
420 | + } |
421 | + template_values.update({ |
422 | + "mode": "manual", |
423 | + "http_host": expected_host, |
424 | + "http_port": expected_port, |
425 | + "auth_user": expected_user, |
426 | + "auth_password": expected_password, |
427 | + "http_use_auth": "true", |
428 | + }) |
429 | + fake_output = TEMPLATE_GSETTINGS_OUTPUT.format(**template_values) |
430 | + self.patch(gsettings.subprocess, "check_output", |
431 | + lambda _: fake_output) |
432 | + ps = gsettings.get_proxy_settings() |
433 | + self.assertEqual(ps, expected) |
434 | |
435 | === modified file 'ubuntu_sso/utils/webclient/tests/test_webclient.py' |
436 | --- ubuntu_sso/utils/webclient/tests/test_webclient.py 2011-12-14 15:29:02 +0000 |
437 | +++ ubuntu_sso/utils/webclient/tests/test_webclient.py 2012-01-11 04:40:28 +0000 |
438 | @@ -25,6 +25,7 @@ |
439 | from twisted.web import http, resource, server, guard |
440 | |
441 | from ubuntuone.devtools.testcases import TestCase |
442 | +from ubuntuone.devtools.testcases.squid import SquidTestCase |
443 | |
444 | from ubuntu_sso.utils import webclient |
445 | |
446 | @@ -50,6 +51,8 @@ |
447 | GUARDED = "guarded" |
448 | OAUTHRESOURCE = "oauthresource" |
449 | |
450 | +WEBCLIENT_MODULE_NAME = webclient.webclient_module().__name__ |
451 | + |
452 | |
453 | def sample_get_credentials(): |
454 | """Will return the sample credentials right now.""" |
455 | @@ -109,6 +112,29 @@ |
456 | return "ERROR: Expected OAuth header not present." |
457 | |
458 | |
459 | +class SaveHTTPChannel(http.HTTPChannel): |
460 | + """A save protocol to be used in tests.""" |
461 | + |
462 | + protocolInstance = None |
463 | + |
464 | + def connectionMade(self): |
465 | + """Keep track of the given protocol.""" |
466 | + SaveHTTPChannel.protocolInstance = self |
467 | + http.HTTPChannel.connectionMade(self) |
468 | + |
469 | + |
470 | +class SaveSite(server.Site): |
471 | + """A site that let us know when it closed.""" |
472 | + |
473 | + protocol = SaveHTTPChannel |
474 | + |
475 | + def __init__(self, *args, **kwargs): |
476 | + """Create a new instance.""" |
477 | + server.Site.__init__(self, *args, **kwargs) |
478 | + # we disable the timeout in the tests, we will deal with it manually. |
479 | + self.timeOut = None |
480 | + |
481 | + |
482 | class MockWebServer(object): |
483 | """A mock webserver for testing""" |
484 | |
485 | @@ -134,23 +160,25 @@ |
486 | [cred_factory]) |
487 | root.putChild(GUARDED, guarded_resource) |
488 | |
489 | - site = server.Site(root) |
490 | + self.site = SaveSite(root) |
491 | application = service.Application('web') |
492 | self.service_collection = service.IServiceCollection(application) |
493 | #pylint: disable=E1101 |
494 | - self.tcpserver = internet.TCPServer(0, site) |
495 | + self.tcpserver = internet.TCPServer(0, self.site) |
496 | self.tcpserver.setServiceParent(self.service_collection) |
497 | self.service_collection.startService() |
498 | |
499 | - def get_url(self): |
500 | - """Build the url for this mock server.""" |
501 | + def get_iri(self): |
502 | + """Build the iri for this mock server.""" |
503 | #pylint: disable=W0212 |
504 | port_num = self.tcpserver._port.getHost().port |
505 | - return "http://localhost:%d/" % port_num |
506 | + return u"http://127.0.0.1:%d/" % port_num |
507 | |
508 | def stop(self): |
509 | """Shut it down.""" |
510 | #pylint: disable=E1101 |
511 | + if self.site.protocol.protocolInstance: |
512 | + self.site.protocol.protocolInstance.timeoutConnection() |
513 | return self.service_collection.stopService() |
514 | |
515 | |
516 | @@ -206,40 +234,46 @@ |
517 | yield super(WebClientTestCase, self).setUp() |
518 | self.ws = MockWebServer() |
519 | self.addCleanup(self.ws.stop) |
520 | - self.base_url = self.ws.get_url() |
521 | + self.base_iri = self.ws.get_iri() |
522 | self.wc = webclient.webclient_factory() |
523 | self.addCleanup(self.wc.shutdown) |
524 | # pylint: disable=W0511 |
525 | # TODO: skewed timestamp correction |
526 | |
527 | @defer.inlineCallbacks |
528 | - def test_get_url(self): |
529 | - """A url is successfully retrieved from the mock webserver.""" |
530 | - result = yield self.wc.request(self.base_url + SIMPLERESOURCE) |
531 | + def test_request_takes_an_iri(self): |
532 | + """Passing a non-unicode iri fails.""" |
533 | + d = self.wc.request(bytes(self.base_iri + SIMPLERESOURCE)) |
534 | + yield self.assertFailure(d, TypeError) |
535 | + |
536 | + @defer.inlineCallbacks |
537 | + def test_get_iri(self): |
538 | + """Passing in a unicode iri works fine.""" |
539 | + result = yield self.wc.request(self.base_iri + SIMPLERESOURCE) |
540 | self.assertEqual(SAMPLE_RESOURCE, result.content) |
541 | |
542 | @defer.inlineCallbacks |
543 | - def test_get_url_error(self): |
544 | + def test_get_iri_error(self): |
545 | """The errback is called when there's some error.""" |
546 | - yield self.assertFailure(self.wc.request(self.base_url + THROWERROR), |
547 | + yield self.assertFailure(self.wc.request(self.base_iri + THROWERROR), |
548 | webclient.WebClientError) |
549 | |
550 | @defer.inlineCallbacks |
551 | def test_unauthorized(self): |
552 | """Detect when a request failed with the UNAUTHORIZED http code.""" |
553 | - yield self.assertFailure(self.wc.request(self.base_url + UNAUTHORIZED), |
554 | + yield self.assertFailure(self.wc.request(self.base_iri + UNAUTHORIZED), |
555 | webclient.UnauthorizedError) |
556 | |
557 | @defer.inlineCallbacks |
558 | def test_method_head(self): |
559 | """The HTTP method is used.""" |
560 | - result = yield self.wc.request(self.base_url + HEADONLY, method="HEAD") |
561 | + result = yield self.wc.request(self.base_iri + HEADONLY, method="HEAD") |
562 | self.assertEqual("", result.content) |
563 | |
564 | @defer.inlineCallbacks |
565 | def test_send_extra_headers(self): |
566 | """The extra_headers are sent to the server.""" |
567 | - result = yield self.wc.request(self.base_url + VERIFYHEADERS, |
568 | + result = yield self.wc.request(self.base_iri + VERIFYHEADERS, |
569 | extra_headers=SAMPLE_HEADERS) |
570 | self.assertEqual(SAMPLE_RESOURCE, result.content) |
571 | |
572 | @@ -249,17 +283,55 @@ |
573 | other_wc = webclient.webclient_factory(username=SAMPLE_USERNAME, |
574 | password=SAMPLE_PASSWORD) |
575 | self.addCleanup(other_wc.shutdown) |
576 | - result = yield other_wc.request(self.base_url + GUARDED) |
577 | + result = yield other_wc.request(self.base_iri + GUARDED) |
578 | self.assertEqual(SAMPLE_RESOURCE, result.content) |
579 | |
580 | @defer.inlineCallbacks |
581 | def test_request_is_oauth_signed(self): |
582 | """The request is oauth signed.""" |
583 | - result = yield self.wc.request(self.base_url + OAUTHRESOURCE, |
584 | + result = yield self.wc.request(self.base_iri + OAUTHRESOURCE, |
585 | oauth_credentials=SAMPLE_CREDENTIALS) |
586 | self.assertEqual(SAMPLE_RESOURCE, result.content) |
587 | |
588 | |
589 | +class BasicProxyTestCase(SquidTestCase): |
590 | + """Test that the proxy works at all.""" |
591 | + |
592 | + @defer.inlineCallbacks |
593 | + def setUp(self): |
594 | + yield super(BasicProxyTestCase, self).setUp() |
595 | + self.ws = MockWebServer() |
596 | + self.addCleanup(self.ws.stop) |
597 | + self.base_iri = self.ws.get_iri() |
598 | + self.wc = webclient.webclient_factory() |
599 | + self.addCleanup(self.wc.shutdown) |
600 | + |
601 | + def assert_header_contains(self, headers, expected): |
602 | + """One of the headers matching key must contain a given value.""" |
603 | + self.assertTrue(any(expected in value for value in headers)) |
604 | + |
605 | + @defer.inlineCallbacks |
606 | + def test_anonymous_proxy_is_used(self): |
607 | + """The anonymous proxy is used by the webclient.""" |
608 | + settings = self.get_nonauth_proxy_settings() |
609 | + self.wc.force_use_proxy(settings) |
610 | + result = yield self.wc.request(self.base_iri + SIMPLERESOURCE) |
611 | + self.assert_header_contains(result.headers["Via"], "squid") |
612 | + |
613 | + @defer.inlineCallbacks |
614 | + def test_authenticated_proxy_is_used(self): |
615 | + """The authenticated proxy is used by the webclient.""" |
616 | + settings = self.get_auth_proxy_settings() |
617 | + self.wc.force_use_proxy(settings) |
618 | + result = yield self.wc.request(self.base_iri + SIMPLERESOURCE) |
619 | + self.assert_header_contains(result.headers["Via"], "squid") |
620 | + |
621 | + if WEBCLIENT_MODULE_NAME.endswith(".txweb"): |
622 | + reason = "txweb does not support proxies." |
623 | + test_anonymous_proxy_is_used.skip = reason |
624 | + test_authenticated_proxy_is_used.skip = reason |
625 | + |
626 | + |
627 | class OAuthTestCase(TestCase): |
628 | """Test for the oauth signing code.""" |
629 | |
630 | |
631 | === added file 'ubuntu_sso/utils/webclient/tests/webclient_demo.py' |
632 | --- ubuntu_sso/utils/webclient/tests/webclient_demo.py 1970-01-01 00:00:00 +0000 |
633 | +++ ubuntu_sso/utils/webclient/tests/webclient_demo.py 2012-01-11 04:40:28 +0000 |
634 | @@ -0,0 +1,51 @@ |
635 | +# -*- coding: utf-8 -*- |
636 | +# |
637 | +# Copyright 2011 Canonical Ltd. |
638 | +# |
639 | +# This program is free software: you can redistribute it and/or modify it |
640 | +# under the terms of the GNU General Public License version 3, as published |
641 | +# by the Free Software Foundation. |
642 | +# |
643 | +# This program is distributed in the hope that it will be useful, but |
644 | +# WITHOUT ANY WARRANTY; without even the implied warranties of |
645 | +# MERCHANTABILITY, SATISFACTORY QUALITY, or FITNESS FOR A PARTICULAR |
646 | +# PURPOSE. See the GNU General Public License for more details. |
647 | +# |
648 | +# You should have received a copy of the GNU General Public License along |
649 | +# with this program. If not, see <http://www.gnu.org/licenses/>. |
650 | + |
651 | +"""Sample program that uses the Qt webclient.""" |
652 | + |
653 | +import sys |
654 | +from PyQt4 import QtGui |
655 | +# need to create the QApplication before installing the reactor |
656 | +QtGui.QApplication(sys.argv) |
657 | +import qt4reactor |
658 | +qt4reactor.install() |
659 | + |
660 | +from pprint import pprint |
661 | +from ubuntu_sso.utils.webclient import qtnetwork |
662 | +from twisted.internet import defer, reactor |
663 | + |
664 | + |
665 | +@defer.inlineCallbacks |
666 | +def do_web_call(): |
667 | + """Wait for the webcall to finish and stop the reactor.""" |
668 | + webclient = qtnetwork.WebClient() |
669 | + d = webclient.request(u"http://slashdot.org") |
670 | + try: |
671 | + response = yield d |
672 | + pprint(response.headers.items()) |
673 | + finally: |
674 | + reactor.stop() |
675 | + |
676 | + |
677 | +def main(): |
678 | + """The main function of this test script.""" |
679 | + do_web_call() |
680 | + # pylint: disable=E1101 |
681 | + reactor.run() |
682 | + |
683 | + |
684 | +if __name__ == "__main__": |
685 | + main() |
686 | |
687 | === modified file 'ubuntu_sso/utils/webclient/txweb.py' |
688 | --- ubuntu_sso/utils/webclient/txweb.py 2011-12-13 19:26:16 +0000 |
689 | +++ ubuntu_sso/utils/webclient/txweb.py 2012-01-11 04:40:28 +0000 |
690 | @@ -32,9 +32,10 @@ |
691 | """A simple web client that does not support proxies, yet.""" |
692 | |
693 | @defer.inlineCallbacks |
694 | - def request(self, url, method="GET", extra_headers=None, |
695 | + def request(self, iri, method="GET", extra_headers=None, |
696 | oauth_credentials=None): |
697 | """Get the page, or fail trying.""" |
698 | + uri = self.iri_to_uri(iri) |
699 | if extra_headers: |
700 | headers = dict(extra_headers) |
701 | else: |
702 | @@ -42,7 +43,7 @@ |
703 | |
704 | if oauth_credentials: |
705 | timestamp = yield self.get_timestamp() |
706 | - oauth_headers = self.build_oauth_headers(method, url, |
707 | + oauth_headers = self.build_oauth_headers(method, uri, |
708 | oauth_credentials, timestamp) |
709 | headers.update(oauth_headers) |
710 | |
711 | @@ -51,10 +52,14 @@ |
712 | headers["Authorization"] = "Basic " + auth |
713 | |
714 | try: |
715 | - result = yield client.getPage(url, method=method, headers=headers) |
716 | + result = yield client.getPage(uri, method=method, headers=headers) |
717 | response = Response(result) |
718 | defer.returnValue(response) |
719 | except error.Error as e: |
720 | if int(e.status) == http.UNAUTHORIZED: |
721 | raise UnauthorizedError(e.message) |
722 | raise WebClientError(e.message) |
723 | + |
724 | + def force_use_proxy(self, settings): |
725 | + """Setup this webclient to use the given proxy settings.""" |
726 | + # No proxy support in twisted.web.client |
Proxy aware webclient with QtNetwork and libsoup backends, and integration tests. (LP: #884963 LP: #884968 LP: #884970 LP: #884971)